I'll admit it. I love my wearables. I love competing against my former CTO and out-walking him. (It may be a rare day, but it's a good one.) I love that my mother, who lives thousands of miles away, has the ability to call for help via her Lifeline. It's one less worry in a sea of worries.
But my love, and patience, is being tested. With all the Apple Watch enthusiasm, and the blockbuster early sales of the iPhone 6, I find that there is a growing pit in my stomach. I find myself not laughing nearly as hard at the South Park episode where the "Apple Police" pursue Kyle for clicking on the iTunes Terms of Service (TOS). When the grain of truth turns into a bushel, well, it's just not funny.
During the last eight months, I have been working on a research project focused on the future of voice and communications. If there was one crystal clear finding thus far, it is that people loathe the idea of their voice being recorded -- particularly when they cannot opt out of it (e.g., customer service calls), delete it later or, worse yet, when they're unaware that it's even happening.
When it comes to privacy, people feel that their voice is a line in the sand. It's one thing to take my metadata, but leave my voice the (insert expletive here) alone.
So why is it that Siri, the star of the Apple Watch, is allowed to record me unannounced? Search Apple's terms of service, and you won't find details on the practice. Furthermore, Siri's FAQs fail to speak to voice recording at all. Thanks to Wired magazine, a reporter last year substantiated that Apple retains the recorded voice records for six months and then de-identifies the recordings and retains for another 18 months, purportedly for research purposes.
Although I cannot say what exactly "de-identifies" means, I can say this: Rest assured, your voice is being recorded and used to benefit Apple.
The violation here is subtle in interface, huge in practice. It sets a precedent for recording our words without the usual safeguards.
Part of the problem is that Apple and other Over The Top (OTT) players -- meaning companies that deliver audio, video, and other media over the Internet without the involvement of a traditional distributor (e.g., Multi-System Operator or MSO) -- lump voice into the general data bucket. But voice is not data. Voice is a part of our identity in a way that text can never claim. It feels to us like the most personal of our Personally Identifiable Information (PII).
Now you may love to hate your local telecom provider, but safeguarding voice is a near religion for insiders with roots in the phone business. In the US, there is a long history of various telecoms vigorously appealing warrants to keep consumer voice private. In Germany, which has the strictest privacy laws globally, Deutsche Telekom was ranked as the country's most trusted brand among Internet providers and mobile providers.
We have laws in the US designed to protect consumers from unannounced recordings, and far stricter ones in the EU. So, how exactly is this unannounced recording possible?
Voice interfaces are new, and we don't necessarily recognize the difference of talking to rather than through a device. Talking to a device feels like talking to a person. Since the vast majority of our human-to-human conversations are ephemeral, we tend to think of this new conversational model the same way. And yet, it is not. Siri is not a person, of course, so it's not covered under laws related to conversations. We should be treating "her" like a recording device. This is where our humanness bites us. It's way too easy to anthropomorphize her, particularly when she answers our knock-knock jokes.
The way Apple benefits from our recorded voice has me worried about other boundaries the Apple Watch might cross. It has access to our vitals too. One particular vital gives me real pause: Heart Rate Variability (HRV). HRV is not only as unique as a fingerprint, but changes in it can signal elevated anxiety, Post Traumatic Stress Syndrome (PTSD), and, a pending heart attack.
Thinking back to my mother, the Apple Watch could be far more helpful than the Lifeline in preventing a catastrophic outcome by calling 911 long before she realizes she is in danger. This would be a massive leap forward for healthcare and telemedicine, which I am eager to embrace. And yet, it also means that our most personal health data is being transmitted from our watch to our iPhone over Bluetooth. And sadly, far too few people understand just how hackable Bluetooth is.
With the speed of technological change fast approaching a blur, legislation cannot possibly stay in sync. As individuals, we need to make our concerns known and demand transparency around privacy practices. Given the asymmetrical power relationship between Apple and other OTT players (I'm looking at you, Google) and its end users, providing clear and well-summarized privacy policies and Terms of Service is the very least they can do.
For the full story, see EBN sister site InformationWeek.