There is no place in the world where you feel so safe, so alone, as your bedroom. It’s why we feel comfortable there, while doing the most vulnerable things humans do. A listening device within the four walls of one’s bedroom was, for most of recent history, a tool for evil, only found in science fiction novels and police states. Over the past few years, though, something dramatic has begun to change in our culture. It’s something no oppressive regime, no science fiction writer, could have predicted. People are now willingly placing microphones in their own bedrooms. In fact, they’re paying money to do it.
Millions of people today own an Amazon Echo, Google Home or Apple Watch. These devices are designed to switch on and off when a certain command is voiced: “Alexa”, “Okay Google”, or “Hey Siri”. With the Apple Watch, you don’t even need to say anything: a simple raise of the arm will have the same effect.
The added convenience of not having to get up and press a button on a device poses an inherent technical problem for developers. People mishear one another all the time, so it’s reasonable enough that our virtual assistants occasionally do it, too. Plenty of words and phrases might sound like “Alexa”, depending on the intonation and accent of the speaker. “Siri” sounds a lot like “Syria”. Amazon, Google and Apple are constantly trying to improve upon their AI, to minimize false positives. In doing so, however, they’re toeing a line.
Last month, a whistleblower told The Guardian that third-party contractors are hired by Apple to listen to and evaluate commands picked up by Siri. The customer recordings they receive tend to be trivial, but not always. The whistleblower reported having overheard multiple drug deals, doctor’s appointments, where sensitive medical details were discussed, as well as sexual intercourse. These recordings aren’t completely anonymous, either: they come paired with the location of the original device, as well as certain contact information and app data from the user. If nothing else, the content of the recordings often includes personal information.
You might expect this news to have come as a surprise; except it wasn’t a surprise, and it was barely news. Just a couple of weeks prior, the Belgian public broadcaster VRT NWS revealed that Google employees and contractors regularly listen to clips of customers’ Google Home recordings. In April of this year, Bloomberg revealed the same about Amazon’s Echo.
The problem with this seemingly ubiquitous industry practice isn’t just that tech companies are listening in to citizens’ lives without their consent (although that would surely be enough): it’s that third-party firms are often the ones doing the listening.
On some level, we all know that tech companies are mining our data. We know that Facebook’s algorithm draws from our Likes to suggest targeted advertisements. We know Apple uses our locations to do the same. What’s new is that people are on the other end of those data channels more often than we’d expect, and that those people aren’t necessarily employees of Apple or Facebook. They work for anonymous, private companies.
It’s unsettling that somebody, somewhere in the world may have overheard something you were doing in your own bedroom. If you own a virtual assistant, it’s not out of the question that it’s already happened. What’s even more frightening is the prospect that those contractors lack the infrastructure to protect your bedroom recordings, in the way that Apple might. As Apple’s whistleblower told The Guardian:
“[. . .] there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people].”
Major conglomerates employ tens, even hundreds of privately-owned contractor companies to carry out all kinds of business functions. Hackers will target those smaller companies as a way to get to their larger counterparts. Just recently, for example, we reported on a lone hacker who managed to breach U.S. Customs and Border Patrol through a small contractor in Tennessee.
What might happen, then, if virtual assistant recordings–millions of them, taken from cars, doctor’s offices and bedrooms around the world–were to fall into the wrong hands?
It’s unlikely that tech giants will stop gathering our data, as long as we keep using their products. However, there are steps Google, Amazon and Apple can take in order to regain some semblance of trust from the public. The easiest of those steps involves improving how they deal with contractors. Strict procedures and safeguards must be put in place, to make sure contractors handle the data they’re given with care. That means prioritizing information security, vetting employees, and removing identification information from recordings. Additionally: advanced, AI-driven network monitoring can help big data companies keep a watchful eye over how contractors traffick in their customer data. That way, any small privacy breach can be picked out and addressed before it becomes a big, company-wide problem.
There’s a saying in sports: a team is only as strong as its weakest player. The principle applies, dually, to privacy in cyberspace. What you do in your bedroom is only as private as the least private person or entity to hear it.
About the author:
Nathaniel Nelson writes the internationally top-ranked “Malicious Life” podcast on iTunes, hosts programs on blockchain and SCADA security, and contributes to AI and emerging tech blogs.