People Who Buy Smart Speakers Have Given Up on Privacy, Researchers Find

If you find always-listening smart devices creepy, but bought an Amazon Echo anyway, you’re not alone. A recent study from researchers at the University of Michigan found that people who own smart speakers are aware of the risks, but feel resigned to the idea that the erosion of privacy is now a fact of life.

“What was really concerning to me was this idea that ‘it’s just a little bit more info you give Google or Amazon, and they already know a lot about you, so how is that bad?’” said Florian Schaub, an assistant professor in the University of Michigan School of Information and a co-author of the study. “It’s representative of this constant erosion of what privacy means and what our privacy expectations are.”

Smart home devices— like internet-connected speakers, TVs, and microwaves—have been involved in multiple privacy scandals. This year, a couple’s private conversation was recorded by their smart speaker and then sent to a random contact. In 2015, people discovered that a buried line in the privacy policy for Samsung’s smart TVs meant that everything you say could be captured and sent to a third party. Like all internet-connected tech, such devices also susceptible to data breaches or hacks.

Yet despite knowledge of these risks, many people say they’re resigned to the idea that we’re going to be spied on and there’s nothing we can do about it.

This kind of privacy nihilism isn’t new, and we’ve long understood that while people say they value privacy, they often act in ways that undermine their privacy protections (like not changing their password, for example). But Schaub and his colleagues wanted to investigate how this played out with internet-connected, voice-enabled speakers, like the Amazon Echo, in particular. Schaub told me they were curious if people’s behavior with the device might reveal a more conscious effort to preserve privacy.

The researchers interviewed 17 smart speaker users and 17 people who deliberately had not bought a smart speaker yet, and had the smart speaker users keep a weekly log of how they used the device. Schaub and his team found that speaker owners weren’t bothering to take steps to protect their privacy. Many devices have a mute button that allows the user to turn off the microphone, for example, but the researchers found most users had never used it.

It was also rare for users to go through their activity logs, where they can review and delete recordings. Instead of using this feature to protect personal privacy, the researchers found users were actually using it to spy on housesitters and babysitters.

Smart speaker makers like Amazon say the devices only record after they hear a special “wake word” (like “Alexa” or “Hey, Google”), something many users said gives them peace of mind. But Schaub said this requires us to deeply trust the companies we buy the devices from.

“We really have to trust Google and Amazon that they respect people’s privacy and adhere to what they’re describing,” Schaub said. “It’s still a fact that you are putting a live microphone into your home and your intimate spaces, and it’s software that decides whether it’s recording with a trigger word or all the time.”

Device functions with huge privacy implications could change over time, Schaub said, and users might not always be up to date on the latest changes.

Schaub said he found the results of the study concerning, particularly the fact that many users seemed apathetic to the idea of companies getting more information about their private lives—if we’re fine with the constant erosion of our privacy, it might no longer exist one day.

If users were more demanding of privacy protections, Schaub said, a few simple changes to the technology could be made to improve user’s privacy without having to give up convenience. The mute button, for example, could be voice-activated—since that’s how everyone uses their speaker—rather than requiring a manual button press.

“For some reason, it’s not possible to verbally tell Alexa to stop listening for ten minutes, or say “forget everything you heard in the last hour,’” Schaub said. “I don’t think it would be very hard for companies to implement these kind of features.”