Amazon workers eavesdrop on your talks with Alexa

Daniel Fowler
April 11, 2019

Amazon customers have complained that their privacy has been violated after it emerged that the company employs hundreds of people to listen to recordings of them speaking to their Alexa devices.

Alexa software is created to continuously record snatches of audio listening for its wake word, Bloomberg alleged, adding that clips assessed by workers included what they believed to be sexual assault and a child screaming. Engadget has reported that "two workers from Romania said they had to listen to what could've been sexual assault" and that "they were apparently told that they couldn't do anything about it, because it's not Amazon's job to interfere".

Sometimes they hear recordings they find upsetting, or possibly criminal.

Not only is Alexa listening when you speak to an Echo smart speaker, an Amazon employee is potentially listening, too.

The Alexa smart assistant answers your questions on your Amazon smart speaker. This revelation today at least partly confirms the validity of their concerns.

Damn, Another Toy May Have Just Really Spoiled AVENGERS: ENDGAME
We can clearly see that 'Professor Hulk' will appear in the Endgame . He added, "We have nearly everything in this movie that we shot".

But they insisted that "all information is treated with high confidentiality" using "multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it". Employees interpret as many as 1,000 audio clips in a 9-hour shift. "We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience", an Amazon spokesperson told Fox News in a written statement.

It also said these chat rooms were used to help identify muddled words, as well as to share amusing recordings. The marketing materials for the Echo claim "Alexa lives in the cloud and is always getting smarter", only hinting in the lengthy Alexa FAQ that "We use your requests to Alexa to train our speech recognition and natural language understanding systems". Amazon and Google allow the voice recordings to be deleted from your account, for example, but this may not be permanent: the recording could continue to be used for training purposes (Google's explanation can be found here). "A global team reviews audio clips in an effort to help the voice-activated assistant respond to commands", the newswire wrote.

"You don't necessarily think of another human listening to what you're telling your smart speaker in the intimacy of your home", Florian Schaub, a professor at the University of MI who has researched privacy issues with smart speakers, told Bloomberg. Google has humans working on Assistant, but audio recordings are intentionally distorted, and there's no account data associated with them.

Google Duplex digital assistant, revealed at Google I/O conference past year, seems to work better than every other speech recognition software. One of the workers spoken to said up to 100 recordings are being transcribed every day when Alexa was triggered by something other than the wake word.

The retention of the audio files is purportedly voluntary, but this is far from clear in the information Amazon gives users.

Other reports by

Discuss This Article