Web giant admits Alexa IS listening

Tech giant Amazon recently admitted that staff are listening to private and sometimes disturbing voice recordings from Alexa to improve the voice-assistants’ understanding of human speech.

They are listening to, transcribing and even joking about the private conversations customers have with Alexa on their smart speakers.

The shocking admission yet again raises the ethical issue over the future of AI smart assistants in the home. How exactly are tech companies like Amazon gathering personal information and what exactly are they doing with all this personal data and what should they be doing with it?

Ethical issues

Smart assistants and the use of the recordings taken in the privacy of a customer’s home has been a long-standing ethical issue for tech firms.

Hundreds of Amazon workers are listening to thousands of audio clips from Alexa-enabled devices, some of which contain extremely private information. Recordings are being analysed by staff and transcribed before feeding them back into the software.

Rather disturbingly, in one case a suspected sexual assault was heard but not reported to police.

If you trawl through the Amazon’s lengthy T&C’s, there is no mention of staff listening in to recordings.

However, technically, users have given permission for the human verification: the company makes clear that it uses data “to train our speech recognition and natural language understanding systems” and gives users the chance to opt out. But the company doesn’t explicitly say that the training will involve workers in Costa Rica, India and America, and more nations around the world listening to those recordings.

It is also believed that both Apple and Google – who use Siri and Assistant – operate a similar system, employing human reviewers to eavesdrop on conversations.

When Amazon were approached by Bloomberg News, the tech giant told them that ‘an extremely small sample of Alexa voice recordings’ were analysed by staff. And that it has ‘procedures in place’ for workers to follow when they hear something distressing.

In a statement given to Bloomberg, Amazon said: “We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.

“We have strict technical and operational safeguards and have a zero-tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”

How to protect your privacy and turn off Alexa settings

If you own an Amazon device, there is a way to turn the settings off via the App to prevent Amazon employees from listening in. The option to enable information sharing, was set to default by Amazon. Here’s how you can turn it off:

  • Open the Alexa app on your phone.
  • Tap the menu button on the top left of the screen.
  • Select “Alexa Account.”
  • Choose “Alexa Privacy.
  • Select “Manage how your data improves Alexa.”
  • Turn off the button next to “Help Develop New Features.”
  • Turn off the button next to your name under “Use Messages to Improve Transcriptions.”
Scroll to Top