Alexa is always listening…

Alexa is always listening…
By Anonymous | March 2, 2022

We have seen a tremendous rise in the use of virtual personal assistants in the last few years. It is very common for families to have smart home devices installed throughout their homes, everything from doorbells to voice activated lights. Coupled with this new era of technology rapidly evolving is a new wave of concerns surrounding privacy and data protection of its consumers.

Of course, to provide a personalized experience, virtual personal assistants are collecting and storing some information about how you interact with them.

Let’s focus on Amazon’s Alexa smart home ecosystem, which has come into the spotlight over the years with privacy violations and bugs that have resulted in a loss of privacy for users.

As Shannon Flynn reports, [“Alexa is an always-on device, so it’s constantly listening for the wake word and will record anything that comes after it”](https://www.lifewire.com/can-alexa-record-conversations-5205324). The light on Alexa devices turns blue to indicate when they are recording, but people aren’t necessarily looking for these visual cues. There is an inherent risk of Alexa recording conversations accidentally when it should not. Amazon’s explanation for these instances has always been that Alexa was activated by a word that sounded like its wake word “Alexa” and consequently began recording. In one extreme case, a couple in [Portland had a private conversation recorded and forwarded to one of their contacts. They only found out because the person called them to tell them they had received that message](https://www.theguardian.com/technology/2018/may/24/amazon-alexa-recorded-conversation).

These accidentally recorded conversations are also uploaded to Amazon’s servers immediately, which poses an even larger risk because sensitive and personally identifiable information is now being transmitted and stored by Amazon. While the driving force behind Alexa is artificial intelligence and machine learning, there is also a [“human quality-control team that reviews user recordings to ensure Alexa’s accuracy” as reported by Robert Earl Wells III](https://www.lifewire.com/is-alexa-safe-to-use-4780145). So in addition to sensitive data being stored in Amazon’s servers, Amazon employees may end up with access to very precise information about a specific user or users. Users have a way to delete this information if accidentally recorded, but they have to also be aware that something is actually happening, like in the case of the Portland couple.

Amazon has recently introduced new restrictions on these human reviews, and given users the option to opt out completely. There are also ways to disable this “always listening” feature individually on each Alexa device, but this also results in the voice-activation feature being unusable. Users will manually have to activate the microphone for Alexa to listen and respond. While this is a safer option in terms of privacy, your speech and interaction data is still being sent off to Amazon’s servers.

At what point Alexa stops recording after being activated is not immediately clear. The intention behind recording conversations in the first place is so that Alexa can learn more about its users to provide more personalized experiences.

The most you can do to protect your privacy is ensure that recordings are being deleted regularly and, at the very least, muting Alexa so that your conversations are not being recorded and analyzed by Amazon. We don’t have a clear understanding of what exactly they do with the data beyond using it to further personalize experiences for their customers, but there is tremendous potential for the company to make assumptions about you based off of the voiceprints and potential biometrics that it is logging.

You are responsible for protecting your own privacy when it comes to smart devices like Alexa, so take advantage of the mechanisms that Amazon provides to increase your protection instead of sticking to the factory defaults.

References:
https://www.forbes.com/sites/tjmccue/2019/04/19/alexa-is-listening-all-the-time-heres-how-to-stop-it/?sh=f9bc6395e2d2
https://www.lifewire.com/is-alexa-safe-to-use-4780145
https://www.lifewire.com/can-alexa-record-conversations-5205324
https://www.theguardian.com/technology/2018/may/24/amazon-alexa-recorded-conversation