Personal voice assistants evesdropping

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant | Ars Technica

If you are a user of Siri, Alexa or Google Assistant you’re familiar with “false alarms”. You just utter a completely innocent phrase and suddenly one of these voice assistants springs into action. This can be funny sometimes, but has become quite a privacy issue. Researches managed to produce a list of more than 1.000 word sequences that incorrectly trigger the devices. A brief summary of their yet to be published paper “Unacceptable, where is my privacy?”can be found here.

And since these devices record portions of what’s said and share it with their manufacturer, potentially private conversations (or fragments of them) can end up on the severs of the manufactures. A year ago, The Guardian reported that Apple employees sometimes have access to sensitive conversations.

Source: Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant | Ars Technica

Photo by Morning Brew on Unsplash

Scroll to top