We’ve all been there. You are talking to your friend or watching something on TV and suddenly your voice assistant comes to life. What caused that? No one said any words that seemed to resemble “Hey, Google” or “Hey, Siri.”
Well, according to new research, there’s a growing list of words and phrases that can wake your digital voice assistants. Besides being just an annoyance, unintended trigger words can lead to a breach of privacy. Not only can law enforcement request access to logs, but voice snippets are occasionally sent to these companies for quality control, where people analyze the recordings to (hopefully) improve the voice assistants.
You can see where this is an issue. It’s one thing for Google, Amazon, Facebook, or Microsoft to analyze you asking a voice assistant about movie times, it’s another thing for these human contractors to hear about your sex life, finances, or literally anything that wasn’t intended.
The researchers used various words and television shows (Game of Thrones, Modern Family, and more) to see what would trigger various voice assistants. In total, over 1,000 words and phrases were found to trigger the voice assistants. Alexa, for example, would react to words like “election,” “unacceptable,” and “a letter.” Google Assistant was tricked by “Ok, cool.” Siri was tripped up by “a city.” The list goes on.
Look, these systems are going to make mistakes. We all speak differently and the technology behind these voice assistants has to account for that. Just know that what happens in your house might not stay in your house. Ars Technica reports that the full study has not yet released, but we’ll update this with an official link when available.
- How to locate and delete voice commands on Google Assistant, Siri, Alexa, and Cortana
- Great, researchers have found a way to hack Alexa, Google Home, and Siri with laser pointers
- Yes, Apple is listening in on your Siri requests, but it’s all done in-house and you can opt-out
- PSA: Don’t use Siri, Alexa, other to ask for customer service numbers, you might get a scammer