Apple Siri, Google Assistant, and other digital assistants are triggered without your knowledge

0

Here are the 1000+ words that can trigger Apple Siri, Google Assistant, Amazon Alexa, and Microsoft Cortana into recording your private conversations

On a bright morning, you are discussing buying a new washing machine or an oven with your wife. After a while, when you open Google or Facebook, you immediately find advertisements for washing machines and ovens splashing your screens. Has this happened with you?

It has happened to me and many others including Thomas Brewster of Forbes who tweeted about it a month back.

While many would say it is not possible, a group of security researchers working for Ruhr Universitat Bochum and the Max Planck Institute for Cybersecurity and Privacy in Germany have discovered that certain words spoken in the households can trigger specific personal digital assistants that you may have on your PC/laptop, smartphone, Smart TV or any other smart device.

The group found that virtual assistants like Apple Siri, Google Assistant, Amazon Alexa, and Microsoft Cortana are triggered by a specific set of words and start your recording private conversations after these words are spoken. The researchers discovered that even conversations on your TV serial or movie/smartphone recording taking place in front of a virtual assistant could activate it to record the conversations taking place thereafter.

The researchers have identified nearly 1000 words that could trigger Apple Siri, Google Assistant, Amazon Alexa, and Microsoft Cortana without the knowledge or consent of users. Some of the words identified as triggering the virtual assistants are election, tobacco, unacceptable, letter, Ok, Cool, chill, who is reading, Ok, Hey Jerry, a city, Montana, duet, love, pose, what the, are you, silly, boy, some metropolitan city names, train.

Here is Amazon’s Alexa getting triggered during a Game of Thrones episode:

Here is Apple’s Siri getting triggered :

From the two videos, it can be seen that certain phrase spoken by the actors on TV triggers the respective devices. After they have been triggered, these virtual assistants start recording the conversations and relay them back to their parent servers. Here are specific words that can trigger a specific virtual assistant:

  • Alexa: “unacceptable,” “election,” and “a letter”
  • Google Home: “OK, cool,” and “Okay, who is reading”
  • Siri: “a city” and “hey jerry”
  • Microsoft Cortana: “Montana”

The researchers said that they had analyzed voice assistants from Amazon, Apple, Google, Microsoft, and Deutsche Telekom, as well as three Chinese models by Xiaomi, Baidu, and Tencent. They will be releasing their study on Tuesday. While the biggies have refused to comment on the triggers, a developer working on Amazon Alexa said that all digital assistants are made sensitive to a set of words irrespective of them being expressed by a human or a computer/TV.

This is really frightening for privacy lovers. If such words can trigger the virtual devices to wake up and record conversations without the knowledge of the owner/user, it is but natural that you see the ad of a washing machine or oven when you open Google or Facebook. Somewhere down the line, a portion of your conversation is definitely logged into some distant server.

Share.

About Author

"The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had." Eric Schmidt

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments