Coincidentally, the voice-activation mechanism that drives home AI assistants is also the big chink in their armor. Burger King’s recent TV commercial has shown how easy it is to hijack your home using these voice-activated devices.
Advances in AI has paved the way for the emergence of home automation technologies and set off a race in Silicon Valley. Smart virtual assistants, in the form of voice-enabled speaker devices, started finding their way into homes gradually, with Amazon’s Echo and Google Home being the market trailblazers.
Play a song, control lights or air-conditioning, check the weather or news headlines, shop online or perform a web search . . . it is now possible to complete everyday tasks just by asking your AI assistant speaker. All you need is your voice to trigger the device. Unfortunately, that means anyone with a voice can do the same–including disembodied voices like those coming from the TV screen.
In the famous tale “Ali Baba and the Forty Thieves” from the “Arabian Nights” collection, the protagonist uses the magic phrase “Open, Sesame!” as a vocal password to get into a cave replete with treasures. Ali Baba overheard the thieves pronounce the phrase and used it to move the voice-activated rock.
In a similar way, Burger King used the phrase “OK, Google” to trigger Google Home and get to another kind of treasure: free advertising.
Recently, Burger King aired a 15-second TV spot triggered Google Home and tricked it into reading an altered section pulled from a Wikipedia page about BK’s Whopper sandwich.
This is actually a simple “hack” that doesn’t involve any codes or computers. Google Home is activated by the phrase “OK, Google” and then reacts to the following spoken request. BK’s commercial featured an employee who uttered the phrase “OK Google. What is the Whopper burger?” Naturally, Goggle Home devices in homes around the country were triggered and automatically searched for information on the web.
These Google Homeowners were regaled with a Wikipedia definition of the Whopper burger.
Abuses and Countermeasures
Some unkind users even made edits to the Wikipedia page so the Whopper ingredients include “a medium-sized child” and “rat and toenail clippings.” Google and Wikipedia both reacted quickly. The first blocked the ad from triggering its device, and the latter blocked further edits to the Whopper Wikipedia page. How much more will these organizations have to do to stop further abuse?
Burger King may have gotten the idea from a Super Bowl LI Google Home ad, which inadvertently triggered a response from the Google Home devices of Super Bowl viewers.
Despite being legally questionable and its disregard for privacy, BK’s stunt revealed the scope of abuse to which these systems are vulnerable. Both Google and Amazon say they have countermeasures to prevent such incidents, but neither explains exactly how.
One possible countermeasure to the misuse of these devices could be in the use of biometric data. Using fingerprints or iris scans, perhaps these devices could be inextricably linked to one user or a group of users and would therefore not be susceptible to an external hack.
Yet, in India, where biometric IDs have been issued to over 1 billion people, the use of biometrics is heavily criticized for being the “world’s largest mass surveillance project.” The project is touted as a deterrent to corruption and fraud, but it is unlikely that such intimate methods of data collection will ever be implemented without strong pushback.
What Does Siri Know About Burger King?
Apple’s senior vice president, Philip Schiller, has nothing nice to say about Google Home or Amazon Echo (and Alexa), so he’s going to “say nothing at all.”
Yet, regarding Apple’s development conference, WWDC, KGI Securities analyst Ming-Chi Kuo mentioned that there’s a 50% chance that Apple will unveil its own AI assistant speaker system during the June 2017 event.
The post Burger King and Other Brands can Hack Your Google Home or Amazon Echo appeared first on Edgy Labs.