US pair’s private chat sent to coworker by AI bug.
It’s time to break out your “Alexa, I Told You So” banners – because a Portland, Oregon, couple received a phone call from one of the husband’s employees earlier this month, telling them she had just received a recording of them talking privately in their home.
“Unplug your Alexa devices right now,” the staffer told the couple, who did not wish to be fully identified, “you’re being hacked.”
At first the couple thought it might be a hoax call. However, the employee – over a hundred miles away in Seattle – confirmed the leak by revealing the pair had just been talking about their hardwood floors.
The recording had been sent from the couple’s Alexa-powered Amazon Echo to the employee’s phone, who is in the husband’s contacts list, and she forwarded the audio to the wife, Danielle, who was amazed to hear herself talking about their floors. Suffice to say, this episode was unexpected. The couple had not instructed Alexa to spill a copy of their conversation to someone else.
“I felt invaded,” Danielle told KIRO-TV. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.'”
The couple then went around their home unplugging all their Amazon Alexa gadgets – they had them all over the place to manage various smart home devices, including a thermostat and security system – and then called the web giant to complain about the snooping tech.
According to Danielle, Amazon confirmed that it was the voice-activated digital assistant that had recorded and sent the file to a virtual stranger, and apologized profusely, but gave no explanation for how it may have happened.
“They said ‘our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we’re sorry.’ He apologized like 15 times in a matter of 30 minutes and he said we really appreciate you bringing this to our attention, this is something we need to fix!”
She said she’d asked for a refund for all their Alexa devices – something the company has so far demurred from agreeing to.
Alexa, what happened? Sorry, I can’t respond to that right now
We asked Amazon for an explanation, and today the US giant responded confirming its software screwed up:
Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future.
For this to happen, something has gone very seriously wrong with the Alexa device’s programming.
The machines are designed to constantly listen out for the “Alexa” wake word, filling a one-second audio buffer from its microphone at all times in anticipation of a command. When the wake word is detected in the buffer, it records what is said until there is a gap in the conversation, and sends the audio to Amazon’s cloud system to transcribe, figure out what needs to be done, and respond to it.
The talking, always listening system is remarkably effective, which has led to it becoming extremely successful as a consumer product and sparked competing voice-controlled gizmos from Google and Apple.
Amazon has since been doing everything it can to position Alexa as a foundational technology, opening it up to apps, tying in smart-home products so voice commands can be used to make changes inside a house and, more recently, allowing it to access contact lists and make phone calls.
Which all sounds terrific until it goes wrong and your device acts like a bug, recording what you say in the privacy of your own home and sending a recording to a seemingly random contact.
The truth is that in its determined effort to expand Alexa’s usefulness and so consolidate its lead in the market, Amazon has been moving too fast. All too often in recent months the devices have been wrongly hearing its wake word – something that users tend to discover only when the device provides an unexpected response to a question it wasn’t asked.
The voice recognition and AI system behind Alexa is also far from perfect, leading to misunderstandings. So long as those misunderstandings and unexpected responses are not too frequent though, users put up with it because of the usefulness of the product overall.
The problem with constantly increasingly what the device can do, however, is that a misunderstanding can have a far greater impact than provided a nonsensical response. The device is now expecting to hear commands that allow it to interact with a huge range of features and services – from calling people to warming up the house – and it appears as though Amazon has turned the dial too far in allowing Alexa to act immediately on what it thinks it heard rather than double check a command if it isn’t clear.
Presumably in this case, the system not only heard its wake word incorrectly but then also misinterpreted the conversation as asking it to call the person in the contacts list. Which it then did, and then at some point decided that was the end of the conversation and go back to sleep.
Whether the device announced what it was doing – which it is designed to do – and wasn’t heard, or failed to announce it will be something important to know. Although for the end user, it’s a distinction that may not actually matter.
Clearly there is going to be some serious fallout from the situation since everyone’s fear about such a system has just been realized.
Amazon will be hotly debating how to respond and how much information to provide over what went wrong. We have no doubt that the company will inform us in a few days that it has discovered the issue and fixed it so it will never happen again. And we expect to see some plausible reason why this was a one-off.
But the truth is that if Alexa devices can easily be turned into bugs if there is a hardware or software mistake and we are willing to bet that in its haste to constantly update its devices Amazon let a big mistake through.
And now, dear readers, enjoy yourselves in the comments section. ®
Updated to add
A spokesperson for Amazon has been in touch with more details on what happened during the Alexa Echo blunder, at least from their point of view. We’re told the device misheard its wake-up word while overhearing the couple’s private chat, started processing talk of wood floorings as commands, and it all went downhill from there. Here is Amazon’s explanation:
The Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right.” As unlikely as this string of events is, we are evaluating options to make this case even less likely.