AppleInsider is supported by its audience and may earn commission as an Amazon Associate and affiliate partner on qualifying purchases. These affiliate partnerships do not influence our editorial content.
An Oregon family's Amazon Echo recorded household audio and sent it to an employee of the family's husband, something Amazon blamed on a rare bug that it intends to fix. [Updated with additional Amazon explanation]
The employee called the family two weeks ago to inform them of what happened, and warn them to unplug their Alexa devices, according to KIRO 7. They did so, and while the husband initially disbelieved the story, the employee was able to share details of audio files, such as a conversation about hardwood floors.
The wife in the family, Danielle, called Amazon multiple times. An Alexa engineer investigated, confirming the situation through logs without detailing how the incident might have happened. The company did offer to de-provision the family's Alexa communications so they could continue to control smarthome devices, but Danielle said she's been fighting with representatives to secure a refund.
"Amazon takes privacy very seriously," the company said in a statement to KIRO. "We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future."
Critics of smartspeakers by Apple, Amazon, Google, and others have worried that the devices could be used as spy tools, whether by corporations, governments, or independent hackers. By design the speakers have to communicate with remote servers to interpret voice commands, only performing a basic amount of processing locally.
Update: An Amazon spokesperson contacted AppleInsider to provide the following statement:
"Echo woke up due to a word in background conversation sounding like 'Alexa.' Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right.' As unlikely as this string of events is, we are evaluating options to make this case even less likely."