Following Apple's decision to temporarily halt Siri grading as it evaluates the program's privacy safeguards, Amazon and Google this week followed suit and updated their respective policies on human reviews of recorded voice assistant audio.
Apple on Thursday suspended its Siri grading program, which seeks to make the virtual assistant more accurate by having workers review snippets of recorded audio, after a contractor raised privacy concerns about the quality control process.
Now, Apple's competitors in the space, namely Google and Amazon, are making similar moves to address criticism about their own audio review policies.
Shortly after Apple's announcement, Google in a statement to Ars Technica on Friday said it, too, halted a global initiative to review Google Assistant audio. Like Siri grading, Google's process runs audio clips by human operators to enhance system accuracy.
Unlike Apple's Siri situation, however, a contractor at one of Google's international review centers leaked 1,000 recordings to VRT NWS, a news organization in Belgium. In a subsequent report in July, the publication claimed it was able to identify people from the audio clips, adding that a number of snippets were of "conversations that should never have been recorded and during which the command 'OK Google' was clearly not given."
The VRT leak prompted German authorities to investigate Google's review program and level a three-month ban on voice recording transcripts.
"Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.
Google did not divulge the halt to global reviews until Friday.
Amazon is also taking steps to temper negative press about its privacy practices and on Friday rolled out a new Alexa option that allows users to opt out of human reviews of audio recordings, Bloomberg reports. Enabling the feature in the Alexa app excludes recorded audio snippets from analysis.
"We take customer privacy seriously and continuously review our practices and procedures," an Amazon spokeswoman said. "We'll also be updating information we provide to customers to make our practices more clear."
Amazon came under fire in April after a report revealed the company records, transcribes and annotates audio clips recorded by Echo devices in an effort to train its Alexa assistant.
While it may come as a surprise to some, human analysis of voice assistant accuracy is common practice in the industry; it is up to tech companies to anonymize and protect that data to preserve customer privacy.
Apple's method is outlined in a security white paper (PDF link) that notes the company ingests voice recordings, strips them of identifiable information, assigns a random device identifier and saves the data for six months, over which time the system can tap into the information for learning purposes. Following the six-month period, the identifier is erased and the clip is saved "for use by Apple in improving and developing Siri for up to two years."
Apple does not explicitly mention the possibility of manual review by human contractors or employees, nor does it currently offer an option for Siri users to opt out of the program. The company will address the latter issue in a future software update.
24 Comments
Odd wording and you have the timeline wrong anyway.
"Google did not divulge the halt to global reviews until Friday" is incorrect.
The Germans on Thursday announced Google had previously paused their program and additionally would now be restricted from using it for the next three months by the German Data Protection Commissioner. In the same statement the German agency recommended that Apple and Amazon follow Google's lead.
"In a statement released Thursday, (and due to time zone differences I believe Wednesday here in the US) Germany’s data protection commissioner said the country was investigating that contractors listen to audio captured by Google’s AI-powered Assistant to improve speech recognition. In the process, according to the reports, contractors found themselves listening to conversations accidentally recorded by products like the Google Home. In the statement, the German regulator writes that other speech assistant providers, including Apple and Amazon, are “invited” to “swiftly review” their policies.
Nothing may come of it but common business practices may not be acceptable going forward, and certainly more transparency required.
Of some note Ireland had already been looking at Apple and Siri due to a citizen complaint but Apple's response to them at the time was no disclosure was necessary since the recordings were anonymised. That defense tactic has probably now changed.
Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.
Interesting that you're so quick to defend Google that you'd make a careless error like that.
Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
Apple had no choice but to stop doing this. This was obvious.
Good to see the others doing the same. But also a bit unnerving to see that what Apple was doing in this regard was qualitatively no different from what Google and Facebook were.
Such a big FUCK UP.
'grading' sounds like 'culling' unbelievable that this term is used, wow.