Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Amazon, Google follow Apple's lead on voice assistant review policies

Last updated

Following Apple's decision to temporarily halt Siri grading as it evaluates the program's privacy safeguards, Amazon and Google this week followed suit and updated their respective policies on human reviews of recorded voice assistant audio.

Apple on Thursday suspended its Siri grading program, which seeks to make the virtual assistant more accurate by having workers review snippets of recorded audio, after a contractor raised privacy concerns about the quality control process.

Now, Apple's competitors in the space, namely Google and Amazon, are making similar moves to address criticism about their own audio review policies.

Shortly after Apple's announcement, Google in a statement to Ars Technica on Friday said it, too, halted a global initiative to review Google Assistant audio. Like Siri grading, Google's process runs audio clips by human operators to enhance system accuracy.

Unlike Apple's Siri situation, however, a contractor at one of Google's international review centers leaked 1,000 recordings to VRT NWS, a news organization in Belgium. In a subsequent report in July, the publication claimed it was able to identify people from the audio clips, adding that a number of snippets were of "conversations that should never have been recorded and during which the command 'OK Google' was clearly not given."

The VRT leak prompted German authorities to investigate Google's review program and level a three-month ban on voice recording transcripts.

"Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.

Google did not divulge the halt to global reviews until Friday.

Amazon is also taking steps to temper negative press about its privacy practices and on Friday rolled out a new Alexa option that allows users to opt out of human reviews of audio recordings, Bloomberg reports. Enabling the feature in the Alexa app excludes recorded audio snippets from analysis.

"We take customer privacy seriously and continuously review our practices and procedures," an Amazon spokeswoman said. "We'll also be updating information we provide to customers to make our practices more clear."

Amazon came under fire in April after a report revealed the company records, transcribes and annotates audio clips recorded by Echo devices in an effort to train its Alexa assistant.

While it may come as a surprise to some, human analysis of voice assistant accuracy is common practice in the industry; it is up to tech companies to anonymize and protect that data to preserve customer privacy.

Apple's method is outlined in a security white paper (PDF link) that notes the company ingests voice recordings, strips them of identifiable information, assigns a random device identifier and saves the data for six months, over which time the system can tap into the information for learning purposes. Following the six-month period, the identifier is erased and the clip is saved "for use by Apple in improving and developing Siri for up to two years."

Apple does not explicitly mention the possibility of manual review by human contractors or employees, nor does it currently offer an option for Siri users to opt out of the program. The company will address the latter issue in a future software update.



24 Comments

gatorguy 13 Years · 24627 comments

Odd wording and you have the timeline wrong anyway.
"Google did not divulge the halt to global reviews until Friday" is incorrect.

The Germans on Thursday  announced Google had previously paused their program and additionally would now be restricted from using it for the next three months by the German Data Protection Commissioner. In the same statement the German agency recommended that Apple and Amazon follow Google's lead.  

"In 
a statement released Thursday, (and due to time zone differences I believe Wednesday here in the US) Germany’s data protection commissioner said the country was investigating that contractors listen to audio captured by Google’s AI-powered Assistant to improve speech recognition. In the process, according to the reports, contractors found themselves listening to conversations accidentally recorded by products like the Google HomeIn the statement, the German regulator writes that other speech assistant providers, including Apple and Amazon, are “invited” to “swiftly review” their policies. 

Nothing may come of it but common business practices may not be acceptable going forward, and certainly more transparency required. 

Of some note Ireland had already been looking at Apple and Siri due to a citizen complaint but Apple's response to them at the time was no disclosure was necessary since the recordings were anonymised. That defense tactic has probably now changed. 

chasm 10 Years · 3625 comments

Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

Interesting that you're so quick to defend Google that you'd make a careless error like that.

Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.

anantksundaram 18 Years · 20391 comments

Apple had no choice but to stop doing this. This was obvious.

Good to see the others doing the same. But also a bit unnerving to see that what Apple was doing in this regard was qualitatively no different from what Google and Facebook were. 

knowitall 11 Years · 1648 comments

Such a big FUCK UP.
'grading' sounds like 'culling' unbelievable that this term is used, wow.

gatorguy 13 Years · 24627 comments

chasm said:
Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

Interesting that you're so quick to defend Google that you'd make a careless error like that.

Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.

Wrong Chasm. Beginning with the headline the article is not entirely accurate.

For example The Verge reported Thursday that Google had voluntarily stopped the reviews at some earlier date which could have been weeks or days earlier, and not specifying whether in just the EU (it was never just Germany) or worldwide. Common sense would say worldwide since it was a contractor problem that needed to be addressed not a regional one.

Ars didn't report the story until Friday and added the word "globally", which still doesn't show "Google and Amazon following Apple's lead". Factually it was the German authorities telling Apple and Amazon they should seriously consider following Google's lead. I assume you read what was announced. Saying Google didn't divulge the program pause until Friday is a half-truth at best IMO.

No that is not defending Google, it's pointing out that it had nada to do with anything Apple chose to do with their program. Rather than Apple being proactive and everyone else following their lead Apple for their part was forced into it to avoid a formal inquiry by yet another country's data privacy commissioner IMO, tho it's reasonable to assume Apple might have done so of their own volition at some point. 

Also Google was not supplying voice snippets for review that were not anonymized. They did not "leave identifying information intact" for contractors. What we don't know is whether Apple was more similar to Amazon and including things like location, gender and device ID for instance, not that it would make those recordings as presented personally identifiable anyway. The leak implies some form of identifying data was attached but Apple has volunteered very little information, nor has Google for that matter. They are both pretty vague and not exactly forthcoming, avoiding comment unless pushed into it.