Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Irish data protection body questioning Apple over Siri grading practices

The Siri interface icon in iOS 13. Credit: Apple

Last updated

A day after the Siri whistleblower lamented that there were no investigations into Apple's Siri quality monitoring, The Irish Data Protection Commission said it is again questioning Apple over the grading practices.

News of the DPC's contact with Apple comes hours after Thomas Le Bonniec wrote to European data regulators and outed himself as the whistleblower that first revealed human contractors were listening to and grading Siri requests in July 2019.

In a statement to Reuters, the DPC said it is "in contact" with the Cupertino tech giant. The regulator acted after Le Bonniec's letter on May 20, which pushed for further investigations into Apple's human-based Siri practices.

Irish DPC Deputy Commissioner Graham Doyle said that the regulatory body "engaged with Apple on this issue when it first arose last summer," adding that Apple "has since made some changes."

"However, we have followed up again with Apple following the release of this public statement and await responses," Doyle said.

The deputy commissioner added that the European Data Protection Board is currently working on producing guidance specifically in the area of voice assistant technology.

In the May 20 letter, Le Bonniec wrote that "it is worrying that Apple ... keeps ignoring and violating fundamental rights and continues their massive collection of data." He added that Apple should be "urgently investigated."

The former Apple contractor first took issue with the company's Siri policies in the summer of 2019, revealing to The Guardian that humans were listening to a select number of anonymized Siri recordings. While Apple didn't make that fact explicitly clear, it has always been upfront about some Siri recordings being manually reviewed.

Despite the fact that Siri queries are not linkable to individual users, the whistleblower mainly took issue with the fact that some recordings allegedly contained highly sensitive and personal conversations or situations.

In response to the complaint and ensuing controversy, Apple reviewed its Siri practices and implemented several changes, including making Siri optimization an opt-in policy and not retaining audio recordings of interactions with the digital assistant.

Keep up with all the Apple news with your iPhone, iPad, or Mac. Say, "Hey, Siri, play AppleInsider Daily," — or bookmark this link — and you'll get a fast update direct from the AppleInsider team.



2 Comments

chasm 10 Years · 3624 comments

This will probably go nowhere or nearly so, since Apple has already implemented both clear language about how Siri works on their privacy website AND they have an easy to reach opt-out if one wants it.

It's funny to me how Amazon, Microsoft, and Google never seem to come up for this type of scrutiny even though they also use human graders, admit freely that said graders listen to personally-identifiable clips (unlike Apple), and to the best of my knowledge do not have an opt-out (at least not one that users can easily find).

Slight correction: Win 10 has an opt-out during installation, but can't seem to find it after that point (but perhaps that's just me).

gatorguy 13 Years · 24627 comments

chasm said:It's funny to me how Amazon, Microsoft, and Google never seem to come up for this type of scrutiny even though they also use human graders, admit freely that said graders listen to personally-identifiable clips (unlike Apple), and to the best of my knowledge do not have an opt-out (at least not one that users can easily find).

Slight correction: Win 10 has an opt-out during installation, but can't seem to find it after that point (but perhaps that's just me).

Google also uses anonymized clips just like Apple does. Always has. The default is opt-out (deny) as a matter of fact. You have to actively opt-in to (allow) voice reviews. You obviously didn't bother checking, prefer making it up as you go along? https://www.blog.google/products/assistant/doing-more-protect-your-privacy-assistant/