Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple suspends Siri quality control program, will let users opt out in update

Last updated

Apple has temporarily suspended its Siri quality control program after a Guardian expose last week claimed private contractors are privy to "highly sensitive recordings," revelations that immediately raised the brows of privacy advocates.

In a statement to TechCrunch, Apple said it has suspended the Siri response grading program as it reviews the initiative designed to determine whether the virtual assistant is being inadvertently triggered. The company will also allow users to opt in or out of Siri grading as part of a forthcoming software update.

"We are committed to delivering a great Siri experience while protecting user privacy," an Apple spokesperson said. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

Last week, Apple came under fire when a report from The Guardian cited a Siri grading program employee as saying the process can inadvertently reveal a user's identity, personal information and other private material.

In efforts to make Siri more accurate, Apple employs contractors who listen to snippets of Siri queries uploaded from devices like iPhone and HomePod. The goal, according to the company, is to resolve whether the assistant was invoked purposely or by mistake, a determination that can only be made by a human operator.

While Apple takes steps to anonymize digested data and disassociate evaluated recordings from device owners, the identities and private information of users can sometimes be gleaned from overheard audio, the contractor said. Further, the contractor claims some audio clips feature "private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

"You can definitely hear a doctor and patient, talking about the medical history of the patient," the person said. "Or you'd hear someone, maybe with car engine background noise — you can't say definitely, but it's a drug deal. You can definitely hearing it happening."

The contractor also questioned Apple's transparency on the subject, positing that the company does not go far enough to disclose to consumers how the grading process is conducted and what it entails.

Apple responded to the claims, saying "a small portion" of Siri requests — less than 1% of daily activations — are evaluated by personnel for quality control purposes. Reviewers must adhere to "strict confidentiality requirements" and conduct their work in "secure facilities," Apple said, adding that typical recordings subject to grading are a few seconds long.

While Apple does inform users of ongoing Siri quality control initiatives in its terms and conditions, the language used is vague and does not specifically state that audio clips will be recorded and reviewed by other people.

Apple's move to temporarily halt Siri grading is in line with the company's well-cultivated public image as a bastion of consumer privacy. With critics lambasting Google, Facebook and others for harvesting user information, Apple wields privacy as a potent marketing tool, promising customers that its products and services provide world-leading data security.



35 Comments

🍪
revenant 15 Years · 610 comments

I honestly see the dilemma, but how do you test real world conversations, voicing, regional dialects? 

I will happily let apple do what they do, I believe they are far and away better at privacy then the other jokers.

🌟
mobird 20 Years · 758 comments

What does it take to invoke Siri to listen in on a complete conversation as it is suggested? Most people know that it is a stretch to get Siri to do the most basic task asked of it.

🍪
Soli 9 Years · 9981 comments

This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.

🍪
kimberly 10 Years · 434 comments

mobird said:
What does it take to invoke Siri to listen in on a complete conversation as it is suggested? Most people know that it is a stretch to get Siri to do the most basic task asked of it.

 :D :D :D