Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple announces plans to improve Siri's privacy protections for users

Last updated

Apple's review of Siri privacy guarantees has completed, and the company is making a few changes going forward to further ensure users' privacy and data safety.

Beyond just its privacy page, Apple on Wednesday reiterated that Siri isn't used to build a marketing profile and any collected data is never sold. Apple says that is uses Siri data only to improve Siri, and is "constantly developing technologies to make Siri even more private."

The company shared more details about the grading process as well.

"Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests — less than 0.2 percent — and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability," wrote Apple. "For example, did the user intend to wake Siri? Did Siri hear the request accurately? And did Siri respond appropriately to the request?"

Apple says that its internal review has resulted in a few changes.

  • Users will be able to opt in to help Siri improve by learning from the audio samples of their requests. Those who choose to participate will be able to opt out at any time.
  • Apple will no longer retain audio recordings of Siri interactions, and will continue to use computer-generated transcripts to help Siri improve.
  • When customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions.
  • Apple will also "work to delete" any recording which is determined to be an inadvertent trigger of Siri.

On July 28, a "whistleblower" who was allegedly a contractor for Apple, detailed a program that used voice recordings to improve the responses from Apple's voice assistant. The report said that a small team of contractors working for Apple, were tasked with determining if the Siri activation was accidental or on purpose, if it was a query within the range of Siri's capabilities, and whether Siri acted properly.

The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors — but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

The nature of the information, sometimes unintentional and not part of the query, is wide-ranging, the whistleblower said.

"You can definitely hear a doctor and patient, talking about the medical history of the patient," said the source. "Or you'd hear someone, maybe with car engine background noise - you can't say definitely, but it's a drug deal. You can definitely hearing it happening," they advised.

The whistleblower went on to state there are many recordings "featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

Allegedly, there wasn't a procedure in place to deal with sensitive recordings, with the whistleblower stepping forward over the suggestion the data could be easily misused.

Wednesday's discussion by Apple also reminds users that as much as possible is done on the device, Siri uses as little data as possible to deliver an accurate result, and contents of Siri queries are not returned to Apple.

Furthermore, even before the announced changes, Apple's Siri used a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it's being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device's data is disassociated from the random identifier.

Apple has also published a support page with frequently asked questions about Siri privacy and grading.



24 Comments

agilealtitude 6 Years · 165 comments

Disappointing.

"will continue to use computer-generated transcripts"
"
only Apple employees will be allowed to listen to audio samples"
"work to delete"

Jargon I would have expected from other companies.  Not Apple.

gatorguy 13 Years · 24627 comments

The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

Where does Apple specifically mention it? The Privacy Page clipping AI has been using was from iOS6 AFAICT, but I've not been able to find any disclosure of queries being reviewed since then, and never by outside contractors anyway. Is there a more recent link to it, or is "six iterations of iOS" meant as a backhanded way of saying that Apple removed it from mention in the privacy policy it a few years ago?

FWIW Google too says recordings may be reviewed, using so many words to say so in a very legal and broad manner, but never says by humans employed by a 3rd party contractor. They should be far more transparent and no doubt will be going forward. They also intimated a few weeks ago that they will be moving the review program in-house just as Apple plans to. TBH I would have assumed that was always the case and more than surprised Amazon, Apple and Google were all sending recordings to outside companies and facilities. Was it just cheaper to do so? 

Mike Wuerthele 8 Years · 6906 comments

gatorguy said:
The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

Where does Apple specifically mention it? The Privacy Page clipping AI has been using was from iOS6 but I've not been able to find any mention of queries being reviewed since then, and never by outside contractors anyway. Is there a more recent link to it, or is "six iterations of iOS" meant as a backhanded way of saying that Apple stopped disclosing it a few years ago? FWIW Google too says recordings may be reviewed, using so many words to say so, but never says by humans employed by a 3rd party contractor.

The clipping is substantively the same in iOS 6 through 11. Discussion of review has been in the privacy page since inception.

Nobody explicitly says "by humans," relying on terms like "agents" and "affiliates" but what else would review them?

rogifan_new 9 Years · 4297 comments

These are good changes. It’s too bad it took Apple being embarrassed by a salacious news story for it to happen.

rogifan_new 9 Years · 4297 comments

Disappointing.

"will continue to use computer-generated transcripts"
"only Apple employees will be allowed to listen to audio samples"
"work to delete"

Jargon I would have expected from other companies.  Not Apple.

How do you expect Apple to improve Siri then?