Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Siri 'whistleblower' details drug deals & sex heard during manual reviews

Last updated

A 'whistleblower' has taken issue with Apple's lack of disclosure that it has contractors listening to anonymized Siri queries — but the company has said all along that it does.

Siri has repeatedly been the subject of improvements over the years, as Apple continues to make the voice recognition systems it uses work better, to minimize the possibility of incorrect answers to questions. As part of this, it is a selection of recordings where potential issues are detected are passed along to workers for analysis.

A report from the Guardian reveals a small number of recordings are passed on to contractors working for Apple, tasked with determining if the Siri activation was accidental or on purpose, if it was a query within the range of Siri's capabilities, and whether Siri acted properly.

The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors — but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

The nature of the information, sometimes unintentional and not part of the query, is wide-ranging, the whisteblower said.

"You can definitely hear a doctor and patient, talking about the medical history of the patient," said the source. "Or you'd hear someone, maybe with car engine background noise - you can't say definitely, but it's a drug deal. You can definitely hearing it happening," they advised.

The whistleblower goes on to state there are many recordings "featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

Allegedly, there isn't a procedure in place to deal with sensitive recordings, with the whistleblower stepping forward over the suggestion the data could be easily misused. Citing a lack of vetting for employees and the broad amount of data provided, it is suggested "it wouldn't be difficult to identify the person you're listening to, especially with accidental triggers" like names and addresses, especially for "someone with nefarious intentions."

Apple confirmed "A small portion of Siri requests are analysed to improve Siri and dictation," but added it was kept as secure as possible.

"User requests are not associated with the user's Apple ID," the company continued, "Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."

Apple added that a random subset of less than 1% of daily Siri activations are used for grading, recordings typically only a few seconds in length.

The report follows after similar privacy-related stories about Google Assistant and Amazon's Alexa, where teams had access to some customer data logs with recordings, for similar review purposes.

In the Amazon case, the captured voice data was associated with user accounts. In regards to Google, a researcher provided a voice snippet that Google retained to analysis to the reporter that made the request — but Google says that the samples aren't identifiable by user information. How the user was identified by the researcher isn't clear.



74 Comments

AppleExposed 6 Years · 1805 comments

No Apple IDs are stored and they're one random clip at a time.

I can see the concerns but keeping it anonymous means we shouldn't worry.

bonerpope 6 Years · 2 comments

If {sounds like a drug deal}, go back to sleep

zoetmb 17 Years · 2655 comments

There's a big difference between 'hearing" a Siri request ("where's the closest pizza place") and hearing conversations when a request isn't specifically being made.  Which is it?   The former doesn't bother me, but the latter does.  

But if they were listening to me using Siri, all they would hear is a lot of annoyance and cursing after I asked the initial query.   

Mike Wuerthele 8 Years · 6906 comments

zoetmb said:
There's a big difference between 'hearing" a Siri request ("where's the closest pizza place") and hearing conversations when a request isn't specifically being made.  Which is it?   The former doesn't bother me, but the latter does.  

But if they were listening to me using Siri, all they would hear is a lot of annoyance and cursing after I asked the initial query.   

I'm sure there's some caught that are inadvertent triggers. That's the whole point of process improvement.

mystigo 16 Years · 183 comments

I bet they hear Siri being called "idiot" a lot. That's my go to response when it tries to take me to an address 3 states away instead of the same address 3 miles away. I have literally had it complain that it could not find a route to England from the US. The assumption that I was looking for a way to drive across the Atlantic to the UK is just flawed on so many levels. It has zero sense of geographical context. And it seems like it would be so easy to fix. Just assume I want the closest match. Idiot.