A 'whistleblower' has taken issue with Apple's lack of disclosure that it has contractors listening to anonymized Siri queries -- but the company has said all along that it does.
has repeatedly been the subject of improvements over the years, as Apple continues to make the voice recognition systems it uses work better, to minimize the possibility of incorrect answers to questions. As part of this, it is a selection of recordings where potential issues are detected are passed along to workers for analysis.
A report from the Guardian reveals a small number of recordings are passed on to contractors working for Apple, tasked with determining if the Siri activation was accidental or on purpose, if it was a query within the range of Siri's capabilities, and whether Siri acted properly.
The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."
The nature of the information, sometimes unintentional and not part of the query, is wide-ranging, the whisteblower said.
"You can definitely hear a doctor and patient, talking about the medical history of the patient," said the source. "Or you'd hear someone, maybe with car engine background noise - you can't say definitely, but it's a drug deal. You can definitely hearing it happening," they advised.
The whistleblower goes on to state there are many recordings "featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."
Allegedly, there isn't a procedure in place to deal with sensitive recordings, with the whistleblower stepping forward over the suggestion the data could be easily misused. Citing a lack of vetting for employees and the broad amount of data provided, it is suggested "it wouldn't be difficult to identify the person you're listening to, especially with accidental triggers" like names and addresses, especially for "someone with nefarious intentions."
Apple confirmed "A small portion of Siri requests are analysed to improve Siri and dictation," but added it was kept as secure as possible.
"User requests are not associated with the user's Apple ID," the company continued, "Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."
Apple added that a random subset of less than 1% of daily Siri activations are used for grading, recordings typically only a few seconds in length.
The report follows after similar privacy-related stories about Google Assistant and Amazon's Alexa, where teams had access to some customer data logs with recordings, for similar review purposes.
In the Amazon case, the captured voice data was associated with user accounts. In regards to Google, a researcher provided a voice snippet that Google retained to analysis to the reporter that made the request -- but Google says that the samples aren't identifiable by user information. How the user was identified by the researcher isn't clear.