Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple must face class action lawsuit over Siri privacy intrusions

Last updated

Apple will face down allegations that its Siri voice assistant listened in on the private conversations of its users, as a judge on Thursday ruled that a majority of claims in a proposed class action lawsuit can move forward.

U.S. District Court Judge Jeffrey White said plaintiffs can pursue claims that Siri recorded and saved private conversations after being accidentally triggered by the "Hey Siri" prompt, reports Reuters. Plaintiffs also allege, seemingly without evidence, that Apple released information gathered in the recordings to third parties.

According to the original filing, which dates back to 2019, one user asserts that a private conversation with his doctor about a "brand name surgical treatment" led to targeted ads for that treatment. Others claim similar circumstances involving Air Jordan sneakers, Pit Viper sunglasses and Olive Garden.

The suit piggybacks on a whistleblower report from July 2019 that claimed an internal Siri grading program could inadvertently reveal a user's identity, personal information and other private material.

Apple, like other operators of voice assistant technologies, fields — or fielded — ongoing programs to increase the accuracy of their product. In Siri's case, contractors were tasked with analyzing snippets of queries uploaded from devices like iPhone and HomePod in attempts to determine whether the assistant was invoked purposely or by mistake.

It was later learned that contractors were privy to sensitive recordings ranging from supposed drug deals to sexual encounters.

While Apple does inform users of ongoing Siri quality control initiatives in its terms and conditions, the language used is vague and does not specifically state that audio clips will be recorded and reviewed by other people.

Apple suspended the Siri grading initiative in August 2019 and implemented options for users to opt out of the program in an ensuing software update. The California class action was filed five days later.

Judge White today ruled plaintiffs can pursue claims that Apple violated federal Wiretap Act and California privacy law, and committed breach of contract, the report said. A claim of unfair competition was tossed.



7 Comments

mcdave 19 Years · 1927 comments

I guess on-device Siri came too late.

The main issue is that most people don’t use the on-device correction/re-training that Siri has provided for years. Siri should listen for the swearing/frustration then invite the user to help improve it understanding.

cpsro 14 Years · 3239 comments

All frivolous except the Olive Garden. How embarrassing!

elijahg 18 Years · 2842 comments

mcdave said:
I guess on-device Siri came too late.

The main issue is that most people don’t use the on-device correction/re-training that Siri has provided for years. Siri should listen for the swearing/frustration then invite the user to help improve it understanding.

The main issue is Apple refuses to improve Siri's intelligence. On-device correction won't fix Siri's lack of understanding of the most basic commands, like "add 5 minutes to my 10 minute timer" which results in "I've changed it to 5 minutes". The audio clips Apple was reviewing are supposed to be going toward improving it, but it's still the embarrassing disaster it's always been.

dewme 10 Years · 5775 comments

I did notice on my camera-equipped Echo Show this morning that it had a tip about how to search for "find a barber near me." Okay, it's been a couple of weeks since my last trim, but I can probably hold out for another week. But maybe Amazon thinks otherwise ... based on ??? Hmmm. Now Alexa is suggesting tips for "where can I get tacos" and it's Thursday, not Tuesday. Total insanity. I guess the real question is, should I get the haircut before or after I get the tacos?  I'll ask Alexa.

Fidonet127 5 Years · 598 comments

elijahg said:
mcdave said:
I guess on-device Siri came too late.

The main issue is that most people don’t use the on-device correction/re-training that Siri has provided for years. Siri should listen for the swearing/frustration then invite the user to help improve it understanding.
The main issue is Apple refuses to improve Siri's intelligence. On-device correction won't fix Siri's lack of understanding of the most basic commands, like "add 5 minutes to my 10 minute timer" which results in "I've changed it to 5 minutes". The audio clips Apple was reviewing are supposed to be going toward improving it, but it's still the embarrassing disaster it's always been.

Apple isn’t refusing to improve Siri, nor is it a disaster. Yes it can’t do some things and it does make mistakes. I find it good enough and hands above in privacy. That privacy has also limited how quickly it has improved and what it can do. That doesn’t mean that it deserves such hyperbole.