Apple will face down allegations that its Siri voice assistant listened in on the private conversations of its users, as a judge on Thursday ruled that a majority of claims in a proposed class action lawsuit can move forward.
U.S. District Court Judge Jeffrey White said plaintiffs can pursue claims that Siri recorded and saved private conversations after being accidentally triggered by the "Hey Siri" prompt, reports Reuters. Plaintiffs also allege, seemingly without evidence, that Apple released information gathered in the recordings to third parties.
According to the original filing, which dates back to 2019, one user asserts that a private conversation with his doctor about a "brand name surgical treatment" led to targeted ads for that treatment. Others claim similar circumstances involving Air Jordan sneakers, Pit Viper sunglasses and Olive Garden.
The suit piggybacks on a whistleblower report from July 2019 that claimed an internal Siri grading program could inadvertently reveal a user's identity, personal information and other private material.
Apple, like other operators of voice assistant technologies, fields — or fielded — ongoing programs to increase the accuracy of their product. In Siri's case, contractors were tasked with analyzing snippets of queries uploaded from devices like iPhone and HomePod in attempts to determine whether the assistant was invoked purposely or by mistake.
It was later learned that contractors were privy to sensitive recordings ranging from supposed drug deals to sexual encounters.
While Apple does inform users of ongoing Siri quality control initiatives in its terms and conditions, the language used is vague and does not specifically state that audio clips will be recorded and reviewed by other people.
Apple suspended the Siri grading initiative in August 2019 and implemented options for users to opt out of the program in an ensuing software update. The California class action was filed five days later.
Judge White today ruled plaintiffs can pursue claims that Apple violated federal Wiretap Act and California privacy law, and committed breach of contract, the report said. A claim of unfair competition was tossed.
7 Comments
I guess on-device Siri came too late.
The main issue is that most people don’t use the on-device correction/re-training that Siri has provided for years. Siri should listen for the swearing/frustration then invite the user to help improve it understanding.
All frivolous except the Olive Garden. How embarrassing!
I did notice on my camera-equipped Echo Show this morning that it had a tip about how to search for "find a barber near me." Okay, it's been a couple of weeks since my last trim, but I can probably hold out for another week. But maybe Amazon thinks otherwise ... based on ??? Hmmm. Now Alexa is suggesting tips for "where can I get tacos" and it's Thursday, not Tuesday. Total insanity. I guess the real question is, should I get the haircut before or after I get the tacos? I'll ask Alexa.