Alex Stamos, former Facebook security chief, says Apple's approach to CSAM scanning and iMessage exploitation may have caused more harm than good for the cybersecurity community.
Once iOS 15 and the other fall operating systems release, Apple will introduce a set of features intended to prevent child exploitation on its platforms. These implementations have created a fiery online debate surrounding user privacy and the future of Apple's reliance on encryption.
Alex Stamos is currently a professor at Stanford but previously acted as security chief at Facebook. He encountered countless damaged families as a result of abuse and sexual exploitation during his tenure at Facebook.
He wants to stress the importance of technologies, such as Apple's, to combat these problems. "A lot of security/privacy people are verbally rolling their eyes at the invocation of child safety as a reason for these changes," Stamos said in a Tweet. "Don't do that."
The Tweet thread covering his views surrounding Apple's decisions is extensive but offers some insight into the matters brought up by Apple and experts alike.
In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.
— Alex Stamos (@alexstamos) August 7, 2021
Nuanced opinions are ok on this.
The nuance of the discussion has been lost on many experts and concerned internet citizens alike. Stamos says the EFF and NCMEC both reacted with little room for conversation, having used Apple's announcements as a stepping stone to advocate for their equities "to the extreme."
Information from Apple's side hasn't helped with the conversation either, says Stamos. For example, the leaked memo from NCMEC calling concerned experts "screeching voices of the minority" is seen as harmful and unfair.
Stanford hosts a series of conferences around privacy and end-to-end encryption products. Apple has been invited but has never participated, according to Stamos.
Instead, Apple "just busted into the balancing debate" with its announcement and "pushed everybody into the furthest corners" with no public consultation, said Stamos. The introduction of non-consensual scanning of local photos combined with client-side ML might have "poisoned the well against any use of client side classifiers."
The implementation of the technology itself has left Stamos puzzled. He cites that the on-device CSAM scanning isn't necessary unless it is in preparation for end-to-end encryption of iCloud backups. Otherwise, Apple could easily perform the scanning server side.
The iMessage system doesn't offer any user-related reporting mechanisms either. So rather than alert Apple to users abusing iMessage for sextortion or sending sexual content to minors, the child is left with a decision — one Stamos says they are not equipped to make.
As a result, their options for preventing abuse are limited.
— Alex Stamos (@alexstamos) August 7, 2021
What I would rather see:
1) Apple creates robust reporting in iMessage
2) Slowly roll out client ML to prompt the user to report something abusive
3) Staff a child safety team to investigate the worst reports
At the end of the Twitter thread, Stamos mentioned that Apple could be implementing these changes due to the regulatory environment. For example, the UK Online Safety Bill and EU Digital Services Act could both have influenced Apple's decisions here.
Alex Stamos isn't happy with the conversation surrounding Apple's announcement and hopes the company will be more open to attending workshops in the future.
The technology itself will be introduced in the United States first, then rolled out on a per-country basis. Apple says it will not allow governments or other entities to coerce it into changing the technology to scan for other items such as terrorism.
25 Comments
“The implementation of the technology itself has left Stamos puzzled. He cites that the on-device CSAM scanning isn't necessary unless it is in preparation for end-to-end encryption of iCloud backups. ”
Nice intentions…
…not worth the bits they are written with.
I just want to see Cook explain to shareholders the crash in Apple’s stock price as he announces leaving the Chinese market for refusing to scan for pictures of the Dalai Lama, Pooh, HK Protests, etc.
You know, just like Apple left the Chinese market when China asked that VPN apps be removed from the AppStore, or when China and Russia mandated that all iCloud servers for their country’s users be within their jurisdiction, or when the US government wanted iCloud backups to remain unencrypted.
Apple is always quick to point out that they comply with all the laws of the countries they operate in, so they will punt and point the finger to the authoritarian regimes’ laws as they obediently comply. And US authorities will use this as an excuse to not have national security disadvantages over other countries, etc.
In the good old days before electronic communications, law enforcement couldn’t tap into anything, and they still managed to prosecute crime, they just had to put in more actual shoe leather for investigations, while these days some guy thinks the only time he should get up from his office chair is to present evidence in court (unless that’s a video conference, too). The demands of law enforcement are simply an expression of laziness.
You know, in Brazil we have an expression: "Boasting to kill the jaguar, after it's already dead!" I doesn't really translates, but it really resonates with some of the reactions we've been seeing this past week. You know who doesn't give a damn? Regular Apple customers. You know, people with real jobs and life worries, that don't abuse children, and invest in Apple hardware because they think it is better than the competition's. They are also people that don't agonize about some very technical and narrow definition of privacy, particularly when it's been polluted by personal (ahem, monetary) interests, and when their own Congress is just about incapable of even following the most straightforward aspects of technology.
Talk about an overblown "first world" problem.
Remember, people, this is the former 'Facebook security chief’. Facebook and security are mutually exclusionary terms. It’s like trying to put a square peg into a round hole. It ain’t happening.