Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Tests confirm macOS Finder isn't scanning for CSAM images

The macOS Finder isn't scanning your images for illegal material.

Last updated

Apple isn't checking images viewed within the macOS Finder for CSAM content, an investigation into macOS Ventura has determined, with analysis indicating that Visual Lookup isn't being used by Apple for that particular purpose.

In December, Apple announced it had given up on plans to scan iPhone photos uploaded to iCloud for Child Sexual Abuse Material (CSAM), following considerable backlash from critics. However, rumors apparently lingered alleging that Apple was still performing checks in macOS Ventura 13.1, prompting an investigation from a developer.

According to Howard Oakley of Eclectic Light Co. in a blog post from January 18, a claim started to circulate that Apple was automatically sending "identifiers of images" that a user had browsed in Finder, doing so "without that user's consent or awareness."

The plan for CSAM scanning would've involved a local on-device check of images for potential CSAM content, using a hashing system. The hash of the image would then be sent off and checked against a list of known CSAM files.

While the idea of scanning images and creating a neural hash to be sent off to Apple to describe characteristics of image could feasibly be used for CSAM scanning, Oakley's testing indicates it's not actively being used in that way. Instead, it seems that Apple's Visual Lookup system, which allows macOS and iOS to identify people and objects in an image, as well as text, could be mistaken for conducting in this sort of behavior.

No evidence in tests

As part of testing, macOS 13.1 was run within a virtual machine and the application Mints was used to scan a unified log of activities on the VM instance. On the VM, a collection of images were viewed for a period of one minute in Finder's gallery view, with more than 40,000 log entries captured and saved.

If the system was used for CSAM analysis, there would be repeated outgoing connections from the "mediaanalysisd" to an Apple server for each image. The mediaanalysisisd refers to an element used in Visual Lookup where Photos and other tools can display information about detected items in an image, such as "cat" or the names of objects.

The logs instead showed that there were no entries associated with mediaanalysisd at all. A further log extract was then found to be very similar to Visual Lookup as it appeared in macOS 12.3, in that the system hasn't materially changed since that release.

Typically, mediaanalysisd doesn't contact Apple's servers until very late in the process itself, as it requires neural hashes generated by image analysis beforehand. Once sent off and a response is received back from Apple's servers, the received data is then used to identify to the user elements within the image.

Further trials determined that there were some other attempts to send off data for analysis, but for enabling Live Text to function.

In his conclusion, Oakley writes that there is "no evidence that local images on a Mac have identifiers computed and uploaded to Apple's servers when viewed in Finder windows."

While images viewed in apps with Visual Lookup support have neural hashes produced, which can be sent to Apple's servers for analysis. Except that trying to harvest the neural hashes for detecting CSAM "would be doomed to failure for many reasons."

Local images in QuickLook Preview also go under normal analysis for Live Text, but "that doesn't generate identifiers that could be uploaded to Apple's servers."

Furthermore, Visual Lookup could be disabled by turning off Siri Suggestions. External mediaanalysiss look-ups could also be blocked using a software firewall configured to block port 443, though "that may well disable other macOS features."

Oakley concludes the article with a warning that "alleging that a user's actions result in controversial effects requires full demonstration of the full chain of causation. Basing claims on the inference that two events might be connected, without understanding the nature of either, is reckless if not malicious."

CSAM still an issue

While Apple has gone off the idea of performing local CSAM detection processing, lawmakers still believe Apple isn't doing enough about the problem.

In December, the Australian e-Safety Commissioner attacked Apple and Google over a "clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming."

Rather than directly scanning for existing content, which would be largely ineffective due to Apple using fully end-to-end encrypted photo storage and backups, Apple instead seems to want to go down a different approach. Namely, one that can detect nudity included in photos sent over iMessage.



20 Comments

racerhomie3 7 Years · 1264 comments

Someone post this on Louis Rossman’s reaction video. It would be good to see this being cleared out

jdw 18 Years · 1457 comments

While forced local scanning on a Mac by government order is a frightful 1984-style nightmare for all citizens (law abiding or not), the upside is that Little Snitch would likely work to block any outgoing transfers.

The concern here is the same as CSAM scanning on the iPhone. It's more than a matter of personal privacy. It's a concern centered on the possibility that an error could result in a law abiding person being reported to law enforcement, which cares more about filling quotas and busting so-called bad guys than anything else.  Having an accused person's time wasted, or worse, being arrested for something they didn't do only because a computer secretly misread a file on their computer is something no citizen of any nation should stand for.  

So how do law enforcers deal with law breakers?  How they always have — which doesn't include privacy invasions like file scanning without a search warrant.  It may not be the ideal approach in light of the tech we have today, but it's the only approach to protect citizen from unlawful search and seizure.

UNLK_A6 7 Years · 8 comments

That's Eclectic Light Company, not Electric Light Company. You should make that correction in the article.

entropys 13 Years · 4316 comments

The concern here is the same as CSAM scanning on the iPhone. It's more than a matter of personal privacy. It's a concern centered on the possibility that an error could result in a law abiding person being reported to law enforcement, which cares more about filling quotas and busting so-called bad guys than anything else.  Having an accused person's time wasted, or worse, being arrested for something they didn't do only because a computer secretly misread a file on their computer is something no citizen of any nation should stand for.  
It was always more than that. The concern was that child abuse was a smoke screen for a broader purpose, eg identifying people that did not agree with an authoritarian government by what pictures they were looking at.

fastasleep 14 Years · 6451 comments

jdw said:

Having an accused person's time wasted, or worse, being arrested for something they didn't do only because a computer secretly misread a file on their computer is something no citizen of any nation should stand for.  

That’s not how hashes work, at all.