After scathing research funded by Meta correctly noted that Instagram was inducing body issues in girls, Mark Zuckerberg emailed a baffling, logically fallacious complaint to his staff that Apple was facing less criticism over similar issues.

I don't know how it's not more clear to Mark Zuckerberg that there's a giant difference between a public social media platform and private messaging.

Rolling back a bit, this all came to light because New Mexico is suing Meta. The New Mexico Attorney General's filing points to Meta's claims that its platforms are safe for teens. At the same time, Meta had internal research saying that Instagram at least absolutely wasn't, and was contributing greatly to the increase of body image issues in teens.

An email by Zuckerberg, revealed in the discovery process of a lawsuit about this issue reported on by The Verge on Thursday, laid bare the Meta CEO's complaint. And as you'd expect, given history over the years, it's more logical fallacies complaining about Apple.

From the email:

"Apple, for example, doesn't seem to study any of this stuff. As far as I understand, they don't have anyone reviewing or moderating content and don't even have a report flow in iMessage. They've taken the approach that it is people's own responsibility what they do on the platform, and by Apple not taking that responsibility upon themselves, they haven't created a staff or plethora of studies examining the tradeoffs in their approach. This has worked surprisingly well for them."

There is at least a spam reporting feature in Messages, so he's at least partially wrong on that front. Apple includes the ability for anyone to have explicit images blocked from view in Messages, a feature that is enabled by default for users under 18, so he's wrong there too.

And he's also 100% wrong about Apple studies.

At about the same time Meta did their study on body image that found themselves in the wrong, Apple had its own studies done on child sexual abuse material (CSAM). They were clear that they had done so at the time of the announcement, and when Apple shut the effort down.

Zuckerberg whined to staff that Apple isn't held to the same standard.

"When Apple did try to do something about CSAM, they were roundly criticized for it," Zuckerberg said. "[This] may encourage them to double down on their original approach."

Zuckerberg is right about one thing, though. When Apple tried to implement that on-device CSAM scanning after studies, the company was widely panned for it. This was mostly because people didn't understand how it worked and considered it a privacy violation.

The Meta CEO went on to complain about how it looked like his platforms had more material that needed to be moderated versus everybody else, and had to disclose it in reporting. This, Zuckerberg believed, made his companies look like there was more of this material than anywhere else.

Let's make this simple: Apple is not responsible for what Meta's users post. There is no one-to-one comparison possible here.

Being angry about it in an email to staff is ridiculous.

Apple does not have two social media networks that require moderation under section 230 of the Communications Decency Act. Saying that Apple somehow gets preferential treatment is ridiculous, because Apple clearly does not fall under 230.

It's evergreen. We can count on, at least annually, Mark Zuckerberg complaining about Apple for one reason or another. It's generally baseless, and certainly self-interested. And also, generally petulant.

But it's okay. He's getting what he wants, and Apple is going to start verifying users' ages. This way, Zuckerberg can blame Apple for being the problem when somebody skirts around the age verification and sees something they shouldn't, on Zuckerberg's platforms which are incredibly poorly moderated.

It's never Facebook that's the problem, apparently. It's never Instagram, that proved to itself that it is absolutely damaging the mental health of America's teenage girls.

Nope. It must be because Apple isn't doing enough to help Meta, on Apple's dime. Again.