Uncategorized

New Mexico Sues Meta Over CSAM Material on Facebook and Instagram

Rohan Goswami, reporting for CNBC:

Facebook and Instagram created “prime locations” for sexual
predators that enabled child sexual abuse, solicitation, and
trafficking, New Mexico’s attorney general alleged in a civil
suit filed Wednesday against Meta and CEO Mark
Zuckerberg.

The suit was brought after an “undercover investigation” allegedly
revealed myriad instances of sexually explicit content being
served to minors, child sexual coercion, or the sale of child
sexual abuse material, or CSAM, New Mexico attorney general Raúl
Torrez said in a press release.

The suit alleges that “certain child exploitative content” is ten
times “more prevalent” on Facebook and Instagram as compared to
pornography site PornHub and adult content platform OnlyFans,
according to the release.

This follows the recent and ongoing investigative reporting by The Wall Street Journal into child porn rings on Instagram, and the ways in which their content algorithms send these deviants further down their perverted rabbit holes.

Which in turn leads the Muskateers paying for Twitter/X to ask questions like “Why are advertisers still on Facebook and Instagram but have such a massive problem with X, which bans such content?”

No content is more electrifyingly objectionable than CSAM. No bones about it, Meta has both a content moderation problem and PR fiasco on its hands. They have got to stamp this out, or advertisers will start abandoning their platform. But there are huge differences between Meta and X. Meta does not want CSAM or even CSAM-adjacent material on its platforms. Their current content moderation infrastructure quashes a shocking amount of it already. They need to do better, and I think most people believe they want to. The objectionable material on Twitter/X, on the other hand — the racism, the antisemitism, the outright Nazism — is explicitly permitted in the name of “free speech”. And in terms of perception, which is what advertisers care most about, Twitter/X is defined now by its number-one user, Elon Musk.

Also, more cynically, ads on Instagram work — advertisers gain more in sales than they spend on the ads. That’s less true — and perhaps not true at all — on Twitter/X.

Meta’s big legal problem isn’t that they’ve looked the other way at CSAM material, but that they’ve deliberately looked the other way at under-13 users signing up for Instagram accounts, and purposely optimized their algorithms to engage teens. It doesn’t pass the sniff test that they’d want CSAM on Instagram; it easily passes the sniff test that they’d want to hook kids on the platform as young as possible.

 ★ 

Rohan Goswami, reporting for CNBC:

Facebook and Instagram created “prime locations” for sexual
predators that enabled child sexual abuse, solicitation, and
trafficking, New Mexico’s attorney general alleged in a civil
suit filed Wednesday
against Meta and CEO Mark
Zuckerberg.

The suit was brought after an “undercover investigation” allegedly
revealed myriad instances of sexually explicit content being
served to minors, child sexual coercion, or the sale of child
sexual abuse material, or CSAM, New Mexico attorney general Raúl
Torrez said in a press release.

The suit alleges that “certain child exploitative content” is ten
times “more prevalent” on Facebook and Instagram as compared to
pornography site PornHub and adult content platform OnlyFans,
according to the release.

This follows the recent and ongoing investigative reporting by The Wall Street Journal into child porn rings on Instagram, and the ways in which their content algorithms send these deviants further down their perverted rabbit holes.

Which in turn leads the Muskateers paying for Twitter/X to ask questions like “Why are advertisers still on Facebook and Instagram but have such a massive problem with X, which bans such content?”

No content is more electrifyingly objectionable than CSAM. No bones about it, Meta has both a content moderation problem and PR fiasco on its hands. They have got to stamp this out, or advertisers will start abandoning their platform. But there are huge differences between Meta and X. Meta does not want CSAM or even CSAM-adjacent material on its platforms. Their current content moderation infrastructure quashes a shocking amount of it already. They need to do better, and I think most people believe they want to. The objectionable material on Twitter/X, on the other hand — the racism, the antisemitism, the outright Nazism — is explicitly permitted in the name of “free speech”. And in terms of perception, which is what advertisers care most about, Twitter/X is defined now by its number-one user, Elon Musk.

Also, more cynically, ads on Instagram work — advertisers gain more in sales than they spend on the ads. That’s less true — and perhaps not true at all — on Twitter/X.

Meta’s big legal problem isn’t that they’ve looked the other way at CSAM material, but that they’ve deliberately looked the other way at under-13 users signing up for Instagram accounts, and purposely optimized their algorithms to engage teens. It doesn’t pass the sniff test that they’d want CSAM on Instagram; it easily passes the sniff test that they’d want to hook kids on the platform as young as possible.

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy