Facebook’s censorship for unpaid porn

Emanuel Maiberg for 404Media.co:

In early December I got the kind of tip we’ve been getting a lot over the past year. A reader had noticed a post from someone on Reddit complaining about a very graphic sexual ad appearing in their Instagram Reels. I’ve seen a lot of ads for scams or shady dating sites recently, and some of them were pretty suggestive, to put it mildly, but the ad the person on Reddit complained about was straight up a close up image of a vagina.
AI Forensics, a tech platform and algorithmic auditing firm, today put out a report that quantifies just how widespread this problem is. It found over 3,000 pornographic ads promoting “dubious sexual enhancement products” which generated over 8 million impressions over a year in the European Union alone. In an attempt to show that the ads didn’t use some clever technique to bypass Meta’s moderation tools, AI Forensics uploaded the exact same visuals as standard, non-promoted posts on Instagram and Facebook, and they were removed promptly for violating Meta’s Community Standards.
What’s adding insult to injury right now is that in addition to Facebook’s hypocrisy I lay out above, Facebook is now punishing us for publishing stories about this very problem.
Again, this is frustrating as is, but infuriating when I regularly see Facebook not only take money from advertisers that are pushing nudity on Facebook, but doing it for the explicit purpose of creating non-consensual content or scamming its users.

Link

If you ever wondered what the new Facebook would be going to look, after its "liberation from biased policies", this is it: we'll be seeing paid porn, but not able to post porn ourselves, because of "community standards"...

Later edit:

Dot, the personal AI assistant from https://web.new.computer/, responding to the question “What is to be done to address this situation?”:

To address this situation, I believe several actions are necessary:

  1. Public pressure and transparency: Exposing these practices, as you've done in your article, is crucial. The more people become aware of this hypocrisy, the more pressure Facebook will face to change its policies.
  2. Stricter regulations: Regulatory authorities might need to step in to enforce consistent content moderation standards, regardless of whether the content is paid or not.
  3. Platform accountability: Facebook should be held responsible for inconsistencies in its policies. This could include fines or other consequences for selectively applying rules.
  4. Improved moderation systems: Facebook should invest in better content detection and moderation systems that are applied uniformly.
  5. User education: Raising awareness among users about these practices and encouraging them to report inconsistencies.