Sex trafficking was "widely tolerated" on Meta's sites, according to unsealed court docs (wait until you see the stats) 😬

Image for article: Sex trafficking was "widely tolerated" on Meta's sites, according to unsealed court docs (wait until you see the stats) 😬

Joel Abbott

Nov 24, 2025

So, yeah.

This report is based off the claims of a court filing that were unsealed last Friday. That filing is part of a major lawsuit against 4 social media companies.

During court proceedings, Instagram's former head of safety and well-being, Vaishnavi Jayakumar, testified that Instagram would allow sex traffickers and prostitutes SIXTEEN CHANCES before finally banning them for violation of its community standards.

'You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,' Jayakumar reportedly testified, adding that 'by any measure across the industry, [it was] a very, very high strike threshold.' The plaintiffs claim that this testimony is corroborated by internal company documentation.

Misgendered a transvestite? IMMEDIATE BAN.

Trafficking kids? We'll give you 17 strikes!

The brief, filed by plaintiffs in the Northern District of California, alleges that Meta was aware of serious harms on its platform and engaged in a broad pattern of deceit to downplay risks to young users. According to the brief, Meta was aware that millions of adult strangers were contacting minors on its sites; that its products exacerbated mental health issues in teens; and that content related to eating disorders, suicide, and child sexual abuse was frequently detected, yet rarely removed. According to the brief, the company failed to disclose these harms to the public or to Congress, and refused to implement safety fixes that could have protected young users.

'Meta has designed social media products and platforms that it is aware are addictive to kids, and they're aware that those addictions lead to a whole host of serious mental health issues,' says Previn Warren, the co-lead attorney for the plaintiffs in the case. 'Like tobacco, this is a situation where there are dangerous products that were marketed to kids,' Warren adds. 'They did it anyway, because more usage meant more profits for the company.'

This lawsuit sheds more light on what happened last year on Capitol Hill when lawmakers grilled Facebook founder and Meta CEO Mark Zuckerberg. During questioning, Zuckerberg actually stood up and apologized to those who have been victimized on his sites.

(During those same hearings in D.C., I should note a whistleblower also accused Meta of giving our data to China.)

Back to TIME's report from this week:

The plaintiffs' brief, first reported by TIME, purports to be based on sworn depositions of current and former Meta executives, internal communications, and company research and presentations obtained during the lawsuit's discovery process. It includes quotes and excerpts from thousands of pages of testimony and internal company documents. TIME was not able to independently view the underlying testimony or research quoted in the brief, since those documents remain under seal.

But the brief still paints a damning picture of the company's internal research and deliberations about issues that have long plagued its platforms. Plaintiffs claim that since 2017, Meta has aggressively pursued young users, even as its internal research suggested its social media products could be addictive and dangerous to kids. Meta employees proposed multiple ways to mitigate these harms, according to the brief, but were repeatedly blocked by executives who feared that new safety features would hamper teen engagement or user growth.

Besides allowing sex trafficking, prostitution, and grooming from pedophiles, Facebook is also accused of downplaying the results of a 2019 internal study that showed teens' mental health improved when they stopped using their apps.

TIME notes that Meta was slow to take action to protect teens, even though they knew it was allowing strange adults to interact with them, because executives were worried about safeguards slowing growth.

An internal 2022 audit allegedly found that Instagram's Accounts You May Follow feature recommended 1.4 million potentially inappropriate adults to teenage users in a single day. By 2023, according to the plaintiffs, Meta knew that they were recommending minors to potentially suspicious adults and vice versa.

It wasn't until 2024 that Meta rolled out default privacy settings to all teen accounts.

In response to the report, Meta told TIME, "We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions."

Meta is not the only company being sued by the coalition of 1,800+ plaintiffs. YouTube, Snapchat, and TikTok are also in the crosshairs.

The lawsuit has been making its way through the courts for years, but a request for additional documentation by TIME was denied as the case is still being largely kept under wraps.

The fact that it has so much evidence and the backing of so many plaintiffs - from students to parents to activists to state attorneys general - may explain why social media companies have seemed so mellow as of late.

There's also the fact that Meta complied with the Biden admin's censorship efforts regarding Covid, the 2020 election, and other thoughtcrime:

No wonder Zuck was so eager for a rebrand this year!!


P.S. Now check out our latest video 👇

Keep up with our latest videos — Subscribe to our YouTube channel!