In October 2019, a BBC investigation found a 16-year-old girl from Guinea being advertised for sale on Instagram. Her price was approximately $3,800. The listing included her photograph. She was one of dozens of domestic workers the BBC found being sold on Facebook and Instagram in Kuwait and Saudi Arabia, but her age made her listing particularly valuable to buyers. She was a child, and Meta’s platform was the marketplace.
Three years later, three women who had been sex trafficked as teenagers sued Facebook in Texas. They had been recruited on the platform, groomed by traffickers who exploited Facebook’s friend suggestion algorithm, and sold. The Texas Supreme Court ruled unanimously that they could proceed. Federal law, the court wrote, does not create “a lawless no-man’s land on the internet” where companies can “knowingly or intentionally participate in the evil of online human trafficking” without consequence.
These are not anomalies. Facebook accounts for 59 to 65 percent of online trafficking recruitment in federal cases. For children specifically, 65 percent. Meta platforms generated 85 percent of all child sexual abuse material reports to the National Center for Missing and Exploited Children in 2022. Meta is the trafficking infrastructure and one man controls it.
Mark Zuckerberg’s company conducted an internal study in 2018 to understand how human trafficking works on its platforms. They mapped what they called “the entire life cycle: recruitment, facilitation, and exploitation.” Then they did approximately nothing about it for years.
When Vaishnavi Jayakumar became Instagram’s head of safety and well-being in 2020, she discovered that Meta had a policy governing accounts engaged in “trafficking of humans for sex” that allowed sixteen violations before suspension. Jayakumar testified under oath that this was “by any measure across the industry, a very, very high strike threshold.” Meta has never publicly acknowledged the policy existed.
Think about what that means. Meta built an enforcement system designed to tolerate trafficking. Not one that failed to catch it. One that caught it and allowed it to continue fifteen more times before taking action.
In July 2020, an internal document listed “sex trafficking and sexual solicitation networks on Instagram” as a known vulnerability that “had not been prioritized.” That same month, someone asked in an internal chat: “What specifically are we doing for child grooming?” The answer: “Somewhere between zero and negligible.”
This is what Meta employees wrote to each other when they thought no one would see it. Somewhere between zero and negligible. For child grooming. Two years after the company had mapped the entire trafficking lifecycle on its platforms.
The 2018 study wasn’t the only early warning. Guy Rosen, then Meta’s vice president of integrity, wrote an internal email that year acknowledging the friend suggestion algorithm had become a tool for predators. Traffickers would send friend requests to a potential victim’s classmates, and Facebook’s system would then recommend the trafficker directly to the victim based on mutual connections. The New Mexico Attorney General’s office later characterized this as a “virtual victim identification service.” The algorithm handed traffickers artificial credibility and direct access to children. Rosen identified the problem in 2018. The company kept operating the same system.
Zuckerberg received direct warnings. Senator Richard Blumenthal referenced an internal memo about trafficking that “was written to you” at a January 2024 hearing. New Mexico Attorney General Raúl Torrez stated Zuckerberg was “absolutely directly warned.” Arturo Béjar, a former Meta engineering director whose own teenage daughter was constantly solicited on Instagram, wrote directly to Zuckerberg in 2021 with data showing 13 percent of users aged 13 to 15 had received unwanted sexual advances in the previous week. Zuckerberg never responded.
Eight days before the BBC published its trafficking investigation, Apple informed Meta it would remove Facebook and Instagram from the App Store unless the company acted. What happened next reveals everything about Meta’s priorities. In one week, Meta removed more than 130,000 pieces of Arabic-language trafficking content and launched new detection tools. One week.
An internal document confirmed the company had known: “Throughout 2018 and H1 2019 we conducted the global Understanding Exercise in order to fully understand how domestic servitude manifests on our platform.” Meta had mapped the networks. Meta possessed the capability to act. Meta simply hadn’t bothered until someone threatened the business.
By early 2021, with the Apple threat safely past, the urgency had faded. When CNN investigated that October using search terms from Meta’s own internal research, reporters easily found active Instagram accounts offering domestic workers for sale. Meta removed the accounts only after CNN asked about them. The emergency had passed. Normal operations resumed.
That same year, Zuckerberg sent a text message that courts later made public. Asked about child safety as a priority, he wrote that he wouldn’t call it his “top concern” because he had “a number of other areas I’m more focused on like building the metaverse.”
During this period, Meta’s own presentation estimated 100,000 children per day received sexual harassment on the company’s platforms. Court documents show Zuckerberg “shot down or ignored” requests from Nick Clegg to increase child safety funding. Brian Boland, an eleven-year Meta vice president, said under oath: “My feeling then and now is that they don’t meaningfully care about user safety. It’s not something they spend a lot of time on. I really think they don’t care.”
Former researchers Jason Sattizahn and Cayce Savage testified before the Senate in September 2025 about how Meta suppressed safety research. The company required legal review of any research on trafficking, suicide, or child exploitation. Sattizahn testified that lawyers “threatened our own jobs” and asked, “You wouldn’t want to have to testify publicly if this research was to get out, would you?” In one VR study in Germany, a teenage boy reported his younger sibling, under ten, had been sexually propositioned multiple times by adults. Meta ordered the recording deleted and all written records destroyed.
A Wall Street Journal investigation with the Stanford Internet Observatory found Instagram’s algorithms actively connected buyers and sellers of child sexual abuse material. The researchers identified 405 accounts selling this content. When users searched terms likely to surface illegal material, Instagram displayed a warning, then offered a button labeled “See results anyway.”
Jayakumar also discovered Instagram lacked any mechanism to report child sexual abuse material. The platform offered one-click reporting for spam, copyright violations, and firearms. Reporting child exploitation required navigating multiple generic menus. She raised this repeatedly. She was told a dedicated button “would require too much work to build.”
Before Meta implemented end-to-end encryption on Messenger, Facebook reported 22 million instances of child exploitation to NCMEC in 2021. WhatsApp, already encrypted, reported 1.3 million. Meta’s leadership understood precisely what encryption without detection safeguards would do.
Technical solutions existed that could preserve both privacy and detection capability. Apple had developed one. Meta itself had contributed detection algorithms to industry databases. The company could have encrypted messages and maintained its ability to identify abuse. It chose to do one without the other.
In December 2023, Meta rolled out default encryption without those safeguards. NCMEC called it “a devastating blow to child protection.”
The results arrived in 2024: 7 million fewer reports than 2023. Meta didn’t have to blind itself. It chose to.
Attorneys general from 42 states filed coordinated lawsuits against Meta in October 2023. Court filings unsealed in November 2024 produced much of this evidence: the seventeen-strike policy, the metaverse text, the suppressed research. New Mexico’s investigation found child exploitative content more than ten times more prevalent on Facebook and Instagram than on Pornhub. A federal judge denied Meta’s motion to dismiss in May 2024.
Two years of litigation have produced increasingly damaging revelations. Documents unsealed November 22, 2025 show Meta halted internal studies when researchers found causal links between platform use and psychological harm. Court filings allege the company ordered destruction of recordings documenting child sexual harassment in virtual reality research. New Mexico’s case goes to trial in February 2026. Attorney General Torrez has stated that if he gets his day in court, “Meta should be very, very concerned.”
Federal accountability remains absent. The Trump administration has announced no criminal investigation despite whistleblower testimony describing destroyed evidence of child exploitation. Trump fired both Democratic FTC commissioners in March 2025, reducing the agency to its lowest membership since its creation. Meanwhile, Meta spent $19.79 million on federal lobbying through September 2025, deploying 87 lobbyists to defeat child safety legislation. Mark Zuckerberg met with Trump at least six times this year, settled a lawsuit for $25 million with most going to Trump’s presidential library, and donated $1 million to his inauguration. Meta won dismissal of the FTC’s antitrust case on November 18. The attorneys general suing Meta aren’t waiting for Washington. Washington has been purchased.
The legal infrastructure for accountability exists at the state level for any attorney general willing to use it.
Mark Zuckerberg built the platform where the majority of documented online trafficking recruitment takes place. His company studied exactly how that trafficking operated and sat on the knowledge for years. When Apple threatened the business, his company removed 130,000 pieces of trafficking content in a single week. When no business interest was threatened, his company’s efforts on child grooming amounted to “somewhere between zero and negligible.” He stated in writing that child safety ranked below the metaverse. His company suppressed research, destroyed data about child exploitation, and refused to enact safeguards around encryption that eliminated detection of millions of abuse cases annually.
Meanwhile, someone at Meta decided that building a button to report child sexual abuse would require too much work.
If you want to understand how we got here, my book Conservatism: America’s Personality Disorder traces the psychological and institutional roots of a movement that protects corporate predators while claiming to defend children. Available here: https://a.co/d/e9ht1vj
If you want to know what comes next, the Introduction to Soft Secession lays out the state-level strategies that are actually working. The attorneys general and governors in this article aren’t waiting for federal action. Neither should you. Available at The Existentialist Republic store: https://theexistentialistrepublic.myshopify.com/products/intro-to-soft-secession
References
Axios. (2025, November 18). Meta wins landmark antitrust case over Instagram, WhatsApp acquisitions. https://www.axios.com/2025/11/18/meta-instagram-whatsapp-antitrust-ftc
BBC News. (2019, October 31). Slave markets found on Instagram and other apps. https://www.bbc.com/news/technology-50228549
Béjar, A. (2023, November 7). Testimony of Arturo Béjar before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law. U.S. Senate Committee on the Judiciary. https://www.judiciary.senate.gov/imo/media/doc/2023-11-07_-testimony-_bejar.pdf
Blackburn, M. (2025, October). Blackburn eviscerates Meta for lobbying against Kids Online Safety Act [Press release]. U.S. Senate. https://www.blackburn.senate.gov/2025/10/technology/video-blackburn-eviscerates-meta-for-lobbying-against-kids-online-safety-act-grills-google-on-gemma-ai-technology-fabricating-news-stories
Boland, B. (2022, September 14). Testimony before the Senate Committee on Homeland Security and Governmental Affairs. U.S. Senate. https://www.hsgac.senate.gov/wp-content/uploads/imo/media/doc/Testimony-Boland-2022-09-14.pdf
CNBC. (2025, November 18). Meta wins FTC antitrust trial that focused on WhatsApp, Instagram. https://www.cnbc.com/2025/11/18/meta-wins-ftc-antitrust-trial-that-focused-on-whatsapp-instagram.html
Horwitz, J., & Blunt, K. (2023, June 7). Instagram connects vast pedophile network. The Wall Street Journal. https://www.wsj.com/articles/instagram-vast-pedophile-network-4ab7189
Human Trafficking Institute. (2021). 2020 federal human trafficking report. https://traffickinginstitute.org/wp-content/uploads/2022/01/2020-Federal-Human-Trafficking-Report-Low-Res.pdf
In re Facebook, Inc., 625 S.W.3d 80 (Tex. 2021).
In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 4:22-md-03047-YGR (N.D. Cal. Nov. 22, 2024) (plaintiffs’ omnibus brief unsealed).
Meta Platforms, Inc. (2023, December 6). Launching default end-to-end encryption on Messenger. Meta Newsroom. https://about.fb.com/news/2023/12/default-end-to-end-encryption-on-messenger/
National Center for Missing & Exploited Children. (n.d.). End-to-end encryption. https://www.missingkids.org/theissues/end-to-end-encryption
National Center for Missing & Exploited Children. (2025). NCMEC releases new data: 2024 in numbers. https://www.missingkids.org/blog/2025/ncmec-releases-new-data-2024-in-numbers
NBC News. (2025, January 29). Meta donates $1 million to Trump’s inaugural fund after Zuckerberg’s Mar-a-Lago meeting. https://www.nbcnews.com/tech/social-media/meta-donates-1-million-trump-inauguration-fund-zuckerberg-mar-lago-mee-rcna184014
NBC News. (2025, March 18). Trump fires both Democratic commissioners at Federal Trade Commission. https://www.nbcnews.com/politics/trump-administration/trump-fires-both-democratic-commissioners-federal-trade-commission-rcna196991
New Mexico Department of Justice. (2023, December 5). Attorney General Raúl Torrez files lawsuit against Meta Platforms and Mark Zuckerberg to protect children from sexual abuse and human trafficking [Press release]. https://nmdoj.gov/press-release/attorney-general-raul-torrez-files-lawsuit-against-meta-platforms-and-mark-zuckerberg-to-protect-children-from-sexual-abuse-and-human-trafficking/
NPR. (2025, January 29). Meta agrees to pay Trump $25 million to settle lawsuit over Facebook and Instagram suspensions. https://www.npr.org/2025/01/29/nx-s1-5279570/meta-trump-settlement-facebook-instagram-suspensions
OpenSecrets. (2025). Meta lobbying profile. https://www.opensecrets.org/federal-lobbying/clients/summary?id=D000033563
Sattizahn, J., & Savage, C. (2025, September 9). Testimony before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law. Tech Policy Press. https://www.techpolicy.press/transcript-us-senate-hearing-on-examining-whistleblower-allegations-that-meta-buried-child-safety-research/
Stanford Internet Observatory. (2023, June 7). Addressing the distribution of illicit sexual content of minors online. https://cyber.fsi.stanford.edu/news/addressing-distribution-illicit-sexual-content-minors-online
Time. (2025, October). The AG putting Big Tech on trial. https://time.com/7327229/raul-torrez-new-mexico-meta-lawsuit/
Time. (2025, November 22). The allegations against Meta in newly unsealed court filings. https://time.com/7336204/meta-lawsuit-files-child-safety/
U.S. Senate Committee on the Judiciary. (2024, January 31). Big Tech and the online child sexual exploitation crisis [Hearing transcript]. https://www.judiciary.senate.gov/hearings/big-tech-and-the-online-child-sexual-exploitation-crisis
U.S. Senator Richard Blumenthal. (2024, January). Internal Meta documents released to the Senate Judiciary Committee. https://www.blumenthal.senate.gov/download/13124-meta-documents