"What Is a Lost Childhood Worth?" The Sexual Exploitation Case that Could Set New Rules — and Vast Costs — for Social Media
The case New Mexico is making against Meta — the first to go to trial — could cascade across 1,500 lawsuits and punish social media companies at a new scale.
New Mexico’s population is small. As a user base, it’s even smaller. The financial penalty a New Mexico judge could impose is, in Meta’s terms, a rounding error. But if Meta loses to state Attorney General Raúl Torrez in his landmark case against the social media giant, the damages could change the industry forever.
If the court establishes a per-user damage figure for connecting children to sexual predators through platform design, as Torrez alleges, that number travels. California, Texas, New York, Illinois, Florida — states with tens of millions of Meta users — are all watching this case. The math is straightforward and the implications are staggering: a per-user price set in a victory for Torrez in New Mexico becomes the template for what this litigation eventually costs Meta at scale.
This is not a case about what users posted on Meta's platforms. That function — Meta as a neutral host shielded from liability for third-party content — is what Section 230 of the Communications Decency Act was designed to protect. Torrez's case doesn't go there. It goes to the question of whether a social media company is responsible for harms that result from its design choices. The algorithm. The "people you may know" feature that internal Meta documents, introduced as evidence, allegedly show was connecting groomers to children. The engagement-maximizing machine that Arturo Béjar — Facebook's former engineering director for safety, who worked at Meta on and off from 2009 to 2021 — testified about at trial. According to reporting from KRQE, Béjar told the jury that the platform is "very good at connecting people with interests — and if your interest is in little girls, it will be very good at connecting you with little girls."

Meta’s defense is that the evidence is taken out of context, that the platforms have safety features, and that the company employs tens of thousands of people in trust and safety roles. In response to the rising number of lawsuits, the company posted a lengthy statement that reads, in part:
Despite the snippets of conversations or cherry-picked quotes that plaintiffs’ counsel may use to paint an intentionally misleading picture of the company, we’re proud of the progress we’ve made, we stand by our record of putting teen safety first, and we’ll keep making improvements.
The company’s lawyer told the jury Meta designed its platforms to be “fun and entertaining, not to harm teens or anyone else.” In a statement during trial, a Meta spokesperson said the company has made “meaningful changes” to its platforms and has been honest with parents about the risks. (Santa Fe New Mexican) What the internal documents Torrez obtained in discovery allegedly suggest, however, is that safety concerns were repeatedly overruled when they threatened engagement — and that Zuckerberg himself, per evidence presented at trial, prioritized engagement above safety when the two collided.
"Time and again," Torrez told me, "the pattern that you would see in the company is whether it was addressing the addictive features — infinite scroll, likes, some of the other ways in which content was amplified by the algorithm — someone in the company would raise their hand and say, this is problematic, this is dangerous to kids. It's harming people. And then someone on the engagement side, the profit side, basically overrule what had been found by the safety people, because they would say, well, look, this is gonna impact engagement."
That’s the tobacco parallel Torrez is reaching for. The suit alleges this isn’t just about whether Meta’s platforms are dangerous. It’s about whether Meta executives knew they were dangerous while publicly representing them as safe. That gap — between internal knowledge and external marketing — is where big tobacco lost.


