Published on

Meta's Legal Reckoning Has Finally Arrived

Authors

Two juries. Two states. Twenty-four hours. That is all it took for the legal landscape of American social media to shift in a way that no Congressional hearing, no voluntary pledge, and no carefully worded press release ever managed to produce. In the span of a single day this week, Meta was handed back-to-back jury verdicts that, taken together, represent something the technology industry has spent decades and billions of dollars trying to prevent: accountability.

I am not a person who uses the word "historic" lightly. It is the most abused adjective in journalism. But on March 24 and 25, 2026, juries in New Mexico and Los Angeles did something that genuinely warrants the designation.


$375 Million and a One-Day Deliberation

The first blow landed in Santa Fe. A New Mexico jury deliberated for approximately one day before ordering Meta to pay $375 million in civil penalties — the maximum allowable under state law at $5,000 per violation — for willfully violating the state's Unfair Practices Act. The jury found that Meta misled consumers about the safety of its platforms and enabled child sexual exploitation. The Guardian

One day. For a case that ran nearly seven weeks.

The case originated from a 2023 lawsuit filed by New Mexico Attorney General Raúl Torrez, itself sparked by a two-year Guardian investigation exposing child sex trafficking markets on Facebook and Instagram. Torrez's office then ran "Operation MetaPhile," an undercover sting in which law enforcement officers posed as children on Meta's platforms. The result, as Torrez described it, was that these fake profiles were "simply inundated with images and targeted solicitations" from child abusers. Three men were arrested for attempting to prey on children through Meta's networks. Ars Technica

Among the more damning revelations at trial: Meta's 2023 decision to encrypt Facebook Messenger by default had, according to internal messages, been made with the knowledge that it would impact the company's ability to disclose approximately 7.5 million child sexual abuse material reports to law enforcement. Law enforcement witnesses and representatives from the National Center for Missing and Exploited Children testified that Meta's AI-driven moderation had generated high volumes of "junk" reports that were, in the words presented to the jury, "useless" for actual investigations. 1

And then there was the testimony of Mark Zuckerberg and Instagram chief Adam Mosseri, who stated in taped depositions that harms to children — including sexual exploitation — were inevitable on Meta's platforms given their scale. The jury, apparently, agreed that "inevitable" is not a defense.


metas legal reckoning has finally arrived 1

The Los Angeles Verdict: Defective by Design

The second verdict arrived the following day from a Los Angeles courtroom, and it carries a different but equally significant legal weight. A jury found Meta and Google's YouTube negligent in causing severe mental health harm to a 20-year-old woman identified as Kaley, or KGM, awarding $6 million in total damages — $3 million compensatory, $3 million punitive — with Meta bearing 70% of the liability. NPR

Kaley began using YouTube at age 6 and Instagram at 11. She developed severe depression, body dysmorphia, and suicidal thoughts. The jury deliberated for over 40 hours across nine days before reaching its conclusion.

What makes this verdict structurally important is not the dollar amount — $6 million is, to Meta, the rough equivalent of a rounding error — but the legal finding itself. For the first time, a jury has ruled that social media platforms constitute defective products, engineered to exploit the developing brains of children. The specific features found liable read like a product manager's engagement playbook: infinite scroll, constant notifications, autoplaying videos, beauty filters. The plaintiff's legal team characterized this architecture as a "digital casino." The jury agreed. CNBC

Internal Meta documents introduced at trial showed executives discussing strategies to attract "tweens" despite the platform's stated 13-year age minimum. One memo noted that 11-year-olds were four times as likely to keep returning to Instagram compared with competing apps. Another document read: "If we wanna win big with teens, we must bring them in as tweens."

Zuckerberg, who testified in person, told the jury that keeping young users safe had always been a company priority. The jury deliberated for nine days before deciding otherwise.


The Section 230 Workaround That Changes Everything

For years, Section 230 of the Communications Decency Act served as the technology industry's legal moat. The law broadly protects platforms from liability for content created by their users, and it has been deployed with remarkable effectiveness to get cases dismissed before they ever reached a jury.

Both of these verdicts circumvented Section 230 entirely — not by challenging the law, but by ignoring the terrain it covers. Plaintiffs in both cases focused not on what users posted, but on how the platforms were designed. That is a distinction with enormous consequences. 2

New Mexico's approach proved that state consumer protection statutes, already on the books, can establish platform liability without waiting for federal legislation. The Los Angeles case proved that a defective product theory — the same legal framework used against automobile manufacturers and pharmaceutical companies — can be successfully applied to algorithmic recommendation systems.

New Mexico Attorney General Torrez has been explicit about his ambitions. He is not satisfied with the $375 million. In the trial's second phase, beginning May 4, he will ask the court for something more disruptive than a fine: court-ordered changes to Meta's algorithms, mandatory age verification systems, removal of predators from platforms, and restrictions on encrypted communications for minors. He also wants independent audits with access to Meta's internal systems. The Meridiem

This is the part that should make every technology company's legal department very uncomfortable.


The Cascade Problem

A financial penalty, however large, is ultimately a cost of doing business. A court order to redesign your platform's architecture is something else entirely.

If a New Mexico judge orders specific algorithmic modifications, Meta's engineers would be building to judicial specifications rather than product roadmaps. If the same legal theory is replicated — and Torrez has explicitly stated his hope that it will be — across Texas, California, Florida, and the other 47 states, Meta and every platform like it could face 50 different compliance regimes, each with different judicial orders, different technical specifications, and different oversight mechanisms.

Federal legislation, whatever its flaws, at least creates uniform requirements. State-by-state judicial remedies do not. The compliance complexity of that scenario is not merely expensive. It is, for a company whose products are built on unified global infrastructure, potentially unworkable.

The Los Angeles case is a bellwether — a test case tied to approximately 2,000 other pending lawsuits from parents and school districts. A separate federal trial in the Northern District of California is scheduled for this summer. TikTok and Snap settled before the Los Angeles trial began. The comparison to Big Tobacco litigation of the 1990s, once a rhetorical flourish, is beginning to look like an operational forecast.


What Meta Says

Meta has announced appeals in both cases. Its spokesperson called the New Mexico verdict the result of "sensationalist, irrelevant arguments" and "cherrypicking select documents." On the Los Angeles verdict, the company stated that teen mental health is "profoundly complex and cannot be linked to a single app."

These are not unreasonable arguments in isolation. Mental health causation is genuinely difficult to establish. Platform scale does create genuine moderation challenges. And appeals courts may well find grounds to overturn or reduce these verdicts.

But the appeals process does not undo the legal framework that produced these verdicts. It does not un-prove the theory. Other state attorneys general do not need to wait for Meta's appeals to conclude before filing their own suits using the same playbook. The liability finding, for now, stands. And the remediation phase in New Mexico begins in five weeks.


metas legal reckoning has finally arrived 2

A Reckoning, Not a Resolution

I want to be precise about what these verdicts are and are not. They are not the end of this fight. Meta is a company worth trillions of dollars with some of the best appellate lawyers money can buy. The road from jury verdict to structural platform change is long, and it runs through courts that may be more sympathetic to the defendants than the juries were.

But these verdicts are, without question, an inflection point. They prove the legal theory works. They demonstrate that juries — ordinary people, not legislators or regulators — look at the internal documents, hear the testimony, and find the behavior unacceptable. They establish that the design of a social media platform can be treated as a defective product under existing law.

For a decade and a half, the technology industry operated on the premise that Section 230 made it essentially litigation-proof on these questions. That premise is now empirically false. The dam, as one plaintiff's attorney put it this week, is breaking.

The children who were harmed did not have the luxury of waiting for Congress to act. It turns out, neither did the law. 3

Footnotes

  1. Meta must pay $375 million for violating New Mexico law in child exploitation case, jury rules

  2. US jury verdicts against Meta, Google tee up fight over tech liability shield

  3. Meta loses trial after arguing child exploitation was "inevitable" on its apps