Meta is a Harm Engine With a PR Budget

KJS 3.26

This week a New Mexico jury made history.

For the first time, a jury held Meta accountable in trial for its platforms’ dangers to children — finding the company willfully engaged in “unfair and deceptive” and “unconscionable” trade practices and ordering it to pay $375 million in damages.

Meta’s response: “We respectfully disagree with the verdict and will appeal.”

Of course they will. Disagreeing with accountability is the only consistent policy Meta has ever held.

Let’s walk through the full record of what this company actually is. Not what it says it is. What it has done, in documented internal communications, under oath, in court, and on the public record for two decades.

They Knew the Kids Were Being Harmed. They Did Nothing.

This is not a theory. This is their own research.

Meta’s own internal studies showed that 13.5% of teen girls in the UK had more frequent suicidal thoughts linked to Instagram use. 17% experienced the worsening of an eating disorder. 32% of teen girls who already felt bad about their bodies felt even worse after using the platform.

They conducted this research. They compiled these statistics. They reviewed these findings at the executive level. And then former engineering director Arturo Bejar — who personally contacted Mark Zuckerberg, Sheryl Sandberg, and Instagram head Adam Mosseri with the evidence — testified before the Senate that his warnings went unaddressed. He never heard back from Zuckerberg at all.

Meta’s internal culture, according to Bejar, is “see no evil, hear no evil.” The company knew. The company chose revenue over the mental health of children. Not once. Over years. Repeatedly. Documented. In their own emails.

The New Mexico jury this week also considered internal Meta documents acknowledging child sexual exploitation on its platforms — and the company’s failure to enforce its own ban on users under 13, the role of its algorithms in prioritizing harmful content, and the prevalence of content about teen suicide. The jury found thousands of separate violations. They awarded the maximum penalty per violation.

That’s not a coincidence. That’s a verdict on a culture.

They Stole 7.5 Million Books. Zuckerberg Personally Approved It.

While building their AI systems, Meta faced a choice. License the intellectual property of writers, scholars, and creators — or steal it.

Internal communications show Meta employees concluded that licensing books legally would be “unreasonably expensive” and “incredibly slow.” When a senior management employee raised concerns about lawsuits, they were convinced to proceed anyway. They downloaded LibGen — a Russian pirate repository of 7.5 million stolen books and 81 million research papers — and used it to train Llama 3. The decision was approved by “MZ.” Mark Zuckerberg.

Court documents filed in January allege Zuckerberg approved the use of the LibGen dataset knowing it contained pirated material.

Meta didn’t just download the stolen library. Internal communications show they used BitTorrent to access it — which means they were simultaneously uploading it to other users. Meta didn’t just steal from authors. They distributed stolen intellectual property to others. That is independently illegal under copyright law, regardless of any fair use argument.

Authors whose life’s work was taken include Ta-Nehisi Coates, Sarah Silverman, and thousands of writers who will never see a dollar. As one author put it: “It’s not just theft of the work; it’s theft of the work to create something to replace us.”

The company’s defense: fair use. The argument that stealing your work to build a product that competes with and eventually replaces you is legally protected innovation. Courts are still deciding. The moral verdict is already in.

They Lobbied to Divide Congress and Kill Reform.

After whistleblower Frances Haugen’s 2021 testimony threatened bipartisan momentum for social media regulation, Meta didn’t change its practices. It deployed its Washington lobbying team to tell Republican staffers that Haugen was “trying to help Democrats,” while telling Democratic staffers that Republicans were focused on unrelated culture war issues. The goal, explicitly, was to “muddy the waters, divide lawmakers along partisan lines and forestall a cross-party alliance” against Meta.

It worked. No federal legislation passed. The children stayed on the platforms. The algorithms kept running. The data kept flowing. The revenue kept growing.

Pattern of Harm

Meta has now been found liable by a jury — for the first time — for child exploitation and consumer deception. It faces active litigation from over 40 states over teen mental health. It faces copyright infringement suits from authors whose work was stolen to train an AI. It has been exposed by multiple whistleblowers, in multiple congressional hearings, across multiple years, for knowingly causing harm to children and choosing not to stop.

Each time: a statement of disagreement. A promise to appeal. A PR campaign about all the good work happening inside the company. And then nothing changes.

The New Mexico attorney general said the verdict “should send a clear message to Big Tech executives that no company is beyond the reach of the law.”

The message is correct. The reach of the law has been slow, and Meta has been counting on that slowness for twenty years. The company is worth over a trillion dollars. The $375 million verdict is roughly 0.03% of its market cap. It will appeal. It may win on appeal.

But something is shifting. Juries are no longer impressed by the innovation defense. States are no longer accepting the “we care about safety” press release. And the record — the internal documents, the whistleblower testimony, the stolen library, the dead children — is now fully public.

Meta is not a technology company that occasionally makes mistakes. It is a company that has repeatedly, knowingly, and profitably chosen harm over safety, theft over licensing, and political manipulation over accountability.

That’s not a bug. That’s the business model. And CEO, Board etc all approved it.