scorecardresearch
Add as a preferred source on Google
Friday, March 27, 2026
Support Our Journalism
HomeFeaturesMeta and YouTube ordered to pay $6 million in social media addiction...

Meta and YouTube ordered to pay $6 million in social media addiction trial. What’s the case?

The Los Angeles case has quickly become a legal flashpoint, closely watched for how it could reshape the trajectory of future lawsuits.

Follow Us :
Text Size:

New Delhi: A pair of jury verdicts in California and New Mexico may mark a new turning point in how the internet is governed in the US, as Meta and Alphabet’s Google have been found liable for harms linked to their platforms’ design. 

In California, the Los Angeles jury found Meta and Google liable for a young woman’s depression and suicidal thoughts after she said she became addicted to Instagram and YouTube at a young age, ordering them to pay a combined $6 million in damages. 

In a separate New Mexico case, jurors ordered Meta to pay $375 million after finding that the company misled users about the safety of its products for young users and enabled the sexual exploitation of children on its platforms.

The rulings have put a crack on the broad legal shields that protected the tech companies so far. These rulings begin to challenge that premise, suggesting that the way platforms are engineered — from recommendation systems to endless scroll — can itself be a source of harm, and therefore liability.

The implications are significant, not least because of the sheer scale of these platforms: Meta owns Facebook and Instagram and has over 3.5 billion users, while nine out of ten American teenagers aged 13-17 use YouTube, according to a 2025 Pew Research Center report.

“This is the first time in history a jury has heard testimony by executives and seen internal documents that we believe prove these companies chose profits over children,” said Joseph VanZandt, one of the lawyers for the plaintiff, identified as KGM. 

Concern over social media use has been intensifying worldwide. In 2024, US Surgeon General Dr. Vivek Murthy urged the introduction of warning labels on platforms, citing their links to mental health harms among adolescents.

By December, Australia had moved to bar children under 16 from accessing social media, while countries including Malaysia, Spain, and Denmark are weighing similar restrictions.

The accusations 

The Los Angeles case has quickly become a legal flashpoint, closely watched for how it could reshape the trajectory of future lawsuits.

The plaintiff, whose first name is Kaley, filed a lawsuit in 2023 against Meta, Snap, YouTube and TikTok. She said she began using social media at six and alleged the platforms contributed to personal harm, including body dysmorphia and self-harm thoughts.

KGM testified that as a child she turned to social media as both a creative outlet and a refuge from bullying at school. She said she spent hours each day on Instagram, posting hundreds of photos with beauty filters to conceal her insecurities — something she linked to the onset of body dysmorphia.

Meta, however, pushed back, arguing that her mental health struggles were rooted in familial abuse and instability, while YouTube maintained it is not a social media platform and said its features were not designed to be addictive.

In a separate New Mexico case, the matter centred on whether Meta’s public assurances matched internal concerns, including around decisions such as expanding end-to-end encryption on Messenger despite warnings about reduced ability to detect abuse. 

The jury was presented with evidence that included the 2024 arrest of three men accused of targeting children through Meta’s platforms and attempting to meet them in person. The arrests were part of a sting operation conducted by undercover agents, referred to as “Operation MetaPhile” by the attorney general’s office.

The New Mexico court also examined how Meta’s 2023 move to introduce end-to-end encryption on Facebook Messenger — its direct messaging service — limited access to critical evidence.

Prosecutors argued that the platform had been used by predators to groom minors and circulate child abuse material, and that encryption made it harder to detect and investigate such activity. 

The companies have said they plan to appeal, setting the stage for a prolonged legal battle that could reach higher courts.


Also read: Fewer immigrants are moving to New York, Los Angeles & Chicago. Here’s why


The Section 230 legal shield

What makes these verdicts stand out is not just the outcome, but the fact that they made it this far at all. For years, thousands of similar lawsuits have been filed against tech companies, yet very few ever see the inside of a courtroom.

The mystery lies in Section 230 of the Communications Decency Act, a 1996 law that has long protected online platforms from liability for content posted by users.

Often described as the legal backbone of the modern internet, Section 230 has enabled companies to argue that they are intermediaries, not publishers, and therefore not responsible for harm caused by third-party content.

Courts have historically interpreted this protection broadly, leading to the early dismissal of many cases against tech firms. In these lawsuits, too, companies including Meta and Google invoked Section 230, asking judges to throw the cases out. 

At the heart of the trial, too, was: were the harms caused by third-party content, which would be protected under Section 230, or by the companies’ own design choices?

This distinction proved crucial.

To find Meta and YouTube liable, jurors had to establish all four elements of negligence under California law: a duty of care, a breach of that duty, causation, and resulting harm. 

Causation quickly became the battleground. 

Over the course of the five-week trial in California, top Meta executives, including Mark Zuckerberg and Adam Mosseri, took the stand, insisting that Instagram could not be described as “clinically” addictive. 

But the law does not require platforms to be the sole cause of harm. Under the “substantial factor” test, the question is whether their conduct played a meaningful role.

And the jury found that it did.

Internal documents presented during the trial, including research and company communications, established that the dangers to younger users were known. 

That evidence strengthened the argument that the harm was not incidental, but stemmed from how the platforms were designed. 

“Courts are increasingly trying to distinguish claims about platform functionality or platform conduct from claims that would really just impose liability for third-party speech,” said Gregory Dickinson, an assistant professor at the University of Nebraska College of Law who studies the intersection of tech and the law.

Several lower courts have already suggested that design choices may not be protected, but appellate courts, whose rulings set binding precedent, have yet to weigh in.

The expected appeals in these cases could change that.

Thousands of lawsuits are already pending against companies including Meta, Google, Snap, and ByteDance, alleging that their platforms contributed to a youth mental health crisis.

More than 2,400 such cases have been consolidated in federal court in California alone. Similar arguments are also being tested against other platforms, including gaming sites.

The US Supreme Court has so far avoided directly ruling on the scope of Section 230, though it has signalled interest. In a recent dissent, Justices warned against continued delay in addressing the issue, writing that social media companies have increasingly used the law as a “get-out-of-jail free card.”

(Edited by Aamaan Alam Khan)

Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular