scorecardresearch
Add as a preferred source on Google
Wednesday, April 1, 2026
Support Our Journalism
HomeOpinionUS courts finally say what parents knew all along—social media is built...

US courts finally say what parents knew all along—social media is built to hook children

Regulation and litigation may correct the extremes, but they cannot replace the everyday discipline of parenting or the gradual development of self-control.

Follow Us :
Text Size:

As children stay glued to screens like Fevicol, a US court has finally said: social media platforms may be addictive by design. Not “oops, by mistake,” not a “side effect,” but properly engineered, like a perfectly spiced bag of chips, except the spice here is dopamine.

In what can only be described as a “finally, the adults have entered the chat” moment, two major court decisions in the US have put Big Tech under the scanner. This time, it wasn’t just for data privacy, but for children’s safety and mental health.

In California, a jury recently held Meta and Google responsible for the depression and anxiety of a woman who had compulsively used social media as a child, awarding her $6 million in damages—half for harm suffered and half as punishment. Meta, unsurprisingly, got the bigger share of the bill.

Meanwhile, in New Mexico, another court ordered Meta to pay a staggering $375 million for misleading users about how safe its platforms were for children, including exposure to explicit content and online predators.

For Indian parents reading this, the reaction is likely: “Beta, we didn’t need a jury to tell us this.” Because this experiment has already been running in homes across the country. Give a child a smartphone. Say, “Just 10 minutes.” Return later, and the child is still there scrolling with the focus of a UPSC aspirant—except the subject is reels, memes, and increasingly questionable content. 

Designed to keep children hooked

What these verdicts do is formally acknowledge something long suspected: social media platforms are not neutral tools. They are carefully designed systems, optimised to keep users engaged—especially young ones who don’t yet have the brakes fully installed.

Doomscroll? That’s not a feature—it’s a trap. It’s like being at an Indian wedding buffet where someone keeps refilling your plate before you can say no. Notifications? They arrive with the persistence of parents checking on your whereabouts with texts that read: “You haven’t checked your phone in 5 minutes… everything okay?” Then there’s the algorithm—arguably the hardest-working employee in Silicon Valley. It’s like your local mithaiwala who knows what you like, what you watch, what keeps you hooked—and serves it to you nonstop, whether it’s good for you or not.

And then there are likes. These are not tools; they are emotional gym trainers. They push, pull, reward, and punish—keeping users engaged like contestants in a never-ending reality show where the prize is more screen time and digital snacks, scientifically salted to keep you reaching back into the bag. And children, being less equipped with impulse control than adults, were always going to be the most enthusiastic consumers.

The troubling part, highlighted in the New Mexico case, is not just design—but disclosure. The court found that Meta had misled users about safety. In other words, parents were told, “All under control,” while the reality was closer to “Handle with extreme caution.” Of course, Meta disagrees and will appeal. It insists it works hard to protect users and that policing harmful content at scale is difficult. Fair point. But courts seem to be saying: difficulty is not an excuse when the stakes are children’s well-being.

And that’s where the humour turns slightly uncomfortable. Because for years, society treated excessive screen time like a minor nuisance—something to nag kids about between dinner and bedtime. Now, courts are treating it as something far more serious: a public health issue with corporate accountability attached.

The warning signs were there. Parents complained. Teachers worried. Even kids admitted, “It’s such a waste of time.” But it took lawsuits, evidence, and verdicts for the system to officially say: yes, there’s a problem.

Now, of course, there are calls for regulation. But policy moves slowly, and is not exactly a Zomato delivery. This is more like Indian Railways—inevitable, but don’t expect it to arrive on time. By the time strong laws are in place, today’s kids may have moved on to something even more addictive—maybe VR classrooms where homework feels like a video game, or AI friends who are available 24/7 and never judge you (unlike your real ones).

Still, this verdict matters. It sends a clear message: you can’t build systems to capture attention endlessly and then pretend it’s the user’s fault for getting hooked. Big Tech has been told unequivocally: you can’t design things to grab attention and then act surprised when people can’t let go. And for now, parents everywhere can take a small victory. Next time they say, “keep your phone away,” they have legal backing.

Whether this leads to real change remains to be seen. Tech companies are very good at rebranding. Tomorrow, they might say, “We don’t make users addicted—we enhance engagement experiences.” Sounds reassuring. But one thing has changed: accountability is no longer theoretical. For the first time, courts are saying what parents have known all along—the apps did exactly what they were designed to do. The problem is, kids were part of the design.

And this is not just an American story. Recent global trends suggest that governments are increasingly willing to step in where families and platforms have struggled to find balance. Indonesia’s decision to restrict social media access for children under 16—covering platforms from YouTube and TikTok to Roblox—marks a first in Southeast Asia, echoing Australia’s earlier move toward a similar ban. 

Even China, often cited for its strict controls, allows a heavily curated version of TikTok for children that prioritises educational and scientific content over pure entertainment. India, despite banning TikTok, continues to grapple with the quality of content on domestic platforms, where regulation has not necessarily translated into healthier digital environments. 


Also read: From Joshimath to Zojila—how Indians are loving the Himalayas to death. Literally


Big Tech’s accountability & parenting

These developments raise an important question: is restricting access the answer, or does it risk avoiding the deeper issue? While such bans may empower families and signal seriousness, they also reflect a growing discomfort with leaving digital exposure entirely to individual choice.

At the same time, the recent US verdicts invite a more nuanced reaction. The New Mexico decision, grounded in misleading claims about safety and exposure to harm, is easier to welcome. The California verdict, however, also reflects the growing influence of contingency-fee litigation—where lawyers take a percentage of large settlements—raising questions about incentives. 

In the US, such legal actors are among the biggest contributors to political funding, and this expanding litigation culture sits uneasily with a system that also values individual freedom and personal agency. 

This brings us to a harder question that cannot be outsourced to courts or governments: where does responsibility ultimately lie? Platforms are undoubtedly designed for engagement, sometimes aggressively so. But in a society that values choice, it is equally the responsibility of parents, guardians, and eventually the users themselves to exercise judgement. Children do not arrive on social media without context—they are shaped by supervision and socialisation. Blaming platforms alone risks oversimplifying a complex ecosystem of design, behaviour, and responsibility. 

Regulation and litigation may correct the extremes, but they cannot replace the everyday discipline of parenting or the gradual development of self-control. A controlled platform may reduce harm, but an unprepared user will still find risk. The real challenge, therefore, is not choosing between corporate accountability and personal responsibility, but recognising that both must coexist—and that neither can fully substitute for the other.

Poonam Khaira Sidhu is a former IRS officer, University of Michigan–trained LLM and economist, and published author with articles in international journals and Indian newspapers, specialising in international tax and public policy. Views are personal.

(Edited by Ratan Priya)

Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular