Adam Mosseri, the head of Instagram, spent part of this week in a Los Angeles courtroom arguing a point that will sound familiar to anyone who’s ever scrolled past midnight: he doesn’t believe people can be clinically addicted to social media. Mosseri drew a line between what he called “problematic use” and a medical diagnosis, likening doomscrolling to being hooked on a Netflix show — something you can do for hours that’s harmful to your life, but not the same as a clinical addiction.
The testimony came during a landmark trial that could reshape how courts treat social platforms. Plaintiffs in the case are asking jurors to hold companies like Meta and Google responsible for harms to young people; the case’s bellwether plaintiff, identified only as KGM, says she spent up to 16 hours a day on apps and suffered real-world consequences. Mosseri acknowledged that such extreme use is “problematic,” but he pushed back against the idea that it meets the bar for a clinical disorder.
That distinction — problematic versus clinical — matters a lot in law and medicine. Clinicians use strict criteria to diagnose addictions: persistent behavior despite harm, loss of control, and measurable impairment in daily functioning. The World Health Organization has recognized gaming disorder as a diagnosable condition in ICD-11, but it has not classified social media use the same way. Public-health researchers and clinicians are still debating whether social-media behaviors fit the same diagnostic mold.
At the same time, a growing body of research links heavy social-media use to anxiety, depression, sleep disruption, and attention problems — especially among adolescents. Editorials and special issues in psychiatric journals have flagged the mental-health risks of excessive smartphone and social-media use, even while stopping short of declaring a new, universally accepted diagnosis. The science is messy: correlation is common, causation is hard to prove, and individual vulnerability varies widely.
Mosseri’s courtroom framing is as much a legal strategy as it is a public-relations posture. If social-media use is not a recognized clinical disorder, the argument goes, then companies can’t be held to the same standard of responsibility as a maker of an addictive drug or a casino. Plaintiffs counter that platforms are engineered to maximize engagement — using algorithms, notifications, and design tricks that exploit human attention — and that those mechanics can create patterns of compulsive use with real harms. The jury will have to weigh technical design choices against medical definitions.
There’s also a generational and cultural angle. For many adults, “addiction” conjures images of substance dependence with clear withdrawal syndromes. For younger users, the harms are often social and developmental: lost sleep, missed schoolwork, body-image issues, and constant pressure to perform online. Clinicians say those harms can be severe and long-lasting even if they don’t fit neatly into existing diagnostic categories. That tension — between lived experience and diagnostic language — is central to the debate.
From a product perspective, Instagram and other platforms have rolled out safety features aimed at younger users: time limits, prompts to take breaks, and changes to recommendation systems. Critics argue that those measures are often cosmetic or insufficient and that design incentives still favor engagement above wellbeing. Mosseri pointed to product changes in his testimony as evidence that the company takes safety seriously, while plaintiffs say those changes don’t undo years of design that prioritized growth.
So what should you take away from this? First, heavy social-media use can be harmful even if it’s not yet classified as a clinical addiction. Second, context matters: a teenager who spends hours a day on an app and shows declining school performance or mood changes is a different clinical picture than an adult who binges on streaming video for a weekend. Third, the legal system is still catching up to the technology; how courts interpret “addiction” could influence regulation, product design, and corporate accountability for years to come.
If you’re worried about your own or a loved one’s use, practical steps matter more than labels. Simple moves — turning off nonessential notifications, setting device-free hours, using built-in screen-time tools, and talking openly about how online time affects mood and sleep — can reduce harm. For parents, the conversation is less about policing and more about modeling habits and creating predictable boundaries. The debate over whether social media is a clinical addiction won’t change the fact that for some people, the apps are causing real, measurable harm.
The courtroom will keep sorting the legal questions, and researchers will keep refining the clinical ones. In the meantime, Mosseri’s testimony is a reminder that language matters: calling something an “addiction” carries legal and medical weight, but it doesn’t change the lived experience of someone who can’t sleep, can’t focus, or feels worse after logging on. Whether the label changes or not, the conversation about how to design, regulate, and live with these platforms is only getting louder.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
