Meta Platforms CEO Mark Zuckerberg testified this week in a Los Angeles courtroom in a closely watched trial that is examining whether leading social media platforms were intentionally designed in ways that contribute to youth addiction and mental health harm.
The case is among the first in the United States to reach a jury on claims that companies such as Instagram and YouTube engineered features that encouraged compulsive use among children and adolescents. Its outcome could shape future legal standards governing platform design and corporate responsibility.
The lawsuit was brought by a California woman identified in court filings as KGM, now 20, who alleges she began using Instagram and YouTube at a young age and developed addictive usage patterns that worsened depression and led to suicidal thoughts. The complaint argues that the companies prioritised user engagement and growth among minors despite internal awareness of research indicating potential psychological risks to teens.
TikTok and Snapchat, which were initially named as defendants, settled before trial proceedings began. The case now continues against Meta and Google’s YouTube.
During testimony in Los Angeles Superior Court, Zuckerberg addressed Meta’s policies on age restrictions, safety measures and platform design. He outlined Instagram’s prohibition on users under 13 and described tools deployed to detect and remove underage accounts. He acknowledged that verifying age and enforcing compliance remains technically difficult.
Attorneys for the plaintiff questioned Zuckerberg about internal company documents suggesting sustained efforts to increase user engagement, including initiatives aimed at extending the amount of time users spend on social media applications. These materials form a central part of the argument that certain design choices may have encouraged habitual use among young users. Meta maintains that its product development strategies were not intended to exploit minors and that the company has invested in safety features and protective measures.
The trial has also examined Instagram’s safety updates introduced after 2019, including changes to age settings and teen protections, as well as broader internal decision-making processes related to youth wellbeing. The proceedings unfold amid mounting scrutiny of social media companies across the United States, where thousands of similar lawsuits allege that algorithmic design and engagement-driven features have contributed to mental health challenges among young people.
Legal observers note that the case could become a significant benchmark for how courts interpret claims of psychological harm linked to digital platforms.
A verdict in favour of the plaintiff could influence ongoing litigation and potentially accelerate regulatory efforts aimed at tightening oversight of social media companies and their responsibilities toward young users.