Social Media Giants Face Extensive Lawsuits Over Addiction and Child Exploitation Claims
Major social media companies, including Meta Platforms (owner of Facebook and Instagram), Google (parent of YouTube), Snap Inc., and ByteDance (owner of TikTok), are currently facing a series of lawsuits across the United States. These legal challenges primarily allege that platforms are intentionally designed to be addictive, contributing to mental health issues among young users, and in a separate proceeding, accuse Meta of fostering environments conducive to child exploitation. Two high-profile trials are underway: one in Los Angeles, California, focusing on personal injury claims related to mental health, and another in Santa Fe, New Mexico, initiated by state prosecutors concerning child exploitation.
Ongoing Trials and Key Allegations
Los Angeles Superior Court: Youth Mental Health and Addiction Claims
A bellwether trial began in Los Angeles Superior Court, involving a 20-year-old plaintiff identified as KGM, who, along with her mother, is suing Meta and Google. KGM alleges that the intentional design of social media platforms led to her developing an addiction at a young age, resulting in mental health issues such as depression, anxiety, body dysmorphia, self-harm, and suicidal ideation. She started using YouTube at age six and Instagram at age nine.
The Los Angeles trial focuses on personal injury claims, alleging that platforms' intentional design fosters addiction and mental health problems in young users.
-
Plaintiff's Claims:
KGM states she became compulsive in her use due to features like infinite scroll, autoplay videos, constant notifications, and recommendation algorithms. She testified about creating multiple accounts to "buy" likes for perceived popularity, experiencing a "rush" from notifications, and using cosmetic filters extensively, which she believes contributed to body dysmorphia. Her attorneys argue she was a vulnerable user and that the companies prioritized profits over youth well-being. -
Defendants' Response:
Meta and Google deny wrongdoing. They argue that KGM's mental health challenges stemmed from a difficult home life and other factors preceding her social media use. They assert that their platforms do not intentionally harm children and highlight various safety features and parental controls. Adam Mosseri, head of Instagram, testified that he does not believe users can be "clinically addicted" to social media but acknowledges "problematic use," which he likened to extended television viewing. He denied Instagram targets teens to maximize profits, stating teens generate less revenue. Meta CEO Mark Zuckerberg is also scheduled to testify. -
Settlements:
Snap Inc. and TikTok, initially named defendants in KGM's case, reached undisclosed settlement agreements with the plaintiff before the trial commenced.
Santa Fe, New Mexico: Child Exploitation Allegations
Jury selection concluded for a stand-alone trial in Santa Fe, New Mexico, where state prosecutors are suing Meta. This lawsuit, initiated by Attorney General Raúl Torrez, alleges that Meta knowingly fostered an environment conducive to predators targeting children for sexual exploitation on platforms like Facebook and Instagram.
New Mexico prosecutors accuse Meta of knowingly creating an environment on its platforms that facilitates child sexual exploitation, leveraging "Operation MetaPhile" as evidence.
-
Prosecution's Claims:
The lawsuit stems from a state-conducted undercover investigation, "Operation MetaPhile," where proxy social media accounts appearing as minors documented sexual solicitations. Prosecutors allege that Meta's design choices and profit motives prioritized engagement over child safety, leading to unmoderated groups related to commercial sex and the facilitation of child sexual abuse material (CSAM). Allegations include Meta potentially benefiting financially from ads placed alongside content that sexualized children and internal estimates of 100,000 children experiencing online sexual harassment daily. Further claims suggest that Mark Zuckerberg approved minors' access to AI chatbot companions despite internal safety warnings of potential sexual interactions. -
Meta's Response:
Meta refutes the civil accusations, describing the prosecutors' methodology as "sensationalist." The company states its commitment to supporting young people and highlights efforts over a decade, including collaboration with experts and law enforcement, in-depth research, and implementation of Teen Accounts with built-in protections and parental management tools. Meta has denied its platforms are designed to exploit children. Mark Zuckerberg has provided a deposition, and portions may be presented in court.
Legal Strategies and Broader Context
Legal Framework and Challenges:
Plaintiffs in both sets of trials are attempting to circumvent Section 230 of the Communications Decency Act, which generally shields online platforms from liability for user-generated content. Attorneys are focusing on the companies' design choices—such as algorithms, notifications, and other engagement features—rather than the content itself, as the source of alleged harm. Legal teams suggest these trials could set new precedents, drawing parallels to the lawsuits against tobacco companies in the 1990s.
The core legal strategy involves bypassing Section 230 immunity by focusing on platform "design choices" and their alleged harmful effects, rather than user-generated content.
-
Objectives:
Plaintiffs seek financial damages and injunctive relief, aiming to mandate changes in platform design and establish industry-wide safety standards. -
State-Level Approach:
New Mexico's prosecution is leveraging consumer protection and public nuisance laws, an approach that could establish a novel legal avenue for states to hold social media companies accountable, potentially bypassing typical immunity provisions. -
Judicial Rulings:
In the Los Angeles case, a judge ruled that jurors must consider the companies' design choices, not just user-generated content. In New Mexico, Meta's attempts to dismiss the case based on Section 230 and the First Amendment were denied, with the court focusing on platform design and non-speech issues.
Scientific Debate on Addiction:
While lawsuits frequently refer to "social media addiction," the scientific community continues to debate the classification. Experts like Ofir Turel (University of Melbourne) and Dr. Jessica Schleider (Northwestern University) acknowledge potential harms but express reservations about the term "addiction" outside a medical context, preferring "problematic use" or "use disorders." They note that while platforms use behavioral techniques found in gambling (e.g., intermittent reinforcement), the severity of consequences and withdrawal symptoms typically differ significantly from substance addictions. Brain imaging research has shown some similarities between excessive social media use and excessive gambling. The American Academy of Pediatrics recommends the term "problematic use."
Related Legal Actions:
These trials unfold amid numerous other legal proceedings against social media corporations. Over 40 state attorneys general have filed federal lawsuits against Meta alleging harm to young people and exacerbation of the youth mental health crisis through deliberate platform design. A separate federal multi-district litigation (MDL) involving over 235 plaintiffs, including families, school districts, and attorneys general from nearly three dozen states, is scheduled for June in San Francisco. Snap Inc. also faces a separate lawsuit from New Mexico prosecutors alleging similar child sexual exploitation facilitated by its platform.
The outcomes of these ongoing trials could significantly influence how social media platforms are designed, potentially leading to substantial financial implications for tech companies and impacting industry-wide safety standards.