Major Social Media Trials Underway: Child Safety, Addiction, and Mental Health at the Center
In recent weeks, several major legal proceedings have commenced against prominent social media companies, including Meta (parent company of Facebook, Instagram, and WhatsApp), Google's YouTube, TikTok, and Snap. These trials, taking place in both state and federal courts, stem from allegations that the design and operation of these platforms have caused harm to children and adolescents.
The cases, which involve claims of addiction, sexual exploitation, and negative mental health impacts, are being watched closely as potential precedents for hundreds of similar lawsuits.
Trial in New Mexico: Public Nuisance and Consumer Protection Claims
The Case
New Mexico Attorney General Raúl Torrez initiated a public nuisance trial against Meta in Santa Fe state court. The lawsuit, filed in 2023, alleges that Meta's platforms—Facebook, Instagram, and WhatsApp—created a hazardous environment for children, enabling sexual exploitation, solicitation, sextortion, and human trafficking.
The state argues that Meta prioritized user engagement and profit over child safety and misrepresented the safety of its platforms, violating state consumer protection laws.
The trial began in early February and is expected to last approximately seven weeks.
The Investigation
The state's case is based on an undercover investigation where agents created proxy social media accounts designed to appear as minors. The investigation documented sexual solicitations and monitored Meta's response to such activities. Evidence presented included details from "Operation MetaPhile," which led to the 2024 arrest of three men charged with preying on children via Meta platforms.
Key Allegations
- Meta knowingly allowed unmoderated groups related to commercial sex and facilitated the spread of child sexual abuse material (CSAM).
- Internal documents reportedly suggest Meta estimated approximately 100,000 children on Facebook and Instagram experience online sexual harassment daily.
- A 2020 Meta estimate indicated that 500,000 children received sexually inappropriate communications, including grooming, on Instagram daily. Meta has stated the technology used for this estimate was broad and may have included interactions that were not inappropriate.
- The "People you may know" algorithm was identified as a significant factor in connecting predators with victims, used in 79% of identified cases in 2018.
- In 2018, approximately 30% of adults whose accounts were disabled for targeting children returned to the platform and resumed their behavior.
- New allegations emerged during the lawsuit claiming Meta may have financially benefited from placing advertisements alongside content that sexualized children.
- A recent filing alleges that CEO Mark Zuckerberg approved minors' access to AI chatbot companions despite internal safety warnings that the bots could engage in sexual interactions.
Proposed Remedies
The state is requesting the judge to order Meta to:
- Add age verification for New Mexico users
- Prohibit end-to-end encryption for users under 18
- Cap minor users' usage at 90 hours per month
- Limit engagement-boosting features (e.g., infinite scroll, autoplay)
- Detect 99% of new child sexual abuse material (CSAM)
Meta's Response
Meta refutes the allegations, describing the prosecution's methodology as "sensationalist." The company states it has strict, longstanding rules against child exploitation and has invested billions in detection technology and safety features.
Meta highlighted its introduction of "Teen Accounts" in 2024, which automatically apply stricter settings for users under 18, including private profiles by default and limited messaging capabilities. The company also notes its transparency regarding content removal data.
Regarding the proposed remedies, Meta has argued that achieving a 99% CSAM detection rate is mathematically impossible because it would require knowing the total amount of CSAM as a denominator. The company also noted that barring encrypted messaging could drive users to other platforms.
Key Testimonies
- Mark Zuckerberg and Instagram head Adam Mosseri stated in taped depositions that harms to children, including sexual exploitation, are inevitable on Meta's platforms. Zuckerberg stated that when serving billions of people, a small percentage will be criminals, and platform perfection is not an achievable standard.
- Zuckerberg testified that Meta has improved in identifying underage users, acknowledging a desire to have achieved this sooner.
- Zuckerberg authorized end-to-end encryption for Facebook Messenger in 2023, despite warnings from child safety groups about potential risks to children. In a deposition, he stated that user privacy afforded by encryption was a more pressing issue.
- Mosseri stated Meta has developed technology to identify accounts with suspicious behavior to prevent them from interacting with youth accounts.
- A law enforcement officer testified that reports of child sexual abuse material from the platform decreased following encryption.
- Educators and law enforcement officials discussed alleged harms and crimes observed on Meta platforms.
- Whistleblowers provided testimony about internal company discussions.
Judge's Ruling on Section 230
Meta's attempts to have the case dismissed based on Section 230 of the Communications Decency Act and the First Amendment were denied in June 2024.
The court's ruling focused on the lawsuit's emphasis on Meta's platform design and non-speech issues, rather than liability for user-generated content.
New Mexico prosecutors clarify they are not seeking to hold Meta accountable for the content itself, but for its role in disseminating that content through algorithms.
Potential Outcome and Impact
If violations are found, potential fines could reach up to $5,000 per violation, which prosecutors estimate could amount to billions of dollars. Any court order would apply only to Meta in New Mexico, but Meta could apply changes elsewhere or cease operations in the state. A sweeping win could influence other lawsuits against tech companies.
Bellwether Trial in California: Addiction and Mental Health Claims
The Case
A jury trial against Meta (Instagram and Facebook) and Google's YouTube commenced in Los Angeles Superior Court. The plaintiff, identified as KGM (now 20), alleges that these platforms were intentionally designed to be addictive, causing her mental health issues including depression, anxiety, body dysmorphia, and suicidal ideation.
This trial is the first of over 1,500 similar personal injury cases currently pending against major social media companies.
TikTok and Snap were initially named as defendants in this case but reached undisclosed settlement agreements with the plaintiff before the trial commenced.
KGM's Allegations
KGM states she began using YouTube at age 6 and Instagram at age 9. Her lawsuit claims she "developed a compulsion to engage with those products nonstop" due to their "addictive design" and "constant notifications."
She testified about setting up multiple accounts to interact with her own posts and "buy" likes to appear popular. She described receiving a "rush" from notifications and struggling to set usage limits.
Instagram filters altering cosmetic appearance were a significant part of her use, with her estimating "almost all" of her Instagram photos used a filter.
She reported not experiencing negative feelings associated with body dysmorphia before her social media and filter use began. The lawsuit also alleges she experienced bullying and sextortion on Instagram and that the company was slow to respond.
Key Testimonies
- KGM: Testified that her early social media use led to addiction and worsened her mental health. She described her upbringing, noting both positive memories and challenging periods at home.
- Adam Mosseri (Head of Instagram): Testified that he does not believe users can be "clinically addicted" to the social media application, acknowledging the possibility of "problematic use," which he compared to watching television for an extended period. He stated teens generate less revenue for the platform compared to other demographics. Internal documents from 2019 showed Meta executives debated banning certain filters, with one email noting expert consensus on the harm. Instagram initially banned all face-distorting filters but later revised the policy.
- Mark Zuckerberg (Meta CEO): Testified that Meta has improved in identifying underage users, noting some users misrepresent their age and that the company removes identified underage users.
- Victoria Burke (KGM's former therapist): Testified that she diagnosed KGM with body dysmorphic disorder and social phobia at age 13. She indicated social media contributed to KGM's mental health issues but was not the sole cause, describing it as "a contributing factor, not a causation factor." Her notes mentioned in-person bullying, school stress, and family issues as other sources of anxiety.
Company Defenses
- Meta: Argues that the plaintiff's mental health challenges stemmed from a difficult family life, preceding her social media use. The company points to evidence showing KGM faced challenges before using social media. Meta has implemented various safety features including default privacy protections, content limits for teen accounts, parental supervision tools, and AI for identifying minor users. A Meta spokesperson stated the lawsuits "misportray our company."
- YouTube: Spokesperson José Castañeda called the allegations "simply not true," asserting that providing a "safer, healthier experience" for young people is central to their work. YouTube cites services and policies developed with experts, content restrictions, AI for minor identification, and parental control tools.
- TikTok and Snap: Both companies reached settlements with KGM's lawyers, with terms undisclosed. Both denied wrongdoing. Snap previously stated its design differs from traditional social media.
Legal Significance and Scientific Debate
This trial is the first instance where social media platforms must present a defense against mental health harm claims before a jury. Legal experts suggest a plaintiff victory could have significant financial implications and potentially lead to changes in platform design.
The plaintiffs are focusing on platform design features (algorithms, notifications, infinite scroll, autoplay) as the source of harm, rather than user-generated content, in an attempt to bypass Section 230 protections.
The scientific community continues to debate whether social media is truly "addictive." The American Academy of Pediatrics recommends using the term "problematic use." Some brain-imaging research has shown excessive social media use is associated with brain differences comparable to those seen in excessive gambling.
Available evidence indicates an association between social media use and mental health outcomes, but research findings are mixed and the average negative impact at a population level is reported as small. The relationship may be bidirectional, where existing poor mental health could also influence social media engagement.
Outcome and Potential Impact
The jury will determine if Meta and YouTube are liable for harms to KGM. A finding against the companies could result in damages and necessitate changes to their platforms.
The outcome could influence over 1,000 similar personal injury cases currently pending, as well as cases from school districts and state attorneys general.
Concurrent Legal Context
These trials are occurring amid numerous other legal proceedings against social media corporations:
- Over 40 state attorneys general have filed federal lawsuits against Meta, alleging the company harms young people and exacerbates the youth mental health crisis.
- A separate multi-district litigation (MDL) is scheduled for June in San Francisco, involving over 235 plaintiffs, including families, school districts, and attorneys general from nearly three dozen states.
- New Mexico prosecutors have also filed a lawsuit against Snap Inc., alleging similar child sexual exploitation facilitated by its platform, with a trial date not yet scheduled.
- Congressional hearings have seen lawmakers question tech executives regarding content that may promote bullying, drug abuse, and self-harm among youth.
Key Definitions and Legal Context
Section 230 of the Communications Decency Act: A federal law that generally protects online platforms from liability for content posted by third parties. Courts are now examining whether this immunity applies to claims based on platform design features rather than user-generated content.
Bellwether Trial: A test case used to evaluate jury reactions and potential verdicts in a group of similar lawsuits, helping establish legal precedent and potentially guide settlement negotiations in other cases.