Back
Technology

Juries Find Meta and Google Liable in Separate Cases Over Platform Design and Child Safety

View source

Recent Jury Verdicts Find Tech Giants Liable for Platform Harms

A series of recent jury verdicts in the United States has found technology companies Meta and Google liable for harms related to their social media and video platforms.

The outcomes are seen as potential precedents for thousands of similar pending lawsuits. Both companies have stated they disagree with the verdicts and plan to appeal.

Los Angeles Case: Platform Design and User Harm

A jury in Los Angeles County Superior Court concluded that Meta Platforms, Inc. (owner of Instagram) and Google LLC (owner of YouTube) were negligent in the design of their platforms. The jury found the companies failed to adequately warn young users of potential dangers, and that this negligence was a substantial factor in causing harm to the plaintiff.

The case was brought by a 20-year-old woman identified in court documents as KGM or Kaley. She testified that she began using YouTube at age six and Instagram at age nine, developing patterns of extensive, daily use during her childhood.

Her legal team argued that specific platform design features—including infinite scroll, autoplay functions, and persistent notifications—were engineered to promote compulsive engagement, particularly among young users.

The Verdict and Damages

After approximately nine days of deliberation, the jury awarded the plaintiff $3 million in compensatory damages. Jurors subsequently recommended an additional $3 million in punitive damages, having found the companies acted with malice, oppression, or fraud.

The jury assigned 70% of the responsibility to Meta and 30% to Google. The judge has final authority over the awarded amounts.

Company Defenses and Legal Context

  • Meta argued that the plaintiff's mental health challenges were unrelated to social media use, citing other factors in her personal life.
  • Google contended that YouTube is a streaming platform, not a social media service, and disputed the plaintiff's reported usage time. Both companies cited existing safety features and parental controls.

TikTok and Snapchat, also named in the original lawsuit, reached confidential settlements before trial. The judge instructed jurors not to consider the content users viewed on the platforms, as liability protections under Section 230 of the Communications Decency Act generally shield companies for user-posted content. The case focused instead on product design and operations.

New Mexico Case: Child Safety and Consumer Protection

A jury in Santa Fe, New Mexico, found Meta liable for violating the state's Unfair Practices Act. The verdict followed a nearly seven-week trial brought by the state's Attorney General.

State prosecutors alleged that Meta prioritized profits over user safety, made false or misleading statements about platform safety, and engaged in "unconscionable" trade practices that exploited children's vulnerabilities. The case included evidence from an undercover investigation where state agents, posing as children under 14 on Facebook and Instagram, documented receiving sexual solicitations and explicit material.

The Verdict and Next Steps

The jury found Meta liable on all counts, determining the company had concealed information about the dangers of child sexual exploitation and impacts on children's mental health. The jury identified thousands of violations, resulting in a civil penalty of $375 million.

A second phase of the trial, scheduled for May, will proceed without a jury. A judge will determine if Meta's platforms constitute a public nuisance and whether the company should be required to fund public programs or implement specific platform changes, such as more effective age verification.

A Meta spokesperson stated the company disagrees with the verdict and will appeal, affirming its commitment to user safety and investment in safety measures.

Research on Platform Design and Compulsive Use

Parallel to the legal actions, research into platform design has drawn comparisons to other industries. Cultural anthropologist Natasha Dow Schüll, whose work focused on video slot machine design, identified four features present in gambling machines that she states now appear in social media and other apps:

Solitude: Interaction primarily between the user and device.
Bottomlessness: Continuous, endless content streams (e.g., infinite scroll).
Speed: Rapid pace of interaction and content delivery.
Teasing: Algorithms that deliver content close to, but not exactly, a user's perceived preference to encourage continued engagement.

Schüll described the combination of these features as a "recipe for overuse." Neuroscientist Jonathan D. Morrow noted the "teasing" feature involves apps withholding a perceived reward to maintain engagement.

Separate scientific research on adolescent social media use has identified behaviors that mirror symptoms of addiction, such as withdrawal and impaired functioning. Researchers have suggested platform modifications to reduce compulsive use, including limiting notifications, disabling infinite scroll and autoplay for minors, implementing default privacy settings, and establishing more robust age verification.

Broader Legal and Regulatory Context

The Los Angeles and New Mexico cases are among the first of numerous similar lawsuits to reach trial.

  • Pending Litigation: Thousands of cases are pending in U.S. courts against social media companies, filed by individuals, school districts, and state attorneys general. Over 40 state attorneys general have filed suits against Meta alleging its platforms contribute to a youth mental health crisis.
  • Bellwether Status: The Los Angeles case is considered a "bellwether" trial, meaning its outcome could influence the resolution of many similar claims.
  • Regulatory Actions: In response to concerns about online safety, some governments are pursuing new regulations. In Australia, the government is consulting on a proposed "digital duty of care" for tech companies. In the U.S., the Kids Online Safety Act (KOSA), which proposes design standards for platforms used by minors, passed the Senate in 2024 and awaits action in the House of Representatives.

Legal experts note that the lawsuits against Meta and Google focus on product design and liability, a legal strategy that attempts to bypass traditional protections like Section 230 by arguing the platforms themselves are defective products.

Credit rating agency Moody's has reported over 4,000 pending cases targeting 166 companies for alleged addictive software design, including cases against video game makers, online gambling apps, and AI chatbot developers.

Company Statements and Market Reaction

Following the verdicts, Meta and Google reiterated their positions.

  • A Meta spokesperson stated, "We respectfully disagree with the verdict and will appeal," adding that teen mental health is complex and not attributable to a single app.
  • A Google spokesperson said the company plans to appeal and stated the Los Angeles case "misrepresents YouTube," characterizing it as a responsibly built streaming platform.

After the Los Angeles verdict, Meta's stock price fell over 7%, and Alphabet's (Google's parent company) share price dropped more than 2%. Following the New Mexico verdict, Meta's stock rose approximately 5% in after-hours trading.