Back
Technology

Meta CEO Mark Zuckerberg Testifies in Landmark Social Media Harm Trial

View source

Mark Zuckerberg Testifies in Landmark Social Media Addiction Trial

Mark Zuckerberg, CEO of Meta, testified in a Los Angeles jury trial addressing allegations that Meta and Google's YouTube intentionally designed their social media platforms to be addictive, contributing to mental health challenges in young users. This trial, centered on a lawsuit brought by a 20-year-old woman identified as "Kaley" and her mother, is considered a bellwether case with potential implications for over 1,500 similar lawsuits across the United States.

Trial Focuses on 'Defective Product' Allegations

The trial focuses on whether social media platforms can be classified as "defective products" designed to exploit vulnerabilities in young users. Plaintiffs allege that features such as infinite scroll, autoplay, likes, beauty filters, and push notifications were engineered to maximize engagement.

Kaley's lawsuit specifically claims that her compulsive use of YouTube, beginning at age 6, and Instagram, starting around age 9, contributed to body dysmorphia, anxiety, depression, and suicidal thoughts, alongside experiences of bullying and sextortion.

Meta and Google, which owns YouTube, deny these allegations. They contend that their platforms do not directly cause mental health issues, arguing that such challenges are complex and often stem from various factors, including difficult childhoods. Prior to the trial, TikTok and Snap, also initially named as defendants, reached settlements with the plaintiff.

Zuckerberg Addresses Addiction, Engagement Goals, and Platform Design

During his testimony, Zuckerberg addressed several key areas, including questions about addiction and engagement goals, age verification, beauty filters, and his public image.

Addiction and Engagement Goals

When questioned if increased usage indicated addiction, Zuckerberg responded, "I don’t think that applies here." He was confronted with internal documents, reportedly from Instagram CEO Adam Mosseri's past testimony, indicating goals to increase daily user engagement time on Instagram to 40 minutes in 2023 and 46 minutes in 2026. Zuckerberg clarified that Instagram previously held such goals but later shifted its focus to utility, assuming that valuable content would naturally lead to more usage.

He also addressed emails from 2014 and 2015 that indicated his own goals to increase user engagement time, stating he did not recall the specific context of a decade-old email and that the company's objective is to build useful services for connecting people. He maintained his 2024 Congressional testimony, stating the company did not set goals for teams to maximize time on apps, was accurate.

Age Verification

Zuckerberg stated that Meta's platforms include age limits (13+) in their terms of service and remove identified underage users who misrepresent their age. He suggested that operating system and app store providers, such as Apple and Google, are better suited for comprehensive age verification.

Beauty Filters

He defended the company's decision to allow beauty filters for user expression, stating that Meta chooses not to recommend them, despite expert connections to body image issues in young girls.

Public Image

Zuckerberg addressed questions regarding his media training, referring to internal communications about appearing "authentic" as "feedback" and acknowledging his perceived challenges with public appearances.

Instagram CEO Adam Mosseri's Prior Testimony

Last week, Instagram CEO Adam Mosseri testified, distinguishing between clinical addiction and "problematic use" of social media. He stated that extensive social media use, even up to 16 hours daily, may not constitute addiction and can vary individually.

Mosseri also explained Meta's decision to modify a prior ban on image filters that altered users' appearance, following discussions about their potential negative mental health effects. He also stated that protecting minors is beneficial for both business and profit.

Internal Documents Fuel Plaintiff's Case

Plaintiff attorneys presented internal Meta documents as evidence of the companies' alleged intent:

  • A 2020 document reportedly indicated 11-year-olds were four times more likely to frequently use Facebook compared to older users, despite Instagram's minimum age being 13.
  • Documents outlining goals to increase the time 10-year-olds spend on Instagram, and a 2018 document stating, "If we wanna win big with teens, we must bring them in as tweens."
  • Meta researchers reportedly found that teenagers reporting negative body image from Instagram often encountered more "eating disorder adjacent content."
  • Leaked internal documents by whistleblower Frances Haugen over four years ago suggested Meta was aware its platforms could be detrimental.

Plaintiff's lawyers argued that these documents, along with platform design features like infinite scroll and push notifications, demonstrate a deliberate intent to make social media apps difficult for young users to disengage from.

Meta and Google Defend Platform Design and Safety Measures

Meta maintains that it has no incentive for users to experience harm, asserting that user safety, particularly for teenagers, is a priority for sustaining its business. A Meta spokesperson stated the company's disagreement with the lawsuit's allegations, expressing confidence in demonstrating its commitment to supporting young people.

Meta has implemented safety features, including parental controls, default privacy settings for young users, content restrictions, and options to limit notifications or screen time. Meta also cited a National Academies of Sciences finding that existing research does not establish a definitive link between social media and changes in children's mental health. The company's legal counsel suggested that Kaley's mental health challenges originated from a difficult childhood, with social media serving as a creative outlet.

Courtroom Recording Incident

During the proceedings, Judge Carolyn B. Kuhl issued a warning to members of Zuckerberg's group, including his executive assistant Andrea Besmehn, for wearing Meta AI glasses in the courtroom. Recording is prohibited in the courtroom, and the glasses have recording capabilities. The judge warned of potential contempt of court if any recording had occurred and mandated deletion of any such content.

Broader Legal Battle and Industry Implications

This trial is part of a broader legal landscape, with over 1,500 similar lawsuits filed by families and school districts against social media companies. Legal experts and plaintiffs' attorneys have drawn parallels to the tobacco trials of the 1990s, which led to increased public warnings and educational efforts.

Plaintiffs' lawyers are employing product liability laws to argue that social media platforms are inherently unsafe products due to their design, a strategy that aims to circumvent Section 230 of the Communications Decency Act, which typically shields tech companies from liability for user-generated content.

Meta is also involved in a separate consumer protection trial in New Mexico, where the state's attorney general alleges the company failed to prevent child sexual exploitation on its platforms. Furthermore, hundreds of lawsuits from school districts are anticipated to proceed to trial later this year.

Families Share Their Stories

Parents of individuals allegedly affected by social media use have been attending the court proceedings. Lori Schott, a parent whose daughter died by suicide in 2020, stated her daughter's body image issues were influenced by social media. Julianna Arnold, whose daughter died from fentanyl poisoning after allegedly meeting a dealer on Instagram, expressed a desire to inform other parents about potential harms, advocating for "guardrails" on tech companies. Joann Bogard, whose son died in 2019 after attempting a "choking challenge" learned from YouTube videos, has also expressed ongoing concerns regarding social media safety.