Mark Zuckerberg grilled in ‘Big Tobacco’ social media trial in Los Angeles – What did Meta chief say? What to know

Meta CEO Mark Zuckerberg was grilled repeatedly in a Los Angeles courtroom on Wednesday as a landmark case over the harms of social media moved into a pivotal phase; The plaintiffs argued that platforms like Instagram were deliberately designed to keep young users addicted despite internal warnings.
The case, heard in Los Angeles Superior Court, is one of several major cases this year that experts have likened to the industry’s “Big Tobacco” moment. At the heart of the case is an allegation from a 20-year-old woman known in court records as KGM, who said her obsessive use of Instagram and YouTube worsened her depression and suicidal thoughts. His case is one of about 20 landmark trials used to test how juries respond to arguments about harmful product design rather than individual ingredients.
Zuckerberg questions underage Instagram users and age verification errors
The focus of Wednesday’s testimony was whether Meta had taken adequate steps to keep children under 13 off Instagram, which requires users to be at least 13 years old.
Zuckerberg acknowledged that the company had improved its ability to detect underage users but was not moving fast enough. “I always wish we could have gotten there sooner,” he told the court.
He said some users lied about their age when joining Instagram, and Meta removed accounts determined to be underage. Plaintiffs’ attorneys challenged the reliability of that system, arguing that the company relied too heavily on official policies rather than enforceable barriers.
“Do you expect a nine-year-old to read all the fine print? Is that the basis for swearing that no children under 13 are allowed?” a lawyer asked.
After repeated questions about age verification, Zuckerberg responded: “I don’t understand why this is so complicated.”
Meta chief says company’s responsibility must include users’ well-being
As the plaintiffs sought to frame Meta’s products as a public health issue rather than a consumer choice, Zuckerberg was asked what duties a tech company owes its users.
“I think a reasonable company should try to help the people who use their services,” he said.
The statement reflected a broader theme that ran throughout the hearing: Social media companies, like producers in previous waves of lawsuits, should be held accountable not just for what appears on their platforms, but for the way those platforms are built and the behavior they encourage.
Beauty filters: Zuckerberg says Meta consults stakeholders but prioritizes freedom of expression
The statement also harkened back to a long-running debate within Meta over Instagram’s beauty filters, which critics say contribute to distorted self-image and anxiety among young users.
Zuckerberg said Meta consulted with “various stakeholders” about the use of filters, but did not name them.
Plaintiffs’ attorneys questioned him about internal messages suggesting he lifted the ban on certain filters because he believed the restriction was excessive.
“It sounds like something I would say and feel,” Zuckerberg replied. “It feels a little oppressive.”
She was pressed on why the company allowed the feature even after receiving guidance from experts that beauty filters had negative effects, especially on young girls.
Advocates cited a University of Chicago study in which 18 experts said beauty filters were a feature that harmed young girls. Zuckerberg said he saw the feedback and discussed it internally, but the decision ultimately came down to freedom of expression.
Engagement goals and internal metrics: Zuckerberg opposes ‘company goals’
The hearing also examined whether Meta set clear goals to increase time spent on Instagram; This claim is at the center of claims that the platform is designed for addiction.
Zuckerberg pushed back against the idea that increasing engagement is a company goal. In 2015, he was questioned about an email thread in which he emphasized that improving engagement metrics was a pressing issue. Zuckerberg said the email chain may have included the words “company goals,” but the comments could have been aspirational, and he insisted Meta didn’t have those goals.
The plaintiffs then presented evidence that included Instagram chief Adam Mosseri’s goals of increasing daily interaction time to 40 minutes in 2023 and 46 minutes in 2026.
Zuckerberg said Meta uses internal milestones to measure itself against competitors and “deliver the results we want to see,” and that the company is developing services to help people connect.
Judge warns against recording statements with AI smart glasses
Courtroom etiquette became a flashpoint after Judge Carolyn B. Kuhl warned that anyone who recorded Zuckerberg’s testimony using artificial intelligence smart glasses would be considered in contempt.
“If you have done this, you must delete it, otherwise you will be charged with contempt of court,” the judge said. “This is very serious.”
The warning came after members of Zuckerberg’s security team were photographed wearing Meta Ray-Ban AI glasses outside the courtroom. Recording in court is not allowed.
Board control and media weirdness: Zuckerberg reconsiders old remarks
Lawyers also asked Zuckerberg about his previous statements suggesting Meta’s board could not meaningfully remove him because of his voting power.
“If the board wants to fire me, I can elect a new board and start over,” he said, referencing remarks on Joe Rogan’s podcast.
He also acknowledged his discomfort with the public questioning.
“I think I’m known for being a little bit bad at it,” he said.
Zuckerberg said in the courtroom that he was “very bad” with the media.
A lawsuit based on design, not content, aims to bypass technology’s traditional legal shield
The hearing marked the first time Zuckerberg has faced a jury in a civil case over child safety concerns. For years, tech companies have relied on federal protections that largely shield them from liability for user-submitted content.
The plaintiffs in this case pursued a different strategy. Their discussions are not primarily about individual posts or videos, but about product design; The features they say aim to maximize engagement, reward compulsive use, and keep users scrolling.
This approach has so far allowed cases to evade the industry’s most familiar legal defense.
Leading cases and what’s at stake for the tech industry
The Los Angeles case involving KGM is one of about 20 landmark trials designed to gauge jury reaction before hundreds of similar claims move forward.
TikTok and Snap reached a settlement at the first hearing but remain defendants in other lawsuits tied to the broader case.
Zuckerberg’s testimony came about a week after Mosseri took the stand. Mosseri disputed the science behind social media addiction, saying users cannot be “clinically addicted.” He described children’s heavy Instagram use as “problematic use” comparable to “watching TV for longer than you feel good about.”
While psychologists do not classify social media addiction as a formal diagnosis, researchers have documented the harmful consequences of compulsive use among young people, and lawmakers around the world have voiced concerns about addictive designs.
Meta disputes the role Instagram plays in FGM’s mental health
Meta’s defense sought to acknowledge FGM’s mental health issues while countering that Instagram played a significant role in exacerbating those issues.
Paul Schmidt, one of Meta’s lawyers, said in an earlier opening statement that the company acknowledged FGM’s mental health issues but argued that Instagram was not the primary factor. Schmidt cited medical records that suggested the real issue was a difficult home life.
Families and lawyers say court can do what Congress can’t
The case is being watched closely by families who argue that the legislative process has stalled despite years of hearings and public scrutiny.
Two years ago, Zuckerberg faced similar questions during a tense congressional hearing on child abuse. In January 2024, he addressed grieving parents and apologized, promising continued investment in protecting children.
Some families are not convinced.
“His apology — if you call it that — was mostly empty,” said John DeMay, whose 17-year-old son Jordan died by suicide in 2022, hours after he became the target of an online sextortion scam on Instagram. “He basically said they did everything they could to stop and prevent these events from happening, but unfortunately that’s not the case.”
DeMay, who travels frequently to Washington to advocate for child safety online, said he now trusts the courts more than Congress.
“I’m hopeful that this case will win, but if we don’t, we still won because we showed the world, with the evidence on the record, that they did one thing and said another,” he said.
Meta faces more lawsuits as child safety allegations spread across states
Meta is also fighting a separate lawsuit in New Mexico, where prosecutors accused the company of violating consumer protection laws by failing to disclose what it knew about potential harm to children. Meta denied the allegations.
Instagram has added safety features aimed at younger users in recent years, but advocacy groups argue those tools remain inconsistent.
A 2025 review by Fairplay, a nonprofit focused on reducing the impact of big tech on children, concluded that “less than a fifth are fully functional and two-thirds (64%) are either largely ineffective or no longer available.”
Former employees also expressed concerns about the company’s internal culture. Kelly Stonelake, a former Meta employee, said she went on medical leave in February 2023 after facing harassment and retaliation for raising concerns about children’s safety. Last year, she filed a lawsuit against Meta for allegedly silencing women.
It alleges that Meta collected data about children without parental consent and disclosed them to other adults, creating “an environment that we know is rife with harassment and bullying.”
Why is the hearing being called social media’s “Big Tobacco” moment?
The phrase emerged as shorthand for a legal and political showdown: an attempt to establish that social media companies, like tobacco companies in previous decades, knew their products could harm users but failed to take decisive action.
Zuckerberg’s emergence has placed the company’s internal deliberations and unresolved tensions between free speech, security and business incentives at a level of courtroom scrutiny rarely seen in the tech industry, according to Meta.
And the outcome for the broader industry could shape not just financial responsibility but also the design norms that have defined social media for more than a decade.



