Social media’s Big Tobacco moment arrives with landmark child harm verdicts

NEWYou can now listen to Fox News articles!
The tide is finally turning on social media. Just in the last two days, two separate juries in New Mexico and California have found social media companies responsible for harming children for the first time in our history. True, no US court has ever previously found social media platforms liable for harm they cause to children. This is a historic moment.
The decision at the trial in Los Angeles is notable because the lawsuit was filed using tort law to hold social media companies liable for mental health damages suffered by an individual plaintiff referred to in the lawsuit as “Kaley.” The jury found Meta and YouTube guilty of negligence in designing and operating an addictive product that was harmful to children and of failing to warn users of these harms. Never before has a case like this reached the court, let alone a guilty verdict.
For more than a decade, children, young victims and parents have suffered countless harms from social media, including suicide, self-harm, eating disorders, anxiety and depression, but have been unable to achieve justice. If another product, such as a defective toy or poisoned food, were harming children, parents would already have their day in court.
Not so for social media. Tech companies have hidden behind the massive immunity shield of a law called Section 230, which says online platforms are not liable for harm resulting from the speech of third parties they host; That’s why years of lawsuits filed against social media platforms that harm children have been completely dismissed due to Section 230.
GREGG JARRETT: JURY CHARGES META AND GOOGLE FOR HARMING YOUTH, BUT APPEAL COULD WEAKEN THE CASE
Family members of the victims spoke to reporters outside Los Angeles Superior Court in Los Angeles on March 25 after a jury found Meta and YouTube were negligent in a lawsuit alleging their platforms contributed to harmful behavior among young users. (Kayla Bartkowski/Los Angeles Times via Getty Images)
Not anymore. This tort case took a new legal approach and focused solely on social media’s product design (recommendation algorithms, “likes,” autoplay, infinite scrolling, and notifications) that is addictive and harmful to children, regardless of content.
Their strategy worked. The jury saw evidence of what it was. For example, when Facebook co-founder and CEO Mark Zuckerberg took the stand at the hearing, he was asked about his decision to allow beauty filters that mimic plastic surgery on Instagram after 18 of Meta’s internal experts warned that they were harmful to young girls and could contribute to body dysmorphia.
“I think a lot of times it’s oppressive to tell people they can’t express themselves that way.” He tried to dismiss this by saying:
META SAYS FIGHTING ‘EPIDEMIC’ AFTER FINDING THE TECH GIANT RESPONSIBLE FOR CAUSING ADDICTION TO CHILDREN
They’ve seen internal emails and pitches saying “young people are the best”, “omg yall IG is a drug” or “we’re basically unattractive”. The jury could clearly see that these platforms were designed to be addictive, that these companies knowingly harmed children, and that they failed to warn users. As the plaintiff’s lead attorney, Mark Lanier, said at a press conference after the verdict, “We sent a message with this that you will only be held liable for the characteristics that encourage addiction.”
Thousands of other cases are still awaiting trial, in the state of California alone, and with this positive initial outcome, companies will be encouraged to resolve these other cases rather than going back to court. Meta and YouTube, as well as other platforms named in pending lawsuits such as TikTok and Snapchat, should all be ready to pay. Take the $6 million in damages awarded in this single verdict and multiply it by thousands. This is a Big Tobacco moment for Big Tech.
Big Tech’s allies and sympathizers are trying to argue that this decision diminishes parents’ responsibility for raising healthy children. They quote the plaintiff, FIRE Executive Vice President Nico Perrino tweeted: “Kaley says she started using YouTube at age 6 and Instagram at age 9, and told the jury she was on social media ‘all day long’ as a child.” “Where were my parents?” he added.
NEARLY TWO-THIRDS OF AMERICAN VOTERS SUPPORT A SOCIAL MEDIA BAN ON CHILDREN UNDER 16, ACCORDING TO A FOX NEWS POLL.
They’re asking the wrong question. The problem is not parental absenteeism, but addictive products without meaningful parental controls or strong age verification. As I explain in my book “Technology Breakout,” social media platforms are actively circumnavigating parents to reach their children; they are recruiting younger users and, as this trial in Los Angeles makes clear, they do not effectively age-restrict their platforms or require any parental consent.
The best outcome for these pending cases is not just huge payouts to the victims, but also a restructuring of the way social media companies do business. One of the most important pending cases, a multidistrict lawsuit by 40 state attorneys general that goes to trial this summer, could do just that.
CLICK FOR OTHER OPINIONS OF FOX NEWS
In 1998, attorneys general for 52 states and territories signed Master Settlement Agreements (MSAs) with the four largest tobacco companies in the United States to resolve dozens of state lawsuits seeking to recover billions of dollars in health care costs related to the treatment of smoking-related diseases.
Thousands of other cases are still awaiting trial, in the state of California alone, and with this positive initial outcome, companies will be encouraged to resolve these other cases rather than going back to court.
This agreement changed the industry forever; banned tobacco from targeting young people in its advertising, banned the use of cartoons (appealing to children) in advertising or on packaging; banned payments to promote tobacco in media such as movies, TV, music, and video games, provided money for states to fund anti-smoking campaigns, and banned more.
CLICK TO DOWNLOAD FOX NEWS APPLICATION
As part of a potential master settlement agreement, attorneys general could similarly seek robust age verification measures to keep out minors, parental consent requirements for social media accounts, or even forcing platforms to voluntarily raise the age of their accounts to 16 instead of 13.
A settlement agreement could also require companies to disable certain addictive features for minors under certain ages, such as recommendation algorithms, infinite scrolling, autoplay, “likes” or other features. Social media doesn’t need to be addictive. This first positive ruling is important because it suggests that the pending multidistrict litigation could lead to a major settlement like Big Tobacco that would completely change the social media industry.
CLICK HERE TO READ MORE FROM CLARE MORELL



