New Mexico seeks child safety restrictions on Meta apps and algorithms in trials 2nd phase

SANTA FE, N.M. (AP) — New Mexico state prosecutors are seeking fundamental changes to Meta’s social media practices and algorithms to protect children in the second phase of a landmark case over allegations that platforms like Instagram pose public safety dangers.
Opening statements will be made Monday at a three-week hearing to decide whether Meta platforms, which also own Facebook and WhatsApp, constitute a public nuisance under state law.
In the first stage, jurors decided to fine Meta $375 million on purpose. harmed children’s mental health and hid what they knew about child sexual abuse on their platforms.
Prosecutors are now asking the judge to make fundamental changes aimed at curbing and curing addictive traits. age verification and preventing child sexual exploitation through default privacy settings and closer surveillance.
Meta has vowed to appeal the jury verdict and warned that it could eliminate Instagram and Facebook services in New Mexico if it is forced to comply with unenforceable orders.
“The fact that we are having a hearing on a troubling situation is itself a remarkable outcome,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law in California. “This theory is not well accepted when applied to the Internet, and this theory does not fit well with the Internet.”
New Mexico Attorney General Raúl Torrez said the jury verdict pierced the air of invincibility that protects tech companies from liability for material on their platforms. Chapter 230A 30-year-old provision of the US Communications Decency Act.
A Los Angeles jury found both Meta and YouTube individually liable for harm to children, confirming longstanding concerns about the dangers of social media.
New Mexico prosecutors are demanding that Meta help address the mental health crisis among children through a series of measures and changes, including redesigning the algorithms that make content recommendations so they no longer prioritize constant interaction.
Prosecutors are also targeting other features linked to compulsive use, such as “infinite scrolling” that constantly loads content; push notifications; and default settings that show calculations for “likes” and sharing. Their lawsuit also calls for improvements age verification and other steps aimed at preventing child sexual abuse.
And New Mexico requires child accounts on Meta platforms to have an associated parent or guardian, as well as a court-supervised child safety monitor to track improvements over time.
Executives said the company has continually improved child safety, addressed compulsive use, and many demands from prosecutors were unwarranted.
Meta plans to call a number of technical experts as witnesses to argue that the demands are impractical, if not impossible, and could force it to “ignore the realities of the internet.”
The company also argues that its platforms are singled out among hundreds of apps used by young people, leaving children vulnerable on platforms with less robust protections.
The company is invoking the freedom of expression protections that have protected social media for decades.
“The state’s proposed mandates violate parental rights and inhibit free expression for all New Mexicans,” Meta said in a statement last week.
The case was the first to go to trial among those filed by more than 40 state attorneys general alleging Meta contributed to the youth mental health crisis. Many seek relief in U.S. federal court.
Torrez, the state attorney general, said this puts the case in a unique position to not only “try and change the paradigm of how this company does business, but also how Big Tech in general is expected to do business going forward.”
Prosecutors may risk wading into murky legal waters just to get the authority to verify age, Goldman said.
“In practice, a court decision saying Facebook must implement age authentication would not have textual support from the Supreme Court,” he said. “The Supreme Court might confirm that. We don’t know.”
The first phase of the trial saw six weeks of testimony from witnesses including teachers, psychiatric experts, government inspectors, senior Meta officials and whistleblowers who had left the company.




