Closing arguments began Monday in a landmark trial in New Mexico where social media conglomerate Meta is accused of misleading its users about how safe its platforms are for children.
Jurors will take up the case after the arguments and six weeks of testimony from scores of witnesses that included local teachers, psychiatric experts, state investigators, top Meta officials and whistleblowers who left the company.
The case in New Mexico state court is among the first to reach trial in a wave of litigation involving social media platforms and their impacts on children.
New Mexico prosecutors have accused Meta — which owns Instagram, Facebook and WhatsApp — of prioritizing profits over safety in violation of state consumer protection laws. They have raised concerns about the safety of complex algorithms, and a variety of messaging features and settings.
“It’s clear that young people are spending too much time on Meta’s products, they’ve lost control,” prosecution attorney Linda Singer told the jury in closing statements. “Meta knew that and it didn’t disclose it.”
At the same time, Singer said testimony and evidence at trial showed Meta’s algorithms had been recommending sensational and harmful content to teenagers, and failing to truly enforce its minimum user age of 13.
“The safety issues that you’ve heard about in this case, weren’t mistakes. …. They were a product of a corporate philosophy that chose growth and engagement over children’s safety,” Singer said. “And young people in this state and around the country have borne the cost.”
Attorneys for Meta dispute the claims and say the company incorporates protections for teenagers and weeds out harmful content, while also acknowledging that some potentially harmful posts get past its safety nets.
Singer urged jurors to impose a civil penalty of more than $2 billion against Meta, based on the maximum $5,000 penalty per violation on two counts of consumer protection violations, and an estimated 208,700 monthly users of Meta platforms under the age of 18 in New Mexico.
“Over the course of a decade Meta has failed over and over again to act honestly and transparently, failed to act to protect young people in this state,” Singer said. “It is up to you to finish this job.”
A second phase of the trial with follow with a judge deciding whether Meta created a public nuisance and should be on the hook financially to fund programs to address alleged harms to children.
Attorney General Raúl Torrez filed suit in 2023, accusing Meta of creating a marketplace and “breeding ground” for predators who target children for sexual exploitation and failing to disclose what it knew about those harmful effects. State investigators created social media accounts posing as children to document online sexual solicitations and the response from Meta.
Meta attorneys have said the company is honest with platform users about rigorous but imperfect efforts to enforce bans on child sexual abuse material. They also accuse prosecutors of cherry-picking evidence and conducting a shoddy investigation.
Meta executives emphasized at trial that the company continuously improves safety and addresses compulsive social media use without infringing on free speech or censoring users.
But the prosecution on Monday said that public assurances about safety disclosures from Meta executives including Mark Zuckerberg and Instagram Head Adam Mosseri often didn’t square with internal studies and communications at the company.
“It was included in Meta’s internal research — again this was research that didn’t get disclosed by Meta — one-in-three teens experienced problematic use,” Singer said. “They knew these kids were struggling with problematic use — again, addiction.”
A jury assembled from residents of Santa Fe County, including the politically progressive state capital city, will decide whether Meta violated the state’s Unfair Practices Act on two counts, including “unconscionable” trade practices.
A finding of willful violations would open the way for possible fines of up to $5,000 per violation. Prosecutors say that could add up to billions of dollars, while Meta said it would seek a different calculation.
Tech companies have been protected from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act, as well as a First Amendment shield.
Prosecutors say New Mexico is not seeking to hold Meta accountable for content on its platforms, but rather its role in pushing out that content through complex algorithms that proliferate material that can be addictive and harmful to children.
In California, a jury already is sequestered in deliberations on whether Meta and YouTube should be liable for harms caused to children using their platforms. The bellwether case could impact how thousands of similar lawsuits against social media companies are likely to play out.


