For years, parents, teenagers, pediatricians, educators and whistleblowers have pushed the idea that social media is detrimental to young people's mental health and can lead to addiction, eating disorders, sexual exploitation and suicide.

For the first time, juries in two states took their side.

In Los Angeles on Wednesday, a jury found both Meta and YouTube liable for harms to children using their services. In New Mexico, a jury determined that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its platforms.

Tech watchdog groups, families and children’s advocates cheered the jury decisions.

“The era of Big Tech invincibility is over,” said Sacha Haworth, executive director of The Tech Oversight Project. “After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years.”

While it's too soon to tell if this week's outcomes will lead to fundamental changes in how social media platforms treat their young users, the dual verdicts signal a changing tide of public perception against tech companies that is likely to lead to more lawsuits and regulation. For years, they have argued that the harms their platforms cause to children are a mere byproduct, unintentional and inevitable consequences of broader societal issues or bad actors taking advantage of safeguards. They pushed against the notion that psychological harms could be the result of social media use and downplayed research that showed otherwise.

When asked about whether people tend to use a platform or product more if it’s addictive during his testimony in the Los Angeles trial, Meta CEO Mark Zuckerberg said “I’m not sure what to say to that. I don’t think that applies here.”

The verdicts show the public's growing willingness to hold the companies responsible for harms and demand meaningful changes in how they operate. What's not apparent, at least not yet, is whether the companies will take heed. Both Meta and Google said they disagree with the verdicts and are exploring legal options, including appeals.

Arturo Béjar, a former Meta engineering director who raised alarms about Instagram's harms inside the company for years before testifying in Congress in 2023, said jury trials “level the playing field” for these trillion-dollar companies. But he cautioned that it will take actual regulation to rein them in.

“One thing that I saw working inside the company that effectively led to behavior change was when an attorney general or the FTC stepped in and required things of the company,” he said. “Both New Mexico and Los Angeles and all the attorneys general that are part of this process have really an extraordinary opportunity and the ability to ask for meaningful change.”

While both cases focused on harms to children, there are key differences between the two. New Mexico's lawsuit was filed by state Attorney General Raúl Torrez in 2023. State investigators built their case by posing as children on social media, then documenting sexual solicitations they received as well as Meta’s response. The jury was asked to determine if Meta violated New Mexico's consumer protection law.

The Los Angeles case had a single plaintiff, who goes by the initials KGM, against Meta, Google's YouTube, TikTok and Snap. TikTok and Snap settled before trial. The plaintiff in this case argued that the platform design features of the two remaining defendants, Meta and YouTube, were designed to be addictive, especially for young users. Because thousands of families have filed similar lawsuits, KGM and a handful of other plaintiffs have been selected for bellwether trials — essentially test cases for both sides to see how their arguments play out before a jury, eventually leading to a broader settlement reminiscent of the Big Tobacco and opioid trials.

By focusing on deliberate design choices and product liability, the lawsuits were able to sidestep Section 230, which generally exempts internet companies from liability for the material users post on their services. Past lawsuits, which have focused on how the platforms distributed content, often failed on these grounds.

“For the first time, courts have held social media platforms accountable for how their product design can harm users,” said Nikolas Guggenberger, an assistant professor of law at the University of Houston Law Center. “This is a new legal territory that could reshape an industry long shielded by Section 230. Platforms will have to rethink their focus on engagement at any cost, which has outlived itself.”

The final outcome of the cases could take years to resolve pending appeals and settlement agreements, but experts say the shift in the public's sentiment and understanding of social media's dangers is already happening. In a 2025 Pew Research Center poll, for instance, 48% of teens said social media harms people their age. In 2022, only 32% said the same.

Amid social media's reckoning, however, artificial intelligence chatbots are emerging as the next frontier in the fight to make technology safer for young people.

“You can ban today's harm, but how do you know what tomorrow is going to bring?” said Sarah Kreps, a professor and director of Cornell University’s Tech Policy Institute. Whether it's another social media app, AI or some other new technology, she added, new things will crop up.

“And people will flock to those because where there’s demand you will see a supply come to meet that demand,” she said.

Keep Reading

Linda Singer, an attorney representing the plaintiff, left, shakes hands with attorney Kevin Huff, representing Meta, after they made closing arguments, Monday, March 23, 2026, in state court, in Santa Fe, N.M., in a trial where the social media conglomerate is accused of misleading its users about how safe its platforms are for children. (Eddie Moore/The Albuquerque Journal via AP, Pool)

Credit: AP

Featured

Lines for domestic security spill out of the north terminal onto the sidewalk at Hartsfield-Jackson Atlanta International Airport amid the ongoing partial government shutdown. Thursday, March 26, 2026 (Ben Hendren for the AJC)

Credit: Ben Hendren