LOS ANGELES — The world’s biggest social media giants are heading to court for the first time in a wave of landmark trials that will determine whether their platforms are responsible for harming children.
On Tuesday, jury selection began in Los Angeles for the first of the lawsuits, which will proceed in both state and federal court. More than 1,600 plaintiffs — including over 350 families and over 250 school districts — accuse the owners of Instagram, YouTube, TikTok and Snap of knowingly designing addictive products harmful to young users’ mental health.
The plaintiffs allege that Instagram, Facebook, YouTube, TikTok and Snap “have rewired how our kids think, feel, and behave,” according to the class action master complaint.
Ahead of the trial’s start date, TikTok and Snap reached a settlement with the plaintiff in the first California state case. But both companies remain defendants in a series of similar lawsuits expected to go to trial this year.
Mark Lanier, lead trial lawyer for the plaintiff in the first case, said that he is open to settlements with Meta and Google, as well, but that his ultimate hope is that the trial will “produce transparency and accountability.”

“Transparency in that we would like for all of the records that are confidential [to] become public so that the public can see that these companies have been orchestrating an addiction crisis in our country and, actually, the world,” Lanier told reporters outside the courtroom Tuesday. “We also want accountability. We want these companies to be held accountable for the damage that they’ve done to individual people.”
The trials are kicking off with the case of a 20-year-old woman identified in court as K.G.M., who was a minor at the time of the incidents outlined in her lawsuit. Lanier believes his client’s case will be a “bellwether” for the hundreds of similar cases still pending in state court.
Meta CEO Mark Zuckerberg is expected to testify in February, according to Meta’s legal team, and the head of Instagram, Adam Mosseri, might take the stand as well. Snap CEO Evan Spiegel was also expected to testify but no longer will after a settlement was reached.
If the jury’s verdict favors the first plaintiff, the social media companies could face damages to be determined by the jury and be forced to change the designs of their platforms. The verdict could also set the tone for whether the tech giants choose to fight or settle the oncoming cases.
The terms of TikTok’s and Snap’s settlements with K.G.M. were not disclosed. TikTok did not immediately respond to a request for comment.
“The Parties are pleased to have been able to resolve this matter in an amicable manner,” a Snap spokesperson wrote in an email.
The Tech Oversight Project, a nonprofit tech watchdog, on Sunday had published a report featuring unsealed court documents, including internal emails, messages and slide decks, pointing to attempts by Meta, Google, Snap and TikTok to make their platforms more appealing to young people.

“This settlement should come as no surprise because that damning evidence is just the tip of the iceberg,” Sacha Haworth, executive director of the Tech Oversight Project, said in a statement. “This was only the first case — there are hundreds of parents and school districts in the social media addiction trials that start today, and sadly, new families every day who are speaking out and bringing Big Tech to court for its deliberately harmful products.”
Matt Bergman, founding director of the Social Media Victims Law Center — which is representing about 750 plaintiffs in the California state proceeding and about 500 in the federal proceeding — said he is eager to hear CEOs testify about “why their profits were more important than our kids’ lives.”
Nearly three years ago, the group filed the first case in the country involving a child alleging harm by social media companies, Bergman said. But the firm was told that such cases “could never progress to trial because of Section 230,” an update to the Communications Act of 1934 enacted in 1996 that says internet companies are not liable for content users post.
And companies, for the first time, are going to be held accountable for the clear and present danger their platforms have inflicted on young people.
-Matt Bergman, founding director of the Social Media Victims Law Center
“Due to the incredible dedication of these families and the hard work of many lawyers and judges, we are now able to go to trial despite Section 230,” Bergman said. “And companies, for the first time, are going to be held accountable for the clear and present danger their platforms have inflicted on young people.”
In an email statement, Meta, which owns Instagram and Facebook, highlighted its record of safety updates, including the introduction of Teen Accounts and tools for parental control. A spokesperson wrote that the company is “proud of the progress we’ve made, and we’re always working to do better.”
“We strongly disagree with these allegations and are confident the evidence will show our longstanding commitment to supporting young people,” the spokesperson wrote. “For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most.”
Google similarly rebutted the lawsuits’ claims about YouTube.
“Providing young people with a safer, healthier experience has always been core to our work. In collaboration with youth, mental health and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls,” Google spokesperson José Castañeda wrote in an email. “The allegations in these complaints are simply not true.”
The jury trials come two years after the Senate Judiciary Committee grilled top executives from Meta, TikTok, X, Snap and Discord about alleged shortcomings related to the safety of young people on their platforms.
The Tech Oversight Project described the cases as “the most significant social media accountability litigation to date.”
