Minors and parents suing Meta Inc.’s Facebook and other technology giants for the kids’ social media platform addictions won an important ruling advancing their collection of lawsuits in a California court.
A state judge on Oct. 13 threw out most of the claims but said she’ll allow the lawsuits to advance based on a claim that the companies were negligent—or knew that the design of their platforms would maximize minors’ use and prove harmful. The plaintiffs argue social media is designed to be addictive, causing depression, anxiety, self-harm, eating disorders and suicide.
Related CM article: What’s Next: Are Social Media Addiction Lawsuits Just the Beginning? (2022)
More than 200 such suits filed across the country have been assigned to two judges in California—one in state court in Los Angeles and the other in federal court in Oakland. Judge Carolyn B. Kuhl’s ruling applies only to the cases in state court. Her decision is part of a larger battle in which statewide social media bans pit concerns about privacy and national security against personal freedoms and the use of wildly popular apps—especially among young users.
In the California case, lawyers representing minors cleared a legal hurdle that allows them to pursue a claim that Facebook, Instagram, Snap Inc., TikTok Inc. and Alphabet Inc.’s YouTube knew that the physical harms of social media were “foreseeable and substantial,” Kuhl wrote her the ruling. The judge pointed to the “obvious inequality” between “unsophisticated minors” and the internet companies “who exercised total control over how their platforms functioned.”
Internet companies have long relied on Section 230 of the Communications Decency Act, a federal statute that has consistently shielded them from liability over comments, ads, pictures and videos on their platforms. Importantly, Kuhl ruled that laws protecting free speech and Section 230 don’t stop the negligence claim in the collection of California cases from going forward.
Kuhl ruled the social media companies could be held liable for the allegations because they are “based on the fact that the design features of the platforms—and not the specific content viewed by plaintiffs — caused plaintiffs’ harms.”
“This decision is an important step forward for the thousands of families we represent whose children have been permanently afflicted with debilitating mental health issues thanks to these social media giants,” lawyers for the plaintiffs said in a statement. “We are determined to use every legal tool at our disposal to hold these companies accountable for their actions and reach a just resolution.”
Google defended its practices in a statement Friday.
“Protecting kids across our platforms has always been core to our work,” José Castañeda, a Google spokesperson, said. “In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in these complaints are simply not true.”
The other companies didn’t immediately respond to requests for comment on the ruling but they too have defended their practices in the past. Antigone Davis, Meta’s Global Head of Safety, responded to one of the lawsuits in March saying the company wants teens to be safe online and offers more than 30 safety tools for kids and families, including supervision and age verification technology.
The judge also tossed out seven other claims in the lawsuit, including an argument that the companies should be held liable for the defective design of their platforms. The concept of product liability was “created in a different era to solve different problems,” Kuhl wrote. Social media present “new challenges” under the law, she said, because they’re not tangible. “One cannot reach out and touch them,” she said.
Lawyers representing minors in the similar collection of lawsuits filed in federal court also face a request by the companies to dismiss the litigation.
The case is Social Media Cases, 22STCV21355, Superior Court of the State of California, County of Los Angeles.