Chasing Platforms Instead of Ambulances: Social Media Liability Trial Kicks Off in California

If the case finds success, it will open the floodgates for attorneys to sue deep-pocketed companies regardless of whether they actually caused harm.

Photo Credit: Getty

Proceedings have commenced in a trial poised to redefine the boundaries of free expression on the internet. The first-of-its-kind test case will determine if social media companies can be held liable for kids’ emotional distress purportedly caused by “addictive” design features of online platforms such as Instagram and YouTube. As the bellwether trial for more than a thousand consolidated cases (JCCP 5255) in the Los Angeles Superior Court alone, it mirrors a larger wave of similar state and federal litigation nationwide filed by families, school districts, and state attorneys general.

Both TikTok and Snapchat settled the case shortly before jury selection last week, leaving Meta and YouTube as the remaining platforms in the suit. But if the case finds success, it will open the floodgates for plaintiffs’ attorneys to sue big-pocketed social media companies regardless of whether they actually caused the alleged harms.

The case centers on a young woman, identified as K.G.M., who alleges to have been hooked by social media platforms starting at age ten. Adopting the litigation playbook used against the tobacco and opioids industries, the plaintiff argues that these tech companies intentionally engineered their platforms to be addictive. She claims these design choices directly fueled her struggles with anxiety, depression, and body dysmorphia.

The timing of the trial is remarkably symbolic, as February 8 marks the 30th anniversary of Section 230 of the Communications Decency Act. Often called “the twenty-six words that created the internet,” Section 230 shields digital platforms from liability for the speech of their users. This trial is one of the law’s greatest tests.

The plaintiffs are attempting an end run around Section 230 by arguing that they aren’t suing over the third-party content hosted by social media platforms but rather the companies’ “product design.” They claim features such as infinite scrolling, push notifications, and recommendation algorithms are defective products that caused K.G.M. emotional distress.

A glaring weakness in the case, which will be relevant to the degree to which Section 230 shields the tech companies, is the issue of causation. Did the design of the platforms cause K.G.M. anxiety and depression, or was it the content that she consumed? If the answer is the latter, then Section 230 bars the suit.

The litigation strategy of this suit, and the others, is predicated on the fallacy of implying causality from correlation. Plaintiffs’ lawyers will doubtless present a litany of bad science to bridge the gap between the platforms’ products causing problems or simply existing alongside them. Furthermore, the litigation depends on expanding the clinical label of “addictive” to pathologize high-engagement features as inherent product defects.

Several studies have cast serious doubt on the causal relationship between social media use and mental distress. A 2022 umbrella review in Current Opinion in Psychology noted that 21 out of 25 research summaries admitted their data were based on only snapshots of a single moment in time, meaning that “causal conclusions are not warranted.” More recently, a December 2025 study published in the Journal of Public Health found that evidence does “not support the widely held view that adolescent technology use is a major causal factor in their mental health difficulties.”

Likely most importantly, this lawsuit runs headfirst into the freedom of speech guaranteed by the First Amendment. As the International Center for Law & Economics (ICLE) recently made clear in an amicus brief submitted in a similar social media case in Massachusetts, features like infinite scrolling, push notifications, and recommendations are not design defects or simple conduct. Rather, they constitute expressive activity and, as the Supreme Court held in Moody v. NetChoice, protected editorial choices regarding how a platform conveys content. The ICLE brief emphasizes that “how a newspaper presents its content is as much part of its ‘editorial control and judgment’ as the content itself,” arguing that “it is much the same here” for social media platforms.

Read more at National Review