Lawsuit: YouTube, Snapchat, TikTok are 'defective products' that should be held liable for harm

Updated
Social media apps on a smartphone.
Social media apps on a smartphone. (Dado Ruvic/Illustration/Reuters) (Dado Ruvic / reuters)

A federal lawsuit is taking a novel approach to challenging social media companies, arguing that such platforms are “defective products” and that the companies that own and operate them should be held liable for the harms they cause to their users.

Plaintiffs in over 100 cases, which were consolidated into one multidistrict lawsuit, allege that platforms such as YouTube, Snapchat, Facebook and TikTok are addictive by design and can result in self-destructive behavior in adolescents.

The lawsuit, which is based in the Northern District of California, where many of the defendants operate, alleges that excessive use of these platforms, or “products,” is associated with depression, disordered eating and insomnia, and can lead to attempted or actual suicide or other forms of self-harm.

Social media companies, the plaintiffs allege, are aware of these “product defects,” and yet they continue to market them to adolescents.

Whereas previous lawsuits against social media companies and internet service providers have centered on how they curate and moderate their content, the question at the center of this case is whether the platforms and their algorithms can be considered products — and, if so, whether the companies that own and operate them can be held liable for product designs that are alleged to cause or contribute to harm.

A girl looking distressed sits in the dark, with her smartphone's screen lighting her face. (Getty Images)
A girl using her smartphone. (Getty Images) (Getty Images/iStockphoto)

“The legal standard for product liability — if the court agrees that these social media platforms are products — is that these products are inherently dangerous,” said Jason Schultz, a clinical law professor at New York University. “Which means you’d have to really show that the use of these platforms is harmful not just for some users, but for most users.”

This requires the plaintiffs to present “very clear” scientific evidence that the use of social media platforms, and the interactions with their algorithms, is the cause of, and not just associated with, harm. “I have yet to see a definitive study that says in these circumstances, with these people, if you have this sort of algorithmic interaction, they will suffer these harms. But maybe that evidence is out there,” Schultz said.

Another key question is whether Section 230(c)(1) of the Communications Decency Act, which protects internet service providers from liability for publishing their users’ content, immunizes social media companies from the plaintiffs’ product liability claims.

“It will be incumbent upon the plaintiffs to show that social media platforms are what’s addictive,” said Robert Kozinets, a communications professor at the University of Southern California, “not cellphones and the content on them.”

Two people sit on the steps of the Supreme Court, with other tourists at the top under its colonnades.
The U.S. Supreme Court. (Celal Gunes/Anadolu Agency via Getty Images) (Anadolu Agency)

“From the evidence I’ve seen,” Schultz said, “there’s an argument to be made that it’s not just the algorithm itself that is causing harm, but the algorithm in conjunction with particular sorts of content.”

If the court accepts this argument, the plaintiffs are likely to run into problems with CDA Section 230, Schultz said, “as it allows social media companies to then say, ‘Look, we didn’t create the content.’”

A team of attorneys representing the plaintiffs is drafting a new consolidated complaint — one organized set of pleadings for future plaintiffs to sign on to, which is due on Feb. 14 — and have said they expect more plaintiffs to join.

It's not the only case taking aim at social media platforms. Earlier this month, the Seattle School District filed a lawsuit in the Western District of Washington against many of the same social media companies named in the multidistrict lawsuit, alleging that the companies’ “misconduct has been a substantial factor in causing a youth mental health crisis.”

“This mental health crisis is no accident. It is the result of [social media companies’] deliberate choices and affirmative actions to design and market their social media platforms to attract adolescents,” the suit asserts.

The recent lawsuits came after Frances Haugen, a former Facebook employee, leaked internal documents in 2021 indicating that executives at Meta (formerly Facebook) and related companies were aware of the products’ harm to users, but prioritized engagement and screen time for the sake of profit.

Antigone Davis at a roundtable discussion.
Antigone Davis, Facebook's head of global safety, addresses a roundtable discussion on cyber safety and technology at the White House in 2018. (Chip Somodevilla/Getty Images) (Getty Images)

"They know that algorithmic-based rankings, or engagement-based rankings, keeps you on their sites longer,” Haugen said in her testimony to Congress after the leak. “You have longer sessions, you show up more often, and that makes them more money.”

Social media companies have largely refrained from commenting on lawsuits but have pushed back against assertions that they place profits over the well-being of their users. In their defense, they have pointed to their safety efforts, but also said the onus is on parents to ensure that adolescents are using their products in a healthy fashion.

Antigone Davis, global director of safety at Meta, said in a recent statement to Axios that the company wanted adolescents to be safe online. "We'll continue to work closely with experts, policy makers and parents on these important issues,” Davis said.

In a separate statement, Google spokesperson José Castañeda said the company has “invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well-being.”

As the case continues, the social media companies named as defendants are likely to move to dismiss the consolidated complaint once it is filed. The judge in this case, however, will likely refrain from ruling on such a motion, Schultz said, until the Supreme Court releases its opinion in two cases involving Section 230(c)(1).

Advertisement