A federal court ruled on Tuesday that Meta, ByteDance, Alphabet, and Snap must proceed with a lawsuit alleging their social platforms have adverse mental health effects on children. US District Judge Yvonne Gonzalez Rogers rejected the social media giants’ motion to dismiss the dozens of lawsuits accusing the companies of running platforms “addictive” to kids. School districts across the US have filed suit against Meta, ByteDance, Alphabet, and Snap, alleging the companies cause physical and emotional harm to children. Meanwhile, 42 states sued Meta last month over claims Facebook and Instagram “profoundly altered the psychological and social realities of a generation of young Americans.” This order addresses the individual suits and “over 140 actions” taken against the companies.
Tuesday’s ruling states that the First Amendment and Section 230, which says online platforms shouldn’t be treated as the publishers of third-party content, don’t shield Facebook, Instagram, YouTube, TikTok, and Snapchat from all liability in this case. Judge Gonzalez Rogers notes many of the claims laid out by the plaintiffs don’t “constitute free speech or expression,” as they have to do with alleged “defects” on the platforms themselves. That includes having insufficient parental controls, no “robust” age verification systems, and a difficult account deletion process.
“Addressing these defects would not require that defendants change how or what speech they disseminate,” Judge Gonzalez Rogers writes. “For example, parental notifications could plausibly empower parents to limit their children’s access to the platform or discuss platform use with them.”
However, Judge Gonzalez Rogers still threw out some of the other “defects” identified by the plaintiffs because they’re protected under Section 230, such as offering a beginning and end to a feed, recommending children’s accounts to adults, the use of “addictive” algorithms, and not putting limits on the amount of time spent on the platforms.
“Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” the lead lawyers representing the plaintiffs, Lexi Hazam, Previn Warren, and Chris Seeger, say in a joint statement. “The Court’s ruling repudiates Big Tech’s overbroad and incorrect claim that Section 230 or the First Amendment should grant them blanket immunity for the harm they cause to their users.”
Google spokesperson José Castañeda says the allegations in these complaints are “simply not true,” adding that the company has “built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.” Snap declined to comment, while Meta and ByteDance didn’t immediately respond to The Verge’s request for comment.
Numerous lawsuits have argued that online platforms include “defective” features that hurt users, but these claims — including a high-profile suit over harassment on Grindr — have often been thrown out in court. As more studies show evidence of the potential harm social platforms may be causing children, lawmakers have pushed to pass new laws specifically targeting child safety, including requirements for age verification. This ruling doesn’t determine that social platforms are causing harm or hold them legally liable for it, but it could still pave the way for a slew of safety claims even without new laws — and make the legal defense against them harder.
In an update, Google added a statement to the ongoing discussion, asserting that the complaints made against them are not true.
This ongoing lawsuit and the ruling by the federal court highlights the increasing concern over the impact of social media platforms on children’s mental health. The lawsuit serves as a reminder of the need for continued exploration and regulation of the digital landscape to ensure the safety and well-being of young users. As the legal proceedings progress, it will be important for all parties involved to prioritize the protection of children while also recognizing the complexities of online platforms in the modern age.