Seattle’s public school district is suing Facebook, Instagram, TikTok, YouTube, Snapchat, and their parent companies. The lawsuit, filed on Friday in a U.S. District Court, alleges that these social media sites have been a primary factor in a “youth mental health crisis,” and that these platforms have knowingly exploited, manipulated, and targeted young people for profit at the expense of their mental heath.
The district argues in its 91-page complaint that tech giants have intentionally engineered addicting platforms, cashed in on the vulnerability of still-developing brains, and algorithmically suggested harmful content to young users.
Ultimately, the school district is blaming these social media companies for the increase in mental health and behavioral issues that teens are showing up to classrooms with, which has rendered the task of educating more difficult, according to the suit. District officials point to a 30% increase in self-reported feelings of sadness and hopelessness among the student body, as well as a rise in student suicide plans and attempts between 2010 and 2018.
In an effort to manage those challenges, the school district says it has had to take expensive actions like hiring more mental health counselors, creating curriculum surrounding social media and mental health, adjusting and enforcing school policies surrounding social media use, and increasing disciplinary resources. However, even all of these changes haven’t been enough to manage.
“Plaintiff cannot keep up with the increased need for mental health services because of the youth mental health crisis,” the lawsuit claims. So, the Seattle schools are seeking accountability for social media platforms and meaningful change in how these companies operate, along with damages and compensation.
Keyboard cleaner
Alternative to canned air, compressed air can be recharged and used repeatedly. It is more energy-efficient and environmentally friendly.
In past, similar cases, tech companies have used Section 230 of the Communications Decency Act as a legal shield. Under the law, digital publishers are not responsible for third-party content posted on their platforms (i.e. Meta is not liable for anything its users post on Instagram and Facebook). However, the Seattle case aims to get around this fundamental protection by targeting the design of social media sites—not their content. The school district is claiming the increasing incentives to spend more and more time scrolling and the algorithms that dictate what users see causes harm too—not just what’s in the posts.
“Defendants have maximized the time users—particularly youth—spend on their platforms by purposely designing, refining, and operating them to exploit the neurophysiology of the brain’s reward systems to keep users coming back, coming back frequently, and staying on the respective platforms for as long as possible,” says the complaint.
Some psychology research, along with both internal and external reports on social media company practices seem to support many of the new lawsuit’s claims. Studies have shown, for instance, that social media use and increased smartphone use may be linked to sleep depravation and accompanying depression. A Pew 2022 analysis found that more than half of teenagers surveyed would have a hard, or very hard, time giving up social media. Meta’s own internal research suggested that Instagram is toxic to some teen users, particularly girls, as it cultivates and amplifies body image issues. And Facebook has known for years that its algorithms boost time spent on its site to users’ detriment.
However, it’s very difficult to establish a direct link between increased social media use and worsened mental health because there are so many variables involved in mental health. And many experts dispute the use of the term “addiction” as applied to social media platforms altogether.
This isn’t the first attempt to sue social media companies for alleged mental health or youth harms in the U.S.. However past suits have mostly focused on individual cases. For instance, the mother of a 10-year old who died in 2021 sued ByteDance over allegations that a TikTok challenge caused her child’s death. And, in April, the mother of a Wisconsin 17-year old who died by suicide sued Meta and Snapchat for “knowingly and purposely” creating harmful and addicting products. The FTC has forced Fortnite to change its interface design so as to be less deceptive (and fined Epic Games half a billion dollars).
California legislators even tried to pass a bill banning addictive social media and explicitly making tech companies liable for every resulting violation involving children. The bill failed, but more than 30 states currently have some sort of proposed or pending legislation aimed at regulating social media.
Gizmodo reached out to Meta (Instagram and Facebook’s parent company), Alphabet (Google and Youtube’s parent company), TikTok (owned by ByteDance Inc.), and Snapchat (owned by Snap Inc.) for comment.
“We want teens to be safe online,” wrote Meta’s head of global safety, Antigone Davis, in a response statement emailed to Gizmodo. Davis’ statement cited tools the company has developed “to support teen and families,” like age verifications, parental controls, and notifications encouraging breaks. Further, it read “we don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us.”
Though past cases, like the death of 14-year old Molly Russell in the U.K., have demonstrated that harmful content like self-harm promotion does slip through the cracks. In the lead up to her suicide, Russell interacted with more than 2,000 Instagram posts relating to self-harm, suicide, and depression.
A Google spokesperson, too, responded by highlighting the efforts he said the company has taken to make its platforms safer for children and teens—like screen time reminders and content blocks.
TikTok and Snapchat did not immediately respond.