Seattle public schools are suing the tech giants behind TikTok, Facebook, Instagram, YouTube and Snapchat, accusing them of creating a “mental health crisis among America’s youth.” The 91-page lawsuit filed in a US District Court claims tech giants exploit the addictive nature of social media, leading to an increase in anxiety, depression and thoughts of self-harm.
“The growth of defendants is a product of the choices they’ve made to design and operate their platforms in ways that leverage their users’ psychology and neurophysiology to spend more and more time on their platforms,” the complaint said. “[They] successfully exploited the vulnerable brains of young people, hooking tens of millions of students across the country in positive feedback loops of overuse and abuse of defendants’ social media platforms.”
The harmful content sent to users includes extreme diet plants, encouragement of self-harm and more, according to the complaint. This led to a 30% increase between 2009 and 2019 in students reporting feeling “so sad or hopeless… for two weeks or more in a row that [they] stopped doing some usual activities.”
The defendants’ misconduct was a substantial factor in causing a youth mental health crisis, which was marked by increasingly high proportions of youth struggling with anxiety, depression, thoughts of self-harm, and suicidal ideation. The rates that children have struggled with mental health problems have risen steadily since 2010, and in 2018, suicide became the second leading cause of death for young people.
This in turn leads to a decline in performance in their studies, making them “less likely to attend school, more likely to use substances and more likely to take action, all of which directly impact the ability of Seattle Public Schools to fulfill its mission educational.”
Section 230 of the US Communications Decency Act means that online platforms are not responsible for content posted by third parties. However, the lawsuit says the provision does not protect social media companies for recommending, distributing and promoting content “in a way that causes harm.”
“We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well-being,” a Google spokesperson said. Axios. “For example, through Family Link, we offer parents the ability to set reminders, limit screen time, and block specific types of content on supervised devices.”
“We’ve developed more than 30 tools to support teens and families, including supervisory tools that help parents limit the amount of time their teens spend on Instagram and age-verification technology that helps teens have appropriate experiences.” age,” said the global head of Meta. safety Antigone Davis said in a statement. “We will continue to work closely with experts, policymakers and parents on these important issues.” TikTok has yet to react, but Engadget has reached out to the company.
Critics and pundits have recently accused social media companies of exploiting teenagers and children. Meta whistleblower Frances Haugen, for example, testified to Congress that “Facebook products harm children.” Eating disorder expert Bryn Austin wrote in a 2021 Harvard paper that social media content can send teens into a “dangerous spiral.” And the issue has caught the attention of lawmakers, who proposed the Kids Online Safety Act (KOSA) last year.
All products recommended by Engadget are selected by our editorial team, which is independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices correct at time of publication.