States’ lawsuits pile up against social media companies
Connecting state and local government leaders
A multistate coalition filed suit against Facebook parent Meta, accusing it of damaging young people’s mental health and illegally harvesting their data. It is the latest in a long line of actions taken against social media platforms to reckon with the growing youth mental health crisis.
Lawsuits against social media companies are multiplying as state attorneys general and school districts look to hold the platforms to account, claiming they are addictive to young people, harm their mental health and fail to comply with federal laws.
Last week, a bipartisan, multistate coalition of 33 state attorneys general became the latest to sue. Tuesday’s filing in a California federal court targets Meta, the parent company of Facebook and Instagram. In addition, another nine attorneys general brought their own lawsuits in their states against Meta.
The filings by the 41 states and Washington, D.C., are the latest in a long line of actions taken against Meta and other social media giants, as elected officials reckon with the youth mental health crisis.
Earlier this month, Utah sued the parent company of TikTok, and litigation by more than 600 school districts nationwide is ongoing against the parent companies of Facebook, TikTok, Snapchat and YouTube. The suits all allege that the platforms cause mental and emotional harm to young adults. U.S. Surgeon General Vivek Murthy said earlier this year that social media is an “important driver of that [mental health] crisis—one that we must urgently address.”
Legislation designed to counter the effects of social media also seems to be gathering steam in some states, as lawmakers look to get the youth mental health crisis under control. New laws come on the heels of Montana’s TikTok ban earlier this year, as well as various restrictions on youth social media use in Utah. New York Attorney General Letitia James announced this month two bills that would ban online platforms from collecting and sharing minors’ personal data without consent and would limit social media platforms’ addictive features.
Lawsuits and legislation may not have easy paths, however. A federal judge called Indiana’s suit against TikTok “political posturing” in June. Meanwhile, a federal judge temporarily blocked Arkansas’ Social Media Safety Act, on the grounds that its age verification requirements placed an undue burden on users and undermined their free speech. That law is blocked while a broader lawsuit plays out.
The multistate lawsuit makes several accusations against Meta and its social media platforms, including that its business model focuses on increasing young people’s engagement so it can make more money; that it “falsely represents” the safety of its social media platforms and says they are not designed to encourage compulsive use by minors; that it misleads its users and the public about the amount of harmful content; and the platforms cause young people “significant physical and mental damage,” something Meta is aware of.
The lawsuit also accuses Meta of not complying with the Children’s Online Privacy Protection Act, which imposes various requirements on platforms, including verifiable parental consent for users under 13 and permission from parents to collect their data.
“Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” James said in a statement about the multistate lawsuit.
The complaint notes that the company has been concerned about falling youth engagement on its platforms in recent years. In order to draw young people in, it employs technologies designed to maximize their time spent using the sites. Those technologies and features include engagement-based, not chronological, feeds; infinite scroll; push notifications; temporary content that disappears after a limited time; and video-based content.
The complaint argues that these features induce “compulsive and extended use” among young people, and that “Likes” are other social comparison features that negatively impact young people’s self-esteem.
Meta is accused of knowing the risks and mental health impacts of its platforms and their design features on young people, but misrepresenting and downplaying those risks to the public. The complaint also says the company chooses not to crack down on underage use as that could impact its bottom line.
Similar arguments about the effects of social media platforms on youth mental health are made in the state-level lawsuits, as well as in the litigation by school districts.
Attorney General Brian Schwalb of Washington, D.C., who filed one of the separate lawsuits, cited statistics from a 2021 youth risk behavior survey to make his case. The survey found that almost half of girls in the district self-reported episodes of psychological distress, while 28% of all middle school students and 36% of high school students reported seriously considering committing suicide.
Meanwhile, over 70% of all D.C. high school students reported spending more than three hours a day on screen time, with over two-thirds of middle school students reporting spending the same amount of time on screens. Other states have found similar upticks in youth mental health issues.
“Children are particularly susceptible to addictive technologies, and Meta has exploited these vulnerabilities, putting its quest for advertising revenue over the psychological and emotional well-being of young people,” Schwalb said in a statement.
In their suit against TikTok, Utah Gov. Spencer Cox and Attorney General Sean Reyes allege that the video-sharing app illegally baits children into addictive and unhealthy use, misrepresents how safe its platform is and deceptively says it is independent of ByteDance, its Chinese parent company.
Like the Meta lawsuit, Utah’s suit against TikTok accuses it of designing addictive features to keep young people “endlessly scrolling,” and in doing so, making more money. Cox said in a statement that given the mental health impacts of TikTok and other social media platforms, “it’s time to intervene.”
NEXT STORY: 4 questions to ask before sharing constituent data