Did Supreme Court ruling come too late to reverse ‘chilling effect’ on 2024 disinformation?
Connecting state and local government leaders
While the justices found in Murthy v. Missouri that two states lacked standing to sue over alleged government censorship, observers warned that the case may have already damaged efforts to crack down ahead of November.
The Supreme Court’s recent ruling that governments can ask social media platforms to remove misinformation may give state and local officials heart ahead of November’s elections. But some observers say it might be too late.
Justices ruled 6-3 last week in Murthy v. Missouri that the states of Louisiana and Missouri and the five individual social media users named in the case did not have the standing to sue. The plaintiffs alleged that the Biden administration pressured Facebook to censor their speech by deleting posts complaining about mask and vaccine requirements during the COVID-19 pandemic.
Experts warned after the decision, though, that the case’s existence created a “chilling effect” on collaboration between governments and social media companies from the get-go. Some inflammatory hearings in a House subcommittee have further discouraged any teamwork. The effect could be difficult to overcome ahead of an election season already dominated by worries about misinformation, artificial intelligence and overwhelmed election workers faced with increased physical and cyber threats.
“A combination of this case and the full attention that has been brought to bear by the House weaponization committee hearings on this topic has really deterred both social media platforms and governmental officials from wanting to further engage in those collaborations,” said Matt Seligman, a fellow at the Stanford Constitutional Law Center, during a Tech Justice Law Project briefing after the decision.
A major source of collaboration on removing election mis- and disinformation has traditionally been between social media companies and two federal agencies—the Cybersecurity and Infrastructure Security Agency, or CISA, and the Federal Bureau of Investigation—which flag content to the platforms that they believe should be taken down.
Those efforts also include free resources and advice for state and local election officials on the best ways to dispel common disinformation narratives and how to communicate effectively with their residents in the face of efforts to undermine them.
But experts said that relationship has cooled in recent years, amid accusations of bias and censorship by the government. That allegation was central in this case. Writing for the minority opinion, Justice Samuel Alito accused government officials of “coercive” censorship that was “blatantly unconstitutional.”
Those who helped bring the case maintained that government agencies working with social media platforms to remove dis- and misinformation is censorship. In a post to X, formerly Twitter, Louisiana Attorney General Liz Murrill said the 6-3 majority “waves off the worst government coercion scheme in history.”
Seligman said that kind of rhetoric may have convinced platforms to back off.
“I think the big open question going forward as we head into the very heat of the 2024 election cycle is to what extent governmental officials at the federal, local and state levels, and the social media platforms are willing and able to restore and restart those collaborative efforts,” he said. “Even if the governmental officials feel like, ‘OK, we're in the clear now,’ the social media platforms have been deterred from reaching out to governmental officials for help they really could use.”
Social media companies and CISA, meanwhile, expressed optimism. Jen Easterly, director of CISA, said in an emailed statement she was “pleased” with the Supreme Court’s decision.
“As we have made clear from the beginning, CISA does not and has never censored speech,” Easterly said. “Every day, the men and women of CISA execute the agency’s mission of reducing risk to U.S. critical infrastructure in a way that protects Americans’ freedom of speech, civil rights, civil liberties, and privacy.”
Social media companies say they are getting tooled up. Facebook parent Meta said it has invested millions of dollars in “teams and technology” to protect election integrity and has around 40,000 people working on “safety and security,” although some remain skeptical they will be able to deal with the expected flood of disinformation and foreign influence operations.
Meanwhile, Elon Musk’s ownership of X has others worried about its potential impact on November’s vote, especially as a recent survey from the Pew Research Center found that 59% of its U.S. users say a reason they use it is to keep up with politics or political issues.
"The risk that government officials will improperly pressure platforms to censor speech is real,” Samir Jain, vice president at the Center for Democracy and Technology, said in a statement. “At the same time, government communications can provide useful information that lets platforms make more informed moderation decisions.”
While the case had generated a lot of heat and noise, especially after District Judge Terry Doughty ruled last July that the federal government had “apparently engaged in a massive effort to suppress disfavored conservative speech,” observers said the Supreme Court’s decision showed there is still plenty of scope for platforms to work with governments and the research community to root out dis- and misinformation.
“[This] ruling hopefully serves as a deterrent for future cases that are weak on facts and heavy on conjecture that are intended to silence rigorous study of how misinformation and disinformation moves across the internet,” Brandi Geurkink, executive director at the nonprofit Coalition for Independent Technology Research, which works to maintain the right to study the impact of tech on society, said during the Tech Justice Law Project briefing. “Today's decision, in my view, sends this really clear message that the justice system in America requires proof and not just political posturing.”
NEXT STORY: Governors seek more say over grid planning process