Missouri rule would require algorithm ‘transparency’ on social media
Attorney General Andrew Bailey unveiled the first-in-the-nation rule that would allow users to choose who moderates their content, in a bid to end what he called “monopoly control” over content moderation.
Amid a flurry of action in his first few hours in office, President Donald Trump signed an executive order that he said would restore freedom of speech and end federal government censorship online.
Conservatives have long griped that their views are censored on social media, and that effort has borne legislative fruit, as Florida and Texas both passed laws banning the censorship of conservatives online. Those laws went before the U.S. Supreme Court last year, only to be sent back to the lower courts for more study on their implications for the First Amendment.
Meanwhile, Trump’s order accused the Biden administration of “censoring Americans’ speech on online platforms…in a manner that advanced the Government’s preferred narrative about significant matters of public debate.”
Just days before Trump took office, another state looked to act against censorship and to protect free speech. Missouri Attorney General Andrew Bailey recently unveiled a first-in-the-nation regulation that would require social media users in the state to be offered algorithmic choice. The rule would require platforms to be transparent about the algorithms they use and offer users the option to select alternatives.
Bailey unveiled the rule under the Missouri Merchandising Practices Act, the state’s consumer protection law. It would declare that it is an “unfair, deceptive, fraudulent, or otherwise unlawful practice” for social media platforms to operate unless users can select a third-party content moderator they choose, rather than rely on the platform’s content moderation.
“Social media companies are supposed to provide a space where users can share views, content and ideas,” Bailey said. “Instead, Big Tech oligarchs have manipulated consumers’ social media feeds for their own purposes and exercised monopoly control over content moderation. To that end, I am invoking my authority under consumer protection law to ensure Missourians get to control the content they consume on social media. With this rule, Missouri becomes the first state in the nation to enshrine transparency and accountability for Big Tech into law at this scale. Big Tech companies who run afoul of this regulation will be held accountable.”
Bailey outlined a series of criteria that would show platforms as satisfying his requirement for algorithmic choice, ahead of the next phase where he will collect public comment and hold public forums to “collect additional evidence about the deceptive practices of the social media companies.”
Under the proposed rule, users must be provided with a choice screen upon activating their account and then at least every six months to choose a content moderator; no selection would be chosen by default; the screen where they choose must not favor the social media platform’s content moderator over those of third parties; and third-party content moderators must be permitted interoperable access to data on the social media platform so they can properly moderate.
Social media companies also would be prevented from moderating, censoring or suppressing content on their platforms if a user would be able to view that content with their chosen content moderator. Bailey called the rule “the first prong of a comprehensive offensive to protect free speech in 2025.”
This rule is the first of its kind in the United States and represents the latest attempt by state lawmakers to regulate social media. That has taken many forms so far, including the efforts in Florida and Texas to prevent what they see as conservative censorship. In addition, several other states have looked to restrict the platforms’ use among minors with legislation that at times has struggled to stand up to legal scrutiny.
And experts appear skeptical of whether this rule can also stand up to legal challenge, or even function as intended. A group of attorneys with law firm WilmerHale wrote that it raises several issues, including around data privacy given the requirements that a third party be able to moderate content and access someone’s account.
The group also questioned Bailey’s assertion that the regulation will follow the “roadmap” set out by the Supreme Court in deciding the Florida and Texas cases, as he does not explain that roadmap or how the state will navigate the rule’s implications for the First Amendment. And they said it is unclear how many third-party content moderators can be selected from, and whether companies will invest the time and resources to be recognized as such.
“While AG Bailey’s stated goal is to ensure transparency around the algorithms social media companies use, this proposed regulation appears to go further than mere transparency,” the WilmerHale attorneys wrote.
Bailey said transparency is key for this new regulation, as well as the ability for users to have power over what they see online.
“Americans should control the content they consume on social media, not Big Tech oligarchs,” Bailey said in a post to X, formerly Twitter. “Let the best algorithm win."
NEXT STORY: Data centers are booming in Texas. What does that mean for the grid?