States Try to Stop Political Deepfake Videos
Connecting state and local government leaders
State lawmakers are increasingly focused on deceptively edited videos, a pervasive technology that advocates say has the potential to disrupt elections. But are bans constitutional?
Deepfakes—video clips convincingly altered to make it look like people said or did something they didn’t—have been called the newest threat to democracy, a dangerous tool that can be used to sway voters in the weeks leading up to elections. Legislators in several states this year began to take action against the problem, including Texas, which became the first state to criminalize the production and distribution of political deepfake videos in a new law that took effect this month.
The law amends the state’s election code to criminalize deepfake videos that are created “with intent to injure a candidate or influence the result of an election” and are “published and distributed within 30 days of an election.” Doing so is a class A misdemeanor, and offenders can receive up to a year in county jail and a fine of up to $4,000.
From a constitutional perspective it’s unclear whether deepfake technology can be outright banned, as doing so may infringe on the maker's First Amendment rights. But because the legislation specifically defines a deepfake in terms of election interference, the Texas law may stand up to legal challenges, according to an analysis from the Texas Senate Research Center.
“This technology likely cannot be constitutionally banned altogether, but it can be narrowly limited to avoid what may be its greatest potential threat: the electoral process,” the analysis says.
But enforcement could be difficult. The Texas law criminalizes distribution of videos as well as their creation, meaning someone living in the state who disseminates a video made elsewhere would still be liable for a crime. But tracing a viral video back to its source can take considerable time and resources, with no guarantee of success.
“It’s a fair question how many people are going to be prosecuted,” said Matthew F. Ferraro, a senior associate at law firm WilmerHale who specializes in deepfakes and disinformation. “With this law, legislators are likely trying to make a statement that the behavior is wrong, then basically doing their best to prevent it from happening.”
Legislators in California passed a similar measure last week, banning both residents and entities from sharing deepfake videos of political candidates within 60 days of an election. Radio and broadcast stations could still share those videos, provided they acknowledge “that there are questions about the authenticity” of the material. News outlets would also not be held liable for broadcasting paid content (namely, political ads) that are deceptively edited.
Despite those exemptions, media organizations took issue with the legislation. The California Cable and Telecommunications Association protested the measure, as did the California News Publishers Association, which said the bill threatens free speech while failing to grant new rights to victims, who already are able to sue for defamation and libel.
The bill “fails to make any provision for speech protected by the First Amendment,” the association wrote on its website. “Though the bill creates limited exceptions from liability where a disclosure is provided identifying the image or recording as being manipulated, those exceptions are almost certainly insufficient to ensure that constitutionally protected speech is not punished.”
The bill was vetted by legal experts, including Erwin Chemerinsky, dean of the law school at the University of California, Berkeley, who said that the measure would regulate speech but not infringe on the First Amendment.
“False speech, at times, is protected, but often the government is allowed to prohibit it without running afoul of the Constitution,” he wrote in the Sacramento Bee, noting laws against perjury and false advertising,
“Most importantly, the court has said that speech which is defamatory of public officials and public figures has no First Amendment protection if the speaker knows the statements are false or acts with reckless disregard of the truth,” he continued. “The Court has explained that the importance of preventing wrongful harm to reputation and of protecting the marketplace of ideas justifies the liability for the false speech.”
The bill awaits Gov. Gavin Newsom’s signature. If it becomes law, California would become the third to criminalize deepfake videos and the second, after Texas, to do so specifically related to election integrity. In July, Virginia became the first to target deepfake videos by amending an existing ban on non-consensual pornography to include the technology, defined in the legislation as “a falsely created videographic or still image.” Violators can receive up to a year in prison and a fine of up to $2,500.
Deepfake technology is not new, Ferraro noted, and has been used in movies for decades (notably, in Forrest Gump, which contained archival news footage of several presidents doctored to include actor Tom Hanks in character). But the malicious use of it, and the ability for a single video clip to instantly go viral, has put the issue front and center for lawmakers.
“I actually think they’ve moved pretty quickly,” he said. “It’s not that the ability to do this was foreign—it’s the ability for many more people to do it quickly, easily and with very little source video on that scale that is, I think, the game-changer.”
It’s likely that legislators will continue to address the technology, not by attempting to regulate the industry but instead by policing its usage, he said.
“Growing public alarm over the negative impacts of manipulated media has already resulted in increasingly aggressive legislative action to try to protect the public and the political process from these new harms,” he said. “To that end, I think legislators are likely to write bills that address not deepfake technology per se but the particular conduct that such technology can be used for—like creating pornography of a non-consenting person or disrupting elections.”
Kate Elizabeth Queram is a Staff Correspondent for Route Fifty and is based in Washington, D.C.
NEXT STORY: FCC announces first spectrum-sharing deal