Digital literacy, not bans, should shape states' approach to social media
Connecting state and local government leaders
Protecting children from harmful content is important, but states should help young people understand the platforms’ risks and make informed decisions about what to view, experts say.
States should focus more on promoting digital literacy among young people and helping them make informed decisions about staying safe on social media, than imposing their own restrictions, according to panelists at a Route Fifty event.
In the absence of federal regulation of social media, several states have implemented their own regulations, primarily with the aim of protecting children from the worst content found on various platforms. Utah was the first state in the nation to require social media companies to verify users’ ages and to implement a curfew for its use.
Meanwhile, Louisiana is among those to restrict access to websites with adult content, while Montana turned heads with the first statewide ban on TikTok.
“State policymakers are passing legislation on online safety, frankly, because Congress isn't taking action,” said John Perrino, a policy analyst at the Stanford Internet Observatory, during a panel discussion at the GovExec State and Local Government Tech Summit. “Politicians look around their neighborhood, their community, and they understand the concern among parents—many are parents or grandparents themselves.”
While it may be tempting for state lawmakers to demonstrate that they are protecting young people with these bills, Perrino said they must proceed with caution.
Rather than focus on banning young people from accessing content outright, or instituting restrictions that require users to verify their ages, experts said states should instead focus on giving teenagers and older children tools to help them stay safe, like the ability to block unwanted messages, harassment and other harmful content.
“Solutions that will probably work best don't take things away from children, they give them tools to stay protected,” Perrino said.
Andy Green, an assistant professor of information security and assurance at Kennesaw State University, said that while there are bad actors online and minors must be careful, elected officials also should not tell children that criminals are “lurking under your bed.”
“We can either teach kids how to deal with it and put some guardrails in place, or we can take the other approach, which is to put your head in the sand and try and block the bad world from getting to you,” Green said. Both he and Perrino noted that tech-savvy young people will always find ways around restrictions. Many likely already use virtual private networks, which replace a user’s IP address with the address of the VPN server, allowing users to mask their physical locations and skirt state-based restrictions.
Requiring social media platforms to verify the ages of their users has been popular with state governments. Advocates say it helps protect minors from the most harmful content on the platforms and can help ease the teen mental health crisis that U.S. Surgeon General Vivek Murthy blamed on social media use.
But while states have said they will rely on age verification technology to enforce those requirements, the solution is still riddled with limitations and privacy concerns.
“My concern from a technical perspective is the lack of enforceability of these bans that state legislators are putting forward,” said Green. “[It] shows a real lack of awareness of how these things actually work.”
Also gaining popularity in state houses are so-called “age-appropriate design” laws. First launched in the United Kingdom, the online services standards protect childrens’ data online and include restricting data collection, checking users’ ages, switching off geolocation services and providing privacy by default.
California initiated a similar effort with its “Age-Appropriate Design Code Act.” Although it is facing legal challenges, that law requires web services likely to be accessed by children to carry out a risk assessment for users under 18. It also restricts the use of dark patterns, which are generally defined as manipulative design features that could trick users into giving away more data.
If social media companies are required to assess the potential harm resulting from their features and recommendation engines and “implement mitigation strategies to prevent some of those harms, I think that’s positive,” Perrino said. “It just needs to be more clearly defined.”
Montana’s TikTok ban was the most extreme example of state-level action against a social media platform, but both Perrino and Green reiterated their views that the law is unenforceable and unconstitutional. Green said he expects to see a “complete reversal” of the law, especially as it is already the subject of multiple lawsuits.
In addition to the TikTok bill, Perrino noted that Greg Gianforte, Montana’s Republican governor, directed state agencies to ban “any application that provides personal information or data to foreign adversaries from the state network,” targeting apps like Chinese-owned WeChat and Telegram, which has connections to Russia.
Perrino said actions that apply to state computer systems do far more to protect Montanans’ data and privacy than bans on personal app use. Plus, they have the added benefit of being enforceable.
If legislators are to produce better bills that can withstand court challenges and have the outcomes they desire, Green said there must be a stronger relationship between lawmakers and researchers who can advise them. That relationship must go both ways, he said: Elected officials must be more inclined to listen, while academics must not purport to be the only ones with expertise.
When it comes to security, experts tend to treat end users as if they “don't know anything about technology,” Green said. “And I think that's a horribly condescending way to talk to people.”
NEXT STORY: Outdated flood data could drown out actual infrastructure needs