The Facebook logo edited with the colors of the Democratic and Republican parties

Facebook will ban new political advertising in the week before the upcoming U.S. presidential election, the company said Thursday. The announcement marks the biggest change at the social media platform in the lead-up to the election, but experts say the move might not go far enough to prevent abuse on the service.

The ban aims to curb the spread of misinformation that is already proliferating on social media platforms as election day approaches. Misleading political messages can lead to voter suppression and potentially to violence, say both Facebook executives and outside experts. 

“The U.S. elections are just two months away, and with Covid-19 affecting communities across the country, I’m concerned about the challenges people could face when voting,” Facebook CEO Mark Zuckerberg said in a blog post. “I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.”

More on Misinformation

Despite those concerns, Facebook is not banning political ads entirely, and such ads won’t be subject to fact-checking, unless they seek to mislead people about how or where to vote, intimidate them into staying away from polling places, or suggest that they vote illegally. It’s also against Facebook policy for an ad to claim people will get COVID-19 if they take part in voting.

The presidential campaigns and other advertisers can continue to promote any ad posted by Oct. 27 as long as it has been seen by at least one Facebook user, according to a company spokesperson.

Still, banning new political advertising marks a significant about-face from Zuckerberg, who previously argued that the company has no place regulating political speech.

“It’s marginally helpful, though seven days is not enough,” says Justin Brookman, director of privacy and technology policy at Consumer Reports. “A lot of people will have already voted by then, especially if people are voting early by mail because they’re scared their ballots won’t count otherwise. There’s a strong argument Facebook should ban political ads in general, but if they’re going to do a moratorium, a month—and a restriction on how ads are targeted at specific users—would have been more useful.”

The political ad ban adds to a number of new and previously enacted Facebook policies that try to limit some kinds of misinformation. The company says it will add a label with a link to vetted information on any post that claims victory for a candidate before official results are in, or casts doubts on the election’s outcome, including posts from the campaigns or the candidates themselves.

Starting this week, users will also see a link to the company’s Voting Information Center, a hub for information about the election and how to register to vote. It will appear at the top of their Facebook and Instagram feeds.

Too Little, Too Late?

Some experts say these changes won’t make a meaningful difference in protecting the upcoming election.

“It is a pretty toothless attempt,” says Melissa Ryan, CEO of Card Strategies, a consulting firm that specialized in combating online disinformation. Preventing advertisers from refreshing the content of their ads but continuing to run anything that has already been posted isn’t likely to make a difference, she says. Ryan says the policy allows advertisers to show a large number of ads for hypothetical scenarios to just a few Facebook users by Oct. 27, and do a wide rollout of whichever ads they want right up to the election.

More important, according to Ryan, is that the company isn’t making any changes to its policies about misinformation.

A Consumer Reports analysis of social media misinformation policies found that advertisers face the strictest rules—but Facebook’s official policy is to avoid fact-checking political ads.

“There is no change to that policy,” a Facebook spokesperson tells CR. “However, voter suppression and COVID misinformation that violate our community standards is removed and rejected in ads regardless of who runs it.”

As CR found earlier this year, it can be easy to get even dangerous health ads approved by the company’s advertising platform.

Ads aren’t the only source of misinformation on Facebook—or the most important one. During the 2016 election season, experts say that disinformation posted by Russian fake accounts were seen by 126 million Facebook users. That appears to have been far greater than the number of users who saw ads purchased by organizations associated with a Russian intelligence agency. 

Another problem identified by CR’s Brookman and others with the 2016 election was the use of microtargeting, in which small groups of voters, grouped by characteristics such as race and voting district, saw misleading ads that weren’t available to most Facebook users. That meant that journalists and others were unable to report on their accuracy or provide missing context. Facebook’s new rule doesn’t address that issue.

Facebook continues to let users, as opposed to advertisers, post misinformation about political and social issues as long as they don’t provide false information about the process of participating in voting or the U.S. Census. Posts by the candidates and their supporters are largely unregulated on the platform.

“TV stations will fact-check ads and not run things that are untrue. I maintain that they as a platform are not responsible enough to be running ads and have this large role in our democracy,” Ryan says.

Facebook’s position differs from that of some of its competitors. Most notable is Twitter, which banned political ads altogether late last year, arguing that the risks posed by political interests using social media to hypertarget misleading information at susceptible voters outweigh the benefits of letting people post whatever they want online.

How to Protect Yourself From Misinformation

For those worried about their exposure to misinformation and other toxic content on social media, there are steps you can take to protect yourself—including a setting that consumers can use right now to limit political ads on their Facebook account altogether. Check out Consumer Reports’ guide to fine-tuning what appears on your social media feed for detailed instructions on advertising settings and other tools.