A conceptual illustration of a newspaper page with red flags to indicate misinformation warnings
  • Media literacy tips that Facebook promoted were found to help thousands of U.S. test subjects distinguish between truthful and false headlines.
  • Facebook has declined to reveal how many users read the tips and whether they affected the spread of misinformation.
  • The results suggest that simple instructions can help people be savvier news consumers.

Back in 2017, Facebook users in 14 countries found a colorful banner at the top of their news feeds with a link to 10 “tips to spot false news.”

It was soon after a Russian social media propaganda effort targeted the U.S. presidential campaign and rampant misinformation was helping incite a genocide in Myanmar.

The list had basic media literacy tips, including gimmes like checking a story's claims against several widely known news sources. For instance, if it's reporting a major scandal that no outlet—say, ABC News or The Wall Street Journal—has mentioned, it’s less likely to be true. 

After a few days, the banners quietly disappeared, though you can still find the list of tips on Facebook with a Google search. And in the years since, misinformation experts have largely dismissed the potential of that kind of simple media literacy lesson to help protect consumers against everything from fake coronavirus or cancer cures to political conspiracy theories.

Along with social media platforms, experts have focused instead on more active precautions: removing accounts associated with propaganda operations, flagging false information with warning labels or just deleting them, and boosting more reliable information in people’s news feeds.

More on Misinformation

But three years later, scientists tested the effectiveness of Facebook’s media literacy tips, and to their surprise, the lessons appear to work.

In a study published Monday, a group of academic researchers showed the recommendations to half of nearly 5,000 American research participants. The researchers found that the participants were significantly better at separating truthful information from false news than the other half, who hadn’t read the tips. The study also included a large group in India, with similar results among highly educated people and less success with other participants. 

“My expectation was that we wouldn’t find large effects—if any,” says Andy Guess, a Princeton professor who studies social media and civil society and was a researcher behind the new study. But that’s not what happened.

The American participants who read the list and answered some questions about it were 26.5 percent better at distinguishing truthful news headlines from false ones than those who hadn’t been given the list, the study found.

The false headlines included claims favoring Republicans and Democrats. A left-leaning headline said "VP Mike Pence Busted Stealing Campaign Funds to Pay His Mortgage Like a Thief." A Republican-leaning headline read "Lisa Page Squeals: DNC Server Was Not Hacked by Russia."

Several weeks later, most of the participants were asked to rate a larger group of headlines. Accuracy dropped somewhat among the U.S. cohort, but it still did better than the group that hadn’t read the tips.

“It’s heartening to see evidence that what feels like a really simple media literacy intervention had some ability to improve news-quality discernment,” says Claire Leibowicz, a misinformation expert at the Partnership on AI, a nonprofit that convenes academics, nonprofits and companies to study technology issues.

Leibowicz wasn't involved in the study, which was published in the academic journal Proceedings of the National Academy of Sciences.

The tips could be helpful in judging any kind of information, whether political in nature or not. For instance, one tip is to look closely at the web address in a link. Sometimes knockoff news sites try to imitate a reputable source by changing a small detail in the URL—something like “consumersreports.org” instead of the authentic “ConsumerReports.org.”

While these results are encouraging to experts, there’s plenty the new study leaves unanswered. For one, participants were asked to sit down and read the 10 media literacy tips. But in real life, when Facebook slapped a link to the list on top of people’s news feeds, researchers don’t know how many read them or scrolled right by without clicking.

The study also found something puzzling: Even participants who became more skeptical of false news didn’t say they were less likely to share it. The researchers didn't know whether the tips did more to change user behavior when they were disseminated on Facebook in 2017.

Facebook has those answers. The company should know how many people clicked on the media literacy list, how long they spent on that page, whether they later changed their reading or sharing habits, and how long any effects lasted. But the company didn’t share that data after it ran its banner links, even with First Draft, the organization that helped design the recommendations.

“These scholars did an amazing job of looking at the scale of the intervention with the tools they had available, but I'm just so disappointed that there isn’t a way for an independent audit of what happened on the platform,” says Claire Wardle, a co-founder of First Draft.

Facebook declined to answer CR’s questions about its 2017 intervention or any conclusions it drew from users’ responses to it.

Flagging Misleading Content

In recent years, Facebook and other social media platforms have launched new initiatives to label or remove misleading content, along with some hate speech and incitements to commit violence. Twitter and Facebook now flag some misleading posts, videos, and images, and sometimes include links to rumor-busting reporting on the topic.

Last Friday, for example, Twitter labeled a satirical video that President Trump tweeted as “manipulated media.” In the video, an innocent scene of toddlers playing was doctored to make it look like a politically charged news report on racism from CNN. (Later, Twitter and Facebook both disabled the video in response to a copyright claim.)

But making those choices can draw social media platforms into debates over their own potential political biases.

More critically, experts say, the sheer scale of posts, links, videos, and images posted every minute means a company will never be able to root out all the misleading and harmful ones on their platform, even with the help of automated moderation systems. 

That’s why equipping users with the know-how to sort out the good from the bad could be helpful, some experts say. But others argue the platforms should do more to help consumers use fake news-spotting tips.

“We should expect individuals to exercise judgment about what they view, how they assess it and how they share—that’s a fair expectation to place on people,” Sam Gregory, program director at Witness, a nonprofit organization that studies how photo and video evidence can create social change. “That said, platforms have the ability to facilitate that.”

For instance, one of Facebook’s 10 tips was to look into the origin of photos that accompany suspicious posts or news stories. "Sometimes the photo may be authentic, but taken out of context. You can search for the photo or image to verify where it came from." Gregory says that social media companies should make that easier—the platform could show, for example, the date when the image was first posted, right in the news feed.

(You can do this on your own by copying the link to an image and pasting it in Google’s “Search by Image” function. This trick is called a reverse image search, and it’s a critical tool for internet detectives.)

Twitter experimented with a similar tool recently when it showed some users a prompt if they were about to retweet an article they haven’t clicked on yet. “Want to read this before retweeting?” it asked. Experts told CR this extra step seemed like a valuable cue that could slow the spread of misinformation.

But several of Facebook's 10  “false news” tips are easy for a news consumer to pick up on their own, without nudges from a social network.

One tip in particular sums up the attitude experts say you should take with all kinds of information on social media: “If shocking claims in the headline sound unbelievable, they probably are.”