© 2024 Blaze Media LLC. All rights reserved.
Facebook ditches red flags in its attempt to help stop spread of 'fake news
Facebook says it's ditching the red warning icons on news stories, and instead, will show "related articles" next to the disputed news stories. (Josh Edelson/AFP/Getty Images)

Facebook ditches red flags in its attempt to help stop spread of 'fake news

Facebook says it's ditching the red warning icons on news stories that its third-party fact-checkers have flagged as "fake news."

Instead, the social media giant will show "related articles" next to the disputed news stories, the company announced Wednesday in a news release.

The company said that the move is part of its effort to help stop the spread of false news. Facebook said that its extensive yearlong research has shown that the "red flag" icon, which it launched last December, did the opposite of what it intended. People appeared to become more entrenched in their beliefs when presented with the disputed warning.

Facebook has come under fire for selling Russian-paid propaganda that some progressives say may have affected the 2016 presidential election. Facebook denied that the ads affected the election, BBC News reported.

Were there other findings in the research?

According to Facebook, the disputed flags didn't make it easy for people to understand what information what false. It required the user to go through additional clicks to find the fact-checkers' disputed information.

• At least two third-party fact-checking organizations had to deem the information as false before being flagged. Facebook said that made it difficult for countries where there are "very few fact-checkers."

• Fact-checkers often used different ratings such as "false," "unproven," and "true." The disputed flags only applied to "false" ratings. Also, sometimes fact-checkers disagreed on an article's rating.

How did Facebook conduct its research?

The company said it traveled around the world over the past year talking to people about misinformation they encountered on the social media site.

It's not immediately clear how many people were interviewed by the company's researchers.

Facebook said it spoke with people from different backgrounds, conducted formal lab usability testing, talked with people individually and in groups, and even made in-home visits.

Facebook said it's launching a second initiative to understand better how people determine the accuracy of the information they receive.

It's unclear how the company plans to track that information, but it says the action will not affect your newsfeed.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?