Facebook CEO Mark Zuckerberg announced on Tuesday that the company is rolling out a program rating the trustworthiness of news organizations based on user feedback.
Content will be ranked and either promoted or suppressed depending on its performance according to surveys filled out by Facebook members.
Zuckerberg says the new initiative is part of an effort to help users find "common ground" following criticism that his company's platform has damaged democracy.
"It's not useful if someone's just kind of repeating the same thing and attempting to polarize or drive people to extremes," he said.
Tens of thousands of Facebook employees will begin to monitor posts to minimize fake news and reduce propaganda, and artificial intelligence will also be utilized in the effort. The company has vowed to spend billions on the sweeping change.
In an additional crackdown on election meddling, Zuckerberg confessed that Facebook's new vetting process will mean that the company is "essentially going to be losing money running political ads." But the new processes will likely take years if not a decade before being fully functional.
The changes were announced at Facebook's F8 development conference, where Zuckerberg met with news outlets such as The New York Times, CNN, Atlantic Media and the Huffington Post.
This comes among rising complaints from conservative outlets who claim their traffic has been suppressed by the social media giant.
In March, Fox News host Tucker Carlson said, "Facebook is not a neutral host; it has a political agenda. It's an act of ideological warfare, and it's far more worrying than anything that Cambridge Analytica has done, or is accused of doing."
But a Yale psychologist, David Rand, conducted an experiment with a colleague that tested the impact on news sites following the initial implementation of Facebook's new user poll. The results found that sources like Fox News were found to be trustworthy sources, while propaganda sites like left-leaning Daily Kos and right-leaning Breitbart were rated as untrustworthy by the researchers' poll.
Rand said of the findings: "It's totally consistent to — actually — an extent I'm pretty shocked about. If this is a result of [Facebook] implementing that policy, it looks like it's working reasonably well."