YouTube announced Friday that it will stop recommending so-called conspiracy videos for users who are not seeking such topics. In a blog post, the company gave three examples of such content: Theories that the earth is flat, miracle cures for a serious illness, and "blatantly false claims about historic events like 9/11."
"Less than one percent" of the content on YouTube will be impacted, although that is still a significant number, since there are billions of videos posted on the website.
"To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube," the blog post states.
The company did not elaborate on the full range of videos that would be considered offensive. Similar to its parent company, Google, YouTube does not reveal how its algorithms work or what criteria is used to decide whether content is inappropriate. Artificial intelligence or machine-based algorithms, not people, will impart the criteria.
In 2017, YouTube began de-monetizing videos it deemed to have a controversial religious message or "supremacist" content. Some of the videos now appear behind a message screen that states the material may be inappropriate or offensive. Also, other features such as likes and comments are disabled.
YouTube indicated that it will not take down the targeted videos and will still recommend them to users who subscribe to a channel that offers "conspiracy theory" content. Users will also be able to search for the content.
"We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube wrote.
The post implied the new policy is similar to YouTube's effort to crackdown on "clickbaity videos with misleading titles and descriptions ('You won't believe what happens next!'). We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often."
YouTube has come under fire for featuring conspiratorial content to users when they have not searched for it, the New York Times reported. Some have also accused the site of creating a deeper political chasm in the country through "extreme" content. But YouTube, along with other social media giants, has also faced criticism for censoring conservative viewpoints, although they all deny doing so.
How will it be enforced?
A combination of "machine learning" and people will be used to implement the new policy, according to YouTube's post.
"We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations," the blog post states. "These evaluators are trained using public guidelines and provide critical input on the quality of a video."