Twitter will post warnings for users, and in some cases hide content, for posts it deems to be misleading, false, or disputed regarding COVID-19, the Associated Press reported.
Depending on the perceived problem with the tweets in question, Twitter may assign a warning underneath a tweet alerting users that it contains disputed information, or it may attempt to direct users to a link with verified information from public health experts.
Other situations might call for a tweet to be completely covered by a warning message stating that "some or all of the content shared in this tweet conflict with guidance from public health experts regarding COVID-19," AP reported.
This initiative is not a full fact-checking operation, Twitter said, and the site will not outright label things as misinformation. From AP:
Twitter won't directly fact check or call tweets false on the site, said Nick Pickles, the company's global senior strategist for public policy. The warning labels might direct users to curated tweets, public health websites or news articles.
"People don't want us to play the role of deciding for them what's true and what's not true but they do want people to play a much stronger role providing context," Pickles said.
This policy change is in addition to Twitter's current policy of taking down tweets it views as being harmful to others, such as tweets that push fake coronavirus cures, or posts that say social distancing or masks don't help stop or slow the spread of COVID-19.
YouTube recently announced that videos containing information that contradict guidance from the World Health Organization are subject to removal, with CEO Susan Wojcicki citing examples such as videos claiming hydroxychloroquine prevents COVID-19 or that taking vitamin C is a cure.
"Anything that would go against World Health Organization recommendations would be a violation of our policy. And so remove is another really important part of our policy," she told CNN's Brian Stelter.