© 2024 Blaze Media LLC. All rights reserved.
Meta asks Facebook oversight board whether COVID-19 misinformation policies are still 'appropriate'
Rafael Henrique/SOPA Images/LightRocket via Getty Images

Meta asks Facebook oversight board whether COVID-19 misinformation policies are still 'appropriate'

Meta, the parent company of Facebook, has asked its oversight board to review whether the platform's COVID-19 misinformation policies are still "appropriate," signaling the company may take a step back from censoring entire categories of false claims about the virus.

In January 2020, Facebook adopted a sweeping misinformation policy that purported to remove false claims about the emerging pandemic that "presented unique risks to public health and safety." The company banned posts that compared the coronavirus to the flu, for example, or those that promoted off-label use of drugs like hydroxychloroquine or ivermectin or raised questions about the lab-leak origins theory of the virus. The company has also targeted skeptics of the COVID-19 vaccines.

As a result of these policies, Meta removed more than 25 million pieces of content since the start of the pandemic, said Nick Clegg, Meta president of Global Affairs, in a blog post.

"Meta began removing false claims about masking, social distancing and the transmissibility of the virus. In late 2020, when the first vaccine became available, we also began removing further false claims, such as the vaccine being harmful or ineffective. Meta’s policy currently provides for removal of 80 distinct false claims about COVID-19 and vaccines," Clegg wrote.

But now, Clegg wrote that "the time is right" for Meta to reconsider its heavy-handed censorship policies.

"The world has changed considerably since 2020. We now have Meta’s COVID-19 Information Center, and guidance from public health authorities is more readily available. Meta’s COVID-19 Information Center has connected over two billion people across 189 countries to helpful, authoritative COVID-19 information," Clegg wrote.

Acknowledging that the pandemic has "evolved" with the successful development and widespread use of vaccines, as well as better information from public health authorities, Clegg wrote that Meta is seeking an advisory opinion on whether it should continue to label or take down content that promotes COVID-19 misinformation.

"Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard. But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate," Clegg wrote.

"The policies in our Community Standards seek to protect free expression while preventing this dangerous content. But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies."

Meta's oversight board consists of an international team of independent academics, law professors, journalists, human rights activists, and other experts with backgrounds related to global politics and digital content moderation. The purpose of the board is to review appeals to Facebook's content moderation decisions and independently determine whether Meta made the right decision according to its own policies. While its decisions are "binding," Meta remains responsible for following through with the board's decisions.

Most notably, the oversight board ruled in May 2021 that Facebook's decision to suspend former President Donald Trump was "justified," but that the company was wrong to suspend him indefinitely.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?