Facebook Will Soon Warn You If You Like Or React To Coronavirus “Misinformation”

facebook

By John Vibes / Truth Theory

Social media platforms have been increasing their content moderation efforts amid the ongoing coronavirus pandemic. This week, Facebook announced that they will begin warning users if they liked, reacted or commented on posts that are deemed to be “misinformation.”

In a blog post announcing the new measures, Guy Rosen, Facebook’s vice president of integrity, said that they will be sourcing the World Health Organization as the arbiter of truth, and will be downranking content that deviates from the organization’s current guidelines. The website will be using an army of 60 different fact-checking organizations to make sure that posts in your timeline reflect what the WHO is saying.

“Ever since COVID-19 was declared a global public health emergency in January, we’ve been working to connect people to accurate information from health experts and keep harmful misinformation about COVID-19 from spreading on our apps. We’ve now directed over 2 billion people to resources from the WHO and other health authorities through our COVID-19 Information Center and pop-ups on Facebook and Instagram with over 350 million people clicking through to learn more. But connecting people to credible information is only half the challenge. Stopping the spread of misinformation and harmful content about COVID-19 on our apps is also critically important,” the post read.

However, the World Health Organization’s guidance since the beginning of the outbreak has been heavily criticized around the world, because the group appeared to have drug their feet in announcing a pandemic, and initially backed up claims by the Chinese government that there was no evidence on the virus spreading between humans. In fact, the organization has also come under fire for its response to previous outbreaks, as the New York Times noted.

The company says that they have already removed hundreds of thousands of posts that they determined were misinformation that “could lead to physical harm.”

“We’re going to start showing messages in News Feed to people who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed. These messages will connect people to COVID-19 myths debunked by the WHO including ones we’ve removed from our platform for leading to imminent physical harm. We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook. People will start seeing these messages in the coming weeks,” the statement continued.

The post also announced a $1 million grant for international fact checking organizations.

Leave Comment: