
Facebook says it removed “hundreds of thousands” of false information about COVID-19, including dangerous bogus treatment and articles that contradict public health advice.
Olivier Douliery / AFP via Getty Images
hide legend
toggle legend
Olivier Douliery / AFP via Getty Images

Facebook says it removed “hundreds of thousands” of false information about COVID-19, including dangerous bogus treatment and articles that contradict public health advice.
Olivier Douliery / AFP via Getty Images
In a new movement to stop the spread of dangerous and false information about the coronavirus, Facebook will begin to tell people when they interacted with publications on healings, hoaxes and other false allegations.
Over the next few weeks, Facebook users who have liked, reacted, or commented on potentially dangerous, debugged content will see a message in their news feed directing them to the World Health Organization’s “Myth busters” page. There, WHO dispels some of the most common lies about the pandemic.

“We want to connect people who may have interacted with harmful, false information about the virus with truth from authoritative sources in the event that they re-examine or hear these allegations on Facebook,” wrote Guy Rosen, vice president of Facebook for integrity, in a blog post. .
The new functionality will go beyond Facebook’s current attempts to prevent dangerous disinformation about the virus on its network. So far, it has only warned users when they share a message that the auditors have labeled false.

Facebook users who have liked, reacted or commented on potentially dangerous demystified content will see a message in their news feed directing them to the World Health Organization’s “Myth busters” page.
hide legend
toggle legend

Facebook users who have liked, reacted or commented on potentially dangerous demystified content will see a message in their news feed directing them to the World Health Organization’s “Myth busters” page.
This week, UN Secretary General António Guterres warned that the world is facing “a dangerous epidemic of disinformation” about the coronavirus. And on Wednesday, global advocacy group Avaaz released a study claiming that millions of users have been exposed to misinformation related to coronaviruses on Facebook.
The study highlighted conspiracy theories that the virus was created by the World Health Organization and the Gates Foundation; bogus treatments like oregano oil and garlic; and the potentially deadly recommendation that consumption of chlorine dioxide, an industrial bleach, will “destroy” the virus.
None of this is true.
“Not only is Facebook the epicenter of disinformation, but more dangerously, people’s lives are at risk because they are not told that this content is false,” said Fadi Quran, campaign director at Avaaz. He said the new alerts are “a huge step forward.”
A Facebook spokesperson said, “We share Avaaz’s goal of reducing misinformation on COVID-19 and appreciate their partnership in developing the notifications that we will now show to those who have engaged in misinformation harmful on the virus that we have since deleted. However, their sample is not representative of the community on Facebook and their results do not reflect the work we have done. ”
Avaaz reviewed 104 messages and videos in six languages, published between January 21 and April 7, which had been found to be false by independent fact-checkers. The study found that these messages were shared more than 1.7 million times and generated 117 million views.

In 43 cases, the messages were still available on Facebook without a warning label stating that the fact-checkers had refuted their claims. Avaaz said that after sharing the list of posts with Facebook, the company had removed 17.
Avaaz found that Facebook could take up to three weeks to post warning labels or remove content that factual verifiers deemed false. Facebook has declined to say how long it generally takes to report or delete posts that violate its policies.
Rosen said Facebook has removed “hundreds of thousands” of false information related to the virus that could cause “imminent physical damage”, including publications that encourage false treatment or contradict advice on social distancing.
For other debunked claims, including conspiracy theories about the origin of the virus, Facebook limits the number of people who see these messages and displays “warning labels and strong notifications” when people view or try them to share them.

Facebook posted warnings on 40 million messages in March, including 4,000 articles found false by fact-checkers, said Rosen. “When people saw these warning labels 95% of the time, they did not view the original content,” he said.
Like many companies, Facebook sent most of its workers home during the pandemic. It is now relying more on automated systems to monitor and report publications, which the company believes could lead to more errors.
Avaaz is also examining disinformation on Twitter and YouTube, said the Quran, to see how they apply their policies.
Editor’s note: Facebook is one of the sponsors of NPR.