Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Did You Fall For A Coronavirus Hoax? Facebook Will Let You Know

Facebook says it has removed "hundreds of thousands" of pieces of misinformation about COVID-19, including dangerous fake cures and posts contradicting public health advice.
Olivier Douliery AFP via Getty Images
Facebook says it has removed "hundreds of thousands" of pieces of misinformation about COVID-19, including dangerous fake cures and posts contradicting public health advice.

In a new move to stop the spread of dangerous and false information about the coronavirus, Facebook will start telling people when they've interacted with posts about bogus cures, hoaxes and other false claims.

In the coming weeks, Facebook users who liked, reacted to or commented on potentially harmful debunked content will see a message in their news feeds directing them to the World Health Organization's Myth busters page. There, the WHO dispels some of the most common falsehoods about the pandemic.

"We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook," wrote Guy Rosen, Facebook's vice president for integrity, in a blog post.

Advertisement

The new feature will go beyond Facebook's current attempts to keep dangerous misinformation about the virus off its network. Up until now, it has been notifying users only when they share a post that fact-checkers have labeled false.

This week, U.N. Secretary-General António Guterres warned the world is facing "a dangerous epidemic of misinformation" about the coronavirus. And on Wednesday, the global advocacy group Avaaz released a study saying millions of users have been exposed to coronavirus-related misinformation on Facebook.

The study pointed to conspiracy theories that the virus was created by the World Health Organization and the Gates Foundation; touted false cures such as oregano oil and garlic; and gave the potentially lethal recommendation that drinking chlorine dioxide, an industrial bleach, will "destroy" the virus.

None of those are true.

"Not only is Facebook an epicenter of misinformation, but more dangerously, people's lives are being put at risk, because they're not being informed that this content was false," said Fadi Quran, campaign director at Avaaz. He said the new alerts are "a huge step forward."

Advertisement

A Facebook spokesperson said: "We share Avaaz's goal of reducing misinformation about COVID-19 and appreciate their partnership in developing the notifications we'll now be showing people who engaged with harmful misinformation about the virus we've since removed. However, their sample is not representative of the community on Facebook and their findings don't reflect the work we've done."

Avaaz examined 104 posts and videos in six languages, posted between Jan. 21 and April 7, that had been rated as false by independent fact-checkers. The study found these posts were shared over 1.7 million times and racked up 117 million views.

In 43 cases, the posts were still available on Facebook without any warning label indicating that fact-checkers had debunked their claims. Avaaz said after it shared the list of posts with Facebook, the company removed 17 of them.

Avaaz found it could take up to three weeks for Facebook to post warning labels or remove content that fact-checkers rated as false. Facebook declined to say how long it typically takes to flag or remove posts that violate its policies.

Rosen said Facebook has removed "hundreds of thousands" of pieces of virus-related misinformation that could lead to "imminent physical harm," including posts that promote fake cures or contradict advice about social distancing.

For other debunked claims, including conspiracy theories about the virus's origin, Facebook limits how many people see those posts and shows "strong warning labels and notifications" when people view them or try to share them.

Facebook displayed warnings on 40 million posts in March that included 4,000 articles found false by fact-checkers, Rosen said. "When people saw those warning labels, 95% of the time they did not go on to view the original content," he said.

Like many companies, Facebook has sent most of its workers home during the pandemic. It is now relying more heavily on automated systems to monitor and flag posts, which the company says could lead to more mistakes.

Avaaz is also examining misinformation on Twitter and YouTube, Quran said, to see how they are enforcing their policies.

Editor's note: Facebook is among NPR's sponsors.

Copyright 2020 NPR. To see more, visit https://www.npr.org.