COVID-19 vaccines false claims: Facebook on Thursday said it would remove false claims about COVID-19 vaccines that have been debunked by public health experts, following a similar announcement by Alphabet Inc`s YouTube in October.

COMMERCIAL BREAK
SCROLL TO CONTINUE READING

The move expands Facebook`s current rules against falsehoods and conspiracy theories about the Covid-19 pandemic. The social media company says it takes down coronavirus misinformation that poses a risk of "imminent" harm, while labeling and reducing distribution of other false claims that fail to reach that threshold.

Facebook said in a blog post that the global policy change came in response to news that COVID-19 vaccines will soon be rolling out around the world.

Two drug companies, Pfizer Inc and Moderna Inc, have asked U.S. authorities for emergency use authorization of their vaccine candidates. Britain approved the Pfizer vaccine on Wednesday, jumping ahead of the rest of the world in the race to begin the most crucial mass inoculation program in history.

Misinformation about the new coronavirus vaccines has proliferated on social media during the pandemic, including through viral anti-vaccine posts shared across multiple platforms and by different ideological groups, according to researchers.

A November report https://firstdraftnews.org/long-form-article/under-the-surface-covid-19-... by the nonprofit First Draft found that 84 percent of interactions generated by vaccine-related conspiracy content it studied came from Facebook pages and Facebook-owned Instagram.

Facebook said it would remove debunked COVID-19 vaccine conspiracies, such as that the vaccines` safety is being tested on specific populations without their consent, and misinformation about the vaccines.

"This could include false claims about the safety, efficacy, ingredients or side effects of the vaccines. For example, we will remove false claims that COVID-19 vaccines contain microchips," the company said in a blog post. It said it would update the claims it removes based on evolving guidance from public health authorities.

Facebook did not specify when it would begin enforcing the updated policy, but acknowledged it would "not be able to start enforcing these policies overnight."

The social media company has rarely removed misinformation about other vaccines under its policy of deleting content that risks imminent harm. It previously removed vaccine misinformation in Samoa where a measles outbreak killed dozens late last year, and it removed false claims about a polio vaccine drive in Pakistan that were leading to violence against health workers.

Facebook, which has taken steps to surface authoritative information about vaccines, said in October that it would also ban ads that discourage people from getting vaccines. In recent weeks, Facebook removed a prominent anti-vaccine page and a large private group - one for repeatedly breaking COVID misinformation rules and the other for promoting the QAnon conspiracy theory. 

 

The story has been taken from a news agency