Facebook bans false COVID-19 vaccine claims



[ad_1]

PALO ALTO, Calif. (Reuters) – Facebook Inc said Thursday it would remove false claims about COVID-19 vaccines that have been debunked by public health experts, following a similar announcement made by YouTube from Alphabet Inc in October.

FILE PHOTO: A 3D printed Facebook logo is placed on a keyboard in this illustration taken March 25, 2020. REUTERS / Dado Ruvic / Illustration

The move expands Facebook’s current rules against lies and conspiracy theories about the pandemic. The social media company says it is removing disinformation about coronaviruses that poses a risk of “imminent” harm, while tagging and reducing the distribution of other bogus claims that fall below that threshold.

Facebook said in a blog post that the global policy change came in response to the announcement that COVID-19 vaccines will soon be rolled out around the world.

Two pharmaceutical companies, Pfizer Inc and Moderna Inc, have applied to US authorities for emergency use of their vaccine candidates. Britain approved the Pfizer vaccine on Wednesday, leading the rest of the world in the race to launch the most crucial mass inoculation program in history.

Misinformation about new coronavirus vaccines has proliferated on social media during the pandemic, including through viral anti-vaccine posts shared across multiple platforms and by different ideological groups, the researchers said.

A November report published here by the nonprofit First Draft found that 84% of interactions generated by the vaccine-related conspiracy content she studied came from Facebook and Instagram pages owned by Facebook.

Facebook said it would remove debunked COVID-19 vaccine conspiracies, such as vaccine safety being tested on specific populations without their consent, and vaccine misinformation.

“This could include false claims about the safety, efficacy, ingredients or side effects of vaccines. For example, we will remove false claims that COVID-19 vaccines contain microchips, ”the company said in a blog post. He said he would update the claims he is removing based on evolving guidance from public health authorities.

Facebook did not say when it would start enforcing the updated policy, but acknowledged that it “could not start enforcing these policies overnight.”

The social media company has rarely removed misinformation about other vaccines as part of its policy of removing content that risks imminent harm. He previously removed misinformation about vaccines in Samoa, where a measles outbreak killed dozens of people late last year, and removed false claims about a polio vaccination campaign in Pakistan that led to violence against health workers.

Facebook, which has taken steps to disseminate authoritative information about vaccines, said in October it would also ban ads discouraging people from getting vaccinated. In recent weeks, Facebook has taken down a leading anti-vaccine page and a large private group – one for repeatedly breaking COVID disinformation rules and the other for promoting QAnon’s conspiracy theory.

Reporting by Katie Paul and Elizabeth Culliford, editing by Nick Zieminski

[ad_2]

Source link