[ad_1]
Facebook will start removing false claims about Covid vaccines, the company said, as the UK prepares to roll out the Pfizer / BioNTech vaccine.
It is Facebook’s strongest move to date to prevent its platform from being used to promote anti-vaccination rhetoric.
Under the new rules, content that makes false claims about Covid-19 vaccines will be removed from Facebook and Instagram once those claims are debunked by public health experts. The company says it is an extension of an existing policy to remove false claims about Covid-19, which has been applied to 12 million pieces of content since March.
“Given recent news that Covid-19 vaccines will soon be rolled out around the world, we will also begin over the next few weeks to remove false claims about these vaccines which have been debunked by public health experts on Facebook and Instagram. A spokesperson for the company said to me. “This is another way we apply our policy to remove misinformation about the virus that could lead to imminent physical damage.”
Such content “could include false claims about the safety, efficacy, ingredients or side effects of vaccines,” Facebook said. These could be claims that coronavirus vaccines contain microchips or conspiracy theories that are known to be false, such as the idea that specific populations are being used without their consent to test the vaccine for safety.
“As it is early and the facts about Covid-19 vaccines will continue to evolve, we will regularly update any claims that we remove based on advice from public health authorities as they learn more.”
The policies are significantly stricter than Facebook’s typical anti-disinformation rules, under which bogus claims are marked as such and removed by curation algorithms, but not fully removed from Facebook and Instagram. The distinction, the company said, was due to the fact that misinformation about Covid and Covid vaccinations “could lead to imminent physical damage.”
As recently as 2018, Facebook allowed anti-vaccination content to spread unchecked on its platform. In July of that year, the company first introduced its policy of removing misinformation that can lead to physical harm and applied it sparingly to content on vaccinations in the years that followed – by se focusing primarily on the most clearly false claims about specific vaccines and immunization programs. .
Since then, he has continued to tighten the rules. In 2019, the company banned advertisements containing misinformation about vaccines, in an effort to reduce the spread of ‘vaccine hoaxes’, and announced that it would remove groups and pages that spread anti-vaccine misinformation .
It was only during the pandemic that Facebook took further action. Last month, the company banned all ads that discouraged people from getting vaccinated, a step up from simply banning misinformation in such ads. Now ads on the platform will only be able to oppose vaccination if they do so from a political standpoint – such as opposing politicians or laws that require compulsory vaccination.
The subject has a personal connection to the founder and CEO of Facebook, Mark Zuckerberg. His philanthropic organization, the Chan Zuckerberg Initiative, has embarked on a flagship effort to “cure all diseases”, with a number of avenues of research underway, including a particular focus on vaccinations, through the Biohub Chan Zuckerberg.
[ad_2]
Source link