Facebook has vowed to crack down on misinformation about the Covid-19 vaccine, but misleading posts are still easy to find



[ad_1]

Facebook has struggled for years to tackle anti-vaxxer content. Late last year, he established new rules to tackle misinformation about the Covid-19 vaccine after pledging two years ago to reduce the spread of anti-vaxxer content. But misleading and alarmist content about Covid vaccines, along with outright misinformation, continues to spill over the platform at a time when the stakes could not be higher: Misinformation about the vaccine can mean life or life. the death.

Four of the top 10 search results for “vaccine” on Facebook owned by Instagram were for anti-vaccination accounts, including “vaccinetruth,” “vaccinefreedom,” “antivaxxknowthefacts,” and “cv19vaccinereactions,” according to a series of research conducted by CNN Company from of several different Instagram handles over the past two weeks.

Shortly after, Instagram updated its search interface on mobile devices to feature three credible results, including the CDC account, followed by a “See more results” prompt. Users who click on this option then see a number of anti-vaccination accounts, in what is arguably the digital equivalent of pushing a mess into a room under the bed.

Some of these accounts have amassed significant followings, raising the question whether Instagram suggesting them as a primary result for users simply looking for vaccine information has helped them grow their following. The “cv19vaccinereactions” account, devoted to documenting allegations of adverse reactions to the vaccine, has more than 77,000 subscribers. The account often shares unsubstantiated reports and hints at unproven links between people receiving the Covid-19 vaccine and major health events, including a stroke or miscarriage.

The fact that some of this anti-vaxx content continues to hide in plain sight from platforms highlights a controversial distinction in Facebook’s approach: a company spokesperson says Facebook distinguishes between misinformation about vaccines in particular, which it is rampant against, and posts that express more general anti-vaccine sentiment, which it allows on the platform.

In December, Facebook said it would remove claims about the coronavirus vaccines that have been debunked by public health officials, including baseless conspiracy theories that they contain microchips. Previously, Facebook policies prohibited disinformation about Covid-19 that “contributes to the risk of imminent violence or physical injury.”

Public health experts have said they fear misinformation about Covid-19 vaccines and general anti-vaccination content on social media could lead people to refuse to be vaccinated. “If they are scared of the lies perpetuated by social media, we will have a real problem coming out of this pandemic,” said Dr LJ Tan, director of strategy for the Action Coalition for Immunization (IAC ).

Joe Osborne, a Facebook spokesperson, said the company is working to “reduce the number of people who see false information” about vaccines and that it is trying to do “more to address other content from misleading vaccines that do not fall under these policies. “

Osborne added that the company is removing claims about the Covid-19 vaccine that have been debunked by public health experts and adding labels and reducing the distribution of other misinformation deemed false by its third-party fact-checking partners.

When a measles epidemic swept across the United States nearly two years ago, Facebook pledged to tackle vaccine misinformation by limiting the reach of this content on its platforms, but backed down. refrained from banning it completely. In March 2019, Facebook said it would “lower the ranking of groups and pages that spread misinformation about vaccinations” by not including them in recommendations or predictions when users type in the search bar. But two months later, CNN Business discovered that Instagram was still serving anti-vaccination accounts and anti-vaccination hashtags messages to anyone who searched for the word “vaccines.”
While Facebook removed a large private group dedicated to anti-vaccine content in November 2020, CNN Business found that more than 20 anti-vaxxer groups remain on the platform, with members ranging from a few hundred to tens of thousands. users. (The company said the group it deleted in November was accused of violating its recidivism policies – preventing group admins from creating another group similar to the one the company deleted – as well as violating its policies against the QAnon plot.)

When searching for the word ‘vaccine’ on Facebook groups last week, three of the platform’s top 20 results led to groups promoting anti-vaccine content, including groups called ‘Say No Covid. 19 Vaccine ”,“ COVID-19 Vaccine Injury Stories ”and“ Vaccine Talk: A Forum for both Pro and Anti Vaxxers ”- which has over 50,000 members. The list fluctuates. A few days later, neither of those groups made it into the top 20, but results 18-20 indicated groups discussing vaccine side effects or adverse effects. Scrolling down further, it was easy to find other anti-vaxxer groups in the search results, including the one titled “Unvaccinated and Thriving,” which makes claims that are widely and consistently denied in its description linking vaccines to autism and other disorders and diseases. It’s unclear what fuels Facebook’s search recommendations and why the results are changing from day to day. Facebook failed to provide a clear explanation after repeated requests for comment.

Dr Wafaa El-Sadr, professor of epidemiology and medicine at Columbia University’s Mailman School of Public Health, called vaccine misinformation on social media “very dangerous” and said it could have “disastrous consequences”.

“We are in a race against the virus,” she said. “We need everyone who is eligible for vaccines to get vaccinated as soon as possible.”

A public Facebook group, which has more than 58,000 members, is devoted to articles on alleged “injuries and reactions to vaccines.” Several recent posts on the group’s page include links that have been marked as “fake news” by Facebook’s independent fact checkers or have a tag saying “Missing background. Independent fact checkers say this information could be misleading people. In error.” A link shared – and labeled as bogus by independent verifiers – claimed 53 people have died in Gibraltar from the Covid-19 vaccine. Despite the warning labels, members of the group continue to engage with these links, voice their doubts about Facebook’s fact-checkers, and share unsubstantiated stories or theories about the dangerousness of vaccines.

“A story doesn’t have to be accurate to change your mind. That’s what we’re up against right now,” said Tan, of the IAC. “In the age of the Internet, science is not the most compelling story.”

Columbia’s El-Sadr warned people to be wary of any anecdotes or individual stories they read in these Facebook groups – which may or may not be true or have anything to do with the vaccine.

“The vast majority of people so far have been vaccinated without incident,” she said. “We have to keep reminding people of this. These vaccines have a very safe profile and are incredibly effective.”

[ad_2]

Source link