Facebook is doing its best to counter the anti-vaccination damage Facebook is doing



[ad_1]

On Monday, Facebook revealed a plan to vaccinate 50 million people, the latest in a series of efforts by the social media company to tackle the Covid-19 pandemic and the misinformation that has thrived on its platform. form. The campaign follows years of criticism directed at Facebook for not doing enough to tackle the dangers of the anti-vaccination movement.

First announced in a post from CEO Mark Zuckerberg, Facebook’s plans include launching a tool to help people find and schedule local vaccination sites, amplify credible vaccination information health officials and adding labels to coronavirus publications that direct people to information from the World Health Organization. The company is also expanding the official WhatsApp chatbots to help people sign up for vaccines and has new stickers on Instagram “so people can inspire others to get vaccinated.” (WhatsApp and Instagram are owned by Facebook.)

On top of all of this, and perhaps more critically, Facebook is doing something it hates: limiting the dissemination of information. The company also announced that it will temporarily reduce the distribution of content from users who have violated its disinformation policies on Covid-19 and vaccines, or who continue to share content that its fact-checking partners have debunked. Understanding what is and what is not disinformation is a tricky business, and it’s hard to tell the difference between people who willfully deceive others and legitimate questions.

These efforts build on existing Facebook promises. In February, Facebook announced that it would remove anti-vaccination misinformation and use its platform for what it called the world’s largest Covid-19 vaccination information campaign, of which it announced. debut this week. The social media company has also partnered with public health researchers to uncover the reasons for vaccine hesitancy – and how to fight it – through surveys on the platform.

Critics say Facebook’s efforts are not enough to counter the enormity of the situation that the platform itself has helped create.

Anti-vaccination rhetoric flourished for years on the platform, which provided a safe space for vaccine misinformation groups and even recommended such groups to users. And much of the content that drives vaccine hesitation would not be viewed as misinformation, but rather opinion, so Facebook’s guidelines wouldn’t ban it, according to David Broniatowski, a professor at George University. Washington which studies anti-vaccination communities.

“People who oppose vaccinations don’t primarily make arguments based on science or fact, but on values ​​such as freedom of choice or civil liberties,” Broniatowski told Recode. “These are opinions, but very corrosive opinions.”

For example, a message saying “I don’t think vaccines are safe, do I?” probably wouldn’t be considered misinformation, but the tone can be insidious.

Facebook is aware that such posts that don’t violate Facebook’s rules cause vaccine hesitation, according to a new report from the Washington Post. “Although the research is very early, we are concerned that the damage caused by non-violent content may be substantial,” the story quotes from an internal Facebook document.

While Broniatowski praised Facebook’s initiatives to partner with health organizations and promote vaccine facts, he believes it could do something more effective: allow public health officials to target hesitant groups. to vaccination with arguments as convincing as those put forward by detractors of vaccines. He noted that vaccine reluctance was encouraged by a relatively small slice of Facebook users with disproportionate influence, and similarly, a small group of public health experts could be used to combat it.

“You have very sophisticated actors who are making a number of arguments, whatever is left, to prevent people from getting vaccinated,” he said. “We need a more nuanced response that better responds to the real concerns of people.”

Facebook did not immediately respond with a comment.

People who refuse to be vaccinated have a wide range of reasons, according to data released today by Carnegie Mellon University’s Delphi Group in partnership with Facebook. Of those polled, 45% said they would avoid getting the vaccine because of fear of side effects, and 40% reported concerns about the safety of the vaccine. Lower percentages of respondents highlighted mistrust of vaccines and the government. Responding directly to these concerns could have a significant impact on people’s willingness to be vaccinated.

Facebook could also ensure that its efforts to curb disinformation about Covid-19 are not limited to its latest public relations campaign, Imran Ahmed, CEO of the Center for Countering Digital Hate, told Recode in a statement.

“Since Facebook’s last announcement of its intention to ‘crack down’ on anti-vaccine misinformation over a month ago, almost no progress has been made,” Ahmed said.

“Facebook and Instagram are still not removing the vast majority of posts flagged to them for containing dangerous misinformation about vaccines,” he said. “The main anti-vaccine lie super-spreaders are still on Instagram or Facebook, despite promises to remove them.”

Since its announcement to ban vaccine misinformation in February, the company has said it has removed 2 million more pieces of content from Facebook and Instagram. It remains to be seen whether this and the new measures will allow an additional 50 million people to be vaccinated.

[ad_2]

Source link