Facebook plans to remove or demote anti-vaccination recommendations



[ad_1]

While public pressure is intensifying on how Facebook is promoting misinformation about vaccines, the social media giant plans to remove anti-vaccination content from its referral systems, reported Bloomberg.

Facebook has become a safe haven for a small community of parents who reject any wisdom about immunization, often citing the science of junk food or conspiracy theories, and refusing to vaccinate their children.

This week, Facebook is under fire for the promotion of anti-vaccination materials, especially advertisements for women in areas where the number of measles cases is high, according to the Daily Beast. The outcry intensified after Rep. Adam Schiff, D-Calif., Wrote A letter to Founder and CEO Mark Zuckerberg asking how Facebook was considering protecting users from misleading information about vaccination. Schiff has sent a similar letter to Sundar Pichai, Google's managing director, which is also undergoing an in-depth review of how his search engine and YouTube affiliate promote potentially dangerous misinformation.

"The algorithms that feed these services are not designed to distinguish quality information from misleading or misleading information, and their consequences are particularly troubling for public health problems," Schiff wrote to the two technical managers.

In an answer to Bloomberg about issues raised by Schiff, Facebook said it "was exploring additional measures to tackle the problem," including "reducing or removing this type of content recommendations, including the groups you should join ". The company also indicated that it plans to downgrade these options in search results and to ensure that "better and more reliable" information is available. "

These tensions arise as the United States faces a worrying upsurge in measles, a disease reported to have been eliminated by the Centers for Disease Control after the introduction of measles, mumps and rubella vaccine in 2000. More than 100 cases have been reported. confirmed in 10 states this year. , according to the CDC, already exceeding the total number of confirmed cases in 2016. Last month, Washington Governor Jay Inslee, Democrat, declared the state of emergency after 25 cases of measles occurred. reported in a single county, where nearly a quarter of children go to school without being vaccinated against measles, mumps and rubella.

Facebook claimed that most anti-vaccine content did not violate community guidelines for "harming the real world." The company told The Washington Post earlier this week that she did not think that removing this type of material would help raise awareness of the facts about immunization. Facebook said that he thought that a specific counter-speech was more productive protection against misinformation.

"As we work hard to remove content that violates our rules, we also give our community tools to control what they see, as well as Facebook, in order to talk and share the views of the community. that surrounds it, "said Facebook in a statement sent to. Post on Wednesday.

Facebook did not immediately respond to a request on Friday on whether the company's position on these issues had changed.

The platform has an uneven track record with respect to the quality of the information contained in the popular health content seen by its users. A recent study by the Credibility Coalition and Health Feedback, a group of scientists who badess the accuracy of media coverage on health, found that the majority of the most clicked health stories on Facebook in 2018 were false or inaccurate. contained a significant amount of misleading information. The study examined the top 100 health articles with the highest number of social media engagements and had their credibility badessed by a network of experts. The study found that less than half were "highly credible". Vaccines are among the three most popular story topics.

The health-related content may be reviewed by Facebook's fact-checking partners, which means that content deemed to be misleading or false will be demoted to the user's feeds and will appear with related articles of fact checkers. But this does not work in social network groups, where the bulk of anti-vaccination material is broadcast.

The World Health Organization recently designated "vaccine hesitancy" as one of the major global threats in 2019. But a recent Guardian survey found that in the research results of Facebook, vaccines were "dominated by anti-vaccination propaganda". Facebook has not responded to the Guardian's questions regarding his plans for dealing with this problem. Another Guardian investigation also revealed that Facebook had accepted advertising revenues from Vax Truther, Anti-Vaxxer, Vaccines Revealed and Michigan for Vaccine Choice, among others.

In his letter to Zuckerberg, Schiff asked how the company planned to reconcile the fact that frightened and confused parents could make decisions based on misinformation about vaccinations on their platform, thus making the population more vulnerable to a deadly disease.

"I recognize that it may not be easy to determine when the information is medically accurate, and we are not asking your platform to practice medicine, but if a concerned parent systematically sees information in his or her lead. news, this casts doubt on the safety or efficacy of vaccines, which could lead them to ignore the advice of their children's doctors and public health experts and to fail to follow the recommended vaccination schedule, "wrote Schiff .

Google's YouTube has already started modifying algorithms to try to control the spread of misinformation. Last month, YouTube announced that it would start by removing videos with "limit content" that "disinforms users in a prejudicial way."

"We believe this change is a balance between maintaining a platform for freedom of expression and respect for our responsibility for our uses," the company said in a statement. blog.

<! –

->

[ad_2]
Source link