Tech takes heat while anti-vaxers become viral



[ad_1]

Lawmakers and public health advocates are pressuring technology companies to crack down on proliferating online anti-vaccine content, fearing that they will contribute to a massive measles epidemic in the United States. United.

Experts attribute the epidemic to the growing number of "anti-vaxers", people who do not vaccinate their children. And they warn that the movement is using social media extensively to promote their views, for example via YouTube videos and Facebook newsgroups.

Technology giants say they take the issue seriously even as they face competing demands between promoting public health and protecting freedom of expression.

Facebook said it could stop recommending anti-vaccination content and groups to users, while YouTube is looking to modify its algorithms to stop promoting videos containing incorrect information.

A spokesperson for YouTube in a statement to The Hill said the video-sharing platform had begun to present accurate medical content "for people looking for topics related to vaccination" and "began to reduce "anti-vaccination videos recommendations.

"Like many algorithmic changes, these efforts will be gradual and more and more accurate over time," the spokesman said.

Over the weekend, YouTube announced that it was going to demonetize the channels that are promoting anti-vaccine content, and link to Wikipedia's entry on "Hesitation versus vaccine "before videos that defend such opinions.

Pinterest has taken the strongest position so far. This week, he announced that he would block the results of research on vaccinations.

The image-sharing network is one of the only major platforms to have a specific "health misinformation" policy, put in place in 2017 after discovering that users were publishing fake treatments for diseases such as cancer and incorrect information about vaccines.

The debate also highlights a crucial question for high-tech companies: the degree of responsibility they must assume with respect to information on their platforms.

Technology platforms regularly invoke the concerns of the First Amendment when they are in a hurry to improve the content of the police.

Section 230 of the Communications Decency Act also prevents larger platforms from being legally responsible for the content posted by their users, leaving little leverage for technology regulators.

Liz Woolery, Deputy Director of the Center for Democracy and Technology's Free Expression Project, told The Hill she was convinced that every company should evaluate how to manage the content of anti-vaccines based on their own guidelines.

But the recent measles epidemic adds a new urgency to the debate and highlights the power of social networks.

The epidemic comes 18 years after the official eradication of measles in the United States and has already affected nearly 350 people since last fall and resulted in the declaration of a public health emergency in Washington State. last month.

The World Health Organization (WHO) this year has listed "vaccine hesitation" among the ten biggest threats to global health.

Health experts who spoke at The Hill said that there was a direct link between online anti-vaccine content and the increase in the number of people who were not getting vaccinated.

"Measles is an excellent case study – a computer virus with real effects," said Dr. Haider Warraich, a specialist in heart failure and transplantation at the Duke University Medical Center, at The Hill. Warraich has studied and criticized the presence of medical misinformation online.

"It began as Internet rumors that merged into these social media groups and now have real effects in real communities," said Warraich. "So I think that the Internet has a role to play in that it could be the first of many other examples in the future."

A recent Guardian Survey found that the YouTube and Facebook algorithms directed users toward anti-vaccination content, moving them away from authoritative medical resources to evaluate them unscientifically.

Facebook and YouTube said they were taking action to fix the problem, but soon after, Buzzfeed News reported Google's YouTube algorithms also recommended anti-vaccine content.

These reports drew the attention of legislators.

"It's inconceivable that YouTube's algorithms will continue to send users conspiracy videos that spread misinformation and end up hurting public health," said Frank Pallone, chairman of the House Energy and Commerce Commission in a statement to The Hill. .

Pallone noted that his committee, which dealt with both technology and health, had a hearing on the measles outbreak on February 27 and is committed to "discussing it with public health experts who testify".

Among these witnesses will be Nancy Messonnier, Director of the National Center for Disease Control and Prevention (CDC) and Anthony Fauci, Director of the Division of Allergy and Infectious Diseases at the National Institutes of Health (NIH).

Chairman of the House Intelligence Committee Adam SchiffAdam Bennett SchiffDemands Growing Up For A Public Report Mueller Bharara: This Would Look "Weird And Unusual" If the Mueller report is not made public, Schiff warns of the Mueller report's restraint: "We're going to get to the bottom of this." AFTER (D-Calif.) earlier this month also urged Google CEO Sundar Pichai and Facebook CEO Mark ZuckerbergMark Elliot ZuckerbergFinance groups blame Facebook for misleading children so that they spend money Pinterest Pinterest blocks all vaccine-related research in an effort to combat anti-vax content Hillicon Valley: the Kremlin wants more control over Russian Internet | CEO Denies Links with Chinese Government | Facebook accused of having exposed health data | Harris asks for paper ballots | Twitter updates advertising before European elections MORE On the question.

Schiff said he feared that YouTube, Facebook and Instagram "will publish and recommend messages discouraging parents from vaccinating their children, posing a direct threat to public health and reversing the progress made in the fight against preventable diseases." the vaccination".

Although lawmakers are pushing for action, health advocates are worried that federal health agencies have delayed action.

The CDC and the Department of Health and Human Services (HHS) have not launched any new campaigns to target the growth of online anti-vaccine content in recent years. Both agencies did not respond to The Hill's request for the role of social media in disseminating anti-vaccine information.

The WHO has also not responded to The Hill's repeated requests about the role that social media could play in deterring people from getting vaccinated.

Peter Hotez, dean of the National School of Tropical Medicine at Baylor College of Medicine in Houston and co-director of the Texas Children's Hospital Vaccine Development Center, said the CDC has largely ignored the anti-vaccine movement since its inception a few years.

"[One] The potential argument was, "Well, it's a marginal group and by naming it and giving it attention, you're only giving it oxygen," said Hotez, who studies the related hesitation. vaccination at The Hill. "I think it was probably a good strategy in the early 2000s, but I think it's little recognized that this has now become a media empire that now needs to be dismantled."

"They dominate the Internet," Hotez told The Hill. "Not only social media – they also have nearly 500 anti-vaccine websites according to some accounts. They use social media to amplify these sites. "

Hotez also referred to another technology company: Amazon.

Most of the best-selling and best-selling books in the "vaccinations" category on Amazon are: skeptical or totally opposite vaccines. The fifth most popular book promoted by the online retail giant defends the theory that vaccines cause autism, an assertion categorically denied by scientists.

Amazon declined to comment on this article, reminding The Hill of its guidelines for book sales, which allow the company to "provide … its customers with a variety of views, including books that some customers might find reprehensible ".

"We reserve the right not to sell certain content, such as pornography or other inappropriate content," Amazon says in the instructions, without going into details.

Dr. Arthur Caplan, founding director of the medical ethics division of New York University's Faculty of Medicine, told The Hill that many anti-vaccine content is spreading through "small and medium-sized groups". so size each other, retweeting each other. "

"Twitter [and] Facebook tend to be the biggest, "said Caplan. "There are robots … that promote or reinforce misinformation. Many things are tweeted, retweeted, retweeted, retweeted. "

Twitter does not have a specific policy on medical misinformation.

"Twitter's open and real-time nature is a powerful antidote to spreading all kinds of false information," a Twitter spokesperson said in a statement posted to The Hill. "As a business, we should not be the arbiter of the truth."

For technology companies, these difficult issues will not go away.

"Working together is the solution," said Warraich, who encouraged technology companies to contact health officials.

"Technology companies need to be humble and understand that there is a public health crisis for which they can be a solution."

[ad_2]

Source link