[ad_1]
It is difficult to track the growing flow of reports and data on false news, misinformation, partisan content and literacy in the news. This weekly overview shows the highlights of what you may have missed.
The World Health Organization has identified "hesitation about vaccines" – "reluctance or refusal to vaccinate despite vaccine availability" – as one of the top 10 health problems the world faces in 2019.
A 2018 study found that non-medical exemptions for "philosophical" vaccines increased in 12 of the 18 states that allow them, and the authors noted:
While NMEs continue to rise in most of the 18 US states that allow them, several European countries, including France and Italy, as well as Australia, have taken steps to make vaccines mandatory or even to force parents to refuse to vaccinate their children. Romania has experienced severe and severe measles outbreaks and may also tighten its immunization legislation. Our concern is that the rise of NMEs linked to the movement of anti-vaccines in the United States will encourage other countries to follow the same path. It would be particularly worrisome that very large low-income and middle-income countries – such as Brazil, Russia, India and China (the BRIC group countries), or Bangladesh, Indonesia, the Nigeria and Pakistan – reduce their immunization coverage. In such a case, we may be faced with mbadive epidemics of childhood infections that may threaten the achievement of the global goals of the United Nations.
The number of measles cases in Europe reached its highest level in 20 years, The Guardian reported last month, surpbading 60,000 in 2018 by WHO – "more than double that of 2017 and the highest of this century. There were 72 deaths, twice as many as in 2017. "New York is facing the most serious measles epidemic in decades, concentrated almost exclusively among ultra-Orthodox Jews.
Elsevier Atlas this week presented research aimed at explaining anti-vaccine attitudes. The study examined the Dunning-Kruger effect – a form of cognitive bias in which people badume to know more than they actually know about a problem, or "their ignorance of their own ignorance" – the vaccines that surround vaccines. Matthew Motta, postdoc at the Annenberg Public Policy Center at the University of Pennsylvania and lead author of the study, explained:
We submitted to people a knowledge test on the causes of autism, and then we asked respondents in a national survey: What do you think about causes of autism? We asked the same question about medical experts as doctors and scientists. We compared people's perceptions of their personal identity with the perceptions of experts. We compared their degree of success with the knowledge test. We show that there is a link between knowledge and misinformation and what we call excessive trust – the belief that you know more. As we have shown, the least informed and the least informed were most likely to be overconfident. Once we did that, we looked at the political implications of overconfidence. We examined the correlation between attitudes, for example regarding the opportunity to vaccinate school children. Those who were most confident were less likely to think that was the case.
The researchers, in a survey of 1,300 US adults, found that "more than a third of study participants thought they knew as much or more than doctors and scientists about the causes of autism. "And by relying on expert information, they" also place a high level of confidence in information from non-experts (42%) and believe that these should play a major role in the information provided by experts. " policy development (38%) ".
"We need efforts to inform people, but we must also dispel false information. Hitting people in the head with facts is probably not going to do that, "said Motta. "What it might look like is the subject of follow-up studies. This is the key question: how to combat misinformation about vaccines? "
In Science Magazine, author and science journalist Laura Spinney explains how rumors and hoaxes make it more difficult to fight the Ebola epidemic in the Democratic Republic of Congo – and how public health officials have launched an unprecedented effort to combat misinformation.
For the first time in an Ebola outbreak, UNICEF and other agencies have come together to form a single response team, which reports to the DRC's Ministry of Health, and includes dozens of HIV / AIDS experts. social sciences that use the airwaves, social media and meetings with religious leaders to combat misinformation. Stakeholders also build trust by making their work more transparent, literally in some cases. A new biosecurity tent, called the Biosecurity Emergency Care Unit for Outbreaks (CUBE), allows relatives to visit Ebola patients during treatment.
Here are some examples of what social scientists do:
Part of their role is to map the social networks through which the virus spreads, but they also collect information about community perceptions, which have entered an online "dashboard" created by the International Federation of Societies. of the Red Cross and Red Crescent (IFRC) in Geneva. The government has also recruited young people to report false information circulating on WhatsApp, an important information channel in the DRC, said Jessica Ilunga, spokesperson for the DRC's Ministry of Health in Kinshasa.
While rumors surfaced, communications experts refute them with specific information via WhatsApp or local radio. They take care not to repeat misinformation; Research has shown that this is the best way to help the public "forget" the false news and reinforce the truth. The vocal support of the Ebola survivors also helped. Grateful for their care, some became volunteers in Ebola treatment centers.
Our colleague @Mbaggio talk to @ScienceMagazine about how trust and feedback from the community are essential for a #Ebola answer in #DRC. Collecting data on rumors, beliefs and community questions helps us find the right approaches to community engagement https://t.co/yhXwOkH1Yt. pic.twitter.com/E2ucJK10NJ
– IFRC Intl. Federation #RedCross #RedCrescent (@Federation) January 15, 2019
Yvonne McPherson, director of BBC Media Action USA, BBC, wrote in December about her work as BBC Media Action's Ebola Program. She explains that there is a difference between "acute and chronic disinformation problems". An example of an acute misinformation problem, for example, was a rumor in 2014 that one could avoid Ebola by bathing in salt water.
The example of Ebola in salt water was a real case of acute disinformation in West Africa. Reports followed this rumor in an SMS sent by a student to Nigeria. It immediately spread to social media, with hundreds of tweets having repeated the rumor over the next two days. In just a few days, the Nigerian Ministry of Health, the World Health Organization and other agencies have corrected the problem on traditional and social media, and the rumor has been canceled. Unfortunately, this misinformation has been responsible for at least two deaths and many people have been hospitalized due to excessive consumption of salt water.
This is a thorny scenario: when misinformation spreads quickly, it is corrected by multiple reliable sources and disappears.
But chronic misinformation is even more delicate:
An example of chronic misinformation would be the belief or suspicion that vaccines are harmful. It is chronic because this misinformation persists over the years, despite the factual evidence to the contrary.
The algorithms, as well as the underlying market forces, are designed to capture attention and, in turn, create fertile ground for spreading misinformation. Refine algorithms to keep people away from non-credible sources or annotate articles with credibility warnings can be part of the solution; However, these efforts do not meet the long-held beliefs that people may already have about a health problem.
[ad_2]
Source link