[ad_1]
A skull in sugar grains, in Paris on December 19, 2017 / AFP
A few years ago, two researchers selected the 50 most used ingredients in a recipe book and looked at how many had been badociated with a cancer risk or benefit, in various studies published in scientific journals.
The answer: 40 out of 50, a list including salt , flour, parsley and even sugar. "Is everything we eat related to cancer?" they then asked, ironically, in their article published in 2013.
Their question relates to a known but persistent problem in the world of research: too many studies use samples too small to lead to conclusions generalizable
But the pressure on researchers, the competition between journals and the insatiable appetite of the media for studies announcing revolutions or major discoveries, makes these articles continue to be published.
The majority of published articles, even in serious journals, are bad, "says one of the authors, John Ioannidis, a professor of medicine at Stanford, de facto specialized in the study of studies.
This slayer poor scientific research demonstrated in 2005 in a resounding article "Why most published studies are false."
Since then, he says, only some progress has been made.
Q some journals require that authors provide their raw data and publish their protocol in advance. This transparency prevents researchers from shredding their methods and data in order to find a result, whatever it may be. They allow others to verify or "replicate" the study.
A customer is eating a "pulpo a feira", an octopus specialty in Galicia, northwestern Spain, in a restaurant on November 22, 2014, in Palas de Rei / AFP
Because when they are redone, the experiments rarely lead to the same results. Only a third of 100 studies published in the three most prestigious psychology journals were reproduced by researchers, in an badysis published in 2015.
Medicine, epidemiology, drug and clinical trials. .. nutrition studies do not do much better, John Ioannidis insists, especially during replications.
"In the biomedical sciences and elsewhere, scientists have only a superficial training in statistics and methodology," adds John Ioannidis. Too many studies concern only a few individuals, preventing generalization to a whole population, because the selected participants are unlikely to be representative.
– Coffee and red wine –
Grapes and Wine at the Bolgheri Wine Museum, Italy, October 3, 2017 / AFP
"The diet is one of the most appalling areas ", continues Professor Ioannidis, and not only because of conflicts of interest with the agri-food industry. Researchers often go in search of correlations in huge databases, without any hypothesis.
In addition, "measuring a diet is extremely difficult," he says. How to quantify exactly what people eat?
Even when the method is good, with a randomized study, where participants are randomly selected, the execution sometimes leaves something to be desired.
A Famous 2013 Benefit Study The Mediterranean diet for heart disease had to be withdrawn in June by the most prestigious medical journal, the New England Journal of Medicine, because not all participants were randomly recruited; the results have been revised downwards
While in the flow of published studies every day?
John Ioannidis recommends asking the following questions: Is such a study isolated, or does it reinforce it existing works? Is the sample small or large? Is this a randomized experiment? Who financed it? Are researchers transparent?
stenting (mini prostheses) in an artery of the brain to reduce the risk of stroke has long been recommended by studies, before being invalidated by another research / AFP / Archives
These precautions are fundamental in medicine, where bad studies contribute to the adoption of treatments that are at best ineffective, and at worst harmful.
In their book "Ending Medical Reversal", Vinayak Prasad and Adam Cifu offer terrifying examples of practices adopted on the basis of studies that were invalidated years later, such as stenting (mini prostheses) in a brain artery to reduce the risk of stroke. It was not until after ten years that a rigorous study showed that the practice … actually increased the risk of stroke.
The solution involves the collective tightening of the criteria for all researchers, not just journals: universities, public funding agencies, laboratories … But these institutions are all subject to competition.
"The system does not encourage people to go in the right direction ", told AFP Ivan Oransky, co-founder of Retraction Watch website, which covers withdrawals of scientific articles. "We want to develop a culture where we reward transparency."
The problem also comes from the media, which according to him must better explain to their readers the uncertainties inherent in scientific research, and resist sensationalism.
problem is the endless succession of studies on coffee, chocolate and red wine, "he complains. "We must stop."
afp
Source link