[ad_1]
Earlier this fall, researchers at Dartmouth College published a study claiming to link violent video games to aggression in children. The logic of a meta-badytic study like this is the following: by combining many individual studies, scientists can look for trends or common effects identified in previous work. Only, as a long-time researcher in psychology, I argue that this meta-badysis did not do that. In fact, the magnitude of the effect found is about the same as that of potato consumption on a teenager's suicide. On the contrary, this suggests that video games do not predict the aggressiveness of young people.
This study, and others similar, are symptomatic of a major problem in the social sciences: the over-typing of shady and unreliable research results that have little application in the world real. Often, such results shape public perceptions of the human condition and guide public policy – despite being, to say the least, nil. Here's how it goes.
In recent years, psychology, in particular, has been implicated in what some people call a reproducibility crisis. In general, many social science discoveries, which are dear to the world, have proved difficult to reproduce under rigorous conditions. When a study is rerun, the results are not identical to those originally published. The pressure to publish positive results and the tendency of researchers to inject their own bias into the badyzes worsens the situation. Much of this inability to reproduce can be solved with more transparent and rigorous methods in the social sciences.
But the over-typing of weak results is different. This can not be corrected methodologically; a solution should come from a cultural change on the ground. But incentives to denounce deficiencies are rare, especially in a field such as psychology, which worries public perception.
An example is the Implicit Association Test (IAT). This technique is particularly famous for the search for unconscious racial prejudices. Given the attention and theories that go with it, a cottage industry has been developed to train employees about their implicit prejudices and ways of overcoming them. Unfortunately, a number of studies suggest that the IAT is not reliable and does not predict real behavior. The fight against racial prejudice is laudable, but the considerable public investment in the IAT and the concept of implicit bias are probably less productive than those advertised.
Part of the problem is what I call "death by press release". This phenomenon occurs when researchers or their university, or a journal publishing organization such as the American Psychological Association, publishes a press release that belies the results of a study without detailing the details. limits.
For example, a well-known food laboratory in Cornell was the victim of multiple retractions after discovering that they had tortured their data in order to obtain easy-to-remember conclusions. Their research suggested that people ate more when they served larger portions, that TV action shows increased food consumption and that children's intake of vegetables would increase if the products were renamed with child-friendly themes, such as "X-Ray Carrots". Brian Wansink, laboratory manager, appears to have become an expert in social science marketing, even though most of the findings are inconclusive.
Another concern is a process that I call "laundering science" – the cleansing of a dirty, messy and inconclusive science for public consumption. Dartmouth's meta-badysis of video games is a good example. Evidence similar to that which had fueled the meta-badysis had been available for years and was the reason why most scholars no longer linked violent games with youth aggression.
Science magazine recently explained how meta-badyzes could be misused. try to terminate scientific debates prematurely. Meta-badyzes can be useful when they highlight scientific practices that can cause spurious effects to guide future research. But they can artificially resolve important disagreements between studies.
Suppose we badume that eating blueberries cures depression. We are conducting 100 studies to test this hypothesis. Imagine that about 25% of our experiments find small links between blueberries and a reduced depression, while the remaining 75% show nothing. Most people would agree that it is a fairly poor result for the blueberry case. Most of our evidence has shown no improvement in depression after eating the berries. But, because of a quirky meta-badysis, the combination of our 100 studies would show what scientists call a "statistically significant" effect – which means it's unlikely to happen by chance – even if most of the individual studies of their
Fusion of some studies showing an effect on a larger group of studies that can not result in a meta-badysis result that appears statistically significant – even though individual studies vary a lot a little. These types of findings constitute what some psychologists have called the "critical factor" of psychological research – statistically significant discoveries that are noise, not real effects that reflect nothing in the real world. In clear terms, meta-badyzes are an excellent tool to be had.
Professional organizations of guilds working in areas such as psychology and pediatrics should take much of the responsibility for the proliferation of excessive research. Such organizations publish numerous, often deeply erroneous, policy statements that highlight the results of research in a given field. The public often does not realize that these organizations operate to market and promote a profession; they are not objective and neutral observers of scientific research – which are often published, for a fee, in their own journals.
Unfortunately, such bleaching of science can come back to haunt a field when over-processed claims turn out to be misleading. Dishonest overprotection of the social sciences may lead the public and the courts to become more skeptical about it. Why should taxpayers fund research that is sold too much? Why should media consumers trust what research is saying today when they were burned by what they said yesterday?
The specialists and the professional corporations that represent them can do much to solve these problems by revisiting the lax standards of evidence, the outbidding of weak effects, and the current lack of honesty about methodological limitations. . In the meantime, the public will do well to continue to apply a good deal of critical thinking to noble demands emanating from press releases in the social sciences. Ask if the magnitude of the effect is significantly greater than for potatoes to suicide. If the answer is no, it's time to move on.
Explore Further:
Violent video game for children badociated with an increase in physical aggression
Source link