[ad_1]
At first, Conway was quite skeptical of his results. “The word on the street is MEG has very crappy spatial resolution,” he says. Basically the machine is good at detecting when there is brain activity, but not so good to show you or in the brain that activity is. But it turned out the patterns were there and they were easy for the decoder to spot. “There you go, the pattern is different enough for the different colors that I can decode with over 90% accuracy of the color you were seeing,” he says. “It’s like: holy shit! “
Chatterjee says that Conway’s MEG approach allows neuroscientists to reverse traditional questions of perception. “Perception is generally considered to be the known quantity” – in this case the color of the spiral – “and then the researchers tried to understand the neural processes leading to it,” he writes. But in this experiment, Conway approached the issue from the opposite side: he measured neural processes and then drew conclusions about how these processes affect his subjects’ color perception.
The MEG also allowed Conway to watch perception unfold over time. In this experiment, it took about a second between when the volunteer saw the spiral and when they named its color out loud. The machine was able to reveal patterns of activation during this time, showing when the perception of color first appeared in the brain, and then follow that activation for about half a second or so as the perception moved to a concept. semantics – the word the volunteer could use to name the color.
But this approach has certain limitations. Although Conway can identify that visualizing different colors creates different patterns of brain responses, and his 18 subjects experienced specific patterns for colors like yellow, brown, or light blue, he can’t say exactly where in the brain these patterns emerge. The paper also does not discuss the mechanisms that create these patterns. But, says Conway, understanding that there is a neural difference in the first place is huge. “That there is a difference is instructive, because it tells us that there is some sort of topographic color map in the human brain,” he says.
“That’s it relationships between colors as we perceive them (perceptual color space) can be derived from the recorded activity relationships (even if it’s MEG and can’t get you to the level of single neurons or small sets of neurons), ”Chatterjee writes. “This makes for a creative and interesting study.”
Plus, says Conway, this research disproves all of those arguments that MEG isn’t precise enough to capture these patterns. “Now we can use [MEG] to decode all kinds of things related to the very fine spatial structure of neurons in the brain, ”suggests Conway.
MEG data also showed that the brain processed these eight color spirals differently depending on whether they had warm or dark colors. Conway made sure to include pairs that had the same hue, meaning that their wavelengths would be seen as the same color by photoceptors in the eye, but had different levels of luminance or brightness, which changes the way people perceive them. For example, yellow and brown have the same hue but differ in terms of luminance. Both are warm colors. And, for the cool colors, the blue and dark blue he chose were also the same shade as each other, and had the same difference in luminance as the warm yellow / brown pair of tones.
MEG data showed that the patterns of brain activity corresponding to blue and dark blue were more similar to each other than the patterns for yellow and brown were. Even though these hues all differed in the same amount of luminance, the brain treated the pair of warm colors as being much more different from each other, compared to the two blues.
[ad_2]
Source link