[ad_1]
Too busy or too lazy to read Moby Dick from Melville or Anna Karenina from Tolstoy? It's good. Whether you read the classics or listen to them instead, the same cognitive and emotional parts of the brain could be stimulated. And now, there is a map to prove it.
Neuroscientists at the University of California at Berkeley have created interactive maps to predict where different categories of words activate the brain. Their last card focuses on what happens in the brain when you read stories.
The conclusions, which should appear on 19 August in the Journal of Neuroscience, provide additional evidence that different people share a similar semantic topology – or similar meaning – opening an extra door to our thoughts and inner narratives. They also have practical implications for learning and speech disorders, from dyslexia to aphasia.
"At a time when more and more people are absorbing information through audio books, podcasts and even audio, our study shows that, whether they listen to or read the same content, they process the semantic information of the same way, "said Fatma Deniz, lead author of the study. , a postdoctoral researcher in neuroscience at the UC Berkeley Gallant Lab and a former data scientist at the Berkeley Institute for Data Science.
For the study, people listened to stories from "The Moth Radio Hour," a series of popular podcasts, and then read those same stories. Using a functional MRI, the researchers examined their brains under the conditions of listening and reading, compared their brain activity data listening and reading and found that the maps they had created from the two sets of data were virtually identical.
The results can be viewed on an interactive, 3D, color-coded map, where words – grouped into categories such as visual, tactile, digital, location, violent, mental, emotional and social – are presented as vibrant butterflies on flattened cortex. The cortex is the superimposed layer of gray matter in the brain that coordinates sensory and motor information.
The interactive 3D brain visualizer is expected to go online this month.
With respect to clinical applications, the cards could be used to compare language treatment in healthy people and in those with stroke, epilepsy and brain damage that impair speech. Understanding such differences can help recovery efforts, Deniz said.
Semantic maps can also inform interventions in favor of dyslexia, a disorder of the treatment of widespread neurodevelopmental language that impedes reading.
"If, in the future, we discover that the dyslexic brain presents a rich representation of semantic language when listening to an audiobook or other recording, this could bring more of audio material in the classroom, "Deniz said.
The same is true for auditory processing disorders, in which people can not distinguish the sounds or "phonemes" that make up the words. "It would be very useful to be able to compare the semantic charts of listening and reading of people with auditory processing disorders," she said.
Nine volunteers spent two hours each on functional MRI scans, listening to and reading stories from "The Moth Radio Hour" while the researchers measured their cerebral blood flow.
Their brain activity data, in both cases, were then compared to time-coded transcripts of the stories, the results of which were integrated into a computer program that notes the words according to their relationship.
With the help of statistical modeling, the researchers ranked thousands of words on maps based on their semantic relationships. In the category of animals, for example, we can find the words "bear", "cat" and "fish".
Maps, which covered at least a third of the cerebral cortex, allowed researchers to accurately predict which words would activate parts of the brain.
The results of the reading experience were a surprise to Deniz, who had anticipated some changes in the way readers compared to listeners would deal with semantic information.
"We knew that some areas of the brain were activated the same way when you heard a word and read the same word, but I did not expect such similarities in the representation of meaning across a vast network of regions. of the brain in these two sensory modalities, "said Deniz. .
His study follows a study conducted in 2016 by the Gallant Lab, which recorded the brain activity of seven study subjects listening to stories from "The Moth Radio Hour."
Future mapping of semantic information will include experiences with people who speak a language other than English, as well as with people with language-based learning disabilities, said Deniz.
The co-authors of the study are Anwar Nunez-Elizalde, Alexander Huth and Jack Gallant, all of UC Berkeley.
[ad_2]
Source link