Reading, Listening, Whatever

Whether the words of a story come from listening or reading, it appears that the brain activates
the same areas to represent their semantics or meaning, according to new research.
Using detailed brain scans, scientists at the University of California (UC), Berkeley, have created
interactive 3D semantic maps that can accurately predict which parts of the brain will respond to
particular categories of words.
When they compared the semantic brain maps for listening and reading, the researchers found
that they were almost identical.
It appears that the brain’s representation of meaning does not depend on which sense acquires the
words that convey it.
A recent Journal of Neuroscience paper describes how the team came to this conclusion.
The findings yield fresh insights into the complex brain activity of comprehension. They should
also, improve the understanding of language processing difficulties such as dyslexia.
At a time when more people are absorbing information via audiobooks, podcasts, and even
audio texts, says lead study author Fatma Deniz, a postdoctoral researcher in neuroscience at
UC, Berkeley, our study shows that, whether they’re-listening to or reading the same materials,
they are processing semantic information similarly.
To create the 3D semantic brain maps, the team invited volunteers to listen to and read the same
stories while they recorded detailed functional MRI scans of their brains.
The scans enabled the researchers to monitor brain activity by measuring blood flow in different
parts of the brain.
The researchers matched the brain activity with time-coded transcripts of the stories. That way,
they could tell which part of the brain responded to each word.
They also used a computer program to allocate the thousands of words in the stories to semantic
categories. For example, the words cat, fish, and bear all belong to the category animal.
Then, using a tool called voxelwise encoding, the team mapped the semantic categories to their
associated activated areas on the cerebral cortex. This is the outer layer of the brain, which is
concerned with motor and sensory information.
The maps look like vibrant patches of color fluttering on the cerebral cortex. Different patches of
color represents different word categories.
The researchers were surprised to find that the maps for listening and reading were nearly
identical, especially as they entailed so many brain regions. They were expecting reading and
listening to process semantic information differently.

The researchers foresee the study’s findings helping to increase understanding of how the brain
processes of language.
The semantic maps could also aid the study of healthy people and those with conditions that
affect brain function, such as stroke, epilepsy, and injuries that can impair speech.
Deniz suggests that the maps could also give fresh insights into dyslexia, a common neurological
a condition that impairs the ability to read.
Dyslexia arises from a difference in brain wiring and does not affect intelligence. Most people
with dyslexia can learn to read with appropriate teaching.
According to the International Dyslexia Association, around 1 in 10 people have dyslexia,
although many have not received a diagnosis or any help.
If in the future," Deniz suggests, we find that the dyslexic brain has rich semantic language
representation when listening to an audiobook or another recording, that could bring more audio
materials into the classroom.
She also sees the maps being useful in the understanding of auditory processing impairments.
People with these conditions cannot make out the phonemes, or subtle sound differences, in
words. For example, they may not be able to differentiate between cat and bat.
It would be very helpful to be able to compare the listening and reading semantic maps for
people with auditory processing disorder."

Comments

Leave a Reply


Copyright © © 2016 Five Star Nursing. All Rights Reserved.