Neuroscience

Articles and news from the latest research reports.

125 notes


How we “hear” with our eyes
In everyday life we rarely consciously try to lip-read. However, in a noisy environment it is often very helpful to be able to see the mouth of the person you are speaking to. Researcher Helen Blank at the MPI in Leipzig explains why this is so: “When our brain is able to combine information from different sensory sources, for example during lip-reading, speech comprehension is improved.” In a recent study, the researchers of the Max Planck Research Group “Neural Mechanisms of Human Communication” investigated this phenomenon in more detail to uncover how visual and auditory brain areas work together during lip-reading.
In the experiment, brain activity was measured using functional magnetic resonance imaging (fMRI) while participants heard short sentences. The participants then watched a short silent video of a person speaking. Using a button press, participants indicated whether the sentence they had heard matched the mouth movements in the video. If the sentence did not match the video, a part of the brain network that combines visual and auditory information showed greater activity and there were increased connections between the auditory speech region and the STS.
“It is possible that advanced auditory information generates an expectation about the lip movements that will be seen”, says Blank. “Any contradiction between the prediction of what will be seen and what is actually observed generates an error signal in the STS.”
How strong the activation is depends on the lip-reading skill of participants: The strong-er the activation, the more correct responses were. “People that were the best lip-readers showed an especially strong error signal in the STS”, Blank explains. This effect seems to be specific to the content of speech - it did not occur when the subjects had to decide if the identity of the voice and face matched.
The results of this study are very important to basic research in this area. A better understanding of how the brain combines auditory and visual information during speech processing could also be applied in clinical settings. “People with hearing impairment are often strongly dependent on lip-reading”, says Blank. The researchers suggest that further studies could examine what happens in the brain after lip-reading training or during a combined use of sign language and lip-reading.

How we “hear” with our eyes

In everyday life we rarely consciously try to lip-read. However, in a noisy environment it is often very helpful to be able to see the mouth of the person you are speaking to. Researcher Helen Blank at the MPI in Leipzig explains why this is so: “When our brain is able to combine information from different sensory sources, for example during lip-reading, speech comprehension is improved.” In a recent study, the researchers of the Max Planck Research Group “Neural Mechanisms of Human Communication” investigated this phenomenon in more detail to uncover how visual and auditory brain areas work together during lip-reading.

In the experiment, brain activity was measured using functional magnetic resonance imaging (fMRI) while participants heard short sentences. The participants then watched a short silent video of a person speaking. Using a button press, participants indicated whether the sentence they had heard matched the mouth movements in the video. If the sentence did not match the video, a part of the brain network that combines visual and auditory information showed greater activity and there were increased connections between the auditory speech region and the STS.

“It is possible that advanced auditory information generates an expectation about the lip movements that will be seen”, says Blank. “Any contradiction between the prediction of what will be seen and what is actually observed generates an error signal in the STS.”

How strong the activation is depends on the lip-reading skill of participants: The strong-er the activation, the more correct responses were. “People that were the best lip-readers showed an especially strong error signal in the STS”, Blank explains. This effect seems to be specific to the content of speech - it did not occur when the subjects had to decide if the identity of the voice and face matched.

The results of this study are very important to basic research in this area. A better understanding of how the brain combines auditory and visual information during speech processing could also be applied in clinical settings. “People with hearing impairment are often strongly dependent on lip-reading”, says Blank. The researchers suggest that further studies could examine what happens in the brain after lip-reading training or during a combined use of sign language and lip-reading.

Filed under brain superior temporal sulcus lip reading brain areas brain activity neuroscience psychology science

  1. arcaniumagi reblogged this from spiritmolecule
  2. empathy-vs-apathy-nyc reblogged this from spiritmolecule
  3. kotahi-aroha reblogged this from spiritmolecule
  4. spiritmolecule reblogged this from greatestmyth
  5. greatestmyth reblogged this from paradoxicalparadigms
  6. somuchscience reblogged this from sagansense
  7. the-nuclear-chaos reblogged this from sagansense
  8. laikas-owner reblogged this from sagansense
  9. 8100dy43d534 reblogged this from sagansense
  10. psapao reblogged this from sagansense
  11. paradoxicalparadigms reblogged this from sagansense
  12. sagansense reblogged this from neurosciencestuff
  13. brainstufffyi4dew0319 reblogged this from neurosciencestuff
  14. nightwalker4769 reblogged this from neurosciencestuff
  15. captainmolesto reblogged this from neurosciencestuff
  16. kosaddiq reblogged this from neurosciencestuff
  17. saraahlynne reblogged this from neurosciencestuff
  18. third-perspective reblogged this from neurosciencestuff
  19. neuromarketingdirections reblogged this from neurosciencestuff
free counters