Posts tagged auditory perception

Posts tagged auditory perception
‘Seeing is believing’, so the idiom goes, but new research suggests vision also involves a bit of hearing.

Scientists studying brain process involved in sight have found the visual cortex also uses information gleaned from the ears as well as the eyes when viewing the world.
They suggest this auditory input enables the visual system to predict incoming information and could confer a survival advantage.
Professor Lars Muckli, of the Institute of Neuroscience and Psychology at the University of Glasgow, who led the research, said: “Sounds create visual imagery, mental images, and automatic projections.
“So, for example, if you are in a street and you hear the sound of an approaching motorbike, you expect to see a motorbike coming around the corner. If it turned out to be a horse, you’d be very surprised.”
The study, published in the journal Current Biology, involved conducting five different experiments using functional Magnetic Resonance Imaging (fMRI) to examine the activity in the early visual cortex in 10 volunteer subjects.
In one experiment they asked the blindfolded volunteers to listen to three different sounds – birdsong, traffic noise and a talking crowd.
Using a special algorithm that can identify unique patterns in brain activity, the researchers were able to discriminate between the different sounds being processed in early visual cortex activity.
A second experiment revealed even imagined images, in the absence of both sight and sound, evoked activity in the early visual cortex.
Lars Muckli said: “This research enhances our basic understanding of how interconnected different regions of the brain are. The early visual cortex hasn’t previously been known to process auditory information, and while there is some anatomical evidence of interconnectedness in monkeys, our study is the first to clearly show a relationship in humans.
“In future we will test how this auditory information supports visual processing, but the assumption is it provides predictions to help the visual system to focus on surprising events which would confer a survival advantage.
“This might provide insights into mental health conditions such as schizophrenia or autism and help us understand how sensory perceptions differ in these individuals.”
(Source: gla.ac.uk)
New findings on the workings of the inner ear
The sensory cells of the inner ear have tiny hairs called stereocilia that play a critical part in hearing. It has long been known that these stereocilia move sideways back and forth in a wave-like motion when stimulated by a sound wave. After having designed a microscope to observe these movements, a research team at Karolinska Institutet has discovered that the hairs not only move sideways but also change in length.
The discovery, which was made in collaboration with scientists at Baylor College of Medicine in Texas, USA provides new fundamental knowledge about the mechanisms of hearing. It is presented in the online scientific journal Nature Communications.
(Source: ki.se)
New research published in Psychological Science, a journal of the Association for Psychological Science, examines the nuanced relationship between language and different types of perception.
Bilingual Infants Can Tell Unfamiliar Languages Apart
Speaking more than one language can improve our ability to control our behavior and focus our attention, recent research has shown. But are there any advantages for bilingual children before they can speak in full sentences? We know that bilingual children can tell if a person is speaking one of their native languages or the other, even when there is no sound, by watching the speaker’s mouth for visual cues. But Núria Sebastián-Gallés of Universitat Pompeu Fabra and colleagues wanted to know whether bilingual infants could also do this with two unfamiliar languages. They studied 8-month-old infants, half of whom lived in either Spanish- or Catalan-speaking households and half of whom lived in Spanish-Catalan bilingual households. The researchers looked at whether the infants could discriminate between English and French, two unfamiliar languages, using only visual cues. They found that the bilingual infants could tell the difference between the two languages, while the infants who lived in single-language households could not. These findings suggest that infants who are immersed in bilingual environments are more sensitive to the differences in visual cues associated with the sounds of various languages.
Lead author: Núria Sebastián-Gallés
Skilled Deaf Readers Have an Enhanced Perceptual Span in Reading
Though people born deaf are better able to use information from peripheral vision than those who can hear, they have a harder time learning to read. Researchers have proposed that the extra information coming in could distract from, rather than enhance, the process of reading. But no research has actually compared visual attention in reading between hearing and deaf readers. In a new study, Nathalie Bélanger of the University of California, San Diego and colleagues investigated this issue by measuring the perceptual span, or the number of letter spaces used when reading, of skilled deaf readers, less-skilled deaf readers, and hearing readers. The experimenters manipulated the number of letter spaces that the participants saw while reading text on a screen. They found that, compared to the other two groups, skilled deaf readers read fastest when they were given the largest number of letter spaces, showing that they had the largest perceptual span. Regardless, they were able to read just as fast as skilled hearing readers. Contrary to previous hypotheses, these findings suggest that enhanced visual attention and perceptual span are not the cause of reading difficulties common among deaf individuals.
Lead author: Nathalie N. Bélanger
Detection of Appearing and Disappearing Objects in Complex Acoustic Scenes
The ability to detect sudden changes in the environment is critical for survival. Hearing is hypothesized to play a major role in this process by serving as an “early warning device,” rapidly directing attention to new events. Here, we investigate listeners’ sensitivity to changes in complex acoustic scenes—what makes certain events “pop-out” and grab attention while others remain unnoticed? We use artificial “scenes” populated by multiple pure-tone components, each with a unique frequency and amplitude modulation rate. Importantly, these scenes lack semantic attributes, which may have confounded previous studies, thus allowing us to probe low-level processes involved in auditory change perception. Our results reveal a striking difference between “appear” and “disappear” events. Listeners are remarkably tuned to object appearance: change detection and identification performance are at ceiling; response times are short, with little effect of scene-size, suggesting a pop-out process. In contrast, listeners have difficulty detecting disappearing objects, even in small scenes: performance rapidly deteriorates with growing scene-size; response times are slow, and even when change is detected, the changed component is rarely successfully identified. We also measured change detection performance when a noise or silent gap was inserted at the time of change or when the scene was interrupted by a distractor that occurred at the time of change but did not mask any scene elements. Gaps adversely affected the processing of item appearance but not disappearance. However, distractors reduced both appearance and disappearance detection. Together, our results suggest a role for neural adaptation and sensitivity to transients in the process of auditory change detection, similar to what has been demonstrated for visual change detection. Importantly, listeners consistently performed better for item addition (relative to deletion) across all scene interruptions used, suggesting a robust perceptual representation of item appearance.