Posts tagged auditory cortex

Posts tagged auditory cortex
Auditory test predicts coma awakening
A coma patient’s chances of surviving and waking up could be predicted by changes in the brain’s ability to discriminate sounds, new research suggests.
Recovery from coma has been linked to auditory function before, but it wasn’t clear whether function depended on the time of assessment. Whereas previous studies tested patients several days or weeks after comas set in, a new study looks at the critical phase during the first 48 hours. At early stages, comatose brains can still distinguish between different sound patterns. How this ability progresses over time can predict whether a coma patient will survive and ultimately awaken, researchers report.
“It’s a very promising tool for prognosis,” says neurologist Mélanie Boly of the Belgian National Fund for Scientific Research, who was not involved with the study. “For the family, it’s very important to know if someone will recover or not.”
A team led by neuroscientist Marzia De Lucia of the University of Lausanne in Switzerland studied 30 coma patients who had experienced heart attacks that deprived their brains of oxygen. All the patients underwent therapeutic hypothermia, a standard treatment to minimize brain damage, in which their bodies were cooled to 33° Celsius for 24 hours.
De Lucia and colleagues played sounds for the patients and recorded their brain activity using scalp electrodes — once in hypothermic conditions during the first 24 hours of coma, and again a day later at normal body temperature. The sounds were a series of pure tones interspersed with sounds of different pitch, duration or location. The brain signals revealed how well patients could discriminate the sounds, compared with five healthy subjects.
After three months, the coma patients had either died or awoken. All the patients whose discrimination improved by the second day of testing survived and awoke from their comas. By contrast, many of those whose sound discrimination deteriorated by the second day did not survive. The results were reported online November 12 in Brain.
(Image credit: ANP)
Music to Your Brain by Dwayne Godwin and Jorge Cham
Is this the most unpleasant sound in the world?
The ear-splitting screech of a knife on a glass bottle has been identified as the worst sound to the human ear by scientists who studied the brain’s response to unpleasant noises.
People who listened to a series of 74 recordings while having their brain activity measured by an MRI scanner rated the sound of a fork on a glass as the second worst noise, followed by chalk on a blackboard.
The scans revealed that unpleasant sounds provoked a stronger response in the brain than pleasant ones such as the noise of blubbing water. While sounds are processed in the brain’s auditory cortex, uncomfortable noises activate the amygdala, a separate brain region which processes emotions.
The researchers studied a group of 13 volunteers and found that sounds with a frequency of between 2,000 and 5,000 Hz, the range at which our ears are the most sensitive, were the hardest to bear.
Although it remains unclear why our ears are most sensitive to this type of sound, researchers noted that screams, which we naturally find uncomfortable, fall within the same range.
Dr Sukhbinder Kumar of Newcastle University, author of the study, which was published in the Journal of Neuroscience, said: “It appears there is something very primitive kicking in. It’s a possible distress signal from the amygdala to the auditory cortex.”
His colleague Prof Tim Griffiths added: “This might be a new inroad into emotional disorders and disorders like tinnitus and migraine, in which there seems to be heightened perception of the unpleasant aspects of sounds.”
Stem Cells Turn Hearing Back On
Scientists have enabled deaf gerbils to hear again—with the help of transplanted cells that develop into nerves that can transmit auditory information from the ears to the brain. The advance, reported in Nature, could be the basis for a therapy to treat various kinds of hearing loss.
In humans, deafness is most often caused by damage to inner ear hair cells—so named because they sport hairlike cilia that bend when they encounter vibrations from sound waves—or by damage to the neurons that transmit that information to the brain. When the hair cells are damaged, those associated spiral ganglion neurons often begin to degenerate from lack of use. Implants can work in place of the hair cells, but if the sensory neurons are damaged, hearing is still limited.
"Obviously the ultimate aim is to replace both cell types," says Marcelo Rivolta of the University of Sheffield in the United Kingdom, who led the new work. "But we already have cochlear implants to replace hair cells, so we decided the first priority was to start by targeting the neurons."
In the past, scientists have tried to isolate so-called auditory stem cells from embryoid bodie—aggregates of stem cells that have begun to differentiate into different types. But such stem cells can only divide about 25 times, making it impossible to produce them in the quantity needed for a neuron transplant.
Rivolta and his colleagues knew that during embryonic development, a handful of proteins, including fibroblast growth factor (FGF) 3 and 10, are required for ears to form. So they exposed human embryonic stem cells to FGF3 and FGF10. Multiple types of cells formed, including precursor inner-ear hair cells, but they were also able to identify and isolate the cells beginning to differentiate into the desired spiral ganglion neurons. Then, they implanted the neuron precursor cells into the ears of gerbils with damaged ear neurons and followed the animals for 10 weeks. The function of the neurons was restored.
"We’ve only followed the animals for a very limited time," Rivolta says. "We want to follow them long-term now"—both to assess the possibility of increased cancer risk and to observe the long-term function of the new neurons, he adds.
"It’s very exciting," says neuroscientist Mark Maconochie of Sussex University in the United Kingdom, who was not involved in the new work. "In the past, there has been work where someone makes a single hair cell or something that looks like one neuron [from stem cells], and even that gets the field excited. This is a real step change."
The question now, he says, is whether the procedure can be fine-tuned to allow more efficient production of the relay neurons—currently, fewer than 20% of the stem cells treated develop into those ear neurons. By combining growth factors other than FGF3 and FGF10 with the stem cell mix, researchers could harvest even more ear progenitor cells, he hypothesizes.
"The next big challenge will be to do something as effective as this for the hair cells," Maconochie adds.
The world continues to be a noisy place, and Purdue University researchers have found that all that background chatter causes the ears of those with hearing impairments to work differently.
"When immersed in the noise, the neurons of the inner ear must work harder because they are spread too thin," said Kenneth S. Henry, a postdoctoral researcher in Purdue’s Department of Speech, Language and Hearing Sciences. "It’s comparable to turning on a dozen television screens and asking someone to focus on one program. The result can be fuzzy because these neurons get distracted by other information."
The findings, by Henry and Michael G. Heinz, an associate professor of speech, language and hearing sciences, are published as a Brief Communication in Nature Neuroscience. The work was funded by the National Institutes of Health and the National Institute on Deafness and Other Communication Disorders.
Tuning a piano also tunes the brain, say researchers who have seen structural changes within the brains of professional piano tuners.
Researchers at University College London and Newcastle University found listening to two notes played simultaneously makes the brain adapt. Brain scans revealed highly specific changes in the hippocampus, which governs memory and navigation. These correlated with the number of years tuners had been doing this job.
The Wellcome Trust researchers used magnetic resonance imaging to compare the brains of 19 professional piano tuners - who play two notes simultaneously to make them pitch-perfect - and 19 other people. What they saw was highly specific changes in both the grey matter - the nerve cells where information processing takes place - and the white matter - the nerve connections - within the brains of the piano tuners.
Investigator Sundeep Teki said: “We already know that musical training can correlate with structural changes, but our group of professionals offered a rare opportunity to examine the ability of the brain to adapt over time to a very specialised form of listening.”
Other researchers have noted similar hippocampal changes in taxi drivers as they build up detailed information needed to find their way around London’s labyrinth of streets. Prof Tim Griffiths, who led the latest study, published in Neuroscience, said: “There has been little work on the role of the hippocampus in auditory analysis. “Our study is consistent with a form of navigation in pitch space as opposed to the more accepted role in spatial navigation.”
Learning to play a musical instrument could help to improve children’s reading and their ability to listen in noisy classrooms, according to new research.
Neuroscientists have found that musicians benefit from heightened brain activity that allows them to process information from their eyes and ears more efficiently than non-musicians.
They found that the part of the brain that interprets sound, known as the auditory cortex, responds faster in people with musical training and is better primed to pick out subtle patterns from the huge volumes of information that flood into the brain from our senses.
Professor Nina Kraus, a neuroscientist and amateur musician at Northwestern University in Evanston, Illinois, has also found that this part of the brain plays a crucial role in reading.
ScienceDaily (July 10, 2012) — People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funded by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. It adds to a growing list of discoveries that confirm the impact of experiences and outside influences in molding the developing brain.

People who are born deaf process the sense of touch differently than people who are born with normal hearing, according to research funded by the National Institutes of Health. The finding reveals how the early loss of a sense — in this case hearing — affects brain development. (Credit: © James Steidl / Fotolia)
The study is published in the July 11 online issue of The Journal of Neuroscience.
The researchers, Christina M. Karns, Ph.D., a postdoctoral research associate in the Brain Development Lab at the University of Oregon, Eugene, and her colleagues, show that deaf people use the auditory cortex to process touch stimuli and visual stimuli to a much greater degree than occurs in hearing people. The finding suggests that since the developing auditory cortex of profoundly deaf people is not exposed to sound stimuli, it adapts and takes on additional sensory processing tasks.
"This research shows how the brain is capable of rewiring in dramatic ways," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "This will be of great interest to other researchers who are studying multisensory processing in the brain."
Previous research, including studies performed by the lab director, Helen Neville Ph.D., has shown that people who are born deaf are better at processing peripheral vision and motion. Deaf people may process vision using many different brain regions, especially auditory areas, including the primary auditory cortex. However, no one has tackled whether vision and touch together are processed differently in deaf people, primarily because in experimental settings, it is more difficult to produce the kind of precise tactile stimuli needed to answer this question.
Dr. Karns and her colleagues developed a unique apparatus that could be worn like headphones while subjects were in a magnetic resonance imaging (MRI) scanner. Flexible tubing, connected to a compressor in another room, delivered soundless puffs of air above the right eyebrow and to the cheek below the right eye. Visual stimuli — brief pulses of light — were delivered through fiber optic cables mounted directly below the air-puff nozzle. Functional MRI was used to measure reactions to the stimuli in Heschl’s gyrus, the site of the primary auditory cortex in the human brain’s temporal lobe as well as other brain areas.
The researchers took advantage of an already known perceptual illusion in hearing people known as the auditory induced double flash, in which a single flash of light paired with two or more brief auditory events is perceived as multiple flashes of light. In their experiment, the researchers used a double puff of air as a tactile stimulus to replace the auditory stimulus, but kept the single flash of light. Subjects were also exposed to tactile stimuli and light stimuli separately and time-periods without stimuli to establish a baseline for brain activity.
Hearing people exposed to two puffs of air and one flash of light claimed only to see a single flash. However, when exposed to the same mix of stimuli, the subjects who were deaf saw two flashes. Looking at the brain scans of those who saw the double flash, the scientists observed much greater activity in Heschl’s gyrus, although not all deaf brains responded to the same degree. The deaf individuals with the highest levels of activity in the primary auditory cortex in response to touch also had the strongest response to the illusion.
"We designed this study because we thought that touch and vision might have stronger interactions in the auditory cortices of deaf people," said Dr. Karns." As it turns out, the primary auditory cortex in people who are profoundly deaf focuses on touch, even more than vision, in our experiment."
There are several ways the finding may help deaf people. For example, if touch and vision interact more in the deaf, touch could be used to help deaf students learn math or reading. The finding also has the potential to help clinicians improve the quality of hearing after cochlear implants, especially among congenitally deaf children who are implanted after the ages of 3 or 4. These children, who have lacked auditory input since birth, may struggle with comprehension and speech because their auditory cortex has taken on the processing of other senses, such as touch and vision. These changes may make it more challenging for the auditory cortex to recover auditory processing function after cochlear implantation. Being able to measure how much the auditory cortex has been taken over by other sensory processing could offer doctors insights into the kinds of intervention programs that would help the brain retrain and devote more capacity to auditory processing.
Source: Science Daily