Neuroscience

Articles and news from the latest research reports.

Posts tagged cochlear implant

173 notes

Infant Cooing, Babbling Linked to Hearing Ability
Infants’ vocalizations throughout the first year follow a set of predictable steps from crying and cooing to forming syllables and first words. However, previous research had not addressed how the amount of vocalizations may differ between hearing and deaf infants. Now, University of Missouri research shows that infant vocalizations are primarily motivated by infants’ ability to hear their own babbling. Additionally, infants with profound hearing loss who received cochlear implants to help correct their hearing soon reached the vocalization levels of their hearing peers, putting them on track for language development.
“Hearing is a critical aspect of infants’ motivation to make early sounds,” said Mary Fagan, an assistant professor in the Department of Communication Science and Disorders in the MU School of Health Professions. “This study shows babies are interested in speech-like sounds and that they increase their babbling when they can hear.”
Fagan studied the vocalizations of 27 hearing infants and 16 infants with profound hearing loss who were candidates for cochlear implants, which are small electronic devices embedded into the bone behind the ear that replace some functions of the damaged inner ear. She found that infants with profound hearing loss vocalized significantly less than hearing infants. However, when the infants with profound hearing loss received cochlear implants, the infants’ vocalizations increased to the same levels as their hearing peers within four months of receiving the implants.
“After the infants received their cochlear implants, the significant difference in overall vocalization quantity was no longer evident,” Fagan said. “These findings support the importance of early hearing screenings and early cochlear implantation.”
Fagan found that non-speech-like sounds such as crying, laughing and raspberry sounds, were not affected by infants’ hearing ability. She says this finding highlights babies are more interested in speech-like sounds since they increase their production of those sounds such as babbling when they can hear.
“Babies learn so much through sound in the first year of their lives,” Fagan said. “We know learning from others is important to infants’ development, but hearing allows infants to explore their own vocalizations and learn through their own capacity to produce sounds.”
In future research, Fagan hopes to study whether infants explore the sounds of objects such as musical toys to the same degree they explore vocalization.
Fagan’s research, “Frequency of vocalization before and after cochlear implantation: Dynamic effect of auditory feedback on infant behavior,” was published in the Journal of Experimental Child Psychology.

Infant Cooing, Babbling Linked to Hearing Ability

Infants’ vocalizations throughout the first year follow a set of predictable steps from crying and cooing to forming syllables and first words. However, previous research had not addressed how the amount of vocalizations may differ between hearing and deaf infants. Now, University of Missouri research shows that infant vocalizations are primarily motivated by infants’ ability to hear their own babbling. Additionally, infants with profound hearing loss who received cochlear implants to help correct their hearing soon reached the vocalization levels of their hearing peers, putting them on track for language development.

“Hearing is a critical aspect of infants’ motivation to make early sounds,” said Mary Fagan, an assistant professor in the Department of Communication Science and Disorders in the MU School of Health Professions. “This study shows babies are interested in speech-like sounds and that they increase their babbling when they can hear.”

Fagan studied the vocalizations of 27 hearing infants and 16 infants with profound hearing loss who were candidates for cochlear implants, which are small electronic devices embedded into the bone behind the ear that replace some functions of the damaged inner ear. She found that infants with profound hearing loss vocalized significantly less than hearing infants. However, when the infants with profound hearing loss received cochlear implants, the infants’ vocalizations increased to the same levels as their hearing peers within four months of receiving the implants.

“After the infants received their cochlear implants, the significant difference in overall vocalization quantity was no longer evident,” Fagan said. “These findings support the importance of early hearing screenings and early cochlear implantation.”

Fagan found that non-speech-like sounds such as crying, laughing and raspberry sounds, were not affected by infants’ hearing ability. She says this finding highlights babies are more interested in speech-like sounds since they increase their production of those sounds such as babbling when they can hear.

“Babies learn so much through sound in the first year of their lives,” Fagan said. “We know learning from others is important to infants’ development, but hearing allows infants to explore their own vocalizations and learn through their own capacity to produce sounds.”

In future research, Fagan hopes to study whether infants explore the sounds of objects such as musical toys to the same degree they explore vocalization.

Fagan’s research, “Frequency of vocalization before and after cochlear implantation: Dynamic effect of auditory feedback on infant behavior,” was published in the Journal of Experimental Child Psychology.

Filed under hearing cochlear implant vocalizations language development psychology neuroscience science

106 notes

Researchers identify pattern of cognitive risks in some children with cochlear implants

Children with profound deafness who receive a cochlear implant had as much as five times the risk of having delays in areas of working memory, controlled attention, planning and conceptual learning as children with normal hearing, according to Indiana University research published May 22 in the Journal of the American Medical Association Otolaryngology—Head and Neck Surgery.

image

The authors evaluated 73 children implanted before age 7 and 78 children with normal hearing to determine the risk of deficits in executive functioning behaviors in everyday life.

Executive functioning, a set of mental processes involved in regulating and directing thinking and behavior, is important for focusing and attaining goals in daily life. All children in the study had average to above-average IQ scores. The results, reported in “Neurocognitive Risk in Children With Cochlear Implants,” are the first from a large-scale study to compare real-world executive functioning behavior in children with cochlear implants and those with normal hearing.

A cochlear implant device consists of an external component that processes sound into electrical signals that are sent to an internal receiver and electrodes that stimulate the auditory nerve. Although the device restores the ability to perceive many sounds to children who are born deaf, some details and nuances of hearing are lost in the process.

First author William Kronenberger, Ph.D., professor of clinical psychology in psychiatry at the IU School of Medicine and a specialist in neurocognitive and executive function testing, said that delays in executive functioning have been commonly reported by parents and others who work with children with cochlear implants. Based on these observations, his group sought to evaluate whether elevated risks of delays in executive functioning in children with cochlear implants exist, and what components of executive functioning were affected.

"In this study, about one-third to one-half of children with cochlear implants were found to be at-risk for delays in areas of parent-rated executive functioning such as concept formation, memory, controlled attention and planning. This rate was 2 to 5 times greater than that seen in normal-hearing children," reported Dr. Kronenberger, who also is co-chief of the ADHD-Disruptive Behavior Disorders Clinic and directs the psychology testing clinic at Riley Hospital for Children at IU Health.

"This is really innovative work," said co-author David B. Pisoni, Ph.D., director of the Speech Research Laboratory in the IU Department of Psychological and Brain Sciences. "Almost no one has looked at these issues in these children. Most audiologists, neuro-otologists, surgeons and speech-language pathologists — the people who work in this field — focus on the hearing deficit as a medical condition and have been less focused on the important discoveries in developmental science and cognitive neuroscience." Dr. Pisoni also is a Chancellors’ Professor of Psychological and Brain Sciences at IU Bloomington.

Richard Miyamoto, M.D., chair of the IU School of Medicine Department of Otolaryngology-Head and Neck Surgery and a pioneer in the field of cochlear implantation in children and adults, said this finding augments other research on interventions to help children with cochlear implants perform at a level similar to children without hearing deficits.

"The ultimate goal of our department’s research with cochlear implants has always been to influence higher-level neurocognitive functioning," Dr. Miyamoto said. "Much of the success we have seen to date clearly relates to the brain’s ability to process an incomplete signal. The current research will further assist in identifying gaps in our knowledge."

One possible answer may lie in earlier implantation, Dr. Miyamoto said. The age at which children are implanted has been steadily decreasing, which has produced significant improvement in spoken language outcomes. Research shows the early implantation is related to better outcomes in speech and understanding, and it is reasonable to believe that there may be less of a deficit in executive functioning with earlier implantation, said Dr. Miyamoto, who is the Arilla Spence DeVault Professor of Otolaryngology-Head and Neck Surgery and medical director of audiology and speech language pathology at the IU School of Medicine.

Preschoolers in the IU study were implanted at an average age of 18 months, and they had fewer executive function delays than school-age children who were implanted 10 months later, at an average age of 28 months. 

Children in the study were divided into two age groups: preschool (3 to 5 years) and school-age (7 to17 years). Using an established rating scale, parents rated executive function in everyday life for children with cochlear implants and for the control group with normal hearing.

"We compared parent ratings and looked at the percentage of children in each group who scored above a cut-off value that indicates at least a mild delay in executive functioning," Dr. Kronenberger said. "In the critical areas of controlled attention, working memory, planning and solving new problems, about 30 to 45 percent of the children with cochlear implants scored above the cut-off value, compared to about 15 percent or less of the children in the normal-hearing sample."

Dr. Kronenberger said the research also shows that many children develop average or better executive functioning skills after cochlear implantation.

"These results show that half or more of our group with cochlear implants did not have significant delays in executive functioning," Dr. Kronenberger said. "Cochlear implants produce remarkable gains in spoken language and other neurocognitive skills, but there is a certain amount of learning and catch-up that needs to take place with children who have experienced a hearing loss prior to cochlear implantation. So far, most of the interventions to help with this learning have focused on speech and language. Our findings show a need to identify and help some children in certain domains of executive functioning as well."

"We are now looking for early markers in children who are at risk before they get implants," Dr. Pisoni said. "It will be beneficial to identify as early as possible which children might be at risk for poor outcomes, and we need to understand the variability in the outcome and what can be done about it."

(Source: news.medicine.iu.edu)

Filed under cochlear implant deafness hearing loss working memory cognition children psychology neuroscience science

73 notes

From Mouse Ears to Man’s?

TAU researcher uses DNA therapy in lab mice to improve cochlear implant functionality

One in a thousand children in the United States is deaf, and one in three adults will experience significant hearing loss after the age of 65. Whether the result of genetic or environmental factors, hearing loss costs billions of dollars in healthcare expenses every year, making the search for a cure critical.

image

Now a team of researchers led by Karen B. Avraham of the Department of Human Molecular Genetics and Biochemistry at Tel Aviv University’s Sackler Faculty of Medicine and Yehoash Raphael of the Department of Otolaryngology–Head and Neck Surgery at University of Michigan’s Kresge Hearing Research Institute have discovered that using DNA as a drug — commonly called gene therapy — in laboratory mice may protect the inner ear nerve cells of humans suffering from certain types of progressive hearing loss.

In the study, doctoral student Shaked Shivatzki created a mouse population possessing the gene that produces the most prevalent form of hearing loss in humans: the mutated connexin 26 gene. Some 30 percent of American children born deaf have this form of the gene. Because of its prevalence and the inexpensive tests available to identify it, there is a great desire to find a cure or therapy to treat it.

"Regenerating" neurons

Prof. Avraham’s team set out to prove that gene therapy could be used to preserve the inner ear nerve cells of the mice. Mice with the mutated connexin 26 gene exhibit deterioration of the nerve cells that send a sound signal to the brain. The researchers found that a protein growth factor used to protect and maintain neurons, otherwise known as brain-derived neurotrophic factor (BDNF), could be used to block this degeneration. They then engineered a virus that could be tolerated by the body without causing disease, and inserted the growth factor into the virus. Finally, they surgically injected the virus into the ears of the mice. This factor was able to “rescue” the neurons in the inner ear by blocking their degeneration.

"A wide spectrum of people are affected by hearing loss, and the way each person deals with it is highly variable," said Prof. Avraham. "That said, there is an almost unanimous interest in finding the genes responsible for hearing loss. We tried to figure out why the mouse was losing cells that enable it to hear. Why did it lose its hearing? The collaborative work allowed us to provide gene therapy to reverse the loss of nerve cells in the ears of these deaf mice."

Although this approach is short of improving hearing in these mice, it has important implications for the enhancement of sound perception with a cochlear implant, used by many people whose connexin 26 mutation has led to impaired hearing.

Embryonic hearing?

Inner ear nerve cells facilitate the optimal functioning of cochlear implants. Prof. Avraham’s research suggests a possible new strategy for improving implant function, particularly in people whose hearing loss gets progressively worse with time, such as those with profound hearing loss as well as those with the connexin gene mutation. Combining gene therapy with the implant could help to protect vital nerve cells, thus preserving and improving the performance of the implant.

More research remains. “Safety is the main question. And what about timing? Although over 80 percent of human and mouse genes are similar, which makes mice the perfect lab model for human hearing, there’s still a big difference. Humans start hearing as embryos, but mice don’t start to hear until two weeks after birth. So we wondered, do we need to start the corrective process in utero, in infants, or later in life?” said Prof. Avraham.

"Practically speaking, we are a long way off from treating hearing loss during embryogenesis. But we proved what we set out to do: that we can help preserve nerve cells in the inner ears of the mouse," Prof. Avraham continued. "This already looks very promising."

(Source: aftau.org)

Filed under cochlear implant hearing loss hearing nerve cells brain-derived neurotrophic factor gene therapy neuroscience science

175 notes

Diary of becoming an NHS-funded cyborg
From the day I was born, my brain developed according to the stimuli it received. My senses of vision, touch, taste, smell were all slightly heightened in compensation for the lack of input from my ears, helping me to create a world I could understand.
My mother worked full time with me, playing a set of activities she called “the game”. I was a child, and didn’t understand the real reason for playing the game — but it taught me to read, write, lipread, and speak, if not to hear in the traditional sense of the word. What I do hear is filtered through digital hearing aids that amplify what little sound I can hear.
A month ago, for the first time, I made the change from external technology to internal technology. I became a full time cyborg, free of charge on the NHS.
They cut away a flap of skin behind my left ear, drilled a tiny hole into my skull between the two main nerves of the face that control taste and the face, and inserted an electrode into my cochlear, connected to a small magnet and circuit board under the skin.
They’re going to switch me on in a few days — and if it’s all working as it should, my auditory cortex will be bombarded by a range of electronic noises. Over time, I may come to understand these sounds as consonants, music, even the spoken word.
This is what it will sound like, apparently.
Even if I can make sense of those sounds, it won’t be “hearing” in the normal sense of the word. My ears have had the same level of input for the last 30 years of my life — and now I’ve physically rewired one of them to receive a completely different signal.
In all the recent blue sky thinking on Wired.co.uk and elsewhere about the future of the human race — coprocessors for the brain, enhanced spectrum bionic eyes, artificial legs, even the possibility of interfacing with computers directly — people forget one thing. What it feels like, what it’s like to live with it every day, whether it makes you feel more, or less, yourself.
I’m also wary of augmentation and body enhancement becoming the norm. We have a fluid definition of what a disability is, and what isn’t. If certain people with access to this technology start engineering themselves to have greater physical or mental abilities, then where does that leave ordinary people? Differently abled? Or Disabled? Or in fact more abled? In giving up perfectly usable eyes, the end result of millions of years of evolution, to install digital eyes that can project images onto the retina, are we really putting ourselves at an advantage?
If I’d been born into a deaf family, all of us signing, my brain developing to become fluent in sign language and developing a deaf identity so strong and complete that I saw deafness as “normal” and hearing as “abnormal” — I wouldn’t have had this implant.
The cochlear implant, in crossing the line from external wearable technology to permanent fixture, becomes a technology that is potentially in conflict with human values, rather than a testament to them. Many deaf people see the cochlear implant as a symbol of medical intervention, to oppress and ultimately eradicate the deaf community and deaf culture, by fixing them one implant at a time — this includes implanting children at an early age so that they’ll be able to acquire spoken language rather than sign.

Diary of becoming an NHS-funded cyborg

From the day I was born, my brain developed according to the stimuli it received. My senses of vision, touch, taste, smell were all slightly heightened in compensation for the lack of input from my ears, helping me to create a world I could understand.

My mother worked full time with me, playing a set of activities she called “the game”. I was a child, and didn’t understand the real reason for playing the game — but it taught me to read, write, lipread, and speak, if not to hear in the traditional sense of the word. What I do hear is filtered through digital hearing aids that amplify what little sound I can hear.

A month ago, for the first time, I made the change from external technology to internal technology. I became a full time cyborg, free of charge on the NHS.

They cut away a flap of skin behind my left ear, drilled a tiny hole into my skull between the two main nerves of the face that control taste and the face, and inserted an electrode into my cochlear, connected to a small magnet and circuit board under the skin.

They’re going to switch me on in a few days — and if it’s all working as it should, my auditory cortex will be bombarded by a range of electronic noises. Over time, I may come to understand these sounds as consonants, music, even the spoken word.

This is what it will sound like, apparently.

Even if I can make sense of those sounds, it won’t be “hearing” in the normal sense of the word. My ears have had the same level of input for the last 30 years of my life — and now I’ve physically rewired one of them to receive a completely different signal.

In all the recent blue sky thinking on Wired.co.uk and elsewhere about the future of the human race — coprocessors for the brain, enhanced spectrum bionic eyes, artificial legs, even the possibility of interfacing with computers directly — people forget one thing. What it feels like, what it’s like to live with it every day, whether it makes you feel more, or less, yourself.

I’m also wary of augmentation and body enhancement becoming the norm. We have a fluid definition of what a disability is, and what isn’t. If certain people with access to this technology start engineering themselves to have greater physical or mental abilities, then where does that leave ordinary people? Differently abled? Or Disabled? Or in fact more abled? In giving up perfectly usable eyes, the end result of millions of years of evolution, to install digital eyes that can project images onto the retina, are we really putting ourselves at an advantage?

If I’d been born into a deaf family, all of us signing, my brain developing to become fluent in sign language and developing a deaf identity so strong and complete that I saw deafness as “normal” and hearing as “abnormal” — I wouldn’t have had this implant.

The cochlear implant, in crossing the line from external wearable technology to permanent fixture, becomes a technology that is potentially in conflict with human values, rather than a testament to them. Many deaf people see the cochlear implant as a symbol of medical intervention, to oppress and ultimately eradicate the deaf community and deaf culture, by fixing them one implant at a time — this includes implanting children at an early age so that they’ll be able to acquire spoken language rather than sign.

Filed under auditory cortex cochlear implant hearing loss deafness neuroscience science

free counters