Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

129 notes

Technique moves practical Alzheimer diagnosis one step closer to reality
Researchers at the University of Wisconsin-Madison School of Medicine and Public Health are moving closer to a significant milepost in the battle against Alzheimer’s disease: identifying the first signs of decline in the brain.
After years of frustrating failure to stop late-stage Alzheimer’s, it’s essential to find and treat the mild stages, says Sterling Johnson, professor of geriatrics. “We need to identify Alzheimer’s as early as possible, before the really destructive changes take place. Typically, by the time we diagnose Alzheimer’s disease, patients have already lost much of their brain capacity, and it’s difficult or impossible for them to recover.”
The earlier phases, before large numbers of brain cells have been killed, should be more amenable to treatment, Johnson says. Alzheimer’s disease is the largest single cause of dementia. Early symptoms include memory decline, eventually progressing to widespread cognitive and behavioral changes.
In a study published in the journal Cerebral Cortex in December, Johnson, Ozioma Okonkwo in the Department of Geriatrics, and colleagues reported on measurements of brain blood flow in 327 adults. The researchers used an advanced form of MRI to compare blood flow in people with Alzheimer’s, a preliminary stage called mild cognitive impairment, or those who had no symptoms but had a family history of Alzheimer’s.
Reduced blood flow signifies reduced activity in particular parts of the brain, often due to the atrophy of nerve cells. One affected structure, called the hippocampus, is necessary for making new memories. In mild to moderate cases of Alzheimer’s, 40 percent or more of the hippocampus has disappeared.
As expected, the Alzheimer’s patients had lower blood flow in several brain regions linked to memory. People with mild cognitive impairment had a milder version of the same deficits. And people whose mother (but not father) had Alzheimer’s had clear signs of reduced blood flow, even though they lacked symptoms.
Other techniques that can measure blood flow are more costly and require the use of radiation and injecting a drug tracer during the scan, Johnson says. If this non-invasive MRI technique continues to prove itself, it could be a key to detecting Alzheimer’s disease in its early, and hopefully more treatable, phases.
"In the new paper, we showed that the same areas that show up with more established scanning techniques also are identified with this MRI blood flow technique, in people with Alzheimer’s and mild cognitive impairment," says Johnson. "So this method is valid and reliable, and is now ready to begin deployment in treatment research with people at risk."

Technique moves practical Alzheimer diagnosis one step closer to reality

Researchers at the University of Wisconsin-Madison School of Medicine and Public Health are moving closer to a significant milepost in the battle against Alzheimer’s disease: identifying the first signs of decline in the brain.

After years of frustrating failure to stop late-stage Alzheimer’s, it’s essential to find and treat the mild stages, says Sterling Johnson, professor of geriatrics. “We need to identify Alzheimer’s as early as possible, before the really destructive changes take place. Typically, by the time we diagnose Alzheimer’s disease, patients have already lost much of their brain capacity, and it’s difficult or impossible for them to recover.”

The earlier phases, before large numbers of brain cells have been killed, should be more amenable to treatment, Johnson says. Alzheimer’s disease is the largest single cause of dementia. Early symptoms include memory decline, eventually progressing to widespread cognitive and behavioral changes.

In a study published in the journal Cerebral Cortex in December, Johnson, Ozioma Okonkwo in the Department of Geriatrics, and colleagues reported on measurements of brain blood flow in 327 adults. The researchers used an advanced form of MRI to compare blood flow in people with Alzheimer’s, a preliminary stage called mild cognitive impairment, or those who had no symptoms but had a family history of Alzheimer’s.

Reduced blood flow signifies reduced activity in particular parts of the brain, often due to the atrophy of nerve cells. One affected structure, called the hippocampus, is necessary for making new memories. In mild to moderate cases of Alzheimer’s, 40 percent or more of the hippocampus has disappeared.

As expected, the Alzheimer’s patients had lower blood flow in several brain regions linked to memory. People with mild cognitive impairment had a milder version of the same deficits. And people whose mother (but not father) had Alzheimer’s had clear signs of reduced blood flow, even though they lacked symptoms.

Other techniques that can measure blood flow are more costly and require the use of radiation and injecting a drug tracer during the scan, Johnson says. If this non-invasive MRI technique continues to prove itself, it could be a key to detecting Alzheimer’s disease in its early, and hopefully more treatable, phases.

"In the new paper, we showed that the same areas that show up with more established scanning techniques also are identified with this MRI blood flow technique, in people with Alzheimer’s and mild cognitive impairment," says Johnson. "So this method is valid and reliable, and is now ready to begin deployment in treatment research with people at risk."

Filed under dementia alzheimer's disease blood flow MRI blood flow technique neuroscience science

86 notes

Teaching the brain to speak again

Cynthia Thompson, a world-renowned researcher on stroke and brain damage, will discuss her groundbreaking research on aphasia and the neurolinguistic systems it affects Feb. 16 at the annual meeting of the American Association for the Advancement of Science (AAAS). An estimated one million Americans suffer from aphasia, affecting their ability to understand and/or produce spoken and/or written language.

For three decades, Thompson has played a crucial role in demonstrating the brain’s plasticity, or ability to change. “Not long ago, the conventional wisdom was that people only could recover language within three months to a year after the onset of stroke,” she says. “Today we know that, with appropriate training, patients can make gains as much as 10 years or more after a stroke.”

Thompson has probably contributed more findings on the effects of brain damage on language processing and the ways the brain and language recover from stroke than any other single researcher. Her particular interest is agrammatic aphasia, which impairs abstract knowledge of grammatical sentence structure and makes sentence production and understanding difficult.

Among the first researchers to use functional magnetic resonance imaging to study recovery from stroke, Thompson found that behavior treatment that focused on improving impaired language processing affects not only the ability to understand and produce language but also brain activity.

She found shifts in neural activity in both cerebral hemispheres associated with recovery, with the greatest recovery seen in undamaged brain regions within the language network engaged by healthy people, albeit regions recruited for various language activities.

"It’s a matter of ‘use it or lose it,’" Thompson says. "The brain has the capacity to learn and relearn throughout life, and it is directly affected by the activities we engage in. Language training that focuses on principles of normal language processing stimulates the recovery of neural networks that support language."

Thompson will discuss research she will conduct as principal investigator of a $12 million National Institutes of Health Clinical Research Center award to study biomarkers of recovery in aphasia.

Working with investigators from a number of universities, Thompson will explore the role blood flow plays in language recovery in chronic stroke patients. In addition, she will conduct cutting-edge, exploratory research using eye tracking to understand how people compute language as they hear it in real time. Eye-tracking techniques have been found to discern subtle problems underlying language deficits in acquired aphasia.

In a landmark 2010 study, she and colleagues discovered two critical variables related to understanding brain damage recovery. They found that stroke not only results in cell death in certain regions of the brain but that it also decreases blood flow (perfusion) to living cells that are adjacent (and sometimes even distant) to the lesion.

Until that study, hypoperfusion (diminished blood flow) was thought only to be associated with acute stroke. Her team also found that greater hypoperfusion led to poorer recovery.

(Source: eurekalert.org)

Filed under language aphasia brain damage stroke neural activity language processing neuroscience science

68 notes

Training speech networks to treat aphasia
About 80,000 people develop aphasia each year in the United States alone. Nearly all of these individuals have difficulty speaking. For example, some patients (nonfluent aphasics) have trouble producing sounds clearly, making it frustrating for them to speak and difficult for them to be understood. Other patients (fluent aphasics) may select the wrong sound in a word or mix up the order of the sounds. In the latter case, “kitchen” can become “chicken.” Blumstein’s idea is to use guided speech to help people who have suffered stroke-related brain damage to rebuild their neural speech infrastructure.
Blumstein has been studying aphasia and the neural basis of language her whole career. She uses brain imaging, acoustic analysis, and other lab-based techniques to study how the brain maps sound to meaning and meaning to sound.
What Blumstein and other scientists believe is that the brain organizes words into networks, linked both by similarity of meaning and similarity of sound. To say “pear,” a speaker will also activate other competing words like “apple” (which competes in meaning) and “bear”(which competes in sound). Despite this competition, normal speakers are able to select the correct word.
In a study published in the Journal of Cognitive Neuroscience in 2010, for example, she and her co-authors used functional magnetic resonance imaging to track neural activation patterns in the brains of 18 healthy volunteers as they spoke English words that had similar sounding “competitors” (“cape” and “gape” differ subtly in the first consonant by voicing, i.e. the timing of the onset of vocal cord vibration). Volunteers also spoke words without similar sounding competitors (“cake” has no voiced competitor in English; gake is not a word). What the researchers found is that neural activation within a network of brain regions was modulated differently when subjects said words that had competitors versus words that did not.
One way this competition-mediated difference is apparent in speech production is that words with competitors are produced differently from words that do not have competitors. For example, the voicing of the “t” in “tot” (with a voiced competitor ‘dot’) is produced with more voicing than the “t” in “top” (there is no ‘dop’ in English). Through acoustic analysis of the speech of people with aphasia, Blumstein has shown that this difference persists, suggesting that their word networks are still largely intact.

Training speech networks to treat aphasia

About 80,000 people develop aphasia each year in the United States alone. Nearly all of these individuals have difficulty speaking. For example, some patients (nonfluent aphasics) have trouble producing sounds clearly, making it frustrating for them to speak and difficult for them to be understood. Other patients (fluent aphasics) may select the wrong sound in a word or mix up the order of the sounds. In the latter case, “kitchen” can become “chicken.” Blumstein’s idea is to use guided speech to help people who have suffered stroke-related brain damage to rebuild their neural speech infrastructure.

Blumstein has been studying aphasia and the neural basis of language her whole career. She uses brain imaging, acoustic analysis, and other lab-based techniques to study how the brain maps sound to meaning and meaning to sound.

What Blumstein and other scientists believe is that the brain organizes words into networks, linked both by similarity of meaning and similarity of sound. To say “pear,” a speaker will also activate other competing words like “apple” (which competes in meaning) and “bear”(which competes in sound). Despite this competition, normal speakers are able to select the correct word.

In a study published in the Journal of Cognitive Neuroscience in 2010, for example, she and her co-authors used functional magnetic resonance imaging to track neural activation patterns in the brains of 18 healthy volunteers as they spoke English words that had similar sounding “competitors” (“cape” and “gape” differ subtly in the first consonant by voicing, i.e. the timing of the onset of vocal cord vibration). Volunteers also spoke words without similar sounding competitors (“cake” has no voiced competitor in English; gake is not a word). What the researchers found is that neural activation within a network of brain regions was modulated differently when subjects said words that had competitors versus words that did not.

One way this competition-mediated difference is apparent in speech production is that words with competitors are produced differently from words that do not have competitors. For example, the voicing of the “t” in “tot” (with a voiced competitor ‘dot’) is produced with more voicing than the “t” in “top” (there is no ‘dop’ in English). Through acoustic analysis of the speech of people with aphasia, Blumstein has shown that this difference persists, suggesting that their word networks are still largely intact.

Filed under aphasia brain damage language speech production neuroimaging neuroscience science

273 notes

How Neuroscience Will Fight Five Age-Old Afflictions
SEIZURES
A device delivers targeted drugs to calm overactive neurons
For years, large clinical trials have treated people with epilepsy using so-called deep-brain stimulation: surgically implanted electrodes that can detect a seizure and stop it with an electrical jolt. The technology leads to a 69 percent reduction in seizures after five years, according to the latest results.
Tracy Cui, a biomedical engineer at the University of Pittsburgh, hopes to improve upon that statistic. Her group has designed an electrode that would deliver both an electrical pulse and antiseizure medication. “We know where we want to apply the drug,” Cui says, “so you would not need a lot of it.”
To build the device, Cui’s team immersed a metal electrode in a solution containing two key ingredients: a molecule called a monomer and the drug CNQX. Zapping the solution with electricity causes the monomers to link together and form a long chain called a polymer. Because the polymer is positively charged, it attracts the negatively charged CNQX, leaving the engineers with their target product: an electrode coated in a film that’s infused with the drug.
The researchers then placed the electrodes in a petri dish with rat neurons. Another zap of electricity disrupted the electrostatic attraction in the film, causing the polymer to release its pharmacological payload—and nearby cells to quiet their erratic firing patterns. Cui says her team has successfully repeated the experiment in living rats. Next, she’d like to test the electrodes in epileptic rats and then begin the long process of regulatory approval for human use.
The body’s blood-brain barrier protects the organ from everything but the smallest molecules, rendering most drugs ineffective. As a result, this drug-​delivery mechanism could treat other brain disorders, Cui says. The electrodes can be loaded with any kind of small drug—like dopamine or painkillers—making it useful for treating Parkinson’s disease, chronic pain, or even drug addiction.
DEMENTIA
Electrode arrays stimulate mental processing
Dementia is one of the most well-known and frustrating brain afflictions. It damages many of the fundamental cognitive functions that make us human: working memory, decision-making, language, and logical reasoning. Alzheimer’s, Huntington’s, and Parkinson’s diseases all lead to dementia, and it’s also sometimes associated with multiple sclerosis, AIDS, and the normal process of aging.
Theodore Berger, a biomedical engineer at the University of Southern California, hopes to help people stave off the symptoms of dementia with a device implanted in the brain’s prefrontal cortex, a region crucial for sophisticated cognition. He and colleagues at Wake Forest Baptist Medical Center tested the device in a study involving five monkeys and a memory game.
First the team implanted an electrode array so that it could record from layers 2/3 and 5 of the prefrontal cortex and stimulate layer 5. The neural signals that jet back and forth between these areas relate to attention and decision-making. The team then trained the monkeys to play a computer game in which they saw a cartoon picture—such as a truck, lion, or paint palette—and had to select the same image from a panel of pictures 90 seconds later.
The scientists initially analyzed the electrical signals sent between the two cortical layers when the monkeys made a correct match. In later experiments, the team caused the array to emit the same signal just before the monkey made its decision. The animals’ accuracy improved by about 10 percent. That effect may be even more profound in an impaired brain. When the monkeys played the same game after receiving a hit of cocaine, their performance dropped by about 20 percent. But electrical stimulation restored their accuracy to normal levels.
Dementia involves far more complicated circuitry than these two layers of the brain. But once scientists better understand exactly how dementia works, it may be possible to combine several implants to each target a specific region.
BLINDNESS
Gene therapy converts cells into photoreceptors, restoring eyesight
Millions of people lose their eyesight when disease damages the photoreceptor cells in their retinas. These cells, called rods and cones, play a pivotal role in vision: They convert incoming light into electrical impulses that the brain interprets as an image.
In recent years, a handful of companies have developed electrode-array implants that bypass the damaged cells. A microprocessor translates information from a video camera into electric pulses that stimulate the retina; as a result, blind subjects in clinical trials have been able to distinguish objects and even read very large type. But the implanted arrays have one big drawback: They stimulate only a small number of retinal cells—about 60 out of 100,000—which ultimately limits a person’s visual resolution.
A gene therapy being developed by Michigan-based RetroSense could replace thousands of damaged retinal cells. The company’s technology targets the layer of the retina containing ganglion cells. Normally, ganglion cells transmit the electric signal from the rods and cones to the brain. But RetroSense inserts a gene that makes the ganglion cells sensitive to light; they take over the job of the photoreceptors. So far, scientists have successfully tested the technology on rodents and monkeys. In rat studies, the gene therapy allowed the animals to see well enough to detect the edge of a platform as they neared it.
The company plans to launch the first clinical trial of the technology next year, with nine subjects blinded by a disease called retinitis pigmentosa. Unlike the surgeries to implant electrode arrays, the procedure to inject gene therapy will take just minutes and requires only local anesthesia. “The visual signal that comes from the ganglion cells may not be encoded in exactly the fashion that they’re used to,” says Peter Francis, chief medical officer of RetroSense. “But what is likely to happen is that their brain is going to adapt.”
PARALYSIS
A brain-machine interface controls limbs while sensing what they touch
Last year, clinical trials involving brain implants gave great hope to people with severe spinal cord injuries. Two paralyzed subjects imagined picking up a cup of coffee. Electrode arrays decoded those neural instructions in real time and sent them to a robotic arm, which brought the coffee to their lips.
But to move limbs with any real precision, the brain also requires tactile feedback. Miguel Nicolelis, a biomedical engineer at Duke University, has now demonstrated that brain-machine interfaces can simultaneously control motion and relay a sense of touch—at least in virtual reality.
For the experiment, Nicolelis’s team inserted electrodes in two brain areas in monkeys: the motor cortex, which controls movement, and the nearby somatosensory cortex, which interprets touch signals from the outside world. Then the monkeys played a computer game in which they controlled a virtual arm—first by using a joystick and eventually by simply imagining the movement. The arm could touch three identical-looking gray circles. But each circle had a different virtual “texture” that sent a distinct electrical pattern to the monkeys’ somatosensory cortex. The monkeys learned to select the texture that produced a treat, proving that the implant was both sending and receiving neural messages.
This year, a study in Brazil will test the ability of 10 to 20 patients with spinal cord injuries to control an exoskeleton using the implant. Nicolelis, an ardent fan of Brazilian soccer, has set a strict timetable for his team: A nonprofit consortium he created, the Walk Again Project, plans to outfit a paraplegic man with a robotic exoskeleton and take him to the 2014 World Cup in São Paulo, where he will deliver the opening kick.
DEAFNESS
Stem cells repair a damaged auditory nerve, improving hearing
Over the past 25 years, more than 30,000 people with hearing loss have received an electronic implant that replaces the cochlea, the snail-shaped organ in the inner ear whose cells transform sound waves into electrical signals. The device acts as a microphone, picking up sounds from the environment and transmitting them to the auditory nerve, which carries them on to the brain.
But a cochlear implant won’t help the 10 percent of people whose profound hearing loss is caused by damage to the auditory nerve. Fortunately for this group, a team of British scientists has found a way to restore that nerve using stem cells.
The researchers exposed human embryonic stem cells to growth factors, substances that cause them to differentiate into the precursors of auditory neurons. Then they injected some 50,000 of these cells into the cochleas of gerbils whose auditory nerves had been damaged. (Gerbils are often used as models of deafness because their range of hearing is similar to that of people.) Three months after the transplant, about one third of the original number of auditory neurons had been restored; some appeared to form projections that connected to the brain stem. The animals’ hearing improved, on average, by 46 percent.
It will be years before the technique is tested in humans. Once it is, researchers say, it has the potential to help not only those with nerve damage but also people with more widespread impairment whose auditory nerve must be repaired in order to receive a cochlear implant.

How Neuroscience Will Fight Five Age-Old Afflictions

SEIZURES

A device delivers targeted drugs to calm overactive neurons

For years, large clinical trials have treated people with epilepsy using so-called deep-brain stimulation: surgically implanted electrodes that can detect a seizure and stop it with an electrical jolt. The technology leads to a 69 percent reduction in seizures after five years, according to the latest results.

Tracy Cui, a biomedical engineer at the University of Pittsburgh, hopes to improve upon that statistic. Her group has designed an electrode that would deliver both an electrical pulse and antiseizure medication. “We know where we want to apply the drug,” Cui says, “so you would not need a lot of it.”

To build the device, Cui’s team immersed a metal electrode in a solution containing two key ingredients: a molecule called a monomer and the drug CNQX. Zapping the solution with electricity causes the monomers to link together and form a long chain called a polymer. Because the polymer is positively charged, it attracts the negatively charged CNQX, leaving the engineers with their target product: an electrode coated in a film that’s infused with the drug.

The researchers then placed the electrodes in a petri dish with rat neurons. Another zap of electricity disrupted the electrostatic attraction in the film, causing the polymer to release its pharmacological payload—and nearby cells to quiet their erratic firing patterns. Cui says her team has successfully repeated the experiment in living rats. Next, she’d like to test the electrodes in epileptic rats and then begin the long process of regulatory approval for human use.

The body’s blood-brain barrier protects the organ from everything but the smallest molecules, rendering most drugs ineffective. As a result, this drug-​delivery mechanism could treat other brain disorders, Cui says. The electrodes can be loaded with any kind of small drug—like dopamine or painkillers—making it useful for treating Parkinson’s disease, chronic pain, or even drug addiction.

DEMENTIA

Electrode arrays stimulate mental processing

Dementia is one of the most well-known and frustrating brain afflictions. It damages many of the fundamental cognitive functions that make us human: working memory, decision-making, language, and logical reasoning. Alzheimer’s, Huntington’s, and Parkinson’s diseases all lead to dementia, and it’s also sometimes associated with multiple sclerosis, AIDS, and the normal process of aging.

Theodore Berger, a biomedical engineer at the University of Southern California, hopes to help people stave off the symptoms of dementia with a device implanted in the brain’s prefrontal cortex, a region crucial for sophisticated cognition. He and colleagues at Wake Forest Baptist Medical Center tested the device in a study involving five monkeys and a memory game.

First the team implanted an electrode array so that it could record from layers 2/3 and 5 of the prefrontal cortex and stimulate layer 5. The neural signals that jet back and forth between these areas relate to attention and decision-making. The team then trained the monkeys to play a computer game in which they saw a cartoon picture—such as a truck, lion, or paint palette—and had to select the same image from a panel of pictures 90 seconds later.

The scientists initially analyzed the electrical signals sent between the two cortical layers when the monkeys made a correct match. In later experiments, the team caused the array to emit the same signal just before the monkey made its decision. The animals’ accuracy improved by about 10 percent. That effect may be even more profound in an impaired brain. When the monkeys played the same game after receiving a hit of cocaine, their performance dropped by about 20 percent. But electrical stimulation restored their accuracy to normal levels.

Dementia involves far more complicated circuitry than these two layers of the brain. But once scientists better understand exactly how dementia works, it may be possible to combine several implants to each target a specific region.

BLINDNESS

Gene therapy converts cells into photoreceptors, restoring eyesight

Millions of people lose their eyesight when disease damages the photoreceptor cells in their retinas. These cells, called rods and cones, play a pivotal role in vision: They convert incoming light into electrical impulses that the brain interprets as an image.

In recent years, a handful of companies have developed electrode-array implants that bypass the damaged cells. A microprocessor translates information from a video camera into electric pulses that stimulate the retina; as a result, blind subjects in clinical trials have been able to distinguish objects and even read very large type. But the implanted arrays have one big drawback: They stimulate only a small number of retinal cells—about 60 out of 100,000—which ultimately limits a person’s visual resolution.

A gene therapy being developed by Michigan-based RetroSense could replace thousands of damaged retinal cells. The company’s technology targets the layer of the retina containing ganglion cells. Normally, ganglion cells transmit the electric signal from the rods and cones to the brain. But RetroSense inserts a gene that makes the ganglion cells sensitive to light; they take over the job of the photoreceptors. So far, scientists have successfully tested the technology on rodents and monkeys. In rat studies, the gene therapy allowed the animals to see well enough to detect the edge of a platform as they neared it.

The company plans to launch the first clinical trial of the technology next year, with nine subjects blinded by a disease called retinitis pigmentosa. Unlike the surgeries to implant electrode arrays, the procedure to inject gene therapy will take just minutes and requires only local anesthesia. “The visual signal that comes from the ganglion cells may not be encoded in exactly the fashion that they’re used to,” says Peter Francis, chief medical officer of RetroSense. “But what is likely to happen is that their brain is going to adapt.”

PARALYSIS

A brain-machine interface controls limbs while sensing what they touch

Last year, clinical trials involving brain implants gave great hope to people with severe spinal cord injuries. Two paralyzed subjects imagined picking up a cup of coffee. Electrode arrays decoded those neural instructions in real time and sent them to a robotic arm, which brought the coffee to their lips.

But to move limbs with any real precision, the brain also requires tactile feedback. Miguel Nicolelis, a biomedical engineer at Duke University, has now demonstrated that brain-machine interfaces can simultaneously control motion and relay a sense of touch—at least in virtual reality.

For the experiment, Nicolelis’s team inserted electrodes in two brain areas in monkeys: the motor cortex, which controls movement, and the nearby somatosensory cortex, which interprets touch signals from the outside world. Then the monkeys played a computer game in which they controlled a virtual arm—first by using a joystick and eventually by simply imagining the movement. The arm could touch three identical-looking gray circles. But each circle had a different virtual “texture” that sent a distinct electrical pattern to the monkeys’ somatosensory cortex. The monkeys learned to select the texture that produced a treat, proving that the implant was both sending and receiving neural messages.

This year, a study in Brazil will test the ability of 10 to 20 patients with spinal cord injuries to control an exoskeleton using the implant. Nicolelis, an ardent fan of Brazilian soccer, has set a strict timetable for his team: A nonprofit consortium he created, the Walk Again Project, plans to outfit a paraplegic man with a robotic exoskeleton and take him to the 2014 World Cup in São Paulo, where he will deliver the opening kick.

DEAFNESS

Stem cells repair a damaged auditory nerve, improving hearing

Over the past 25 years, more than 30,000 people with hearing loss have received an electronic implant that replaces the cochlea, the snail-shaped organ in the inner ear whose cells transform sound waves into electrical signals. The device acts as a microphone, picking up sounds from the environment and transmitting them to the auditory nerve, which carries them on to the brain.

But a cochlear implant won’t help the 10 percent of people whose profound hearing loss is caused by damage to the auditory nerve. Fortunately for this group, a team of British scientists has found a way to restore that nerve using stem cells.

The researchers exposed human embryonic stem cells to growth factors, substances that cause them to differentiate into the precursors of auditory neurons. Then they injected some 50,000 of these cells into the cochleas of gerbils whose auditory nerves had been damaged. (Gerbils are often used as models of deafness because their range of hearing is similar to that of people.) Three months after the transplant, about one third of the original number of auditory neurons had been restored; some appeared to form projections that connected to the brain stem. The animals’ hearing improved, on average, by 46 percent.

It will be years before the technique is tested in humans. Once it is, researchers say, it has the potential to help not only those with nerve damage but also people with more widespread impairment whose auditory nerve must be repaired in order to receive a cochlear implant.

Filed under seizures dementia blindness paralysis deafness neuroscience medicine science

478 notes

Chimpanzees have faster working memory than humans
Chimpanzees have a faster working memory than humans according to a remarkable study showing that it takes them a fraction of a second to remember something that it would take several seconds for humans to memorise. 
A Japanese scientist has demonstrated the prowess of chimps in remembering in less than half a second the precise position and correct sequence of up to nine numbers on a computer screen.
The numbers are shown together randomly distributed on a computer screen and as soon as the chimps press the number “one” the rest of the numerals are masked. However, they can almost invariably remember where each number was.
It is impossible for people to do the same cognitive task that quickly, said Tetsuro Matsuzawa, a primatologist at Kyoto University. “They have a better working memory than us,” he told the American Association for the Advancment of Science meeting in Boston.
Professor Matsuzawa had carried out the memory experiments on a female chimp called Ai, which means “love” in Japanese, and Ayumu, her son who was born in 2000 and has shown even better memory skills, he said.
Professor Matsuzawa suggested that chimps have developed this part of their memory because they live in the “here and now” whereas humans are thinking more about the past and planning for the future.

Chimpanzees have faster working memory than humans

Chimpanzees have a faster working memory than humans according to a remarkable study showing that it takes them a fraction of a second to remember something that it would take several seconds for humans to memorise.

A Japanese scientist has demonstrated the prowess of chimps in remembering in less than half a second the precise position and correct sequence of up to nine numbers on a computer screen.

The numbers are shown together randomly distributed on a computer screen and as soon as the chimps press the number “one” the rest of the numerals are masked. However, they can almost invariably remember where each number was.

It is impossible for people to do the same cognitive task that quickly, said Tetsuro Matsuzawa, a primatologist at Kyoto University. “They have a better working memory than us,” he told the American Association for the Advancment of Science meeting in Boston.

Professor Matsuzawa had carried out the memory experiments on a female chimp called Ai, which means “love” in Japanese, and Ayumu, her son who was born in 2000 and has shown even better memory skills, he said.

Professor Matsuzawa suggested that chimps have developed this part of their memory because they live in the “here and now” whereas humans are thinking more about the past and planning for the future.

Filed under primates memory working memory cognitive tasks psychology neuroscience science

61 notes

Hypothalamic control of energy balance: insights into the role of synaptic plasticity
The past 20 years witnessed an enormous leap in understanding of the central regulation of whole-body energy metabolism. Genetic tools have enabled identification of the region-specific expression of peripheral metabolic hormone receptors and have identified neuronal circuits that mediate the action of these hormones on behavior and peripheral tissue functions. One of the surprising findings of recent years is the observation that brain circuits involved in metabolism regulation remain plastic through adulthood. In this review, we discuss these findings and focus on the role of neurons and glial cells in the dynamic process of plasticity, which is fundamental to the regulation of physiological and pathological metabolic events.

Hypothalamic control of energy balance: insights into the role of synaptic plasticity

The past 20 years witnessed an enormous leap in understanding of the central regulation of whole-body energy metabolism. Genetic tools have enabled identification of the region-specific expression of peripheral metabolic hormone receptors and have identified neuronal circuits that mediate the action of these hormones on behavior and peripheral tissue functions. One of the surprising findings of recent years is the observation that brain circuits involved in metabolism regulation remain plastic through adulthood. In this review, we discuss these findings and focus on the role of neurons and glial cells in the dynamic process of plasticity, which is fundamental to the regulation of physiological and pathological metabolic events.

Filed under energy metabolism neuronal circuits plasticity neurons glial cells neuroscience science

56 notes

Low-protein diet slows Alzheimer’s in mice

Mice with many of the pathologies of Alzheimer’s Disease showed fewer signs of the disease when given a protein-restricted diet supplemented with specific amino acids every other week for four months.

Mice at advanced stages of the disease were put on the new diet. They showed improved cognitive abilities over their non-dieting peers when their memory was tested using mazes. In addition, fewer of their neurons contained abnormal levels of a damaged protein, called “tau,” which accumulates in the brains of Alzheimer’s patients.

Dietary protein is the major dietary regulator of a growth hormone known as IGF-1, which has been associated with aging and diseases in mice and several diseases in older adults.

Upcoming studies by USC Professor Valter Longo, the study’s corresponding author, will attempt to determine whether humans respond similarly – while simultaneously examining the effects of dietary restrictions on cancer, diabetes and cardiac disease.

"We had previously shown that humans deficient in Growth Hormone receptor and IGF-I displayed reduced incidence of cancer and diabetes. Although the new study is in mice, it raises the possibility that low protein intake and low IGF-I may also protect from age-dependent neurodegeneration," said Longo, who directs the Longevity Institute of the USC Davis School of Gerontology and has a joint appointment the USC Dornsife College of Letters, Arts and Sciences.

Longo worked with Pinchas Cohen, dean of the USC Davis School, as well as USC graduate students Edoardo Parrella, Tom Maxim, Lu Zhang, Junxiang Wan and Min Wei; Francesca Maialetti of the Istituto Superiore di Sanità in Rome; and Luigi Fontana of Washington University in St. Louis.

"Alzheimer’s Disease and other forms of neurodegeneration are a major burden on society, and it is a rising priority for this nation to develop new approaches for preventing and treating these conditions, since the frequencies of these disorders will be rising as the population ages over the next several decades," said Cohen, who became dean of the School of Gerontology in summer 2012. "New strategies to address this, particularly non-invasive, non-pharmacological approaches such as tested in Dr. Longo’s study are particularly exciting."

The results of their study were published online by Aging Cell last month.

The team found that a protein-restricted diet reduced levels of IGF-1 circulating through the body by 30 to 70 percent, and caused an eight-fold increase in a protein that blocks IGF-1’s effects by binding to it.

IGF-1 helps the body grow during youth but is also associated with several diseases later in life in both mice and humans. Exploring dietary solutions to those diseases as opposed to generating pharmaceuticals to manipulate IGF-1 directly allows Longo’s team to make strides that could help sufferers today or in the next few years.

"We always try to do things for people who have the problem now," Longo said. "Developing a drug can take 15 years of trials and a billion dollars.

"Although only clinical trials can determine whether the protein-restricted diet is effective and safe in humans with cognitive impairment, a doctor could read this study today and, if his or her patient did not have any other viable options, could consider introducing the protein restriction cycles in the treatment – understanding that effective interventions in mice may not translate into effective human therapies," he said.

Many elderly individuals may have already be frail, have lost weight or may not be healthy enough to eat a protein-restricted diet every other week. Longo strongly insisted that any dieting be monitored by a doctor or registered dietician to make sure that patients do not become amino acid deficient, lose additional weight or develop other side effects.

(Source: eurekalert.org)

Filed under dietary protein aging neurodegeneration azheimer's disease tau protein neuroscience science

53 notes

Threat bias interacts with combat, gene to boost PTSD risk
Soldiers preoccupied with threat at the time of enlistment or with avoiding it just before deployment were more likely to develop post-traumatic stress disorder (PTSD), in a study of Israeli infantrymen. Such pre-deployment threat vigilance and avoidance, interacting with combat experience and an emotion-related gene, accounted for more than a third of PTSD symptoms that emerged later, say National Institutes of Health scientists, who conducted the study in collaboration with American and Israeli colleagues.
“Since biased attention predicted future risk for PTSD, computerized training that helps modify such attention biases might help protect soldiers from the disorder,” said Daniel Pine, M.D., of the NIH’s National Institute of Mental Health (NIMH).
Pine, Yair Bar-Haim, Ph.D., of Tel Aviv University, and colleagues, report their findings, Feb.  13, 2013, in the journal JAMA Psychiatry.

Threat bias interacts with combat, gene to boost PTSD risk

Soldiers preoccupied with threat at the time of enlistment or with avoiding it just before deployment were more likely to develop post-traumatic stress disorder (PTSD), in a study of Israeli infantrymen. Such pre-deployment threat vigilance and avoidance, interacting with combat experience and an emotion-related gene, accounted for more than a third of PTSD symptoms that emerged later, say National Institutes of Health scientists, who conducted the study in collaboration with American and Israeli colleagues.

“Since biased attention predicted future risk for PTSD, computerized training that helps modify such attention biases might help protect soldiers from the disorder,” said Daniel Pine, M.D., of the NIH’s National Institute of Mental Health (NIMH).

Pine, Yair Bar-Haim, Ph.D., of Tel Aviv University, and colleagues, report their findings, Feb.  13, 2013, in the journal JAMA Psychiatry.

Filed under PTSD anxiety attention serotonin genes threat bias neuroscience science

44 notes

Parkinson’s patients advised to seek Deep Brain Stimulation treatment in early stages
People with Parkinson’s disease who receive Deep Brain Stimulation (DBS) therapy in the early stages of the condition will benefit from a significant increase in quality of life, a revolutionary study from The New England Journal of Medicine has found.
World-leading neurologist and lead clinician Professor Peter Silburn from the Asia-Pacific Centre for Neuromodulation (APCN), a joint initiative of The University of Queensland (UQ) and St Andrew’s Hospital, said the results published today in the medical journal would transform the way we treat people with Parkinson’s disease.
“Before the release of this study, a typical patient with Parkinson’s disease would need to wait around 10 years or until their motor complications could no longer be treated successfully with medicine alone, before DBS surgery was considered an option,” Professor Silburn said.
“This study has confirmed the best medical practice for a person with Parkinson’s disease is to perform DBS surgery around 4 to 7 years into the condition, as opposed to waiting until the medications stop working.”

Parkinson’s patients advised to seek Deep Brain Stimulation treatment in early stages

People with Parkinson’s disease who receive Deep Brain Stimulation (DBS) therapy in the early stages of the condition will benefit from a significant increase in quality of life, a revolutionary study from The New England Journal of Medicine has found.

World-leading neurologist and lead clinician Professor Peter Silburn from the Asia-Pacific Centre for Neuromodulation (APCN), a joint initiative of The University of Queensland (UQ) and St Andrew’s Hospital, said the results published today in the medical journal would transform the way we treat people with Parkinson’s disease.

“Before the release of this study, a typical patient with Parkinson’s disease would need to wait around 10 years or until their motor complications could no longer be treated successfully with medicine alone, before DBS surgery was considered an option,” Professor Silburn said.

“This study has confirmed the best medical practice for a person with Parkinson’s disease is to perform DBS surgery around 4 to 7 years into the condition, as opposed to waiting until the medications stop working.”

Filed under neuromodulation deep brain stimulation parkinson's disease neuroscience science

52 notes

Limits on Brain’s Ability to Perceive Multifeatured Objects

New research sheds light on how the brain encodes objects with multiple features, a fundamental task for the perceptual system. The study, published in Psychological Science, a journal of the Association for Psychological Science, suggests that we have limited ability to perceive mixed color-shape associations among objects that exist in several locations.

Research suggests that neurons that encode a certain feature — shape or color, for example — fire in synchrony with neurons that encode other features of the same object. Psychological scientists Liat Goldfarb of the University of Haifa and Anne Treisman of Princeton University hypothesized that if this neural-synchrony explanation were true, then synchrony would be impossible in situations in which the same features are paired differently in different objects.

Say, for example, a person sees a string of letters, “XOOX,” and the letters are printed in alternating colors, red and green. Both letter shape and letter color need to be encoded, but the associations between letter shape and letter color are mixed (i.e., the first X is red, while the second X is green), which should make neural synchrony impossible.

“The perceptual system can either know how many Xs there are or how many reds there are, but it cannot know both at the same time,” Goldfarb and Treisman explain.

The researchers investigated their hypothesis in two experiments, in which they presented participants with strings of green and red Xs and Os and asked them to compare the number of Xs with the number of red letters (i.e., more Xs, more reds, or the same).

Participants’ responses to unique color-shape associations were significantly faster and more accurate than were their responses to displays with mixed color-shape associations.

The results show that relevant color and shape dimensions could be synchronized when the pairings between color and shape were unique, but not when the pairings were mixed.

These findings demonstrate a new behavioral principle that governs object representation. When shapes are repeated in several locations and have mixed color-shape associations, they are hard to perceive.

This research expands on Anne Treisman’s groundbreaking research on feature integration in visual perception, which shows that humans can encode characteristics such as color, form, and orientation, even in the absence of spatial attention.

Treisman is one of 12 scientists who received the National Medal of Science at the White House on February 1, 2013. The National Medal of Science, along with the National Medal of Technology and Innovation, is the highest honor that the US government grants to scientists, engineers, and inventors.

Filed under visual perception neural synchrony neurons brain psychology neuroscience science

free counters