Neuroscience

Articles and news from the latest research reports.

Posts tagged science

83 notes

Mapping the Brain
Freiburg Researchers Use Signals from Natural Movements to Identify Brain Regions
Whether we run to catch a bus or reach for a pen: Activities that involve the use of muscles are related to very specific areas in the brain. Traditionally, their exact location has only been determined through electrical stimulation or unnatural, experimental tasks. A team of scientists in Freiburg has now succeeded for the first time in mapping the brain’s surface using measurements of everyday movements. Attributing abilities to specific brain regions and identifying pathological areas is especially important in the treatment of epilepsy patients, as severe cases require removal of neural tissue. Until now, such “mapping” involved stimulating individual regions of the brain’s surface with electric currents and observing the reaction or sensation. Alternatively, patients were asked to perform the same movements again and again until the physicians isolated the corresponding patterns in brain activity. However, these methods required for the patient to cooperate and to provide detailed answers to the physicians’ questions. This is a prerequisite that small children or patients with impaired mental abilities can hardly meet, and hence there is a need for other strategies.
Scientists from the group of Dr. Tonio Ball at the Cluster of Excellence “BrainLinks-BrainTools” and the Bernstein Center Freiburg report in the current issue of NeuroImage that the brain’s natural activity during everyday movements can also be used to reliably identify the regions responsible for arm and leg movements.
The researchers examined data from epilepsy patients who had electrodes implanted under their skull prior to surgery. Using video recordings, the team captured the spontaneous movements of their patients, searching for concurrent signals of a certain frequency in the data gathered on the surface of the brain. They succeeded in creating a map of the brain’s surface for arm and leg movements that is as accurate as those created through established experimental methods.
A big hope for the team of researchers is also to gain new insights into the control of movements in the brain, as their method allows them to explore all manner of behaviors and is no longer limited to experimental conditions. Last but not least, the scientists explain that this new method of analyzing signals from the brain will contribute to the development of brain-machine interfaces that are suitable for daily use.

Mapping the Brain

Freiburg Researchers Use Signals from Natural Movements to Identify Brain Regions

Whether we run to catch a bus or reach for a pen: Activities that involve the use of muscles are related to very specific areas in the brain. Traditionally, their exact location has only been determined through electrical stimulation or unnatural, experimental tasks. A team of scientists in Freiburg has now succeeded for the first time in mapping the brain’s surface using measurements of everyday movements.
Attributing abilities to specific brain regions and identifying pathological areas is especially important in the treatment of epilepsy patients, as severe cases require removal of neural tissue. Until now, such “mapping” involved stimulating individual regions of the brain’s surface with electric currents and observing the reaction or sensation. Alternatively, patients were asked to perform the same movements again and again until the physicians isolated the corresponding patterns in brain activity. However, these methods required for the patient to cooperate and to provide detailed answers to the physicians’ questions. This is a prerequisite that small children or patients with impaired mental abilities can hardly meet, and hence there is a need for other strategies.

Scientists from the group of Dr. Tonio Ball at the Cluster of Excellence “BrainLinks-BrainTools” and the Bernstein Center Freiburg report in the current issue of NeuroImage that the brain’s natural activity during everyday movements can also be used to reliably identify the regions responsible for arm and leg movements.

The researchers examined data from epilepsy patients who had electrodes implanted under their skull prior to surgery. Using video recordings, the team captured the spontaneous movements of their patients, searching for concurrent signals of a certain frequency in the data gathered on the surface of the brain. They succeeded in creating a map of the brain’s surface for arm and leg movements that is as accurate as those created through established experimental methods.

A big hope for the team of researchers is also to gain new insights into the control of movements in the brain, as their method allows them to explore all manner of behaviors and is no longer limited to experimental conditions. Last but not least, the scientists explain that this new method of analyzing signals from the brain will contribute to the development of brain-machine interfaces that are suitable for daily use.

Filed under brain mapping brain regions motor cortex electrocortical stimulation mapping epilepsy neuroscience science

187 notes

Weird: Nuclear Bomb Tests Reveal Adults Grow New Brain Cells
Aboveground nuclear bomb testing in the 1950s and 1960s inadvertently gave modern scientists a way to prove the adult brain regularly creates new neurons, research reveals.
Researchers used to believe that the brain changed little once it finished maturing. That view is now considered out of date, as studies have revealed how changeable — or plastic — the adult brain can be.
Much of this plasticity is related to the brain’s organization; brain cells can alter their connections and communications with other brain cells. What has been less clear is whether, and to what extent, the human brain grows brand-new neurons in adulthood.
"There was a lot in the literature showing there was neurogenesis in rodents and every animal studied," said study researcher Kirsty Spalding, a biologist at the Karolinska Institute in Sweden, "But there was very little evidence of whether this happens in humans."
Tantalizing clues
Scientists had reason to believe it does. In adult mice, the hippocampus, a structure deep in the brain involved in memory and navigation, turns over cells all the time. Some of the biological markers linked to this turnover are seen in the human hippocampus. But the only direct evidence of new brain cells forming in the region came from a 1998 study in which researchers looked at the brains of five people who had been injected with a compounded called BrdU that cells take up into their DNA. (The compound was once used in experimental cancer studies, but is not used anymore for safety reasons.)
The BrdU study revealed that neurons in the hippocampuses of the participants contained the compound in their DNA, indicating these brain cells had formed after the injections. The oldest person in the study was 72, suggesting new neuron creation, known as neurogenesis, continues well into old age.
The 1998 study was the only direct evidence of such neurogenesis in the human hippocampus, however. Spalding and her colleagues wanted to change that. Ten years ago, they began a project to track the age of neurons in the human brain using an unusual tool: spare molecules left over from Cold War-era nuclear bomb tests.
Learning to love the bomb
Between 1945 and 1962, the United States conducted hundreds of aboveground nuclear bomb tests. These tests largely stopped with the Limited Test Ban Treaty of 1963, but their effects remained in the atmosphere. The neutrons sent flying by the bombs reacted with nitrogen in the atmosphere, creating a spike in carbon 14, an isotope (or variation) of carbon.
This carbon 14, in turn, did what carbon in the atmosphere does. It combined with oxygen to form carbon dioxide, and was then taken in by plants, which use carbon dioxide in photosynthesis. Humans ate some of these plants, along with some of the animals that also ate these plants, and the carbon 14 inside ended up in their bodies.
When a cell divides, it uses this carbon 14, integrating it into the DNA of the new cells that are forming. Carbon 14 decays over time at a known rate, so scientists can pinpoint from that decay exactly when the new cells were born.
Over the past decade, Spalding and her colleagues have used the technique in a variety of cells, including fat cells, refining it along the way until it became sensitive enough to measure tiny amounts of carbon 14 in small hippocampus samples. The researchers collected samples, with family permission, from autopsies in Sweden.
They found the tantalizing 1998 evidence was correct: Human hippocampuses do grow new neurons. In fact, about a third of the brain region is subject to cell turnover, with about 700 new neurons being formed each day in each hippocampus (humans have two, a mirror-image set on either side of the brain). Hippocampus neurons die each day, too, keeping the overall number more or less in balance, with some slow loss of cells with aging, Spalding said.
This turnover occurs at a ridge in the hippocampus known as the dentate gyrus, a spot known to contribute to the formation of new memories. Researchers aren’t sure what the function of this constant renewal is, but it could relate to allowing the brain to cope with novel situations, Spalding told LiveScience.
"Neurogenesis gives a particular kind of plasticity to the brain, a cognitive flexibility," she said.
Spalding and her colleagues had used the same techniques in other regions of the brain, including the cortex, the cerebellum and the olfactory bulb, and found no evidence of newborn neurons being integrated into those areas. The researchers now plan to study whether there are any links between neurogenesis and psychiatric conditions such as depression.
The new findings are detailed in the journal Cell.

Weird: Nuclear Bomb Tests Reveal Adults Grow New Brain Cells

Aboveground nuclear bomb testing in the 1950s and 1960s inadvertently gave modern scientists a way to prove the adult brain regularly creates new neurons, research reveals.

Researchers used to believe that the brain changed little once it finished maturing. That view is now considered out of date, as studies have revealed how changeable — or plastic — the adult brain can be.

Much of this plasticity is related to the brain’s organization; brain cells can alter their connections and communications with other brain cells. What has been less clear is whether, and to what extent, the human brain grows brand-new neurons in adulthood.

"There was a lot in the literature showing there was neurogenesis in rodents and every animal studied," said study researcher Kirsty Spalding, a biologist at the Karolinska Institute in Sweden, "But there was very little evidence of whether this happens in humans."

Tantalizing clues

Scientists had reason to believe it does. In adult mice, the hippocampus, a structure deep in the brain involved in memory and navigation, turns over cells all the time. Some of the biological markers linked to this turnover are seen in the human hippocampus. But the only direct evidence of new brain cells forming in the region came from a 1998 study in which researchers looked at the brains of five people who had been injected with a compounded called BrdU that cells take up into their DNA. (The compound was once used in experimental cancer studies, but is not used anymore for safety reasons.)

The BrdU study revealed that neurons in the hippocampuses of the participants contained the compound in their DNA, indicating these brain cells had formed after the injections. The oldest person in the study was 72, suggesting new neuron creation, known as neurogenesis, continues well into old age.

The 1998 study was the only direct evidence of such neurogenesis in the human hippocampus, however. Spalding and her colleagues wanted to change that. Ten years ago, they began a project to track the age of neurons in the human brain using an unusual tool: spare molecules left over from Cold War-era nuclear bomb tests.

Learning to love the bomb

Between 1945 and 1962, the United States conducted hundreds of aboveground nuclear bomb tests. These tests largely stopped with the Limited Test Ban Treaty of 1963, but their effects remained in the atmosphere. The neutrons sent flying by the bombs reacted with nitrogen in the atmosphere, creating a spike in carbon 14, an isotope (or variation) of carbon.

This carbon 14, in turn, did what carbon in the atmosphere does. It combined with oxygen to form carbon dioxide, and was then taken in by plants, which use carbon dioxide in photosynthesis. Humans ate some of these plants, along with some of the animals that also ate these plants, and the carbon 14 inside ended up in their bodies.

When a cell divides, it uses this carbon 14, integrating it into the DNA of the new cells that are forming. Carbon 14 decays over time at a known rate, so scientists can pinpoint from that decay exactly when the new cells were born.

Over the past decade, Spalding and her colleagues have used the technique in a variety of cells, including fat cells, refining it along the way until it became sensitive enough to measure tiny amounts of carbon 14 in small hippocampus samples. The researchers collected samples, with family permission, from autopsies in Sweden.

They found the tantalizing 1998 evidence was correct: Human hippocampuses do grow new neurons. In fact, about a third of the brain region is subject to cell turnover, with about 700 new neurons being formed each day in each hippocampus (humans have two, a mirror-image set on either side of the brain). Hippocampus neurons die each day, too, keeping the overall number more or less in balance, with some slow loss of cells with aging, Spalding said.

This turnover occurs at a ridge in the hippocampus known as the dentate gyrus, a spot known to contribute to the formation of new memories. Researchers aren’t sure what the function of this constant renewal is, but it could relate to allowing the brain to cope with novel situations, Spalding told LiveScience.

"Neurogenesis gives a particular kind of plasticity to the brain, a cognitive flexibility," she said.

Spalding and her colleagues had used the same techniques in other regions of the brain, including the cortex, the cerebellum and the olfactory bulb, and found no evidence of newborn neurons being integrated into those areas. The researchers now plan to study whether there are any links between neurogenesis and psychiatric conditions such as depression.

The new findings are detailed in the journal Cell.

Filed under adult brain neurogenesis cognitive function neurons nuclear bomb hippocampus memory neuroscience science

129 notes

Rapid, Irregular Heartbeat May Be Linked to Problems with Memory and Thinking 
People who develop a type of irregular heartbeat common in old age called atrial fibrillation may also be more likely to develop problems with memory and thinking, according to new research published in the June 5, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Problems with memory and thinking are common for people as they get older. Our study shows that on average, problems with memory and thinking may start earlier or get worse more quickly in people who have atrial fibrillation,” said study author Evan L. Thacker, PhD, of the University of Alabama at Birmingham. “This means that heart health is an important factor related to brain health.”
The study involved people age 65 and older from four communities in the United States who were enrolled in the Cardiovascular Health Study. Participants did not have a history of atrial fibrillation or stroke at the start of the study. They were followed for an average of seven years, and received a 100-point memory and thinking test every year. People who had a stroke were not included in this analysis after the stroke. Of the 5,150 participants, 552, or about 11 percent, developed atrial fibrillation during the study.
The study found that people with atrial fibrillation were more likely to experience lower memory and thinking scores at earlier ages than people with no history of atrial fibrillation. For example, from age 80 to age 85 the average score on the 100-point test went down by about 6 points for people without atrial fibrillation, but it went down by about 10 points for people with atrial fibrillation.
For participants ages 75 and older, the average rate of decline was about three to four points faster per five years of aging with atrial fibrillation compared to those without the condition.
“This suggests that on average, people with atrial fibrillation may be more likely to develop cognitive impairment or dementia at earlier ages than people with no history of atrial fibrillation,” Thacker said.
Thacker noted that scores below 78 points on the 100-point test are suggestive of dementia. People without atrial fibrillation in the study were predicted on average to score below 78 points at age 87, while people with atrial fibrillation were predicted to score below 78 points at age 85, two years earlier.
“If there is indeed a link between atrial fibrillation and memory and thinking decline, the next steps are to learn why that decline happens and how we can prevent that decline,” said Thacker.

Rapid, Irregular Heartbeat May Be Linked to Problems with Memory and Thinking

People who develop a type of irregular heartbeat common in old age called atrial fibrillation may also be more likely to develop problems with memory and thinking, according to new research published in the June 5, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Problems with memory and thinking are common for people as they get older. Our study shows that on average, problems with memory and thinking may start earlier or get worse more quickly in people who have atrial fibrillation,” said study author Evan L. Thacker, PhD, of the University of Alabama at Birmingham. “This means that heart health is an important factor related to brain health.”

The study involved people age 65 and older from four communities in the United States who were enrolled in the Cardiovascular Health Study. Participants did not have a history of atrial fibrillation or stroke at the start of the study. They were followed for an average of seven years, and received a 100-point memory and thinking test every year. People who had a stroke were not included in this analysis after the stroke. Of the 5,150 participants, 552, or about 11 percent, developed atrial fibrillation during the study.

The study found that people with atrial fibrillation were more likely to experience lower memory and thinking scores at earlier ages than people with no history of atrial fibrillation. For example, from age 80 to age 85 the average score on the 100-point test went down by about 6 points for people without atrial fibrillation, but it went down by about 10 points for people with atrial fibrillation.

For participants ages 75 and older, the average rate of decline was about three to four points faster per five years of aging with atrial fibrillation compared to those without the condition.

“This suggests that on average, people with atrial fibrillation may be more likely to develop cognitive impairment or dementia at earlier ages than people with no history of atrial fibrillation,” Thacker said.

Thacker noted that scores below 78 points on the 100-point test are suggestive of dementia. People without atrial fibrillation in the study were predicted on average to score below 78 points at age 87, while people with atrial fibrillation were predicted to score below 78 points at age 85, two years earlier.

“If there is indeed a link between atrial fibrillation and memory and thinking decline, the next steps are to learn why that decline happens and how we can prevent that decline,” said Thacker.

Filed under atrial fibrillation cognitive decline cognition irregular heartbeat medicine neuroscience science

47 notes

Targeting an aspect of Down syndrome
University of Michigan researchers have determined how a gene that is known to be defective in Down syndrome is regulated and how its dysregulation may lead to neurological defects, providing insights into potential therapeutic approaches to an aspect of the syndrome.
Normally, nerve cells called neurons undergo an intense period of extending and branching of neuronal protrusions around the time of birth. During this period, the neurons produce the proteins of the gene called Down syndrome cell-adhesion molecule, or Dscam, at high levels. After this phase, the growth and the levels of protein taper off.
However, in the brains of patients with Down syndrome, epilepsy and several other neurological disorders, the amount of Dscam remains high. The impact of the elevated Dscam amount on how neurons develop is unknown.
Bing Ye, a faculty member at U-M’s Life Sciences Institute, found that in the fruit fly Drosophila, the amount of Dscam proteins in a neuron determines the size to which a neuron extends its protrusions before it forms connections with other nerve cells. An overproduction of Dscam proteins leads to abnormally large neuronal protrusions.
Ye also identified two molecular pathways that converge to regulate the abundance of Dscam. One, dual leucine zipper kinase (DLK), which is involved in nerve regeneration, promotes the synthesis of Dscam proteins. Another, fragile X mental retardation protein (FMRP), which causes fragile X syndrome when defective, represses Dscam protein synthesis. Because humans share these genes with Drosophila, the DLK-FMRP-Dscam relationship presents a possible target for therapeutic intervention, Ye said.
Many genes are involved in neurological disorders like Down syndrome, and how molecular defects cause the disease is complex.
"But because of the important roles of Dscam in the development of neurons, its related defect is very likely to be an aspect of Down syndrome and it may be an aspect of the syndrome that can be treated," said Ye, an assistant professor in the Department of Cell and Developmental Biology at the U-M Medical School.
Ye’s next step is to test the effects of overexpression of Dscam in mice to see how it changes the development of the nervous system and the behavior of the animal.
Down syndrome occurs in about one in 830 newborns; an estimated 250,000 people in the U.S. have the condition, according to the National Library of Medicine’s Genetics Home Reference.

Targeting an aspect of Down syndrome

University of Michigan researchers have determined how a gene that is known to be defective in Down syndrome is regulated and how its dysregulation may lead to neurological defects, providing insights into potential therapeutic approaches to an aspect of the syndrome.

Normally, nerve cells called neurons undergo an intense period of extending and branching of neuronal protrusions around the time of birth. During this period, the neurons produce the proteins of the gene called Down syndrome cell-adhesion molecule, or Dscam, at high levels. After this phase, the growth and the levels of protein taper off.

However, in the brains of patients with Down syndrome, epilepsy and several other neurological disorders, the amount of Dscam remains high. The impact of the elevated Dscam amount on how neurons develop is unknown.

Bing Ye, a faculty member at U-M’s Life Sciences Institute, found that in the fruit fly Drosophila, the amount of Dscam proteins in a neuron determines the size to which a neuron extends its protrusions before it forms connections with other nerve cells. An overproduction of Dscam proteins leads to abnormally large neuronal protrusions.

Ye also identified two molecular pathways that converge to regulate the abundance of Dscam. One, dual leucine zipper kinase (DLK), which is involved in nerve regeneration, promotes the synthesis of Dscam proteins. Another, fragile X mental retardation protein (FMRP), which causes fragile X syndrome when defective, represses Dscam protein synthesis. Because humans share these genes with Drosophila, the DLK-FMRP-Dscam relationship presents a possible target for therapeutic intervention, Ye said.

Many genes are involved in neurological disorders like Down syndrome, and how molecular defects cause the disease is complex.

"But because of the important roles of Dscam in the development of neurons, its related defect is very likely to be an aspect of Down syndrome and it may be an aspect of the syndrome that can be treated," said Ye, an assistant professor in the Department of Cell and Developmental Biology at the U-M Medical School.

Ye’s next step is to test the effects of overexpression of Dscam in mice to see how it changes the development of the nervous system and the behavior of the animal.

Down syndrome occurs in about one in 830 newborns; an estimated 250,000 people in the U.S. have the condition, according to the National Library of Medicine’s Genetics Home Reference.

Filed under fruit flies nerve cells nerve regeneration down syndrome dscam proteins fragile X syndrome neuroscience science

58 notes

These scientists are ‘itching’ to help you stop scratching



Itch and scratch, itch and scratch.  It’s not the most serious physical problem in our lives, but it is common and it is very annoying. Now, researchers at the Hebrew University of Jerusalem and in Boston have come up with new findings that can stop the itching through silencing the neurons that transmit itch-generating stimuli.
The research was a collaborative effort by a group led by Dr. Alex Binshtok at the Hebrew University’s Department of Medical Neurobiology at the Institute of Medical Research Israel-Canada, and the Edmond & Lily Safra Center for Brain Sciences; along with Dr. Clifford Woolf’s group in the Boston Children’s Hospital and Harvard Medical School.
The study demonstrated the presence of functionally distinct sets of neurons that detect and transmit itch-generating stimuli. The researchers were further able to demonstrate that they could selectively target and silence those itch-generating neurons while active. These results provide a basis for the development of novel therapeutic approaches for selective treatment of previously unmet itching not induced by histamine (non-histaminergic itch), such as dry skin itch and allergic dermatitis.
(Histaminergic itch is brought on when histamine triggers an inflammatory immune response to foreign agents, such as occurs, for example, in hay fever.)
The findings of the Israeli-US researchers were published in the journal Nature Neuroscience. In addition to the senior researchers, student major contributors to the project were Sagi Gudes and Felix Blasl from the Hebrew University; and David Roberson and Jared Sprague from Harvard Medical School.
Itch is a complex, unpleasant, cutaneous sensation that in some respects resembles pain, yet is different in terms of its intrinsic sensory quality and the urge to scratch. Although some types of itch like urticaria (hives) could be effectively treated with anti-histaminergic agents, itch accompanying most chronic itch-inducing diseases, including atopic dermatitis (eczema), allergic itch and dry skin itch, is not predominantly induced by histamine. An understanding of the molecular and cellular mechanisms underlying the sensation of itch, therefore, is essential for the development of effective and selective treatment of itch, which in some cases could become a devastating condition, say the researchers.
The researchers’ findings suggest that primary itch-generating neurons that carry messages toward the central nervous system code functionally distinct histaminergic and non-histaminergic itch pathways that could be selectively blocked. This is the first time that this has been demonstrated, and means that it is possible to block itch signals in the neurons that mediate non-histamine itch. 
These findings have a great clinical importance since they could be translated into novel, selective and effective therapies for previously largely untreated dry skin itch and allergic dermatitis itch.

These scientists are ‘itching’ to help you stop scratching

Itch and scratch, itch and scratch.  It’s not the most serious physical problem in our lives, but it is common and it is very annoying. Now, researchers at the Hebrew University of Jerusalem and in Boston have come up with new findings that can stop the itching through silencing the neurons that transmit itch-generating stimuli.

The research was a collaborative effort by a group led by Dr. Alex Binshtok at the Hebrew University’s Department of Medical Neurobiology at the Institute of Medical Research Israel-Canada, and the Edmond & Lily Safra Center for Brain Sciences; along with Dr. Clifford Woolf’s group in the Boston Children’s Hospital and Harvard Medical School.

The study demonstrated the presence of functionally distinct sets of neurons that detect and transmit itch-generating stimuli. The researchers were further able to demonstrate that they could selectively target and silence those itch-generating neurons while active. These results provide a basis for the development of novel therapeutic approaches for selective treatment of previously unmet itching not induced by histamine (non-histaminergic itch), such as dry skin itch and allergic dermatitis.

(Histaminergic itch is brought on when histamine triggers an inflammatory immune response to foreign agents, such as occurs, for example, in hay fever.)

The findings of the Israeli-US researchers were published in the journal Nature Neuroscience. In addition to the senior researchers, student major contributors to the project were Sagi Gudes and Felix Blasl from the Hebrew University; and David Roberson and Jared Sprague from Harvard Medical School.

Itch is a complex, unpleasant, cutaneous sensation that in some respects resembles pain, yet is different in terms of its intrinsic sensory quality and the urge to scratch. Although some types of itch like urticaria (hives) could be effectively treated with anti-histaminergic agents, itch accompanying most chronic itch-inducing diseases, including atopic dermatitis (eczema), allergic itch and dry skin itch, is not predominantly induced by histamine. An understanding of the molecular and cellular mechanisms underlying the sensation of itch, therefore, is essential for the development of effective and selective treatment of itch, which in some cases could become a devastating condition, say the researchers.

The researchers’ findings suggest that primary itch-generating neurons that carry messages toward the central nervous system code functionally distinct histaminergic and non-histaminergic itch pathways that could be selectively blocked. This is the first time that this has been demonstrated, and means that it is possible to block itch signals in the neurons that mediate non-histamine itch. 

These findings have a great clinical importance since they could be translated into novel, selective and effective therapies for previously largely untreated dry skin itch and allergic dermatitis itch.

Filed under itch sensory neurons histamine neuroscience science

89 notes

Study Expands Concerns About Anesthesia’s Impact on the Brain
As pediatric specialists become increasingly aware that surgical anesthesia may have lasting effects on the developing brains of young children, new research suggests the threat may also apply to adult brains.
Researchers from Cincinnati Children’s Hospital Medical Center report June 5 the Annals of Neurology that testing in laboratory mice shows anesthesia’s neurotoxic effects depend on the age of brain neurons – not the age of the animal undergoing anesthesia, as once thought.
Although more research is needed to confirm the study’s relevance to humans, the study suggests possible health implications for millions of children and adults who undergo surgical anesthesia annually, according to Andreas Loepke, MD, PhD, a physician and researcher in the Department of Anesthesiology.
“We demonstrate that anesthesia-induced cell death in neurons is not limited to the immature brain, as previously believed,” said Loepke. “Instead, vulnerability seems to target neurons of a certain age and maturational stage. This finding brings us a step closer to understanding the phenomenon’s underlying mechanism”.
New neurons are generated abundantly in most regions of the very young brain, explaining why previous research has focused on that developmental stage. In a mature brain, neuron formation slows considerably, but extends into later life in dentate gyrus and olfactory bulb.
The dentate gyrus, which helps control learning and memory, is the region Loepke and his research colleagues paid particular attention to in their study. Also collaborating were researchers from the University of Cincinnati College of Medicine and the Children’s Hospital of Fudan University, Shanghai, China.
Researchers exposed newborn, juvenile and young adult mice to a widely used anesthetic called isoflurane in doses approximating those used in surgical practice. Newborn mice exhibited widespread neuronal loss in forebrain structures – confirming previous research – with no significant impact on the dentate gyrus. However, the effect in juvenile mice was reversed, with minimal neuronal impact in the forebrain regions and significant cell death in the dentate gyrus.
The team then performed extensive studies to discover that age and maturational stage of the affected neurons were the defining characteristics for vulnerability to anesthesia-induced neuronal cell death. The researchers observed similar results in young adult mice as well.
Research over the past 10 years has made it increasingly clear that commonly used anesthetics increase brain cell death in developing animals, raising concerns from the Food and Drug Administration, clinicians, neuroscientists and the public. As well, several follow-up studies in children and adults who have undergone surgical anesthesia show a link to learning and memory impairment.
Cautioning against immediate application of the current study’s findings to children and adults undergoing anesthesia, Loepke said his research team is trying to learn enough about anesthesia’s impact on brain chemistry to develop protective therapeutic strategies, in case they are needed. To this end, their next step is to identify specific molecular processes triggered by anesthesia that lead to brain cell death.
“Surgery is often vital to save lives or maintain quality of life and usually cannot be performed without general anesthesia,” Loepke said. “Physicians should carefully discuss with patients, parents and caretakers the risks and benefits of procedures requiring anesthetics, as well as the known risks of not treating certain conditions.”
Loepke is also collaborating with researchers from the Pediatric Neuroimaging Research Consortium at Cincinnati Children’s Hospital Medical Center to examine anesthesia’s impact on children’s brain using non-invasive magnetic resonance imaging (MRI) technology.

Study Expands Concerns About Anesthesia’s Impact on the Brain

As pediatric specialists become increasingly aware that surgical anesthesia may have lasting effects on the developing brains of young children, new research suggests the threat may also apply to adult brains.

Researchers from Cincinnati Children’s Hospital Medical Center report June 5 the Annals of Neurology that testing in laboratory mice shows anesthesia’s neurotoxic effects depend on the age of brain neurons – not the age of the animal undergoing anesthesia, as once thought.

Although more research is needed to confirm the study’s relevance to humans, the study suggests possible health implications for millions of children and adults who undergo surgical anesthesia annually, according to Andreas Loepke, MD, PhD, a physician and researcher in the Department of Anesthesiology.

“We demonstrate that anesthesia-induced cell death in neurons is not limited to the immature brain, as previously believed,” said Loepke. “Instead, vulnerability seems to target neurons of a certain age and maturational stage. This finding brings us a step closer to understanding the phenomenon’s underlying mechanism”.

New neurons are generated abundantly in most regions of the very young brain, explaining why previous research has focused on that developmental stage. In a mature brain, neuron formation slows considerably, but extends into later life in dentate gyrus and olfactory bulb.

The dentate gyrus, which helps control learning and memory, is the region Loepke and his research colleagues paid particular attention to in their study. Also collaborating were researchers from the University of Cincinnati College of Medicine and the Children’s Hospital of Fudan University, Shanghai, China.

Researchers exposed newborn, juvenile and young adult mice to a widely used anesthetic called isoflurane in doses approximating those used in surgical practice. Newborn mice exhibited widespread neuronal loss in forebrain structures – confirming previous research – with no significant impact on the dentate gyrus. However, the effect in juvenile mice was reversed, with minimal neuronal impact in the forebrain regions and significant cell death in the dentate gyrus.

The team then performed extensive studies to discover that age and maturational stage of the affected neurons were the defining characteristics for vulnerability to anesthesia-induced neuronal cell death. The researchers observed similar results in young adult mice as well.

Research over the past 10 years has made it increasingly clear that commonly used anesthetics increase brain cell death in developing animals, raising concerns from the Food and Drug Administration, clinicians, neuroscientists and the public. As well, several follow-up studies in children and adults who have undergone surgical anesthesia show a link to learning and memory impairment.

Cautioning against immediate application of the current study’s findings to children and adults undergoing anesthesia, Loepke said his research team is trying to learn enough about anesthesia’s impact on brain chemistry to develop protective therapeutic strategies, in case they are needed. To this end, their next step is to identify specific molecular processes triggered by anesthesia that lead to brain cell death.

“Surgery is often vital to save lives or maintain quality of life and usually cannot be performed without general anesthesia,” Loepke said. “Physicians should carefully discuss with patients, parents and caretakers the risks and benefits of procedures requiring anesthetics, as well as the known risks of not treating certain conditions.”

Loepke is also collaborating with researchers from the Pediatric Neuroimaging Research Consortium at Cincinnati Children’s Hospital Medical Center to examine anesthesia’s impact on children’s brain using non-invasive magnetic resonance imaging (MRI) technology.

Filed under anesthesia neurons cell death apoptosis dentate gyrus neurology neuroscience science

115 notes

Neurochemical Traffic Signals May Open New Avenues for the Treatment of Schizophrenia
Researchers at Boston University School of Medicine (BUSM) have uncovered important clues about a biochemical pathway in the brain that may one day expand treatment options for schizophrenia. The study, published online in the journal Molecular Pharmacology, was led by faculty within the department of pharmacology and experimental therapeutics at BUSM.
Patients with schizophrenia suffer from a life-long condition that can produce delusions, disordered thinking, and breaks with reality. A number of treatments are available for schizophrenia, but many patients do not respond to these therapies or experience side effects that limit their use.
This research focused on key components of the brain known as NMDA receptors. These receptors are located on nerve cells in the brain and serve as biochemical gates that allow calcium ions (electrical charges) to enter the cell when a neurotransmitter, such as glutamate, binds to the receptor. Proper activation of these receptors is critical for sensory perception, memory and learning, including the transfer of short-term memory into long-term storage. Patients with schizophrenia have poorly functioning or “hypoactive” NMDA receptors, suggesting the possibility of treatment with drugs that positively affect these receptors. Currently the only way to enhance NMDA receptor function is through the use of agents called agonists that directly bind to the receptor on the outer surface of the cell, opening the gates to calcium ions outside the cell.
In this study, the researchers discovered a novel “non-canonical” pathway in which NMDA receptors residing inside the cell are stimulated by a neuroactive steroid to migrate to the cell surface (a process known as trafficking), thus increasing the number of receptors available for glutamate activation. The researchers treated neural cells from the cerebral cortex with the novel steroid pregnenolone sulfate (PregS) and found that the number of working NMDA receptors on the cell surface increased by 60 to 100 percent within 10 minutes. The exact mechanism by which this occurs is not completely clear, but it appears that PregS increases calcium ions within the cell, which in turn produces a green light signal for more frequent trafficking of NMDA receptors to the cell surface.
Although still in the early stages, further research in this area may be instrumental in the development of treatments not only for schizophrenia, but also for other conditions associated with malfunctioning NMDA receptors, such as age-related decreases in memory and learning ability.

Neurochemical Traffic Signals May Open New Avenues for the Treatment of Schizophrenia

Researchers at Boston University School of Medicine (BUSM) have uncovered important clues about a biochemical pathway in the brain that may one day expand treatment options for schizophrenia. The study, published online in the journal Molecular Pharmacology, was led by faculty within the department of pharmacology and experimental therapeutics at BUSM.

Patients with schizophrenia suffer from a life-long condition that can produce delusions, disordered thinking, and breaks with reality. A number of treatments are available for schizophrenia, but many patients do not respond to these therapies or experience side effects that limit their use.

This research focused on key components of the brain known as NMDA receptors. These receptors are located on nerve cells in the brain and serve as biochemical gates that allow calcium ions (electrical charges) to enter the cell when a neurotransmitter, such as glutamate, binds to the receptor. Proper activation of these receptors is critical for sensory perception, memory and learning, including the transfer of short-term memory into long-term storage. Patients with schizophrenia have poorly functioning or “hypoactive” NMDA receptors, suggesting the possibility of treatment with drugs that positively affect these receptors. Currently the only way to enhance NMDA receptor function is through the use of agents called agonists that directly bind to the receptor on the outer surface of the cell, opening the gates to calcium ions outside the cell.

In this study, the researchers discovered a novel “non-canonical” pathway in which NMDA receptors residing inside the cell are stimulated by a neuroactive steroid to migrate to the cell surface (a process known as trafficking), thus increasing the number of receptors available for glutamate activation. The researchers treated neural cells from the cerebral cortex with the novel steroid pregnenolone sulfate (PregS) and found that the number of working NMDA receptors on the cell surface increased by 60 to 100 percent within 10 minutes. The exact mechanism by which this occurs is not completely clear, but it appears that PregS increases calcium ions within the cell, which in turn produces a green light signal for more frequent trafficking of NMDA receptors to the cell surface.

Although still in the early stages, further research in this area may be instrumental in the development of treatments not only for schizophrenia, but also for other conditions associated with malfunctioning NMDA receptors, such as age-related decreases in memory and learning ability.

Filed under schizophrenia NMDA receptors nerve cells calcium ions glutamate trafficking neuroscience science

156 notes

Pioneering Study Demonstrates Benefit of Imaging Technique in Identifying Mental Illness
MRI may be an effective way to diagnose mental illnesses such as bipolar disorder, according to experts from the Icahn School of Medicine at Mount Sinai. In a landmark study using advanced techniques, the researchers were able to correctly distinguish bipolar patients from healthy individuals based on their brain scans alone. The data are published in the journal Psychological Medicine.
Currently, most mental illnesses are diagnosed based on symptoms only, creating an urgent need for new approaches to diagnosis. In bipolar disorder, there may be a significant delay in diagnosis due to the complex clinical presentation of the illness. In this study, Sophia Frangou, MD, Professor of Psychiatry and Chief of the Psychosis Research Program at the Icahn School of Medicine at Mount Sinai teamed up with Andy Simmons, MD, of the Kings College London and Janaina Mourao-Miranda, MD, of University College London, to explore whether brain imaging could help correctly identify patients with bipolar disorder.
“Bipolar disorder affects patients’ ability to regulate their emotions successfully, which puts them at great disadvantage in their lives,” said Dr. Frangou. “The situation is made worse by unacceptably long delays, sometimes of up to 10 years, in making the correct diagnosis. Bipolar disorder may be easily misdiagnosed for other disorders, such as depression or schizophrenia. This is why bipolar disorder ranks among the top ten disorders causing significant disability worldwide.”
Dr. Frangou and her team used MRI to scan the brains of people with bipolar disorder and of healthy individuals. Using advanced computational models, they were successful in correctly separating people with bipolar disorder from healthy individuals with 73 percent accuracy using their brain imaging scans alone. They replicated their finding in a separate group of patients and healthy individuals and found a 72 percent accuracy rate.
Dr. Simmons added, “The level of accuracy we achieved is comparable to that of many other tests used in medicine. Additionally, brain scanning is very acceptable to patients as most people consider it a routine diagnostic test.”
“This approach does not undermine the importance of rigorous clinical assessment and the importance of building relationships with patients but provides biological justification for the type of diagnosis made,” said Dr. Frangou. “However, diagnostic imaging for psychiatry is still under investigation and not ready for widespread use. Nonetheless, our results together with those from other labs are a harbinger of a major shift in the way we approach diagnosis in psychiatry.”

Pioneering Study Demonstrates Benefit of Imaging Technique in Identifying Mental Illness

MRI may be an effective way to diagnose mental illnesses such as bipolar disorder, according to experts from the Icahn School of Medicine at Mount Sinai. In a landmark study using advanced techniques, the researchers were able to correctly distinguish bipolar patients from healthy individuals based on their brain scans alone. The data are published in the journal Psychological Medicine.

Currently, most mental illnesses are diagnosed based on symptoms only, creating an urgent need for new approaches to diagnosis. In bipolar disorder, there may be a significant delay in diagnosis due to the complex clinical presentation of the illness. In this study, Sophia Frangou, MD, Professor of Psychiatry and Chief of the Psychosis Research Program at the Icahn School of Medicine at Mount Sinai teamed up with Andy Simmons, MD, of the Kings College London and Janaina Mourao-Miranda, MD, of University College London, to explore whether brain imaging could help correctly identify patients with bipolar disorder.

“Bipolar disorder affects patients’ ability to regulate their emotions successfully, which puts them at great disadvantage in their lives,” said Dr. Frangou. “The situation is made worse by unacceptably long delays, sometimes of up to 10 years, in making the correct diagnosis. Bipolar disorder may be easily misdiagnosed for other disorders, such as depression or schizophrenia. This is why bipolar disorder ranks among the top ten disorders causing significant disability worldwide.”

Dr. Frangou and her team used MRI to scan the brains of people with bipolar disorder and of healthy individuals. Using advanced computational models, they were successful in correctly separating people with bipolar disorder from healthy individuals with 73 percent accuracy using their brain imaging scans alone. They replicated their finding in a separate group of patients and healthy individuals and found a 72 percent accuracy rate.

Dr. Simmons added, “The level of accuracy we achieved is comparable to that of many other tests used in medicine. Additionally, brain scanning is very acceptable to patients as most people consider it a routine diagnostic test.”

“This approach does not undermine the importance of rigorous clinical assessment and the importance of building relationships with patients but provides biological justification for the type of diagnosis made,” said Dr. Frangou. “However, diagnostic imaging for psychiatry is still under investigation and not ready for widespread use. Nonetheless, our results together with those from other labs are a harbinger of a major shift in the way we approach diagnosis in psychiatry.”

Filed under bipolar depression bipolar disorder neuroimaging MRI mental health psychology neuroscience science

119 notes

New technique for deep brain stimulation surgery proves accurate and safe
Surgery has been used for Parkinson’s disease and familial tremors, and also shows promise for other disorders
The surgeon who more than two decades ago pioneered deep brain stimulation surgery in the United States to treat people with Parkinson’s disease and other movement disorders has now developed a new way to perform the surgery — which allows for more accurate placement of the brain electrodes and likely is safer for patients.
The success and safety of the new surgical technique could have broad implications for deep brain stimulation, or DBS, surgery into the future, as it may increasingly be used to help with a wide range of medical issues beyond Parkinson’s disease and familial tremors.
The new surgery also offers another distinct advantage: patients are asleep during the surgery, rather than being awake under local anesthesia to help surgeons determine placement of the electrodes as happens with the traditional DBS surgery.
A study detailing the new surgical technique is being published in the June 2013 edition of the Journal of Neurosurgery, and has been published online at the journal’s website.
"I think this will be how DBS surgery will be done in most cases going forward," said Kim Burchiel, M.D., F.A.C.S., chair of neurological surgery at Oregon Health & Science University and the lead author of the Journal of Neurosurgery article. “This surgery allows for extremely accurate placement of the electrodes and it’s safer. Plus patients don’t need to be awake during this surgery — which will mean many more patients who can be helped by this surgery will now be willing to consider it.”
DBS surgery was first developed in France in 1987. Burchiel was the first surgeon in North America to perform the surgery, as part of a Food and Drug Administration-approved clinical trial in 1991.
The FDA approved the surgery for “essential tremor” in 1997 and for tremors associated with Parkinson’s disease in 2002. The surgery has been performed tens of thousands of times over the last decade or so in the United States, most often for familial tremor and Parkinson’s disease. Burchiel and his team at OHSU have performed the surgery more than 750 times.
The surgery involves implanting very thin wire electrodes in the brain, connected to something like a pacemaker implanted in the chest. The system then stimulates the brain to often significantly reduce the tremors.
For most of the last two decades, the DBS patient was required to be awake during surgery, to allow surgeons to determine through monitoring the patient’s symptoms and getting other conscious patient feedback whether the electrodes were placed in the right spots in the brain.
But the traditional form of the surgery had drawbacks. Many patients who might have benefitted weren’t willing to undergo the sometimes 4 to 6 hour surgery while awake. There also is a small chance of hemorrhaging in the brain as the surgeon places or moves the electrodes to the right spot in the brain.
The new technique uses advances in brain imaging in recent years to place the electrodes more safely, and more accurately, than in traditional DBS surgery. The surgical team uses CT scanning during the surgery itself, along with an MRI of the patient’s brain before the surgery, to precisely place the electrodes in the brain, while better ensuring no hemorrhaging or complications from the insertion of the electrode.
The Journal of Neurosurgery article reported on 60 patients who had the surgery at OHSU over an 18-month period beginning in early 2011.
"What our results say is that it’s safe, that we had no hemorrhaging or complications at all — and the accuracy of the electrode placement is the best ever reported," Burchiel said.
Burchiel and his team have done another 140 or so surgeries with the new procedure since enrollment in the study ended. OHSU was the first center to pioneer the new DBS procedure, but other surgical teams across the U.S. are learning the technique at OHSU, and bringing it back to their own centers.
The positive results with the new DBS technique could have ramifications as medical researchers nationwide continue to explore possible new uses for DBS surgery. DBS surgery has shown promising results in clinical trials with some Alzheimer’s patients, with some forms of depression and even with obesity.
If the early promising results for these conditions are confirmed, the number of people who might be candidates for DBS surgery could expand greatly, Burchiel said.
The length of the new surgery for the 60 patients involved in the study was slightly longer than traditional DBS surgery. But as Burchiel and his team have developed the new surgical technique, the new DBS surgeries are usually much shorter, often taking half the time of the more traditional approach. Given that, and that the electrodes are placed more accurately and the surgery is cheaper to perform, the new DBS surgery likely will be the technique most surgeons will use in coming years, Burchiel said.
DBS surgery often helps significantly reduce tremors in patients with familial tremor and tremors and other symptoms in Parkinson’s disease. A parallel study is ongoing at OHSU to assess how symptoms of the patients have improved since their DBS surgery using this new method.
(Image: Dr Frank Gaillard)

New technique for deep brain stimulation surgery proves accurate and safe

Surgery has been used for Parkinson’s disease and familial tremors, and also shows promise for other disorders

The surgeon who more than two decades ago pioneered deep brain stimulation surgery in the United States to treat people with Parkinson’s disease and other movement disorders has now developed a new way to perform the surgery — which allows for more accurate placement of the brain electrodes and likely is safer for patients.

The success and safety of the new surgical technique could have broad implications for deep brain stimulation, or DBS, surgery into the future, as it may increasingly be used to help with a wide range of medical issues beyond Parkinson’s disease and familial tremors.

The new surgery also offers another distinct advantage: patients are asleep during the surgery, rather than being awake under local anesthesia to help surgeons determine placement of the electrodes as happens with the traditional DBS surgery.

A study detailing the new surgical technique is being published in the June 2013 edition of the Journal of Neurosurgery, and has been published online at the journal’s website.

"I think this will be how DBS surgery will be done in most cases going forward," said Kim Burchiel, M.D., F.A.C.S., chair of neurological surgery at Oregon Health & Science University and the lead author of the Journal of Neurosurgery article. “This surgery allows for extremely accurate placement of the electrodes and it’s safer. Plus patients don’t need to be awake during this surgery — which will mean many more patients who can be helped by this surgery will now be willing to consider it.”

DBS surgery was first developed in France in 1987. Burchiel was the first surgeon in North America to perform the surgery, as part of a Food and Drug Administration-approved clinical trial in 1991.

The FDA approved the surgery for “essential tremor” in 1997 and for tremors associated with Parkinson’s disease in 2002. The surgery has been performed tens of thousands of times over the last decade or so in the United States, most often for familial tremor and Parkinson’s disease. Burchiel and his team at OHSU have performed the surgery more than 750 times.

The surgery involves implanting very thin wire electrodes in the brain, connected to something like a pacemaker implanted in the chest. The system then stimulates the brain to often significantly reduce the tremors.

For most of the last two decades, the DBS patient was required to be awake during surgery, to allow surgeons to determine through monitoring the patient’s symptoms and getting other conscious patient feedback whether the electrodes were placed in the right spots in the brain.

But the traditional form of the surgery had drawbacks. Many patients who might have benefitted weren’t willing to undergo the sometimes 4 to 6 hour surgery while awake. There also is a small chance of hemorrhaging in the brain as the surgeon places or moves the electrodes to the right spot in the brain.

The new technique uses advances in brain imaging in recent years to place the electrodes more safely, and more accurately, than in traditional DBS surgery. The surgical team uses CT scanning during the surgery itself, along with an MRI of the patient’s brain before the surgery, to precisely place the electrodes in the brain, while better ensuring no hemorrhaging or complications from the insertion of the electrode.

The Journal of Neurosurgery article reported on 60 patients who had the surgery at OHSU over an 18-month period beginning in early 2011.

"What our results say is that it’s safe, that we had no hemorrhaging or complications at all — and the accuracy of the electrode placement is the best ever reported," Burchiel said.

Burchiel and his team have done another 140 or so surgeries with the new procedure since enrollment in the study ended. OHSU was the first center to pioneer the new DBS procedure, but other surgical teams across the U.S. are learning the technique at OHSU, and bringing it back to their own centers.

The positive results with the new DBS technique could have ramifications as medical researchers nationwide continue to explore possible new uses for DBS surgery. DBS surgery has shown promising results in clinical trials with some Alzheimer’s patients, with some forms of depression and even with obesity.

If the early promising results for these conditions are confirmed, the number of people who might be candidates for DBS surgery could expand greatly, Burchiel said.

The length of the new surgery for the 60 patients involved in the study was slightly longer than traditional DBS surgery. But as Burchiel and his team have developed the new surgical technique, the new DBS surgeries are usually much shorter, often taking half the time of the more traditional approach. Given that, and that the electrodes are placed more accurately and the surgery is cheaper to perform, the new DBS surgery likely will be the technique most surgeons will use in coming years, Burchiel said.

DBS surgery often helps significantly reduce tremors in patients with familial tremor and tremors and other symptoms in Parkinson’s disease. A parallel study is ongoing at OHSU to assess how symptoms of the patients have improved since their DBS surgery using this new method.

(Image: Dr Frank Gaillard)

Filed under deep brain stimulation parkinson's disease neuroimaging medicine neuroscience science

114 notes

Helicopter takes to the skies with the power of thought

A remote controlled helicopter has been flown through a series of hoops around a college gymnasium in Minnesota.

It sounds like your everyday student project; however, there is one caveat…the helicopter was controlled using just the power of thought.

The experiments have been performed by researchers hoping to develop future robots that can help restore the autonomy of paralysed victims or those suffering from neurodegenerative disorders.

Their study has been published today, 4 June 2013, in IOP Publishing’s Journal of Neural Engineering and is accompanied by a video of the helicopter control in action. 

There were five subjects (three female, two male) who took part in the study and each one was able to successfully control the four-blade helicopter, also known as a quadcopter, quickly and accurately for a sustained amount of time.

Lead author of the study Professor Bin He, from the University of Minnesota College of Science and Engineering, said: “Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts, sensed from noninvasive brain waves.”

The noninvasive technique used was electroencephalography (EEG), which recorded the electrical activity of the subjects’ brain through a cap fitted with 64 electrodes.

Facing away from the quadcopter, the subjects were asked to imagine using their right hand, left hand, and both hands together; this would instruct the quadcopter to turn right, left, lift, and then fall, respectively. The quadcopter was driven with a pre-set forward moving velocity and controlled through the sky with the subject’s thoughts.

The subjects were positioned in front of a screen which relayed images of the quadcopter’s flight through an on-board camera, allowing them to see which direction it was travelling in. Brain signals were recorded by the cap and sent to the quadcopter over WiFi.

“In previous work we showed that humans could control a virtual helicopter using just their thoughts. I initially intended to use a small helicopter for this real-life study; however, the quadcopter is more stable, smooth and has fewer safety concerns,” continued Professor He.

After several different training sessions, the subjects were required to fly the quadcopter through two foam rings suspended from the gymnasium ceiling and were scored on three aspects: the number of times they sent the quadcopter through the rings; the number of times the quadcopter collided with the rings; and the number of times they went outside the experiment boundary.

A number of statistical tests were used to calculate how each subject performed.

A group of subjects also directed the quadcopter with a keyboard in a control experiment, allowing for a comparison between a standardised method and brain control.

This process is just one example of a brain–computer interface where a direct pathway between the brain and an external device is created to help assist, augment or repair human cognitive or sensory-motor functions; researchers are currently looking at ways to restore hearing, sight and movement using this approach.

“Our next goal is to control robotic arms using noninvasive brain wave signals, with the eventual goal of developing brain–computer interfaces that aid patients with disabilities or neurodegenerative disorders,” continued Professor He.

Filed under neurodegenerative diseases quadcopter brainwaves EEG BCI robotics neuroscience science

free counters