Neuroscience

Articles and news from the latest research reports.

169 notes

Action video games boost reading skills
Much to the chagrin of parents who think their kids should spend less time playing video games and more time studying, time spent playing action video games can actually make dyslexic children read better. In fact, 12 hours of video game play did more for reading skills than is normally achieved with a year of spontaneous reading development or demanding traditional reading treatments.
The evidence, appearing in the Cell Press journal Current Biology on February 28, follows from earlier work by the same team linking dyslexia to early problems with visual attention rather than language skills.
"Action video games enhance many aspects of visual attention, mainly improving the extraction of information from the environment," said Andrea Facoetti of the University of Padua and the Scientific Institute Medea of Bosisio Parini in Italy. "Dyslexic children learned to orient and focus their attention more efficiently to extract the relevant information of a written word more rapidly."
The findings come as further support for the notion that visual attention deficits are at the root of dyslexia, a condition that makes reading extremely difficult for one out of every ten children, Facoetti added. He emphasized that there is, as of now, no approved treatment for dyslexia that includes video games.
Facoetti’s team, including Sandro Franceschini, Simone Gori, Milena Ruffino, Simona Viola, and Massimo Molteni, tested the reading, phonological, and attentional skills of two groups of children with dyslexia before and after they played action or non-action video games for nine 80-minute sessions. The action video gamers were able to read faster without losing accuracy. They also showed gains in other tests of attention.
"These results are very important in order to understand the brain mechanisms underlying dyslexia, but they don’t put us in a position to recommend playing video games without any control or supervision," Facoetti said.
Still, there is great hope for early interventions that could be applied in low-resource settings. “Our study paves the way for new remediation programs, based on scientific results, that can reduce the dyslexia symptoms and even prevent dyslexia when applied to children at risk for dyslexia before they learn to read.”
And, guess what? Those kids will also be having fun.

Action video games boost reading skills

Much to the chagrin of parents who think their kids should spend less time playing video games and more time studying, time spent playing action video games can actually make dyslexic children read better. In fact, 12 hours of video game play did more for reading skills than is normally achieved with a year of spontaneous reading development or demanding traditional reading treatments.

The evidence, appearing in the Cell Press journal Current Biology on February 28, follows from earlier work by the same team linking dyslexia to early problems with visual attention rather than language skills.

"Action video games enhance many aspects of visual attention, mainly improving the extraction of information from the environment," said Andrea Facoetti of the University of Padua and the Scientific Institute Medea of Bosisio Parini in Italy. "Dyslexic children learned to orient and focus their attention more efficiently to extract the relevant information of a written word more rapidly."

The findings come as further support for the notion that visual attention deficits are at the root of dyslexia, a condition that makes reading extremely difficult for one out of every ten children, Facoetti added. He emphasized that there is, as of now, no approved treatment for dyslexia that includes video games.

Facoetti’s team, including Sandro Franceschini, Simone Gori, Milena Ruffino, Simona Viola, and Massimo Molteni, tested the reading, phonological, and attentional skills of two groups of children with dyslexia before and after they played action or non-action video games for nine 80-minute sessions. The action video gamers were able to read faster without losing accuracy. They also showed gains in other tests of attention.

"These results are very important in order to understand the brain mechanisms underlying dyslexia, but they don’t put us in a position to recommend playing video games without any control or supervision," Facoetti said.

Still, there is great hope for early interventions that could be applied in low-resource settings. “Our study paves the way for new remediation programs, based on scientific results, that can reduce the dyslexia symptoms and even prevent dyslexia when applied to children at risk for dyslexia before they learn to read.”

And, guess what? Those kids will also be having fun.

Filed under reading reading development dyslexia visual attention video games neuroscience psychology science

113 notes

Authors: Develop digital games to improve brain function and well-being
Neuroscientists should help to develop compelling digital games that boost brain function and improve well-being, say two professors specializing in the field in a commentary article published in the science journal Nature.
In the Feb. 28 issue, the two — Daphne Bavelier of the University of Rochester and Richard J. Davidson of the University of Wisconsin-Madison — urge game designers and brain scientists to work together to design new games that train the brain, producing positive effects on behavior, such as decreasing anxiety, sharpening attention and improving empathy. Already, some video games are designed to treat depression and to encourage cancer patients to stick with treatment, the authors note.
Davidson is founder and chair of the Center for Investigating Healthy Minds at the UW’s Waisman Center. Bavelier is a professor in the Department of Brain and Cognitive Sciences at Rochester.
Video game usage, which continues to rise among American children, has been associated with a number of negative outcomes, such as obesity, aggressiveness, antisocial behavior and, in extreme cases, addiction. “At the same time, evidence is mounting that playing games can have a beneficial effects on the brain,” the authors write.
Last year, Bavelier and Davidson presided over a meeting at the White House in which neuroscientists met with entertainment media experts to discuss ways of using interactive technology such as video games to further understanding of brain functions, as well as to provide new, engaging tools for boosting attention and well-being.
Bavelier’s work is focused on how humans learn and how the brain adapts to changes in experience, either by nature (as in deafness) or by training (such as playing video games). Her lab investigates how new media, including video games, can be leveraged to foster learning and brain plasticity.
Davidson, who studies emotion and the brain, is leading a project in collaboration with UW-Madison’s Games + Learning + Society to develop two video games designed to help middle school students develop social and emotional skills, such as empathy, cooperation, mental focus and self-regulation.
"Gradually, this work will begin to document the burning social question of how technology is having an impact on our brains and our lives, and enable us to make evidence-based choices about the technologies of the future, to produce a new set of tools to cultivate positive habits of mind," the authors conclude.

Authors: Develop digital games to improve brain function and well-being

Neuroscientists should help to develop compelling digital games that boost brain function and improve well-being, say two professors specializing in the field in a commentary article published in the science journal Nature.

In the Feb. 28 issue, the two — Daphne Bavelier of the University of Rochester and Richard J. Davidson of the University of Wisconsin-Madison — urge game designers and brain scientists to work together to design new games that train the brain, producing positive effects on behavior, such as decreasing anxiety, sharpening attention and improving empathy. Already, some video games are designed to treat depression and to encourage cancer patients to stick with treatment, the authors note.

Davidson is founder and chair of the Center for Investigating Healthy Minds at the UW’s Waisman Center. Bavelier is a professor in the Department of Brain and Cognitive Sciences at Rochester.

Video game usage, which continues to rise among American children, has been associated with a number of negative outcomes, such as obesity, aggressiveness, antisocial behavior and, in extreme cases, addiction. “At the same time, evidence is mounting that playing games can have a beneficial effects on the brain,” the authors write.

Last year, Bavelier and Davidson presided over a meeting at the White House in which neuroscientists met with entertainment media experts to discuss ways of using interactive technology such as video games to further understanding of brain functions, as well as to provide new, engaging tools for boosting attention and well-being.

Bavelier’s work is focused on how humans learn and how the brain adapts to changes in experience, either by nature (as in deafness) or by training (such as playing video games). Her lab investigates how new media, including video games, can be leveraged to foster learning and brain plasticity.

Davidson, who studies emotion and the brain, is leading a project in collaboration with UW-Madison’s Games + Learning + Society to develop two video games designed to help middle school students develop social and emotional skills, such as empathy, cooperation, mental focus and self-regulation.

"Gradually, this work will begin to document the burning social question of how technology is having an impact on our brains and our lives, and enable us to make evidence-based choices about the technologies of the future, to produce a new set of tools to cultivate positive habits of mind," the authors conclude.

Filed under brain brain function gaming digital games anxiety empathy neuroscience science

98 notes

Ectopic Eyes Function Without Connection to Brain
For the first time, scientists have shown that transplanted eyes located far outside the head in a vertebrate animal model can confer vision without a direct neural connection to the brain.
Biologists at Tufts University School of Arts and Sciences used a frog model to shed new light – literally – on one of the major questions in regenerative medicine, bioengineering, and sensory augmentation research.
"One of the big challenges is to understand how the brain and body adapt to large changes in organization," says Douglas J. Blackiston, Ph.D., first author of the paper "Ectopic Eyes Outside the Head in Xenopus Tadpoles Provide Sensory Data For Light-Mediated Learning," in the February 27 issue of the Journal of Experimental Biology. “Here, our research reveals the brain’s remarkable ability, or plasticity, to process visual data coming from misplaced eyes, even when they are located far from the head.”
Blackiston is a post-doctoral associate in the laboratory of co-author Michael Levin, Ph.D., professor of biology and director of the Center for Regenerative and Developmental Biology at Tufts University.
Levin notes, “A primary goal in medicine is to one day be able to restore the function of damaged or missing sensory structures through the use of biological or artificial replacement components. There are many implications of this study, but the primary one from a medical standpoint is that we may not need to make specific connections to the brain when treating sensory disorders such as blindness.”
In this experiment, the team surgically removed donor embryo eye primordia, marked with fluorescent proteins, and grafted them into the posterior region of recipient embryos. This induced the growth of ectopic eyes. The recipients’ natural eyes were removed, leaving only the ectopic eyes.
Fluorescence microscopy revealed various innervation patterns but none of the animals developed nerves that connected the ectopic eyes to the brain or cranial region.
To determine if the ectopic eyes conveyed visual information, the team developed a computer-controlled visual training system in which quadrants of water were illuminated by either red or blue LED lights. The system could administer a mild electric shock to tadpoles swimming in a particular quadrant. A motion tracking system outfitted with a camera and a computer program allowed the scientists to monitor and record the tadpoles’ motion and speed.
Eyes See Without Wiring to Brain
The team made exciting discoveries: Just over 19 percent of the animals with optic nerves that connected to the spine demonstrated learned responses to the lights. They swam away from the red light while the blue light stimulated natural movement.
Their response to the lights elicited during the experiments was no different from that of a control group of tadpoles with natural eyes intact. Furthermore, this response was not demonstrated by eyeless tadpoles or tadpoles that did not receive any electrical shock.
"This has never been shown before," says Levin. "No one would have guessed that eyes on the flank of a tadpole could see, especially when wired only to the spinal cord and not the brain."The findings suggest a remarkable plasticity in the brain’s ability to incorporate signals from various body regions into behavioral programs that had evolved with a specific and different body plan.
"Ectopic eyes performed visual function," says Blackiston. "The brain recognized visual data from eyes that impinged on the spinal cord. We still need to determine if this plasticity in vertebrate brains extends to different ectopic organs or organs appropriate in different species."
One of the most fascinating areas for future investigation, according to Blackiston and Levin, is the question of exactly how the brain recognizes that the electrical signals coming from tissue near the gut is to be interpreted as visual data.
In computer engineering, notes Levin, who majored in computer science and biology as a Tufts undergraduate, this problem is usually solved by a “header”—a piece of metadata attached to a packet of information that indicates its source and type. Whether electric signals from eyes impinging on the spinal cord carry such an identifier of their origin remains a hypothesis to be tested.

Ectopic Eyes Function Without Connection to Brain

For the first time, scientists have shown that transplanted eyes located far outside the head in a vertebrate animal model can confer vision without a direct neural connection to the brain.

Biologists at Tufts University School of Arts and Sciences used a frog model to shed new light – literally – on one of the major questions in regenerative medicine, bioengineering, and sensory augmentation research.

"One of the big challenges is to understand how the brain and body adapt to large changes in organization," says Douglas J. Blackiston, Ph.D., first author of the paper "Ectopic Eyes Outside the Head in Xenopus Tadpoles Provide Sensory Data For Light-Mediated Learning," in the February 27 issue of the Journal of Experimental Biology. “Here, our research reveals the brain’s remarkable ability, or plasticity, to process visual data coming from misplaced eyes, even when they are located far from the head.”

Blackiston is a post-doctoral associate in the laboratory of co-author Michael Levin, Ph.D., professor of biology and director of the Center for Regenerative and Developmental Biology at Tufts University.

Levin notes, “A primary goal in medicine is to one day be able to restore the function of damaged or missing sensory structures through the use of biological or artificial replacement components. There are many implications of this study, but the primary one from a medical standpoint is that we may not need to make specific connections to the brain when treating sensory disorders such as blindness.”

In this experiment, the team surgically removed donor embryo eye primordia, marked with fluorescent proteins, and grafted them into the posterior region of recipient embryos. This induced the growth of ectopic eyes. The recipients’ natural eyes were removed, leaving only the ectopic eyes.

Fluorescence microscopy revealed various innervation patterns but none of the animals developed nerves that connected the ectopic eyes to the brain or cranial region.

To determine if the ectopic eyes conveyed visual information, the team developed a computer-controlled visual training system in which quadrants of water were illuminated by either red or blue LED lights. The system could administer a mild electric shock to tadpoles swimming in a particular quadrant. A motion tracking system outfitted with a camera and a computer program allowed the scientists to monitor and record the tadpoles’ motion and speed.

Eyes See Without Wiring to Brain

The team made exciting discoveries: Just over 19 percent of the animals with optic nerves that connected to the spine demonstrated learned responses to the lights. They swam away from the red light while the blue light stimulated natural movement.

Their response to the lights elicited during the experiments was no different from that of a control group of tadpoles with natural eyes intact. Furthermore, this response was not demonstrated by eyeless tadpoles or tadpoles that did not receive any electrical shock.

"This has never been shown before," says Levin. "No one would have guessed that eyes on the flank of a tadpole could see, especially when wired only to the spinal cord and not the brain."
The findings suggest a remarkable plasticity in the brain’s ability to incorporate signals from various body regions into behavioral programs that had evolved with a specific and different body plan.

"Ectopic eyes performed visual function," says Blackiston. "The brain recognized visual data from eyes that impinged on the spinal cord. We still need to determine if this plasticity in vertebrate brains extends to different ectopic organs or organs appropriate in different species."

One of the most fascinating areas for future investigation, according to Blackiston and Levin, is the question of exactly how the brain recognizes that the electrical signals coming from tissue near the gut is to be interpreted as visual data.

In computer engineering, notes Levin, who majored in computer science and biology as a Tufts undergraduate, this problem is usually solved by a “header”—a piece of metadata attached to a packet of information that indicates its source and type. Whether electric signals from eyes impinging on the spinal cord carry such an identifier of their origin remains a hypothesis to be tested.

Filed under animal model visual system brain plasticity ectopic eyes regenerative medicine neuroscience science

88 notes

Researchers identify brain pathway triggering impulsive eating
New research from the University of Georgia has identified the neural pathways in an insect brain tied to eating for pleasure, a discovery that sheds light on mirror impulsive eating pathways in the human brain.
"We know when insects are hungry, they eat more, become aggressive and are willing to do more work to get the food," said Ping Shen, a UGA associate professor of cellular biology in the Franklin College of Arts and Sciences. "Little is known about the other half-the reward-driven feeding behavior-when the animal is not so hungry but they still get excited about food when they smell something great.
The fact that a relatively lower animal, a fly larva, actually does this impulsive feeding based on a rewarding cue was a surprise.”
The research team led by Shen, who also is a member of the Biomedical and Health Sciences Institute, found that presenting fed fruit fly larvae with appetizing odors caused impulsive feeding of sugar-rich foods. The findings, published Feb. 28 in Cell Press, suggest eating for pleasure is an ancient behavior and that fly larvae can be used in studying neurobiology and the evolution of olfactory reward-driven impulses.
To test reward-driven behaviors in flies, Shen introduced appetizing odors to groups of well-fed larvae. In every case, the fed larvae consumed about 30 percent more food when surrounded by the attractive odors.
But when the insects were offered a substandard meal, they refused to eat it.
"They have expectations," he said. "If we reduce the concentration of sugar below a threshold, they do not respond anymore. Similar to what you see in humans, if you approach a beautiful piece of cake and you taste it and determine it is old and horrible, you are no longer interested."
Shen’s team also tried to further define this phenomenon-the connection between excitement and expectation. He found when the larvae were presented with a brief odor, the amount of time they were willing to act on the impulse was about 15 minutes.
"After 15 minutes, they revert back to normal. You get excited, but you can’t stay excited forever, so there is a mechanism to shut it down," he said.
His work also suggests the neuropeptides, or brain chemicals acting as signaling molecules triggering impulsive eating, are consistent between flies and humans. Neurons receive and convert stimuli into thoughts that are then relayed to the downstream mechanism telling the animals to act. These signaling molecules are required for this impulse, suggesting the molecular details of these functions are evolutionarily tied between flies and humans.
"There are hyper-rewarding cues that humans and flies have evolved to perceive, and they connect this perception with behavior performance," Shen said. "As long as this is activated, the animal will eat food. In this way, the brain is stupid: It does not know how it gets activated. In this case, the fly says ‘I smell something, I want to do this.’ This kind of connection has been established very early on, probably before the divergence of fly and human. That is why we both have it."
Impulsive and reward-driven behaviors are largely misunderstood, partially due to the complex systems at work in human brains. Fly larvae nervous systems, in terms of scheme and organization, are very similar to adult flies and to mammals, but with fewer neurons and less complex wirings.
"A particular function in the brain of mammals may require a large cluster of neurons," he said. "In flies, it may be only one or four. They are simpler in number but not principle."
In the fly model, four neurons are responsible for relaying signals from the olfactory center to the brain to stimulate action. Each odor and receptor translates the response slightly differently. Human triggers are obviously more diverse, but Shen thinks the mechanism to appreciate the combination is likely the same. He is now working with Tianming Liu, assistant professor of computer science at UGA and member of the Bioimaging Research Center and Institute of Bioinformatics, on a computer model to determine how these odors are interpreted as stimuli.
"Dieting is difficult, especially in the environment of these beautiful foods," Shen said. "It is very hard to control this impulsive urge. So, if we understand how this compulsive eating behavior comes about, we maybe can devise a way, at least for the behavioral aspect, to prevent it. We can modulate our behaviors better or use chemical interventions to calm down these cues."

Researchers identify brain pathway triggering impulsive eating

New research from the University of Georgia has identified the neural pathways in an insect brain tied to eating for pleasure, a discovery that sheds light on mirror impulsive eating pathways in the human brain.

"We know when insects are hungry, they eat more, become aggressive and are willing to do more work to get the food," said Ping Shen, a UGA associate professor of cellular biology in the Franklin College of Arts and Sciences. "Little is known about the other half-the reward-driven feeding behavior-when the animal is not so hungry but they still get excited about food when they smell something great.

The fact that a relatively lower animal, a fly larva, actually does this impulsive feeding based on a rewarding cue was a surprise.”

The research team led by Shen, who also is a member of the Biomedical and Health Sciences Institute, found that presenting fed fruit fly larvae with appetizing odors caused impulsive feeding of sugar-rich foods. The findings, published Feb. 28 in Cell Press, suggest eating for pleasure is an ancient behavior and that fly larvae can be used in studying neurobiology and the evolution of olfactory reward-driven impulses.

To test reward-driven behaviors in flies, Shen introduced appetizing odors to groups of well-fed larvae. In every case, the fed larvae consumed about 30 percent more food when surrounded by the attractive odors.

But when the insects were offered a substandard meal, they refused to eat it.

"They have expectations," he said. "If we reduce the concentration of sugar below a threshold, they do not respond anymore. Similar to what you see in humans, if you approach a beautiful piece of cake and you taste it and determine it is old and horrible, you are no longer interested."

Shen’s team also tried to further define this phenomenon-the connection between excitement and expectation. He found when the larvae were presented with a brief odor, the amount of time they were willing to act on the impulse was about 15 minutes.

"After 15 minutes, they revert back to normal. You get excited, but you can’t stay excited forever, so there is a mechanism to shut it down," he said.

His work also suggests the neuropeptides, or brain chemicals acting as signaling molecules triggering impulsive eating, are consistent between flies and humans. Neurons receive and convert stimuli into thoughts that are then relayed to the downstream mechanism telling the animals to act. These signaling molecules are required for this impulse, suggesting the molecular details of these functions are evolutionarily tied between flies and humans.

"There are hyper-rewarding cues that humans and flies have evolved to perceive, and they connect this perception with behavior performance," Shen said. "As long as this is activated, the animal will eat food. In this way, the brain is stupid: It does not know how it gets activated. In this case, the fly says ‘I smell something, I want to do this.’ This kind of connection has been established very early on, probably before the divergence of fly and human. That is why we both have it."

Impulsive and reward-driven behaviors are largely misunderstood, partially due to the complex systems at work in human brains. Fly larvae nervous systems, in terms of scheme and organization, are very similar to adult flies and to mammals, but with fewer neurons and less complex wirings.

"A particular function in the brain of mammals may require a large cluster of neurons," he said. "In flies, it may be only one or four. They are simpler in number but not principle."

In the fly model, four neurons are responsible for relaying signals from the olfactory center to the brain to stimulate action. Each odor and receptor translates the response slightly differently. Human triggers are obviously more diverse, but Shen thinks the mechanism to appreciate the combination is likely the same. He is now working with Tianming Liu, assistant professor of computer science at UGA and member of the Bioimaging Research Center and Institute of Bioinformatics, on a computer model to determine how these odors are interpreted as stimuli.

"Dieting is difficult, especially in the environment of these beautiful foods," Shen said. "It is very hard to control this impulsive urge. So, if we understand how this compulsive eating behavior comes about, we maybe can devise a way, at least for the behavioral aspect, to prevent it. We can modulate our behaviors better or use chemical interventions to calm down these cues."

Filed under brain fly larva impulsive eating insects neuropeptides evolution neuroscience science

124 notes

‘Rain Man’-like Brains Mapped with Network Analysis

Innovative Technique Sheds Light on Abnormal Brain Connectivity Responsible for Common Genetic Cause of Autism

A group of researchers at UC San Francisco and UC Berkeley have mapped the three-dimensional global connections within the brains of seven adults who have genetic malformations that leave them without the corpus callosum, which connects the left and right sides of the brain.

These “structural connectome” maps, which combine hospital MRIs with the mathematical tool known as network analysis, are described in the upcoming April 15 issue of the journal Neuroimage. They reveal new details about the condition known as agenesis of the corpus callosum, which is one of the top genetic causes of autism. The condition was part of the mysterious brain physiology of Laurence Kim Peek, the remarkable savant portrayed by Dustin Hoffman in the 1987 movie “Rain Man.”

While some people born with agenesis of the corpus callosum are of normal intelligence and do not have any obvious signs of neurologic disease, approximately 40 percent of people with the condition are at high risk for autism. Given this, the work is a step toward finding better ways to image the brains of people with the condition, said Pratik Mukherjee, MD, PhD, a professor of radiology and biomedical imaging at UCSF who was the co-senior author of the research.

Understanding how brain connectivity varies from person to person may help researchers identify imaging biomarkers for autism to help diagnose it and manage care for individuals. Currently autism is diagnosed and assessed based on cognitive tests, such as those involving stacking blocks and looking at pictures on flip cards.

While the new work falls short of a quantitative measure doctors could use instead of cognitive testing, it does offer a proof-of-principle that this novel technique may shed light on neurodevelopment disorders.

“Because you are looking at the whole brain at the network level, you can do new types of analysis to find what’s abnormal,” Mukherjee said.

The Connection between the Brain Hemispheres and Autism

Agenesis of the corpus callosum can arise if individuals are born missing DNA from chromosome 16 and often leads to autism.

Scientists have long puzzled over what the link is between this disorder and the autistic brain, said co-senior author of the paper Elliott Sherr, MD, PhD, professor of neurology and genetics especially since not all people with this malformation develop autism.

Doctors believe this is because the brain has a rich capacity for rewiring in alternative ways.

Pursuing this question, Mukherjee and Sherr turned to MRI and the mathematical technique of network analysis, which has long supported fields like civil engineering, helping urban planners optimize the timing of traffic lights to speed traffic. This is the first time network analysis has been applied to brain mapping for a genetic cause of autism.

The brain offers a significantly complicated challenge for analysis because, unlike the streets of a given city, the brain has hundreds of billions of neurons, many of which make tens of thousands of connections to each other, making its level of connectivity highly complex.

By comparing the seven rain man-like brains to those of 11 people without this malformation, the scientists determined how particular structures called the cingulate bundles were smaller and the neurons within these bundles were less connected to others in the brain. They also found that the network topology of the brain was more variable in people with agenesis of the corpus callosum than in people without the malformation.

Filed under brain AgCC corpus callosum connectome autism Kim Peek network analysis neuroscience science

46 notes

Novel wireless brain sensor

image

A team of neuroengineers based at Brown University has developed a fully implantable and rechargeable wireless brain sensor capable of relaying real-time broadband signals from up to 100 neurons in freely moving subjects. Several copies of the novel low-power device, described in the Journal of Neural Engineering, have been performing well in animal models for more than year, a first in the brain-computer interface field. Brain-computer interfaces could help people with severe paralysis control devices with their thoughts.

Arto Nurmikko, professor of engineering at Brown University who oversaw the device’s invention, is presenting it this week at the 2013 International Workshop on Clinical Brain-Machine Interface Systems in Houston.

“This has features that are somewhat akin to a cell phone, except the conversation that is being sent out is the brain talking wirelessly,” Nurmikko said.

Neuroscientists can use such a device to observe, record, and analyze the signals emitted by scores of neurons in particular parts of the animal model’s brain.

Meanwhile, wired systems using similar implantable sensing electrodes are being investigated in brain-computer interface research to assess the feasibility of people with severe paralysis moving assistive devices like robotic arms or computer cursors by thinking about moving their arms and hands.

This wireless system addresses a major need for the next step in providing a practical brain-computer interface,” said neuroscientist John Donoghue, the Wriston Professor of Neuroscience at Brown University and director of the Brown Institute for Brain Science.

Tightly packed technology

In the device, a pill-sized chip of electrodes implanted on the cortex sends signals through uniquely designed electrical connections into the device’s laser-welded, hermetically sealed titanium “can.” The can measures 2.2 inches (56 mm) long, 1.65 inches (42 mm) wide, and 0.35 inches (9 mm) thick. That small volume houses an entire signal processing system: a lithium ion battery, ultralow-power integrated circuits designed at Brown for signal processing and conversion, wireless radio and infrared transmitters, and a copper coil for recharging — a “brain radio.” All the wireless and charging signals pass through an electromagnetically transparent sapphire window.

In all, the device looks like a miniature sardine can with a porthole.

But what the team has packed inside makes it a major advance among brain-machine interfaces, said lead author David Borton, a former Brown graduate student and postdoctoral research associate who is now at Ecole Polytechnique Federale Lausanne in Switzerland.

“What makes the achievement discussed in this paper unique is how it integrated many individual innovations into a complete system with potential for neuroscientific gain greater than the sum of its parts,” Borton said. “Most importantly, we show the first fully implanted microsystem operated wirelessly for more than 12 months in large animal models — a milestone for potential [human] clinical translation.”

The device transmits data at 24 Mbps via 3.2 and 3.8 Ghz microwave frequencies to an external receiver. After a two-hour charge, delivered wirelessly through the scalp via induction, it can operate for more than six hours.

“The device uses less than 100 milliwatts of power, a key figure of merit,” Nurmikko said.

Co-author Ming Yin, a Brown postdoctoral scholar and electrical engineer, said one of the major challenges that the team overcame in building the device was optimizing its performance given the requirements that the implant device be small, low-power and leak-proof, potentially for decades.

“We tried to make the best tradeoff between the critical specifications of the device, such as power consumption, noise performance, wireless bandwidth and operational range,” Yin said. “Another major challenge we encountered was to integrate and assemble all the electronics of the device into a miniaturized package that provides long-term hermeticity (water-proofing) and biocompatibility as well as transparency to the wireless data, power, and on-off switch signals.”

With early contributions by electrical engineer William Patterson at Brown, Yin helped to design the custom chips for converting neural signals into digital data. The conversion has to be done within the device, because brain signals are not produced in the ones and zeros of computer data.

Ample applications

The team worked closely with neurosurgeons to implant the device in three pigs and three rhesus macaque monkeys. The research in these six animals has been helping scientists better observe complex neural signals for as long as 16 months so far. In the new paper, the team shows some of the rich neural signals they have been able to record in the lab. Ultimately this could translate to significant advances that can also inform human neuroscience.

Current wired systems constrain the actions of research subjects, Nurmikko said. The value of wireless transmission is that it frees subjects to move however they intend, allowing them to produce a wider variety of more realistic behaviors. If neuroscientists want to observe the brain signals produced during some running or foraging behaviors, for instance, they can’t use a cabled sensor to study how neural circuits would form those plans for action and execution or strategize in decision making.

In the experiments in the new paper, the device is connected to one array of 100 cortical electrodes, the microscale individual neural listening posts, but the new device design allows for multiple arrays to be connected, Nurmikko said. That would allow scientists to observe ensembles of neurons in multiple related areas of a brain network.

The new wireless device is not approved for use in humans and is not used in clinical trials of brain-computer interfaces. It was designed, however, with that translational motivation.

“This was conceived very much in concert with the larger BrainGate* team, including neurosurgeons and neurologists giving us advice as to what were appropriate strategies for eventual clinical applications,” said Nurmikko, who is also affiliated with the Brown Institute for Brain Science.

Borton is now spearheading the development of a collaboration between EPFL and Brown to use a version of the device to study the role of the motor cortex in an animal model of Parkinson’s disease.

Meanwhile the Brown team is continuing work on advancing the device for even larger amounts of neural data transmission, reducing its size even further, and improving other aspects of the device’s safety and reliability so that it can someday be considered for clinical application in people with movement disabilities.

(Source: news.brown.edu)

Filed under brain brain-computer interface BCI electrodes wireless brain sensor movement disabilities implants neuroscience science

152 notes

Brain-to-brain interface allows transmission of tactile and motor information between rats
Researchers have electronically linked the brains of pairs of rats for the first time, enabling them to communicate directly to solve simple behavioral puzzles. A further test of this work successfully linked the brains of two animals thousands of miles apart—one in Durham, N.C., and one in Natal, Brazil.
The results of these projects suggest the future potential for linking multiple brains to form what the research team is calling an “organic computer,” which could allow sharing of motor and sensory information among groups of animals. The study was published Feb. 28, 2013, in the journal Scientific Reports.
"Our previous studies with brain-machine interfaces had convinced us that the rat brain was much more plastic than we had previously thought," said Miguel Nicolelis, M.D., PhD, lead author of the publication and professor of neurobiology at Duke University School of Medicine. "In those experiments, the rat brain was able to adapt easily to accept input from devices outside the body and even learn how to process invisible infrared light generated by an artificial sensor. So, the question we asked was, ‘if the brain could assimilate signals from artificial sensors, could it also assimilate information input from sensors from a different body?’"
To test this hypothesis, the researchers first trained pairs of rats to solve a simple problem: to press the correct lever when an indicator light above the lever switched on, which rewarded the rats with a sip of water. They next connected the two animals’ brains via arrays of microelectrodes inserted into the area of the cortex that processes motor information.
Read more

Brain-to-brain interface allows transmission of tactile and motor information between rats

Researchers have electronically linked the brains of pairs of rats for the first time, enabling them to communicate directly to solve simple behavioral puzzles. A further test of this work successfully linked the brains of two animals thousands of miles apart—one in Durham, N.C., and one in Natal, Brazil.

The results of these projects suggest the future potential for linking multiple brains to form what the research team is calling an “organic computer,” which could allow sharing of motor and sensory information among groups of animals. The study was published Feb. 28, 2013, in the journal Scientific Reports.

"Our previous studies with brain-machine interfaces had convinced us that the rat brain was much more plastic than we had previously thought," said Miguel Nicolelis, M.D., PhD, lead author of the publication and professor of neurobiology at Duke University School of Medicine. "In those experiments, the rat brain was able to adapt easily to accept input from devices outside the body and even learn how to process invisible infrared light generated by an artificial sensor. So, the question we asked was, ‘if the brain could assimilate signals from artificial sensors, could it also assimilate information input from sensors from a different body?’"

To test this hypothesis, the researchers first trained pairs of rats to solve a simple problem: to press the correct lever when an indicator light above the lever switched on, which rewarded the rats with a sip of water. They next connected the two animals’ brains via arrays of microelectrodes inserted into the area of the cortex that processes motor information.

Read more

Filed under brain activity electrical stimulation cortex behavioral decision neuroscience science

102 notes

Researchers Identify Possible Treatment Window for Memory Problems
Researchers have identified a possible treatment window for plaques in the brain that are thought to cause memory loss in diseases such as Alzheimer’s, according to a new study published in the February 27, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Our study suggests that plaques in the brain that are linked to a decline in memory and thinking abilities, called beta amyloid, take about 15 years to build up and then plateau,” said Clifford R. Jack, Jr., MD, with the Mayo Clinic in Rochester, Minn.
For the study, 260 people between the ages of 70 and 92 underwent two or more brain scans over an average of 1.3 years that measured plaque buildup in the brain. Of the participants, 78 percent did not have impaired thinking abilities or memory at the start of the study.
The study found that the rate of buildup accelerates initially, then slows down before plateauing at high levels. For example, lower rates of plaque buildup were found in both people who had low and high levels of the plaques at the start of the study while the rate of plaque accumulation was highest in participants with mid-range levels at the start of the study.
The study also found that the rate of buildup of plaques was more closely tied to the total amount of amyloid plaques in the brain than other risk factors, such as the level of cognitive impairment, age and the presence of the APOE gene, a gene linked to Alzheimer’s disease.
“Our results suggest that there is a long treatment window where medications may be able to help slow buildup of the amyloid plaques that are linked to cognitive decline,” said Jack. “On the other hand, trying to treat the plaque buildup after the amyloid plaque load has plateaued may not do much good.”

Researchers Identify Possible Treatment Window for Memory Problems

Researchers have identified a possible treatment window for plaques in the brain that are thought to cause memory loss in diseases such as Alzheimer’s, according to a new study published in the February 27, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Our study suggests that plaques in the brain that are linked to a decline in memory and thinking abilities, called beta amyloid, take about 15 years to build up and then plateau,” said Clifford R. Jack, Jr., MD, with the Mayo Clinic in Rochester, Minn.

For the study, 260 people between the ages of 70 and 92 underwent two or more brain scans over an average of 1.3 years that measured plaque buildup in the brain. Of the participants, 78 percent did not have impaired thinking abilities or memory at the start of the study.

The study found that the rate of buildup accelerates initially, then slows down before plateauing at high levels. For example, lower rates of plaque buildup were found in both people who had low and high levels of the plaques at the start of the study while the rate of plaque accumulation was highest in participants with mid-range levels at the start of the study.

The study also found that the rate of buildup of plaques was more closely tied to the total amount of amyloid plaques in the brain than other risk factors, such as the level of cognitive impairment, age and the presence of the APOE gene, a gene linked to Alzheimer’s disease.

“Our results suggest that there is a long treatment window where medications may be able to help slow buildup of the amyloid plaques that are linked to cognitive decline,” said Jack. “On the other hand, trying to treat the plaque buildup after the amyloid plaque load has plateaued may not do much good.”

Filed under alzheimer disease amyloid plaques memory memory loss cognitive decline neuroscience science

96 notes

Infant brains imply adult ills: Researchers study traits in babies as young as two weeks
Brain images from newborns are giving scientists a glimpse of the future - not just into the lives of their tiny subjects but also paths to treatment for adult patients with schizophrenia and Alzheimer’s disease.
Researchers from the University of North Carolina-Chapel Hill found degeneration in the brains of 2-week-old infants, a result considered a “game changer” for the field of brain research, said Jay Giedd, a brain imaging specialist for the National Institute of Mental Health.
"Our original model was that the brain was fine until someone got the illness," Giedd said. "This work shows that these changes are there probably from conception. It also suggests that while these traits don’t cause brain damage, they set up the brain to be slightly different."
The researchers examined scans of 272 newborns. About 15 percent were found to have smaller medial temporal lobe sections. “The medial temporal lobe plays an important role in memory,” said Rebecca Knickmeyer, a UNC assistant professor of psychiatry and co-author of the research, published last month in Cerebral Cortex, an online journal.
"The idea is that this is an anatomical vulnerability. If you start out with less, you might hit active symptoms earlier in life."
The researchers also found specific gene traits associated with Alzheimer’s in babies with the smaller media temporal lobes.
"We were interested because it was generally known that people’s genes contribute to psychiatric conditions later in life, but pretty much all the existing studies were in adults," Knickmeyer said. "Our question was ‘When were these genes exerting their effect?’ Now we know it’s much earlier than previously thought, perhaps before birth."
Research such as this would benefit from the Brain Activity Map under development through the National Institutes of Health. The project’s 10-year goal is to create a map of the brain’s nearly 30,000 genes as well as the circuitry system that transmits information via brain waves.
President Obama mentioned the project in his State of the Union address and is expected to include funding for the project in the upcoming federal budget. Foundations and some private companies have also expressed interest in assisting in the project, which is expected to push brain research to a higher level.
"As brain scientists, we were giddy to hear this," Giedd said. "Motivation is sky high. If they fund this, we believe our work will really take off." Giedd, who is familiar with but did not participate in the infant brain study, said the search for treatments or cures for diseases such as Alzheimer’s, autism, schizophrenia and Parkinson’s disease have been stymied by the many mysteries that remain regarding how the brain functions.
"If we understood more about the mechanisms that cause these diseases, we could step in and do something about it," Giedd said. "The brain is so complicated. Most diseases don’t just involve one or two or even three genes. It might be 60 or 100 genes, along with upbringing, diet and environment. There are so many parameters to the equation."
Knickmeyer said her research team plans to follow up with the newborns as they grow into adulthood to see whether the traits displayed by infants change over time or remain stable throughout their lives.
Daniel Kaufer, cognitive neurology and memory disorders chief for UNC’s Department of Neurology, said he thinks the time is right for great advances in brain research.
"We are at the crossroads of two important events: the realization that brain disorders may occur long before symptoms begin, and the development of brain imaging technology to record brain processes," Kaufer said.
Learning more about the brain’s functions through gene mapping may be the third piece of the puzzle. “Right now, there is no map of the human brain,” said Murali Doraiswamy, professor of psychiatry and behavioral sciences at Duke University School of Medicine.
Doraiswamy said the brain carries thousands of genes that influence thought, perception, emotion, memory and other mental activities. “We want to find out how much is nature and how much is nurture,” he added. “I think we are at the forefront of something very insightful, but also a little frightening.”
MAPPING A NEW WORLD
The Brain Activity Map is being planned as a decade-long research effort to create a comprehensive outline of the structure of the human brain and its neurons.
Funding is expected to come from a variety of sources, including the federal government, private industry and research foundations.
Details of the project have not yet been made public. But it is being compared to the DNA sequencing effort known as the Human Genome Project, which ran from 1990 to 2003 and cost $3.8 billion.

Infant brains imply adult ills: Researchers study traits in babies as young as two weeks

Brain images from newborns are giving scientists a glimpse of the future - not just into the lives of their tiny subjects but also paths to treatment for adult patients with schizophrenia and Alzheimer’s disease.

Researchers from the University of North Carolina-Chapel Hill found degeneration in the brains of 2-week-old infants, a result considered a “game changer” for the field of brain research, said Jay Giedd, a brain imaging specialist for the National Institute of Mental Health.

"Our original model was that the brain was fine until someone got the illness," Giedd said. "This work shows that these changes are there probably from conception. It also suggests that while these traits don’t cause brain damage, they set up the brain to be slightly different."

The researchers examined scans of 272 newborns. About 15 percent were found to have smaller medial temporal lobe sections. “The medial temporal lobe plays an important role in memory,” said Rebecca Knickmeyer, a UNC assistant professor of psychiatry and co-author of the research, published last month in Cerebral Cortex, an online journal.

"The idea is that this is an anatomical vulnerability. If you start out with less, you might hit active symptoms earlier in life."

The researchers also found specific gene traits associated with Alzheimer’s in babies with the smaller media temporal lobes.

"We were interested because it was generally known that people’s genes contribute to psychiatric conditions later in life, but pretty much all the existing studies were in adults," Knickmeyer said. "Our question was ‘When were these genes exerting their effect?’ Now we know it’s much earlier than previously thought, perhaps before birth."

Research such as this would benefit from the Brain Activity Map under development through the National Institutes of Health. The project’s 10-year goal is to create a map of the brain’s nearly 30,000 genes as well as the circuitry system that transmits information via brain waves.

President Obama mentioned the project in his State of the Union address and is expected to include funding for the project in the upcoming federal budget. Foundations and some private companies have also expressed interest in assisting in the project, which is expected to push brain research to a higher level.

"As brain scientists, we were giddy to hear this," Giedd said. "Motivation is sky high. If they fund this, we believe our work will really take off." Giedd, who is familiar with but did not participate in the infant brain study, said the search for treatments or cures for diseases such as Alzheimer’s, autism, schizophrenia and Parkinson’s disease have been stymied by the many mysteries that remain regarding how the brain functions.

"If we understood more about the mechanisms that cause these diseases, we could step in and do something about it," Giedd said. "The brain is so complicated. Most diseases don’t just involve one or two or even three genes. It might be 60 or 100 genes, along with upbringing, diet and environment. There are so many parameters to the equation."

Knickmeyer said her research team plans to follow up with the newborns as they grow into adulthood to see whether the traits displayed by infants change over time or remain stable throughout their lives.

Daniel Kaufer, cognitive neurology and memory disorders chief for UNC’s Department of Neurology, said he thinks the time is right for great advances in brain research.

"We are at the crossroads of two important events: the realization that brain disorders may occur long before symptoms begin, and the development of brain imaging technology to record brain processes," Kaufer said.

Learning more about the brain’s functions through gene mapping may be the third piece of the puzzle. “Right now, there is no map of the human brain,” said Murali Doraiswamy, professor of psychiatry and behavioral sciences at Duke University School of Medicine.

Doraiswamy said the brain carries thousands of genes that influence thought, perception, emotion, memory and other mental activities. “We want to find out how much is nature and how much is nurture,” he added. “I think we are at the forefront of something very insightful, but also a little frightening.”

MAPPING A NEW WORLD

The Brain Activity Map is being planned as a decade-long research effort to create a comprehensive outline of the structure of the human brain and its neurons.

Funding is expected to come from a variety of sources, including the federal government, private industry and research foundations.

Details of the project have not yet been made public. But it is being compared to the DNA sequencing effort known as the Human Genome Project, which ran from 1990 to 2003 and cost $3.8 billion.

Filed under infants neurodegeneration medial temporal lobe memory alzheimer's disease neuroscience science

85 notes

Research reveals Huntington’s hope
Researchers in Scotland and Germany have discovered a molecular mechanism that shows promise for developing a cure for Huntington’s Disease (HD).
Scientists from the University of Dundee, the German Center for Neurodegenerative Diseases (DZNE) in Bonn, the Max-Planck Institute for Molecular Genetics in Berlin and the Johannes Gutenberg-Universität Mainz have found a mechanism that specifically stirs and induces the synthesis of disease-making protein in HD patients.
Their data lead to the conclusion that a selective overproduction of aberrant Huntington protein in patients is a key step in the establishment of the disease, which affects 1 in 10,000 people in Western countries and is so far incurable.
"This is a very promising strategy to develop a small molecule drug therapy that is able to inhibit the production of disease-making protein," said Professor Susann Schweiger of the University of Dundee and Johannes Gutenberg-Universität Mainz.
"Theoretically, if you don’t have the disease-making protein then you don’t have the disease. Obviously we still have work to do to develop a drug to target these mechanisms and inhibit the production of this protein but we think this research is attractive to drug discovery and ongoing work in this area is being carried out."
The gene responsible for causing Huntington’s Disease was first identified in 1993, leading to hopes that a specific therapy for HD would soon be on the market. However, cell biology and brain pathology of HD showed it to be more complicated than originally anticipated and only symptomatic treatments to slightly relieve the distress of single components of the disease are currently available.
The new discovery once again raises hopes that a curative therapy can be established. The scientists found that it was mainly three proteins - the mammalian target of rapamycin (mTOR), protein phosphatase 2A (PP2A) and Midline 1 (MID1) - that specifically drive the production of disease-making protein in HD patients.
As a result, more and more aberrant protein is produced with time, which leads to a protein overload in the cell. By interfering with the function of the three proteins it is possible to disrupt this circle and prevent the synthesis of aberrant protein in HD patients.
The Dundee-Germany research is published in the latest edition of the Nature Communications journal.

Research reveals Huntington’s hope

Researchers in Scotland and Germany have discovered a molecular mechanism that shows promise for developing a cure for Huntington’s Disease (HD).

Scientists from the University of Dundee, the German Center for Neurodegenerative Diseases (DZNE) in Bonn, the Max-Planck Institute for Molecular Genetics in Berlin and the Johannes Gutenberg-Universität Mainz have found a mechanism that specifically stirs and induces the synthesis of disease-making protein in HD patients.

Their data lead to the conclusion that a selective overproduction of aberrant Huntington protein in patients is a key step in the establishment of the disease, which affects 1 in 10,000 people in Western countries and is so far incurable.

"This is a very promising strategy to develop a small molecule drug therapy that is able to inhibit the production of disease-making protein," said Professor Susann Schweiger of the University of Dundee and Johannes Gutenberg-Universität Mainz.

"Theoretically, if you don’t have the disease-making protein then you don’t have the disease. Obviously we still have work to do to develop a drug to target these mechanisms and inhibit the production of this protein but we think this research is attractive to drug discovery and ongoing work in this area is being carried out."

The gene responsible for causing Huntington’s Disease was first identified in 1993, leading to hopes that a specific therapy for HD would soon be on the market. However, cell biology and brain pathology of HD showed it to be more complicated than originally anticipated and only symptomatic treatments to slightly relieve the distress of single components of the disease are currently available.

The new discovery once again raises hopes that a curative therapy can be established. The scientists found that it was mainly three proteins - the mammalian target of rapamycin (mTOR), protein phosphatase 2A (PP2A) and Midline 1 (MID1) - that specifically drive the production of disease-making protein in HD patients.

As a result, more and more aberrant protein is produced with time, which leads to a protein overload in the cell. By interfering with the function of the three proteins it is possible to disrupt this circle and prevent the synthesis of aberrant protein in HD patients.

The Dundee-Germany research is published in the latest edition of the Nature Communications journal.

Filed under neurodegenerative diseases proteins huntington's disease drug discovery neuroscience science

free counters