Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

244 notes

Robot-Delivered Speech and Physical Therapy
In one of the earliest experiments using a humanoid robot to deliver speech and physical therapy to a stroke patient, researchers at the University of Massachusetts Amherst saw notable speech and physical therapy gains and significant improvement in quality of life.
Regarding the overall outcome, speech language pathologist and study leader Yu-kyong Choe says, “It’s clear from our study of a 72-year-old male stroke client that a personal humanoid robot can help people recover by delivering therapy such as word-retrieval games and arm movement tasks in an enjoyable and engaging way.”
A major focus of this case study was to assess how therapy interventions in one domain, speech, affected interventions in another, physical therapy, in two different delivery scenarios. Despite the importance of working with other professionals, the authors point out, until now it has been “largely unknown how interventions by one type of therapy affects progress in others.”
The client, with aphasia and physical disability on one side, completed a robot-mediated program of only speech therapy for five weeks followed by only physical therapy for five weeks in the sole condition, but for the sequential condition he attended back-to-back speech and physical therapy sessions for five weeks.
Over the course of the experiment, the client made “notable gains in the frequency and range of the upper-limb movements,” the authors say. He also made positive gains in verbal expression. Interestingly, his improvements in speech and physical function were much greater when he engaged in only one therapy than when the two therapies were paired in sessions immediately following each other. The authors summarize that in such a sequential schedule “speech and physical functions seemed to compete for limited resources” in the brain. Their work is described in the current issue of the journal Aphasiology.
Choe and computer science researcher and robot expert Rod Grupen, director of the Laboratory for Perceptual Robotics at UMass Amherst, are in the second year of a $109,251 grant from the American Heart Association to investigate the effect of stroke rehabilitation delivered by a humanoid robot, uBot-5. It is a child-sized unit with arms and a computer screen through which therapists interact with the client.
Choe, Grupen and colleagues are seeking ways to bring more and longer-term therapy and social contact to people recovering from stroke. It’s estimated that 3 million Americans daily experience the debilitating effects of stroke. But even after years, they can recover significant function with intensive rehabilitation, says Choe. The bad news is that this is rarely available or accessible due to a shortage of therapists and lack of coverage for long-term treatment. Many people are left with chronic low function, which can lead to social isolation and depression.
While some may object to robots delivering therapy, the need is great and definitely not being met now, especially in rural areas, Grupen and Choe point out. They hope to aid human-to-human interaction, so a robot can temporarily take the therapist’s place. Grupen says, “In addition to improving quality of life, if we can support a client in the home so they can delay institutionalization, we can improve outcomes and make a huge impact on the cost of elder care. There are 70 million baby boomers beginning to retire now.”
“Stroke rehabilitation is such a monumental financial problem everywhere in the world, that’s where it can pay for itself,” he adds. “A personal robot could save billions of dollars in elder care while letting people stay in their own homes and communities. We’re hoping for a win-win where our elders live better, more independent and productive lives and our overtaxed healthcare resources are used more effectively.”

Robot-Delivered Speech and Physical Therapy

In one of the earliest experiments using a humanoid robot to deliver speech and physical therapy to a stroke patient, researchers at the University of Massachusetts Amherst saw notable speech and physical therapy gains and significant improvement in quality of life.

Regarding the overall outcome, speech language pathologist and study leader Yu-kyong Choe says, “It’s clear from our study of a 72-year-old male stroke client that a personal humanoid robot can help people recover by delivering therapy such as word-retrieval games and arm movement tasks in an enjoyable and engaging way.”

A major focus of this case study was to assess how therapy interventions in one domain, speech, affected interventions in another, physical therapy, in two different delivery scenarios. Despite the importance of working with other professionals, the authors point out, until now it has been “largely unknown how interventions by one type of therapy affects progress in others.”

The client, with aphasia and physical disability on one side, completed a robot-mediated program of only speech therapy for five weeks followed by only physical therapy for five weeks in the sole condition, but for the sequential condition he attended back-to-back speech and physical therapy sessions for five weeks.

Over the course of the experiment, the client made “notable gains in the frequency and range of the upper-limb movements,” the authors say. He also made positive gains in verbal expression. Interestingly, his improvements in speech and physical function were much greater when he engaged in only one therapy than when the two therapies were paired in sessions immediately following each other. The authors summarize that in such a sequential schedule “speech and physical functions seemed to compete for limited resources” in the brain. Their work is described in the current issue of the journal Aphasiology.

Choe and computer science researcher and robot expert Rod Grupen, director of the Laboratory for Perceptual Robotics at UMass Amherst, are in the second year of a $109,251 grant from the American Heart Association to investigate the effect of stroke rehabilitation delivered by a humanoid robot, uBot-5. It is a child-sized unit with arms and a computer screen through which therapists interact with the client.

Choe, Grupen and colleagues are seeking ways to bring more and longer-term therapy and social contact to people recovering from stroke. It’s estimated that 3 million Americans daily experience the debilitating effects of stroke. But even after years, they can recover significant function with intensive rehabilitation, says Choe. The bad news is that this is rarely available or accessible due to a shortage of therapists and lack of coverage for long-term treatment. Many people are left with chronic low function, which can lead to social isolation and depression.

While some may object to robots delivering therapy, the need is great and definitely not being met now, especially in rural areas, Grupen and Choe point out. They hope to aid human-to-human interaction, so a robot can temporarily take the therapist’s place. Grupen says, “In addition to improving quality of life, if we can support a client in the home so they can delay institutionalization, we can improve outcomes and make a huge impact on the cost of elder care. There are 70 million baby boomers beginning to retire now.”

“Stroke rehabilitation is such a monumental financial problem everywhere in the world, that’s where it can pay for itself,” he adds. “A personal robot could save billions of dollars in elder care while letting people stay in their own homes and communities. We’re hoping for a win-win where our elders live better, more independent and productive lives and our overtaxed healthcare resources are used more effectively.”

Filed under robots robotics humanoids stroke speech therapy aphasia neuroscience science

253 notes

Humanoid robot helps train children with autism
“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.
Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.
An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.
The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.
“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.

Humanoid robot helps train children with autism

“Aiden, look!” piped NAO, a two-foot tall humanoid robot, as it pointed to a flat-panel display on a far wall. As the cartoon dog Scooby Doo flashed on the screen, Aiden, a young boy with an unruly thatch of straw-colored hair, looked in the direction the robot was pointing.

Aiden, who is three and a half years old, has been diagnosed with autism spectrum disorder (ASD). NAO (pronounced “now”) is the diminutive “front man” for an elaborate system of cameras, sensors and computers designed specifically to help children like Aiden learn how to coordinate their attention with other people and objects in their environment. This basic social skill is called joint attention. Typically developing children learn it naturally. Children with autism, however, have difficulty mastering it and that inability can compound into a variety of learning difficulties as they age.

An interdisciplinary team of mechanical engineers and autism experts at Vanderbilt University have developed the system and used it to demonstrate that robotic systems may be powerful tools for enhancing the basic social learning skills of children with ASD. Writing in the March issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers report that children with ASD paid more attention to the robot and followed its instructions almost as well as they did those of a human therapist in standard exercises used to develop joint attention skill.

The finding indicates that robots could play a crucial role in responding to the “public health emergency” that has been created by the rapid growth in the number of children being diagnosed with ASD. Today, one in 88 children (one in 54 boys) are being diagnosed with ASD. That is a 78 percent increase in just four years. The trend has major implications for the nation’s healthcare budget because estimates of the lifetime cost of treating ASD patients ranges from four to six times greater than for patients without autism.

“This is the first real world test of whether intelligent adaptive systems can make an impact on autism,” said team member Zachary Warren, who directs the Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) at Vanderbilt’s Kennedy Center.

Filed under robots robotics humanoids ASD autism NAO joint attention neuroscience science

78 notes

How two brain areas interact to trigger divergent emotional behaviors
New research from the University of North Carolina School of Medicine for the first time explains exactly how two brain regions interact to promote emotionally motivated behaviors associated with anxiety and reward.
The findings could lead to new mental health therapies for disorders such as addiction, anxiety, and depression. A report of the research was published online by the journal, Nature, on March 20, 2013.
Located deep in the brain’s temporal lobe are tightly packed clusters of brain cells in the almond shaped amygdala that are important for processing memory and emotion. When animals or people are in stressful situations, neurons in an extended portion of the amygdala called the bed nucleus of the stria terminalis, or BNST, become hyperactive.
But, almost paradoxically, neurons in the BNST, which modulate fear and anxiety, reach into a portion of the midbrain that’s involved in behavioral responses to reward, the ventral tegmental area, or VTA.
“For many years it’s been known that dopamine neurons in the VTA are involved in reward processing and motivation. For example, they’re activated during exposure to drugs of abuse and naturally rewarding experiences,” says study senior author Garret Stuber, PhD, assistant professor in the departments of Psychiatry and Cell Biology and Physiology, and the UNC Neuroscience Center.  “On the one hand, you have this area of the brain – the BNST – that’s associated with aversion and anxiety, but it’s in direct communication with a brain reward center. We wanted to figure out exactly how these two brain regions interact to promote different types of behavioral responses related to anxiety and reward.”
In the past, researchers have tried to get a glimpse into the inner workings of the brain using electrical stimulation or drugs, but those techniques couldn’t quickly and specifically change only one type of cell or one type of connection. But optogenetics, a technique that emerged about seven years ago, can.
In the technique, scientists transfer light-sensitive proteins called “opsins” – derived from algae or bacteria that need light to grow – into the mammalian brain cells they wish to study. Then they shine laser beams onto the genetically manipulated brain cells, either exciting or blocking their activity with millisecond precision.

How two brain areas interact to trigger divergent emotional behaviors

New research from the University of North Carolina School of Medicine for the first time explains exactly how two brain regions interact to promote emotionally motivated behaviors associated with anxiety and reward.

The findings could lead to new mental health therapies for disorders such as addiction, anxiety, and depression. A report of the research was published online by the journal, Nature, on March 20, 2013.

Located deep in the brain’s temporal lobe are tightly packed clusters of brain cells in the almond shaped amygdala that are important for processing memory and emotion. When animals or people are in stressful situations, neurons in an extended portion of the amygdala called the bed nucleus of the stria terminalis, or BNST, become hyperactive.

But, almost paradoxically, neurons in the BNST, which modulate fear and anxiety, reach into a portion of the midbrain that’s involved in behavioral responses to reward, the ventral tegmental area, or VTA.

“For many years it’s been known that dopamine neurons in the VTA are involved in reward processing and motivation. For example, they’re activated during exposure to drugs of abuse and naturally rewarding experiences,” says study senior author Garret Stuber, PhD, assistant professor in the departments of Psychiatry and Cell Biology and Physiology, and the UNC Neuroscience Center.  “On the one hand, you have this area of the brain – the BNST – that’s associated with aversion and anxiety, but it’s in direct communication with a brain reward center. We wanted to figure out exactly how these two brain regions interact to promote different types of behavioral responses related to anxiety and reward.”

In the past, researchers have tried to get a glimpse into the inner workings of the brain using electrical stimulation or drugs, but those techniques couldn’t quickly and specifically change only one type of cell or one type of connection. But optogenetics, a technique that emerged about seven years ago, can.

In the technique, scientists transfer light-sensitive proteins called “opsins” – derived from algae or bacteria that need light to grow – into the mammalian brain cells they wish to study. Then they shine laser beams onto the genetically manipulated brain cells, either exciting or blocking their activity with millisecond precision.

Filed under brain brain cells ventral tegmental area temporal lobe amygdala behavioral responses neuroscience science

64 notes

Brain Mapping Reveals Neurological Basis of Decision-Making in Rats
Scientists at UC San Francisco have discovered how memory recall is linked to decision-making in rats, showing that measurable activity in one part of the brain occurs when rats in a maze are playing out memories that help them decide which way to turn. The more they play out these memories, the more likely they are to find their way correctly to the end of the maze.
In their study, reported this week in the journal Neuron, the UCSF researchers implanted electrodes directly on a region of the rat brain known as the hippocampus, which is already known to play a key role in the formation and recall of memory. This same region is active when animals are learning, and it is damaged in people who have Alzheimer’s and post-traumatic stress disorder.
The study showed that when the rats paused before an upcoming choice, sometimes the hippocampus was more active and sometimes it was less active. When it was more active it did a better job of recalling memories of places the animal could go next, and the animal was more likely to go to the right place.
“We know that considering possibilities is important for decision-making, but we haven’t really known how this happens in the brain,” said neuroscientist Loren Frank, PhD, who led the research. Frank is an associate professor of physiology and a member of the UCSF Center for Integrative Neuroscience at UCSF.
The work builds upon several years of investigations in Frank’s laboratory that have shown how activity in the hippocampus is a fundamental constituent of memory retrieval. Their recent work shows that this activity is not just about remembering the past – it is also important for thinking about the future. When the brain does a better job of thinking about future possibilities, it makes better decisions.
Next, the team wants to tease out why sometimes the hippocampus does not do a good job of playing out future options. Problems with memory and decision-making are central to age-related cognitive decline, and a deeper understanding of how this works could pave the way for interventions that make the brain work better.

Brain Mapping Reveals Neurological Basis of Decision-Making in Rats

Scientists at UC San Francisco have discovered how memory recall is linked to decision-making in rats, showing that measurable activity in one part of the brain occurs when rats in a maze are playing out memories that help them decide which way to turn. The more they play out these memories, the more likely they are to find their way correctly to the end of the maze.

In their study, reported this week in the journal Neuron, the UCSF researchers implanted electrodes directly on a region of the rat brain known as the hippocampus, which is already known to play a key role in the formation and recall of memory. This same region is active when animals are learning, and it is damaged in people who have Alzheimer’s and post-traumatic stress disorder.

The study showed that when the rats paused before an upcoming choice, sometimes the hippocampus was more active and sometimes it was less active. When it was more active it did a better job of recalling memories of places the animal could go next, and the animal was more likely to go to the right place.

“We know that considering possibilities is important for decision-making, but we haven’t really known how this happens in the brain,” said neuroscientist Loren Frank, PhD, who led the research. Frank is an associate professor of physiology and a member of the UCSF Center for Integrative Neuroscience at UCSF.

The work builds upon several years of investigations in Frank’s laboratory that have shown how activity in the hippocampus is a fundamental constituent of memory retrieval. Their recent work shows that this activity is not just about remembering the past – it is also important for thinking about the future. When the brain does a better job of thinking about future possibilities, it makes better decisions.

Next, the team wants to tease out why sometimes the hippocampus does not do a good job of playing out future options. Problems with memory and decision-making are central to age-related cognitive decline, and a deeper understanding of how this works could pave the way for interventions that make the brain work better.

Filed under brain memory cognitive decline hippocampus decision-making neuroscience science

73 notes

Sleep consolidates memories for competing tasks
Sleep plays an important role in the brain’s ability to consolidate learning when two new potentially competing tasks are learned in the same day, research at the University of Chicago demonstrates.
Other studies have shown that sleep consolidates learning for a new task. The new study, which measured starlings’ ability to recognize new songs, shows that learning a second task can undermine the performance of a previously learned task. But this study is the first to show that a good night’s sleep helps the brain retain both new memories.
Starlings provide an excellent model for studying memory because of fundamental biological similarities between avian and mammalian brains, scholars wrote in the paper, “Sleep Consolidation of Interfering Auditory Memories in Starlings,” published in the current online edition of Psychological Science.
“These observations demonstrate that sleep consolidation enhances retention of interfering experiences, facilitating daytime learning and the subsequent formation of stable memories,” the authors wrote.
The paper was written by Timothy Brawn, a graduate researcher in psychology at UChicago; Howard Nusbaum, professor of psychology; and Daniel Margoliash, professor of psychology, organismal biology and anatomy. Nusbaum is a leading expert on learning, and Margoliash is a pioneer in the research of brain function and its development in birds.

Sleep consolidates memories for competing tasks

Sleep plays an important role in the brain’s ability to consolidate learning when two new potentially competing tasks are learned in the same day, research at the University of Chicago demonstrates.

Other studies have shown that sleep consolidates learning for a new task. The new study, which measured starlings’ ability to recognize new songs, shows that learning a second task can undermine the performance of a previously learned task. But this study is the first to show that a good night’s sleep helps the brain retain both new memories.

Starlings provide an excellent model for studying memory because of fundamental biological similarities between avian and mammalian brains, scholars wrote in the paper, “Sleep Consolidation of Interfering Auditory Memories in Starlings,” published in the current online edition of Psychological Science.

“These observations demonstrate that sleep consolidation enhances retention of interfering experiences, facilitating daytime learning and the subsequent formation of stable memories,” the authors wrote.

The paper was written by Timothy Brawn, a graduate researcher in psychology at UChicago; Howard Nusbaum, professor of psychology; and Daniel Margoliash, professor of psychology, organismal biology and anatomy. Nusbaum is a leading expert on learning, and Margoliash is a pioneer in the research of brain function and its development in birds.

Filed under starlings birds consolidation sleep learning memory neuroscience science

66 notes

‘Brain waves’ challenge area-specific view of brain activity
Our understanding of brain activity has traditionally been linked to brain areas – when we speak, the speech area of the brain is active. New research by an international team of psychologists led by David Alexander and Cees van Leeuwen (Laboratory for Perceptual Dynamics) shows that this view may be overly rigid. The entire cortex, not just the area responsible for a certain function, is activated when a given task is initiated. Furthermore, activity occurs in a pattern: waves of activity roll from one side of the brain to the other.
The brain can be studied on various scales, researcher David Alexander explains: “You have the neurons, the circuits between the neurons, the Brodmann areas – brain areas that correspond to a certain function – and the entire cortex. Traditionally, scientists looked at local activity when studying brain activity, for example, activity in the Brodmann areas. To do this, you take EEG’s (electroencephalograms) to measure the brain’s electrical activity while a subject performs a task and then you try to trace that activity back to one or more brain areas.”
Activity waves
In this study, the psychologists explore uncharted territory: “We are examining the activity in the cerebral cortex as a whole. The brain is a non-stop, always-active system. When we perceive something, the information does not end up in a specific part of our brain. Rather, it is added to the brain’s existing activity. If we measure the electrochemical activity of the whole cortex, we find wave-like patterns. This shows that brain activity is not local but rather that activity constantly moves from one part of the brain to another. The local activity in the Brodmann areas only appears when you average over many such waves.”
Each activity wave in the cerebral cortex is unique. “When someone repeats the same action, such as drumming their fingers, the motor centre in the brain is stimulated. But with each individual action, you still get a different wave across the cortex as a whole. Perhaps the person was more engaged in the action the first time than he was the second time, or perhaps he had something else on his mind or had a different intention for the action. The direction of the waves is also meaningful. It is already clear, for example, that activity waves related to orienting move differently in children – more prominently from back to front – than in adults. With further research, we hope to unravel what these different wave trajectories mean.”

‘Brain waves’ challenge area-specific view of brain activity

Our understanding of brain activity has traditionally been linked to brain areas – when we speak, the speech area of the brain is active. New research by an international team of psychologists led by David Alexander and Cees van Leeuwen (Laboratory for Perceptual Dynamics) shows that this view may be overly rigid. The entire cortex, not just the area responsible for a certain function, is activated when a given task is initiated. Furthermore, activity occurs in a pattern: waves of activity roll from one side of the brain to the other.

The brain can be studied on various scales, researcher David Alexander explains: “You have the neurons, the circuits between the neurons, the Brodmann areas – brain areas that correspond to a certain function – and the entire cortex. Traditionally, scientists looked at local activity when studying brain activity, for example, activity in the Brodmann areas. To do this, you take EEG’s (electroencephalograms) to measure the brain’s electrical activity while a subject performs a task and then you try to trace that activity back to one or more brain areas.”

Activity waves

In this study, the psychologists explore uncharted territory: “We are examining the activity in the cerebral cortex as a whole. The brain is a non-stop, always-active system. When we perceive something, the information does not end up in a specific part of our brain. Rather, it is added to the brain’s existing activity. If we measure the electrochemical activity of the whole cortex, we find wave-like patterns. This shows that brain activity is not local but rather that activity constantly moves from one part of the brain to another. The local activity in the Brodmann areas only appears when you average over many such waves.”

Each activity wave in the cerebral cortex is unique. “When someone repeats the same action, such as drumming their fingers, the motor centre in the brain is stimulated. But with each individual action, you still get a different wave across the cortex as a whole. Perhaps the person was more engaged in the action the first time than he was the second time, or perhaps he had something else on his mind or had a different intention for the action. The direction of the waves is also meaningful. It is already clear, for example, that activity waves related to orienting move differently in children – more prominently from back to front – than in adults. With further research, we hope to unravel what these different wave trajectories mean.”

Filed under brain brain activity activity waves EEG cerebral cortex neuroscience psychology science

69 notes

Researchers image most of vertebrae brain at single cell level
Misha Ahrens and Philipp Keller, researchers with the Howard Hughes Medical Institute have succeeded in making a near real-time video of most of a zebrafish’s brain showing individual neuron cells firing. To create the video, as the team reports in their paper published in the journal Nature Methods, the two developed a type of modified light-sheet microscopy and used it in on genetically modified fish.
To create the video, the researchers turned to zebrafish in their larval state—their brains are transparent and small. To cause firing neurons to be visible they genetically altered the fish’s brains, giving them a protein that glows when responding to changes in calcium ion levels, which happen when nerve cells fire. Next, they used a microscope that was able to broadcast a sheet of light through the fish’s brain allowing for the detection of the firing neurons. The system recorded images every 1.3 seconds. The final step was stitching the images together to create a video. The result is nothing short of breathtaking—looking like something out of a science fiction movie’s special effects department.
The video marks the first visual capture of most of a living vertebrae brain at the neuron level, as it works in near real-time and offers striking evidence of the complexity of the brain—even one as small as 100,000 neurons. The researchers say their video shows approximately 80 percent of the zebrafish’s brain as it operates—though what all those firing neurons represent in particular, is still unknown.
The researchers are careful to point out that what they’ve accomplished does not portend the creation of a video of a human brain in action—our brains are much larger, have billions more neurons and perhaps more importantly, are not transparent and are covered by a thick skull. Instead they suggest that studying a simpler brain in action might help to explain how biological neural networks actually work, perhaps leading to theories that can be generalized over larger animals.
But before that can happen, the procedure the team has developed needs to be improved—neurons can fire at hundreds of times per second, which means a lot of firing in the video has been missed. Capturing at a faster rate would mean generating nearly unmanageable amounts of data—at the current rate, just one hour of capture creates a terabyte of data. Thus a new way to store and process the data must be developed.

Researchers image most of vertebrae brain at single cell level

Misha Ahrens and Philipp Keller, researchers with the Howard Hughes Medical Institute have succeeded in making a near real-time video of most of a zebrafish’s brain showing individual neuron cells firing. To create the video, as the team reports in their paper published in the journal Nature Methods, the two developed a type of modified light-sheet microscopy and used it in on genetically modified fish.

To create the video, the researchers turned to zebrafish in their larval state—their brains are transparent and small. To cause firing neurons to be visible they genetically altered the fish’s brains, giving them a protein that glows when responding to changes in calcium ion levels, which happen when nerve cells fire. Next, they used a microscope that was able to broadcast a sheet of light through the fish’s brain allowing for the detection of the firing neurons. The system recorded images every 1.3 seconds. The final step was stitching the images together to create a video. The result is nothing short of breathtaking—looking like something out of a science fiction movie’s special effects department.

The video marks the first visual capture of most of a living vertebrae brain at the neuron level, as it works in near real-time and offers striking evidence of the complexity of the brain—even one as small as 100,000 neurons. The researchers say their video shows approximately 80 percent of the zebrafish’s brain as it operates—though what all those firing neurons represent in particular, is still unknown.

The researchers are careful to point out that what they’ve accomplished does not portend the creation of a video of a human brain in action—our brains are much larger, have billions more neurons and perhaps more importantly, are not transparent and are covered by a thick skull. Instead they suggest that studying a simpler brain in action might help to explain how biological neural networks actually work, perhaps leading to theories that can be generalized over larger animals.

But before that can happen, the procedure the team has developed needs to be improved—neurons can fire at hundreds of times per second, which means a lot of firing in the video has been missed. Capturing at a faster rate would mean generating nearly unmanageable amounts of data—at the current rate, just one hour of capture creates a terabyte of data. Thus a new way to store and process the data must be developed.

Filed under zebrafish neuronal activity nerve cells neurons brain function neuroscience science

51 notes

Fetal exposure to antiepileptic drug valproate impairs cognitive development

The effects of antiepileptic drugs during pregnancy have long been a concern of clinicians and women of childbearing age whose seizures can only be controlled by medications. In 1999, a study called the Neurodevelopmental Effects of Antiepileptic Drugs (NEAD) began following the children of women who were taking a single antiepileptic agent during pregnancy. The drugs included carbamazepine, lamotrigine, phenytoin or valproate.

Recently released final data from NEAD shows that at age 6, IQ is 7-10 points lower in children exposed in utero to the anti-epileptic drug valproate (Depakote) than those exposed to the other medications. The children exposed to valproate also did poorly on measures of verbal and memory abilities, and non-verbal and executive functions. The results were reported in the January 23, 2013, Lancet Neurology publication on line.

"Data published at ages 3 and 4.5 showed similar results in cognitive impairment," says lead study author Kimford Meador, MD, professor of neurology at Emory University School of Medicine. "Age 6 IQ was our primary outcome goal because it is standardized and predictive of school performance."

The NEAD study is the largest prospective study examining the cognitive effects of fetal antiepileptic drug exposure. The researchers monitored women through pregnancy and followed their children, performing cognitive testing at ages 2,3,4.5 and finally at 6. In addition to the effect on cognitive function, earlier data from NEAD showed an increase in the risk of anatomical birth defects.

Valproate is an anticonvulsant used in the treatment of epilepsy, migraines and bipolar disorder, and is particularly effective in the treatment of primary generalized seizures.  Except for a small number of women who only respond to valproate, there are alternative medications.

"These findings consistently show a substantial loss of developmental abilities for these children," says Meador. "Women of childbearing age who have epilepsy should talk with their doctors about their options, and possibly test the safer medications prior to pregnancy to find out if they work."

In order to avoid seizures with potentially serious consequences, Meador emphasizes that women who are already pregnant and taking valproate should not stop without consulting their physicians.

"For a woman who has significant seizures, the risk from the seizure itself is worse than the risk of taking the drugs," he points out.  "The number one reason for miscarriage late in pregnancy for women with epilepsy is trauma resulting from a seizure."

Meador will co-lead a follow-up study with Page Pennell, MD, from Harvard. The new study funded by the National Institutes of Health is called Maternal Outcomes and Neurodevelopmental Effects of Antiepileptic Drugs (MONEAD), and will investigate the risks of these same drugs to both the mother and the child. The study will be conducted at 19 sites, enrolling 350 women with epilepsy during pregnancy. An additional 100 women with epilepsy who are not pregnant, and 100 healthy pregnant women will serve as controls.

(Source: news.emory.edu)

Filed under antiepileptic drugs cognitive impairment drug exposure pregnancy neuroscience science

34 notes

Transistor in the fly antenna

Highly developed antennae containing different types of olfactory receptors allow insects to use minute amounts of odors for orientation towards resources like food, oviposition sites or mates. Scientists at the Max Planck Institute for Chemical Ecology in Jena, Germany, have now used mutant flies and for the first time provided experimental proof that the extremely sensitive olfactory system of fruit flies − they are able to detect a few thousand odour molecules per millilitre of air, whereas humans need hundreds of millions − is based on self-regulation of odorant receptors. Even fewer molecules below the response threshold are sufficient to amplify the sensitivity of the receptors, and binding of molecules shortly afterwards triggers the opening of an ion channel that controls the fly’s reaction and flight behaviour. This means that a below threshold odor stimulation increases the sensitivity of the receptor, and if a second odour pulse arrives within a certain time span, a neural response will be elicited.

It is amazing how many fruit flies (Drosophila melanogaster) find their way to a rotting apple. It is known that insects are able to detect the slightest concentrations of odour molecules, especially pheromones, but also “food signals”.

Dieter Wicher, Shannon Olsson, Bill Hansson and their colleagues at the Max Planck Institute for Chemical Ecology were looking for answers to the question why insects can trace odour molecules so easily and at such low concentrations in comparison to other animals. They focused their attention on odorant receptor proteins in the antenna, the insects’ nose. These insect proteins are pretty young from an evolutionary perspective and their molecular constituents may be the basis for the insects’ highly sensitive sense of smell.

Insect odorant receptors form a receptor system that consists of the actual receptor protein and an ion channel. After binding of an odour molecule, receptor protein and ion channel trigger the neural electrical response. This mechanism was recently described in the receptor system Or22a-Orco. Apart from functioning as so-called ionotropic receptors, which enable ion flow through membranes after binding of odour molecules, odorant receptors also elicit intracellular signals. These stimulate the formation of cyclic adenosine monophosphate (cyclic AMP or cAMP), which activates an ion flow through the co-receptor Orco. The role and relevance of this weak and slow electrical current, however, was until now unclear.

Merid N. Getahun, a PhD student from Ethiopia, and his colleagues have conducted numerous experiments on Drosophila olfactory neurons. They injected tiny amounts of compounds that stimulate, inhibit or imitate cAMP formation directly into the sensory hairs housing olfactory sensory neurons on the fly antenna. The researchers tested the flies’ responses to ethyl butyrate, which has a fruity odour similar to pineapple, and measured activity in the sensory neurons by using glass microelectrodes. As a control, they used genetically modified fruit flies where the co-receptor Orco had been inactivated. “The fact that these mutants are no more able to respond to cAMP or the inhibition/activation of the involved key enzymes, such as protein kinase C and phospholipase C, shows that the highly sensitive olfactory system in insects is regulated intracellularly by their own odorant receptors,” says Dieter Wicher, the leader of the research group.

The combination of odorant receptor and co-receptor Orco can be compared to a transistor, Wicher continues: A weak basic current is sufficient to release the main electric current that activates the neuron. The process can also be seen as a short-term memory situated in the insect nose. A very weak stimulus does not elicit a response when it first occurs, but if it reoccurs within a certain time span it will release the electrical response according to the principle “one time is no time, but two is a bunch.”

Filed under fruit flies olfactory system ion channels odor stimulation receptors neuroscience science

103 notes

Stem Cell Research Could Expand Clinical Use of Regenerative Human Cells 
Research led by a biology professor in the School of Science at IUPUI has uncovered a method to produce retinal cells from regenerative human stem cells without the use of animal products, proteins or other foreign substances, which historically have limited the application of stem cells to treat disease and other human developmental disorders.
The study of human induced pluripotent stem cells (hiPSCs) has been pursued vigorously since they were first discovered in 2007 due to their ability to be manipulated into specific cell types. Scientists believe these cells hold considerable potential for cell replacement, disease modeling and pharmacological testing. However, clinical applications have been hindered by the fact that, to date, the cells have required animal products and proteins to grow and differentiate
A research team led by Jason S. Meyer, Ph.D., assistant professor of biology, successfully differentiated hiPSCs in a lab environment—completely through chemical methods—to form neural retinal cell types (including photoreceptors and retinal ganglion cells). Tests have shown the cells function and grow just as efficiently as those cells produced through traditional methods.
“Not only were we able to develop these (hiPSC) cells into retinal cells, but we were able to do so in a system devoid of any animal cells and proteins,” Meyer said. “Since these kinds of stem cells can be generated from a patient’s own cells, there will be nothing the body will recognize as foreign.”
In addition, this research should allow scientists to better reproduce these cells because they know exactly what components were included to spur growth and minimize or eliminate any variations, Meyer said. Furthermore, the cells function in a very similar fashion to human embryonic stem cells, but without controversial or immune rejection issues because they are derived from individual patients.
“This method could have a considerable impact on the treatment of retinal diseases such as age-related macular degeneration and forms of blindness with hereditary factors,” Meyer said. “We hope this will help us understand what goes wrong when diseases arise and that we can use this method as platform for the development of new treatments or drug therapies.”
“We’re talking about bringing stem cells a significant step closer to clinical use,” Meyer added.
The research will be published in the April edition of Stem Cells Translational Medicine.

Stem Cell Research Could Expand Clinical Use of Regenerative Human Cells

Research led by a biology professor in the School of Science at IUPUI has uncovered a method to produce retinal cells from regenerative human stem cells without the use of animal products, proteins or other foreign substances, which historically have limited the application of stem cells to treat disease and other human developmental disorders.

The study of human induced pluripotent stem cells (hiPSCs) has been pursued vigorously since they were first discovered in 2007 due to their ability to be manipulated into specific cell types. Scientists believe these cells hold considerable potential for cell replacement, disease modeling and pharmacological testing. However, clinical applications have been hindered by the fact that, to date, the cells have required animal products and proteins to grow and differentiate

A research team led by Jason S. Meyer, Ph.D., assistant professor of biology, successfully differentiated hiPSCs in a lab environment—completely through chemical methods—to form neural retinal cell types (including photoreceptors and retinal ganglion cells). Tests have shown the cells function and grow just as efficiently as those cells produced through traditional methods.

“Not only were we able to develop these (hiPSC) cells into retinal cells, but we were able to do so in a system devoid of any animal cells and proteins,” Meyer said. “Since these kinds of stem cells can be generated from a patient’s own cells, there will be nothing the body will recognize as foreign.”

In addition, this research should allow scientists to better reproduce these cells because they know exactly what components were included to spur growth and minimize or eliminate any variations, Meyer said. Furthermore, the cells function in a very similar fashion to human embryonic stem cells, but without controversial or immune rejection issues because they are derived from individual patients.

“This method could have a considerable impact on the treatment of retinal diseases such as age-related macular degeneration and forms of blindness with hereditary factors,” Meyer said. “We hope this will help us understand what goes wrong when diseases arise and that we can use this method as platform for the development of new treatments or drug therapies.”

“We’re talking about bringing stem cells a significant step closer to clinical use,” Meyer added.

The research will be published in the April edition of Stem Cells Translational Medicine.

Filed under embryonic stem cells stem cells retinal ganglion cells hiPSCs retinal diseases medicine neuroscience science

free counters