Neuroscience

Articles and news from the latest research reports.

Posts tagged science

499 notes

Hereditary trauma
The phenomenon has long been known in psychology: traumatic experiences can induce behavioural disorders that are passed down from one generation to the next. It is only recently that scientists have begun to understand the physiological processes underlying hereditary trauma. ”There are diseases such as bipolar disorder, that run in families but can’t be traced back to a particular gene”, explains Isabelle Mansuy, professor at ETH Zurich and the University of Zurich. With her research group at the Brain Research Institute of the University of Zurich, she has been studying the molecular processes involved in non-genetic inheritance of behavioural symptoms induced by traumatic experiences in early life.
Mansuy and her team have succeeded in identifying a key component of these processes: short RNA molecules. These RNAs are synthetized from genetic information (DNA) by enzymes that read specific sections of the DNA (genes) and use them as template to produce corresponding RNAs. Other enzymes then trim these RNAs into mature forms. Cells naturally contain a large number of different short RNA molecules called microRNAs. They have regulatory functions, such as controlling how many copies of a particular protein are made.
Small RNAs with a huge impact
The researchers studied the number and kind of microRNAs expressed by adult mice exposed to traumatic conditions in early life and compared them with non-traumatized mice. They discovered that traumatic stress alters the amount of several microRNAs in the blood, brain and sperm – while some microRNAs were produced in excess, others were lower than in the corresponding tissues or cells of control animals. These alterations resulted in misregulation of cellular processes normally controlled by these microRNAs.
After traumatic experiences, the mice behaved markedly differently: they partly lost their natural aversion to open spaces and bright light and had depressive-like behaviours. These behavioural symptoms were also transferred to the next generation via sperm, even though the offspring were not exposed to any traumatic stress themselves. 
Even passed on to the third generation
The metabolism of the offspring of stressed mice was also impaired: their insulin and blood-sugar levels were lower than in the offspring of non-traumatized parents. “We were able to demonstrate for the first time that traumatic experiences affect metabolism in the long-term and that these changes are hereditary”, says Mansuy. The effects on metabolism and behaviour even persisted in the third generation.
“With the imbalance in microRNAs in sperm, we have discovered a key factor through which trauma can be passed on,” explains Mansuy. However, certain questions remain open, such as how the dysregulation in short RNAs comes about. “Most likely, it is part of a chain of events that begins with the body producing too much stress hormones.”
Importantly, acquired traits other than those induced by trauma could also be inherited through similar mechanisms, the researcher suspects. “The environment leaves traces on the brain, on organs and also on gametes. Through gametes, these traces can be passed to the next generation.”
Mansuy and her team are currently studying the role of short RNAs in trauma inheritance in humans. As they were also able to demonstrate the microRNAs imbalance in the blood of traumatized mice and their offspring, the scientists hope that their results may be useful to develop a blood test for diagnostics.

Hereditary trauma

The phenomenon has long been known in psychology: traumatic experiences can induce behavioural disorders that are passed down from one generation to the next. It is only recently that scientists have begun to understand the physiological processes underlying hereditary trauma. ”There are diseases such as bipolar disorder, that run in families but can’t be traced back to a particular gene”, explains Isabelle Mansuy, professor at ETH Zurich and the University of Zurich. With her research group at the Brain Research Institute of the University of Zurich, she has been studying the molecular processes involved in non-genetic inheritance of behavioural symptoms induced by traumatic experiences in early life.

Mansuy and her team have succeeded in identifying a key component of these processes: short RNA molecules. These RNAs are synthetized from genetic information (DNA) by enzymes that read specific sections of the DNA (genes) and use them as template to produce corresponding RNAs. Other enzymes then trim these RNAs into mature forms. Cells naturally contain a large number of different short RNA molecules called microRNAs. They have regulatory functions, such as controlling how many copies of a particular protein are made.

Small RNAs with a huge impact

The researchers studied the number and kind of microRNAs expressed by adult mice exposed to traumatic conditions in early life and compared them with non-traumatized mice. They discovered that traumatic stress alters the amount of several microRNAs in the blood, brain and sperm – while some microRNAs were produced in excess, others were lower than in the corresponding tissues or cells of control animals. These alterations resulted in misregulation of cellular processes normally controlled by these microRNAs.

After traumatic experiences, the mice behaved markedly differently: they partly lost their natural aversion to open spaces and bright light and had depressive-like behaviours. These behavioural symptoms were also transferred to the next generation via sperm, even though the offspring were not exposed to any traumatic stress themselves. 

Even passed on to the third generation

The metabolism of the offspring of stressed mice was also impaired: their insulin and blood-sugar levels were lower than in the offspring of non-traumatized parents. “We were able to demonstrate for the first time that traumatic experiences affect metabolism in the long-term and that these changes are hereditary”, says Mansuy. The effects on metabolism and behaviour even persisted in the third generation.

“With the imbalance in microRNAs in sperm, we have discovered a key factor through which trauma can be passed on,” explains Mansuy. However, certain questions remain open, such as how the dysregulation in short RNAs comes about. “Most likely, it is part of a chain of events that begins with the body producing too much stress hormones.”

Importantly, acquired traits other than those induced by trauma could also be inherited through similar mechanisms, the researcher suspects. “The environment leaves traces on the brain, on organs and also on gametes. Through gametes, these traces can be passed to the next generation.”

Mansuy and her team are currently studying the role of short RNAs in trauma inheritance in humans. As they were also able to demonstrate the microRNAs imbalance in the blood of traumatized mice and their offspring, the scientists hope that their results may be useful to develop a blood test for diagnostics.

Filed under traumatic stress traumatic experiences microRNA stress genetics neuroscience science

148 notes

New mouse model could revolutionize research in Alzheimer’s disease
In a study published today in Nature Neuroscience, a group of researchers led by Takaomi Saido of the RIKEN Brain Science Institute in Japan have reported the creation of two new mouse models of Alzheimer’s disease that may potentially revolutionize research into this disease. 

Alzheimer’s disease, the primary cause of dementia in the elderly, imposes a tremendous social and economic burden on modern society. In Japan, the burden of the disease in 2050 is estimated to be a half a trillion US dollars, a figure equivalent to the government’s annual revenues.
Unfortunately, it has proven very difficult to develop drugs capable of ameliorating the disease. After a tremendous burst of progress in the 1990s, the pace of discoveries has slowed. Dr. Saido believes that part of the difficulty is the inadequacy of current mouse models to replicate the real conditions of Alzheimer’s disease and allow an understanding of the underlying mechanisms that lead to neurodegeneration. In fact, much of the research in Alzheimer’s disease over the past decade may be flawed, as it was based on unrealistic models.
The problem with older mouse models is that they overexpress a protein called amyloid precursor protein, or APP, which gives rise to the amyloid-beta (Abeta) peptides that accumulate in the brain, eventually leading to the neurodegeneration that characterizes Alzheimer’s disease. However, in mice the overexpression of APP gives rise to effects which are not seen in human Alzheimer’s disease.
For example, the APP mutant mice often die of unknown causes at a young age, and the group believes this may be related to the generation of toxic fragments of APP, such as CTF-beta. In addition, some of the fragments of APP could be neuroprotective, making it difficult to judge whether drugs are being effective due to their effect on Abeta peptides, which are known to be involved in human AD, or whether it is due to other effects that would not be seen in human disease. In addition, the gene for expressing APP is inserted in different places in the genome, and may knock out other genes, creating artifacts that are not seen in humans.
With this awareness, more than a decade ago Dr. Saido launched a project to develop a new mouse model that would allow more accurate evaluation of therapies for the disease. One of the major hurdles involved a part of the gene, intron 16, which they discovered was necessary for creating more specific models.
The first mice model they developed (NL-F/NL-F) was knocked in with two mutations found in human familial Alzheimer’s disease. The mice showed early accumulation of Abeta peptides, and importantly, were found to undergo cognitive dysfunction similar to the progression of AD seen in human patients. A second model, with the addition of a further mutation that had been discovered in a family in Sweden, showed even faster initiation of memory loss.
These new models could help in two major areas. The first model, which expresses high levels of the Abeta peptides, seems to realistically model the human form of AD, and could be used for elucidating the mechanism of Abeta deposition. The second model, which demonstrates AD pathology very early on, could be used to examine factors downstream of Abeta-40 and Abeta-42 deposition, such as tauopathy, which are believed to be involved in the neurodegeneration. These results may eventually contribute to drug development and to the discovery of new biomarkers for Alzheimer’s disease. The group is currently looking at several proteins, using the new models, which have potential to be biomarkers.
According to Dr. Saido, “We have a social responsibility to make Alzheimer’s disease preventable and curable. The generation of appropriate mouse models will be a major breakthrough for understanding the mechanism of the disease, which will lead to the establishment of presymptomatic diagnosis, prevention and treatment of the disease.”

New mouse model could revolutionize research in Alzheimer’s disease

In a study published today in Nature Neuroscience, a group of researchers led by Takaomi Saido of the RIKEN Brain Science Institute in Japan have reported the creation of two new mouse models of Alzheimer’s disease that may potentially revolutionize research into this disease.

Alzheimer’s disease, the primary cause of dementia in the elderly, imposes a tremendous social and economic burden on modern society. In Japan, the burden of the disease in 2050 is estimated to be a half a trillion US dollars, a figure equivalent to the government’s annual revenues.

Unfortunately, it has proven very difficult to develop drugs capable of ameliorating the disease. After a tremendous burst of progress in the 1990s, the pace of discoveries has slowed. Dr. Saido believes that part of the difficulty is the inadequacy of current mouse models to replicate the real conditions of Alzheimer’s disease and allow an understanding of the underlying mechanisms that lead to neurodegeneration. In fact, much of the research in Alzheimer’s disease over the past decade may be flawed, as it was based on unrealistic models.

The problem with older mouse models is that they overexpress a protein called amyloid precursor protein, or APP, which gives rise to the amyloid-beta (Abeta) peptides that accumulate in the brain, eventually leading to the neurodegeneration that characterizes Alzheimer’s disease. However, in mice the overexpression of APP gives rise to effects which are not seen in human Alzheimer’s disease.

For example, the APP mutant mice often die of unknown causes at a young age, and the group believes this may be related to the generation of toxic fragments of APP, such as CTF-beta. In addition, some of the fragments of APP could be neuroprotective, making it difficult to judge whether drugs are being effective due to their effect on Abeta peptides, which are known to be involved in human AD, or whether it is due to other effects that would not be seen in human disease. In addition, the gene for expressing APP is inserted in different places in the genome, and may knock out other genes, creating artifacts that are not seen in humans.

With this awareness, more than a decade ago Dr. Saido launched a project to develop a new mouse model that would allow more accurate evaluation of therapies for the disease. One of the major hurdles involved a part of the gene, intron 16, which they discovered was necessary for creating more specific models.

The first mice model they developed (NL-F/NL-F) was knocked in with two mutations found in human familial Alzheimer’s disease. The mice showed early accumulation of Abeta peptides, and importantly, were found to undergo cognitive dysfunction similar to the progression of AD seen in human patients. A second model, with the addition of a further mutation that had been discovered in a family in Sweden, showed even faster initiation of memory loss.

These new models could help in two major areas. The first model, which expresses high levels of the Abeta peptides, seems to realistically model the human form of AD, and could be used for elucidating the mechanism of Abeta deposition. The second model, which demonstrates AD pathology very early on, could be used to examine factors downstream of Abeta-40 and Abeta-42 deposition, such as tauopathy, which are believed to be involved in the neurodegeneration. These results may eventually contribute to drug development and to the discovery of new biomarkers for Alzheimer’s disease. The group is currently looking at several proteins, using the new models, which have potential to be biomarkers.

According to Dr. Saido, “We have a social responsibility to make Alzheimer’s disease preventable and curable. The generation of appropriate mouse models will be a major breakthrough for understanding the mechanism of the disease, which will lead to the establishment of presymptomatic diagnosis, prevention and treatment of the disease.”

Filed under alzheimer's disease dementia amyloid precursor protein tauopathy neurodegeneration animal model neuroscience science

142 notes

Sleep-dependent memory consolidation and accelerated forgetting
Accelerated long-term forgetting (ALF) is a form of memory impairment in which learning and initial retention of information appear normal but subsequent forgetting is excessively rapid. ALF is most commonly associated with epilepsy and, in particular, a form of late-onset epilepsy called transient epileptic amnesia (TEA). ALF provides a novel opportunity to investigate post-encoding memory processes, such as consolidation. Sleep is implicated in the consolidation of memory in healthy people and a deficit in sleep-dependent memory consolidation has been proposed as an explanation for ALF. If this proposal were correct, then sleep would not benefit memory retention in people with ALF as much as in healthy people, and ALF might only be apparent when the retention interval contains sleep. To test this theory, we compared performance on a sleep-sensitive memory task over a night of sleep and a day of wakefulness. We found, contrary to the hypothesis, that sleep benefits memory retention in TEA patients with ALF and that this benefit is no smaller in magnitude than that seen in healthy controls. Indeed, the patients performed significantly more poorly than the controls only in the wake condition and not the sleep condition. Patients were matched to controls on learning rate, initial retention, and the effect of time of day on cognitive performance. These results indicate that ALF is not caused by a disruption of sleep-dependent memory consolidation. Instead, ALF may be due to an encoding abnormality that goes undetected on behavioural assessments of learning, or by a deficit in memory consolidation processes that are not sleep-dependent.
Full Article
(Image: Courtney Icenhour)

Sleep-dependent memory consolidation and accelerated forgetting

Accelerated long-term forgetting (ALF) is a form of memory impairment in which learning and initial retention of information appear normal but subsequent forgetting is excessively rapid. ALF is most commonly associated with epilepsy and, in particular, a form of late-onset epilepsy called transient epileptic amnesia (TEA). ALF provides a novel opportunity to investigate post-encoding memory processes, such as consolidation. Sleep is implicated in the consolidation of memory in healthy people and a deficit in sleep-dependent memory consolidation has been proposed as an explanation for ALF. If this proposal were correct, then sleep would not benefit memory retention in people with ALF as much as in healthy people, and ALF might only be apparent when the retention interval contains sleep. To test this theory, we compared performance on a sleep-sensitive memory task over a night of sleep and a day of wakefulness. We found, contrary to the hypothesis, that sleep benefits memory retention in TEA patients with ALF and that this benefit is no smaller in magnitude than that seen in healthy controls. Indeed, the patients performed significantly more poorly than the controls only in the wake condition and not the sleep condition. Patients were matched to controls on learning rate, initial retention, and the effect of time of day on cognitive performance. These results indicate that ALF is not caused by a disruption of sleep-dependent memory consolidation. Instead, ALF may be due to an encoding abnormality that goes undetected on behavioural assessments of learning, or by a deficit in memory consolidation processes that are not sleep-dependent.

Full Article

(Image: Courtney Icenhour)

Filed under memory memory consolidation epilepsy forgetting sleep psychology neuroscience science

172 notes

(Image caption: In this image, marking shows the axons in retinal neurons (in red) that innervate the superior colliculus (in blue) in a “normal” mouse. Credit: © Michael Reber / Institut des Neurosciences Cellulaires et Intégratives)
Confirmation of the neurobiological origin of attention-deficit disorder
A study, carried out on mice, has just confirmed the neurobiological origin of attention-deficit disorder (ADD), a syndrome whose causes are poorly understood. Researchers from CNRS, the University of Strasbourg and INSERM1 have identified a cerebral structure, the superior colliculus, where hyperstimulation causes behavior modifications similar to those of some patients who suffer from ADD. Their work also shows noradrenaline accumulation in the affected area, shedding light on this chemical mediator having a role in attention disorders. These results are published in the journal Brain Structure and Function.
Attention-deficit disorder affects between 4-8% of children. It manifests mainly through disturbed attention and verbal and motor impulsiveness, sometimes accompanied by hyperactivity. About 60% of these children still show symptoms in adulthood. No cure exists at this time. The only effective treatment is to administer psychostimulants, but these have substantial side effects, such as dependence. Persistent controversy surrounding the neurobiological origin of this disorder has hindered the development of new treatments.
The study in Strasbourg investigated the behavior of transgenic mice having developmental defects in the superior colliculus. This structure, located in the midbrain, is a sensory hub involved in controlling attention and visual and spatial orientation. The mice studied were characterized by duplicated neuron projections between the superior colliculus and the retina. This anomaly causes visual hyperstimulation and excess noradrenaline in the superior colliculus. The effects of the neurotransmitter noradrenaline, which vary from species to species, are still poorly understood. However, we do know that this noradrenaline imbalance is associated with significant behavioral changes in mice carrying the genetic mutation. By studying them, researchers have observed a loss of inhibition: for example mice hesitate less to penetrate a hostile environment. They have difficulties in understanding relevant information and demonstrate a form of impulsiveness. These symptoms remind us of adult patients suffering from one of the forms of ADD.
Currently, the fundamental work on ADD uses mainly animal models obtained by mutations that disturb dopamine production and transmission pathways. In mice with a malformed superior colliculus, these pathways are intact. The changes occur elsewhere in the neural networks of the midbrain. By broadening the classic boundary used to research its causes, using these new models would allow a more global approach to ADD to be developed. Characterizing the effects of noradrenaline on the superior colliculus more precisely could open the way to innovative therapeutic strategies.

(Image caption: In this image, marking shows the axons in retinal neurons (in red) that innervate the superior colliculus (in blue) in a “normal” mouse. Credit: © Michael Reber / Institut des Neurosciences Cellulaires et Intégratives)

Confirmation of the neurobiological origin of attention-deficit disorder

A study, carried out on mice, has just confirmed the neurobiological origin of attention-deficit disorder (ADD), a syndrome whose causes are poorly understood. Researchers from CNRS, the University of Strasbourg and INSERM1 have identified a cerebral structure, the superior colliculus, where hyperstimulation causes behavior modifications similar to those of some patients who suffer from ADD. Their work also shows noradrenaline accumulation in the affected area, shedding light on this chemical mediator having a role in attention disorders. These results are published in the journal Brain Structure and Function.

Attention-deficit disorder affects between 4-8% of children. It manifests mainly through disturbed attention and verbal and motor impulsiveness, sometimes accompanied by hyperactivity. About 60% of these children still show symptoms in adulthood. No cure exists at this time. The only effective treatment is to administer psychostimulants, but these have substantial side effects, such as dependence. Persistent controversy surrounding the neurobiological origin of this disorder has hindered the development of new treatments.

The study in Strasbourg investigated the behavior of transgenic mice having developmental defects in the superior colliculus. This structure, located in the midbrain, is a sensory hub involved in controlling attention and visual and spatial orientation. The mice studied were characterized by duplicated neuron projections between the superior colliculus and the retina. This anomaly causes visual hyperstimulation and excess noradrenaline in the superior colliculus. The effects of the neurotransmitter noradrenaline, which vary from species to species, are still poorly understood. However, we do know that this noradrenaline imbalance is associated with significant behavioral changes in mice carrying the genetic mutation. By studying them, researchers have observed a loss of inhibition: for example mice hesitate less to penetrate a hostile environment. They have difficulties in understanding relevant information and demonstrate a form of impulsiveness. These symptoms remind us of adult patients suffering from one of the forms of ADD.

Currently, the fundamental work on ADD uses mainly animal models obtained by mutations that disturb dopamine production and transmission pathways. In mice with a malformed superior colliculus, these pathways are intact. The changes occur elsewhere in the neural networks of the midbrain. By broadening the classic boundary used to research its causes, using these new models would allow a more global approach to ADD to be developed. Characterizing the effects of noradrenaline on the superior colliculus more precisely could open the way to innovative therapeutic strategies.

Filed under attention-deficit disorder ADD superior colliculus noradrenaline retinotopy neuroscience science

111 notes

(Fig. 1: Humans have the ability to accurately estimate the speed of moving objects under good light conditions, such as a bird on a clear day (left). On a cloudy day (right), however, the sensory information may be more ambiguous and invokes a specific cognitive mechanism—perceptual bias—that is hardwired into the visual cortex. Image credit: Justin Gardner, RIKEN Brain Science Institute)
An early link to motion perception
When viewing a scene with low contrast, such as in cloudy or low-light situations, humans tend to perceive objects to be moving slower or flickering faster than in reality. This less-than-faithful interpretation of the sensory environment is known as perceptual bias and is thought to be a mechanism that can help humans interpret vague motion information. Brett Vintch and Justin Gardner from the Laboratory for Human Systems Neuroscience at the RIKEN Brain Science Institute have now shown that perceptual bias is encoded within the visual cortex—the region of the brain where visual stimuli first arrive and begin to be processed.
Although humans have the ability to estimate the speed of easily visible, high-contrast stimuli quite accurately, the speed of less-visible, low-contrast stimuli is harder to judge and is invariably underestimated. Speed perception is thought to be closely associated with the middle temporal zone of the visual cortex, but measurements have so far been unable to confirm this link.
Vintch and Gardner set out to resolve the link between cortical response and perception by conducting functional magnetic resonance imaging experiments on test subjects exposed to a series of low- and high-contrast images either moving across the screen at different speeds or flickering at different rates.
The researchers found that different speeds of motion in visual stimulus evoked different patterns of activity in the visual cortex. So systematic was the observed pattern of activity that Vintch and Gardner were able to predict the motion speed or flicker frequency of what the observer was viewing simply by examining the measured brain responses. Using these predictions, they found that when the test subjects viewed scenes with low contrast, the patterns of activity shifted to match what the observer was perceiving rather than what was actually physically present. 
The findings indicate that human perceptual bias about the movement of low-contrast stimuli originates from a shift in the response of neuronal populations in the parts of the brain that first start to process images. This early visual processing, which is hardwired into the visual cortex, may help humans make sense of ambiguous or vague visual information, such as moving or flickering scenes under low-contrast conditions (Fig. 1).
“Multiple aspects of human thought, such as sensory inference, language, cognition and reasoning, involve cognitive guesswork. We hope that our study of this very simple form of guessing by the nervous system will have implications for other high-level processes in the human brain,” explains Gardner.

(Fig. 1: Humans have the ability to accurately estimate the speed of moving objects under good light conditions, such as a bird on a clear day (left). On a cloudy day (right), however, the sensory information may be more ambiguous and invokes a specific cognitive mechanism—perceptual bias—that is hardwired into the visual cortex. Image credit: Justin Gardner, RIKEN Brain Science Institute)

An early link to motion perception

When viewing a scene with low contrast, such as in cloudy or low-light situations, humans tend to perceive objects to be moving slower or flickering faster than in reality. This less-than-faithful interpretation of the sensory environment is known as perceptual bias and is thought to be a mechanism that can help humans interpret vague motion information. Brett Vintch and Justin Gardner from the Laboratory for Human Systems Neuroscience at the RIKEN Brain Science Institute have now shown that perceptual bias is encoded within the visual cortex—the region of the brain where visual stimuli first arrive and begin to be processed.

Although humans have the ability to estimate the speed of easily visible, high-contrast stimuli quite accurately, the speed of less-visible, low-contrast stimuli is harder to judge and is invariably underestimated. Speed perception is thought to be closely associated with the middle temporal zone of the visual cortex, but measurements have so far been unable to confirm this link.

Vintch and Gardner set out to resolve the link between cortical response and perception by conducting functional magnetic resonance imaging experiments on test subjects exposed to a series of low- and high-contrast images either moving across the screen at different speeds or flickering at different rates.

The researchers found that different speeds of motion in visual stimulus evoked different patterns of activity in the visual cortex. So systematic was the observed pattern of activity that Vintch and Gardner were able to predict the motion speed or flicker frequency of what the observer was viewing simply by examining the measured brain responses. Using these predictions, they found that when the test subjects viewed scenes with low contrast, the patterns of activity shifted to match what the observer was perceiving rather than what was actually physically present. 

The findings indicate that human perceptual bias about the movement of low-contrast stimuli originates from a shift in the response of neuronal populations in the parts of the brain that first start to process images. This early visual processing, which is hardwired into the visual cortex, may help humans make sense of ambiguous or vague visual information, such as moving or flickering scenes under low-contrast conditions (Fig. 1).

“Multiple aspects of human thought, such as sensory inference, language, cognition and reasoning, involve cognitive guesswork. We hope that our study of this very simple form of guessing by the nervous system will have implications for other high-level processes in the human brain,” explains Gardner.

Filed under perceptual bias visual cortex vision motion perception neuroscience science

152 notes

Fruit flies, fighter jets use similar nimble tactics when under attack
When startled by predators, tiny fruit flies respond like fighter jets – employing screaming-fast banked turns to evade attacks.
Researchers at the University of Washington used an array of high-speed video cameras operating at 7,500 frames a second to capture the wing and body motion of flies after they encountered a looming image of an approaching predator.
“Although they have been described as swimming through the air, tiny flies actually roll their bodies just like aircraft in a banked turn to maneuver away from impending threats,” said Michael Dickinson, UW professor of biology and co-author of a paper on the findings in the April 11 issue of Science. “We discovered that fruit flies alter course in less than one one-hundredth of a second, 50 times faster than we blink our eyes, and which is faster than we ever imagined.”
In the midst of a banked turn, the flies can roll on their sides 90 degrees or more, almost flying upside down at times, said Florian Muijres, a UW postdoctoral researcher and lead author of the paper.
“These flies normally flap their wings 200 times a second and, in almost a single wing beat, the animal can reorient its body to generate a force away from the threatening stimulus and then continues to accelerate,” he said.
The fruit flies, a species called Drosophila hydei that are about the size of a sesame seed, rely on a fast visual system to detect approaching predators.
“The brain of the fly performs a very sophisticated calculation, in a very short amount of time, to determine where the danger lies and exactly how to bank for the best escape, doing something different if the threat is to the side, straight ahead or behind,” Dickinson said.
“How can such a small brain generate so many remarkable behaviors? A fly with a brain the size of a salt grain has the behavioral repertoire nearly as complex as a much larger animal such as a mouse. That’s a super interesting problem from an engineering perspective,” Dickinson said.
The researchers synchronized three high-speed cameras each able to capture 7,500 frames per second, or 40 frames per wing beat. The cameras were focused on a small region in the middle of a cylindrical flight arena where 40 to 50 fruit flies flitted about. When a fly passed through the intersection of two laser beams at the exact center of the arena, it triggered an expanding shadow that caused the fly to take evasive action to avoid a collision or being eaten.
With the camera shutters opening and closing every one thirty-thousandth of a second, the researchers needed to flood the space with very bright light, Muijres said. Because flies rely on their vision and would be blinded by regular light, the arena was ringed with very bright infrared lights to overcome the problem. Neither humans nor fruit flies register infrared light.
How the fly’s brain and muscles control these remarkably fast and accurate evasive maneuvers is the next thing researchers would like to investigate, Dickinson said.

Fruit flies, fighter jets use similar nimble tactics when under attack

When startled by predators, tiny fruit flies respond like fighter jets – employing screaming-fast banked turns to evade attacks.

Researchers at the University of Washington used an array of high-speed video cameras operating at 7,500 frames a second to capture the wing and body motion of flies after they encountered a looming image of an approaching predator.

“Although they have been described as swimming through the air, tiny flies actually roll their bodies just like aircraft in a banked turn to maneuver away from impending threats,” said Michael Dickinson, UW professor of biology and co-author of a paper on the findings in the April 11 issue of Science. “We discovered that fruit flies alter course in less than one one-hundredth of a second, 50 times faster than we blink our eyes, and which is faster than we ever imagined.”

In the midst of a banked turn, the flies can roll on their sides 90 degrees or more, almost flying upside down at times, said Florian Muijres, a UW postdoctoral researcher and lead author of the paper.

“These flies normally flap their wings 200 times a second and, in almost a single wing beat, the animal can reorient its body to generate a force away from the threatening stimulus and then continues to accelerate,” he said.

The fruit flies, a species called Drosophila hydei that are about the size of a sesame seed, rely on a fast visual system to detect approaching predators.

“The brain of the fly performs a very sophisticated calculation, in a very short amount of time, to determine where the danger lies and exactly how to bank for the best escape, doing something different if the threat is to the side, straight ahead or behind,” Dickinson said.

“How can such a small brain generate so many remarkable behaviors? A fly with a brain the size of a salt grain has the behavioral repertoire nearly as complex as a much larger animal such as a mouse. That’s a super interesting problem from an engineering perspective,” Dickinson said.

The researchers synchronized three high-speed cameras each able to capture 7,500 frames per second, or 40 frames per wing beat. The cameras were focused on a small region in the middle of a cylindrical flight arena where 40 to 50 fruit flies flitted about. When a fly passed through the intersection of two laser beams at the exact center of the arena, it triggered an expanding shadow that caused the fly to take evasive action to avoid a collision or being eaten.

With the camera shutters opening and closing every one thirty-thousandth of a second, the researchers needed to flood the space with very bright light, Muijres said. Because flies rely on their vision and would be blinded by regular light, the arena was ringed with very bright infrared lights to overcome the problem. Neither humans nor fruit flies register infrared light.

How the fly’s brain and muscles control these remarkably fast and accurate evasive maneuvers is the next thing researchers would like to investigate, Dickinson said.

Filed under fruit flies vision visual system robotics robots flying sensorimotor control science

114 notes

Getting To The Root Of Parkinson’s Disease

Working with human neurons and fruit flies, researchers at Johns Hopkins have identified and then shut down a biological process that appears to trigger a particular form of Parkinson’s disease present in a large number of patients. A report on the study, in the April 10 issue of the journal Cell, could lead to new treatments for this disorder.

image

“Drugs such as L-dopa can, for a time, manage symptoms of Parkinson’s disease, but as the disease worsens, tremors give way to immobility and, in some cases, to dementia. Even with good treatment, the disease marches on,” says Ted Dawson, M.D., Ph.D., professor of neurology and director of the Johns Hopkins Institute for Cell Engineering, Dawson says the new research builds on a growing body of knowledge about the origins of Parkinson’s disease, whose symptoms appear when dopamine-producing nerve cells in the brain degenerate. Further evidence for a role of genetics in Parkinson’s disease appeared a decade ago when researchers identified key mutations in an enzyme known as leucine-rich repeat kinase 2, or LRRK2 — pronounced “lark2.” When that enzyme was cloned, Dawson, together with his wife and longtime collaborator Valina Dawson, Ph.D., professor of neurology and member of the Institute for Cell Engineering, discovered that LRRK2 was a kinase, a type of enzyme that transfers phosphate groups to proteins and turns proteins on or off to change their activity.

Over the years, it was found that blocking kinase activity in mutated LRRK2 halted degeneration, while enhancing it made things worse. But nobody knew what proteins LRRK2 was acting on.

"For nearly a decade, scientists have been trying to figure out how mutations in LRRK2 cause Parkinson’s disease," said Margaret Sutherland, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke. "This study represents a clear link between LRRK2 and a pathogenic mechanism linked to Parkinson’s disease."

Dawson went fishing for the right proteins using LRRK2 as bait. When his team began to identify those proteins, Dawson says they were surprised to discover that many were linked to the cellular machinery, like ribosomes, that make proteins. Nobody, says Dawson, suspected that LRRK2 might be involved at such a basic level as protein manufacture.

Unsure if they were right, the team then tested the proteins they identified to see which of them, if any, LRRK2 could add phosphate groups to. They came up with three ribosomal protein candidates — s11, s15 and s27. They then altered each ribosomal protein to see what would happen. It turned out that mutating s15 in a manner that blocked LRRK2 phosphorylation protected nerve cells taken from rats, humans and fruit flies from death. In other words, s15 appeared to be the much sought-after target of LRRK2, Dawson says.

"When you go fishing, you want to catch fish. We just happened to catch a big one,” Dawson says.

With the protein now identified, Dawson’s team is tackling further experiments to find out how excess protein production causes dopamine neurons to degenerate. And they want to see what happens when they block LRRK2 from phosphorylating the s15 protein in mice, to build on their findings from fruit flies and nerve cells grown in a dish.

“There’s a big chasm between animal disease models and human treatments,” says Ian Martin, Ph.D., a neuroscientist in Dawson’s lab and the lead author on the paper. “But it’s exciting. I think it definitely could turn into something real, hopefully in my lifetime.”

(Source: hopkinsmedicine.org)

Filed under parkinson's disease LRRK2 neurodegeneration s15 protein neuroscience science

103 notes

Researchers examine metabolism in defective cells
UAlberta researchers are taking a closer look at how two metabolic pathways interact to increase the lifespan of cells with mitochondrial defects. Magnus Friis (PhD ’10) is the lead author of the study, which was published online on April 10 and will be published in the April 24 issue of Cell Reports.
Mitochondria produce energy for cells through oxidative metabolism, but the process produces toxic byproducts that can accumulate and cause defects in the cell’s mitochondria. These defects, in turn, affect the cell’s ability to generate energy and can potentially lead to cell death and are associated with aging and various neurological diseases.
Friis, a postdoctoral fellow in Mike Schultz’s biochemistry lab, examined how dietary changes at the cell level can affect cell health. He exposed normal and defective yeast cells to two different energy sources: glucose, the preferred sugar of cells, and raffinose, a natural sugar found in vegetables and whole grains.
“[The dietary intervention] is a general shift in what we’re feeding the cells to get them to do something different with their whole nutrient metabolism,” Friis noted. “There are signaling pathways that allow a cell to sense its environment and co-ordinate events to allow the cell to adapt to what’s going on. In this case, [cells are responding to] which nutrients are available.”
Friis and Schultz examined two nutrient signaling pathways called the AMPK pathway and the retrograde response. AMPK responds to energy deficits in the cell by down-regulating energy consuming processes, which are often associated with cell growth, and up-regulating energy producing processes. The retrograde response pathway is specific to the yeast used in the study and supplies key amino acids to the cell by changing the metabolic process of the mitochondria.  
When activated individually, neither the AMPK pathway nor the retrograde response provided substantial benefits to cells with damaged mitochondria. When activated simultaneously, clear benefits became evident.
“We looked at the effect activating both pathways had on maintenance of cellular viability in what’s called a chronological aging experiment,” Friis said. “Even when they had defective mitochondria, the cells with the retrograde response and AMPK simultaneously activated during growth were able to live as long as cells with normal mitochondrial function.”
Working in collaboration with John Paul Glaves, a postdoctoral fellow in Bryan Sykes’ lab, and Tao Huan, a PhD student in Liang Li’s lab, Friis measured the molecules produced during the metabolic process. They found that the defective cells had higher levels of branched chain amino acids and trahelose, a carbohydrate found in yeast that can serve an energy source, similar to glycogen in human cells.
“By activating AMPK, we’ve removed certain blocks in metabolism. With the retrograde response, we’ve changed the amino acid metabolism in a way that allowed the cells to accumulate storage carbohydrates, which stabilize their function,” Friis said.
Activated AMPK and retrograde response pathways allow the cell to accumulate a storage carbohydrate, which can be metabolize normally despite mitochondrial defects that affect the cell’s metabolism. The additional energy stabilizes cell function and prevents premature cell death often caused by defects in mitochondria.
“No matter how many people are working on the problem in humans, mitochondrial disorders are too complicated to figure out the nuts and bolts without the work that Magnus is doing,” Schultz said. “This research opens the concept, a new concept on how to deal with these metabolic problems.”

Researchers examine metabolism in defective cells

UAlberta researchers are taking a closer look at how two metabolic pathways interact to increase the lifespan of cells with mitochondrial defects. Magnus Friis (PhD ’10) is the lead author of the study, which was published online on April 10 and will be published in the April 24 issue of Cell Reports.

Mitochondria produce energy for cells through oxidative metabolism, but the process produces toxic byproducts that can accumulate and cause defects in the cell’s mitochondria. These defects, in turn, affect the cell’s ability to generate energy and can potentially lead to cell death and are associated with aging and various neurological diseases.

Friis, a postdoctoral fellow in Mike Schultz’s biochemistry lab, examined how dietary changes at the cell level can affect cell health. He exposed normal and defective yeast cells to two different energy sources: glucose, the preferred sugar of cells, and raffinose, a natural sugar found in vegetables and whole grains.

“[The dietary intervention] is a general shift in what we’re feeding the cells to get them to do something different with their whole nutrient metabolism,” Friis noted. “There are signaling pathways that allow a cell to sense its environment and co-ordinate events to allow the cell to adapt to what’s going on. In this case, [cells are responding to] which nutrients are available.”

Friis and Schultz examined two nutrient signaling pathways called the AMPK pathway and the retrograde response. AMPK responds to energy deficits in the cell by down-regulating energy consuming processes, which are often associated with cell growth, and up-regulating energy producing processes. The retrograde response pathway is specific to the yeast used in the study and supplies key amino acids to the cell by changing the metabolic process of the mitochondria.  

When activated individually, neither the AMPK pathway nor the retrograde response provided substantial benefits to cells with damaged mitochondria. When activated simultaneously, clear benefits became evident.

“We looked at the effect activating both pathways had on maintenance of cellular viability in what’s called a chronological aging experiment,” Friis said. “Even when they had defective mitochondria, the cells with the retrograde response and AMPK simultaneously activated during growth were able to live as long as cells with normal mitochondrial function.”

Working in collaboration with John Paul Glaves, a postdoctoral fellow in Bryan Sykes’ lab, and Tao Huan, a PhD student in Liang Li’s lab, Friis measured the molecules produced during the metabolic process. They found that the defective cells had higher levels of branched chain amino acids and trahelose, a carbohydrate found in yeast that can serve an energy source, similar to glycogen in human cells.

“By activating AMPK, we’ve removed certain blocks in metabolism. With the retrograde response, we’ve changed the amino acid metabolism in a way that allowed the cells to accumulate storage carbohydrates, which stabilize their function,” Friis said.

Activated AMPK and retrograde response pathways allow the cell to accumulate a storage carbohydrate, which can be metabolize normally despite mitochondrial defects that affect the cell’s metabolism. The additional energy stabilizes cell function and prevents premature cell death often caused by defects in mitochondria.

“No matter how many people are working on the problem in humans, mitochondrial disorders are too complicated to figure out the nuts and bolts without the work that Magnus is doing,” Schultz said. “This research opens the concept, a new concept on how to deal with these metabolic problems.”

Filed under mitochondria mitochondrial disorders metabolism cell function medicine science

203 notes

Brain cell find points to new therapies
Insights into how brain cells are produced could lead to treatments for brain cancer and other brain-related disorders.
Scientists have gained new understanding of the role played by a key molecule that controls how and when nerve and brain cells are formed - a process that allows the brain to develop and keeps it healthy.
Their findings could help explain what happens when cell production goes out of control, which is a fundamental characteristic of many diseases including cancer.
Regulatory systems
Researchers have focused on a RNA molecule, known as miR-9, which is linked to the development of brain cells, known as neurons and glial cells.
They have shown that a protein called Lin28a regulates the production of miR-9, which in turn controls the genes involved in brain cell development and function.
Scientists carried out lab studies of embryonic cells, which can develop into neurons, to determine how Lin28a controls the amount of miR-9 that is produced.
Complex pathways
They found that in embryonic cells, Lin28a prevents production of miR-9 by triggering the degradation of its precursor molecule.
In developed brain cells, Lin28a is no longer produced, which enables miR-9 to accumulate and function.
In cancer cells, Lin28a production is re-established, and as a result this natural process is disrupted.
Lab experiments
Researchers used a series of lab tests to unravel the complex processes that are directed by the Lin28a protein.
They say further studies could help explain fully the role of Lin28a and miR-9 in brain development, and pave the way to the development of novel therapies.
The study, published in Nature Communications, was supported by the Wellcome Trust and the Medical Research Council.

Understanding more of the complex science behind the fundamental processes of cell development will helps us learn more about what happens when this goes wrong – and what might be done to prevent it. -Dr Gracjan Michlewski (School of Biological Sciences)

(Image: iStock)

Brain cell find points to new therapies

Insights into how brain cells are produced could lead to treatments for brain cancer and other brain-related disorders.

Scientists have gained new understanding of the role played by a key molecule that controls how and when nerve and brain cells are formed - a process that allows the brain to develop and keeps it healthy.

Their findings could help explain what happens when cell production goes out of control, which is a fundamental characteristic of many diseases including cancer.

Regulatory systems

Researchers have focused on a RNA molecule, known as miR-9, which is linked to the development of brain cells, known as neurons and glial cells.

They have shown that a protein called Lin28a regulates the production of miR-9, which in turn controls the genes involved in brain cell development and function.

Scientists carried out lab studies of embryonic cells, which can develop into neurons, to determine how Lin28a controls the amount of miR-9 that is produced.

Complex pathways

They found that in embryonic cells, Lin28a prevents production of miR-9 by triggering the degradation of its precursor molecule.

In developed brain cells, Lin28a is no longer produced, which enables miR-9 to accumulate and function.

In cancer cells, Lin28a production is re-established, and as a result this natural process is disrupted.

Lab experiments

Researchers used a series of lab tests to unravel the complex processes that are directed by the Lin28a protein.

They say further studies could help explain fully the role of Lin28a and miR-9 in brain development, and pave the way to the development of novel therapies.

The study, published in Nature Communications, was supported by the Wellcome Trust and the Medical Research Council.

Understanding more of the complex science behind the fundamental processes of cell development will helps us learn more about what happens when this goes wrong – and what might be done to prevent it. -Dr Gracjan Michlewski (School of Biological Sciences)

(Image: iStock)

Filed under lin28a brain cells cancer cells brain cancer glial cells cell differentiation neuroscience science

free counters