Neuroscience

Articles and news from the latest research reports.

Posts tagged science

155 notes

Upfront and personal: Scientists model human reasoning in the brain’s prefrontal cortex
Located at the forward end of the brain’s frontal lobe, the mammalian prefrontal cortex (PFC) is the seat of many of our most unique cognitive abilities – collectively referred to as executive function – including planning, decision-making, and coordinating thoughts and actions with internal goals. That said, perhaps its most important attribute – one that is apparently unique to H. sapiens – is reasoning which, based on Bayesian, or probabilistic, inference, mitigates uncertainty by informing adaptive behavior. While the structural details of this remarkable process have historically remained elusive, scientists at Institut National de la Santé et de la Recherche Médicale, Paris, and Ecole Normale Supérieure, Paris and Université Pierre et Marie Curie, Paris have recently employed computational modeling and neuroimaging to show that the human prefrontal cortex involves two interactive reasoning pathways that embody hypothesis testing for evaluating, accepting and rejecting behavioral strategies. More specifically, their model describes behavior guided by reason in the form of an online algorithm combining Bayesian inference applied to multiple stored strategies with hypothesis testing that can update these strategies. In addition – as proposed in a previous work – the scientists conclude that since the frontopolar cortex (FPC), located in the anterior-most portion of the frontal lobes, is human-specific and is a key component in executive function decision-making, the ability to make inferences on concurrent strategies and decide to switch directly to one of these alternative strategies is unique to humans as well.
Prof. Etienne Koechlin discussed the paper that he, Dr. Maël Donoso and Dr. Anne G. E. Collins published in Science.
Read more

Upfront and personal: Scientists model human reasoning in the brain’s prefrontal cortex

Located at the forward end of the brain’s frontal lobe, the mammalian prefrontal cortex (PFC) is the seat of many of our most unique cognitive abilities – collectively referred to as executive function – including planning, decision-making, and coordinating thoughts and actions with internal goals. That said, perhaps its most important attribute – one that is apparently unique to H. sapiens – is reasoning which, based on Bayesian, or probabilistic, inference, mitigates uncertainty by informing adaptive behavior. While the structural details of this remarkable process have historically remained elusive, scientists at Institut National de la Santé et de la Recherche Médicale, Paris, and Ecole Normale Supérieure, Paris and Université Pierre et Marie Curie, Paris have recently employed computational modeling and neuroimaging to show that the human prefrontal cortex involves two interactive reasoning pathways that embody hypothesis testing for evaluating, accepting and rejecting behavioral strategies. More specifically, their model describes behavior guided by reason in the form of an online algorithm combining Bayesian inference applied to multiple stored strategies with hypothesis testing that can update these strategies. In addition – as proposed in a previous work – the scientists conclude that since the frontopolar cortex (FPC), located in the anterior-most portion of the frontal lobes, is human-specific and is a key component in executive function decision-making, the ability to make inferences on concurrent strategies and decide to switch directly to one of these alternative strategies is unique to humans as well.

Prof. Etienne Koechlin discussed the paper that he, Dr. Maël Donoso and Dr. Anne G. E. Collins published in Science.

Read more

Filed under prefrontal cortex executive function decision making reasoning neuroscience science

212 notes

Little or poor sleep may be associated with worse brain function when aging
Research published today in PLOS ONE by researchers at the University of Warwick indicates that sleep problems are associated with worse memory and executive function in older people.
Analysis of sleep and cognitive (brain function) data from 3,968 men and 4,821 women who took part in the English Longitudinal Study of Ageing (ELSA), was conducted in a study funded by the Economic and Social Research Council (ESRC). Respondents reported on the quality and quantity of sleep over the period of a month.
The study showed that there is an association between both quality and duration of sleep and brain function which changes with age.
In adults aged between 50 and 64 years of age, short sleep (<6hrs per night) and long sleep (>8hrs per night) were associated with lower brain function scores. By contrast, in older adults (65-89 years) lower brain function scores were only observed in long sleepers.
Dr Michelle A Miller says “6-8 hours of sleep per night is particularly important for optimum brain function, in younger adults”. These results are consistent with our previous research, which showed that 6-8 hours of sleep per night was optimal for physical health, including lowest risk of developing obesity, hypertension, diabetes, heart disease and stroke”.
Interestingly, in the younger pre-retirement aged adults, sleep quality did not have any significant association with brain function scores, whereas in the older adults (>65 years), there was a significant relationship between sleep quality and the observed scores.
“Sleep is important for good health and mental wellbeing” says Professor Francesco Cappuccio, “Optimising sleep at an older age may help to delay the decline in brain function seen with age, or indeed may slow or prevent the rapid decline that leads to dementia”.
Dr Miller concludes that “if poor sleep is causative of future cognitive decline, non-pharmacological improvements in sleep may provide an alternative low-cost and more accessible Public Health intervention, to delay or slow the rate of cognitive decline”.

Little or poor sleep may be associated with worse brain function when aging

Research published today in PLOS ONE by researchers at the University of Warwick indicates that sleep problems are associated with worse memory and executive function in older people.

Analysis of sleep and cognitive (brain function) data from 3,968 men and 4,821 women who took part in the English Longitudinal Study of Ageing (ELSA), was conducted in a study funded by the Economic and Social Research Council (ESRC). Respondents reported on the quality and quantity of sleep over the period of a month.

The study showed that there is an association between both quality and duration of sleep and brain function which changes with age.

In adults aged between 50 and 64 years of age, short sleep (<6hrs per night) and long sleep (>8hrs per night) were associated with lower brain function scores. By contrast, in older adults (65-89 years) lower brain function scores were only observed in long sleepers.

Dr Michelle A Miller says “6-8 hours of sleep per night is particularly important for optimum brain function, in younger adults”. These results are consistent with our previous research, which showed that 6-8 hours of sleep per night was optimal for physical health, including lowest risk of developing obesity, hypertension, diabetes, heart disease and stroke”.

Interestingly, in the younger pre-retirement aged adults, sleep quality did not have any significant association with brain function scores, whereas in the older adults (>65 years), there was a significant relationship between sleep quality and the observed scores.

Sleep is important for good health and mental wellbeing” says Professor Francesco Cappuccio, “Optimising sleep at an older age may help to delay the decline in brain function seen with age, or indeed may slow or prevent the rapid decline that leads to dementia”.

Dr Miller concludes that “if poor sleep is causative of future cognitive decline, non-pharmacological improvements in sleep may provide an alternative low-cost and more accessible Public Health intervention, to delay or slow the rate of cognitive decline”.

Filed under brain function cognitive impairment memory sleep aging psychology neuroscience science

371 notes

Brain fills gaps to produce a likely picture
Researchers at Radboud University use visual illusions to demonstrate to what extent the brain interprets visual signals. They were surprised to discover that active interpretation occurs early on in signal processing. In other words, we see not only with our eyes, but with our brain, too. Current Biology is publishing these results in the July issue.
The results obtained by the Radboud University researchers are illustrated, for example, by the visual illusion on the left: we see a triangle that in fact is not there. The triangle is only suggested because of the way the ‘Pac-Man’ shapes are positioned; there appears to be a light-grey triangle on top of three black circles.
Seen in the fMRIHow does the brain do that? That was the question Peter Kok and Floris de Lange, from the Donders Institute at Radboud University in Nijmegen, asked themselves. Using fMRI, they discovered that the triangle – although non-existent – activates the primary visual brain cortex. This is the first area in the cortex to deal with a signal from the eyes.
The primary visual brain cortex is normally regarded as the area where eye signals are merely processed, but that has now been refuted by the results Kok and De Lange obtained.
Active interpretationRecent theories assume that the brain does not simply process or filter external information, but actively interprets it. In the example described above, the brain decides it is more likely that a triangle would be on top of black circles than that three such circles, each with a bite taken out, would by coincidence point in a particular direction. After all, when we look around, we see triangles and circles more often than Pac-Man shapes.
Furthermore, objects very often lie on top of other things; just think of the books and piles of paper on your desk. The imaginary triangle is a feasible explanation for the bites taken out of the circles; the brain ‘understands’ they are ‘merely’ partly covered black circles.
The unexpected requires more processingKok and De Lange also noticed that whenever the Pac-Man shapes do not form a triangle, more brain activity is required. In the above image on the right, we see that the three Pac-Man shapes ‘underneath’ the triangle cause little brain activity (coloured blue), but the separate Pac-Man on the right causes more activity. This also fits in with the theory that perception is a question of interpretation: if something is easy to explain, less brain activity is needed to process that information, compared to when something is unexpected or difficult to account for – as in the adjacent diagram.

Brain fills gaps to produce a likely picture

Researchers at Radboud University use visual illusions to demonstrate to what extent the brain interprets visual signals. They were surprised to discover that active interpretation occurs early on in signal processing. In other words, we see not only with our eyes, but with our brain, too. Current Biology is publishing these results in the July issue.

The results obtained by the Radboud University researchers are illustrated, for example, by the visual illusion on the left: we see a triangle that in fact is not there. The triangle is only suggested because of the way the ‘Pac-Man’ shapes are positioned; there appears to be a light-grey triangle on top of three black circles.

Seen in the fMRI
How does the brain do that? That was the question Peter Kok and Floris de Lange, from the Donders Institute at Radboud University in Nijmegen, asked themselves. Using fMRI, they discovered that the triangle – although non-existent – activates the primary visual brain cortex. This is the first area in the cortex to deal with a signal from the eyes.

The primary visual brain cortex is normally regarded as the area where eye signals are merely processed, but that has now been refuted by the results Kok and De Lange obtained.

Active interpretation
Recent theories assume that the brain does not simply process or filter external information, but actively interprets it. In the example described above, the brain decides it is more likely that a triangle would be on top of black circles than that three such circles, each with a bite taken out, would by coincidence point in a particular direction. After all, when we look around, we see triangles and circles more often than Pac-Man shapes.

Furthermore, objects very often lie on top of other things; just think of the books and piles of paper on your desk. The imaginary triangle is a feasible explanation for the bites taken out of the circles; the brain ‘understands’ they are ‘merely’ partly covered black circles.

The unexpected requires more processing
Kok and De Lange also noticed that whenever the Pac-Man shapes do not form a triangle, more brain activity is required. In the above image on the right, we see that the three Pac-Man shapes ‘underneath’ the triangle cause little brain activity (coloured blue), but the separate Pac-Man on the right causes more activity. This also fits in with the theory that perception is a question of interpretation: if something is easy to explain, less brain activity is needed to process that information, compared to when something is unexpected or difficult to account for – as in the adjacent diagram.

Filed under visual illusions visual cortex brain activity neuroimaging shape perception neuroscience science

92 notes

Blocking key enzyme minimizes stroke injury

A drug that blocks the action of the enzyme Cdk5 could substantially reduce brain damage if administered shortly after a stroke, UT Southwestern Medical Center research suggests.

The findings, reported in the June 11 issue of the Journal of Neuroscience, determined in rodent models that aberrant Cdk5 activity causes nerve cell death during stroke.

“If you inhibit Cdk5, then the vast majority of brain tissue stays alive without oxygen for up to one hour,” said Dr. James Bibb, Associate Professor of Psychiatry and Neurology and Neurotherapeutics at UT Southwestern and senior author of the study. “This result tells us that Cdk5 is a central player in nerve cell death.”

More importantly, development of a Cdk5 inhibitor as an acute neuroprotective therapy has the potential to reduce stroke injury.

“If we could block Cdk5 in patients who have just suffered a stroke, we may be able to reduce the number of patients in our hospitals who become disabled or die from stroke. Doing so would have a major impact on health care,” Dr. Bibb said.

While several pharmaceutical companies worked to develop Cdk5 inhibitors years ago, these efforts were largely abandoned since research indicated blocking Cdk5 long-term could have detrimental effects. At the time, many scientists thought aberrant Cdk5 activity played a major role in the development of Alzheimer’s disease and that Cdk5 inhibition might be beneficial as a treatment.

Based on Dr. Bibb’s research and that of others, Cdk5 has both good and bad effects. When working normally, Cdk5 adds phosphates to other proteins that are important to healthy brain function. On the flip side, researchers have found that aberrant Cdk5 activity contributes to nerve cell death following brain injury and can lead to cancer.

“Cdk5 regulates communication between nerve cells and is essential for proper brain function. Therefore, blocking Cdk5 long-term may not be beneficial,” Dr. Bibb said. “Until now, the connection between Cdk5 and stroke injury was unknown, as was the potential benefit of acute Cdk5 inhibition as a therapy.”

In this study, researchers administered a Cdk5 inhibitor directly into dissected brain slices after adult rodents suffered a stroke, in addition to measuring the post-stroke effects in Cdk5 knockout mice. 

“We are not yet at a point where this new treatment can be given for stroke. Nevertheless, this research brings us a step closer to developing the right kinds of drugs,” Dr. Bibb said. “We first need to know what mechanisms underlie the disease before targeted treatments can be developed that will be effective. As no Cdk5 blocker exists that works in a pill form, the next step will be to develop a systemic drug that could be used to confirm the study’s results and lead to a clinical trial at later stages.”

Currently, there is only one FDA-approved drug for acute treatment of stroke, the clot-busting drug tPA. Other treatment options include neurosurgical procedures to help minimize brain damage.

(Source: utsouthwestern.edu)

Filed under stroke nerve cells cdk5 brain function tPA cell death neuroscience science

130 notes

Potential Alzheimer’s drug prevents abnormal blood clots in the brain

Without a steady supply of blood, neurons can’t work. That’s why one of the culprits behind Alzheimer’s disease is believed to be the persistent blood clots that often form in the brains of Alzheimer’s patients, contributing to the condition’s hallmark memory loss, confusion and cognitive decline.

image

New experiments in Sidney Strickland’s Laboratory of Neurobiology and Genetics at Rockefeller University have identified a compound that might halt the progression of Alzheimer’s by interfering with the role amyloid-β, a small protein that forms plaques in Alzheimer’s brains, plays in the formation of blood clots. This work is highlighted in the July issue of Nature Reviews Drug Discovery.

For more than a decade, potential Alzheimer’s drugs have targeted amyloid-β, but, in clinical trials, they have either failed to slow the progression of the disease or caused serious side effects. However, by targeting the protein’s ability to bind to a clotting agent in blood, the work in the Strickland lab offers a promising new strategy, according to the highlight published in print on July 1.

This latest study builds on previous work in Strickland’s lab showing amyloid-β can interact with fibrinogen, the clotting agent, to form difficult-to-break-down clots that alter blood flow, cause inflammation and choke neurons.

“Our experiments in test tubes and in mouse models of Alzheimer’s showed the compound, known as RU-505, helped restore normal clotting and cerebral blood flow. But the big pay-off came with behavioral tests in which the Alzheimer’s mice treated with RU-505 exhibited better memories than their untreated counterparts,” Strickland says. “These results suggest we have found a new strategy with which to treat Alzheimer’s disease.”

RU-505 emerged from a pack of 93,716 candidates selected from libraries of compounds, the researchers write in the June issue of the Journal of Experimental Medicine. Hyung Jin Ahn, a research associate in the lab, examined these candidates with a specific goal in mind: Find one that interferes with the interaction between fibrinogen and amyloid-β. In a series of tests that began with a massive, automated screening effort at Rockefeller’s High Throughput Resource Center, Ahn and colleagues winnowed the 93,000 contenders to five. Then, test tube experiments whittled the list down to one contender: RU-505, a small, synthetic compound. Because RU-505 binds to amyloid-β and only prevents abnormal blood clot formation, it does not interfere with normal clotting. It is also capable of passing through the blood-brain barrier.

“We tested RU-505 in mouse models of Alzheimer’s disease that over-express amyloid-β and have a relatively early onset of disease. Because Alzheimer’s disease is a long-term, progressive disease, these treatments lasted for three months,” Ahn says. “Afterward, we found evidence of improvement both at the cellular and the behavioral levels.”

The brains of the treated mice had less of the chronic and harmful inflammation associated with the disease, and blood flow in their brains was closer to normal than that of untreated Alzheimer’s mice. The RU-505-treated mice also did better when placed in a maze. Mice naturally want to escape the maze, and are trained to recognize visual cues to find the exit quickly. Even after training, Alzheimer’s mice have difficulty in exiting the maze. After these mice were treated with RU-505, they performed much better.

“While the behavior and the brains of the Alzheimer’s mice did not fully recover, the three-month treatment with RU-505 prevents much of the decline associated with the disease,” Strickland says.

The researchers have begun the next steps toward developing a human treatment. Refinements to the compound are being supported by the Robertson Therapeutic Development Fund and the Tri-Institutional Therapeutic Discovery Institute. As part of a goal to help bridge critical gaps in drug discovery, these initiatives support the early stages of drug development, as is being done with RU-505.

“At very high doses, RU-505 is toxic to mice and even at lower doses it caused some inflammation at the injection site, so we are hoping to find ways to reduce this toxicity, while also increasing RU-505’s efficacy so smaller doses can accomplish similar results,” Ahn says.

(Source: newswire.rockefeller.edu)

Filed under alzheimer's disease cognitive decline beta amyloid fibrinogen blood clots neuroscience science

206 notes

The Social Psychology of Nerve Cells
The functional organization of the central nervous system depends upon a precise architecture and connectivity of distinct types of neurons. Multiple cell types are present within any brain structure, but the rules governing their positioning, and the molecular mechanisms mediating those rules, have been relatively unexplored.
A new study by UC Santa Barbara researchers demonstrates that a particular neuron, the cholinergic amacrine cell, creates a “personal space” in much the same way that people distance themselves from one another in an elevator. In addition, the study, published in the Proceedings of the National Academy of Sciences, shows that this feature is heritable and identifies a genetic contributor to it, pituitary tumor-transforming gene 1 (Pttg1).
Patrick Keeley, a postdoctoral scholar in Benjamin Reese’s laboratory at UCSB’s Neuroscience Research Institute, has been using the retina as a model system for exploring such principles of developmental neurobiology. The retina is ideal because this portion of the central nervous system lends itself to such spatial analysis. 
“Populations of neurons in the retina are laid out in single strata within this layered structure, lending themselves to accurate quantitation and statistical analysis,” explained Keeley. “Rather than being distributed as regular lattices of nerve cells, populations in the retina appear to abide by a simple rule, that of minimizing proximity to other cells of the same type. We would like to understand how such populations create and maintain such spacing behavior.”
To address this, Keeley and colleagues quantified the regularity in the population of a particular type of amacrine cell in the mouse retina. They did so in 26 genetically distinct strains of mice and found that every strain exhibited this same self-spacing behavior but that some strains did so more efficiently than others. Amacrine cells are retinal interneurons that form connections between other neurons and regulate bipolar cell output.
“The regularity in the patterning of these amacrine cells showed little variation within each strain, while showing conspicuous variation between the strains, indicating a heritable component to this trait,” said Keeley.
“This itself was something of a surprise, given that the patterning in such populations has an apparently stochastic quality to it,” said Reese, a professor in the Department of Psychological and Brain Sciences. Stochastic systems are random and are analyzed, at least in part, using probability theory.
This strain variation in the regularity of this cellular patterning showed a significant linkage to a location in the genome on chromosome 11, where the researchers identified Pttg1, previously unknown to play any role in the retina.
Working in collaboration with colleagues at the University of Tennessee Health Science Center in Memphis, Keeley’s team demonstrated that the expression of this gene varies across the 26 strains of mice and that there was a positive correlation between gene expression and regularity. They then identified a mutation in this gene that itself correlated with expression levels and with regularity. Working with colleagues at Cedars-Sinai Medical Center in Los Angeles, the team also demonstrated directly that this mutation controlled gene expression.   
“Pttg1 has diverse functions, being an oncogene for pituitary tumors, and is known to have regulatory functions orchestrating gene expression elsewhere in the body,” explained Keeley. “Within this class of retinal neurons, it should be regulating the way in which cells integrate signals from their immediate neighbors, translating that information to position the cell farthest from those neighbors.” Future studies should decipher the genetic network controlled by Pttg1 that mediates such nerve-cell spacing.

The Social Psychology of Nerve Cells

The functional organization of the central nervous system depends upon a precise architecture and connectivity of distinct types of neurons. Multiple cell types are present within any brain structure, but the rules governing their positioning, and the molecular mechanisms mediating those rules, have been relatively unexplored.

A new study by UC Santa Barbara researchers demonstrates that a particular neuron, the cholinergic amacrine cell, creates a “personal space” in much the same way that people distance themselves from one another in an elevator. In addition, the study, published in the Proceedings of the National Academy of Sciences, shows that this feature is heritable and identifies a genetic contributor to it, pituitary tumor-transforming gene 1 (Pttg1).

Patrick Keeley, a postdoctoral scholar in Benjamin Reese’s laboratory at UCSB’s Neuroscience Research Institute, has been using the retina as a model system for exploring such principles of developmental neurobiology. The retina is ideal because this portion of the central nervous system lends itself to such spatial analysis. 

“Populations of neurons in the retina are laid out in single strata within this layered structure, lending themselves to accurate quantitation and statistical analysis,” explained Keeley. “Rather than being distributed as regular lattices of nerve cells, populations in the retina appear to abide by a simple rule, that of minimizing proximity to other cells of the same type. We would like to understand how such populations create and maintain such spacing behavior.”

To address this, Keeley and colleagues quantified the regularity in the population of a particular type of amacrine cell in the mouse retina. They did so in 26 genetically distinct strains of mice and found that every strain exhibited this same self-spacing behavior but that some strains did so more efficiently than others. Amacrine cells are retinal interneurons that form connections between other neurons and regulate bipolar cell output.

“The regularity in the patterning of these amacrine cells showed little variation within each strain, while showing conspicuous variation between the strains, indicating a heritable component to this trait,” said Keeley.

“This itself was something of a surprise, given that the patterning in such populations has an apparently stochastic quality to it,” said Reese, a professor in the Department of Psychological and Brain Sciences. Stochastic systems are random and are analyzed, at least in part, using probability theory.

This strain variation in the regularity of this cellular patterning showed a significant linkage to a location in the genome on chromosome 11, where the researchers identified Pttg1, previously unknown to play any role in the retina.

Working in collaboration with colleagues at the University of Tennessee Health Science Center in Memphis, Keeley’s team demonstrated that the expression of this gene varies across the 26 strains of mice and that there was a positive correlation between gene expression and regularity. They then identified a mutation in this gene that itself correlated with expression levels and with regularity. Working with colleagues at Cedars-Sinai Medical Center in Los Angeles, the team also demonstrated directly that this mutation controlled gene expression.   

“Pttg1 has diverse functions, being an oncogene for pituitary tumors, and is known to have regulatory functions orchestrating gene expression elsewhere in the body,” explained Keeley. “Within this class of retinal neurons, it should be regulating the way in which cells integrate signals from their immediate neighbors, translating that information to position the cell farthest from those neighbors.” Future studies should decipher the genetic network controlled by Pttg1 that mediates such nerve-cell spacing.

Filed under nerve cells amacrine cells gene expression Pttg1 retina interneurons neuroscience science

282 notes

Running, Combined with Visual Experience, Restores Brain Function
In a new study by UC San Francisco scientists, running, when accompanied by visual stimuli, restored brain function to normal levels in mice that had been deprived of visual experience in early life.
In addition to suggesting a novel therapeutic strategy for humans with blindness in one eye caused by a congenital cataract, droopy eyelid, or misaligned eye, the new research—the latest in a series of UCSF studies exploring effects of locomotion on brain function—suggests that the adult brain may be far more capable of rewiring and repairing itself than previously thought.
In 2010, Michael P. Stryker, PhD, the W.F. Ganong Professor of Physiology, and postdoctoral fellow Cris Niell, PhD, now at the University of Oregon, made the surprising discovery that neurons in the visual area of the mouse brain fired much more robustly whenever the mice walked or ran.
Earlier this year, postdoctoral fellow Yu Fu, PhD, Stryker and a number of colleagues built on these findings, identifying and describing the neural circuit responsible for this locomotion-induced “high-gain state” in the visual cortex of the mouse brain.
Neither of these studies made clear, however, whether this circuit might have broader functional or clinical significance.
It has been known since the 1960s that visual areas of the brain do not develop normally if deprived of visual input during a “critical period” of brain development early in life. For example, in humans, if amblyopia (“lazy eye”) or other major eye problems are not surgically corrected in infancy, vision will never be normal in the affected eye—if such individuals lose sight in their “good” eye in later life, they are blind.
In the new research, published June 26, 2014 in the online journal eLife, Stryker and UCSF postdoctoral fellow Megumi Kaneko, MD, PhD, closed one eyelid of mouse pups at about 20 days after birth, and that eye was kept closed until the mice reached about five months of age.
As expected, the mice in which one eye had been closed during the critical developmental period showed sharply reduced neural activity in the part of the brain responsible for vision in that eye.
As in the previous UCSF experiments in this area, some mice were allowed to run freely on Styrofoam balls suspended on a cushion of air while recordings were made from their brains.
Little improvement was seen in the mice that had been deprived of visual input either when they were simply allowed to run or when they received visual training with the deprived eye not accompanied by walking or running.
But when the mice were exposed to the visual stimuli while they were running or walking, the results were dramatic: within a week the brain responses to those stimuli from the deprived eye were nearly identical to those from the normal eye, indicating that the circuits in the visual area of the brain representing the deprived eye had undergone a rapid reorganization, known in neuroscience as “plasticity.”
Interestingly, this recovery was stimulus-specific: if the brain activity of the mice was tested using a stimulus other than that they had seen while running, little or no recovery of function was apparent.
“We have no idea yet whether running puts the human cortex into a high-gain state that enhances plasticity, as it does the visual cortex of the mouse,” Stryker said, “but we are designing experiments to find out.”

Running, Combined with Visual Experience, Restores Brain Function

In a new study by UC San Francisco scientists, running, when accompanied by visual stimuli, restored brain function to normal levels in mice that had been deprived of visual experience in early life.

In addition to suggesting a novel therapeutic strategy for humans with blindness in one eye caused by a congenital cataract, droopy eyelid, or misaligned eye, the new research—the latest in a series of UCSF studies exploring effects of locomotion on brain function—suggests that the adult brain may be far more capable of rewiring and repairing itself than previously thought.

In 2010, Michael P. Stryker, PhD, the W.F. Ganong Professor of Physiology, and postdoctoral fellow Cris Niell, PhD, now at the University of Oregon, made the surprising discovery that neurons in the visual area of the mouse brain fired much more robustly whenever the mice walked or ran.

Earlier this year, postdoctoral fellow Yu Fu, PhD, Stryker and a number of colleagues built on these findings, identifying and describing the neural circuit responsible for this locomotion-induced “high-gain state” in the visual cortex of the mouse brain.

Neither of these studies made clear, however, whether this circuit might have broader functional or clinical significance.

It has been known since the 1960s that visual areas of the brain do not develop normally if deprived of visual input during a “critical period” of brain development early in life. For example, in humans, if amblyopia (“lazy eye”) or other major eye problems are not surgically corrected in infancy, vision will never be normal in the affected eye—if such individuals lose sight in their “good” eye in later life, they are blind.

In the new research, published June 26, 2014 in the online journal eLife, Stryker and UCSF postdoctoral fellow Megumi Kaneko, MD, PhD, closed one eyelid of mouse pups at about 20 days after birth, and that eye was kept closed until the mice reached about five months of age.

As expected, the mice in which one eye had been closed during the critical developmental period showed sharply reduced neural activity in the part of the brain responsible for vision in that eye.

As in the previous UCSF experiments in this area, some mice were allowed to run freely on Styrofoam balls suspended on a cushion of air while recordings were made from their brains.

Little improvement was seen in the mice that had been deprived of visual input either when they were simply allowed to run or when they received visual training with the deprived eye not accompanied by walking or running.

But when the mice were exposed to the visual stimuli while they were running or walking, the results were dramatic: within a week the brain responses to those stimuli from the deprived eye were nearly identical to those from the normal eye, indicating that the circuits in the visual area of the brain representing the deprived eye had undergone a rapid reorganization, known in neuroscience as “plasticity.”

Interestingly, this recovery was stimulus-specific: if the brain activity of the mice was tested using a stimulus other than that they had seen while running, little or no recovery of function was apparent.

“We have no idea yet whether running puts the human cortex into a high-gain state that enhances plasticity, as it does the visual cortex of the mouse,” Stryker said, “but we are designing experiments to find out.”

Filed under visual cortex brain function brain activity amblyopia plasticity locomotion neuroscience science

359 notes

Early life stress can leave lasting impacts on the brain
For children, stress can go a long way. A little bit provides a platform for learning, adapting and coping. But a lot of it — chronic, toxic stress like poverty, neglect and physical abuse — can have lasting negative impacts.
A team of University of Wisconsin-Madison researchers recently showed these kinds of stressors, experienced in early life, might be changing the parts of developing children&#8217;s brains responsible for learning, memory and the processing of stress and emotion. These changes may be tied to negative impacts on behavior, health, employment and even the choice of romantic partners later in life.
The study, published in the journal Biological Psychiatry, could be important for public policy leaders, economists and epidemiologists, among others, says study lead author and recent UW Ph.D. graduate Jamie Hanson.
"We haven&#8217;t really understood why things that happen when you&#8217;re 2, 3, 4 years old stay with you and have a lasting impact," says Seth Pollak, co-leader of the study and UW-Madison professor of psychology.
Yet, early life stress has been tied before to depression, anxiety, heart disease, cancer, and a lack of educational and employment success, says Pollak, who is also director of the UW Waisman Center&#8217;s Child Emotion Research Laboratory.
"Given how costly these early stressful experiences are for society &#8230; unless we understand what part of the brain is affected, we won&#8217;t be able to tailor something to do about it," he says.
For the study, the team recruited 128 children around age 12 who had experienced either physical abuse, neglect early in life or came from low socioeconomic status households.
Researchers conducted extensive interviews with the children and their caregivers, documenting behavioral problems and their cumulative life stress. They also took images of the children&#8217;s brains, focusing on the hippocampus and amygdala, which are involved in emotion and stress processing. They were compared to similar children from middle-class households who had not been maltreated.
Hanson and the team outlined by hand each child&#8217;s hippocampus and amygdala and calculated their volumes. Both structures are very small, especially in children (the word amygdala is Greek for almond, reflecting its size and shape in adults), and Hanson and Pollak say the automated software measurements from other studies may be prone to error.
Indeed, their hand measurements found that children who experienced any of the three types of early life stress had smaller amygdalas than children who had not. Children from low socioeconomic status households and children who had been physically abused also had smaller hippocampal volumes. Putting the same images through automated software showed no effects.
Behavioral problems and increased cumulative life stress were also linked to smaller hippocampus and amygdala volumes.
Why early life stress may lead to smaller brain structures is unknown, says Hanson, now a postdoctoral researcher at Duke University&#8217;s Laboratory for NeuroGenetics, but a smaller hippocampus is a demonstrated risk factor for negative outcomes. The amygdala is much less understood and future work will focus on the significance of these volume changes.
"For me, it&#8217;s an important reminder that as a society we need to attend to the types of experiences children are having," Pollak says. "We are shaping the people these individuals will become."
But the findings, Hanson and Pollak say, are just markers for neurobiological change; a display of the robustness of the human brain, the flexibility of human biology. They aren&#8217;t a crystal ball to be used to see the future.
"Just because it&#8217;s in the brain doesn&#8217;t mean it&#8217;s destiny," says Hanson.

Early life stress can leave lasting impacts on the brain

For children, stress can go a long way. A little bit provides a platform for learning, adapting and coping. But a lot of it — chronic, toxic stress like poverty, neglect and physical abuse — can have lasting negative impacts.

A team of University of Wisconsin-Madison researchers recently showed these kinds of stressors, experienced in early life, might be changing the parts of developing children’s brains responsible for learning, memory and the processing of stress and emotion. These changes may be tied to negative impacts on behavior, health, employment and even the choice of romantic partners later in life.

The study, published in the journal Biological Psychiatry, could be important for public policy leaders, economists and epidemiologists, among others, says study lead author and recent UW Ph.D. graduate Jamie Hanson.

"We haven’t really understood why things that happen when you’re 2, 3, 4 years old stay with you and have a lasting impact," says Seth Pollak, co-leader of the study and UW-Madison professor of psychology.

Yet, early life stress has been tied before to depression, anxiety, heart disease, cancer, and a lack of educational and employment success, says Pollak, who is also director of the UW Waisman Center’s Child Emotion Research Laboratory.

"Given how costly these early stressful experiences are for society … unless we understand what part of the brain is affected, we won’t be able to tailor something to do about it," he says.

For the study, the team recruited 128 children around age 12 who had experienced either physical abuse, neglect early in life or came from low socioeconomic status households.

Researchers conducted extensive interviews with the children and their caregivers, documenting behavioral problems and their cumulative life stress. They also took images of the children’s brains, focusing on the hippocampus and amygdala, which are involved in emotion and stress processing. They were compared to similar children from middle-class households who had not been maltreated.

Hanson and the team outlined by hand each child’s hippocampus and amygdala and calculated their volumes. Both structures are very small, especially in children (the word amygdala is Greek for almond, reflecting its size and shape in adults), and Hanson and Pollak say the automated software measurements from other studies may be prone to error.

Indeed, their hand measurements found that children who experienced any of the three types of early life stress had smaller amygdalas than children who had not. Children from low socioeconomic status households and children who had been physically abused also had smaller hippocampal volumes. Putting the same images through automated software showed no effects.

Behavioral problems and increased cumulative life stress were also linked to smaller hippocampus and amygdala volumes.

Why early life stress may lead to smaller brain structures is unknown, says Hanson, now a postdoctoral researcher at Duke University’s Laboratory for NeuroGenetics, but a smaller hippocampus is a demonstrated risk factor for negative outcomes. The amygdala is much less understood and future work will focus on the significance of these volume changes.

"For me, it’s an important reminder that as a society we need to attend to the types of experiences children are having," Pollak says. "We are shaping the people these individuals will become."

But the findings, Hanson and Pollak say, are just markers for neurobiological change; a display of the robustness of the human brain, the flexibility of human biology. They aren’t a crystal ball to be used to see the future.

"Just because it’s in the brain doesn’t mean it’s destiny," says Hanson.

Filed under stress amygdala neuroimaging hippocampus child development plasticity neuroscience science

176 notes

Hearing with the skull
The ear is an important organ that allows us to perceive the world around us. However, very few of us are aware that not only the ear cup but also our skull bone can receive and conduct sounds. Tatjana Tchumatchenko from the Max Planck Institute for Brain Research in Frankfurt and Tobias Reichenbach from Imperial College London have now developed a new model explaining how the vibrations of the surrounding bone and the basilar membrane are coupled. These new results can be important for the development of new headphones and hearing devices.
Our sense of hearing, which is the ability to perceive sounds, arises exclusively in the inner ear. When sound waves travel through the air and reach our ear canal they cause different regions of the basilar membrane in the inner ear to vibrate. Which regions of the membrane they vibrate depends on their frequency. It is these microscopic vibrations of the membrane that we perceive as sound. However, the inner ear is surrounded by a bone that can also vibrate.

With the help of fluid dynamics calculations Tchumatchenko and Reichenbach have now discovered that the vibrations of the bone and basilar membrane are coupled. In other words, they can also mutually excite each other.

This gives rise to fascinating phenomena which, thanks to the new model, can now be understood: For example, two sounds with slightly different frequencies that arrive in the inner ear at the same time can overlap and excite the same regions on the basilar membrane. In this case, combination tones, or so-called otoacoustic emissions, are produced in the inner ear through the nonlinearity of the membrane. Precisely how these sounds leave the inner ear and how they spread inside the cochlea is currently a matter of scientific debate. “In our study we have shown that the combination tones can leave the inner ear in the form of a fast wave along the bone surface, and not, as previously assumed, by a wave along the basilar membrane,” explains Tatjana Tchumatchenko from the Max Planck Institute for Brain Research.

Moreover, the new model proves that the travelling waves along the basilar membrane can be generated by both the vibrations of the cochlear bone and the vibrations of the air inside the ear canal. “Our results provide an elegant explanation for this long-known but poorly understood observation,&#8221; says Tobias Reichenbach from Imperial College London.

These results will help advance our understanding of the complex interaction between the dynamics of fluids and the mechanics of the bone. This understanding can prove essential for ever more fascinating future clinical and commercial applications of bone conduction, such new-generation hearing aids and combinations between headphones and glasses.

Hearing with the skull

The ear is an important organ that allows us to perceive the world around us. However, very few of us are aware that not only the ear cup but also our skull bone can receive and conduct sounds. Tatjana Tchumatchenko from the Max Planck Institute for Brain Research in Frankfurt and Tobias Reichenbach from Imperial College London have now developed a new model explaining how the vibrations of the surrounding bone and the basilar membrane are coupled. These new results can be important for the development of new headphones and hearing devices.

Our sense of hearing, which is the ability to perceive sounds, arises exclusively in the inner ear. When sound waves travel through the air and reach our ear canal they cause different regions of the basilar membrane in the inner ear to vibrate. Which regions of the membrane they vibrate depends on their frequency. It is these microscopic vibrations of the membrane that we perceive as sound. However, the inner ear is surrounded by a bone that can also vibrate.

With the help of fluid dynamics calculations Tchumatchenko and Reichenbach have now discovered that the vibrations of the bone and basilar membrane are coupled. In other words, they can also mutually excite each other.

This gives rise to fascinating phenomena which, thanks to the new model, can now be understood: For example, two sounds with slightly different frequencies that arrive in the inner ear at the same time can overlap and excite the same regions on the basilar membrane. In this case, combination tones, or so-called otoacoustic emissions, are produced in the inner ear through the nonlinearity of the membrane. Precisely how these sounds leave the inner ear and how they spread inside the cochlea is currently a matter of scientific debate. “In our study we have shown that the combination tones can leave the inner ear in the form of a fast wave along the bone surface, and not, as previously assumed, by a wave along the basilar membrane,” explains Tatjana Tchumatchenko from the Max Planck Institute for Brain Research.

Moreover, the new model proves that the travelling waves along the basilar membrane can be generated by both the vibrations of the cochlear bone and the vibrations of the air inside the ear canal. “Our results provide an elegant explanation for this long-known but poorly understood observation,” says Tobias Reichenbach from Imperial College London.

These results will help advance our understanding of the complex interaction between the dynamics of fluids and the mechanics of the bone. This understanding can prove essential for ever more fascinating future clinical and commercial applications of bone conduction, such new-generation hearing aids and combinations between headphones and glasses.

Filed under hearing basilar membrane otoacoustic emissions inner ear neuroscience science

248 notes

Bad learning
University of Iowa researchers have discovered a new form of neurotransmission that influences the long-lasting memory created by addictive drugs, like cocaine and opioids, and the subsequent craving for these drugs of abuse. Loss of this type of neurotransmission creates changes in brains cells that resemble the changes caused by drug addiction.
The findings, published June 22 in the journal Nature Neuroscience, suggest that targeting this type of neurotransmission might lead to new therapies for treating drug addiction.
“Molecular therapies for drug addiction are pretty much non-existent,” says Collin Kreple, UI graduate student and co-first author of the study. “I think this finding at least provides the possibility of a new molecular target.”
The new form of neurotransmission involves proteins called acid-sensing ion channels (ASICs), which have previously been shown to promote learning and memory, and which are abundant in a part of the brain that is involved in drug addiction. The researchers, led by John Wemmie, professor of psychiatry in the UI Carver College of Medicine, reasoned that disrupting ASIC activity in this brain region (the nucleus accumbens) should reduce learned addiction-related behaviors. However, their experiments showed that loss of ASIC signaling actually increases learned drug-seeking in mice.
When mice learned to associate one side of a chamber with receiving cocaine, animals that lacked the ASIC protein developed an even stronger preference for the “cocaine side” than control mice, suggesting that loss of ASIC had increased addiction behavior. The same result was seen for morphine, another drug of abuse, which has a different mechanism of action than cocaine.
"Always before, the data suggested that when you get rid of ASICs, learning and memory are impaired," Wemmie says. "So we expected the same trend when we studied reward-related learning and behavior and we were surprised to find the opposite."
In a second experiment, rats learned to press a lever to self-administer cocaine. Blocking or removing ASIC in the rat brains caused the animals to self-administer more cocaine than control animals. Conversely, increasing the amount of ASIC by over-expressing the protein seemed to decrease the animals’ craving for cocaine.
"There are many forms of addiction," says Wemmie, who also holds appointments in the UI Departments of Molecular Physiology and Biophysics and Neurosurgery, and with the Iowa City VA Medical Center. "We&#8217;d like to see if these mechanisms also apply to other addictions besides cocaine and morphine. And, we want to move forward to see if this pathway can be used to target addiction."
Novel neurotransmission
As the name suggests, acid-sensing ion channels are activated by acid, in the form of protons. This research and a second UI study recently published in PNAS show that protons and ASICs form a previously unrecognized neurotransmitter pair that helps neurons communicate in a novel way; and appear to influence several forms of learning and memory, including fear, as well as addiction.
Manipulating the activity of ASICs or the level of protons (acidity) may provide a new way to treat addiction.
"We are still a long way from using these findings to create a therapy," notes Yuan Lu, co-first author and UI postdoctoral scholar. "The key significance of this study is that we have found new, different targets [that might allow us to inhibit the addiction behavior].”
Drugs change the brain
Previous research has shown that drug abuse and addiction physically alter the connections between neurons (synapses) that are important for the creation and storage of memories. Although normal learning requires synapses to be dynamic and plastic, exposure to addictive drugs abnormally increases synaptic plasticity in a way that is thought to underlie drug-related learning and addiction behaviors. The UI study found that absence of ASIC-proton mediated neurotransmission also increased synaptic plasticity in a way that resembled the changes created by addiction and drug withdrawal.
"It seemed like everything we looked at (physiology and structural changes) really paralleled what you would see in an animal undergoing drug withdrawal, even though these animals missing ASIC had never been exposed to drugs," Kreple says.
Overall the study findings suggest that ASIC-related neurotransmission in the nucleus accumbens may play a role in reducing synaptic plasticity and appropriately stabilizing synapses.

Bad learning

University of Iowa researchers have discovered a new form of neurotransmission that influences the long-lasting memory created by addictive drugs, like cocaine and opioids, and the subsequent craving for these drugs of abuse. Loss of this type of neurotransmission creates changes in brains cells that resemble the changes caused by drug addiction.

The findings, published June 22 in the journal Nature Neuroscience, suggest that targeting this type of neurotransmission might lead to new therapies for treating drug addiction.

“Molecular therapies for drug addiction are pretty much non-existent,” says Collin Kreple, UI graduate student and co-first author of the study. “I think this finding at least provides the possibility of a new molecular target.”

The new form of neurotransmission involves proteins called acid-sensing ion channels (ASICs), which have previously been shown to promote learning and memory, and which are abundant in a part of the brain that is involved in drug addiction. The researchers, led by John Wemmie, professor of psychiatry in the UI Carver College of Medicine, reasoned that disrupting ASIC activity in this brain region (the nucleus accumbens) should reduce learned addiction-related behaviors. However, their experiments showed that loss of ASIC signaling actually increases learned drug-seeking in mice.

When mice learned to associate one side of a chamber with receiving cocaine, animals that lacked the ASIC protein developed an even stronger preference for the “cocaine side” than control mice, suggesting that loss of ASIC had increased addiction behavior. The same result was seen for morphine, another drug of abuse, which has a different mechanism of action than cocaine.

"Always before, the data suggested that when you get rid of ASICs, learning and memory are impaired," Wemmie says. "So we expected the same trend when we studied reward-related learning and behavior and we were surprised to find the opposite."

In a second experiment, rats learned to press a lever to self-administer cocaine. Blocking or removing ASIC in the rat brains caused the animals to self-administer more cocaine than control animals. Conversely, increasing the amount of ASIC by over-expressing the protein seemed to decrease the animals’ craving for cocaine.

"There are many forms of addiction," says Wemmie, who also holds appointments in the UI Departments of Molecular Physiology and Biophysics and Neurosurgery, and with the Iowa City VA Medical Center. "We’d like to see if these mechanisms also apply to other addictions besides cocaine and morphine. And, we want to move forward to see if this pathway can be used to target addiction."

Novel neurotransmission

As the name suggests, acid-sensing ion channels are activated by acid, in the form of protons. This research and a second UI study recently published in PNAS show that protons and ASICs form a previously unrecognized neurotransmitter pair that helps neurons communicate in a novel way; and appear to influence several forms of learning and memory, including fear, as well as addiction.

Manipulating the activity of ASICs or the level of protons (acidity) may provide a new way to treat addiction.

"We are still a long way from using these findings to create a therapy," notes Yuan Lu, co-first author and UI postdoctoral scholar. "The key significance of this study is that we have found new, different targets [that might allow us to inhibit the addiction behavior].”

Drugs change the brain

Previous research has shown that drug abuse and addiction physically alter the connections between neurons (synapses) that are important for the creation and storage of memories. Although normal learning requires synapses to be dynamic and plastic, exposure to addictive drugs abnormally increases synaptic plasticity in a way that is thought to underlie drug-related learning and addiction behaviors. The UI study found that absence of ASIC-proton mediated neurotransmission also increased synaptic plasticity in a way that resembled the changes created by addiction and drug withdrawal.

"It seemed like everything we looked at (physiology and structural changes) really paralleled what you would see in an animal undergoing drug withdrawal, even though these animals missing ASIC had never been exposed to drugs," Kreple says.

Overall the study findings suggest that ASIC-related neurotransmission in the nucleus accumbens may play a role in reducing synaptic plasticity and appropriately stabilizing synapses.

Filed under drug addiction neurotransmission nucleus accumbens ion channels cocaine synaptic plasticity neuroscience science

free counters