Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

62 notes

Increased risk of stroke in people with cognitive impairment
People with cognitive impairment are significantly more likely to have a stroke, with a 39% increased risk, than people with normal cognitive function, according to a new study published in CMAJ (Canadian Medical Association Journal).
"Given the projected substantial rise in the number of older people around the world, prevalence rates of cognitive impairment and stroke are expected to soar over the next several decades, especially in high-income countries," writes Dr. Bruce Ovbiagele, Chair of the Department of Neurology, Medical University of South Carolina, Charleston, South Carolina, with coauthors.
Cognitive impairment and stroke are major contributors to disability, and stroke is the second leading cause of death world-wide. Although stroke is linked to the development and worsening of cognitive impairment, it is not known whether the reverse is true. Previous studies that have looked at the link between cognitive impairment and subsequent stroke have been inconsistent in their findings.
The study in CMAJ, by researchers in the United States, Taiwan and South Korea, analyzed data from 18 studies of 121 879 people with cognitive impairment, of whom 7799 later had strokes. Most of the included studies were conducted in North America or Europe.
The researchers observed a significantly higher rate of stroke in people with cognitive impairment than in people with normal cognitive function.
"We found that the risk of future stroke was 39% higher among patients with cognitive impairment at baseline than among those with normal cognitive function at baseline," write the authors. "This risk increased to 64% when a broadly adopted definition of cognitive impairment was used."
Blockage of blood vessels in the brain (brain infarcts), atherosclerosis, inflammation and other vascular conditions are associated with a higher risk of stroke and cognitive impairment and may contribute to the increased risk.
"Cognitive impairment should be more broadly recognized as a possible early clinical manifestation of cerebral infarction, so that timely management of vascular risk factors can be instituted to potentially prevent future stroke events and to avoid further deterioration of cognitive health," conclude the authors.

Increased risk of stroke in people with cognitive impairment

People with cognitive impairment are significantly more likely to have a stroke, with a 39% increased risk, than people with normal cognitive function, according to a new study published in CMAJ (Canadian Medical Association Journal).

"Given the projected substantial rise in the number of older people around the world, prevalence rates of cognitive impairment and stroke are expected to soar over the next several decades, especially in high-income countries," writes Dr. Bruce Ovbiagele, Chair of the Department of Neurology, Medical University of South Carolina, Charleston, South Carolina, with coauthors.

Cognitive impairment and stroke are major contributors to disability, and stroke is the second leading cause of death world-wide. Although stroke is linked to the development and worsening of cognitive impairment, it is not known whether the reverse is true. Previous studies that have looked at the link between cognitive impairment and subsequent stroke have been inconsistent in their findings.

The study in CMAJ, by researchers in the United States, Taiwan and South Korea, analyzed data from 18 studies of 121 879 people with cognitive impairment, of whom 7799 later had strokes. Most of the included studies were conducted in North America or Europe.

The researchers observed a significantly higher rate of stroke in people with cognitive impairment than in people with normal cognitive function.

"We found that the risk of future stroke was 39% higher among patients with cognitive impairment at baseline than among those with normal cognitive function at baseline," write the authors. "This risk increased to 64% when a broadly adopted definition of cognitive impairment was used."

Blockage of blood vessels in the brain (brain infarcts), atherosclerosis, inflammation and other vascular conditions are associated with a higher risk of stroke and cognitive impairment and may contribute to the increased risk.

"Cognitive impairment should be more broadly recognized as a possible early clinical manifestation of cerebral infarction, so that timely management of vascular risk factors can be instituted to potentially prevent future stroke events and to avoid further deterioration of cognitive health," conclude the authors.

Filed under stroke cognitive impairment cognitive function neuroscience science

124 notes

Changes in the eye can predict changes in the brain

Researchers at the Gladstone Institutes and University of California, San Francisco have shown that a loss of cells in the retina is one of the earliest signs of frontotemporal dementia (FTD) in people with a genetic risk for the disorder—even before any changes appear in their behavior.

image

Published today in the Journal of Experimental Medicine, the researchers, led by Gladstone investigator Li Gan, PhD and UCSF associate professor of neurology Ari Green, MD, studied a group of individuals who had a certain genetic mutation that is known to result in FTD. They discovered that before any cognitive signs of dementia were present, these individuals showed a significant thinning of the retina compared with people who did not have the gene mutation.

“This finding suggests that the retina acts as a type of ‘window to the brain,’” said Dr. Gan. “Retinal degeneration was detectable in mutation carriers prior to the onset of cognitive symptoms, establishing retinal thinning as one of the earliest observable signs of familial FTD. This means that retinal thinning could be an easily measured outcome for clinical trials.”

Although it is located in the eye, the retina is made up of neurons with direct connections to the brain. This means that studying the retina is one of the easiest and most accessible ways to examine and track changes in neurons.

Lead author Michael Ward, MD, PhD, a postdoctoral fellow at the Gladstone Institutes and assistant professor of neurology at UCSF, explained, “The retina may be used as a model to study the development of FTD in neurons. If we follow these patients over time, we may be able to correlate a decline in retinal thickness with disease progression. In addition, we may be able to track the effectiveness of a treatment through a simple eye examination.”

The researchers also discovered new mechanisms by which cell death occurs in FTD. As with most complex neurological disorders, there are several changes in the brain that contribute to the development of FTD. In the inherited form researched in the current study, this includes a deficiency of the protein progranulin, which is tied to the mislocalization of another crucial protein, TDP-43, from the nucleus of the cell out to the cytoplasm.

However, the relationship between neurodegeneration, progranulin, and TDP-43 was previously unclear. In follow-up studies using a genetic mouse model of FTD, the scientists were able to investigate this connection for the first time in neurons from the retina. They identified a depletion of TDP-43 from the cell nuclei before any signs of neurodegeneration occurred, signifying that this loss may be a direct cause of the cell death associated with FTD.

TDP-43 levels were shown to be regulated by a third cellular protein called Ran. By increasing expression of Ran, the researchers were able to elevate TDP-43 levels in the nucleus of progranulin-deficient neurons and prevent their death.

“With these findings,” said Dr. Gan, “we now not only know that retinal thinning can act as a pre-symptomatic marker of dementia, but we’ve also gained an understanding into the underlying mechanisms of frontotemporal dementia that could potentially lead to novel therapeutic targets.”

(Source: gladstoneinstitutes.org)

Filed under frontotemporal dementia retina genetic mutation neurodegeneration TDP-43 neurons neuroscience science

322 notes

Train your heart to protect your mind
Exercising to improve our cardiovascular strength may protect us from cognitive impairment as we age, according to a new study by researchers at the University of Montreal and its affiliated Institut universitaire de gératrie de Montréal Research Centre. “Our body’s arteries stiffen with age, and the vessel hardening is believed to begin in the aorta, the main vessel coming out of the heart, before reaching the brain. Indeed, the hardening may contribute to cognitive changes that occur during a similar time frame,” explained Claudine Gauthier, first author of the study. “We found that older adults whose aortas were in a better condition and who had greater aerobic fitness performed better on a cognitive test. We therefore think that the preservation of vessel elasticity may be one of the mechanisms that enables exercise to slow cognitive aging.”
The researchers worked with 31 young people between the ages of 18 and 30 and 54 older participants aged between 55 and 75. This enabled the team to compare the older participants within their peer group and against the younger group who obviously have not begun the aging processes in question. None of the participants had physical or mental health issues that might influence the study outcome. Their fitness was tested by exhausting the participants on a workout machine and determining their maximum oxygen intake over a 30 second period. Their cognitive abilities were assessed with the Stroop task. The Stroop task is a scientifically validated test that involves asking someone to identify the ink colour of a colour word that is printed in a different colour (e.g. the word red could be printed in blue ink and the correct answer would be blue). A person who is able to correctly name the colour of the word without being distracted by the reflex to read it has greater cognitive agility.
The participants undertook three MRI scans: one to evaluate the blood flow to the brain, one to measure their brain activity as they performed the Stroop task, and one to actually look at the physical state of their aorta. The researchers were interested in the brain’s blood flow, as poorer cardiovascular health is associated with a faster pulse wave,at each heartbeat which in turn could cause damage to the brain’s smaller blood vessels. “This is first study to use MRI to examine participants in this way,” Gauthier said. “It enabled us to find even subtle effects in this healthy population, which suggests that other researchers could adapt our test to study vascular-cognitive associations within less healthy and clinical populations.”
The results demonstrated age-related declines in executive function, aortic elasticity and cardiorespiratory fitness, a link between vascular health and brain function, and a positive association between aerobic fitness and brain function. “The link between fitness and brain function may be mediated through preserved cerebrovascular reactivity in periventricular watershed areas that are also associated with cardiorespiratory fitness,” Gauthier said. “Although the impact of fitness on cerebral vasculature may however involve other, more complex mechanisms, overall these results support the hypothesis that lifestyle helps maintain the elasticity of arteries, thereby preventing downstream cerebrovascular damage and resulting in preserved cognitive abilities in later life.”

Train your heart to protect your mind

Exercising to improve our cardiovascular strength may protect us from cognitive impairment as we age, according to a new study by researchers at the University of Montreal and its affiliated Institut universitaire de gératrie de Montréal Research Centre. “Our body’s arteries stiffen with age, and the vessel hardening is believed to begin in the aorta, the main vessel coming out of the heart, before reaching the brain. Indeed, the hardening may contribute to cognitive changes that occur during a similar time frame,” explained Claudine Gauthier, first author of the study. “We found that older adults whose aortas were in a better condition and who had greater aerobic fitness performed better on a cognitive test. We therefore think that the preservation of vessel elasticity may be one of the mechanisms that enables exercise to slow cognitive aging.”

The researchers worked with 31 young people between the ages of 18 and 30 and 54 older participants aged between 55 and 75. This enabled the team to compare the older participants within their peer group and against the younger group who obviously have not begun the aging processes in question. None of the participants had physical or mental health issues that might influence the study outcome. Their fitness was tested by exhausting the participants on a workout machine and determining their maximum oxygen intake over a 30 second period. Their cognitive abilities were assessed with the Stroop task. The Stroop task is a scientifically validated test that involves asking someone to identify the ink colour of a colour word that is printed in a different colour (e.g. the word red could be printed in blue ink and the correct answer would be blue). A person who is able to correctly name the colour of the word without being distracted by the reflex to read it has greater cognitive agility.

The participants undertook three MRI scans: one to evaluate the blood flow to the brain, one to measure their brain activity as they performed the Stroop task, and one to actually look at the physical state of their aorta. The researchers were interested in the brain’s blood flow, as poorer cardiovascular health is associated with a faster pulse wave,at each heartbeat which in turn could cause damage to the brain’s smaller blood vessels. “This is first study to use MRI to examine participants in this way,” Gauthier said. “It enabled us to find even subtle effects in this healthy population, which suggests that other researchers could adapt our test to study vascular-cognitive associations within less healthy and clinical populations.”

The results demonstrated age-related declines in executive function, aortic elasticity and cardiorespiratory fitness, a link between vascular health and brain function, and a positive association between aerobic fitness and brain function. “The link between fitness and brain function may be mediated through preserved cerebrovascular reactivity in periventricular watershed areas that are also associated with cardiorespiratory fitness,” Gauthier said. “Although the impact of fitness on cerebral vasculature may however involve other, more complex mechanisms, overall these results support the hypothesis that lifestyle helps maintain the elasticity of arteries, thereby preventing downstream cerebrovascular damage and resulting in preserved cognitive abilities in later life.”

Filed under aging cognition cardiorespiratory fitness executive function brain function neuroscience science

89 notes

'Haven't my neurons seen this before?'
The world grows increasingly more chaotic year after year, and our brains are constantly bombarded with images. A new study from Center for the Neural Basis of Cognition (CNBC), a joint project between Carnegie Mellon University and the University of Pittsburgh, reveals how neurons in the part of the brain responsible for recognizing objects respond to being shown a barrage of images. The study is published online by Nature Neuroscience.
The CNBC researchers showed animal subjects a rapid succession of images, some that were new, and some that the subjects had seen more than 100 times. The researchers measured the electrical response of individual neurons in the inferotemporal cortex, an essential part of the visual system and the part of the brain responsible for object recognition.
In previous studies, researchers found that when subjects were shown a single, familiar image, their neurons responded less strongly than when they were shown an unfamiliar image. However, in the current study, the CNBC researchers found that when subjects were exposed to familiar and unfamiliar images in a rapid succession, their neurons — especially the inhibitory neurons — fired much more strongly and selectively to images the subject had seen many times before.
"It was such a dramatic effect, it leapt out at us," said Carl Olson, a professor at Carnegie Mellon. "You wouldn’t expect there to be such deep changes in the brain from simply making things familiar. We think this may be a mechanism the brain uses to track a rapidly changing visual environment."
The researchers then ran a similar experiment in which they used themselves as subjects, recording their brain activity using EEG. They found that the humans’ brains responded similarly to the animal subjects’ brains when presented with familiar or unfamiliar images in rapid succession. In future studies, they hope to link these changes in the brain to improvements in perception and cognition.

'Haven't my neurons seen this before?'

The world grows increasingly more chaotic year after year, and our brains are constantly bombarded with images. A new study from Center for the Neural Basis of Cognition (CNBC), a joint project between Carnegie Mellon University and the University of Pittsburgh, reveals how neurons in the part of the brain responsible for recognizing objects respond to being shown a barrage of images. The study is published online by Nature Neuroscience.

The CNBC researchers showed animal subjects a rapid succession of images, some that were new, and some that the subjects had seen more than 100 times. The researchers measured the electrical response of individual neurons in the inferotemporal cortex, an essential part of the visual system and the part of the brain responsible for object recognition.

In previous studies, researchers found that when subjects were shown a single, familiar image, their neurons responded less strongly than when they were shown an unfamiliar image. However, in the current study, the CNBC researchers found that when subjects were exposed to familiar and unfamiliar images in a rapid succession, their neurons — especially the inhibitory neurons — fired much more strongly and selectively to images the subject had seen many times before.

"It was such a dramatic effect, it leapt out at us," said Carl Olson, a professor at Carnegie Mellon. "You wouldn’t expect there to be such deep changes in the brain from simply making things familiar. We think this may be a mechanism the brain uses to track a rapidly changing visual environment."

The researchers then ran a similar experiment in which they used themselves as subjects, recording their brain activity using EEG. They found that the humans’ brains responded similarly to the animal subjects’ brains when presented with familiar or unfamiliar images in rapid succession. In future studies, they hope to link these changes in the brain to improvements in perception and cognition.

Filed under inferotemporal cortex object recognition brain activity neurons neuroscience science

158 notes

Neuroscience and big data: How to find simplicity in the brain
Scientists can now monitor and record the activity of hundreds of neurons concurrently in the brain, and ongoing technology developments promise to increase this number manyfold. However, simply recording the neural activity does not automatically lead to a clearer understanding of how the brain works.
In a new review paper published in Nature Neuroscience, Carnegie Mellon University’s Byron M. Yu and Columbia University’s John P. Cunningham describe the scientific motivations for studying the activity of many neurons together, along with a class of machine learning algorithms — dimensionality reduction — for interpreting the activity.
In recent years, dimensionality reduction has provided insight into how the brain distinguishes between different odors, makes decisions in the face of uncertainty and is able to think about moving a limb without actually moving. Yu and Cunningham contend that using dimensionality reduction as a standard analytical method will make it easier to compare activity patterns in healthy and abnormal brains, ultimately leading to improved treatments and interventions for brain injuries and disorders.
"One of the central tenets of neuroscience is that large numbers of neurons work together to give rise to brain function. However, most standard analytical methods are appropriate for analyzing only one or two neurons at a time. To understand how large numbers of neurons interact, advanced statistical methods, such as dimensionality reduction, are needed to interpret these large-scale neural recordings," said Yu, an assistant professor of electrical and computer engineering and biomedical engineering at CMU and a faculty member in the Center for the Neural Basis of Cognition (CNBC).
The idea behind dimensionality reduction is to summarize the activity of a large number of neurons using a smaller number of latent (or hidden) variables. Dimensionality reduction methods are particularly useful to uncover inner workings of the brain, such as when we ruminate or solve a mental math problem, where all the action is going on inside the brain and not in the outside world. These latent variables can be used to trace out the path of ones thoughts.
"One of the major goals of science is to explain complex phenomena in simple terms. Traditionally, neuroscientists have sought to find simplicity with individual neurons. However, it is becoming increasingly recognized that neurons show varied features in their activity patterns that are difficult to explain by examining one neuron at a time. Dimensionality reduction provides us with a way to embrace single-neuron heterogeneity and seek simple explanations in terms of how neurons interact with each other," said Cunningham, assistant professor of statistics at Columbia.
Although dimensionality reduction is relatively new to neuroscience compared to existing analytical methods, it has already shown great promise. With Big Data getting ever bigger thanks to the continued development of neural recording technologies and the federal BRAIN Initiative, the use of dimensionality reduction and related methods will likely become increasingly essential.

Neuroscience and big data: How to find simplicity in the brain

Scientists can now monitor and record the activity of hundreds of neurons concurrently in the brain, and ongoing technology developments promise to increase this number manyfold. However, simply recording the neural activity does not automatically lead to a clearer understanding of how the brain works.

In a new review paper published in Nature Neuroscience, Carnegie Mellon University’s Byron M. Yu and Columbia University’s John P. Cunningham describe the scientific motivations for studying the activity of many neurons together, along with a class of machine learning algorithms — dimensionality reduction — for interpreting the activity.

In recent years, dimensionality reduction has provided insight into how the brain distinguishes between different odors, makes decisions in the face of uncertainty and is able to think about moving a limb without actually moving. Yu and Cunningham contend that using dimensionality reduction as a standard analytical method will make it easier to compare activity patterns in healthy and abnormal brains, ultimately leading to improved treatments and interventions for brain injuries and disorders.

"One of the central tenets of neuroscience is that large numbers of neurons work together to give rise to brain function. However, most standard analytical methods are appropriate for analyzing only one or two neurons at a time. To understand how large numbers of neurons interact, advanced statistical methods, such as dimensionality reduction, are needed to interpret these large-scale neural recordings," said Yu, an assistant professor of electrical and computer engineering and biomedical engineering at CMU and a faculty member in the Center for the Neural Basis of Cognition (CNBC).

The idea behind dimensionality reduction is to summarize the activity of a large number of neurons using a smaller number of latent (or hidden) variables. Dimensionality reduction methods are particularly useful to uncover inner workings of the brain, such as when we ruminate or solve a mental math problem, where all the action is going on inside the brain and not in the outside world. These latent variables can be used to trace out the path of ones thoughts.

"One of the major goals of science is to explain complex phenomena in simple terms. Traditionally, neuroscientists have sought to find simplicity with individual neurons. However, it is becoming increasingly recognized that neurons show varied features in their activity patterns that are difficult to explain by examining one neuron at a time. Dimensionality reduction provides us with a way to embrace single-neuron heterogeneity and seek simple explanations in terms of how neurons interact with each other," said Cunningham, assistant professor of statistics at Columbia.

Although dimensionality reduction is relatively new to neuroscience compared to existing analytical methods, it has already shown great promise. With Big Data getting ever bigger thanks to the continued development of neural recording technologies and the federal BRAIN Initiative, the use of dimensionality reduction and related methods will likely become increasingly essential.

Filed under neurons neural activity neural recordings neuroscience science

80 notes

Driving brain rhythm makes mice more sensitive to touch
By striking up the right rhythm in the right brain region at the right time, Brown University neuroscientists report in Nature Neuroscience that they managed to endow mice with greater touch sensitivity than other mice, making hard-to-perceive vibrations suddenly more vivid to them.
The findings offer the first direct evidence that “gamma” brainwaves in the cortex affect perception and attention. With only correlations and associations as evidence before, neuroscientists have argued for years about whether gamma has an important role or whether it’s merely a byproduct — an “exhaust fume” in the words of one — of such brain activity.
“There’s a lot of excitement about the importance of gamma rhythms in behavior, as well as a lot of skepticism,” said co-lead author Joshua Siegle, a former graduate student at Brown University and MIT, who is now at the Allen Institute for Neuroscience. “Rather than try to correlate changes in gamma rhythms with changes in behavior, which is what researchers have done in the past, we chose to directly control the cells that produce gamma.”
The result was a mouse with whiskers that were about 20 percent more sensitive.
“There were a lot of ways this experiment could have failed but instead to our surprise it was pretty decisive from the very first subject we looked at — that under certain conditions we can make a super-perceiving mouse,” said Christopher Moore, associate professor of neuroscience at Brown and senior author of the study. “We’re making a mouse do better than a mouse could have done otherwise.”
Specifically, Moore and co-first authors Siegle and Dominique Pritchett performed their experiments by using optogenetics — a technique of using light to control the firing patterns of neurons — to generate a gamma rhythm by manipulating inhibitory interneurons in the primary sensory neocortex of mice. That part of the brain controls a mouse’s ability to detect faint sensations via its whiskers.
A different part of the brain handles stronger, more imposing sensations, Moore said. The primary sensory neocortex, a particular feature of mammals, has the distinction of allowing an animal to purposely pay attention to more subtle sensations. It’s the difference between the feeling of gently brushing a fingertip along a wood board to assess if it needs a bit more sanding and the feeling of dropping the wood board on a foot.
Before anything else in the paper, the researchers confirmed that mice naturally produce a 40-hertz gamma rhythm in their sensory neocortex sometimes. Then they optogenetically generated that gamma rhythm with precise pulses of blue light. Mice with this rhythm could more often detect the fainter vibrations the researchers supplied to their whiskers than could mice who did not have the rhythm going in their brains.
Control and optogenetically stimulated mice alike had been conditioned to indicated their detection of a supplied stimulus by licking a water bottle. The vibrations provided to the mice to sense covered a span of 17 different levels of detectability.
The team’s hypothesis was that the gamma rhythm of the stimulated neurons, because they inhibit the transmission of sensation messages by pyramidal neurons in the neocortex with a structured periodicity, actually orders the pyramidal messages into a more coherent and therefore stronger train.
“It’s not surprising that these synchronized bursts of activity can benefit signal transmission, in the same way that synchronized clapping in a crowd of people is louder than random clapping,” Siegle said.
This idea suggested that the timing of the rhythm matters.
So in another experiment, Siegle, Pritchett, and Moore varied the onset of the gamma rhythm by increments of 5 milliseconds to see whether it made a difference to perception. It did. The mice showed their increased sensitivity only so long as the gamma rhythms were underway 20-25 milliseconds before the subtle sensations were presented. If they weren’t, the mice experienced on average no impact on sensitivity.
One of the key implications from the findings for neuroscience, Moore said, is that the way gamma rhythms appear to structure the processing of perception is more important than the mere firing rate of neurons in the sensory neocortex. Mice became better able to feel not because neurons became more active (they didn’t), but because they were entrained by a precisely timed rhythm.
Although the study provides causal evidence of a functional importance for gamma rhythms, Moore acknowledged, it still leaves open important questions. The exact mechanism by which gamma rhythms affect sensation processing and attention are not proved, only hypothesized.
And in one experiment, optogenetically stimulated mice appeared less able to detect the most obvious and imposing of the sensations, even as they became more sensitive to the more subtle ones. In other experiments, however, their detection of major sensations was not compromised.
But the possible loss of sensitivity to stimuli that are easier to feel could be consistent with a shifting of attention to fainter ones, said Pritchett, also a former Brown and MIT student now at the Champalimaud Centre for the Unknown in Lisbon, Portugal.
“What we are showing is that, paradoxically, the rhythmic inhibitory input works to amplify threshold stimuli, possibly at the expense of salient stimuli,” he said. “This is precisely what you would expect from a mechanism that might be responsible for selective attention in the brain.”
Therefore, Siegle, Pritchett, and Moore say they do have a better feel now for what’s going on in the brain.

Driving brain rhythm makes mice more sensitive to touch

By striking up the right rhythm in the right brain region at the right time, Brown University neuroscientists report in Nature Neuroscience that they managed to endow mice with greater touch sensitivity than other mice, making hard-to-perceive vibrations suddenly more vivid to them.

The findings offer the first direct evidence that “gamma” brainwaves in the cortex affect perception and attention. With only correlations and associations as evidence before, neuroscientists have argued for years about whether gamma has an important role or whether it’s merely a byproduct — an “exhaust fume” in the words of one — of such brain activity.

“There’s a lot of excitement about the importance of gamma rhythms in behavior, as well as a lot of skepticism,” said co-lead author Joshua Siegle, a former graduate student at Brown University and MIT, who is now at the Allen Institute for Neuroscience. “Rather than try to correlate changes in gamma rhythms with changes in behavior, which is what researchers have done in the past, we chose to directly control the cells that produce gamma.”

The result was a mouse with whiskers that were about 20 percent more sensitive.

“There were a lot of ways this experiment could have failed but instead to our surprise it was pretty decisive from the very first subject we looked at — that under certain conditions we can make a super-perceiving mouse,” said Christopher Moore, associate professor of neuroscience at Brown and senior author of the study. “We’re making a mouse do better than a mouse could have done otherwise.”

Specifically, Moore and co-first authors Siegle and Dominique Pritchett performed their experiments by using optogenetics — a technique of using light to control the firing patterns of neurons — to generate a gamma rhythm by manipulating inhibitory interneurons in the primary sensory neocortex of mice. That part of the brain controls a mouse’s ability to detect faint sensations via its whiskers.

A different part of the brain handles stronger, more imposing sensations, Moore said. The primary sensory neocortex, a particular feature of mammals, has the distinction of allowing an animal to purposely pay attention to more subtle sensations. It’s the difference between the feeling of gently brushing a fingertip along a wood board to assess if it needs a bit more sanding and the feeling of dropping the wood board on a foot.

Before anything else in the paper, the researchers confirmed that mice naturally produce a 40-hertz gamma rhythm in their sensory neocortex sometimes. Then they optogenetically generated that gamma rhythm with precise pulses of blue light. Mice with this rhythm could more often detect the fainter vibrations the researchers supplied to their whiskers than could mice who did not have the rhythm going in their brains.

Control and optogenetically stimulated mice alike had been conditioned to indicated their detection of a supplied stimulus by licking a water bottle. The vibrations provided to the mice to sense covered a span of 17 different levels of detectability.

The team’s hypothesis was that the gamma rhythm of the stimulated neurons, because they inhibit the transmission of sensation messages by pyramidal neurons in the neocortex with a structured periodicity, actually orders the pyramidal messages into a more coherent and therefore stronger train.

“It’s not surprising that these synchronized bursts of activity can benefit signal transmission, in the same way that synchronized clapping in a crowd of people is louder than random clapping,” Siegle said.

This idea suggested that the timing of the rhythm matters.

So in another experiment, Siegle, Pritchett, and Moore varied the onset of the gamma rhythm by increments of 5 milliseconds to see whether it made a difference to perception. It did. The mice showed their increased sensitivity only so long as the gamma rhythms were underway 20-25 milliseconds before the subtle sensations were presented. If they weren’t, the mice experienced on average no impact on sensitivity.

One of the key implications from the findings for neuroscience, Moore said, is that the way gamma rhythms appear to structure the processing of perception is more important than the mere firing rate of neurons in the sensory neocortex. Mice became better able to feel not because neurons became more active (they didn’t), but because they were entrained by a precisely timed rhythm.

Although the study provides causal evidence of a functional importance for gamma rhythms, Moore acknowledged, it still leaves open important questions. The exact mechanism by which gamma rhythms affect sensation processing and attention are not proved, only hypothesized.

And in one experiment, optogenetically stimulated mice appeared less able to detect the most obvious and imposing of the sensations, even as they became more sensitive to the more subtle ones. In other experiments, however, their detection of major sensations was not compromised.

But the possible loss of sensitivity to stimuli that are easier to feel could be consistent with a shifting of attention to fainter ones, said Pritchett, also a former Brown and MIT student now at the Champalimaud Centre for the Unknown in Lisbon, Portugal.

“What we are showing is that, paradoxically, the rhythmic inhibitory input works to amplify threshold stimuli, possibly at the expense of salient stimuli,” he said. “This is precisely what you would expect from a mechanism that might be responsible for selective attention in the brain.”

Therefore, Siegle, Pritchett, and Moore say they do have a better feel now for what’s going on in the brain.

Filed under gamma oscillations interneurons optogenetics tactile stimulation neuroscience science

294 notes

Fed Up with Waiting? Timely Activation of Serotonin Enhances Patience

Lining up in a long queue for a popular restaurant or waiting for the arrival of a date requires a great deal of patience. Our lives are full of decisions involving patience, yet it needs to be exercised at the appropriate times. In order to examine the brain mechanism for controlling patience to obtain a reward, Drs. Kayoko Miyazaki and Katsuhiko Miyazaki and Prof. Kenji Doya of the Neural Computation Unit at the Okinawa Institute of Science and Technology Graduate University, used a new technique called optogenetics, where they use light to simulate specific neurons with precise timing. Their most recent research shows that activating serotonin neurons specifically during waiting promotes patience for delayed rewards. This research was published in the online version of Current Biology on August 21, 2014.

In this study, the researchers used genetically engineered mice that produce light-activated molecules only in neurons that produce serotonin. They implanted an optical fiber in a small part of the brain called the dorsal raphe, from which neural fibers releasing serotonin extend throughout the cerebrum, the largest and most highly developed part of the brain. The researchers trained five of those mice to perform a delayed reward task, meaning that if they waited at a hole, they would receive a food pellet as a reward. To show that they were waiting, each mouse needed to hold its nose inside the hole where the food pellet would appear, a posture that the researchers call a nose poke. The durations of waiting were randomly chosen from 3, 6, or 9 seconds, or infinity, meaning no reward was given no matter how long the mice waited. In half of those trials, researchers stimulated serotonin neurons by shining a light through the optical fiber while the mice were waiting. No prior signal was given to notify how long the waiting would be. The mice consistently waited for 3 and 6 seconds to receive the food. But when the mice needed to wait for 9 seconds, the mice showed difficulty and often removed their nose from the food hole. When the researchers shone a light on serotonin neurons during the nose poke position, the light stimulation significantly decreased the number of failures to wait for 9 seconds to obtain the food.

In the 25% of trials, the food pellet reward was not delivered regardless of how long the mouse waited. In these trials, without their serotonin neurons stimulated, mice waited 12.0 seconds on average. With their serotonin neurons stimulated, the mice waited 17.5 seconds on average. As control experiments, the researchers activated serotonin neurons at different timing when each mouse did not have its nose poked into the food hole, then observed that these mice behaved the same as in unstimulated cases with no evidence of simple motor inhibition. The results showed, for the first time, that the timed activation of serotonin neurons promotes animals’ patience for delayed rewards.

Serotonin is a neuromodulator that is released diffusely in the entire brain. It is involved in behavioral, cognitive, and mental functions. Classically, serotonin was believed to signal punishment and inhibit behaviors. However, serotonin enriching drugs, known as SSRI, are effective for therapies of depression, which is hard to reconcile with the classic view. Another recent study of optogenetic stimulation of serotonin neurons even reported its effect as a reward, to further complicate the story. On the other hand, another line of research, including the previous work by the OIST researchers, showed that the lack of serotonin causes impulsive behaviors. “Our previous studies have shown that serotonin levels increase when waiting for delayed rewards. We have also shown that inhibiting serotonin neurons leads to an inability to wait for a long time,” explained Kayoko and Katsuhiko Miyazaki. “By using light to stimulate neurons at specific times, this study has proven serotonin’s role in patience during delayed reward waiting, underlining serotonin’s much greater role than previously thought.” By further exploring the effect of serotonin, the researchers hope to decipher the neuronal network behind mental disorders and behaviors involving serotonin. Such studies can promote a better understanding of human emotions, including the development of software and robots and that think and act like humans.

Filed under serotonin optogenetics dorsal raphe serotonergic neurons neuroscience science

83 notes

Mouse model for epilepsy, Alzheimer’s gives window into the working brain

University of Utah scientists have developed a genetically engineered line of mice that is expected to open the door to new research on epilepsy, Alzheimer’s and other diseases.

The mice carry a protein marker, which changes in degree of fluorescence in response to different calcium levels. This will allow many cell types, including cells called astrocytes and microglia, to be studied in a new way.

"This is opening up the possibility to decipher how the brain works," said Petr Tvrdik, Ph.D., a research fellow in human genetics and a senior author on the study.

The research was published Aug. 14, 2014, in Neuron, a world-leading neuroscience journal. The work is the result of a three-year study involving multiple labs connected with The Brain Institute at the University of Utah. The lead author is J. Michael Gee, who is pursuing both a medical degree and a graduate degree in bioengineering at the university.

"We’re really in the era of team science," said John White, Ph.D., professor of bioengineering, executive director of the Brain Institute and the study’s corresponding author.

With the new mouse line, scientists can use a laser-based fluorescence microscope to study the calcium indicator in the glial cells of the living mouse, either when the mouse is anesthetized or awake. Calcium is studied because it is an important signaling molecule in the body and it can reveal how well the brain is functioning.

Using this method, the scientists are essentially creating a window into the working brain to study the interactions between neurons, astrocytes and microglia.

"We believe this will give us new insights for treatments of epilepsy and for new views of how the immune system of the brain works," White said.

About one-third of the 3 million Americans estimated to have epilepsy lack adequate treatment to manage the disease.

Describing a long-standing collaboration with fellow university researcher and professor of pharmacology and toxicology Karen Wilcox, Ph.D., White said, “We believe the glial cells are malfunctioning in epilepsy. What we’re trying to do is find out in what ways astrocytes participate in the disease.”

This research is expected to lead to new classes of drugs.

The ability to track calcium changes in microglial cells will also open up the possibility of studying inflammatory diseases of the brain. Every neurological disease, including Multiple Sclerosis and Alzheimer’s, appears to include components of inflammation, the scientists said.

"Live imaging and monitoring microglial activity and responses to inflammation was not possible before," said Tvrdik, particularly in living animals. In the past, researchers studied post-mortem tissue or relied on invasive approaches using synthetic dyes.

(Source: eurekalert.org)

Filed under epilepsy alzheimer's disease glial cells neurons animal model calcium neuroscience science

78 notes

Study of self-awareness in MS has implications for rehabilitation

A new study of self-awareness by Kessler Foundation researchers shows that persons with multiple sclerosis (MS) may be able to improve their self-awareness through task-oriented cognitive rehabilitation. The study was epublished ahead of print on July 2 in NeuroRehabilitation. Self-awareness is one’s ability to recognize cognitive problems caused by brain injury. This is the first study of self-awareness in MS that includes assessment of online awareness, as well as metacognitive awareness. 

Yael Goverover, PhD, OT, is a visiting scientist at Kessler Foundation. She is an associate professor at New York University. Dr. Goverover is a recipient of the National Institute on Disability and Rehabilitation Research Fellowship award (Mary Switzer Award). Drs. Genova, Chiaravalloti and DeLuca are MS researchers at Kessler Foundation.

The researchers assessed 18 people with MS and 16 healthy controls for 2 types of self-awareness - metacognitive knowledge of disabilities (or intellectual awareness) and online awareness (emergent or anticipatory awareness). They also looked at the relationships among self-awareness, functional performance and quality of life (QoL). Assessment involved the Functional Behavior Profile, questionnaires administered before and after functional tasks (purchasing cookies and airline tickets via the Internet) and the Functional Assessment of Multiple Sclerosis measure. 

“Results showed that compared with controls, people with MS assessed their actual performance more realistically following completion of a task. This suggests that individuals may be able to improve their self-awareness through more experience with tasks,” noted Nancy Chiaravalloti, PhD, director of Neuropsychology & Neuroscience Research at Kessler Foundation.

"Research that leads to better understanding of types of self-awareness, functional outcomes and QOL will aid the development of effective assessments and rehabilitation interventions,” said Dr. Chiaravalloti. “The association between online awareness and task performance in this study, for example, may have implications for cognitive rehabilitation strategies in the MS population.”

(Source: kesslerfoundation.org)

Filed under MS self-awareness cognition psychology neuroscience science

133 notes

New enzyme targets for selective cancer therapies
Thanks to important discoveries in basic and clinical research and technological advances, the fight against cancer has mobilized into a complex offensive spanning multiple fronts.
Work happening in a University of Alberta chemistry lab could help find new and more selective therapies for cancer. Researchers have developed a compound that targets a specific enzyme overexpressed in certain cancers—and they have tested its activity in cells from brain tumours.
Chemistry professor Christopher Cairo and his team synthesized a first-of-its-kind inhibitor that prevents the activity of an enzyme called neuraminidase. Although flu viruses use enzymes with the same mechanism as part of the process of infection, human cells use their own forms of the enzyme in many biological processes.
Cairo’s group collaborated with a group in Milan, Italy, that has shown that neuraminidases are found in excess amounts in glioblastoma cells, a form of brain cancer.
In a new study, a team from the University of Milan tested Cairo’s enzyme inhibitor and found that it turned glioblastoma cancer stem cells—found within a tumour and believed to drive cancer growth—into normal cells. The compound also caused the cells to stop growing, suggesting that this mechanism could be important for therapeutics. Results of their efforts were published Aug. 22 in the Nature journal Cell Death & Disease.
Cairo said these findings establish that an inhibitor of this enzyme could work therapeutically and should open the door for future research.
“This is the first proof-of-concept showing a selective neuraminidase inhibitor can have a real effect in human cancer cells,” he said. “It isn’t a drug yet, but it establishes a new target that we think can be used for creating new, more selective drugs.”
Long road from proof of concept to drug
Proving the compound can successfully inhibit the neuraminidase enzyme in cancer cells is just the first step in determining its potential as a therapy.
In its current form, the compound could not be used as a drug, Cairo explained, largely because it wasn’t designed to breach the blood-brain barrier making it difficult to reach the target cells. The team in Milan had to use the compound in very high concentrations, he added.
The research advances our understanding of how important carbohydrates are to the function of cells. Although most of us think of glucose (blood sugar) as the only important sugar in biology, there is an entire area of research known as glycobiology that seeks to understand the function of complex carbohydrate structures in cells. Carbohydrate structures cover the surface of cells, and affect how cells interact with each other and with pathogens.
Scientists have known for decades that the carbohydrates found on cancer cells are very different from those on normal cells. For example, many cancers have different amounts of specific residues like sialic acid, or may have different arrangements of the same residues.
“The carbohydrates on the cell surface determine how it interacts with other cells, which makes them important in cancer and other diseases. So, if we can design compounds that change these structures in a defined way, we can affect those interactions,” Cairo explained. “Finding new enzyme targets is essential to that process, and our work shows that we can selectively target this neuraminidase enzyme.”
Although there has been a lot of work on targeting viral neuraminidase enzymes, Cairo’s team has found inhibitors of the human enzymes. “The challenge in human cells is that there are four different isoenzymes. While we might want to target one for its role in cancer, hitting the wrong one could have harmful side-effects,” he said.
The U of A team reached out to their colleagues in Milan who were studying the role of a specific neuraminidase isoenzyme in cancer cells isolated from patients. Cairo approached them about testing a compound his team identified last year, which was selective for the same isoenzyme.
“I expected it would do something, but I didn’t know it would be that striking. It came out beautifully,” Cairo said.
The U of A team is already working on improving the compound, and developing and testing new and existing inhibitors using a panel of in vitro assays they developed.
“We’ve been working on these enzymes for about five years. Validation of our strategy­­­—design of a selective neuraminidase inhibitor and application in a cell that overexpresses that enzyme—is an achievement for us.”

New enzyme targets for selective cancer therapies

Thanks to important discoveries in basic and clinical research and technological advances, the fight against cancer has mobilized into a complex offensive spanning multiple fronts.

Work happening in a University of Alberta chemistry lab could help find new and more selective therapies for cancer. Researchers have developed a compound that targets a specific enzyme overexpressed in certain cancers—and they have tested its activity in cells from brain tumours.

Chemistry professor Christopher Cairo and his team synthesized a first-of-its-kind inhibitor that prevents the activity of an enzyme called neuraminidase. Although flu viruses use enzymes with the same mechanism as part of the process of infection, human cells use their own forms of the enzyme in many biological processes.

Cairo’s group collaborated with a group in Milan, Italy, that has shown that neuraminidases are found in excess amounts in glioblastoma cells, a form of brain cancer.

In a new study, a team from the University of Milan tested Cairo’s enzyme inhibitor and found that it turned glioblastoma cancer stem cells—found within a tumour and believed to drive cancer growth—into normal cells. The compound also caused the cells to stop growing, suggesting that this mechanism could be important for therapeutics. Results of their efforts were published Aug. 22 in the Nature journal Cell Death & Disease.

Cairo said these findings establish that an inhibitor of this enzyme could work therapeutically and should open the door for future research.

“This is the first proof-of-concept showing a selective neuraminidase inhibitor can have a real effect in human cancer cells,” he said. “It isn’t a drug yet, but it establishes a new target that we think can be used for creating new, more selective drugs.”

Long road from proof of concept to drug

Proving the compound can successfully inhibit the neuraminidase enzyme in cancer cells is just the first step in determining its potential as a therapy.

In its current form, the compound could not be used as a drug, Cairo explained, largely because it wasn’t designed to breach the blood-brain barrier making it difficult to reach the target cells. The team in Milan had to use the compound in very high concentrations, he added.

The research advances our understanding of how important carbohydrates are to the function of cells. Although most of us think of glucose (blood sugar) as the only important sugar in biology, there is an entire area of research known as glycobiology that seeks to understand the function of complex carbohydrate structures in cells. Carbohydrate structures cover the surface of cells, and affect how cells interact with each other and with pathogens.

Scientists have known for decades that the carbohydrates found on cancer cells are very different from those on normal cells. For example, many cancers have different amounts of specific residues like sialic acid, or may have different arrangements of the same residues.

“The carbohydrates on the cell surface determine how it interacts with other cells, which makes them important in cancer and other diseases. So, if we can design compounds that change these structures in a defined way, we can affect those interactions,” Cairo explained. “Finding new enzyme targets is essential to that process, and our work shows that we can selectively target this neuraminidase enzyme.”

Although there has been a lot of work on targeting viral neuraminidase enzymes, Cairo’s team has found inhibitors of the human enzymes. “The challenge in human cells is that there are four different isoenzymes. While we might want to target one for its role in cancer, hitting the wrong one could have harmful side-effects,” he said.

The U of A team reached out to their colleagues in Milan who were studying the role of a specific neuraminidase isoenzyme in cancer cells isolated from patients. Cairo approached them about testing a compound his team identified last year, which was selective for the same isoenzyme.

“I expected it would do something, but I didn’t know it would be that striking. It came out beautifully,” Cairo said.

The U of A team is already working on improving the compound, and developing and testing new and existing inhibitors using a panel of in vitro assays they developed.

“We’ve been working on these enzymes for about five years. Validation of our strategy­­­—design of a selective neuraminidase inhibitor and application in a cell that overexpresses that enzyme—is an achievement for us.”

Filed under brain tumors neuraminidase glioblastoma tumor cells neuroscience science

free counters