Neuroscience

Articles and news from the latest research reports.

Posts tagged science

127 notes

SA’s Taung Child’s skull and brain not human-like in expansion
The Taung Child, South Africa’s premier hominin discovered 90 years ago by Wits University Professor Raymond Dart, never seizes to transform and evolve the search for our collective origins.
By subjecting the skull of the first australopith discovered to the latest technologies in the Wits University Microfocus X-ray Computed Tomography (CT) facility, researchers are now casting doubt on theories that Australopithecus africanus shows the same cranial adaptations found in modern human infants and toddlers – in effect disproving current support for the idea that this early hominin shows infant brain development in the prefrontal region similar to that of modern humans.
The results have been published online in the prestigious journal Proceedings of the National Academy of Sciences (PNAS) on Monday, 25 August 2014 at 21:00 SAST (15:00 EST), in an article titled: New high resolution CT data of the Taung partial cranium and endocast and their bearing on metopism and hominin brain evolution.
The Taung Child has historical and scientific importance in the fossil record as the first and best example of early hominin brain evolution, and theories have been put forward that it exhibits key cranial adaptations found in modern human infants and toddlers.
To test the ancientness of this evolutionary adaptation, Dr Kristian J. Carlson, Senior Researcher from the Evolutionary Studies Institute at the University of the Witwatersrand, and colleagues, Professor Ralph L. Holloway from Columbia University and Douglas C. Broadfield from Florida Atlantic University, performed an in silico dissection of the Taung fossil using high-resolution computed tomography.
"A recent study has described the roughly 3 million-year-old fossil, thought to have belonged to a 3 to 4-year-old, as having a persistent metopic suture and open anterior fontanelle, two features that facilitate post-natal brain growth in human infants when their disappearance is delayed," said Carlson.
Comparisons with the existing hominin fossil record and chimpanzee variation do not support this evolutionary scenario.
Citing deficiencies in how the Taung fossil material has been recently assessed, the researchers suggest physical evidence does not incontrovertibly link features of the Taung skull, or its endocast, to early prefrontal lobe expansion, a brain region implicated in many human behaviors.
The authors also debate the previously offered theoretical basis for this adaptation in A. africanus. By refuting the presence of these features in the Taung Child, the researchers dispute whether these structures were selectively advantageous in hominin evolution, particularly in australopiths.
Thus, results of the new study show that there is still no evidence for this kind of skull adaptation that evolved before Homo, nor is there evidence for a link between such skull characteristics and the proposed accompanying early prefrontal lobe expansion, Carlson said.

SA’s Taung Child’s skull and brain not human-like in expansion

The Taung Child, South Africa’s premier hominin discovered 90 years ago by Wits University Professor Raymond Dart, never seizes to transform and evolve the search for our collective origins.

By subjecting the skull of the first australopith discovered to the latest technologies in the Wits University Microfocus X-ray Computed Tomography (CT) facility, researchers are now casting doubt on theories that Australopithecus africanus shows the same cranial adaptations found in modern human infants and toddlers – in effect disproving current support for the idea that this early hominin shows infant brain development in the prefrontal region similar to that of modern humans.

The results have been published online in the prestigious journal Proceedings of the National Academy of Sciences (PNAS) on Monday, 25 August 2014 at 21:00 SAST (15:00 EST), in an article titled: New high resolution CT data of the Taung partial cranium and endocast and their bearing on metopism and hominin brain evolution.

The Taung Child has historical and scientific importance in the fossil record as the first and best example of early hominin brain evolution, and theories have been put forward that it exhibits key cranial adaptations found in modern human infants and toddlers.

To test the ancientness of this evolutionary adaptation, Dr Kristian J. Carlson, Senior Researcher from the Evolutionary Studies Institute at the University of the Witwatersrand, and colleagues, Professor Ralph L. Holloway from Columbia University and Douglas C. Broadfield from Florida Atlantic University, performed an in silico dissection of the Taung fossil using high-resolution computed tomography.

"A recent study has described the roughly 3 million-year-old fossil, thought to have belonged to a 3 to 4-year-old, as having a persistent metopic suture and open anterior fontanelle, two features that facilitate post-natal brain growth in human infants when their disappearance is delayed," said Carlson.

Comparisons with the existing hominin fossil record and chimpanzee variation do not support this evolutionary scenario.

Citing deficiencies in how the Taung fossil material has been recently assessed, the researchers suggest physical evidence does not incontrovertibly link features of the Taung skull, or its endocast, to early prefrontal lobe expansion, a brain region implicated in many human behaviors.

The authors also debate the previously offered theoretical basis for this adaptation in A. africanus. By refuting the presence of these features in the Taung Child, the researchers dispute whether these structures were selectively advantageous in hominin evolution, particularly in australopiths.

Thus, results of the new study show that there is still no evidence for this kind of skull adaptation that evolved before Homo, nor is there evidence for a link between such skull characteristics and the proposed accompanying early prefrontal lobe expansion, Carlson said.

Filed under taung child hominin evolution prefrontal cortex brain development neuroscience science

113 notes

Scientists Uncover Navigation System Used by Cancer, Nerve Cells

Duke University researchers have found a ”roving detection system” on the surface of cells that may point to new ways of treating diseases like cancer, Parkinson’s disease and amyotrophic lateral sclerosis (ALS).

The cells, which were studied in nematode worms, are able to break through normal tissue boundaries and burrow into other tissues and organs — a crucial step in many normal developmental processes, ranging from embryonic development and wound-healing to the formation of new blood vessels.

But sometimes the process goes awry. Such is the case with metastatic cancer, in which cancer cells spread unchecked from where they originated and form tumors in other parts of the body.

“Cell invasion is one of the most clinically relevant yet least understood aspects of cancer progression,” said David Sherwood, an associate professor of biology at Duke.

Sherwood is leading a team that is investigating the molecular mechanisms that control cell invasion in both normal development and cancer, using a one-millimeter worm known as C. elegans.

At one point in C. elegans development, a specialized cell called the anchor cell breaches the dense, sheet-like membrane that separate the worm’s uterus from its vulva, opening up the worm’s reproductive tract.

Anchor cells can’t see, so they need some kind of signal to tell them where to break through. In a 2009 study, Sherwood and colleagues discovered that an extracellular cue called netrin orients the anchor cell so that it invades in the right direction.

In a new study appearing Aug. 25 in the Journal of Cell Biology, the team shows how receptors on the invasive cells essentially rove around the cell membrane ”hunting” for the missing netrin signal that will guide the cell to the correct location.

The researchers used a video camera attached to a powerful microscope to take time-lapse movies of the slow movement of the C. elegans anchor cell during its invasion (Figure 1, Figure 2).

Their time-lapse analyses reveal that when netrin production is blocked, netrin receptors on the surface of the anchor cell periodically cluster, disperse and reassemble in a different region of the cell membrane. The receptors cluster alongside patches of actin filaments — thin flexible fibers that help cells change shape and form invasive protrusions –- that pop up in each new spot.

“It’s kind of like a missile detection system,” Sherwood said.

Rather than the whole cell having to move around, its receptors move around on the outside of the cell until they get a signal. Once the receptors locate the netrin signal, they stabilize in the region of the cell membrane that is closest to the source of the signal.

The findings redefine decades-old ideas about how the cell’s navigation system works. “Cells don’t just passively respond to the netrin signal — they’re actively searching for it,” Sherwood said.

Given that netrin has been found to promote cell invasion in some of the most lethal cancers, the findings could lead to new treatment strategies. Disrupting the cell’s netrin detection system, for example, could prevent cancer cells from finding their way to the bloodstream or the lymphatic system and stop them from metastasizing, or becoming invasive and spreading throughout the body.

“One of the things we’re gearing up to do next are drug screens with our collaborators to see if we can block this detection system during invasion,” Sherwood said.

Scientists have also known for years that netrin plays a key role in wiring the brain and nervous system by guiding developing nerve cells as they grow and form connections.

This means the results could also point to new ways of treating neurological disorders like Parkinson’s and ALS and recovering from spinal cord injuries.

Tinkering with the cell’s netrin detection machinery, for example, may make it possible to encourage damaged cells in the central nervous system — which normally have limited ability to regenerate — to regrow.

(Source: today.duke.edu)

Filed under C. elegans netrin cancer cells nerve cells neuroscience science

62 notes

Increased risk of stroke in people with cognitive impairment
People with cognitive impairment are significantly more likely to have a stroke, with a 39% increased risk, than people with normal cognitive function, according to a new study published in CMAJ (Canadian Medical Association Journal).
"Given the projected substantial rise in the number of older people around the world, prevalence rates of cognitive impairment and stroke are expected to soar over the next several decades, especially in high-income countries," writes Dr. Bruce Ovbiagele, Chair of the Department of Neurology, Medical University of South Carolina, Charleston, South Carolina, with coauthors.
Cognitive impairment and stroke are major contributors to disability, and stroke is the second leading cause of death world-wide. Although stroke is linked to the development and worsening of cognitive impairment, it is not known whether the reverse is true. Previous studies that have looked at the link between cognitive impairment and subsequent stroke have been inconsistent in their findings.
The study in CMAJ, by researchers in the United States, Taiwan and South Korea, analyzed data from 18 studies of 121 879 people with cognitive impairment, of whom 7799 later had strokes. Most of the included studies were conducted in North America or Europe.
The researchers observed a significantly higher rate of stroke in people with cognitive impairment than in people with normal cognitive function.
"We found that the risk of future stroke was 39% higher among patients with cognitive impairment at baseline than among those with normal cognitive function at baseline," write the authors. "This risk increased to 64% when a broadly adopted definition of cognitive impairment was used."
Blockage of blood vessels in the brain (brain infarcts), atherosclerosis, inflammation and other vascular conditions are associated with a higher risk of stroke and cognitive impairment and may contribute to the increased risk.
"Cognitive impairment should be more broadly recognized as a possible early clinical manifestation of cerebral infarction, so that timely management of vascular risk factors can be instituted to potentially prevent future stroke events and to avoid further deterioration of cognitive health," conclude the authors.

Increased risk of stroke in people with cognitive impairment

People with cognitive impairment are significantly more likely to have a stroke, with a 39% increased risk, than people with normal cognitive function, according to a new study published in CMAJ (Canadian Medical Association Journal).

"Given the projected substantial rise in the number of older people around the world, prevalence rates of cognitive impairment and stroke are expected to soar over the next several decades, especially in high-income countries," writes Dr. Bruce Ovbiagele, Chair of the Department of Neurology, Medical University of South Carolina, Charleston, South Carolina, with coauthors.

Cognitive impairment and stroke are major contributors to disability, and stroke is the second leading cause of death world-wide. Although stroke is linked to the development and worsening of cognitive impairment, it is not known whether the reverse is true. Previous studies that have looked at the link between cognitive impairment and subsequent stroke have been inconsistent in their findings.

The study in CMAJ, by researchers in the United States, Taiwan and South Korea, analyzed data from 18 studies of 121 879 people with cognitive impairment, of whom 7799 later had strokes. Most of the included studies were conducted in North America or Europe.

The researchers observed a significantly higher rate of stroke in people with cognitive impairment than in people with normal cognitive function.

"We found that the risk of future stroke was 39% higher among patients with cognitive impairment at baseline than among those with normal cognitive function at baseline," write the authors. "This risk increased to 64% when a broadly adopted definition of cognitive impairment was used."

Blockage of blood vessels in the brain (brain infarcts), atherosclerosis, inflammation and other vascular conditions are associated with a higher risk of stroke and cognitive impairment and may contribute to the increased risk.

"Cognitive impairment should be more broadly recognized as a possible early clinical manifestation of cerebral infarction, so that timely management of vascular risk factors can be instituted to potentially prevent future stroke events and to avoid further deterioration of cognitive health," conclude the authors.

Filed under stroke cognitive impairment cognitive function neuroscience science

124 notes

Changes in the eye can predict changes in the brain

Researchers at the Gladstone Institutes and University of California, San Francisco have shown that a loss of cells in the retina is one of the earliest signs of frontotemporal dementia (FTD) in people with a genetic risk for the disorder—even before any changes appear in their behavior.

image

Published today in the Journal of Experimental Medicine, the researchers, led by Gladstone investigator Li Gan, PhD and UCSF associate professor of neurology Ari Green, MD, studied a group of individuals who had a certain genetic mutation that is known to result in FTD. They discovered that before any cognitive signs of dementia were present, these individuals showed a significant thinning of the retina compared with people who did not have the gene mutation.

“This finding suggests that the retina acts as a type of ‘window to the brain,’” said Dr. Gan. “Retinal degeneration was detectable in mutation carriers prior to the onset of cognitive symptoms, establishing retinal thinning as one of the earliest observable signs of familial FTD. This means that retinal thinning could be an easily measured outcome for clinical trials.”

Although it is located in the eye, the retina is made up of neurons with direct connections to the brain. This means that studying the retina is one of the easiest and most accessible ways to examine and track changes in neurons.

Lead author Michael Ward, MD, PhD, a postdoctoral fellow at the Gladstone Institutes and assistant professor of neurology at UCSF, explained, “The retina may be used as a model to study the development of FTD in neurons. If we follow these patients over time, we may be able to correlate a decline in retinal thickness with disease progression. In addition, we may be able to track the effectiveness of a treatment through a simple eye examination.”

The researchers also discovered new mechanisms by which cell death occurs in FTD. As with most complex neurological disorders, there are several changes in the brain that contribute to the development of FTD. In the inherited form researched in the current study, this includes a deficiency of the protein progranulin, which is tied to the mislocalization of another crucial protein, TDP-43, from the nucleus of the cell out to the cytoplasm.

However, the relationship between neurodegeneration, progranulin, and TDP-43 was previously unclear. In follow-up studies using a genetic mouse model of FTD, the scientists were able to investigate this connection for the first time in neurons from the retina. They identified a depletion of TDP-43 from the cell nuclei before any signs of neurodegeneration occurred, signifying that this loss may be a direct cause of the cell death associated with FTD.

TDP-43 levels were shown to be regulated by a third cellular protein called Ran. By increasing expression of Ran, the researchers were able to elevate TDP-43 levels in the nucleus of progranulin-deficient neurons and prevent their death.

“With these findings,” said Dr. Gan, “we now not only know that retinal thinning can act as a pre-symptomatic marker of dementia, but we’ve also gained an understanding into the underlying mechanisms of frontotemporal dementia that could potentially lead to novel therapeutic targets.”

(Source: gladstoneinstitutes.org)

Filed under frontotemporal dementia retina genetic mutation neurodegeneration TDP-43 neurons neuroscience science

322 notes

Train your heart to protect your mind
Exercising to improve our cardiovascular strength may protect us from cognitive impairment as we age, according to a new study by researchers at the University of Montreal and its affiliated Institut universitaire de gératrie de Montréal Research Centre. “Our body’s arteries stiffen with age, and the vessel hardening is believed to begin in the aorta, the main vessel coming out of the heart, before reaching the brain. Indeed, the hardening may contribute to cognitive changes that occur during a similar time frame,” explained Claudine Gauthier, first author of the study. “We found that older adults whose aortas were in a better condition and who had greater aerobic fitness performed better on a cognitive test. We therefore think that the preservation of vessel elasticity may be one of the mechanisms that enables exercise to slow cognitive aging.”
The researchers worked with 31 young people between the ages of 18 and 30 and 54 older participants aged between 55 and 75. This enabled the team to compare the older participants within their peer group and against the younger group who obviously have not begun the aging processes in question. None of the participants had physical or mental health issues that might influence the study outcome. Their fitness was tested by exhausting the participants on a workout machine and determining their maximum oxygen intake over a 30 second period. Their cognitive abilities were assessed with the Stroop task. The Stroop task is a scientifically validated test that involves asking someone to identify the ink colour of a colour word that is printed in a different colour (e.g. the word red could be printed in blue ink and the correct answer would be blue). A person who is able to correctly name the colour of the word without being distracted by the reflex to read it has greater cognitive agility.
The participants undertook three MRI scans: one to evaluate the blood flow to the brain, one to measure their brain activity as they performed the Stroop task, and one to actually look at the physical state of their aorta. The researchers were interested in the brain’s blood flow, as poorer cardiovascular health is associated with a faster pulse wave,at each heartbeat which in turn could cause damage to the brain’s smaller blood vessels. “This is first study to use MRI to examine participants in this way,” Gauthier said. “It enabled us to find even subtle effects in this healthy population, which suggests that other researchers could adapt our test to study vascular-cognitive associations within less healthy and clinical populations.”
The results demonstrated age-related declines in executive function, aortic elasticity and cardiorespiratory fitness, a link between vascular health and brain function, and a positive association between aerobic fitness and brain function. “The link between fitness and brain function may be mediated through preserved cerebrovascular reactivity in periventricular watershed areas that are also associated with cardiorespiratory fitness,” Gauthier said. “Although the impact of fitness on cerebral vasculature may however involve other, more complex mechanisms, overall these results support the hypothesis that lifestyle helps maintain the elasticity of arteries, thereby preventing downstream cerebrovascular damage and resulting in preserved cognitive abilities in later life.”

Train your heart to protect your mind

Exercising to improve our cardiovascular strength may protect us from cognitive impairment as we age, according to a new study by researchers at the University of Montreal and its affiliated Institut universitaire de gératrie de Montréal Research Centre. “Our body’s arteries stiffen with age, and the vessel hardening is believed to begin in the aorta, the main vessel coming out of the heart, before reaching the brain. Indeed, the hardening may contribute to cognitive changes that occur during a similar time frame,” explained Claudine Gauthier, first author of the study. “We found that older adults whose aortas were in a better condition and who had greater aerobic fitness performed better on a cognitive test. We therefore think that the preservation of vessel elasticity may be one of the mechanisms that enables exercise to slow cognitive aging.”

The researchers worked with 31 young people between the ages of 18 and 30 and 54 older participants aged between 55 and 75. This enabled the team to compare the older participants within their peer group and against the younger group who obviously have not begun the aging processes in question. None of the participants had physical or mental health issues that might influence the study outcome. Their fitness was tested by exhausting the participants on a workout machine and determining their maximum oxygen intake over a 30 second period. Their cognitive abilities were assessed with the Stroop task. The Stroop task is a scientifically validated test that involves asking someone to identify the ink colour of a colour word that is printed in a different colour (e.g. the word red could be printed in blue ink and the correct answer would be blue). A person who is able to correctly name the colour of the word without being distracted by the reflex to read it has greater cognitive agility.

The participants undertook three MRI scans: one to evaluate the blood flow to the brain, one to measure their brain activity as they performed the Stroop task, and one to actually look at the physical state of their aorta. The researchers were interested in the brain’s blood flow, as poorer cardiovascular health is associated with a faster pulse wave,at each heartbeat which in turn could cause damage to the brain’s smaller blood vessels. “This is first study to use MRI to examine participants in this way,” Gauthier said. “It enabled us to find even subtle effects in this healthy population, which suggests that other researchers could adapt our test to study vascular-cognitive associations within less healthy and clinical populations.”

The results demonstrated age-related declines in executive function, aortic elasticity and cardiorespiratory fitness, a link between vascular health and brain function, and a positive association between aerobic fitness and brain function. “The link between fitness and brain function may be mediated through preserved cerebrovascular reactivity in periventricular watershed areas that are also associated with cardiorespiratory fitness,” Gauthier said. “Although the impact of fitness on cerebral vasculature may however involve other, more complex mechanisms, overall these results support the hypothesis that lifestyle helps maintain the elasticity of arteries, thereby preventing downstream cerebrovascular damage and resulting in preserved cognitive abilities in later life.”

Filed under aging cognition cardiorespiratory fitness executive function brain function neuroscience science

89 notes

'Haven't my neurons seen this before?'
The world grows increasingly more chaotic year after year, and our brains are constantly bombarded with images. A new study from Center for the Neural Basis of Cognition (CNBC), a joint project between Carnegie Mellon University and the University of Pittsburgh, reveals how neurons in the part of the brain responsible for recognizing objects respond to being shown a barrage of images. The study is published online by Nature Neuroscience.
The CNBC researchers showed animal subjects a rapid succession of images, some that were new, and some that the subjects had seen more than 100 times. The researchers measured the electrical response of individual neurons in the inferotemporal cortex, an essential part of the visual system and the part of the brain responsible for object recognition.
In previous studies, researchers found that when subjects were shown a single, familiar image, their neurons responded less strongly than when they were shown an unfamiliar image. However, in the current study, the CNBC researchers found that when subjects were exposed to familiar and unfamiliar images in a rapid succession, their neurons — especially the inhibitory neurons — fired much more strongly and selectively to images the subject had seen many times before.
"It was such a dramatic effect, it leapt out at us," said Carl Olson, a professor at Carnegie Mellon. "You wouldn’t expect there to be such deep changes in the brain from simply making things familiar. We think this may be a mechanism the brain uses to track a rapidly changing visual environment."
The researchers then ran a similar experiment in which they used themselves as subjects, recording their brain activity using EEG. They found that the humans’ brains responded similarly to the animal subjects’ brains when presented with familiar or unfamiliar images in rapid succession. In future studies, they hope to link these changes in the brain to improvements in perception and cognition.

'Haven't my neurons seen this before?'

The world grows increasingly more chaotic year after year, and our brains are constantly bombarded with images. A new study from Center for the Neural Basis of Cognition (CNBC), a joint project between Carnegie Mellon University and the University of Pittsburgh, reveals how neurons in the part of the brain responsible for recognizing objects respond to being shown a barrage of images. The study is published online by Nature Neuroscience.

The CNBC researchers showed animal subjects a rapid succession of images, some that were new, and some that the subjects had seen more than 100 times. The researchers measured the electrical response of individual neurons in the inferotemporal cortex, an essential part of the visual system and the part of the brain responsible for object recognition.

In previous studies, researchers found that when subjects were shown a single, familiar image, their neurons responded less strongly than when they were shown an unfamiliar image. However, in the current study, the CNBC researchers found that when subjects were exposed to familiar and unfamiliar images in a rapid succession, their neurons — especially the inhibitory neurons — fired much more strongly and selectively to images the subject had seen many times before.

"It was such a dramatic effect, it leapt out at us," said Carl Olson, a professor at Carnegie Mellon. "You wouldn’t expect there to be such deep changes in the brain from simply making things familiar. We think this may be a mechanism the brain uses to track a rapidly changing visual environment."

The researchers then ran a similar experiment in which they used themselves as subjects, recording their brain activity using EEG. They found that the humans’ brains responded similarly to the animal subjects’ brains when presented with familiar or unfamiliar images in rapid succession. In future studies, they hope to link these changes in the brain to improvements in perception and cognition.

Filed under inferotemporal cortex object recognition brain activity neurons neuroscience science

158 notes

Neuroscience and big data: How to find simplicity in the brain
Scientists can now monitor and record the activity of hundreds of neurons concurrently in the brain, and ongoing technology developments promise to increase this number manyfold. However, simply recording the neural activity does not automatically lead to a clearer understanding of how the brain works.
In a new review paper published in Nature Neuroscience, Carnegie Mellon University’s Byron M. Yu and Columbia University’s John P. Cunningham describe the scientific motivations for studying the activity of many neurons together, along with a class of machine learning algorithms — dimensionality reduction — for interpreting the activity.
In recent years, dimensionality reduction has provided insight into how the brain distinguishes between different odors, makes decisions in the face of uncertainty and is able to think about moving a limb without actually moving. Yu and Cunningham contend that using dimensionality reduction as a standard analytical method will make it easier to compare activity patterns in healthy and abnormal brains, ultimately leading to improved treatments and interventions for brain injuries and disorders.
"One of the central tenets of neuroscience is that large numbers of neurons work together to give rise to brain function. However, most standard analytical methods are appropriate for analyzing only one or two neurons at a time. To understand how large numbers of neurons interact, advanced statistical methods, such as dimensionality reduction, are needed to interpret these large-scale neural recordings," said Yu, an assistant professor of electrical and computer engineering and biomedical engineering at CMU and a faculty member in the Center for the Neural Basis of Cognition (CNBC).
The idea behind dimensionality reduction is to summarize the activity of a large number of neurons using a smaller number of latent (or hidden) variables. Dimensionality reduction methods are particularly useful to uncover inner workings of the brain, such as when we ruminate or solve a mental math problem, where all the action is going on inside the brain and not in the outside world. These latent variables can be used to trace out the path of ones thoughts.
"One of the major goals of science is to explain complex phenomena in simple terms. Traditionally, neuroscientists have sought to find simplicity with individual neurons. However, it is becoming increasingly recognized that neurons show varied features in their activity patterns that are difficult to explain by examining one neuron at a time. Dimensionality reduction provides us with a way to embrace single-neuron heterogeneity and seek simple explanations in terms of how neurons interact with each other," said Cunningham, assistant professor of statistics at Columbia.
Although dimensionality reduction is relatively new to neuroscience compared to existing analytical methods, it has already shown great promise. With Big Data getting ever bigger thanks to the continued development of neural recording technologies and the federal BRAIN Initiative, the use of dimensionality reduction and related methods will likely become increasingly essential.

Neuroscience and big data: How to find simplicity in the brain

Scientists can now monitor and record the activity of hundreds of neurons concurrently in the brain, and ongoing technology developments promise to increase this number manyfold. However, simply recording the neural activity does not automatically lead to a clearer understanding of how the brain works.

In a new review paper published in Nature Neuroscience, Carnegie Mellon University’s Byron M. Yu and Columbia University’s John P. Cunningham describe the scientific motivations for studying the activity of many neurons together, along with a class of machine learning algorithms — dimensionality reduction — for interpreting the activity.

In recent years, dimensionality reduction has provided insight into how the brain distinguishes between different odors, makes decisions in the face of uncertainty and is able to think about moving a limb without actually moving. Yu and Cunningham contend that using dimensionality reduction as a standard analytical method will make it easier to compare activity patterns in healthy and abnormal brains, ultimately leading to improved treatments and interventions for brain injuries and disorders.

"One of the central tenets of neuroscience is that large numbers of neurons work together to give rise to brain function. However, most standard analytical methods are appropriate for analyzing only one or two neurons at a time. To understand how large numbers of neurons interact, advanced statistical methods, such as dimensionality reduction, are needed to interpret these large-scale neural recordings," said Yu, an assistant professor of electrical and computer engineering and biomedical engineering at CMU and a faculty member in the Center for the Neural Basis of Cognition (CNBC).

The idea behind dimensionality reduction is to summarize the activity of a large number of neurons using a smaller number of latent (or hidden) variables. Dimensionality reduction methods are particularly useful to uncover inner workings of the brain, such as when we ruminate or solve a mental math problem, where all the action is going on inside the brain and not in the outside world. These latent variables can be used to trace out the path of ones thoughts.

"One of the major goals of science is to explain complex phenomena in simple terms. Traditionally, neuroscientists have sought to find simplicity with individual neurons. However, it is becoming increasingly recognized that neurons show varied features in their activity patterns that are difficult to explain by examining one neuron at a time. Dimensionality reduction provides us with a way to embrace single-neuron heterogeneity and seek simple explanations in terms of how neurons interact with each other," said Cunningham, assistant professor of statistics at Columbia.

Although dimensionality reduction is relatively new to neuroscience compared to existing analytical methods, it has already shown great promise. With Big Data getting ever bigger thanks to the continued development of neural recording technologies and the federal BRAIN Initiative, the use of dimensionality reduction and related methods will likely become increasingly essential.

Filed under neurons neural activity neural recordings neuroscience science

80 notes

Driving brain rhythm makes mice more sensitive to touch
By striking up the right rhythm in the right brain region at the right time, Brown University neuroscientists report in Nature Neuroscience that they managed to endow mice with greater touch sensitivity than other mice, making hard-to-perceive vibrations suddenly more vivid to them.
The findings offer the first direct evidence that “gamma” brainwaves in the cortex affect perception and attention. With only correlations and associations as evidence before, neuroscientists have argued for years about whether gamma has an important role or whether it’s merely a byproduct — an “exhaust fume” in the words of one — of such brain activity.
“There’s a lot of excitement about the importance of gamma rhythms in behavior, as well as a lot of skepticism,” said co-lead author Joshua Siegle, a former graduate student at Brown University and MIT, who is now at the Allen Institute for Neuroscience. “Rather than try to correlate changes in gamma rhythms with changes in behavior, which is what researchers have done in the past, we chose to directly control the cells that produce gamma.”
The result was a mouse with whiskers that were about 20 percent more sensitive.
“There were a lot of ways this experiment could have failed but instead to our surprise it was pretty decisive from the very first subject we looked at — that under certain conditions we can make a super-perceiving mouse,” said Christopher Moore, associate professor of neuroscience at Brown and senior author of the study. “We’re making a mouse do better than a mouse could have done otherwise.”
Specifically, Moore and co-first authors Siegle and Dominique Pritchett performed their experiments by using optogenetics — a technique of using light to control the firing patterns of neurons — to generate a gamma rhythm by manipulating inhibitory interneurons in the primary sensory neocortex of mice. That part of the brain controls a mouse’s ability to detect faint sensations via its whiskers.
A different part of the brain handles stronger, more imposing sensations, Moore said. The primary sensory neocortex, a particular feature of mammals, has the distinction of allowing an animal to purposely pay attention to more subtle sensations. It’s the difference between the feeling of gently brushing a fingertip along a wood board to assess if it needs a bit more sanding and the feeling of dropping the wood board on a foot.
Before anything else in the paper, the researchers confirmed that mice naturally produce a 40-hertz gamma rhythm in their sensory neocortex sometimes. Then they optogenetically generated that gamma rhythm with precise pulses of blue light. Mice with this rhythm could more often detect the fainter vibrations the researchers supplied to their whiskers than could mice who did not have the rhythm going in their brains.
Control and optogenetically stimulated mice alike had been conditioned to indicated their detection of a supplied stimulus by licking a water bottle. The vibrations provided to the mice to sense covered a span of 17 different levels of detectability.
The team’s hypothesis was that the gamma rhythm of the stimulated neurons, because they inhibit the transmission of sensation messages by pyramidal neurons in the neocortex with a structured periodicity, actually orders the pyramidal messages into a more coherent and therefore stronger train.
“It’s not surprising that these synchronized bursts of activity can benefit signal transmission, in the same way that synchronized clapping in a crowd of people is louder than random clapping,” Siegle said.
This idea suggested that the timing of the rhythm matters.
So in another experiment, Siegle, Pritchett, and Moore varied the onset of the gamma rhythm by increments of 5 milliseconds to see whether it made a difference to perception. It did. The mice showed their increased sensitivity only so long as the gamma rhythms were underway 20-25 milliseconds before the subtle sensations were presented. If they weren’t, the mice experienced on average no impact on sensitivity.
One of the key implications from the findings for neuroscience, Moore said, is that the way gamma rhythms appear to structure the processing of perception is more important than the mere firing rate of neurons in the sensory neocortex. Mice became better able to feel not because neurons became more active (they didn’t), but because they were entrained by a precisely timed rhythm.
Although the study provides causal evidence of a functional importance for gamma rhythms, Moore acknowledged, it still leaves open important questions. The exact mechanism by which gamma rhythms affect sensation processing and attention are not proved, only hypothesized.
And in one experiment, optogenetically stimulated mice appeared less able to detect the most obvious and imposing of the sensations, even as they became more sensitive to the more subtle ones. In other experiments, however, their detection of major sensations was not compromised.
But the possible loss of sensitivity to stimuli that are easier to feel could be consistent with a shifting of attention to fainter ones, said Pritchett, also a former Brown and MIT student now at the Champalimaud Centre for the Unknown in Lisbon, Portugal.
“What we are showing is that, paradoxically, the rhythmic inhibitory input works to amplify threshold stimuli, possibly at the expense of salient stimuli,” he said. “This is precisely what you would expect from a mechanism that might be responsible for selective attention in the brain.”
Therefore, Siegle, Pritchett, and Moore say they do have a better feel now for what’s going on in the brain.

Driving brain rhythm makes mice more sensitive to touch

By striking up the right rhythm in the right brain region at the right time, Brown University neuroscientists report in Nature Neuroscience that they managed to endow mice with greater touch sensitivity than other mice, making hard-to-perceive vibrations suddenly more vivid to them.

The findings offer the first direct evidence that “gamma” brainwaves in the cortex affect perception and attention. With only correlations and associations as evidence before, neuroscientists have argued for years about whether gamma has an important role or whether it’s merely a byproduct — an “exhaust fume” in the words of one — of such brain activity.

“There’s a lot of excitement about the importance of gamma rhythms in behavior, as well as a lot of skepticism,” said co-lead author Joshua Siegle, a former graduate student at Brown University and MIT, who is now at the Allen Institute for Neuroscience. “Rather than try to correlate changes in gamma rhythms with changes in behavior, which is what researchers have done in the past, we chose to directly control the cells that produce gamma.”

The result was a mouse with whiskers that were about 20 percent more sensitive.

“There were a lot of ways this experiment could have failed but instead to our surprise it was pretty decisive from the very first subject we looked at — that under certain conditions we can make a super-perceiving mouse,” said Christopher Moore, associate professor of neuroscience at Brown and senior author of the study. “We’re making a mouse do better than a mouse could have done otherwise.”

Specifically, Moore and co-first authors Siegle and Dominique Pritchett performed their experiments by using optogenetics — a technique of using light to control the firing patterns of neurons — to generate a gamma rhythm by manipulating inhibitory interneurons in the primary sensory neocortex of mice. That part of the brain controls a mouse’s ability to detect faint sensations via its whiskers.

A different part of the brain handles stronger, more imposing sensations, Moore said. The primary sensory neocortex, a particular feature of mammals, has the distinction of allowing an animal to purposely pay attention to more subtle sensations. It’s the difference between the feeling of gently brushing a fingertip along a wood board to assess if it needs a bit more sanding and the feeling of dropping the wood board on a foot.

Before anything else in the paper, the researchers confirmed that mice naturally produce a 40-hertz gamma rhythm in their sensory neocortex sometimes. Then they optogenetically generated that gamma rhythm with precise pulses of blue light. Mice with this rhythm could more often detect the fainter vibrations the researchers supplied to their whiskers than could mice who did not have the rhythm going in their brains.

Control and optogenetically stimulated mice alike had been conditioned to indicated their detection of a supplied stimulus by licking a water bottle. The vibrations provided to the mice to sense covered a span of 17 different levels of detectability.

The team’s hypothesis was that the gamma rhythm of the stimulated neurons, because they inhibit the transmission of sensation messages by pyramidal neurons in the neocortex with a structured periodicity, actually orders the pyramidal messages into a more coherent and therefore stronger train.

“It’s not surprising that these synchronized bursts of activity can benefit signal transmission, in the same way that synchronized clapping in a crowd of people is louder than random clapping,” Siegle said.

This idea suggested that the timing of the rhythm matters.

So in another experiment, Siegle, Pritchett, and Moore varied the onset of the gamma rhythm by increments of 5 milliseconds to see whether it made a difference to perception. It did. The mice showed their increased sensitivity only so long as the gamma rhythms were underway 20-25 milliseconds before the subtle sensations were presented. If they weren’t, the mice experienced on average no impact on sensitivity.

One of the key implications from the findings for neuroscience, Moore said, is that the way gamma rhythms appear to structure the processing of perception is more important than the mere firing rate of neurons in the sensory neocortex. Mice became better able to feel not because neurons became more active (they didn’t), but because they were entrained by a precisely timed rhythm.

Although the study provides causal evidence of a functional importance for gamma rhythms, Moore acknowledged, it still leaves open important questions. The exact mechanism by which gamma rhythms affect sensation processing and attention are not proved, only hypothesized.

And in one experiment, optogenetically stimulated mice appeared less able to detect the most obvious and imposing of the sensations, even as they became more sensitive to the more subtle ones. In other experiments, however, their detection of major sensations was not compromised.

But the possible loss of sensitivity to stimuli that are easier to feel could be consistent with a shifting of attention to fainter ones, said Pritchett, also a former Brown and MIT student now at the Champalimaud Centre for the Unknown in Lisbon, Portugal.

“What we are showing is that, paradoxically, the rhythmic inhibitory input works to amplify threshold stimuli, possibly at the expense of salient stimuli,” he said. “This is precisely what you would expect from a mechanism that might be responsible for selective attention in the brain.”

Therefore, Siegle, Pritchett, and Moore say they do have a better feel now for what’s going on in the brain.

Filed under gamma oscillations interneurons optogenetics tactile stimulation neuroscience science

294 notes

Fed Up with Waiting? Timely Activation of Serotonin Enhances Patience

Lining up in a long queue for a popular restaurant or waiting for the arrival of a date requires a great deal of patience. Our lives are full of decisions involving patience, yet it needs to be exercised at the appropriate times. In order to examine the brain mechanism for controlling patience to obtain a reward, Drs. Kayoko Miyazaki and Katsuhiko Miyazaki and Prof. Kenji Doya of the Neural Computation Unit at the Okinawa Institute of Science and Technology Graduate University, used a new technique called optogenetics, where they use light to simulate specific neurons with precise timing. Their most recent research shows that activating serotonin neurons specifically during waiting promotes patience for delayed rewards. This research was published in the online version of Current Biology on August 21, 2014.

In this study, the researchers used genetically engineered mice that produce light-activated molecules only in neurons that produce serotonin. They implanted an optical fiber in a small part of the brain called the dorsal raphe, from which neural fibers releasing serotonin extend throughout the cerebrum, the largest and most highly developed part of the brain. The researchers trained five of those mice to perform a delayed reward task, meaning that if they waited at a hole, they would receive a food pellet as a reward. To show that they were waiting, each mouse needed to hold its nose inside the hole where the food pellet would appear, a posture that the researchers call a nose poke. The durations of waiting were randomly chosen from 3, 6, or 9 seconds, or infinity, meaning no reward was given no matter how long the mice waited. In half of those trials, researchers stimulated serotonin neurons by shining a light through the optical fiber while the mice were waiting. No prior signal was given to notify how long the waiting would be. The mice consistently waited for 3 and 6 seconds to receive the food. But when the mice needed to wait for 9 seconds, the mice showed difficulty and often removed their nose from the food hole. When the researchers shone a light on serotonin neurons during the nose poke position, the light stimulation significantly decreased the number of failures to wait for 9 seconds to obtain the food.

In the 25% of trials, the food pellet reward was not delivered regardless of how long the mouse waited. In these trials, without their serotonin neurons stimulated, mice waited 12.0 seconds on average. With their serotonin neurons stimulated, the mice waited 17.5 seconds on average. As control experiments, the researchers activated serotonin neurons at different timing when each mouse did not have its nose poked into the food hole, then observed that these mice behaved the same as in unstimulated cases with no evidence of simple motor inhibition. The results showed, for the first time, that the timed activation of serotonin neurons promotes animals’ patience for delayed rewards.

Serotonin is a neuromodulator that is released diffusely in the entire brain. It is involved in behavioral, cognitive, and mental functions. Classically, serotonin was believed to signal punishment and inhibit behaviors. However, serotonin enriching drugs, known as SSRI, are effective for therapies of depression, which is hard to reconcile with the classic view. Another recent study of optogenetic stimulation of serotonin neurons even reported its effect as a reward, to further complicate the story. On the other hand, another line of research, including the previous work by the OIST researchers, showed that the lack of serotonin causes impulsive behaviors. “Our previous studies have shown that serotonin levels increase when waiting for delayed rewards. We have also shown that inhibiting serotonin neurons leads to an inability to wait for a long time,” explained Kayoko and Katsuhiko Miyazaki. “By using light to stimulate neurons at specific times, this study has proven serotonin’s role in patience during delayed reward waiting, underlining serotonin’s much greater role than previously thought.” By further exploring the effect of serotonin, the researchers hope to decipher the neuronal network behind mental disorders and behaviors involving serotonin. Such studies can promote a better understanding of human emotions, including the development of software and robots and that think and act like humans.

Filed under serotonin optogenetics dorsal raphe serotonergic neurons neuroscience science

83 notes

Mouse model for epilepsy, Alzheimer’s gives window into the working brain

University of Utah scientists have developed a genetically engineered line of mice that is expected to open the door to new research on epilepsy, Alzheimer’s and other diseases.

The mice carry a protein marker, which changes in degree of fluorescence in response to different calcium levels. This will allow many cell types, including cells called astrocytes and microglia, to be studied in a new way.

"This is opening up the possibility to decipher how the brain works," said Petr Tvrdik, Ph.D., a research fellow in human genetics and a senior author on the study.

The research was published Aug. 14, 2014, in Neuron, a world-leading neuroscience journal. The work is the result of a three-year study involving multiple labs connected with The Brain Institute at the University of Utah. The lead author is J. Michael Gee, who is pursuing both a medical degree and a graduate degree in bioengineering at the university.

"We’re really in the era of team science," said John White, Ph.D., professor of bioengineering, executive director of the Brain Institute and the study’s corresponding author.

With the new mouse line, scientists can use a laser-based fluorescence microscope to study the calcium indicator in the glial cells of the living mouse, either when the mouse is anesthetized or awake. Calcium is studied because it is an important signaling molecule in the body and it can reveal how well the brain is functioning.

Using this method, the scientists are essentially creating a window into the working brain to study the interactions between neurons, astrocytes and microglia.

"We believe this will give us new insights for treatments of epilepsy and for new views of how the immune system of the brain works," White said.

About one-third of the 3 million Americans estimated to have epilepsy lack adequate treatment to manage the disease.

Describing a long-standing collaboration with fellow university researcher and professor of pharmacology and toxicology Karen Wilcox, Ph.D., White said, “We believe the glial cells are malfunctioning in epilepsy. What we’re trying to do is find out in what ways astrocytes participate in the disease.”

This research is expected to lead to new classes of drugs.

The ability to track calcium changes in microglial cells will also open up the possibility of studying inflammatory diseases of the brain. Every neurological disease, including Multiple Sclerosis and Alzheimer’s, appears to include components of inflammation, the scientists said.

"Live imaging and monitoring microglial activity and responses to inflammation was not possible before," said Tvrdik, particularly in living animals. In the past, researchers studied post-mortem tissue or relied on invasive approaches using synthetic dyes.

(Source: eurekalert.org)

Filed under epilepsy alzheimer's disease glial cells neurons animal model calcium neuroscience science

free counters