Neuroscience

Articles and news from the latest research reports.

Posts tagged science

273 notes

Glucose ‘control switch’ in the brain key to both types of diabetes
Researchers at Yale School of Medicine have pinpointed a mechanism in part of the brain that is key to sensing glucose levels in the blood, linking it to both type 1 and type 2 diabetes. The findings are published in the July 28 issue of Proceedings of the National Academies of Sciences.
“We’ve discovered that the prolyl endopeptidase enzyme — located in a part of the hypothalamus known as the ventromedial nucleus — sets a series of steps in motion that control glucose levels in the blood,” said lead author Sabrina Diano, professor in the Departments of Obstetrics, Gynecology & Reproductive Sciences, Comparative Medicine, and Neurobiology at Yale School of Medicine. “Our findings could eventually lead to new treatments for diabetes.”
The ventromedial nucleus contains cells that are glucose sensors. To understand the role of prolyl endopeptidase in this part of the brain, the team used mice that were genetically engineered with low levels of this enzyme. They found that in absence of this enzyme, mice had high levels of glucose in the blood and became diabetic.
Diano and her team discovered that this enzyme is important because it makes the neurons in this part of the brain sensitive to glucose. The neurons sense the increase in glucose levels and then tell the pancreas to release insulin, which is the hormone that maintains a steady level of glucose in the blood, preventing diabetes.
“Because of the low levels of endopeptidase, the neurons were no longer sensitive to increased glucose levels and could not control the release of insulin from the pancreas, and the mice developed diabetes.” said Diano, who is also a member of the Yale Program in Integrative Cell Signaling and Neurobiology of Metabolism.
Diano said the next step in this research is to identify the targets of this enzyme by understanding how the enzyme makes the neurons sense changes in glucose levels. “If we succeed in doing this, we could be able to regulate the secretion of insulin, and be able to prevent and treat type 2 diabetes,” she said.

Glucose ‘control switch’ in the brain key to both types of diabetes

Researchers at Yale School of Medicine have pinpointed a mechanism in part of the brain that is key to sensing glucose levels in the blood, linking it to both type 1 and type 2 diabetes. The findings are published in the July 28 issue of Proceedings of the National Academies of Sciences.

“We’ve discovered that the prolyl endopeptidase enzyme — located in a part of the hypothalamus known as the ventromedial nucleus — sets a series of steps in motion that control glucose levels in the blood,” said lead author Sabrina Diano, professor in the Departments of Obstetrics, Gynecology & Reproductive Sciences, Comparative Medicine, and Neurobiology at Yale School of Medicine. “Our findings could eventually lead to new treatments for diabetes.”

The ventromedial nucleus contains cells that are glucose sensors. To understand the role of prolyl endopeptidase in this part of the brain, the team used mice that were genetically engineered with low levels of this enzyme. They found that in absence of this enzyme, mice had high levels of glucose in the blood and became diabetic.

Diano and her team discovered that this enzyme is important because it makes the neurons in this part of the brain sensitive to glucose. The neurons sense the increase in glucose levels and then tell the pancreas to release insulin, which is the hormone that maintains a steady level of glucose in the blood, preventing diabetes.

“Because of the low levels of endopeptidase, the neurons were no longer sensitive to increased glucose levels and could not control the release of insulin from the pancreas, and the mice developed diabetes.” said Diano, who is also a member of the Yale Program in Integrative Cell Signaling and Neurobiology of Metabolism.

Diano said the next step in this research is to identify the targets of this enzyme by understanding how the enzyme makes the neurons sense changes in glucose levels. “If we succeed in doing this, we could be able to regulate the secretion of insulin, and be able to prevent and treat type 2 diabetes,” she said.

Filed under glucose diabetes ventromedial nucleus endopeptidase insulin medicine science

252 notes

The bit of your brain that signals how bad things could be

An evolutionarily ancient and tiny part of the brain tracks expectations about nasty events, finds new UCL research.

The study, published in Proceedings of the National Academy of Sciences, demonstrates for the first time that the human habenula, half the size of a pea, tracks predictions about negative events, like painful electric shocks, suggesting a role in learning from bad experiences.

image

Brain scans from 23 healthy volunteers showed that the habenula activates in response to pictures associated with painful electric shocks, with the opposite occurring for pictures that predicted winning money.

Previous studies in animals have found that habenula activity leads to avoidance as it suppresses dopamine, a brain chemical that drives motivation. In animals, habenula cells have been found to fire when bad things happen or are anticipated.

"The habenula tracks our experiences, responding more the worse something is expected to be," says senior author Dr Jonathan Roiser of the UCL Institute of Cognitive Neuroscience. "For example, the habenula responds much more strongly when an electric shock is almost certain than when it is unlikely. In this study we showed that the habenula doesn’t just express whether something leads to negative events or not; it signals quite how much bad outcomes are expected."

During the experiment, healthy volunteers were placed inside a functional magnetic resonance imaging (fMRI) scanner, and brain images were collected at high resolution because the habenula is so small. Volunteers were shown a random sequence of pictures each followed by a set chance of a good or bad outcome, occasionally pressing a button simply to show they were paying attention. Habenula activation tracked the changing expectation of bad and good events.

"Fascinatingly, people were slower to press the button when the picture was associated with getting shocked, even though their response had no bearing on the outcome." says lead author Dr Rebecca Lawson, also at the UCL Institute of Cognitive Neuroscience. "Furthermore, the slower people responded, the more reliably their habenula tracked associations with shocks. This demonstrates a crucial link between the habenula and motivated behaviour, which may be the result of dopamine suppression."

The habenula has previously been linked to depression, and this study shows how it could be involved in causing symptoms such low motivation, pessimism and a focus on negative experiences. A hyperactive habenula could cause people to make disproportionately negative predictions.

"Other work shows that ketamine, which has profound and immediate benefits in patients who failed to respond to standard antidepressant medication, specifically dampens down habenula activity," says Dr Roiser. "Therefore, understanding the habenula could help us to develop better treatments for treatment-resistant depression."

(Source: eurekalert.org)

Filed under habenula negative events dopamine ketamine experiences neuroscience science

318 notes

Memory relies on astrocytes, the brain’s lesser known cells
When you’re expecting something—like the meal you’ve ordered at a restaurant—or when something captures your interest, unique electrical rhythms sweep through your brain.
These waves are called gamma oscillations and they reflect a symphony of cells—both excitatory and inhibitory—playing together in an orchestrated way. Though their role has been debated, gamma waves have been associated with higher-level brain function, and disturbances in the patterns have been tied to schizophrenia, Alzheimer’s disease, autism, epilepsy and other disorders.
Now, new research from the Salk Institute shows that little known supportive cells in the brain known as astrocytes may in fact be major players that control these waves.
In a study published July 28 in the Proceedings of the National Academy of Sciences, Salk researchers report a new, unexpected strategy to turn down gamma oscillations, by disabling not neurons but astrocytes—cells type traditionally thought to provide more of a support role in the brain. In the process, the team showed that astrocytes, and the gamma oscillations they help shape, are critical for some forms of memory.
"This is what could be called a smoking gun," says co-author Terrence Sejnowski, head of the Computational Neurobiology Laboratory at the Salk Institute for Biological Sciences and a Howard Hughes Medical Institute investigator. "There are hundreds of papers linking gamma oscillations with attention and memory, but they are all correlational. This is the first time we have been able to do a causal experiment, where we selectively block gamma oscillations and show that it has a highly specific impact on how the brain interacts with the world."
A collaboration among the labs of Salk professors Sejnowski, Inder Verma and Stephen Heinemann found that activity in the form of calcium signaling in astrocytes immediately preceded gamma oscillations in the brains of mice. This suggested that astrocytes, which use many of the same chemical signals as neurons, could be influencing these oscillations.
To test their theory, the group used a virus carrying tetanus toxin to disable the release of chemicals released selectively from astrocytes, effectively eliminating the cells’ ability to communicate with neighboring cells. Neurons were unaffected by the toxin.
After adding a chemical to trigger gamma waves in the animals’ brains, the researchers found that brain tissue with disabled astrocytes produced shorter gamma waves than in tissue containing healthy cells. And after adding three genes that would allow the researchers to selectively turn on and off the tetanus toxin in astrocytes at will, they found that gamma waves were dampened in mice whose astrocytes were blocked from signaling. Turning off the toxin reversed this effect.
The mice with the modified astrocytes seemed perfectly healthy. But after several cognitive tests, the researchers found that they failed in one major area: novel object recognition. A healthy mouse spent more time with a new item placed in its environment than it did with familiar items, as expected.
In contrast, the group’s new mutant mouse treated all objects the same. “That turned out to be a spectacular result in the sense that novel object recognition memory was not just impaired, it was gone—as if we were deleting this one form of memory, leaving others intact,” Sejnowski says.
The results were surprising, in part because astrocytes operate on a seconds- or longer timescale whereas neurons signal far faster, on the millisecond scale. Because of that slower speed, no one suspected astrocytes were involved in the high-speed brain activity needed to make quick decisions.
"What I thought quite unique was the idea that astrocytes, traditionally considered only guardians and supporters of neurons and other cells, are also involved in the processing of information and in other cognitive behavior," says Verma, a professor in the Laboratory of Genetics and American Cancer Society Professor.
It’s not that astrocytes are quick—they’re still slower than neurons. But the new evidence suggests that astrocytes are actively supplying the right environment for gamma waves to occur, which in turn makes the brain more likely to learn and change the strength of its neuronal connections.
Sejnowski says that the behavioral result is just the tip of the iceberg. “The recognition system is hugely important,” he says, adding that it includes recognizing other people, places, facts and things that happened in the past. With this new discovery, scientists can begin to better understand the role of gamma waves in recognition memory, he adds.

Memory relies on astrocytes, the brain’s lesser known cells

When you’re expecting something—like the meal you’ve ordered at a restaurant—or when something captures your interest, unique electrical rhythms sweep through your brain.

These waves are called gamma oscillations and they reflect a symphony of cells—both excitatory and inhibitory—playing together in an orchestrated way. Though their role has been debated, gamma waves have been associated with higher-level brain function, and disturbances in the patterns have been tied to schizophrenia, Alzheimer’s disease, autism, epilepsy and other disorders.

Now, new research from the Salk Institute shows that little known supportive cells in the brain known as astrocytes may in fact be major players that control these waves.

In a study published July 28 in the Proceedings of the National Academy of Sciences, Salk researchers report a new, unexpected strategy to turn down gamma oscillations, by disabling not neurons but astrocytes—cells type traditionally thought to provide more of a support role in the brain. In the process, the team showed that astrocytes, and the gamma oscillations they help shape, are critical for some forms of memory.

"This is what could be called a smoking gun," says co-author Terrence Sejnowski, head of the Computational Neurobiology Laboratory at the Salk Institute for Biological Sciences and a Howard Hughes Medical Institute investigator. "There are hundreds of papers linking gamma oscillations with attention and memory, but they are all correlational. This is the first time we have been able to do a causal experiment, where we selectively block gamma oscillations and show that it has a highly specific impact on how the brain interacts with the world."

A collaboration among the labs of Salk professors Sejnowski, Inder Verma and Stephen Heinemann found that activity in the form of calcium signaling in astrocytes immediately preceded gamma oscillations in the brains of mice. This suggested that astrocytes, which use many of the same chemical signals as neurons, could be influencing these oscillations.

To test their theory, the group used a virus carrying tetanus toxin to disable the release of chemicals released selectively from astrocytes, effectively eliminating the cells’ ability to communicate with neighboring cells. Neurons were unaffected by the toxin.

After adding a chemical to trigger gamma waves in the animals’ brains, the researchers found that brain tissue with disabled astrocytes produced shorter gamma waves than in tissue containing healthy cells. And after adding three genes that would allow the researchers to selectively turn on and off the tetanus toxin in astrocytes at will, they found that gamma waves were dampened in mice whose astrocytes were blocked from signaling. Turning off the toxin reversed this effect.

The mice with the modified astrocytes seemed perfectly healthy. But after several cognitive tests, the researchers found that they failed in one major area: novel object recognition. A healthy mouse spent more time with a new item placed in its environment than it did with familiar items, as expected.

In contrast, the group’s new mutant mouse treated all objects the same. “That turned out to be a spectacular result in the sense that novel object recognition memory was not just impaired, it was gone—as if we were deleting this one form of memory, leaving others intact,” Sejnowski says.

The results were surprising, in part because astrocytes operate on a seconds- or longer timescale whereas neurons signal far faster, on the millisecond scale. Because of that slower speed, no one suspected astrocytes were involved in the high-speed brain activity needed to make quick decisions.

"What I thought quite unique was the idea that astrocytes, traditionally considered only guardians and supporters of neurons and other cells, are also involved in the processing of information and in other cognitive behavior," says Verma, a professor in the Laboratory of Genetics and American Cancer Society Professor.

It’s not that astrocytes are quick—they’re still slower than neurons. But the new evidence suggests that astrocytes are actively supplying the right environment for gamma waves to occur, which in turn makes the brain more likely to learn and change the strength of its neuronal connections.

Sejnowski says that the behavioral result is just the tip of the iceberg. “The recognition system is hugely important,” he says, adding that it includes recognizing other people, places, facts and things that happened in the past. With this new discovery, scientists can begin to better understand the role of gamma waves in recognition memory, he adds.

Filed under astrocytes memory gamma oscillations neuroscience science

77 notes

Scientists find 6 new genetic risk factors for Parkinson’s 
Using data from over 18,000 patients, scientists have identified more than two dozen genetic risk factors involved in Parkinson’s disease, including six that had not been previously reported. The study, published in Nature Genetics, was partially funded by the National Institutes of Health (NIH) and led by scientists working in NIH laboratories.
"Unraveling the genetic underpinnings of Parkinson’s is vital to understanding the multiple mechanisms involved in this complex disease, and hopefully, may one day lead to effective therapies," said Andrew Singleton, Ph.D., a scientist at the NIH’s National Institute on Aging (NIA) and senior author of the study.
Dr. Singleton and his colleagues collected and combined data from existing genome-wide association studies (GWAS), which allow scientists to find common variants, or subtle differences, in the genetic codes of large groups of individuals. The combined data included approximately 13,708 Parkinson’s disease cases and 95,282 controls, all of European ancestry.
The investigators identified potential genetic risk variants, which increase the chances that a person may develop Parkinson’s disease. Their results suggested that the more variants a person has, the greater the risk, up to three times higher, for developing the disorder in some cases.
"The study brought together a large international group of investigators from both public and private institutions who were interested in sharing data to accelerate the discovery of genetic risk factors for Parkinson’s disease," said Margaret Sutherland, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke (NINDS), part of NIH. "The advantage of this collaborative approach is highlighted in the identification of pathways and gene networks that may significantly increase our understanding of Parkinson’s disease."
To obtain the data, the researchers collaborated with multiple public and private organizations, including the U.S. Department of Defense, the Michael J. Fox Foundation, 23andMe and many international investigators.
Affecting millions of people worldwide, Parkinson’s disease is a degenerative disorder that causes movement problems, including trembling of the hands, arms, or legs, stiffness of limbs and trunk, slowed movements and problems with posture. Over time, patients may have difficulty walking, talking, or completing other simple tasks. Although nine genes have been shown to cause rare forms of Parkinson’s disease, scientists continue to search for genetic risk factors to provide a complete genetic picture of the disorder.
The researchers confirmed the results in another sample of subjects, including 5,353 patients and 5,551 controls. By comparing the genetic regions to sequences on a state-of-the-art gene chip called NeuroX, the researchers confirmed that 24 variants represent genetic risk factors for Parkinson’s disease, including six variants that had not been previously identified. The NeuroX gene chip contains the codes of approximately 24,000 common genetic variants thought to be associated with a broad spectrum of neurodegenerative disorders.
"The replication phase of the study demonstrates the utility of the NeuroX chip for unlocking the secrets of neurodegenerative disorders," said Dr. Sutherland. "The power of these high tech, data-driven genomic methods allows scientists to find the needle in the haystack that may ultimately lead to new treatments."
Some of the newly identified genetic risk factors are thought to be involved with Gaucher’s disease, regulating inflammation and the nerve cell chemical messenger dopamine as well as alpha-synuclein, a protein that has been shown to accumulate in the brains of some cases of Parkinson’s disease. Further research is needed to determine the roles of the variants identified in this study.

Scientists find 6 new genetic risk factors for Parkinson’s

Using data from over 18,000 patients, scientists have identified more than two dozen genetic risk factors involved in Parkinson’s disease, including six that had not been previously reported. The study, published in Nature Genetics, was partially funded by the National Institutes of Health (NIH) and led by scientists working in NIH laboratories.

"Unraveling the genetic underpinnings of Parkinson’s is vital to understanding the multiple mechanisms involved in this complex disease, and hopefully, may one day lead to effective therapies," said Andrew Singleton, Ph.D., a scientist at the NIH’s National Institute on Aging (NIA) and senior author of the study.

Dr. Singleton and his colleagues collected and combined data from existing genome-wide association studies (GWAS), which allow scientists to find common variants, or subtle differences, in the genetic codes of large groups of individuals. The combined data included approximately 13,708 Parkinson’s disease cases and 95,282 controls, all of European ancestry.

The investigators identified potential genetic risk variants, which increase the chances that a person may develop Parkinson’s disease. Their results suggested that the more variants a person has, the greater the risk, up to three times higher, for developing the disorder in some cases.

"The study brought together a large international group of investigators from both public and private institutions who were interested in sharing data to accelerate the discovery of genetic risk factors for Parkinson’s disease," said Margaret Sutherland, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke (NINDS), part of NIH. "The advantage of this collaborative approach is highlighted in the identification of pathways and gene networks that may significantly increase our understanding of Parkinson’s disease."

To obtain the data, the researchers collaborated with multiple public and private organizations, including the U.S. Department of Defense, the Michael J. Fox Foundation, 23andMe and many international investigators.

Affecting millions of people worldwide, Parkinson’s disease is a degenerative disorder that causes movement problems, including trembling of the hands, arms, or legs, stiffness of limbs and trunk, slowed movements and problems with posture. Over time, patients may have difficulty walking, talking, or completing other simple tasks. Although nine genes have been shown to cause rare forms of Parkinson’s disease, scientists continue to search for genetic risk factors to provide a complete genetic picture of the disorder.

The researchers confirmed the results in another sample of subjects, including 5,353 patients and 5,551 controls. By comparing the genetic regions to sequences on a state-of-the-art gene chip called NeuroX, the researchers confirmed that 24 variants represent genetic risk factors for Parkinson’s disease, including six variants that had not been previously identified. The NeuroX gene chip contains the codes of approximately 24,000 common genetic variants thought to be associated with a broad spectrum of neurodegenerative disorders.

"The replication phase of the study demonstrates the utility of the NeuroX chip for unlocking the secrets of neurodegenerative disorders," said Dr. Sutherland. "The power of these high tech, data-driven genomic methods allows scientists to find the needle in the haystack that may ultimately lead to new treatments."

Some of the newly identified genetic risk factors are thought to be involved with Gaucher’s disease, regulating inflammation and the nerve cell chemical messenger dopamine as well as alpha-synuclein, a protein that has been shown to accumulate in the brains of some cases of Parkinson’s disease. Further research is needed to determine the roles of the variants identified in this study.

Filed under parkinson's disease GWAS NeuroX genetics neuroscience science

145 notes

(Image caption: Techniques known as dimensionality reduction can help find patterns in the recorded activity of thousands of neurons. Rather than look at all responses at once, these methods find a smaller set of dimensions — in this case three — that capture as much structure in the data as possible. Each trace in these graphics represents the activity of the whole brain during a single presentation of a moving stimulus, and different versions of the analysis capture structure related either to the passage of time (left) or the direction of the motion (right). The raw data is the same in both cases, but the analyses finds different patterns. Credit: Jeremy Freeman, Nikita Vladimirov, Takashi Kawashima, Yu Mu, Nicholas Sofroniew, Davis Bennett, Joshua Rosen, Chao-Tsung Yang, Loren Looger, Philipp Keller, Misha Ahrens)
New Tools Help Neuroscientists Analyze Big Data
In an age of “big data,” a single computer cannot always find the solution a user wants. Computational tasks must instead be distributed across a cluster of computers that analyze a massive data set together. It’s how Facebook and Google mine your web history to present you with targeted ads, and how Amazon and Netflix recommend your next favorite book or movie. But big data is about more than just marketing.
New technologies for monitoring brain activity are generating unprecedented quantities of information. That data may hold new insights into how the brain works – but only if researchers can interpret it. To help make sense of the data, neuroscientists can now harness the power of distributed computing with Thunder, a library of tools developed at the Howard Hughes Medical Institute’s Janelia Research Campus.
Thunder speeds the analysis of data sets that are so large and complex they would take days or weeks to analyze on a single workstation – if a single workstation could do it at all. Janelia group leaders Jeremy Freeman, Misha Ahrens, and other colleagues at Janelia and the University of California, Berkeley, report in the July 27, 2014, issue of the journal Nature Methods that they have used Thunder to quickly find patterns in high-resolution images collected from the brains of active zebrafish and mice with multiple imaging techniques.
Importantly, they have used Thunder to analyze imaging data from a new microscope that Ahrens and colleagues developed to monitor the activity of nearly every individual cell in the brain of a zebrafish as it behaves in response to visual stimuli. That technology is described in a companion paper published in the same issue of Nature Methods.
Thunder can run on a private cluster or on Amazon’s cloud computing services. Researchers can find everything they need to begin using the open source library of tools at http://freeman-lab.github.io/thunder
New microscopes are capturing images of the brain faster, with better spatial resolution, and across wider regions of the brain than ever before. Yet all that detail comes encrypted in gigabytes or even terabytes of data. On a single workstation, simple calculations can take hours. “For a lot of these data sets, a single machine is just not going to cut it,” Freeman says.
It’s not just the sheer volume of data that exceeds the limits of a single computer, Freeman and Ahrens say, but also its complexity. “When you record information from the brain, you don’t know the best way to get the information that you need out of it. Every data set is different. You have ideas, but whether or not they generate insights is an open question until you actually apply them,” says Ahrens.
Neuroscientists rarely arrive at new insights about the brain the first time they consider their data, he explains. Instead, an initial analysis may hint at a more promising approach, and with a few adjustments and a new computational analysis, the data may begin to look more meaningful. “Being able to apply these analyses quickly — one after the other — is important. Speed gives a researcher more flexibility to explore and get new ideas.”
That’s why trying to analyze neuroscience data with slow computational tools can be so frustrating. “For some analyses, you can load the data, start it running, and then come back the next day,” Freeman says. “But if you need to tweak the analysis and run it again, then you have to wait another night.” For larger data sets, the lag time might be weeks or months.
Distributed computing was an obvious solution to accelerate analysis while exploring the full richness of a data set, but many alternatives are available. Freeman chose to build on a new platform called Spark. Developed at the University of California, Berkeley’s AMPLab, Spark is rapidly becoming a favored tool for large-scale computing across industry, Freeman says. Spark’s capabilities for data caching eliminates the bottleneck of loading a complete data set for all but the initial step, making it well-suited for interactive, exploratory analysis, and for complex algorithms requiring repeated operations on the same data. And Spark’s elegant and versatile application programming interfaces (APIs) help simplify development. Thunder uses the Python API, which Freeman hopes will make it particularly easy for others to adopt, given Python’s increasing use in neuroscience and data science.
To make Spark suitable for analyzing a broad range of neuroscience data – information about connectivity and activity collected from different organisms and with different techniques – Freeman first developed standardized representations of data that were amenable to distributed computing. He then worked to express typical neuroscience workflows into the computational language of Spark.
From there, he says, the biological questions that he and his colleagues were curious about drove development. “We started with our questions about the biology, then came up with the analyses and developed the tools,” he says.
The result is a modular set of tools that will expand as the Janelia team — and the neuroscience community — add new components. “The analyses we developed are building blocks,” says Ahrens. “The development of new analyses for interpreting large-scale recording is an active field and goes hand-in-hand with the development of resources for large-scale computing and imaging. The algorithms in our paper are a starting point.”
Using Thunder, Freeman, Ahrens, and their colleagues analyzed images of the brain in minutes, interacting with and revising analyses without the lengthy delays associated with previous methods. In images taken of a mouse brain with a two-photon microscope, for example, the team found cells in the brain whose activity varied with running speed.
For analyzing much larger data sets, tools such as Thunder are not just helpful, they are essential, the scientists say. This is true for the information collected by the new microscope that Ahrens and colleagues developed for monitoring whole-brain activity in response to visual stimuli.
Last year, Ahrens and Janelia group leader Phillip Keller used high-speed light-sheet imaging to engineer a microscope that captures neuronal activity cell by cell across nearly the entire brain of a larval zebrafish. That microscope produced stunning images of neurons in the zebrafish brain firing while the fish was inactive. But Ahrens wanted to use the technology to study the brain’s activity in more complex situations. Now, the team has combined their original technology with a virtual-reality swim simulator that Ahrens previously developed to provide fish with visual feedback that simulates movement.
In a light sheet microscope, a sheet of laser light scans across a sample, illuminating a thin section at a time. To enable a fish in the microscope to see and respond to its virtual-reality environment, Ahrens’ team needed to protect its eyes. So they programmed the laser to quickly shut off when its light sheet approaches the eye and restart once the area is cleared. Then they introduced a second laser that scans the sample from a different angle to ensure that the region of the brain behind the eyes is imaged. Together, the two lasers image the brain with nearly complete coverage without interfering with the animal’s vision.
Combining these two technologies lets Ahrens monitor activity throughout the brain as a fish adjusts its behavior based on the sensory information it receives. The technique generates about a terabyte of data in an hour – presenting a data analysis challenge that helped motivate the development of Thunder. When Freeman and Ahrens applied their new tools to the data, patterns quickly emerged. As examples, they identified cells whose activity was associated with movement in particular directions and cells that fired specifically when the fish was at rest, and were able to characterize the dynamics of those cells’ activities. Example analyses like these, and example data sets, are available at the website http://research.janelia.org/zebrafish/.
Ahrens now plans to explore more complex questions using the new technology, and both he and Freeman foresee expansion of Thunder. “At every level, this is really just the beginning,” Freeman says.

(Image caption: Techniques known as dimensionality reduction can help find patterns in the recorded activity of thousands of neurons. Rather than look at all responses at once, these methods find a smaller set of dimensions — in this case three — that capture as much structure in the data as possible. Each trace in these graphics represents the activity of the whole brain during a single presentation of a moving stimulus, and different versions of the analysis capture structure related either to the passage of time (left) or the direction of the motion (right). The raw data is the same in both cases, but the analyses finds different patterns. Credit: Jeremy Freeman, Nikita Vladimirov, Takashi Kawashima, Yu Mu, Nicholas Sofroniew, Davis Bennett, Joshua Rosen, Chao-Tsung Yang, Loren Looger, Philipp Keller, Misha Ahrens)

New Tools Help Neuroscientists Analyze Big Data

In an age of “big data,” a single computer cannot always find the solution a user wants. Computational tasks must instead be distributed across a cluster of computers that analyze a massive data set together. It’s how Facebook and Google mine your web history to present you with targeted ads, and how Amazon and Netflix recommend your next favorite book or movie. But big data is about more than just marketing.

New technologies for monitoring brain activity are generating unprecedented quantities of information. That data may hold new insights into how the brain works – but only if researchers can interpret it. To help make sense of the data, neuroscientists can now harness the power of distributed computing with Thunder, a library of tools developed at the Howard Hughes Medical Institute’s Janelia Research Campus.

Thunder speeds the analysis of data sets that are so large and complex they would take days or weeks to analyze on a single workstation – if a single workstation could do it at all. Janelia group leaders Jeremy Freeman, Misha Ahrens, and other colleagues at Janelia and the University of California, Berkeley, report in the July 27, 2014, issue of the journal Nature Methods that they have used Thunder to quickly find patterns in high-resolution images collected from the brains of active zebrafish and mice with multiple imaging techniques.

Importantly, they have used Thunder to analyze imaging data from a new microscope that Ahrens and colleagues developed to monitor the activity of nearly every individual cell in the brain of a zebrafish as it behaves in response to visual stimuli. That technology is described in a companion paper published in the same issue of Nature Methods.

Thunder can run on a private cluster or on Amazon’s cloud computing services. Researchers can find everything they need to begin using the open source library of tools at http://freeman-lab.github.io/thunder

New microscopes are capturing images of the brain faster, with better spatial resolution, and across wider regions of the brain than ever before. Yet all that detail comes encrypted in gigabytes or even terabytes of data. On a single workstation, simple calculations can take hours. “For a lot of these data sets, a single machine is just not going to cut it,” Freeman says.

It’s not just the sheer volume of data that exceeds the limits of a single computer, Freeman and Ahrens say, but also its complexity. “When you record information from the brain, you don’t know the best way to get the information that you need out of it. Every data set is different. You have ideas, but whether or not they generate insights is an open question until you actually apply them,” says Ahrens.

Neuroscientists rarely arrive at new insights about the brain the first time they consider their data, he explains. Instead, an initial analysis may hint at a more promising approach, and with a few adjustments and a new computational analysis, the data may begin to look more meaningful. “Being able to apply these analyses quickly — one after the other — is important. Speed gives a researcher more flexibility to explore and get new ideas.”

That’s why trying to analyze neuroscience data with slow computational tools can be so frustrating. “For some analyses, you can load the data, start it running, and then come back the next day,” Freeman says. “But if you need to tweak the analysis and run it again, then you have to wait another night.” For larger data sets, the lag time might be weeks or months.

Distributed computing was an obvious solution to accelerate analysis while exploring the full richness of a data set, but many alternatives are available. Freeman chose to build on a new platform called Spark. Developed at the University of California, Berkeley’s AMPLab, Spark is rapidly becoming a favored tool for large-scale computing across industry, Freeman says. Spark’s capabilities for data caching eliminates the bottleneck of loading a complete data set for all but the initial step, making it well-suited for interactive, exploratory analysis, and for complex algorithms requiring repeated operations on the same data. And Spark’s elegant and versatile application programming interfaces (APIs) help simplify development. Thunder uses the Python API, which Freeman hopes will make it particularly easy for others to adopt, given Python’s increasing use in neuroscience and data science.

To make Spark suitable for analyzing a broad range of neuroscience data – information about connectivity and activity collected from different organisms and with different techniques – Freeman first developed standardized representations of data that were amenable to distributed computing. He then worked to express typical neuroscience workflows into the computational language of Spark.

From there, he says, the biological questions that he and his colleagues were curious about drove development. “We started with our questions about the biology, then came up with the analyses and developed the tools,” he says.

The result is a modular set of tools that will expand as the Janelia team — and the neuroscience community — add new components. “The analyses we developed are building blocks,” says Ahrens. “The development of new analyses for interpreting large-scale recording is an active field and goes hand-in-hand with the development of resources for large-scale computing and imaging. The algorithms in our paper are a starting point.”

Using Thunder, Freeman, Ahrens, and their colleagues analyzed images of the brain in minutes, interacting with and revising analyses without the lengthy delays associated with previous methods. In images taken of a mouse brain with a two-photon microscope, for example, the team found cells in the brain whose activity varied with running speed.

For analyzing much larger data sets, tools such as Thunder are not just helpful, they are essential, the scientists say. This is true for the information collected by the new microscope that Ahrens and colleagues developed for monitoring whole-brain activity in response to visual stimuli.

Last year, Ahrens and Janelia group leader Phillip Keller used high-speed light-sheet imaging to engineer a microscope that captures neuronal activity cell by cell across nearly the entire brain of a larval zebrafish. That microscope produced stunning images of neurons in the zebrafish brain firing while the fish was inactive. But Ahrens wanted to use the technology to study the brain’s activity in more complex situations. Now, the team has combined their original technology with a virtual-reality swim simulator that Ahrens previously developed to provide fish with visual feedback that simulates movement.

In a light sheet microscope, a sheet of laser light scans across a sample, illuminating a thin section at a time. To enable a fish in the microscope to see and respond to its virtual-reality environment, Ahrens’ team needed to protect its eyes. So they programmed the laser to quickly shut off when its light sheet approaches the eye and restart once the area is cleared. Then they introduced a second laser that scans the sample from a different angle to ensure that the region of the brain behind the eyes is imaged. Together, the two lasers image the brain with nearly complete coverage without interfering with the animal’s vision.

Combining these two technologies lets Ahrens monitor activity throughout the brain as a fish adjusts its behavior based on the sensory information it receives. The technique generates about a terabyte of data in an hour – presenting a data analysis challenge that helped motivate the development of Thunder. When Freeman and Ahrens applied their new tools to the data, patterns quickly emerged. As examples, they identified cells whose activity was associated with movement in particular directions and cells that fired specifically when the fish was at rest, and were able to characterize the dynamics of those cells’ activities. Example analyses like these, and example data sets, are available at the website http://research.janelia.org/zebrafish/.

Ahrens now plans to explore more complex questions using the new technology, and both he and Freeman foresee expansion of Thunder. “At every level, this is really just the beginning,” Freeman says.

Filed under brain activity zebrafish Thunder computational analysis neuroscience science

839 notes

What sign language teaches us about the brain
The world’s leading humanoid robot, ASIMO, has recently learnt sign language. The news of this breakthrough came just as I completed Level 1 of British Sign Language (I dare say it took me longer to master signing than it did the robot!). As a neuroscientist, the experience of learning to sign made me think about how the brain perceives this means of communicating.
For instance, during my training, I found that mnemonics greatly simplified my learning process. To sign the colour blue you use the fingers of your right hand to rub the back of your left hand, my simple mnemonic for this sign being that the veins on the back of our hand appear blue. I was therefore forming an association between the word blue (English), the sign for blue (BSL), and the visual aid that links the two. However, the two languages differ markedly in that one relies on sounds and the other on visual signs.
Do our brains process these languages differently? It seems that for the most part, they don’t. And it turns out that brain studies of sign language users have helped bust a few myths.
Read more

What sign language teaches us about the brain

The world’s leading humanoid robot, ASIMO, has recently learnt sign language. The news of this breakthrough came just as I completed Level 1 of British Sign Language (I dare say it took me longer to master signing than it did the robot!). As a neuroscientist, the experience of learning to sign made me think about how the brain perceives this means of communicating.

For instance, during my training, I found that mnemonics greatly simplified my learning process. To sign the colour blue you use the fingers of your right hand to rub the back of your left hand, my simple mnemonic for this sign being that the veins on the back of our hand appear blue. I was therefore forming an association between the word blue (English), the sign for blue (BSL), and the visual aid that links the two. However, the two languages differ markedly in that one relies on sounds and the other on visual signs.

Do our brains process these languages differently? It seems that for the most part, they don’t. And it turns out that brain studies of sign language users have helped bust a few myths.

Read more

Filed under sign language neuroimaging communication lesion studies neuroscience science

81 notes

Slow Walking Speed and Memory Complaints Can Predict Dementia

A study involving nearly 27,000 older adults on five continents found that nearly 1 in 10 met criteria for pre-dementia based on a simple test that measures how fast people walk and whether they have cognitive complaints. People who tested positive for pre-dementia were twice as likely as others to develop dementia within 12 years. The study, led by scientists at Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center, was published online on July 16, 2014 in Neurology®, the medical journal of the American Academy of Neurology.

image

The new test diagnoses motoric cognitive risk syndrome (MCR). Testing for the newly described syndrome relies on measuring gait speed (our manner of walking) and asking a few simple questions about a patient’s cognitive abilities, both of which take just seconds. The test is not reliant on the latest medical technology and can be done in a clinical setting, diagnosing people in the early stages of the dementia process. Early diagnosis is critical because it allows time to identify and possibly treat the underlying causes of the disease, which may delay or even prevent the onset of dementia in some cases.

“In many clinical and community settings, people don’t have access to the sophisticated tests—biomarker assays, cognitive tests or neuroimaging studies—used to diagnose people at risk for developing dementia,” said Joe Verghese, M.B.B.S., professor in the Saul R. Korey Department of Neurology and of medicine at Einstein, chief of geriatrics at Einstein and Montefiore, and senior author of the Neurology paper. “Our assessment method could enable many more people to learn if they’re at risk for dementia, since it avoids the need for complex testing and doesn’t require that the test be administered by a neurologist. The potential payoff could be tremendous—not only for individuals and their families, but also in terms of healthcare savings for society. All that’s needed to assess MCR is a stopwatch and a few questions, so primary care physicians could easily incorporate it into examinations of their older patients.”

The U.S. Centers for Disease Control and Prevention estimates that up to 5.3 million Americans—about 1 in 9 people age 65 and over—have Alzheimer’s disease, the most common type of dementia. That number is expected to more than double by 2050 due to population aging.

“As a young researcher, I examined hundreds of patients and noticed that if an older person was walking slowly, there was a good chance that his cognitive tests were also abnormal,” said Dr. Verghese, who is also the Murray D. Gross Memorial Faculty Scholar in Gerontology at Einstein. “This gave me the idea that perhaps we could use this simple clinical sign—how fast someone walks—to predict who would develop dementia. In a 2002 New England Journal of Medicine study, we reported that abnormal gait patterns accurately predict whether people will go on to develop dementia. MCR improves on the slow gait concept by evaluating not only patients’ gait speed but also whether they have cognitive complaints.”

The Neurology paper reported on the prevalence of MCR among 26,802 adults without dementia or disability aged 60 years and older enrolled in 22 studies in 17 countries. A significant number of adults—9.7 percent—met the criteria for MCR (i.e., abnormally slow gait and cognitive complaints). While the syndrome was equally common in men and women, highly educated people were less likely to test positive for MCR compared with less-educated individuals. A slow gait, said Dr. Verghese, is a walking speed slower than about one meter per second, which is about 2.2 miles per hour (m.p.h.). Less than 0.6 meters per second (or 1.3 m.p.h.) is “clearly abnormal.”

To test whether MCR predicts future dementia, the researchers focused on four of the 22 studies that tested a total of 4,812 people for MCR and then evaluated them annually over an average follow-up period of 12 years to see which ones developed dementia. Those who met the criteria for MCR were nearly twice as likely to develop dementia over the following 12 years compared with people who did not.

Dr. Verghese emphasized that a slow gait alone is not sufficient for a diagnosis of MCR. “Walking slowly could be due to conditions such as arthritis or an inner ear problem that affects balance, which would not increase risk for dementia. To meet the criteria for MCR requires having a slow gait and cognitive problems. An example would be answering ‘yes’ to the question, ‘Do you think you have more memory problems than other people?’”

For patients meeting MCR criteria, said Dr. Verghese, the next step is to look for the causes of their slow gait and cognitive complaints. The search may reveal underlying—and controllable—problems. “Evidence increasingly suggests that brain health is closely tied to cardiovascular health—meaning that treatable conditions such as hypertension, smoking, high cholesterol, obesity and diabetes can interfere with blood flow to the brain and thereby increase a person’s risk for developing Alzheimer’s and other dementias,” said Dr. Verghese.

What about people who meet MCR criteria but no treatable underlying problems can be found?

“Even in the absence of a specific cause, we know that most healthy lifestyle factors, such as exercising and eating healthier, have been shown to reduce the rate of cognitive decline,” said Dr. Verghese. “In addition, our group has shown that cognitively stimulating activities—playing board games, card games, reading, writing and also dancing—can delay dementia’s onset. Knowing they’re at high risk for dementia can also help people and their families make arrangements for the future, which is an aspect of MCR testing that I’ve found is very important in my own clinical practice.”

Filed under dementia motoric cognitive risk syndrome gait speed cognitive decline neuroscience science

178 notes

Researchers discover that Klotho is neuroprotective against Alzheimer’s disease

Boston University School of Medicine researchers may have found a way to delay or even prevent Alzheimer’s disease (AD). They discovered that pre-treatment of neurons with the anti-aging protein Klotho can prevent neuron death in the presence of the toxic amyloid protein and glutamate. These findings currently appear in the Journal of Biological Chemistry.

Alzheimer’s disease is the most frequent age-related dementia affecting 5.4 million Americans including 13 percent of people age 65 and older and more than 40 percent of people over the age of 85. In AD the cognitive decline and dementia result from the death of nerve cells that are involved in learning and memory. The amyloid protein and the excess of the neurotransmitter, glutamate are partially responsible for the neuronal demise.

Nerve cells were grown in petri dishes and treated with or without Klotho for four hours. Amyloid or glutamate then were added to the dish for 24 hours. In the dishes where Klotho was added, a much higher percentage of neurons survived than in the dishes without Klotho.

"Finding a neuroprotective agent that will protect nerve cells from amyloid that accumulates as a function of age in the brain is novel and of major importance," explained corresponding author Carmela R. Abraham, PhD, professor of biochemistry and pharmacology at BUSM. "We now have evidence that if more Klotho is present in the brain, it will protect the neurons from the oxidative stress induced by amyloid and glutamate.

According to the researchers, Klotho is a large protein that cannot penetrate the blood brain barrier so it can’t be administered by mouth or injection. However in a separate study the researchers have identified small molecules that can enter the brain and increase the levels of Klotho. “We believe that increasing Klotho levels with such compounds would improve the outcome for Alzheimer’s patients, and if started early enough would prevent further deterioration. This potential treatment has implications for other neurodegenerative diseases such as Parkinson’s, Huntington’s, ALS and brain trauma, as well,” added Abraham.

(Source: eurekalert.org)

Filed under klotho alzheimer's disease neuroprotection glutamate oxidative stress neuroscience science

116 notes

Anti-inflammatory drug can prevent neuron loss in Parkinson’s model

An experimental anti-inflammatory drug can protect vulnerable neurons and reduce motor deficits in a rat model of Parkinson’s disease, researchers at Emory University School of Medicine have shown.

The results were published Thursday, July 24 in the Journal of Parkinson’s Disease.

image

The findings demonstrate that the drug, called XPro1595, can reach the brain at sufficient levels and have beneficial effects when administered by subcutaneous injection, like an insulin shot. Previous studies of XPro1595 in animals tested more invasive modes of delivery, such as direct injection into the brain.

“This is an important step forward for anti-inflammatory therapies for Parkinson’s disease,” says Malu Tansey, PhD, associate professor of physiology at Emory University School of Medicine. “Our results provide a compelling rationale for moving toward a clinical trial in early Parkinson’s disease patients.”

The new research on subcutaneous administration of XPro1595 was funded by the Michael J. Fox Foundation for Parkinson’s Research (MJFF). XPro1595 is licensed by FPRT Bio, and is seeking funding for a clinical trial to test its efficacy in the early stages of Parkinson’s disease.

“We are proud to have supported this work and glad to see positive pre-clinical results,” said Marco Baptista, PhD, MJFF associate director of research programs. “A therapy that could slow Parkinson’s progression would be a game changer for the millions living with this disease, and this study is a step in that direction.”

In addition, Tansey and Yoland Smith, PhD, from Yerkes National Primate Research Center, were awarded a grant this week from the Parkinson’s Disease Foundation to test XPro1595 in a non-human primate model of Parkinson’s.

Evidence has been piling up that inflammation is an important mechanism driving the progression of Parkinson’s disease. XPro1595 targets tumor necrosis factor (TNF), a critical inflammatory signaling molecule, and is specific to the soluble form of TNF. This specificity would avoid compromising immunity to infections, a known side effect of existing anti-TNF drugs used to treat disorders such as rheumatoid arthritis.

“Inflammation is probably not the initiating event in Parkinson’s disease, but it is important for the neurodegeneration that follows,” Tansey says. “That’s why we believe that an anti-inflammatory agent, such as one that counteracts soluble TNF, could substantially slow the progression of the disease.”

Postdoctoral fellow Christopher Barnum, PhD and colleagues used a model of Parkinson’s disease in rats in which the neurotoxin 6-hydroxydopamine (6-OHDA) is injected into only one side of the brain. This reproduces some aspects of Parkinson’s disease: neurons that produce dopamine in the injected side of the brain die, leading to impaired movement on the opposite side of the body.

When XPro1595 is given to the animals 3 days after 6-OHDA injection, just 15 percent of the dopamine-producing neurons were lost five weeks later. That compares to controls in which 55 percent of the same neurons were lost. By reducing dopamine neuron loss with XPro1595, the researchers were also able to reduce motor impairment. In fact, the degree of dopamine cell loss was highly correlated both with the degree of motor impairment and immune cell activation.

When XPro1595 is given two weeks after injection, 44 percent of the vulnerable neurons are still lost, suggesting that there is a limited window of opportunity to intervene.

“Recent clinical studies indicates there is a four or five year window between diagnosis of Parkinson’s disease and the time when the maximum number of vulnerable neurons are lost,” Dr. Tansey says. “If this is true, and if inflammation is playing a key role during this window, then we might be able to slow or halt the progression of Parkinson’s with a treatment like XPro1595.”

(Source: news.emory.edu)

Filed under parkinson's disease substantia nigra inflammation microglia astrocytes neuroscience science

130 notes

Experiences at every stage of life contribute to cognitive abilities in old age

Early life experiences, such as childhood socioeconomic status and literacy, may have greater influence on the risk of cognitive impairment late in life than such demographic characteristics as race and ethnicity, a large study by researchers with the UC Davis Alzheimer’s Disease Center and the University of Victoria, Canada, has found.

image

“Declining cognitive function in older adults is a major personal and public health concern,” said Bruce Reed professor of neurology and associate director of the UC Davis Alzheimer’s Disease Center.

“But not all people lose cognitive function, and understanding the remarkable variability in cognitive trajectories as people age is of critical importance for prevention, treatment and planning to promote successful cognitive aging and minimize problems associated with cognitive decline.”

The study, “Life Experiences and Demographic Influences on Cognitive Function in Older Adults,” is published online in Neuropsychology, a journal of the American Psychological Association. It is one of the first comprehensive examinations of the multiple influences of varied demographic factors early in life and their relationship to cognitive aging.

The research was conducted in a group of over 300 diverse men and women who spoke either English or Spanish. They were recruited from senior citizen social, recreational and residential centers, as well as churches and health-care settings. At the time of recruitment, all study participants were 60 or older, and had no major psychiatric illnesses or life threatening medical illnesses. Participants were Caucasian, African-American or Hispanic.

The extensive testing included multidisciplinary diagnostic evaluations through the UC Davis Alzheimer’s Disease Center in either English or Spanish, which permitted comparisons across a diverse cohort of participants.

Consistent with previous research, the study found that non-Latino Caucasians scored 20 to 25 percent higher on tests of semantic memory (general knowledge) and 13 to 15 percent higher on tests of executive functioning compared to the other ethnic groups. However, ethnic differences in executive functioning disappeared and differences in semantic memory were reduced by 20 to 30 percent when group differences in childhood socioeconomic status, adult literacy and extent of physical activity during adulthood were considered. 

“This study is unusual in that it examines how many different life experiences affect cognitive decline in late life,” said Dan Mungas, professor of neurology and associate director of the UC Davis Alzheimer’s Disease Research Center. 

“It shows that variables like ethnicity and years of education that influence cognitive test scores in a single evaluation are not associated with rate of cognitive decline, but that specific life experiences like level of reading attainment and intellectually stimulating activities are predictive of the rate of late-life cognitive decline. This suggests that intellectual stimulation throughout the life span can reduce cognitive decline in old age.”

Regardless of ethnicity, advanced age and apolipoprotein-E (APOE genotype) were associated with increased cognitive decline over an average of four years that participants were followed. APOE is the largest known genetic risk factor for late-onset Alzheimer’s. Less decline was experienced by persons who reported more engagement in recreational activities in late life and who maintained their levels of activity engagement from middle age to old age. Single-word reading — the ability to decode a word on sight, which often is considered an indication of quality of educational experience — was also associated with less cognitive decline, a finding that was true for both English and Spanish readers, irrespective of their race or ethnicity. These findings suggest that early life experiences affect late-life cognition indirectly, through literacy and late-life recreational pursuits, the authors said.

“These findings are important,” explained Paul Brewster, lead author of the study, a doctoral student at the University of Victoria, Canada, and a pre-doctoral psychology intern at the UC San Diego Department of Psychiatry, “because it challenges earlier research that suggests associations between race and ethnicity, particularly among Latinos, and an increased risk of late-life cognitive impairment and dementia.

”Our findings suggest that the influences of demographic factors on late-life cognition may be reflective of broader socioeconomic factors, such as educational opportunity and related differences in physical and mental activity across the life span.”

(Source: ucdmc.ucdavis.edu)

Filed under alzheimer's disease cognitive impairment life experience apoE4 psychology neuroscience science

free counters