Neuroscience

Articles and news from the latest research reports.

89 notes

Researchers find gene critical for development of brain motor centre

In a report published today in Nature Communications, an Ottawa-led team of researchers describe the role of a specific gene, called Snf2h, in the development of the cerebellum. Snf2h is required for the proper development of a healthy cerebellum, a master control centre in the brain for balance, fine motor control and complex physical movements.

Athletes and artists perform their extraordinary feats relying on the cerebellum. As well, the cerebellum is critical for the everyday tasks and activities that we perform, such as walking, eating and driving a car. By removing Snf2h, researchers found that the cerebellum was smaller than normal, and balance and refined movements were compromised.

Led by Dr. David Picketts, a senior scientist at the Ottawa Hospital Research Institute and professor in the Faculty of Medicine at the University of Ottawa, the team describes the Snf2h gene, which is found in our brain’s neural stem cells and functions as a master regulator. When they removed this gene early on in a mouse’s development, its cerebellum only grew to one-third the normal size. It also had difficulty walking, balancing and coordinating its movements, something called cerebellar ataxia that is a component of many neurodegenerative diseases.

"As these cerebellar stem cells divide, on their journey toward becoming specialized neurons, this master gene is responsible for deciding which genes are turned on and which genes are packed tightly away," said Dr. Picketts. "Without Snf2h there to keep things organized, genes that should be packed away are left turned on, while other genes are not properly activated. This disorganization within the cell’s nucleus results in a neuron that doesn’t perform very well—like a car running on five cylinders instead of six."

The cerebellum contains roughly half the neurons found in the brain. It also develops in response to external stimuli. So, as we practice tasks, certain genes or groups of genes are turned on and off, which strengthens these circuits and helps to stabilize or perfect the task being undertaken. The researchers found that the Snf2h gene orchestrates this complex and ongoing process. These master genes, which adapt to external cues to adjust the genes they turn on and off, are known as epigenetic regulators.

"These epigenetic regulators are known to affect memory, behaviour and learning," said Dr. Picketts. "Without Snf2h, not enough cerebellar neurons are produced, and the ones that are produced do not respond and adapt as well to external signals. They also show a progressively disorganized gene expression profile that results in cerebellar ataxia and the premature death of the animal."

There are no studies showing a direct link between Snf2h mutations and diseases with cerebellar ataxia, but Dr. Picketts added that it “is certainly possible and an interesting avenue to explore.”

In 2012, Developmental Cell published a paper by Dr. Picketts’ team showing that mice lacking the sister gene Snf2l were completely normal, but had larger brains, more cells in all areas of the brain and more actively dividing brain stem cells. The balance between Snf2l and Snf2h gene activity is necessary for controlling brain size and for establishing the proper gene expression profiles that underlie the function of neurons in different regions, including the cerebellum.

Filed under cerebellum Snf2h motor control cerebellar ataxia stem cells gene expression neuroscience science

995 notes

Tiny Molecule Could Help Diagnose and Treat Mental Disorders
Scientists “fingerprint” a culprit in depression, anxiety and other mood disorders
According to the World Health Organization, such mood disorders as depression affect some 10% of the world’s population and are associated with a heavy burden of disease. That is why numerous scientists around the world have invested a great deal of effort in understanding these diseases. Yet the molecular and cellular mechanisms that underlie these problems are still only partly understood.
The existing anti-depressants are not good enough: Some 60-70% of patients get no relief from them. For the other 30-40%, that relief is often incomplete, and they must take the drugs for a long period before feeling any effects. In addition, there are many side effects associated with the drugs. New and better drugs are clearly needed, an undertaking that requires, first and foremost, a better understanding of the processes and causes underlying the disorders.
The Weizmann Institute’s Prof. Alon Chen, together with his then PhD student Dr. Orna Issler, investigated the molecular mechanisms of the brain’s serotonin system, which, when misregulated, is involved in depression and anxiety disorders. Chen and his colleagues researched the role of microRNA molecules (small, non-coding RNA molecules that regulate various cellular activities) in the nerve cells that produce serotonin. They succeeded in identifying, for the first time, the unique “fingerprints” of a microRNA molecule that acts on the serotonin-producing nerve cells. Combining bioinformatics methods with experiments, the researchers found a connection between this particular microRNA, (miR135), and two proteins that play a key role in serotonin production and the regulation of its activities. The findings appeared today in Neuron.
The scientists noted that in the area of the brain containing the serotonin-producing nerve cells, miR135 levels increased when antidepressant compounds were introduced. Mice that were genetically engineered to produce higher-than-average amounts of the microRNA were more resistant to constant stress: They did not develop any of the behaviors associated with chronic stress, such as anxiety or depression, which would normally appear. In contrast, mice that expressed low levels of miR135 exhibited more of these behaviors; in addition, their response to antidepressants was weaker. In other words, the brain needs the proper miR135 levels – low enough to enable a healthy stress response and high enough to avoid depression or anxiety disorders and to respond to serotonin-boosting antidepressants. When this idea was tested on human blood samples, the researchers found that subjects who suffered from depression had unusually low miR135 levels in their blood. On closer inspection, the scientists discovered that the three genes involved in producing miR135 are located in areas of the genome that are known to be associated with risk factors for bipolar mood disorders.
These findings suggest that miR135 could be a useful therapeutic molecule – both as a blood test for depression and related disorders, and as a target whose levels might be raised in patients. Yeda Research and Development Co. Ltd., the technology transfer arm of the Weizmann Institute, has applied for a patent connected to these findings and recently licensed the rights to miCure Therapeutics to develop a drug and diagnostic method. After completing preclinical trials, the company hopes to begin clinical trials in humans.

Tiny Molecule Could Help Diagnose and Treat Mental Disorders

Scientists “fingerprint” a culprit in depression, anxiety and other mood disorders

According to the World Health Organization, such mood disorders as depression affect some 10% of the world’s population and are associated with a heavy burden of disease. That is why numerous scientists around the world have invested a great deal of effort in understanding these diseases. Yet the molecular and cellular mechanisms that underlie these problems are still only partly understood.

The existing anti-depressants are not good enough: Some 60-70% of patients get no relief from them. For the other 30-40%, that relief is often incomplete, and they must take the drugs for a long period before feeling any effects. In addition, there are many side effects associated with the drugs. New and better drugs are clearly needed, an undertaking that requires, first and foremost, a better understanding of the processes and causes underlying the disorders.

The Weizmann Institute’s Prof. Alon Chen, together with his then PhD student Dr. Orna Issler, investigated the molecular mechanisms of the brain’s serotonin system, which, when misregulated, is involved in depression and anxiety disorders. Chen and his colleagues researched the role of microRNA molecules (small, non-coding RNA molecules that regulate various cellular activities) in the nerve cells that produce serotonin. They succeeded in identifying, for the first time, the unique “fingerprints” of a microRNA molecule that acts on the serotonin-producing nerve cells. Combining bioinformatics methods with experiments, the researchers found a connection between this particular microRNA, (miR135), and two proteins that play a key role in serotonin production and the regulation of its activities. The findings appeared today in Neuron.

The scientists noted that in the area of the brain containing the serotonin-producing nerve cells, miR135 levels increased when antidepressant compounds were introduced. Mice that were genetically engineered to produce higher-than-average amounts of the microRNA were more resistant to constant stress: They did not develop any of the behaviors associated with chronic stress, such as anxiety or depression, which would normally appear. In contrast, mice that expressed low levels of miR135 exhibited more of these behaviors; in addition, their response to antidepressants was weaker. In other words, the brain needs the proper miR135 levels – low enough to enable a healthy stress response and high enough to avoid depression or anxiety disorders and to respond to serotonin-boosting antidepressants. When this idea was tested on human blood samples, the researchers found that subjects who suffered from depression had unusually low miR135 levels in their blood. On closer inspection, the scientists discovered that the three genes involved in producing miR135 are located in areas of the genome that are known to be associated with risk factors for bipolar mood disorders.

These findings suggest that miR135 could be a useful therapeutic molecule – both as a blood test for depression and related disorders, and as a target whose levels might be raised in patients. Yeda Research and Development Co. Ltd., the technology transfer arm of the Weizmann Institute, has applied for a patent connected to these findings and recently licensed the rights to miCure Therapeutics to develop a drug and diagnostic method. After completing preclinical trials, the company hopes to begin clinical trials in humans.

Filed under depression mood disorders serotonin microRNA miR135 antidepressants neuroscience science

132 notes

Progesterone could become tool versus brain cancer

The hormone progesterone could become part of therapy against the most aggressive form of brain cancer. High concentrations of progesterone kill glioblastoma cells and inhibit tumor growth when the tumors are implanted in mice, researchers have found.

image

The results were recently published in the Journal of Steroid Biochemistry and Molecular Biology.

Glioblastoma is the most common and the most aggressive form of brain cancer in adults, with average survival after diagnosis of around 15 months. Surgery, radiation and chemotherapy do prolong survival by several months, but targeted therapies, which have been effective with other forms of cancer, have not lengthened survival in patients fighting glioblastoma.

The lead author of the current paper is Fahim Atif, PhD, Assistant Professor of Emergency Medicine at Emory University. The findings with glioblastoma came out of Emory researchers’ work on progesterone as therapy for traumatic brain injury and more recently, stroke. Atif, Donald Stein and their colleagues have been studying progesterone for the treatment of traumatic brain injury for more than two decades, prompted by Stein’s initial observation that females recover from brain injury more readily than males. There is a similar tilt in glioblastoma as well: primary glioblastoma develops three times more frequently in males compared to females.

These results could pave the way for the use of progesterone against glioblastoma in a human clinical trial, perhaps in combination with standard-of-care therapeutic agents such as temozolomide. However, Stein says that more experiments are necessary with grafts of human tumor cells into animal brains first. His team identified a factor that may be important for clinical trial design: progesterone was not toxic to all glioblastoma cell lines, and its toxicity may depend on whether the tumor suppressor gene p53 is mutated.

Atif, Stein, and colleague Seema Yousuf found that low, physiological doses of progesterone stimulate the growth of glioblastoma tumor cells, but higher doses kill the tumor cells while remaining nontoxic for healthy cells. Similar effects have been seen with the progesterone antagonist RU486, but the authors cite evidence that progesterone is less toxic to healthy cells. Progesterone has also been found to inhibit growth of neuroblastoma cells (neuroblastoma is the most common cancer in infants), as well as breast, ovarian and colon cancers in cell culture and animal models.

(Source: news.emory.edu)

Filed under glioblastoma brain cancer progesterone temozolomide neuroscience science

484 notes

Finding thoughts in speech
For the first time, neuroscientists were able to find out how different thoughts are reflected in neuronal activity during natural conversations. Johanna Derix, Olga Iljina and the interdisciplinary team of Dr. Tonio Ball from the Cluster of Excellence BrainLinks-BrainTools at the University of Freiburg and the Epilepsy Center of the University Medical Center Freiburg (Freiburg, Germany) report on the link between speech, thoughts and brain responses in a special issue of Frontiers in Human Neuroscience.
"Thoughts are difficult to investigate, as one cannot observe in a direct manner what the person is thinking about. Language, however, reflects the underlying mental processes, so we can perform linguistic analyses of the subjects’ speech and use such information as a "bridge" between the neuronal processes and the subject’s thoughts," explains neuroscientist Johanna Derix.
The novelty of the authors’ approach is that the participants were not instructed to think and talk about a given topic in an experimental setting. Instead, the researchers analysed everyday conversations and the underlying brain activity, which was recorded directly from the cortical surface. This study was possible owing to the help of epilepsy patients in whom recordings of neural activity had to be obtained over several days for the purpose of pre-neurosurgical diagnostics.
For a start, borders between individual thoughts in continuous conversations had to be identified. Earlier psycholinguistic research indicates that a simple sentence is a suitable unit to contain a single thought, so the researchers opted for linguistic segmentation into simple sentences. The resulting “idea” units were classified into different categories. These included, for example, whether or not a sentence expressed memory- or self-related content. Then, the researchers analysed content-specific neural responses and observed clearly visible patterns of brain activity.
Thus, the neuroscientists from Freiburg have demonstrated the feasibility of their innovative approach to investigate, via speech, how the human brain processes thoughts during real-life conditions.

Finding thoughts in speech

For the first time, neuroscientists were able to find out how different thoughts are reflected in neuronal activity during natural conversations. Johanna Derix, Olga Iljina and the interdisciplinary team of Dr. Tonio Ball from the Cluster of Excellence BrainLinks-BrainTools at the University of Freiburg and the Epilepsy Center of the University Medical Center Freiburg (Freiburg, Germany) report on the link between speech, thoughts and brain responses in a special issue of Frontiers in Human Neuroscience.

"Thoughts are difficult to investigate, as one cannot observe in a direct manner what the person is thinking about. Language, however, reflects the underlying mental processes, so we can perform linguistic analyses of the subjects’ speech and use such information as a "bridge" between the neuronal processes and the subject’s thoughts," explains neuroscientist Johanna Derix.

The novelty of the authors’ approach is that the participants were not instructed to think and talk about a given topic in an experimental setting. Instead, the researchers analysed everyday conversations and the underlying brain activity, which was recorded directly from the cortical surface. This study was possible owing to the help of epilepsy patients in whom recordings of neural activity had to be obtained over several days for the purpose of pre-neurosurgical diagnostics.

For a start, borders between individual thoughts in continuous conversations had to be identified. Earlier psycholinguistic research indicates that a simple sentence is a suitable unit to contain a single thought, so the researchers opted for linguistic segmentation into simple sentences. The resulting “idea” units were classified into different categories. These included, for example, whether or not a sentence expressed memory- or self-related content. Then, the researchers analysed content-specific neural responses and observed clearly visible patterns of brain activity.

Thus, the neuroscientists from Freiburg have demonstrated the feasibility of their innovative approach to investigate, via speech, how the human brain processes thoughts during real-life conditions.

Filed under speech production neural activity thinking prefrontal cortex communication autobiographical memory neuroscience science

178 notes

Limb regeneration: do salamanders hold the key?

For the first time, researchers have found that the ‘ERK pathway’ must be constantly active for salamander cells to be reprogrammed, and hence able to contribute to the regeneration of different body parts.

image

The team identified a key difference between the activity of this pathway in salamanders and mammals, which helps us to understand why humans can’t regrow limbs and sheds light on how regeneration of human cells can be improved.

The study published in Stem Cell Reports, demonstrates that the ERK pathway is not fully active in mammalian cells, but when forced to be constantly active, gives the cells more potential for reprogramming and regeneration. This could help researchers better understand diseases and design new therapies.

Lead researcher on the study, Dr Max Yun (UCL Institute of Structural & Molecular Biology) said: “While humans have limited regenerative abilities, other organisms, such as the salamander, are able to regenerate an impressive repertoire of complex structures including parts of their hearts, eyes, spinal cord, tails, and they are the only adult vertebrates able to regenerate full limbs.

We’re thrilled to have found a critical molecular pathway, the ERK pathway, that determines whether an adult cell is able to be reprogrammed and help the regeneration processes. Manipulating this mechanism could contribute to therapies directed at enhancing regenerative potential of human cells.”

The ERK pathway is a way for proteins to communicate a signal from the surface of a cell to the nucleus which contains the cell’s genetic material. Further research will focus on understanding how this important pathway is regulated during limb regeneration, and which other molecules are involved in the process.

(Source: ucl.ac.uk)

Filed under regeneration salamanders regenerative medicine science

149 notes

Seeing the inner workings of the brain made easier by new technique
Last year Karl Deisseroth, a Stanford professor of bioengineering and of psychiatry and behavioral sciences, announced a new way of peering into a brain – removed from the body – that provided spectacular fly-through views of its inner connections. Since then laboratories around the world have begun using the technique, called CLARITY, with some success, to better understand the brain’s wiring.
However, Deisseroth said that with two technological fixes CLARITY could be even more broadly adopted. The first problem was that laboratories were not set up to reliably carry out the CLARITY process. Second, the most commonly available microscopy methods were not designed to image the whole transparent brain. “There have been a number of remarkable results described using CLARITY,” Deisseroth said, “but we needed to address these two distinct challenges to make the technology easier to use.”
In a Nature Protocols paper published June 19, Deisseroth presented solutions to both of those bottlenecks. “These transform CLARITY, making the overall process much easier and the data collection much faster,” he said. He and his co-authors, including postdoctoral fellows Raju Tomer and Li Ye and graduate student Brian Hsueh, anticipate that even more scientists will now be able to take advantage of the technique to better understand the brain at a fundamental level, and also to probe the origins of brain diseases.
This paper may be the first to be published with support of the White House BRAIN Initiative, announced last year with the ambitious goal of mapping the brain’s trillions of nerve connections and understanding how signals zip through those interconnected cells to control our thoughts, memories, movement and everything else that makes us us.
"This work shares the spirit of the BRAIN Initiative goal of building new technologies to understand the brain – including the human brain," said Deisseroth, who is also a Stanford Bio-X affiliated faculty member.
Eliminating fat
When you look at the brain, what you see is the fatty outer covering of the nerve cells within, which blocks microscopes from taking images of the intricate connections between deep brain cells. The idea behind CLARITY was to eliminate that fatty covering while keeping the brain intact, complete with all its intricate inner wiring.
The way Deisseroth and his team eliminated the fat was to build a gel within the intact brain that held all the structures and proteins in place. They then used an electric field to pull out the fat layer that had been dissolved in an electrically charged detergent, leaving behind all the brain’s structures embedded in the firm water-based gel, or hydrogel. This is called electrophoretic CLARITY.
The electric field aspect was a challenge for some labs. “About half the people who tried it got it working right away,” Deisseroth said, “but others had problems with the voltage damaging tissue.” Deisseroth said that this kind of challenge is normal when introducing new technologies. When he first introduced optogenetics, which allows scientists to control individual nerves using light, a similar proportion of labs were not initially set up to easily implement the new technology, and ran into challenges.
To help expand the use of CLARITY, the team devised an alternate way of pulling out the fat from the hydrogel-embedded brain – a technique they call passive CLARITY. It takes a little longer, but still removes all the fat, is much easier and does not pose a risk to the tissue. “Electrophoretic CLARITY is important for cases where speed is critical, and for some tissues,” said Deisseroth, who is also the D.H. Chen Professor. “But passive CLARITY is a crucial advance for the community, especially for neuroscience.” Passive CLARITY requires nothing more than some chemicals, a warm bath and time.
Many groups have begun to apply CLARITY to probe brains donated from people who had diseases like epilepsy or autism, which might have left clues in the brain to help scientists understand and eventually treat the disease. But scientists, including Deisseroth, had been wary of trying electrophoretic CLARTY on these valuable clinical samples with even a very low risk of damage. “It’s a rare and precious donated sample, you don’t want to have a chance of damage or error,” Deisseroth said. “Now the risk issue is addressed, and on top of that you can get the data very rapidly.”
Fast CLARITY imaging in color
The second advance had to do this rapidity of data collection. In studying any cells, scientists often make use of probes that will go into the cell or tissue, latch onto a particular molecule, then glow green, blue, yellow or other colors in response to particular wavelengths of light. This is what produces the colorful cellular images that are so common in biology research. Using CLARITY, these colorful structures become visible throughout the entire brain, since no fat remains to block the light.
But here’s the hitch. Those probes stop working, or get bleached, after they’ve been exposed to too much light. That’s fine if a scientist is just taking a picture of a small cellular structure, which takes little time. But to get a high-resolution image of an entire brain, the whole tissue is bathed in light throughout the time it takes to image it point by point. This approach bleaches out the probes before the entire brain can be imaged at high resolution.
The second advance of the new paper addresses this issue, making it easier to image the entire brain without bleaching the probes. “We can now scan an entire plane at one time instead of a point,” Deisseroth said. “That buys you a couple orders of magnitude of time, and also efficiently delivers light only to where the imaging is happening.” The technique is called light sheet microscopy and has been around for a while, but previously didn’t have high enough resolution to see the fine details of cellular structures. “We advanced traditional light sheet microscopy for CLARITY, and can now see fine wiring structures deep within an intact adult brain,” Deisseroth said. His lab built their own microscope, but the procedures are described in the paper, and the key components are commercially available. Additionally, Deisseroth’s lab provides free training courses in CLARITY, modeled after his optogenetics courses, to help disseminate the techniques.
Brain imaging to help soldiers
The BRAIN Initiative is being funded through several government agencies including the Defense Advanced Research Projects Agency (DARPA), which funded Deisseroth’s work through its new Neuro-FAST program. Deisseroth said that like the National Institute of Mental Health (NIMH, another major funder of the new paper), DARPA “is interested in deepening our understanding of brain circuits in intact and injured brains to inform the development of better therapies.” The new methods Deisseroth and his team developed will accelerate both human- and animal-model CLARITY; as CLARITY becomes more widely used, it will continue to help reveal how those inner circuits are structured in normal and diseased brains, and perhaps point to possible therapies.
Other arms of the BRAIN Initiative are funded through the National Science Foundation (NSF) and the National Institutes of Health (NIH). A working group for the NIH arm was co-led by William Newsome, professor of neurobiology and director of the Stanford Neurosciences Institute, and also included Deisseroth and Mark Schnitzer, associate professor of biology and of applied physics. That group recently recommended a $4.5 billion investment in the BRAIN Initiative over the next 12 years, which NIH Director Francis Collins approved earlier this month.
In addition to funding by DARPA and NIMH, the work was funded by the NSF, the National Institute on Drug Abuse, the Simons Foundation and the Wiegers Family Fund.

Seeing the inner workings of the brain made easier by new technique

Last year Karl Deisseroth, a Stanford professor of bioengineering and of psychiatry and behavioral sciences, announced a new way of peering into a brain – removed from the body – that provided spectacular fly-through views of its inner connections. Since then laboratories around the world have begun using the technique, called CLARITY, with some success, to better understand the brain’s wiring.

However, Deisseroth said that with two technological fixes CLARITY could be even more broadly adopted. The first problem was that laboratories were not set up to reliably carry out the CLARITY process. Second, the most commonly available microscopy methods were not designed to image the whole transparent brain. “There have been a number of remarkable results described using CLARITY,” Deisseroth said, “but we needed to address these two distinct challenges to make the technology easier to use.”

In a Nature Protocols paper published June 19, Deisseroth presented solutions to both of those bottlenecks. “These transform CLARITY, making the overall process much easier and the data collection much faster,” he said. He and his co-authors, including postdoctoral fellows Raju Tomer and Li Ye and graduate student Brian Hsueh, anticipate that even more scientists will now be able to take advantage of the technique to better understand the brain at a fundamental level, and also to probe the origins of brain diseases.

This paper may be the first to be published with support of the White House BRAIN Initiative, announced last year with the ambitious goal of mapping the brain’s trillions of nerve connections and understanding how signals zip through those interconnected cells to control our thoughts, memories, movement and everything else that makes us us.

"This work shares the spirit of the BRAIN Initiative goal of building new technologies to understand the brain – including the human brain," said Deisseroth, who is also a Stanford Bio-X affiliated faculty member.

Eliminating fat

When you look at the brain, what you see is the fatty outer covering of the nerve cells within, which blocks microscopes from taking images of the intricate connections between deep brain cells. The idea behind CLARITY was to eliminate that fatty covering while keeping the brain intact, complete with all its intricate inner wiring.

The way Deisseroth and his team eliminated the fat was to build a gel within the intact brain that held all the structures and proteins in place. They then used an electric field to pull out the fat layer that had been dissolved in an electrically charged detergent, leaving behind all the brain’s structures embedded in the firm water-based gel, or hydrogel. This is called electrophoretic CLARITY.

The electric field aspect was a challenge for some labs. “About half the people who tried it got it working right away,” Deisseroth said, “but others had problems with the voltage damaging tissue.” Deisseroth said that this kind of challenge is normal when introducing new technologies. When he first introduced optogenetics, which allows scientists to control individual nerves using light, a similar proportion of labs were not initially set up to easily implement the new technology, and ran into challenges.

To help expand the use of CLARITY, the team devised an alternate way of pulling out the fat from the hydrogel-embedded brain – a technique they call passive CLARITY. It takes a little longer, but still removes all the fat, is much easier and does not pose a risk to the tissue. “Electrophoretic CLARITY is important for cases where speed is critical, and for some tissues,” said Deisseroth, who is also the D.H. Chen Professor. “But passive CLARITY is a crucial advance for the community, especially for neuroscience.” Passive CLARITY requires nothing more than some chemicals, a warm bath and time.

Many groups have begun to apply CLARITY to probe brains donated from people who had diseases like epilepsy or autism, which might have left clues in the brain to help scientists understand and eventually treat the disease. But scientists, including Deisseroth, had been wary of trying electrophoretic CLARTY on these valuable clinical samples with even a very low risk of damage. “It’s a rare and precious donated sample, you don’t want to have a chance of damage or error,” Deisseroth said. “Now the risk issue is addressed, and on top of that you can get the data very rapidly.”

Fast CLARITY imaging in color

The second advance had to do this rapidity of data collection. In studying any cells, scientists often make use of probes that will go into the cell or tissue, latch onto a particular molecule, then glow green, blue, yellow or other colors in response to particular wavelengths of light. This is what produces the colorful cellular images that are so common in biology research. Using CLARITY, these colorful structures become visible throughout the entire brain, since no fat remains to block the light.

But here’s the hitch. Those probes stop working, or get bleached, after they’ve been exposed to too much light. That’s fine if a scientist is just taking a picture of a small cellular structure, which takes little time. But to get a high-resolution image of an entire brain, the whole tissue is bathed in light throughout the time it takes to image it point by point. This approach bleaches out the probes before the entire brain can be imaged at high resolution.

The second advance of the new paper addresses this issue, making it easier to image the entire brain without bleaching the probes. “We can now scan an entire plane at one time instead of a point,” Deisseroth said. “That buys you a couple orders of magnitude of time, and also efficiently delivers light only to where the imaging is happening.” The technique is called light sheet microscopy and has been around for a while, but previously didn’t have high enough resolution to see the fine details of cellular structures. “We advanced traditional light sheet microscopy for CLARITY, and can now see fine wiring structures deep within an intact adult brain,” Deisseroth said. His lab built their own microscope, but the procedures are described in the paper, and the key components are commercially available. Additionally, Deisseroth’s lab provides free training courses in CLARITY, modeled after his optogenetics courses, to help disseminate the techniques.

Brain imaging to help soldiers

The BRAIN Initiative is being funded through several government agencies including the Defense Advanced Research Projects Agency (DARPA), which funded Deisseroth’s work through its new Neuro-FAST program. Deisseroth said that like the National Institute of Mental Health (NIMH, another major funder of the new paper), DARPA “is interested in deepening our understanding of brain circuits in intact and injured brains to inform the development of better therapies.” The new methods Deisseroth and his team developed will accelerate both human- and animal-model CLARITY; as CLARITY becomes more widely used, it will continue to help reveal how those inner circuits are structured in normal and diseased brains, and perhaps point to possible therapies.

Other arms of the BRAIN Initiative are funded through the National Science Foundation (NSF) and the National Institutes of Health (NIH). A working group for the NIH arm was co-led by William Newsome, professor of neurobiology and director of the Stanford Neurosciences Institute, and also included Deisseroth and Mark Schnitzer, associate professor of biology and of applied physics. That group recently recommended a $4.5 billion investment in the BRAIN Initiative over the next 12 years, which NIH Director Francis Collins approved earlier this month.

In addition to funding by DARPA and NIMH, the work was funded by the NSF, the National Institute on Drug Abuse, the Simons Foundation and the Wiegers Family Fund.

Filed under CLARITY BRAIN Initiative brain imaging light sheet microscopy neuroscience science

148 notes

Scientists tie social behavior to activity in specific brain circuit

A team of Stanford University investigators has linked a particular brain circuit to mammals’ tendency to interact socially. Stimulating this circuit — one among millions in the brain — instantly increases a mouse’s appetite for getting to know a strange mouse, while inhibiting it shuts down its drive to socialize with the stranger.

image

The new findings, published June 19 in Cell, may throw light on psychiatric disorders marked by impaired social interaction such as autism, social anxiety, schizophrenia and depression, said the study’s senior author, Karl Deisseroth, MD, PhD, a professor of bioengineering and of psychiatry and behavioral sciences. The findings are also significant in that they highlight not merely the role of one or another brain chemical, as pharmacological studies tend to do, but rather the specific components of brain circuits involved in a complex behavior. A combination of cutting-edge techniques developed in Deisseroth’s laboratory permitted unprecedented analysis of how brain activity controls behavior.

Deisseroth, the D.H. Chen Professor and a member of the interdisciplinary Stanford Bio-X institute, is a practicing psychiatrist who sees patients with severe social deficits. “People with autism, for example, often have an outright aversion to social interaction,” he said. They can find socializing — even mere eye contact — painful.

Deisseroth pioneered a brain-exploration technique, optogenetics, that involves selectively introducing light-receptor molecules to the surfaces of particular nerve cells in a living animal’s brain and then carefully positioning, near the circuit in question, the tip of a lengthy, ultra-thin optical fiber (connected to a laser diode at the other end) so that the photosensitive cells and the circuits they compose can be remotely stimulated or inhibited at the turn of a light switch while the animal remains free to move around in its cage.

Monitoring activity in real time

Using optogenetics and other methods he and his associates have invented, Deisseroth and his associates were able to both manipulate and monitor activity in specific nerve-cell clusters, and the fiber tracts connecting them, in mice’s brains in real time while the animals were exposed to either murine newcomers or inanimate objects in various laboratory environments. The mice’s behavioral responses were captured by video and compared with simultaneously recorded brain-circuit activity.

In some cases, the researchers observed activity in various brain centers and nerve-fiber tracts connecting them as the mice variously examined or ignored one another. Other experiments involved stimulating or inhibiting impulses within those circuits to see how these manipulations affected the mice’s social behavior.

To avoid confusing simple social interactions with mating- and aggression-related behaviors, the researchers restricted their experiments to female mouse pairs.

The scientists first examined the relationship between the mice’s social interactions and a region in the brain stem called the ventral tegmental area. The VTA is a key node in the brain’s reward circuitry, which produces sensations of pleasure in response to success in such survival-improving activities as eating, mating or finding a warm shelter in a cold environment.

The VTA transmits signals to other centers throughout the brain via tracts of fibers that secrete chemicals, including one called dopamine, at contact points abutting nerve cells within these faraway centers. When dopamine lands on receptors on those nerve cells, it can set off signaling activity within them.

Abnormal activity in the VTA has been linked to drug abuse and depression, for example. But much less is known about this brain center’s role in social behavior, and it had not previously been possible to observe or control activity along its connections during social behavior.

Deisseroth and his colleagues used mice whose dopamine-secreting, or dopaminergic, VTA nerve cells had been bioengineered to express optogenetic control proteins that could set off or inhibit signaling in the cells in response to light. They observed that enhancing activity in these cells increased a mouse’s penchant for social interaction. When a newcomer was introduced into its cage, it came, it saw, it sniffed. Inhibiting the dopaminergic VTA cells had the opposite effect: The host lost much of its interest in the guest.

Only social interaction affected

On the other hand, such manipulations of the VTA’s dopaminergic cells had no effect on the mice’s penchant for exploring novel objects (a golf ball, for example) placed in their cages. Nor did it change their overall propensity to move around. The effect appeared to be specific for social interaction.

Finding out exactly which dopaminergic projections from the VTA, traveling to which remote brain structures, were carrying the signals that generate exploratory social behavior required designing a new monitoring methodology. The signals traveling along such projections are extremely weak and confounded by background noise, especially when located deep within the brains of ambulatory animals. Deisseroth’s group overcame this by developing a highly sensitive technology capable of plucking these tiny signals out of the surrounding noise. The new technique, called fiber photometry, is a sophisticated way of measuring calcium flux, which invariably accompanies signaling activity along the fibers projecting from nerve cells.

Using a combination of optogenetics and fiber photometry, the investigators were able to demonstrate that a particular tract projecting from the VTA to a mid-brain structure called the nucleus accumbens (also strongly implicated in the reward system) was the relevant conduit carrying the impetus to social interaction in the mice.

A third technological trick helped determine which recipient nerve cells within the nucleus accumbens were involved in the social-behavior circuitry. That structure’s two types of dopamine-responsive cells are differentiated by the types of dopamine receptors, referred to as D1 and D2, on their surfaces. The team performed experiments in animals bioengineered so that the normally D1-containing cells instead expressed a modified, light-inducible version of that receptor. These experiments, along with complementary experiments blocking the D1 receptors with specific drug antagonists, showed that the D1 nucleus-accumbens nerve cells were mediating the changes in social behavior. Tripping off those receptors, either by optogenetically inducing incoming tracts to deliver dopamine to these receptors, or by directly stimulating light-activated forms of these receptors on the target cells, enhanced mice’s social exploration.

Helping to see how social behavior can go wrong

“Every behavior presumably arises from a pattern of activity in the brain, and every behavioral malfunction arises from malfunctioning circuitry,” said Deisseroth, who is also co-director of Stanford’s Cracking the Neural Code Program. “The ability, for the first time, to pinpoint a particular nerve-cell projection involved in the social behavior of a living, moving animal will greatly enhance our ability to understand how social behavior operates, and how it can go wrong.”

(Source: med.stanford.edu)

Filed under social interaction brain activity autism schizophrenia optogenetics fiber photometry neuroscience science

54 notes

Scientists Pinpoint How Genetic Mutation Causes Early Brain Damage

Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shed light on how a specific kind of genetic mutation can cause damage during early brain development that results in lifelong learning and behavioral disabilities. The work suggests new possibilities for therapeutic intervention.

The study, which focuses on the role of a gene known as Syngap1, was published June 18, 2014, online ahead of print by the journal Neuron. In humans, mutations in Syngap1 are known to cause devastating forms of intellectual disability and epilepsy.

“We found a sensitive cell type that is both necessary and sufficient to account for the bulk of the behavioral problems resulting from this mutation,” said TSRI Associate Professor Gavin Rumbaugh, who led the study. “Because we found the root biological cause of this genetic brain disorder, we can now shift our research toward developing tailor-made therapies for people affected by Syngap1 mutations.”

In the study, Rumbaugh and his colleagues used a mouse model to show that mutations in Syngap1 damage the development of a kind of neuron known as glutamatergic neurons in the young forebrain, leading to intellectual disability. Higher cognitive processes, such as language, reasoning and memory arise in children as the forebrain develops.

Repairing damaging Syngap1 mutations in these specific neurons during development prevented cognitive abnormalities, while repairing the gene in other kinds of neurons and in other locations had no effect.

Rumbaugh noted prenatal diagnosis of some infant genetic disorders is on the horizon. Technological advances in genetic sequencing allow for individual genomes to be scanned for damaging mutations; it is possible to scan the entire genome of a child still in the womb. “Our research suggests that if Syngap1 function can be fixed very early in development, this should protect the brain from damage and permanently improve cognitive function,” said TSRI Research Associate Emin Ozkan, a first author of the study, along with TSRI Research Associate Thomas Creson. “In theory, patients then wouldn’t have to be subjected to a lifetime of therapies and worry that the drugs might stop working or have side effects from chronic use.”

Mutations to Syngap1 are a leading cause of “sporadic intellectual disability,” resulting from new, random mutations arising spontaneously in genes, rather than faulty genes inherited from parents. Intellectual disability affects approximately one to three percent of the population worldwide.

Rumbaugh and his colleagues are continuing to investigate. “Our findings have also identified exciting potential biomarkers in the brain of cognitive failure, allowing us to test new therapeutic strategies in our Syngap1 animal model,” said Creson.

(Source: newswise.com)

Filed under syngap1 genetic mutation glutamatergic neurons genetics brain damage neuroscience science

100 notes

Exploring How the Nervous System Develops



The circuitry of the central nervous system is immensely complex and, as a result, sometimes confounding. When scientists conduct research to unravel the inner workings at a cellular level, they are sometimes surprised by what they find.
Patrick Keeley, a postdoctoral scholar in Benjamin Reese’s laboratory at UC Santa Barbara’s Neuroscience Research Institute, had such an experience. He spent years analyzing different cell types in the retina, the light-sensitive layer of tissue lining the inner surface of the eye that mediates the first stages of visual processing. The results of his research are published today in the journal Developmental Cell.
Using a rodent model, Keeley and his colleagues quantified the number of cells present in each retina for 12 different retinal cell types across 30 genetically distinct lines of mice. For every cell type the team investigated, the researchers found a remarkable degree of variation in cell number across the strains. More surprising, the variation in the number of different cell types was largely independent of one another across the strains. This has substantial implications for retinal wiring during cellular development.
“These cells are connected to each other, and their convergence ratios are believed to underlie various aspects of visual processing,” Keeley explained, “so it was expected that the numbers of these cell types might be correlated. But that was not the case at all. We found very few significant correlations and even the ones we did find were modest.”
Using quantitative trait locus (QTL) analysis — a statistical method that links two types of information, in this case cell number and genetic markers — Keeley’s team compared not only the covariance between different types of cells but also the genetic co-regulation of their number. When they mapped the variation in cell number to locations within the genome, the locations were rarely the same for different types of cells. The result was entirely unexpected.
“Current views of retinal development propose that molecular switches control the alternate fates a newborn neuron should adopt, leading one to expect negative correlations between certain cell types,” said Reese, who is also a professor in UCSB’s Department of Psychological and Brain Sciences. “Still others have proposed that synaptically connected nerve cells ‘match’ their pre- and post-synaptic numbers through a process of naturally occurring cell death, leading one to expect positive correlations between connected cell types. Neither expectation was borne out.”
“If the cell types are not correlated, then some mice will have retinas with a lot of one cell type — say, photoreceptors — but not a lot of another cell type to connect to, in this case bipolar cells, or vice versa,” Keeley added. “So how does the developing retina accommodate this variation?”
The authors posit that since the ratios of pre- to post-synaptic cell number are not precisely controlled, the rules for connecting them should offer a degree of plasticity as they wire their connections during development.
Take bipolar cells as an example. To test this assumption, the scientists looked at the morphology of their dendrites, the threadlike extensions of a neuron that gather synaptic input. Keeley and coworkers examined their size, their branching pattern and the number of contacts they formed as a function of the number of surrounding bipolar cells and the number of photoreceptors across these different strains.
“We found that the extent of dendritic growth was proportional to the local density of bipolar cells,” Keeley explained. “If there are more, they grow smaller dendrites. If there are fewer, they grow larger dendrites.
“Photoreceptor number, on the other hand, had no effect upon the size of the dendritic field of the bipolar cells but determined the frequency of branching made by those very dendrites,” he added. “This plasticity in neural circuit assembly ensures that the nervous system modulates its connectivity to accommodate the independent variation in cell number.”
This research gives scientists an idea of how individual cell types are generated, how they differentiate and how they form appropriate connections with one another. Researchers in the Reese lab are trying to understand the genes that control these processes.
“I think that’s important when we discuss cellular therapeutics such as transplanting stem cells to replace cells that are lost,” Keeley said. “We’re going to need this sort of fundamental knowledge about neural development to promote the differentiation and integration of transplanted stem cells. This focus on genetic and cellular mechanisms is going to be important for developing new therapies to treat developmental disorders affecting the eye.”

Exploring How the Nervous System Develops

The circuitry of the central nervous system is immensely complex and, as a result, sometimes confounding. When scientists conduct research to unravel the inner workings at a cellular level, they are sometimes surprised by what they find.

Patrick Keeley, a postdoctoral scholar in Benjamin Reese’s laboratory at UC Santa Barbara’s Neuroscience Research Institute, had such an experience. He spent years analyzing different cell types in the retina, the light-sensitive layer of tissue lining the inner surface of the eye that mediates the first stages of visual processing. The results of his research are published today in the journal Developmental Cell.

Using a rodent model, Keeley and his colleagues quantified the number of cells present in each retina for 12 different retinal cell types across 30 genetically distinct lines of mice. For every cell type the team investigated, the researchers found a remarkable degree of variation in cell number across the strains. More surprising, the variation in the number of different cell types was largely independent of one another across the strains. This has substantial implications for retinal wiring during cellular development.

“These cells are connected to each other, and their convergence ratios are believed to underlie various aspects of visual processing,” Keeley explained, “so it was expected that the numbers of these cell types might be correlated. But that was not the case at all. We found very few significant correlations and even the ones we did find were modest.”

Using quantitative trait locus (QTL) analysis — a statistical method that links two types of information, in this case cell number and genetic markers — Keeley’s team compared not only the covariance between different types of cells but also the genetic co-regulation of their number. When they mapped the variation in cell number to locations within the genome, the locations were rarely the same for different types of cells. The result was entirely unexpected.

“Current views of retinal development propose that molecular switches control the alternate fates a newborn neuron should adopt, leading one to expect negative correlations between certain cell types,” said Reese, who is also a professor in UCSB’s Department of Psychological and Brain Sciences. “Still others have proposed that synaptically connected nerve cells ‘match’ their pre- and post-synaptic numbers through a process of naturally occurring cell death, leading one to expect positive correlations between connected cell types. Neither expectation was borne out.”

“If the cell types are not correlated, then some mice will have retinas with a lot of one cell type — say, photoreceptors — but not a lot of another cell type to connect to, in this case bipolar cells, or vice versa,” Keeley added. “So how does the developing retina accommodate this variation?”

The authors posit that since the ratios of pre- to post-synaptic cell number are not precisely controlled, the rules for connecting them should offer a degree of plasticity as they wire their connections during development.

Take bipolar cells as an example. To test this assumption, the scientists looked at the morphology of their dendrites, the threadlike extensions of a neuron that gather synaptic input. Keeley and coworkers examined their size, their branching pattern and the number of contacts they formed as a function of the number of surrounding bipolar cells and the number of photoreceptors across these different strains.

“We found that the extent of dendritic growth was proportional to the local density of bipolar cells,” Keeley explained. “If there are more, they grow smaller dendrites. If there are fewer, they grow larger dendrites.

“Photoreceptor number, on the other hand, had no effect upon the size of the dendritic field of the bipolar cells but determined the frequency of branching made by those very dendrites,” he added. “This plasticity in neural circuit assembly ensures that the nervous system modulates its connectivity to accommodate the independent variation in cell number.”

This research gives scientists an idea of how individual cell types are generated, how they differentiate and how they form appropriate connections with one another. Researchers in the Reese lab are trying to understand the genes that control these processes.

“I think that’s important when we discuss cellular therapeutics such as transplanting stem cells to replace cells that are lost,” Keeley said. “We’re going to need this sort of fundamental knowledge about neural development to promote the differentiation and integration of transplanted stem cells. This focus on genetic and cellular mechanisms is going to be important for developing new therapies to treat developmental disorders affecting the eye.”

Filed under nervous system retina bipolar cells neural circuits neuroscience science

150 notes

Exposure to TV Violence Related to Irregular Attention and Brain Structure

Young adult men who watched more violence on television showed indications of less mature brain development and poorer executive functioning, according to the results of an Indiana University School of Medicine study published online in the journal Brain and Cognition.

image

The researchers used psychological testing and MRI scans to measure mental abilities and volume of brain regions in 65 healthy males with normal IQ between the age of 18 and 29, specifically chosen because they were not frequent video game players.

Lead author Tom A. Hummer, Ph.D., assistant research professor in the IU Department of Psychiatry, said the young men provided estimates of their television viewing over the past year and then kept a detailed diary of their TV viewing for a week. Participants also completed a series of psychological tests measuring inhibitory control, attention and memory. At the conclusion, MRI scans were used to measure brain structure.

Executive function is the broad ability to formulate plans, make decisions, reason and problem-solve, regulate attention, and inhibit behavior in order to achieve goals.

"We found that the more violent TV viewing a participant reported, the worse they performed on tasks of attention and cognitive control," Dr. Hummer said. "On the other hand, the overall amount of TV watched was not related to performance on any executive function tests."

Dr. Hummer noted that these executive functioning abilities can be important for controlling impulsive behaviors, including aggression. “The worry is that more impulsivity does not mix well with the behaviors modeled in violent programming.”

Tests that measured working memory, another subtype of executive functioning, were not found to be related to overall or violent TV viewing.

Comparing TV habits to brain images also produced results that Dr. Hummer and colleagues believe are significant.

"When we looked at the brain scans of young men with higher violent television exposure, there was less volume of white matter connecting the frontal and parietal lobes, which can be a sign of less maturity in brain development," he said.

White matter is tissue in the brain that insulates nerve fibers connecting different brain regions, making functioning more efficient. In typical development, the amount or volume of white matter increases as the brain makes more connections until about age 30, improving communication between regions of the brain. Connections between the frontal and parietal lobes are thought to be especially important for executive functioning.

"The take-home message from this study is the finding of a relationship between how much violent television we watch and important aspects of brain functioning like controlled attention and inhibition," Dr. Hummer said.

Dr. Hummer cautions that more research is needed to better understand the study findings.

"With this study we could not isolate whether people with poor executive function are drawn to programs with more violence or if the content of the TV viewing is responsible for affecting the brain’s development over a period of time," Dr. Hummer said. "Additional longitudinal work is necessary to resolve whether individuals with poor executive function and slower white matter growth are more drawn to violent programming or if exposure to media violence modifies development of cognitive control," Dr. Hummer said.

(Source: newswise.com)

Filed under executive function television media violence white matter brain structure psychology neuroscience science

free counters