Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

130 notes

The claustrum’s proposed role in consciousness is supported by the effect and target localization of Salvia divinorum
This article brings together three findings and ideas relevant for the understanding of human consciousness: (I) Crick’s and Koch’s theory that the claustrum is a “conductor of consciousness” crucial for subjective conscious experience. (II) Subjective reports of the consciousness-altering effects the plant Salvia divinorum, whose primary active ingredient is salvinorin A, a κ-opioid receptor agonist. (III) The high density of κ-opioid receptors in the claustrum. Fact III suggests that the consciousness-altering effects of S. divinorum/salvinorin A (II) are due to a κ-opioid receptor mediated inhibition of primarily the claustrum and, additionally, the deep layers of the cortex, mainly in prefrontal areas. Consistent with Crick and Koch’s theory that the claustrum plays a key role in consciousness (I), the subjective effects of S. divinorum indicate that salvia disrupts certain facets of consciousness much more than the largely serotonergic hallucinogen lysergic acid diethylamide (LSD). Based on this data and on the relevant literature, we suggest that the claustrum does indeed serve as a conductor for certain aspects of higher-order integration of brain activity, while integration of auditory and visual signals relies more on coordination by other areas including parietal cortex and the pulvinar.
Full Article

The claustrum’s proposed role in consciousness is supported by the effect and target localization of Salvia divinorum

This article brings together three findings and ideas relevant for the understanding of human consciousness: (I) Crick’s and Koch’s theory that the claustrum is a “conductor of consciousness” crucial for subjective conscious experience. (II) Subjective reports of the consciousness-altering effects the plant Salvia divinorum, whose primary active ingredient is salvinorin A, a κ-opioid receptor agonist. (III) The high density of κ-opioid receptors in the claustrum. Fact III suggests that the consciousness-altering effects of S. divinorum/salvinorin A (II) are due to a κ-opioid receptor mediated inhibition of primarily the claustrum and, additionally, the deep layers of the cortex, mainly in prefrontal areas. Consistent with Crick and Koch’s theory that the claustrum plays a key role in consciousness (I), the subjective effects of S. divinorum indicate that salvia disrupts certain facets of consciousness much more than the largely serotonergic hallucinogen lysergic acid diethylamide (LSD). Based on this data and on the relevant literature, we suggest that the claustrum does indeed serve as a conductor for certain aspects of higher-order integration of brain activity, while integration of auditory and visual signals relies more on coordination by other areas including parietal cortex and the pulvinar.

Full Article

Filed under consciousness claustrum salvinorin A brain activity neuroscience science

274 notes

Uncovering Clues to the Genetic Cause of Schizophrenia
The overall number and nature of mutations—rather than the presence of any single mutation—influences an individual’s risk of developing schizophrenia, as well as its severity, according to a discovery by Columbia University Medical Center researchers published in the latest issue of Neuron. The findings could have important implications for the early detection and treatment of schizophrenia.
Maria Karayiorgou, MD, professor of psychiatry and Joseph Gogos, MD, PhD, professor of physiology and cellular biophysics and of neuroscience, and their team sequenced the “exome”—the region of the human genome that codes for proteins—of 231 schizophrenia patients and their unaffected parents. Using this data, they demonstrated that schizophrenia arises from collective damage across several genes.
“This study helps define a specific genetic mechanism that explains some of schizophrenia’s heritability and clinical manifestation,” said Dr. Karayiorgou, who is acting chief of the Division of Psychiatric and Medical Genetics at the New York State Psychiatric Institute. “Accumulation of damaged genes inherited from healthy parents leads to higher risk not only to develop schizophrenia but also to develop more severe forms of the disease.”
Schizophrenia is a severe psychiatric disorder in which patients experience hallucination, delusion, apathy and cognitive difficulties. The disorder is relatively common, affecting around 1 in every 100 people, and the risk of developing schizophrenia is strongly increased if a family member has the disease. Previous research has focused on the search for individual genes that might trigger schizophrenia. The availability of new high-throughput DNA sequencing technology has contributed to a more holistic approach to the disease.
The researchers compared sequencing data to look for genetic differences and identify new loss-of-function mutations—which are rarer, but have a more severe effect on ordinary gene function—in cases of schizophrenia that had not been inherited from the patients’ parents. They found an excess of such mutations in a variety of genes across different chromosomes.
Using the same sequencing data, the researchers also looked at what types of mutations are commonly passed on to schizophrenia patients from their parents. It turns out that many of these are “loss-of-function” types. These mutations were also found to occur more frequently in genes with a low tolerance for genetic variation.
“These mutations are important signposts toward identifying the genes involved in schizophrenia,” said Dr. Karayiorgou.
The researchers then looked more deeply into the sequencing data to try to determine the biological functions of the disrupted genes involved in schizophrenia. They were able to verify two key damaging mutations in a gene called SETD1A, suggesting that this gene contributes significantly to the disease.
SETD1A is involved in a process called chromatin modification. Chromatin is the molecular apparatus that packages DNA into a smaller volume so it can fit into the cell and physically regulates how genes are expressed. Chromatin modification is therefore a crucial cellular activity.
The finding fits with accumulating evidence that damage to chromatin regulatory genes is a common feature of various psychiatric and neurodevelopmental disorders. By combining the mutational data from this and related studies on schizophrenia, the authors found that “chromatin regulation” was the most common description for genes that had damaging mutations.
“A clinical implication of this finding is the possibility of using the number and severity of mutations involved in chromatin regulation as a way to identify children at risk of developing schizophrenia and other neurodevelopmental disorders,” said Dr. Gogos. “Exploring ways to reverse alterations in chromatic modification and restore gene expression may be an effective path toward treatment.”
In further sequencing studies, the researchers hope to identify and characterize more genes that might play a role in schizophrenia and to elucidate common biological functions of the genes.

Uncovering Clues to the Genetic Cause of Schizophrenia

The overall number and nature of mutations—rather than the presence of any single mutation—influences an individual’s risk of developing schizophrenia, as well as its severity, according to a discovery by Columbia University Medical Center researchers published in the latest issue of Neuron. The findings could have important implications for the early detection and treatment of schizophrenia.

Maria Karayiorgou, MD, professor of psychiatry and Joseph Gogos, MD, PhD, professor of physiology and cellular biophysics and of neuroscience, and their team sequenced the “exome”—the region of the human genome that codes for proteins—of 231 schizophrenia patients and their unaffected parents. Using this data, they demonstrated that schizophrenia arises from collective damage across several genes.

“This study helps define a specific genetic mechanism that explains some of schizophrenia’s heritability and clinical manifestation,” said Dr. Karayiorgou, who is acting chief of the Division of Psychiatric and Medical Genetics at the New York State Psychiatric Institute. “Accumulation of damaged genes inherited from healthy parents leads to higher risk not only to develop schizophrenia but also to develop more severe forms of the disease.”

Schizophrenia is a severe psychiatric disorder in which patients experience hallucination, delusion, apathy and cognitive difficulties. The disorder is relatively common, affecting around 1 in every 100 people, and the risk of developing schizophrenia is strongly increased if a family member has the disease. Previous research has focused on the search for individual genes that might trigger schizophrenia. The availability of new high-throughput DNA sequencing technology has contributed to a more holistic approach to the disease.

The researchers compared sequencing data to look for genetic differences and identify new loss-of-function mutations—which are rarer, but have a more severe effect on ordinary gene function—in cases of schizophrenia that had not been inherited from the patients’ parents. They found an excess of such mutations in a variety of genes across different chromosomes.

Using the same sequencing data, the researchers also looked at what types of mutations are commonly passed on to schizophrenia patients from their parents. It turns out that many of these are “loss-of-function” types. These mutations were also found to occur more frequently in genes with a low tolerance for genetic variation.

“These mutations are important signposts toward identifying the genes involved in schizophrenia,” said Dr. Karayiorgou.

The researchers then looked more deeply into the sequencing data to try to determine the biological functions of the disrupted genes involved in schizophrenia. They were able to verify two key damaging mutations in a gene called SETD1A, suggesting that this gene contributes significantly to the disease.

SETD1A is involved in a process called chromatin modification. Chromatin is the molecular apparatus that packages DNA into a smaller volume so it can fit into the cell and physically regulates how genes are expressed. Chromatin modification is therefore a crucial cellular activity.

The finding fits with accumulating evidence that damage to chromatin regulatory genes is a common feature of various psychiatric and neurodevelopmental disorders. By combining the mutational data from this and related studies on schizophrenia, the authors found that “chromatin regulation” was the most common description for genes that had damaging mutations.

“A clinical implication of this finding is the possibility of using the number and severity of mutations involved in chromatin regulation as a way to identify children at risk of developing schizophrenia and other neurodevelopmental disorders,” said Dr. Gogos. “Exploring ways to reverse alterations in chromatic modification and restore gene expression may be an effective path toward treatment.”

In further sequencing studies, the researchers hope to identify and characterize more genes that might play a role in schizophrenia and to elucidate common biological functions of the genes.

Filed under schizophrenia genetics genomics neuroscience science

297 notes

Learning Early in Life May Help Keep Brain Cells Alive
Using your brain – particularly during adolescence – may help brain cells survive and could impact how the brain functions after puberty.
According to a recently published study in Frontiers in Neuroscience, Rutgers behavioral and systems neuroscientist Tracey Shors, who co-authored the study, found that the newborn brain cells in young rats that were successful at learning survived while the same brain cells in animals that didn’t master the task died quickly.
“In those that didn’t learn, three weeks after the new brain cells were made, nearly one-half of them were no longer there,” said Shors, professor in the Department of Psychology and Center for Collaborative Neuroscience at Rutgers. “But in those that learned, it was hard to count. There were so many that were still alive.”
The study is important, Shors says, because it suggests that the massive proliferation of new brain cells most likely helps young animals leave the protectiveness of their mothers and face dangers, challenges and opportunities of adulthood.
Scientists have known for years that the neurons in adult rats, which are significant but fewer in numbers than during puberty, could be saved with learning, but they did not know if this would be the case for young rats that produce two to four times more neurons than adult animals.
By examining the hippocampus – a portion of the brain associated with the process of learning  – after the rats learned to associate a sound with a motor response, scientists found that the new brain cells injected with dye a few weeks earlier were still alive in those that had learned the task while the cells in those who had failed did not survive.
“It’s not that learning makes more cells,” says Shors. “It’s that the process of learning keeps new cells alive that are already present at the time of the learning experience.”
Since the process of producing new brain cells on a cellular level is similar in animals, including humans, Shors says ensuring that adolescent children learn at optimal levels is critical.
“What it has shown me, especially as an educator, is how difficult it is to achieve optimal learning for our students. You don’t want the material to be too easy to learn and yet still have it too difficult where the student doesn’t learn and gives up,” Shors says.
So, what does this mean for the 12-year-old adolescent boy or girl?
While scientists can’t measure individual brain cells in humans, Shors says this study, on the cellular level, provides a look at what is happening in the adolescent brain and provides a window into the amazing ability the brain has to reorganize itself and form new neural connections at such a transformational time in our lives.
“Adolescents are trying to figure out who they are now, who they want to be when they grow up and are at school in a learning environment all day long,” says Shors. “The brain has to have a lot of strength to respond to all those experiences.”

Learning Early in Life May Help Keep Brain Cells Alive

Using your brain – particularly during adolescence – may help brain cells survive and could impact how the brain functions after puberty.

According to a recently published study in Frontiers in Neuroscience, Rutgers behavioral and systems neuroscientist Tracey Shors, who co-authored the study, found that the newborn brain cells in young rats that were successful at learning survived while the same brain cells in animals that didn’t master the task died quickly.

“In those that didn’t learn, three weeks after the new brain cells were made, nearly one-half of them were no longer there,” said Shors, professor in the Department of Psychology and Center for Collaborative Neuroscience at Rutgers. “But in those that learned, it was hard to count. There were so many that were still alive.”

The study is important, Shors says, because it suggests that the massive proliferation of new brain cells most likely helps young animals leave the protectiveness of their mothers and face dangers, challenges and opportunities of adulthood.

Scientists have known for years that the neurons in adult rats, which are significant but fewer in numbers than during puberty, could be saved with learning, but they did not know if this would be the case for young rats that produce two to four times more neurons than adult animals.

By examining the hippocampus – a portion of the brain associated with the process of learning – after the rats learned to associate a sound with a motor response, scientists found that the new brain cells injected with dye a few weeks earlier were still alive in those that had learned the task while the cells in those who had failed did not survive.

“It’s not that learning makes more cells,” says Shors. “It’s that the process of learning keeps new cells alive that are already present at the time of the learning experience.”

Since the process of producing new brain cells on a cellular level is similar in animals, including humans, Shors says ensuring that adolescent children learn at optimal levels is critical.

“What it has shown me, especially as an educator, is how difficult it is to achieve optimal learning for our students. You don’t want the material to be too easy to learn and yet still have it too difficult where the student doesn’t learn and gives up,” Shors says.

So, what does this mean for the 12-year-old adolescent boy or girl?

While scientists can’t measure individual brain cells in humans, Shors says this study, on the cellular level, provides a look at what is happening in the adolescent brain and provides a window into the amazing ability the brain has to reorganize itself and form new neural connections at such a transformational time in our lives.

“Adolescents are trying to figure out who they are now, who they want to be when they grow up and are at school in a learning environment all day long,” says Shors. “The brain has to have a lot of strength to respond to all those experiences.”

Filed under brain cells puberty adolescence hippocampus dentate gyrus neuroscience science

117 notes

New epilepsy treatment offers ‘on demand’ seizure suppression

A new treatment for drug-resistant epilepsy with the potential to suppress seizures ‘on demand’ with a pill, similar to how you might take painkillers when you feel a headache coming on, has been developed by UCL researchers funded by the Wellcome Trust.

image

The treatment, described in Nature Communications, combines genetic and chemical approaches to suppress seizures without disrupting normal brain function. The technique was demonstrated in rodents but in future we could see people controlling seizures on-demand with a simple pill.

Epilepsy affects around 50 million people worldwide including 600,000 in the UK and around a quarter of cases are resistant to conventional treatments. Many of these cases could be addressed by the new treatment method, which relies on genetic modification of brain cells to make them sensitive to a normally inactive compound.

“First, we inject a modified virus into the area of the brain where seizures arise,” explains Professor Dimitri Kullmann of the UCL Institute of Neurology, senior author of the research. “This virus instructs the brain cells to make a protein that is activated by CNO (clozapine-N-oxide), a compound that can be taken as a pill. The activated protein then suppresses the over-excitable brain cells that trigger seizures, but only in the presence of CNO.

“At the moment, severe seizures are treated with drugs that suppress the excitability of all brain cells, and patients therefore experience side effects. Sometimes the dose required to stop seizures is so high that patients need to be sedated and taken to intensive care. If we can take our new method into the clinic, which we hope to do within the next decade, we could treat patients who are susceptible to severe seizures with a one-off injection of the modified virus, and then use CNO only when needed.

“CNO would be given as a pill in the event that patients could predict when seizures were likely to occur. For example, many people with treatment-resistant epilepsy experience clusters of seizures, where severe seizures are preceded by smaller ones. Seizure risk is also high when people are ill, sleep deprived, or at certain times of the menstrual cycle, so these would all be good times to take the pill as a preventative measure. In urgent situations, the compound could be given as an injection. We could even consider a fully automatic delivery system, where CNO was given by a pump, as is done for insulin in some people with diabetes.”

As CNO has a half-life of about a few hours and only affects the pre-treated epileptic parts of the brain, the new method avoids the need to permanently alter the brain or treat the whole brain with seizure-suppressing drugs. It builds on similar work by Professor Kullmann’s group using gene therapy to ‘calm down’ brain cells, or using light pulses to activate seizure-suppressing receptors in the brain. The new technique works in a similar way but is reversible and avoids the need for invasive devices to deliver light to the brain.

“After the one-off injection into affected areas of the brain, our new technique would require nothing beyond CNO, administered as an injection or a pill, to suppress seizures when required,” says Professor Kullmann. “This makes it more attractive than alternative forms of targeted therapy such as surgery to remove the brain region where seizures arise, or gene therapy that permanently alters the excitability of brain cells.

“Although there is currently no evidence that permanently suppressing excitability in a small area affects brain function, we cannot be sure that it would have no impact long-term. Our new method is completely reversible, so if there were any side-effects then people could simply stop taking the CNO pill.”

(Source: ucl.ac.uk)

Filed under epilepsy seizure suppression brain cells gene therapy optogenetics neuroscience science

102 notes

Drug used to treat multiple sclerosis may have beneficial effects on memory

Virginia Commonwealth University School of Medicine researchers have uncovered a new mechanism of action of fingolimod, a drug widely used to treat multiple sclerosis: elimination of adverse or traumatic memories.

The findings shed light on how the drug works on the molecular level – something that has not been well understood until now.

Fingolimod, or FTY720, which is the first orally available drug for treatment of multiple sclerosis, works by suppressing the immune system. Fingolimod is a prodrug that is phosphorylated in the body to its active form, FTY720-phosphate.

In a study published by the Nature Neuroscience journal on May 25 as an Advanced Online Publication, researchers used a mouse model to show that fingolimod accumulates in the brain and inhibits histone deacetylases, which are enzymes important to regulate gene expression. The team observed an increased expression of a limited number of genes important for certain memory processes. Fingolimod acted similarly to the natural signaling lipid, sphingosine-1-phosphate, which it closely resembles.

“Our work suggests that some of the beneficial effects of FTY720/fingolimod that are not well understood might be mediated by this new activity that we have discovered,” said first author Sarah Spiegel, Ph.D., an internationally renowned researcher and professor and chair of the Department of Biochemistry and Molecular Biology in the VCU School of Medicine.

“It will be important in the future to determine whether this prodrug can reduce loss of cognitive functions and can erase adverse memories,” she said.

Spiegel added that other histone deacetylase inhibitors have long been used for treatment of psychiatric and neurological disorders, yet the mechanism of their effectiveness is not fully understood.

“FTY720/fingolimod may be a useful adjuvant therapy to help stop aversive memories such as in post-traumatic stress disorder and other anxiety disorders,” Spiegel said.

“The work has not been extended to show effectiveness in humans at this time. We are still working to fully understand the molecular underpinnings of the drug and its link to memory,” she said.

The work is based on previous findings by Spiegel’s group that were published in Science in 2009. They had reported that sphingosine-1-phosphate formed in the nucleus of cells is a natural inhibitor of histone deacetylases and a regulator of gene expression.

(Source: spectrum.vcu.edu)

Filed under MS fingolimod memory histone deacetylase gene expression neuroscience science

138 notes

Timing is everything: scientists control rapid re-wiring of brain circuits using patterned visual stimulation
In a new study, published in this week’s issue of the journal Science, researchers show for the first time how the brain re-wires and fine-tunes its connections differently depending on the relative timing of sensory stimuli. In most neuroscience textbooks today, there is a widely held model that explains how nerve circuits might refine their connectivity based on patterned firing of brain cells, but it has not previously been directly observed in real time. This “Hebbian Theory”, named after the McGill University psychologist Donald Olding Hebb who first proposed it in 1949 has been summarized as:
“Cells that fire together, wire together. Cells that fire out of sync, lose their link”
In other words, a nerve cell that fires at the same time as its nerve cell neighbors will cooperatively form strong, stable connections onto its partner cells. On the other hand, a nerve cell that fires out of synchrony with its neighbours, will end up destabilizing and withdrawing its connections. “For the first time, we have direct, real-time evidence from watching brain cells in an intact animal to support Hebb’s model, but, we also provide surprising, new details, fundamentally updating the model for the 21st century,” says Dr. Edward Ruthazer, senior investigator on the study at the Montreal Neurological Institute and Hospital –The Neuro at McGill University and the McGill University Health Centre. 
The study, which used multiphoton laser-scanning microscopy to observe cells in the brains of intact animals, discovered that asynchronous firing, or “firing out of sync” not only caused brain cells to lose their ability to make other cells fire, but unexpectedly, also caused them to dramatically increase their elaboration of new branches in search of better matched partners. “The surprising and entirely unexpected finding is that even though nerve circuit remodeling from asynchronous stimulation actively weakens connections, there is a 60% increase in axon branches that are exploring the environment but these exploratory branches are not long-lived,” said Dr. Ruthazer.
IMAGES of nerves in action in transparent xenopus tadpoles: http://bit.ly/1lNuux0
Dr. Ruthazer’s lab charts the formation of brain circuitry during development in the hopes of better understanding the rules that control healthy brain wiring and of advancing treatments for injuries to the nervous system and therapies for neurodevelopmental disorders such as autism and schizophrenia. Astoundingly, nearly one out of every 100 Canadians suffers from one of these disorders, estimated to cost the Canadian economy over $10 billion annually in addition to inflicting a devastating impact on patients and their families.
In the developing brain, initially imprecise connections between nerve cells are gradually pruned away, leaving connections that are stronger and more specific. This refinement occurs in response to patterned stimulation from the environment. “The way we perceive the world as adults is directly impacted by what we saw when we were younger,” says Dr. Ruthazer.
Dr. Ruthazer’s team studies brain development in Xenopus tadpoles, which have the distinct advantage of being transparent, enabling the team to clearly see the nervous system inside. They have developed a model that allows them to watch nerve cell remodeling in vivo, in real time, and to measure the efficacy of connections between cells. Optic fibers were used to stimulate the eyes of the tadpoles with different light patterns, while imaging and recording nerve cell branch formation.  Asynchronous stimulation involved light flashes presented to each eye at different times, while synchronous stimulation involved simultaneous stimulation of both eyes.
Importantly, Dr. Ruthazer’s group also has begun to identify the molecular mechanisms underlying these changes in the nervous system. They show that the stabilization of the retinal nerve cell branches caused by synchronous firing involves signaling downstream of the synaptic activation of a neurotransmitter receptor called the N-methyl-D-aspartate receptor. In contrast, the enhanced exploratory growth that occurs with asynchronous activity does not appear to require the activation of this receptor.

Timing is everything: scientists control rapid re-wiring of brain circuits using patterned visual stimulation

In a new study, published in this week’s issue of the journal Science, researchers show for the first time how the brain re-wires and fine-tunes its connections differently depending on the relative timing of sensory stimuli. In most neuroscience textbooks today, there is a widely held model that explains how nerve circuits might refine their connectivity based on patterned firing of brain cells, but it has not previously been directly observed in real time. This “Hebbian Theory”, named after the McGill University psychologist Donald Olding Hebb who first proposed it in 1949 has been summarized as:

“Cells that fire together, wire together. Cells that fire out of sync, lose their link”

In other words, a nerve cell that fires at the same time as its nerve cell neighbors will cooperatively form strong, stable connections onto its partner cells. On the other hand, a nerve cell that fires out of synchrony with its neighbours, will end up destabilizing and withdrawing its connections. “For the first time, we have direct, real-time evidence from watching brain cells in an intact animal to support Hebb’s model, but, we also provide surprising, new details, fundamentally updating the model for the 21st century,” says Dr. Edward Ruthazer, senior investigator on the study at the Montreal Neurological Institute and Hospital –The Neuro at McGill University and the McGill University Health Centre. 

The study, which used multiphoton laser-scanning microscopy to observe cells in the brains of intact animals, discovered that asynchronous firing, or “firing out of sync” not only caused brain cells to lose their ability to make other cells fire, but unexpectedly, also caused them to dramatically increase their elaboration of new branches in search of better matched partners. “The surprising and entirely unexpected finding is that even though nerve circuit remodeling from asynchronous stimulation actively weakens connections, there is a 60% increase in axon branches that are exploring the environment but these exploratory branches are not long-lived,” said Dr. Ruthazer.

IMAGES of nerves in action in transparent xenopus tadpoles: http://bit.ly/1lNuux0

Dr. Ruthazer’s lab charts the formation of brain circuitry during development in the hopes of better understanding the rules that control healthy brain wiring and of advancing treatments for injuries to the nervous system and therapies for neurodevelopmental disorders such as autism and schizophrenia. Astoundingly, nearly one out of every 100 Canadians suffers from one of these disorders, estimated to cost the Canadian economy over $10 billion annually in addition to inflicting a devastating impact on patients and their families.

In the developing brain, initially imprecise connections between nerve cells are gradually pruned away, leaving connections that are stronger and more specific. This refinement occurs in response to patterned stimulation from the environment. “The way we perceive the world as adults is directly impacted by what we saw when we were younger,” says Dr. Ruthazer.

Dr. Ruthazer’s team studies brain development in Xenopus tadpoles, which have the distinct advantage of being transparent, enabling the team to clearly see the nervous system inside. They have developed a model that allows them to watch nerve cell remodeling in vivo, in real time, and to measure the efficacy of connections between cells. Optic fibers were used to stimulate the eyes of the tadpoles with different light patterns, while imaging and recording nerve cell branch formation.  Asynchronous stimulation involved light flashes presented to each eye at different times, while synchronous stimulation involved simultaneous stimulation of both eyes.

Importantly, Dr. Ruthazer’s group also has begun to identify the molecular mechanisms underlying these changes in the nervous system. They show that the stabilization of the retinal nerve cell branches caused by synchronous firing involves signaling downstream of the synaptic activation of a neurotransmitter receptor called the N-methyl-D-aspartate receptor. In contrast, the enhanced exploratory growth that occurs with asynchronous activity does not appear to require the activation of this receptor.

Filed under nerve cells visual stimulation brain wiring brain circuitry neuroscience science

197 notes

(Image caption: Researchers at Cold Spring Harbor Laboratory have identified the neurons in the brain that determine if a mouse will learn to cope with stress or become depressed. These neurons, located in a region of the brain known as the medial prefrontal cortex (green, left image), become hyperactive in depressed mice (right panel is close-up of left, yellow indicates activation). The team showed that this enhanced activity in fact causes depression.)
Dealing with stress – to cope or to quit?
We all deal with stress differently. For many of us, stress is a great motivator, spurring a renewed sense of vigor to solve life’s problems. But for others, stress triggers depression. We become overwhelmed, paralyzed by hopelessness and defeat. Up to 20% of us will struggle with depression at some point in life, and researchers are actively working to understand how and why this debilitating mental disease develops.
Today, a team of researchers at Cold Spring Harbor Laboratory (CSHL) led by Associate Professor Bo Li reveals a major insight into the neuronal basis of depression. They have identified the group of neurons in the brain that determines how a mouse responds to stress — whether with resilience or defeat.
For years, scientists have relied on brain imaging to look for neuronal changes during depression. They found that a region of the brain known as the medial prefrontal cortex (mPFC) becomes hyperactive in depressed people. This area of the brain is well known to play a role in the control of emotions and behavior, linking our feelings with our actions. But brain scans aren’t able to determine if increased activity in the mPFC causes depression, or if it is simply a byproduct of other neuronal changes. 
Dr. Li set out to identify the neuronal changes that underlie depression. In work published today in The Journal of Neuroscience,Li and his team, including Minghui Wang, Ph.D. and Zinaida Perova, Ph.D., used a mouse model for depression, known as “learned helplessness.” They combined this with a genetic trick to mark specific neurons that respond to stress. They discovered that neurons in the mPFC become highly excited in mice that are depressed. These same neurons are weakened in mice that aren’t deterred by stress – what scientists call resilient mice.
But the team still couldn’t be sure that enhanced signaling in the mPFC actually caused depression. To test this, they engineered mice to mimic the neuronal conditions they found in depressed mice. “We artificially enhanced the activity of these neurons using a powerful method known as chemical genetics,” says Li. “The results were remarkable: once-strong and resilient mice became helpless, showing all of the classic signs of depression.”
These results help explain how one promising new treatment for depression works and may lead to improvements in the treatment.
Doctors have had some success with deep brain stimulation (DBS), which suppresses the activity of neurons in a very specific portion of the brain. “We hope that our work will make DBS even more targeted and powerful,” says Li, “and we are working to develop additional strategies based upon the activity of the mPFC to treat depression.”
Next, Li is looking forward to exploring how the neurons in the mPFC become hyperactive in helpless mice. “These active neurons are surrounded by inhibitory neurons,” says Li. “Are the inhibitory neurons failing? Or are the active neurons somehow able to bypass their controls? These are some of the many open questions we are pursuing to understand the how depression develops.”

(Image caption: Researchers at Cold Spring Harbor Laboratory have identified the neurons in the brain that determine if a mouse will learn to cope with stress or become depressed. These neurons, located in a region of the brain known as the medial prefrontal cortex (green, left image), become hyperactive in depressed mice (right panel is close-up of left, yellow indicates activation). The team showed that this enhanced activity in fact causes depression.)

Dealing with stress – to cope or to quit?

We all deal with stress differently. For many of us, stress is a great motivator, spurring a renewed sense of vigor to solve life’s problems. But for others, stress triggers depression. We become overwhelmed, paralyzed by hopelessness and defeat. Up to 20% of us will struggle with depression at some point in life, and researchers are actively working to understand how and why this debilitating mental disease develops.

Today, a team of researchers at Cold Spring Harbor Laboratory (CSHL) led by Associate Professor Bo Li reveals a major insight into the neuronal basis of depression. They have identified the group of neurons in the brain that determines how a mouse responds to stress — whether with resilience or defeat.

For years, scientists have relied on brain imaging to look for neuronal changes during depression. They found that a region of the brain known as the medial prefrontal cortex (mPFC) becomes hyperactive in depressed people. This area of the brain is well known to play a role in the control of emotions and behavior, linking our feelings with our actions. But brain scans aren’t able to determine if increased activity in the mPFC causes depression, or if it is simply a byproduct of other neuronal changes. 

Dr. Li set out to identify the neuronal changes that underlie depression. In work published today in The Journal of Neuroscience,Li and his team, including Minghui Wang, Ph.D. and Zinaida Perova, Ph.D., used a mouse model for depression, known as “learned helplessness.” They combined this with a genetic trick to mark specific neurons that respond to stress. They discovered that neurons in the mPFC become highly excited in mice that are depressed. These same neurons are weakened in mice that aren’t deterred by stress – what scientists call resilient mice.

But the team still couldn’t be sure that enhanced signaling in the mPFC actually caused depression. To test this, they engineered mice to mimic the neuronal conditions they found in depressed mice. “We artificially enhanced the activity of these neurons using a powerful method known as chemical genetics,” says Li. “The results were remarkable: once-strong and resilient mice became helpless, showing all of the classic signs of depression.”

These results help explain how one promising new treatment for depression works and may lead to improvements in the treatment.

Doctors have had some success with deep brain stimulation (DBS), which suppresses the activity of neurons in a very specific portion of the brain. “We hope that our work will make DBS even more targeted and powerful,” says Li, “and we are working to develop additional strategies based upon the activity of the mPFC to treat depression.”

Next, Li is looking forward to exploring how the neurons in the mPFC become hyperactive in helpless mice. “These active neurons are surrounded by inhibitory neurons,” says Li. “Are the inhibitory neurons failing? Or are the active neurons somehow able to bypass their controls? These are some of the many open questions we are pursuing to understand the how depression develops.”

Filed under stress prefrontal cortex depression deep brain stimulation animal model learned helplessness psychology neuroscience science

367 notes

Dad’s Brain Becomes More ‘Maternal’ When He’s Primary Caregiver

Fathers who spend more time taking care of their newborn child undergo changes in brain activity that make them more apt to fret about their baby’s safety, a new study shows.

image

(Image: Shutterstock)

In particular, fathers who are the primary caregiver experience an increase in activity in their amygdala and other emotional-processing systems, causing them to experience parental emotions similar to those typically experienced by mothers, the researchers noted.

The findings suggest there is a neural network in the brain dedicated to parenting, and that the network responds to changes in parental roles, said study senior author Ruth Feldman, a researcher in the department of psychology and the Gonda Brain Sciences Center at Bar-Ilan University in Israel.

"Pregnancy, childbirth and lactation are very powerful primers in women to worry about their child’s survival," said Feldman, who also serves as an adjunct professor at the Yale Child Study Center at Yale University. "Fathers have the capacity to do it as well as mothers, but they need daily caregiving activities to ignite that mothering network."

Read more

Filed under parenting amygdala brain activity emotions psychology neuroscience science

303 notes

Did standing up change our brains?
Although lots of animals are smart, humans are even smarter. How and why do we think and act so differently from other species?
A young boy’s efforts while learning to walk have suggested a new explanation, in a new journal paper jointly authored by his father and grandfather, both academics at the University of Sydney.
In the latest issue of the scientific journal, Frontiers in Neuroscience, the son-and-father team Mac and Rick Shine suggest that the big difference between humans and other species may lie in how we use our brains for routine tasks.
They advance the idea that the key to exploiting the awesome processing power of our brain’s most distinctive feature - the cortex - may have been to liberate it from the drudgery of controlling routine activities.
And that’s where young Tyler Shine, now two years old, comes into the story. When Tyler was first learning to walk, his doting father and grandfather noticed that every step took Tyler’s full attention.
But before too long, walking became routine, and Tyler was able to start noticing other things around him. He was better at maintaining his balance, which freed up his attention to focus on more interesting tasks, like trying to get into mischief.
How did Tyler improve? His father and grandfather suggest that he did so by transferring the control of his balance to ‘lower’ parts of the brain, freeing up the powerful cortex to focus on unpredictable challenges, such as a bumpy floor covered in stray toys.
"Any complicated task - like driving a car or playing a musical instrument - starts out consuming all our attention, but eventually becomes routine," Mac Shine says.
"Studies of brain function suggest that we shift the control of these routine tasks down to ‘lower’ areas of the brain, such as the basal ganglia and the cerebellum.
"So, humans are smart because we have automated the routine tasks; and thus, can devote our most potent mental faculties to deal with new, unpredictable challenges.
"What event in the early history of humans made us change the way we use our brains?
Watching Tyler learn to walk suggested that it was the evolutionary shift from walking on all fours, to walking on two legs.
"Suddenly our brains were overwhelmed with the complicated challenge of keeping our balance - and the best kind of brain to have, was one that didn’t waste its most powerful functions on controlling routine tasks."
So, the Shines believe, those first pre-humans who began to stand upright faced a new evolutionary pressure not just on their bodies, but on their brains as well.
"New technologies are allowing us to look inside the brain while it works, and we are learning an enormous amount," Mac Shine says.
"But in order to interpret those results, we need new ideas as well. I’m delighted that my son has played a role in suggesting one of those ideas."
"Hopefully, by the time he is watching his own son learn to walk, we will be much closer to truly understanding the greatest mystery of human existence: how our brains work."

Did standing up change our brains?

Although lots of animals are smart, humans are even smarter. How and why do we think and act so differently from other species?

A young boy’s efforts while learning to walk have suggested a new explanation, in a new journal paper jointly authored by his father and grandfather, both academics at the University of Sydney.

In the latest issue of the scientific journal, Frontiers in Neuroscience, the son-and-father team Mac and Rick Shine suggest that the big difference between humans and other species may lie in how we use our brains for routine tasks.

They advance the idea that the key to exploiting the awesome processing power of our brain’s most distinctive feature - the cortex - may have been to liberate it from the drudgery of controlling routine activities.

And that’s where young Tyler Shine, now two years old, comes into the story. When Tyler was first learning to walk, his doting father and grandfather noticed that every step took Tyler’s full attention.

But before too long, walking became routine, and Tyler was able to start noticing other things around him. He was better at maintaining his balance, which freed up his attention to focus on more interesting tasks, like trying to get into mischief.

How did Tyler improve? His father and grandfather suggest that he did so by transferring the control of his balance to ‘lower’ parts of the brain, freeing up the powerful cortex to focus on unpredictable challenges, such as a bumpy floor covered in stray toys.

"Any complicated task - like driving a car or playing a musical instrument - starts out consuming all our attention, but eventually becomes routine," Mac Shine says.

"Studies of brain function suggest that we shift the control of these routine tasks down to ‘lower’ areas of the brain, such as the basal ganglia and the cerebellum.

"So, humans are smart because we have automated the routine tasks; and thus, can devote our most potent mental faculties to deal with new, unpredictable challenges.

"What event in the early history of humans made us change the way we use our brains?

Watching Tyler learn to walk suggested that it was the evolutionary shift from walking on all fours, to walking on two legs.

"Suddenly our brains were overwhelmed with the complicated challenge of keeping our balance - and the best kind of brain to have, was one that didn’t waste its most powerful functions on controlling routine tasks."

So, the Shines believe, those first pre-humans who began to stand upright faced a new evolutionary pressure not just on their bodies, but on their brains as well.

"New technologies are allowing us to look inside the brain while it works, and we are learning an enormous amount," Mac Shine says.

"But in order to interpret those results, we need new ideas as well. I’m delighted that my son has played a role in suggesting one of those ideas."

"Hopefully, by the time he is watching his own son learn to walk, we will be much closer to truly understanding the greatest mystery of human existence: how our brains work."

Filed under basal ganglia cerebellum automaticity delegation evolution neuroscience science

183 notes

Sex-specific changes in cerebral blood flow begin at puberty

Puberty is the defining process of adolescent development, beginning a cascade of changes throughout the body, including the brain. Penn Medicine researchers have discovered that cerebral blood flow (CBF) levels decreased similarly in males and females before puberty, but saw them diverge sharply in puberty, with levels increasing in females while decreasing further in males, which could give hints as to developing differences in behavior in men and women and sex-specific pre-dispositions to certain psychiatric disorders. Their findings are available in Proceedings of the National Academy of Science (PNAS).

"These findings help us understand normal neurodevelopment and could be a step towards creating normal ‘growth charts’ for brain development in kids. These results also show what every parent knows: boys and girls grow differently. This applies to the brain as well," says Theodore D. Satterthwaite, MD, MA, assistant professor in the Department of Psychiatry in the Perelman School of Medicine at the University of Pennsylvania. "Hopefully, one day such growth charts might allow us to identify abnormal brain development much earlier before it leads to major mental illness."

Studies on structural brain development have shown that puberty is an important source of sex differences. Previous work has shown that CBF declines throughout childhood, but the effects of puberty on properties of brain physiology such as CBF, also known as cerebral perfusion, are not well known. “We know that adult women have higher blood flow than men, but it was not clear when that difference began, so we hypothesized that the gap between women and men would begin in adolescence and coincide with puberty,” Satterthwaite says.

The Penn team imaged the brains of 922 youth ages 8 through 22 using arterial spin labeled (ASL) MRI. The youth were all members of the Philadelphia Neurodevelopmental Cohort, a National Institute of Mental Health-funded collaboration between the University of Pennsylvania Brain Behavior Laboratory and the Center for Applied Genomics at the Children’s Hospital of Philadelphia.

They found support for their hypothesis.

Age related differences were observed in the amount and location of blood flow in males versus females, with blood flow declining at a similar rate before puberty and diverging markedly in mid-puberty. At around age 16, while male CBF values continue to decline with advanced age, females CBF values actually increased. This resulted in females having notably higher CBF than males by the end of adolescence. The difference between males and females was most notable in parts of the brain that are critical for social behaviors and emotion regulation such as the orbitofrontal cortex. The researchers speculate that such differences could be related to females’ well-established superior performance on social cognition tasks. Potentially, these effects could also be related to the higher risk in women for depression and anxiety disorders, and higher risk of flat affect and schizophrenia in men.

(Source: eurekalert.org)

Filed under cerebral blood flow puberty brain development orbitofrontal cortex neuroscience science

free counters