Neuroscience

Articles and news from the latest research reports.

Posts tagged frontal cortex

135 notes

Schizophrenia symptoms linked to faulty ‘switch’ in brain
Scientists at The University of Nottingham have shown that psychotic symptoms experienced by people with schizophrenia could be caused by a faulty ‘switch’ within the brain.
In a study published today in the leading journal Neuron, they have demonstrated that the severity of symptoms such as delusions and hallucinations which are typical in patients with the psychiatric disorder is caused by a disconnection between two important regions in the brain — the insula and the lateral frontal cortex.
The breakthrough, say the academics, could form the basis for better, more targeted treatments for schizophrenia with fewer side effects.
The four-year study, led by Professor Peter Liddle and Dr Lena Palaniyappan in the University’s Division of Psychiatry and based in the Institute of Mental Health, centred on the insula region, a segregated ‘island’ buried deep within the brain, which is responsible for seamless switching between inner and outer world.
"Powerful explanation" 

Dr Lena Palaniyappan, a Wellcome Trust Research Fellow, said: “In our daily life, we constantly switch between our inner, private world and the outer, objective world. This switching action is enabled by the connections between the insula and frontal cortex. This switch process appears to be disrupted in patients with schizophrenia. This could explain why internal thoughts sometime appear as external objective reality, experienced as voices or hallucinations in this condition. This could also explain the difficulties in ‘internalising’ external material pleasures (e.g. enjoying a musical tune or social events) that result in emotional blunting in patients with psychosis. Our observation offers a powerful mechanistic explanation for the formation of psychotic symptoms.”
Several brain regions are engaged when we are lost in thought or, for example, remembering a past event. However, when interrupted by a loud noise or another person speaking we are able to switch to using our frontal cortex area of the brain, which processes this external information. With a disruption in the connections from the insula, such switching may not be possible.
Compromised brain function 

The Nottingham scientists used functional MRI (fMRI) imaging to compare the brains of 35 healthy volunteers with those of 38 schizophrenic patients. The results showed that whereas the majority of healthy patients were able to make this switch between regions, the patients with schizophrenia were less likely to shift to using their frontal cortex.
The insular and frontal cortex form a sensitive ‘salience’ loop within the brain — the insular should stimulate the frontal cortex while in turn the frontal cortex should inhibit the insula — but in patients with schizophrenia this system was found to be seriously compromised.
The results suggest that detecting the lack of a positive influence from the insula to the frontal cortex using fMRI could have a high degree of predictive value in identifying patients with schizophrenia.
The results of the study offer vital information for the development of more effective treatments for the condition.
Schizophrenia is one of the most common serious mental health conditions affecting around 1 in 100 people. Its onset occurs most commonly in a patient’s late teens or early 20s which can have devastating consequences for their future.
Genetic and environmental triggers 


Scientists remain unsure what causes schizophrenia but believe it could be a combination of a genetic predisposition to the condition combined with environmental factors. Drug use is known to be a key trigger – people who use cannabis, or stimulant drugs, are three to four times more likely to go on to develop recurrent psychotic symptoms.
It is also believed that underdevelopment of the brain in the womb caused by complications in the mother’s pregnancy and in early childhood linked to issues such as malnutrition could play a key part. Previous observations from this research group have also uncovered the presence of unusually smooth folding patterns of the brain over the insula region in patients, suggesting an impairment in the normal development of this structure in schizophrenia.
At present, treatment involves a combination of antipsychotic medications, psychological therapies and social interventions. Currently, only one in five patients with schizophrenia achieve complete recovery and many patients who develop the condition in the long-term struggle to find a treatment that is 100 per cent effective in managing their condition.
Antipsychotic drugs, though effective in a number of patients, have poor acceptance rates due to the side effect burden meaning that many patients stop taking them in the longer run, leading to recurrence of disabling symptoms.
Researchers in Nottingham are also looking at a technique called TMS – transcranial magnetic stimulation — which uses a powerful magnetic pulse to stimulate the brain regions that are malfunctioning.
Compassion-based therapy 
Despite the fact that the insular region is buried so deeply within the brain that TMS would usually be ineffective, the results of the Nottingham study suggest that the loop between the insular and the frontal cortex could be exploited for TMS– if a pulse is delivered to the frontal lobe it could stimulate the insula and reset the ‘switch’.
Other future treatment options could include the use of a compassion-based meditation therapy called mindfulness, which may have the potential to ‘reset’ the switching function of the insula and can promote physical changes within the brain. Meditation over a long period of time has been shown to increase the folding patterns within the insula area of the brain. These ideas are in its early stages at present, but may deliver more focussed treatment approaches in the longer term.

Schizophrenia symptoms linked to faulty ‘switch’ in brain

Scientists at The University of Nottingham have shown that psychotic symptoms experienced by people with schizophrenia could be caused by a faulty ‘switch’ within the brain.

In a study published today in the leading journal Neuron, they have demonstrated that the severity of symptoms such as delusions and hallucinations which are typical in patients with the psychiatric disorder is caused by a disconnection between two important regions in the brain — the insula and the lateral frontal cortex.

The breakthrough, say the academics, could form the basis for better, more targeted treatments for schizophrenia with fewer side effects.

The four-year study, led by Professor Peter Liddle and Dr Lena Palaniyappan in the University’s Division of Psychiatry and based in the Institute of Mental Health, centred on the insula region, a segregated ‘island’ buried deep within the brain, which is responsible for seamless switching between inner and outer world.

"Powerful explanation"

Dr Lena Palaniyappan, a Wellcome Trust Research Fellow, said: “In our daily life, we constantly switch between our inner, private world and the outer, objective world. This switching action is enabled by the connections between the insula and frontal cortex. This switch process appears to be disrupted in patients with schizophrenia. This could explain why internal thoughts sometime appear as external objective reality, experienced as voices or hallucinations in this condition. This could also explain the difficulties in ‘internalising’ external material pleasures (e.g. enjoying a musical tune or social events) that result in emotional blunting in patients with psychosis. Our observation offers a powerful mechanistic explanation for the formation of psychotic symptoms.”

Several brain regions are engaged when we are lost in thought or, for example, remembering a past event. However, when interrupted by a loud noise or another person speaking we are able to switch to using our frontal cortex area of the brain, which processes this external information. With a disruption in the connections from the insula, such switching may not be possible.

Compromised brain function

The Nottingham scientists used functional MRI (fMRI) imaging to compare the brains of 35 healthy volunteers with those of 38 schizophrenic patients. The results showed that whereas the majority of healthy patients were able to make this switch between regions, the patients with schizophrenia were less likely to shift to using their frontal cortex.

The insular and frontal cortex form a sensitive ‘salience’ loop within the brain — the insular should stimulate the frontal cortex while in turn the frontal cortex should inhibit the insula — but in patients with schizophrenia this system was found to be seriously compromised.

The results suggest that detecting the lack of a positive influence from the insula to the frontal cortex using fMRI could have a high degree of predictive value in identifying patients with schizophrenia.

The results of the study offer vital information for the development of more effective treatments for the condition.

Schizophrenia is one of the most common serious mental health conditions affecting around 1 in 100 people. Its onset occurs most commonly in a patient’s late teens or early 20s which can have devastating consequences for their future.

Genetic and environmental triggers

Scientists remain unsure what causes schizophrenia but believe it could be a combination of a genetic predisposition to the condition combined with environmental factors. Drug use is known to be a key trigger – people who use cannabis, or stimulant drugs, are three to four times more likely to go on to develop recurrent psychotic symptoms.

It is also believed that underdevelopment of the brain in the womb caused by complications in the mother’s pregnancy and in early childhood linked to issues such as malnutrition could play a key part. Previous observations from this research group have also uncovered the presence of unusually smooth folding patterns of the brain over the insula region in patients, suggesting an impairment in the normal development of this structure in schizophrenia.

At present, treatment involves a combination of antipsychotic medications, psychological therapies and social interventions. Currently, only one in five patients with schizophrenia achieve complete recovery and many patients who develop the condition in the long-term struggle to find a treatment that is 100 per cent effective in managing their condition.

Antipsychotic drugs, though effective in a number of patients, have poor acceptance rates due to the side effect burden meaning that many patients stop taking them in the longer run, leading to recurrence of disabling symptoms.

Researchers in Nottingham are also looking at a technique called TMS – transcranial magnetic stimulation — which uses a powerful magnetic pulse to stimulate the brain regions that are malfunctioning.

Compassion-based therapy

Despite the fact that the insular region is buried so deeply within the brain that TMS would usually be ineffective, the results of the Nottingham study suggest that the loop between the insular and the frontal cortex could be exploited for TMS– if a pulse is delivered to the frontal lobe it could stimulate the insula and reset the ‘switch’.

Other future treatment options could include the use of a compassion-based meditation therapy called mindfulness, which may have the potential to ‘reset’ the switching function of the insula and can promote physical changes within the brain. Meditation over a long period of time has been shown to increase the folding patterns within the insula area of the brain. These ideas are in its early stages at present, but may deliver more focussed treatment approaches in the longer term.

Filed under insula frontal cortex schizophrenia neuroimaging neuroscience psychology science

165 notes

Unique Epigenomic Code Identified During Human Brain Development 
Changes in the epigenome, including chemical modifications of DNA, can act as an extra layer of information in the genome, and are thought to play a role in learning and memory, as well as in age-related cognitive decline. The results of a new study by scientists at the Salk Institute for Biological Studies show that the landscape of DNA methylation, a particular type of epigenomic modification, is highly dynamic in brain cells during the transition from birth to adulthood, helping to understand how information in the genomes of cells in the brain is controlled from fetal development to adulthood. The brain is much more complex than all other organs in the body and this discovery opens the door to a deeper understanding of how the intricate patterns of connectivity in the brain are formed.
“These results extend our knowledge of the unique role of DNA methylation in brain development and function,” says senior author Joseph R. Ecker, professor and director of Salk’s Genomic Analysis Laboratory and holder of the Salk International Council Chair in Genetics. “They offer a new framework for testing the role of the epigenome in healthy function and in pathological disruptions of neural circuits.”
A healthy brain is the product of a long process of development. The front-most part of our brain, called the frontal cortex, plays a key role in our ability to think, decide and act. The brain accomplishes all of this through the interaction of special cells such as neurons and glia. We know that these cells have distinct functions, but what gives these cells their individual identities? The answer lies in how each cell expresses the information contained in its DNA. Epigenomic modifications, such as DNA methylation, can control which genes are turned on or off without changing letters of the DNA alphabet (A-T-C-G), and thus help distinguish different cell types.
In this new study, published July 4 in Science, the scientists found that the patterns of DNA methylation undergo widespread reconfiguration in the frontal cortex of mouse and human brains during a time of development when synapses, or connections between nerve cells, are growing rapidly. The researchers identified the exact sites of DNA methylation throughout the genome in brains from infants through adults. They found that one form of DNA methylation is present in neurons and glia from birth. Strikingly, a second form of “non-CG” DNA methylation that is almost exclusive to neurons accumulates as the brain matures, becoming the dominant form of methylation in the genome of human neurons. These results help us to understand how the intricate DNA landscape of brain cells develops during the key stages of childhood.
The genetic code in DNA is made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). DNA methylation typically occurs at so-called CpG sites, where C (cytosine) sits next to G (guanine) in the DNA alphabet. About 80 to 90 percent of CpG sites are methylated in human DNA. Salk researchers previously discovered that in human embryonic stem cells and induced pluripotent stem cells, a type of artificially derived stem cell, DNA methylation can also occur when G does not follow C, hence “non-CG methylation.” Originally, they thought that this type of methylation disappeared when stem cells differentiate into specific tissue-types such as lung or fat cells. The current study finds this is not the case in the brain, where non-CG methylation appears after cells differentiate, usually during childhood and adolescence when the brain is maturing.
By sequencing the genomes of mouse and human brain tissue as well as neurons and glia (from the frontal cortex of the brain) during early postnatal, juvenile, adolescent and adult stages, the Salk team found that non-CG methylation accumulates in neurons through early childhood and adolescence, and becomes the dominant form of DNA methylation in mature human neurons. “This shows that the period during which the neural circuits of the brain mature is accompanied by a parallel process of large-scale reconfiguration of the neural epigenome,” says Ecker, who is a Howard Hughes Medical Institute and Gordon and Betty Moore Foundation investigator.
The study provides the first comprehensive maps of how DNA methylation patterns change in the mouse and human brain during development, forming a critical foundation to now explore whether changes in methylation patterns may be linked to human diseases, including psychiatric disorders. Recent studies have demonstrated a possible role for DNA methylation in schizophrenia, depression, suicide and bipolar disorder. “Our work will let us begin to ask more detailed questions about how changes in the epigenome sculpt the complex identities of brain cells through life,” says co-first author Eran Mukamel, from Salk’s Computational Neurobiology Laboratory.
“The human brain has been called the most complex system that we know of in the universe,” says Ryan Lister, co-corresponding author on the new paper, previously a postdoctoral fellow in Ecker’s laboratory at Salk and now a group leader at The University of Western Australia. “So perhaps we shouldn’t be so surprised that this complexity extends to the level of the brain epigenome. These unique features of DNA methylation that emerge during critical phases of brain development suggest the presence of previously unrecognized regulatory processes that may be critically involved in normal brain function and brain disorders.”
At present, there is consensus among neuroscientists that many mental disorders have a neurodevelopmental origin and arise from an interaction between genetic predisposition and environmental influences (for example, early-life stress or drug abuse), the outcome of which is altered activity of brain networks. The building and shaping of these brain networks requires a long maturation process in which central nervous system cell types (neurons and glia) need to fine-tune the way they express their genetic code.
“DNA methylation fulfills this role,” says study co-author Terrence J. Sejnowski, a Howard Hughes Medical Institute Investigator, holder of the Francis Crick Chair and head of Salk’s Computational Neurobiology Laboratory. “We found that patterns of methylation are dynamic during brain development, in particular for non-CG methylation during early childhood and adolescence, which changes the way that we think about normal brain function and dysfunction.”
By disrupting the transcriptional expression of neurons, adds co-corresponding author M. Margarita Behrens, a staff scientist in the Computational Neurobiology Laboratory, “the alterations of these methylation patterns will change the way in which networks are formed, which could, in turn, lead to the appearance of mental disorders later in life.”

Unique Epigenomic Code Identified During Human Brain Development

Changes in the epigenome, including chemical modifications of DNA, can act as an extra layer of information in the genome, and are thought to play a role in learning and memory, as well as in age-related cognitive decline. The results of a new study by scientists at the Salk Institute for Biological Studies show that the landscape of DNA methylation, a particular type of epigenomic modification, is highly dynamic in brain cells during the transition from birth to adulthood, helping to understand how information in the genomes of cells in the brain is controlled from fetal development to adulthood. The brain is much more complex than all other organs in the body and this discovery opens the door to a deeper understanding of how the intricate patterns of connectivity in the brain are formed.

“These results extend our knowledge of the unique role of DNA methylation in brain development and function,” says senior author Joseph R. Ecker, professor and director of Salk’s Genomic Analysis Laboratory and holder of the Salk International Council Chair in Genetics. “They offer a new framework for testing the role of the epigenome in healthy function and in pathological disruptions of neural circuits.”

A healthy brain is the product of a long process of development. The front-most part of our brain, called the frontal cortex, plays a key role in our ability to think, decide and act. The brain accomplishes all of this through the interaction of special cells such as neurons and glia. We know that these cells have distinct functions, but what gives these cells their individual identities? The answer lies in how each cell expresses the information contained in its DNA. Epigenomic modifications, such as DNA methylation, can control which genes are turned on or off without changing letters of the DNA alphabet (A-T-C-G), and thus help distinguish different cell types.

In this new study, published July 4 in Science, the scientists found that the patterns of DNA methylation undergo widespread reconfiguration in the frontal cortex of mouse and human brains during a time of development when synapses, or connections between nerve cells, are growing rapidly. The researchers identified the exact sites of DNA methylation throughout the genome in brains from infants through adults. They found that one form of DNA methylation is present in neurons and glia from birth. Strikingly, a second form of “non-CG” DNA methylation that is almost exclusive to neurons accumulates as the brain matures, becoming the dominant form of methylation in the genome of human neurons. These results help us to understand how the intricate DNA landscape of brain cells develops during the key stages of childhood.

The genetic code in DNA is made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). DNA methylation typically occurs at so-called CpG sites, where C (cytosine) sits next to G (guanine) in the DNA alphabet. About 80 to 90 percent of CpG sites are methylated in human DNA. Salk researchers previously discovered that in human embryonic stem cells and induced pluripotent stem cells, a type of artificially derived stem cell, DNA methylation can also occur when G does not follow C, hence “non-CG methylation.” Originally, they thought that this type of methylation disappeared when stem cells differentiate into specific tissue-types such as lung or fat cells. The current study finds this is not the case in the brain, where non-CG methylation appears after cells differentiate, usually during childhood and adolescence when the brain is maturing.

By sequencing the genomes of mouse and human brain tissue as well as neurons and glia (from the frontal cortex of the brain) during early postnatal, juvenile, adolescent and adult stages, the Salk team found that non-CG methylation accumulates in neurons through early childhood and adolescence, and becomes the dominant form of DNA methylation in mature human neurons. “This shows that the period during which the neural circuits of the brain mature is accompanied by a parallel process of large-scale reconfiguration of the neural epigenome,” says Ecker, who is a Howard Hughes Medical Institute and Gordon and Betty Moore Foundation investigator.

The study provides the first comprehensive maps of how DNA methylation patterns change in the mouse and human brain during development, forming a critical foundation to now explore whether changes in methylation patterns may be linked to human diseases, including psychiatric disorders. Recent studies have demonstrated a possible role for DNA methylation in schizophrenia, depression, suicide and bipolar disorder. “Our work will let us begin to ask more detailed questions about how changes in the epigenome sculpt the complex identities of brain cells through life,” says co-first author Eran Mukamel, from Salk’s Computational Neurobiology Laboratory.

“The human brain has been called the most complex system that we know of in the universe,” says Ryan Lister, co-corresponding author on the new paper, previously a postdoctoral fellow in Ecker’s laboratory at Salk and now a group leader at The University of Western Australia. “So perhaps we shouldn’t be so surprised that this complexity extends to the level of the brain epigenome. These unique features of DNA methylation that emerge during critical phases of brain development suggest the presence of previously unrecognized regulatory processes that may be critically involved in normal brain function and brain disorders.”

At present, there is consensus among neuroscientists that many mental disorders have a neurodevelopmental origin and arise from an interaction between genetic predisposition and environmental influences (for example, early-life stress or drug abuse), the outcome of which is altered activity of brain networks. The building and shaping of these brain networks requires a long maturation process in which central nervous system cell types (neurons and glia) need to fine-tune the way they express their genetic code.

“DNA methylation fulfills this role,” says study co-author Terrence J. Sejnowski, a Howard Hughes Medical Institute Investigator, holder of the Francis Crick Chair and head of Salk’s Computational Neurobiology Laboratory. “We found that patterns of methylation are dynamic during brain development, in particular for non-CG methylation during early childhood and adolescence, which changes the way that we think about normal brain function and dysfunction.”

By disrupting the transcriptional expression of neurons, adds co-corresponding author M. Margarita Behrens, a staff scientist in the Computational Neurobiology Laboratory, “the alterations of these methylation patterns will change the way in which networks are formed, which could, in turn, lead to the appearance of mental disorders later in life.”

Filed under brain cells dna methylation brain development cognitive function frontal cortex epigenetics neuroscience science

127 notes

A look inside children’s minds
University of Iowa study shows how 3- and 4-year-olds retain what they see around them
When young children gaze intently at something or furrow their brows in concentration, you know their minds are busily at work. But you’re never entirely sure what they’re thinking.
Now you can get an inside look. Psychologists led by the University of Iowa for the first time have peered inside the brain with optical neuroimaging to quantify how much 3- and 4-year-old children are grasping when they survey what’s around them and to learn what areas of the brain are in play. The study looks at “visual working memory,” a core cognitive function in which we stitch together what we see at any given point in time to help focus attention. In a series of object-matching tests, the researchers found that 3-year-olds can hold a maximum of 1.3 objects in visual working memory, while 4-year-olds reach capacity at 1.8 objects. By comparison, adults max out at 3 to 4 objects, according to prior studies.
“This is literally the first look into a 3 and 4-year-old’s brain in action in this particular working memory task,” says John Spencer, psychology professor at the UI and corresponding author of the paper, which appears in the journal NeuroImage.
The research is important, because visual working memory performance has been linked to a variety of childhood disorders, including attention-deficit/hyperactivity disorder (ADHD), autism, developmental coordination disorder as well as affecting children born prematurely. The goal is to use the new brain imaging technique to detect these disorders before they manifest themselves in children’s behavior later on.
“At a young age, children may behave the same,” notes Spencer, who’s also affiliated with the Delta Center and whose department is part of the College of Liberal Arts and Sciences, “but if you can distinguish these problems in the brain, then it’s possible to intervene early and get children on a more standard trajectory.”
Plenty of research has gone into better understanding visual working memory in children and adults. Those prior studies divined neural networks in action using function magnetic resonance imaging (fMRI). That worked great for adults, but not so much with children,­ especially young ones, whose jerky movements threw the machine’s readings off kilter. So, Spencer and his team turned to functional near-infrared spectroscopy (fNIRS), which has been around since the 1960s but has never been used to look at working memory in children as young as three years of age.
“It’s not a scary environment,” says Spencer of the fNIRS. “No tube, no loud noises. You just have to wear a cap.”
Like fMRI, fNIRS records neural activity by measuring the difference in oxygenated blood concentrations anywhere in the brain. You’ve likely seen similar technology when a nurse puts your finger in a clip to check your circulation. In the brain, when a region is activated, neurons fire like mad, gobbling up oxygen provided in the blood. Those neurons need another shipment of oxygen-rich blood to arrive to keep going. The fNIRS measures the contrast between oxygen-rich and oxygen-deprived blood to gauge which area of the brain is going full tilt at a point in time.
The researchers outfitted the youngsters with colorful, comfortable ski hats in which fiber optic wires had been woven. The children played a computer game in which they were shown a card with one to three objects of different shapes for two seconds. After a pause of a second, the children were shown a card with either the same or different shapes. They responded whether they had seen a match.
The tests revealed novel insights. First, neural activity in the right frontal cortex was an important barometer of higher visual working memory capacity in both age groups. This could help clinicians evaluate children’s visual working memory at a younger age than before, and work with those whose capacity falls below the norm, the researchers say.
Secondly, 4-year olds showed a greater use than 3-year olds of the parietal cortex, located in both hemispheres below the crown of the head and which is believed to guide spatial attention.
"This suggests that improvements in performance are accompanied by increases in the neural response," adds Aaron Buss, a UI graduate student in psychology and the first author on the paper. "Further work will be needed to explain exactly how the neural response increases—either through changes in local tuning, or through changes in long range connectivity, or some combination."

A look inside children’s minds

University of Iowa study shows how 3- and 4-year-olds retain what they see around them

When young children gaze intently at something or furrow their brows in concentration, you know their minds are busily at work. But you’re never entirely sure what they’re thinking.

Now you can get an inside look. Psychologists led by the University of Iowa for the first time have peered inside the brain with optical neuroimaging to quantify how much 3- and 4-year-old children are grasping when they survey what’s around them and to learn what areas of the brain are in play. The study looks at “visual working memory,” a core cognitive function in which we stitch together what we see at any given point in time to help focus attention. In a series of object-matching tests, the researchers found that 3-year-olds can hold a maximum of 1.3 objects in visual working memory, while 4-year-olds reach capacity at 1.8 objects. By comparison, adults max out at 3 to 4 objects, according to prior studies.

“This is literally the first look into a 3 and 4-year-old’s brain in action in this particular working memory task,” says John Spencer, psychology professor at the UI and corresponding author of the paper, which appears in the journal NeuroImage.

The research is important, because visual working memory performance has been linked to a variety of childhood disorders, including attention-deficit/hyperactivity disorder (ADHD), autism, developmental coordination disorder as well as affecting children born prematurely. The goal is to use the new brain imaging technique to detect these disorders before they manifest themselves in children’s behavior later on.

“At a young age, children may behave the same,” notes Spencer, who’s also affiliated with the Delta Center and whose department is part of the College of Liberal Arts and Sciences, “but if you can distinguish these problems in the brain, then it’s possible to intervene early and get children on a more standard trajectory.”

Plenty of research has gone into better understanding visual working memory in children and adults. Those prior studies divined neural networks in action using function magnetic resonance imaging (fMRI). That worked great for adults, but not so much with children,­ especially young ones, whose jerky movements threw the machine’s readings off kilter. So, Spencer and his team turned to functional near-infrared spectroscopy (fNIRS), which has been around since the 1960s but has never been used to look at working memory in children as young as three years of age.

“It’s not a scary environment,” says Spencer of the fNIRS. “No tube, no loud noises. You just have to wear a cap.”

Like fMRI, fNIRS records neural activity by measuring the difference in oxygenated blood concentrations anywhere in the brain. You’ve likely seen similar technology when a nurse puts your finger in a clip to check your circulation. In the brain, when a region is activated, neurons fire like mad, gobbling up oxygen provided in the blood. Those neurons need another shipment of oxygen-rich blood to arrive to keep going. The fNIRS measures the contrast between oxygen-rich and oxygen-deprived blood to gauge which area of the brain is going full tilt at a point in time.

The researchers outfitted the youngsters with colorful, comfortable ski hats in which fiber optic wires had been woven. The children played a computer game in which they were shown a card with one to three objects of different shapes for two seconds. After a pause of a second, the children were shown a card with either the same or different shapes. They responded whether they had seen a match.

The tests revealed novel insights. First, neural activity in the right frontal cortex was an important barometer of higher visual working memory capacity in both age groups. This could help clinicians evaluate children’s visual working memory at a younger age than before, and work with those whose capacity falls below the norm, the researchers say.

Secondly, 4-year olds showed a greater use than 3-year olds of the parietal cortex, located in both hemispheres below the crown of the head and which is believed to guide spatial attention.

"This suggests that improvements in performance are accompanied by increases in the neural response," adds Aaron Buss, a UI graduate student in psychology and the first author on the paper. "Further work will be needed to explain exactly how the neural response increases—either through changes in local tuning, or through changes in long range connectivity, or some combination."

Filed under memory working memory learning parietal cortex neuroimaging frontal cortex neuroscience science

122 notes

Common Brain Processes of Anesthetic-Induced Unconsciousness Identified 
A study from the June issue of Anesthesiology found feedback from the front region of the brain is a crucial building block for consciousness and that its disruption is associated with unconsciousness when the anesthetics ketamine, propofol or sevoflurane are administered.
Brain centers and mechanisms of consciousness have not been well understood, resulting in a need for better monitors of consciousness during anesthesia. In addition, how anesthetics with different structures and pharmacological properties can generate unconsciousness has been a persistent question in anesthesiology since the beginning of the field in the mid-19th century.
A team of researchers from the University of Michigan, Ann Arbor, Mich., and Asan Medical Center, Seoul, South Korea, conducted a brain wave (electroencephalographic, or EEG) study of the front and back regions of the brain in 30 surgical patients who received intravenous ketamine. They compared the results of this study to the EEG data collected from 18 surgical patients who received either intravenous propofol or inhaled sevoflurane in a previous study. These three anesthetics, known to act on different parts of the brain and produce different EEG patterns, had the same effect of disrupting communication in the brain.
“Understanding a commonality among the actions of these diverse drugs could lead to a more comprehensive theory of how general anesthetics induce unconsciousness,” said study author George Mashour, M.D., Ph.D., assistant professor and associate chair for faculty affairs, Department of Anesthesiology, University of Michigan. “Our research shows that studying general anesthesia from the perspective of consciousness may be a fruitful approach and create new avenues for further investigation of anesthetic mechanisms and monitoring.”
An accompanying editorial by Jamie W. Sleigh, M.D., professor of anaesthesiology and intensive care, Department of Anaesthesia, University of Auckland, Hamilton, New Zealand, supported the study’s ability to better understand the neurobiology of consciousness.
“If the study’s findings are confirmed by subsequent work, the paper will achieve landmark status,” said Dr. Sleigh. “The study not only sheds light on the phenomenon of general anesthesia, but also how it is necessary for certain regions of the brain to communicate accurately with one another for consciousness to emerge.”
In addition, Dr. Sleigh recognized the study’s potential to lead to the development of better depth-of-anesthesia monitors that work for all general anesthetics.
(Image: Shutterstock)

Common Brain Processes of Anesthetic-Induced Unconsciousness Identified

A study from the June issue of Anesthesiology found feedback from the front region of the brain is a crucial building block for consciousness and that its disruption is associated with unconsciousness when the anesthetics ketamine, propofol or sevoflurane are administered.

Brain centers and mechanisms of consciousness have not been well understood, resulting in a need for better monitors of consciousness during anesthesia. In addition, how anesthetics with different structures and pharmacological properties can generate unconsciousness has been a persistent question in anesthesiology since the beginning of the field in the mid-19th century.

A team of researchers from the University of Michigan, Ann Arbor, Mich., and Asan Medical Center, Seoul, South Korea, conducted a brain wave (electroencephalographic, or EEG) study of the front and back regions of the brain in 30 surgical patients who received intravenous ketamine. They compared the results of this study to the EEG data collected from 18 surgical patients who received either intravenous propofol or inhaled sevoflurane in a previous study. These three anesthetics, known to act on different parts of the brain and produce different EEG patterns, had the same effect of disrupting communication in the brain.

“Understanding a commonality among the actions of these diverse drugs could lead to a more comprehensive theory of how general anesthetics induce unconsciousness,” said study author George Mashour, M.D., Ph.D., assistant professor and associate chair for faculty affairs, Department of Anesthesiology, University of Michigan. “Our research shows that studying general anesthesia from the perspective of consciousness may be a fruitful approach and create new avenues for further investigation of anesthetic mechanisms and monitoring.”

An accompanying editorial by Jamie W. Sleigh, M.D., professor of anaesthesiology and intensive care, Department of Anaesthesia, University of Auckland, Hamilton, New Zealand, supported the study’s ability to better understand the neurobiology of consciousness.

“If the study’s findings are confirmed by subsequent work, the paper will achieve landmark status,” said Dr. Sleigh. “The study not only sheds light on the phenomenon of general anesthesia, but also how it is necessary for certain regions of the brain to communicate accurately with one another for consciousness to emerge.”

In addition, Dr. Sleigh recognized the study’s potential to lead to the development of better depth-of-anesthesia monitors that work for all general anesthetics.

(Image: Shutterstock)

Filed under anesthetics consciousness anesthesia brain frontal cortex cortical feedback neuroscience science

104 notes

Anything you can do I can do better: Neuromolecular foundations of the superiority illusion
The existential psychologist Rollo May wrote that “depression is the inability to construct a future” while Lionel Tiger stated that “optimism has been central to the process of human evolution”. These deceptively simple phrases are remarkable in their depth and the connections they form between philosophy, psychology and neuroscience. Both capture the essence of human nature by articulating their insight that our ability to imagine and plan for the future is not only one of the most striking aspects of our species, but also that the inability to exercise this faculty is profoundly damaging to our happiness and sense of self. Two concepts related to these observations are depressive realism – the assertion that people with depression actually have a more accurate perception of reality, and moreover are less affected by its counterpoint, the superiority illusion. The superiority illusion is a cognitive bias by which individuals, relative to others, overestimate their positive qualities and abilities (such as intelligence, cognitive ability, and desirable traits) and underestimate their negative qualities. (Other cognitive biases include optimism bias and illusion of control.) While mathematically flawed – given a normal population distribution, most people are not above average – the superiority illusion is a positive belief that promotes mental health. Recently, scientists at the National Institute of Radiological Sciences (Chiba, Japan), the Japan Science and Technology Agency (Saitama), and Stanford University School of Medicine used resting-state functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) to study the default states of neural and molecular systems that generate the superiority illusion. They showed that resting-state functional connectivity between the frontal cortex and striatum regulated by inhibitory dopaminergic neurotransmission determines individual levels of the superiority illusion. The scientists state that their findings help clarify how the superiority illusion is biologically determined and identify potential molecular and neural targets for treating depressive realism.
Read more

Anything you can do I can do better: Neuromolecular foundations of the superiority illusion

The existential psychologist Rollo May wrote that “depression is the inability to construct a future” while Lionel Tiger stated that “optimism has been central to the process of human evolution”. These deceptively simple phrases are remarkable in their depth and the connections they form between philosophy, psychology and neuroscience. Both capture the essence of human nature by articulating their insight that our ability to imagine and plan for the future is not only one of the most striking aspects of our species, but also that the inability to exercise this faculty is profoundly damaging to our happiness and sense of self. Two concepts related to these observations are depressive realism – the assertion that people with depression actually have a more accurate perception of reality, and moreover are less affected by its counterpoint, the superiority illusion. The superiority illusion is a cognitive bias by which individuals, relative to others, overestimate their positive qualities and abilities (such as intelligence, cognitive ability, and desirable traits) and underestimate their negative qualities. (Other cognitive biases include optimism bias and illusion of control.) While mathematically flawed – given a normal population distribution, most people are not above average – the superiority illusion is a positive belief that promotes mental health. Recently, scientists at the National Institute of Radiological Sciences (Chiba, Japan), the Japan Science and Technology Agency (Saitama), and Stanford University School of Medicine used resting-state functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) to study the default states of neural and molecular systems that generate the superiority illusion. They showed that resting-state functional connectivity between the frontal cortex and striatum regulated by inhibitory dopaminergic neurotransmission determines individual levels of the superiority illusion. The scientists state that their findings help clarify how the superiority illusion is biologically determined and identify potential molecular and neural targets for treating depressive realism.

Read more

Filed under superiority illusion cognitive bias frontal cortex evolution psychology neuroscience science

213 notes

New research uncovers the neural mechanism underlying drug cravings
Addiction may result from abnormal brain circuitry in the frontal cortex, the part of the brain that controls decision-making. Researchers from the RIKEN Center for Molecular Imaging Science in Japan collaborating with colleagues from the Montreal Neurological Institute of McGill University in Canada report today that the lateral and orbital regions of the frontal cortex interact during the response to a drug-related cue and that aberrant interaction between the two frontal regions may underlie addiction. Their results are published today in the journal Proceedings of the National Academy of Sciences of the USA.
Cues such as the sight of drugs can induce cravings and lead to drug-seeking behaviors and drug use. But cravings are also influenced by other factors, such as drug availability and self-control. To investigate the neural mechanisms involved in cue-induced cravings the researchers studied the brain activity of a group of 10 smokers, following exposure to cigarette cues under two different conditions of cigarette availability. In one experiment cigarettes were available immediately and in the other they were not. The researchers combined a technique called transcranial magnetic stimulation (TMS) with functional magnetic resonance imaging (fMRI).
The results demonstrate that in smokers the orbitofrontal cortex (OFC) tracks the level of craving while the dorsolateral prefrontal cortex (DPFC) is responsible for integrating drug cues and drug availability. Moreover, the DPFC has the ability to suppress activity in the OFC when the cigarette is unavailable. When the DPFC was inactivated using TMS, both craving and craving-related signals in the OFC became independent of drug availability.
The authors of the study conclude that the DLPFC incorporates drug cues and knowledge on drug availability to modulate the value signals it transmits to the OFC, where this information is transformed into drug-seeking action.
"We demonstrate that in smokers, cravings build up in the OFC upon processing of cigarette cues and availability by the DFPC. What is surprising is that this is a neural circuit involved in decision making and self-control, that normally guides individuals to optimal behaviors in daily life." Explains Dr. Hayashi, from RIKEN, who designed and conducted the fMRI and TMS experiments.
"This research uncovers the brain circuitry responsible for self-control during reward-seeking choices. It is also consistent with the view that drug addiction is a pathology of decision making." According to Dr. Alain Dagher, a neurologist at the Montreal Neurological Institute.
These findings will help understand the neural basis of addiction and may contribute to a therapeutic approach for addiction.
(Image: New Jersey Addiction Assistance)

New research uncovers the neural mechanism underlying drug cravings

Addiction may result from abnormal brain circuitry in the frontal cortex, the part of the brain that controls decision-making. Researchers from the RIKEN Center for Molecular Imaging Science in Japan collaborating with colleagues from the Montreal Neurological Institute of McGill University in Canada report today that the lateral and orbital regions of the frontal cortex interact during the response to a drug-related cue and that aberrant interaction between the two frontal regions may underlie addiction. Their results are published today in the journal Proceedings of the National Academy of Sciences of the USA.

Cues such as the sight of drugs can induce cravings and lead to drug-seeking behaviors and drug use. But cravings are also influenced by other factors, such as drug availability and self-control. To investigate the neural mechanisms involved in cue-induced cravings the researchers studied the brain activity of a group of 10 smokers, following exposure to cigarette cues under two different conditions of cigarette availability. In one experiment cigarettes were available immediately and in the other they were not. The researchers combined a technique called transcranial magnetic stimulation (TMS) with functional magnetic resonance imaging (fMRI).

The results demonstrate that in smokers the orbitofrontal cortex (OFC) tracks the level of craving while the dorsolateral prefrontal cortex (DPFC) is responsible for integrating drug cues and drug availability. Moreover, the DPFC has the ability to suppress activity in the OFC when the cigarette is unavailable. When the DPFC was inactivated using TMS, both craving and craving-related signals in the OFC became independent of drug availability.

The authors of the study conclude that the DLPFC incorporates drug cues and knowledge on drug availability to modulate the value signals it transmits to the OFC, where this information is transformed into drug-seeking action.

"We demonstrate that in smokers, cravings build up in the OFC upon processing of cigarette cues and availability by the DFPC. What is surprising is that this is a neural circuit involved in decision making and self-control, that normally guides individuals to optimal behaviors in daily life." Explains Dr. Hayashi, from RIKEN, who designed and conducted the fMRI and TMS experiments.

"This research uncovers the brain circuitry responsible for self-control during reward-seeking choices. It is also consistent with the view that drug addiction is a pathology of decision making." According to Dr. Alain Dagher, a neurologist at the Montreal Neurological Institute.

These findings will help understand the neural basis of addiction and may contribute to a therapeutic approach for addiction.

(Image: New Jersey Addiction Assistance)

Filed under frontal cortex orbitofrontal cortex brain activity addiction decision-making neuroimaging neuroscience science

166 notes

My mistake or yours? How the brain decides
Humans and other animals learn by making mistakes. They can also learn from observing the mistakes of others. The brain processes self-generated errors in a region called the medial frontal cortex (MFC) but little is known about how it processes the observed errors of others. A Japanese research team led by Masaki Isoda and Atsushi Iriki of the RIKEN Brain Science Institute has now demonstrated that the MFC is also involved in processing observed errors.
The team studied the brains of monkeys while the animals performed the same task. Two monkeys sat opposite each other and took turns to choose between a yellow and green button, one of which resulted in a liquid reward for both. Each monkey’s turn consisted of two choices.
After blocks of between 5 and 17 choices, the button that resulted in reward was switched unpredictably, usually causing an error on the next choice. The choices made by each monkey immediately after such errors, or errors that were random, showed that they used both their own errors and their partner’s to guide their subsequent choices. While the monkeys performed this task, the researchers recorded activity of single neurons in their brains.
In this way they were able to determine which behavioural aspect was most closely associated with each neuron’s activity, explains Isoda. “We found that many neurons in the medial frontal cortex were not activated when the monkey made an error itself, but they became active when their partner made an error.” This brain activity shows that it is the MFC which processes observations of another’s error, and the corresponding behavior shows that observing and processing such errors guides subsequent actions.
“Such error identification and subsequent error correction are of crucial importance for developing and maintaining successful social communities,” says Isoda. “Humans are tuned into other people’s mistakes not only for competitive success, but also for cooperative group living. If non-invasive techniques become available in humans, then we should be able to identify medial frontal neurons that behave similarly.”
Having identified the MFC as being involved, Isoda now wants to delve deeper into the process. “The next steps will be to clarify whether the inactivation of medial frontal cortex neurons reduces the ability to identify others’ errors, and to determine whether other brain regions are also involved in the processing of others’ errors.”

My mistake or yours? How the brain decides

Humans and other animals learn by making mistakes. They can also learn from observing the mistakes of others. The brain processes self-generated errors in a region called the medial frontal cortex (MFC) but little is known about how it processes the observed errors of others. A Japanese research team led by Masaki Isoda and Atsushi Iriki of the RIKEN Brain Science Institute has now demonstrated that the MFC is also involved in processing observed errors.

The team studied the brains of monkeys while the animals performed the same task. Two monkeys sat opposite each other and took turns to choose between a yellow and green button, one of which resulted in a liquid reward for both. Each monkey’s turn consisted of two choices.

After blocks of between 5 and 17 choices, the button that resulted in reward was switched unpredictably, usually causing an error on the next choice. The choices made by each monkey immediately after such errors, or errors that were random, showed that they used both their own errors and their partner’s to guide their subsequent choices. While the monkeys performed this task, the researchers recorded activity of single neurons in their brains.

In this way they were able to determine which behavioural aspect was most closely associated with each neuron’s activity, explains Isoda. “We found that many neurons in the medial frontal cortex were not activated when the monkey made an error itself, but they became active when their partner made an error.” This brain activity shows that it is the MFC which processes observations of another’s error, and the corresponding behavior shows that observing and processing such errors guides subsequent actions.

“Such error identification and subsequent error correction are of crucial importance for developing and maintaining successful social communities,” says Isoda. “Humans are tuned into other people’s mistakes not only for competitive success, but also for cooperative group living. If non-invasive techniques become available in humans, then we should be able to identify medial frontal neurons that behave similarly.”

Having identified the MFC as being involved, Isoda now wants to delve deeper into the process. “The next steps will be to clarify whether the inactivation of medial frontal cortex neurons reduces the ability to identify others’ errors, and to determine whether other brain regions are also involved in the processing of others’ errors.”

Filed under brain brain activity neuron error correction primates frontal cortex neuroscience science

92 notes

The evolution of human intellect: Human-specific regulation of neuronal genes

A new study published November 20 in the open-access journal PLOS Biology has identified hundreds of small regions of the genome that appear to be uniquely regulated in human neurons. These regulatory differences distinguish us from other primates, including monkeys and apes, and as neurons are at the core of our unique cognitive abilities, these features may ultimately hold the key to our intellectual prowess (and also to our potential vulnerability to a wide range of ‘human-specific’ diseases from autism to Alzheimer’s).

Exploring which features in the genome separate human neurons from their non-human counterparts has been a challenging task until recently; primate genomes comprise billions of base pairs (the basic building blocks of DNA), and comparisons between the human and chimpanzee genomes alone reveal close to 40 million differences. Most of these are thought to merely reflect random ‘genetic drift’ during the course of evolution, so the challenge was to identify the small set of changes that have functionally important consequences, as these might help to explain the genomic basis of the emergence of human-specific neuronal function.

The key to the present study, led by Dr Schahram Akbarian of the University of Massachusetts and the Mount Sinai School of Medicine, was not to focus on the “letters” of the DNA code, but rather on what might be called its “font” or “typeface”—the DNA strands of the genome are wrapped in protein to make a chromatin fiber, and the way in which they are wrapped, the “chromatin state”, in turn reflects the regulatory state of that region of the genome (e.g. whether a given gene is turned on or off). This is the field that biologists call “epigenetics”—the study of the “epigenome”.

Dr Akbarian and colleagues set out to isolate small snippets of chromatin fibers from the frontal cortex, a brain region involved in complex cognitive operations. They were then able to analyze these snippets for the chemical signals (histone methylation) that define the regulatory state (on/off) of the chromatin. The results of their analysis identified hundreds of regions throughout the genome which showed a markedly different chromatin structure in neurons from human children and adults, compared to chimpanzees and macaques.

This treasure trove of short genomic regions is now providing researchers with interesting new leads involving the evolution of the human brain. Although some of the regions have remained unchanged during primate evolution, some more tantalizing ones have recently changed, having a DNA sequence that is unique to humans and our close extinct relatives, the Neanderthals and the Denisovans.

The study also uncovered examples where several of these regulatory DNA regions appear to physically interact with each other inside the cell nucleus, despite being separated by hundreds of thousands of base pairs on the linear genome. This phenomenon of “chromatin looping” is implicated in controlling the expression of neighboring genes, including several with a critical role for human brain development. The study, from laboratories based in the United States, Switzerland and Russia, draws further attention to the role of epigenetics and the epigenome in our biology and our evolution. As Dr Akbarian notes, “Much about human biology and disease cannot be deduced by simply sequencing the genome. Mapping the epigenome of neurons and other cells will help us to better understand the inner workings of our brain, and where we are coming from.”

(Source: medicalxpress.com)

Filed under neuronal function genome chromatin fiber frontal cortex evolution neuroscience science

free counters