Neuroscience

Articles and news from the latest research reports.

66 notes

You Don’t Walk Alone

Breakthrough in detecting early onset of refractory epilepsy in children will lead to effective treatment using non-pharmacological therapies.

65 million people around the world today suffer from epilepsy, a condition of the brain that may trigger an uncontrollable seizure at any time, often for no known reason. A seizure is a disruption of the electrical communication between neurons, and someone is said to have epilepsy if they experience two or more unprovoked seizures separated by at least 24 hours.

Epilepsy is the most common chronic disease in pediatric neurology, with about 0.5-1% of children developing epilepsy during their lifetime. A further 30-40% of epileptic children develop refractory epilepsy, a particular type of epilepsy that cannot be managed by antiepileptic drugs (AED). Regardless of etiology, children with refractory epilepsy are invariably exposed to a variety of physical, psychological and social morbidities. Patients whose seizures are difficult to control could benefit from non-pharmacological therapies, including surgery, deep brain stimulation and ketogenic diets. Therefore, the early identification of patients whose seizures are refractory to AED would allow them to receive alternative therapies at an appropriate time.

Despite idiopathic etiology being a significant predictor of a lower risk of refractory epilepsy, a subset of patients with idiopathic epilepsy might still be refractory to medical treatment.

Using a new electroencephalography (EEG) analytical method, a team of medical doctors and scientists in Taiwan has successfully developed a tool to detect certain EEG features often present in children with idiopathic epilepsy.

The team developed an efficient, automated and quantitative approach towards the early prediction of refractory idiopathic epilepsy based on EEG classification analysis. EEG analysis is widely employed to investigate brain disorders and to study brain electrical activity. In the study, a set of artifact-free EEG segments was acquired from the EEG recordings of patients belonging to two classes of epilepsy: well-controlled and refractory. To search for significantly discriminative EEG features and to reduce computational costs, a statistical approach involving global parametric features was adopted across EEG channels as well as over time. A gain ratio-based feature selection was then performed.

The study found a significantly higher DecorrTime avg AVG and RelPowDelta avg AVG in the well-controlled group than in the refractory group. This suggests that refractory patients have a higher risk of seizure attacks than well-controlled patients.

The main contributions of this study are as follows:

  1. the generalisation of 10 significant EEG features into a concept for the recognition and identification of potential refractory epilepsy in patients with idiopathic epilepsy, based on EEG classification analysis;
  2. the development of a diagnostic tool based conceptually on these 10 EEG features, using a support vector machine (SVM) classification model to discriminate between well-controlled idiopathic epilepsy and refractory idiopathic epilepsy, which will facilitate subsequent expert visual EEG interpretation.

Further research with more diversity (in terms of pediatric and adult participants) is encouraged to expand on the tool’s reliability and generalisation. This study was supported partly by a grant from the Kaohsiung Medical University Hospital and grants from Ministry of Science and Technology, Taiwan.

The paper can be found in the upcoming issue of the International Journal of Neural Systems (IJNS)

Filed under epilepsy EEG epileptic seizures neuroscience science

74 notes

Network measures predict neuropsychological outcome after brain injury

Cognitive neuroscience research has shown that certain brain regions are associated with specific cognitive abilities, such as language, naming, and decision-making.

image

How and where these specific abilities are integrated in the brain to support complex cognition is still under investigation. However, researchers at the University of Iowa and Washington University in St. Louis, Missouri, believe that several hub regions may be especially important for the brain to function as an integrated network.

In research published online Sept. 15 in the Early Edition of the Proceedings of the National Academy of Sciences, scientists studied neurological patients with focal brain damage, and found that damage to six hub locations—identified in a model developed at Washington University using resting state fMRI, functional connectivity analyses, and graph theory—produced much greater cognitive impairment than damage to other locations.

Doctors have long observed that despite having similar locations or extent of brain injury, patients often present with wide-ranging degrees of impairment and exhibit different recovery trajectories. A better understanding of brain networks and hubs may improve the understanding of outcomes of brain injuries (for example, stroke, resection, or trauma) and help inform prognosis and rehabilitation efforts.

“We were able to identify a set of brain hubs and show that damage to those locations unexpectedly causes widespread cognitive impairments,” says David Warren, cognitive neuroscientist at the University of Iowa and lead study author. “We hope that this framework will help neurologists with diagnosis and prognosis, and neurosurgeons with surgical planning.”

Two contrasting views of brain hubs exist. One view focuses on the sheer number of connections between brain regions, with those regions showing the most connections considered hubs.

Warren and his colleagues contend that the number of connections a given region makes may not reflect the importance of a region to network function because it can be strongly influenced by network size. Instead, their framework defines hubs as brain regions that show correlated activity with multiple brain systems (rather than regions). The authors predicted that because hubs should be critical for brain function and complex cognition, damage to true hubs should produce widespread cognitive impairment.

This study evaluated long-term cognitive and behavioral data in 30 patients in the Iowa Neurological Patient Registry—19 with focal damage to one of the authors’ six target hub locations and 11 with damage to two control locations that fit the alternative hub definition.

On average, patients with lesions to target hubs had significant impairment in nine major cognitive domains—orientation/attention, perception, memory, language skills, motor performance, concept formation/reasoning, executive functions, emotional functions, and adaptive functions. In contrast, the group with lesions to control hubs was significantly impaired in just three of the nine domains (executive functions, emotional functions, and adaptive functions).

Additionally, the target group had significantly greater cognitive deficits than the control group in seven of nine domains (all except perception and emotional functions), again showing the widespread cognitive effects of target hub lesions.

“With a grant from the McDonnell Foundation, we’re planning to follow up by exploring the effects of damage to additional brain hubs, examining how damage to hubs alters brain activation, and studying neurosurgery patients prospectively before and after their surgeries,” says senior study author Daniel Tranel, professor of neurology in the UI Carver College of Medicine and psychology in the College of Liberal Arts and Sciences. “We think that this work could have a tremendous influence on clinical practice.”

(Source: now.uiowa.edu)

Filed under brain injury default mode network brain function cognitive impairment neuroscience science

190 notes

Neuroscientists identify key role of language gene
Neuroscientists have found that a gene mutation that arose more than half a million years ago may be key to humans’ unique ability to produce and understand speech.
Researchers from MIT and several European universities have shown that the human version of a gene called Foxp2 makes it easier to transform new experiences into routine procedures. When they engineered mice to express humanized Foxp2, the mice learned to run a maze much more quickly than normal mice.
The findings suggest that Foxp2 may help humans with a key component of learning language — transforming experiences, such as hearing the word “glass” when we are shown a glass of water, into a nearly automatic association of that word with objects that look and function like glasses, says Ann Graybiel, an MIT Institute Professor, member of MIT’s McGovern Institute for Brain Research, and a senior author of the study.
“This really is an important brick in the wall saying that the form of the gene that allowed us to speak may have something to do with a special kind of learning, which takes us from having to make conscious associations in order to act to a nearly automatic-pilot way of acting based on the cues around us,” Graybiel says.
Wolfgang Enard, a professor of anthropology and human genetics at Ludwig-Maximilians University in Germany, is also a senior author of the study, which appears in the Proceedings of the National Academy of Sciences this week. The paper’s lead authors are Christiane Schreiweis, a former visiting graduate student at MIT, and Ulrich Bornschein of the Max Planck Institute for Evolutionary Anthropology in Germany.
All animal species communicate with each other, but humans have a unique ability to generate and comprehend language. Foxp2 is one of several genes that scientists believe may have contributed to the development of these linguistic skills. The gene was first identified in a group of family members who had severe difficulties in speaking and understanding speech, and who were found to carry a mutated version of the Foxp2 gene.
In 2009, Svante Pääbo, director of the Max Planck Institute for Evolutionary Anthropology, and his team engineered mice to express the human form of the Foxp2 gene, which encodes a protein that differs from the mouse version by only two amino acids. His team found that these mice had longer dendrites — the slender extensions that neurons use to communicate with each other — in the striatum, a part of the brain implicated in habit formation. They were also better at forming new synapses, or connections between neurons.
Pääbo, who is also an author of the new PNAS paper, and Enard enlisted Graybiel, an expert in the striatum, to help study the behavioral effects of replacing Foxp2. They found that the mice with humanized Foxp2 were better at learning to run a T-shaped maze, in which the mice must decide whether to turn left or right at a T-shaped junction, based on the texture of the maze floor, to earn a food reward.
The first phase of this type of learning requires using declarative memory, or memory for events and places. Over time, these memory cues become embedded as habits and are encoded through procedural memory — the type of memory necessary for routine tasks, such as driving to work every day or hitting a tennis forehand after thousands of practice strokes.
Using another type of maze called a cross-maze, Schreiweis and her MIT colleagues were able to test the mice’s ability in each of type of memory alone, as well as the interaction of the two types. They found that the mice with humanized Foxp2 performed the same as normal mice when just one type of memory was needed, but their performance was superior when the learning task required them to convert declarative memories into habitual routines. The key finding was therefore that the humanized Foxp2 gene makes it easier to turn mindful actions into behavioral routines.
The protein produced by Foxp2 is a transcription factor, meaning that it turns other genes on and off. In this study, the researchers found that Foxp2 appears to turn on genes involved in the regulation of synaptic connections between neurons. They also found enhanced dopamine activity in a part of the striatum that is involved in forming procedures. In addition, the neurons of some striatal regions could be turned off for longer periods in response to prolonged activation — a phenomenon known as long-term depression, which is necessary for learning new tasks and forming memories.
Together, these changes help to “tune” the brain differently to adapt it to speech and language acquisition, the researchers believe. They are now further investigating how Foxp2 may interact with other genes to produce its effects on learning and language.
This study “provides new ways to think about the evolution of Foxp2 function in the brain,” says Genevieve Konopka, an assistant professor of neuroscience at the University of Texas Southwestern Medical Center who was not involved in the research. “It suggests that human Foxp2 facilitates learning that has been conducive for the emergence of speech and language in humans. The observed differences in dopamine levels and long-term depression in a region-specific manner are also striking and begin to provide mechanistic details of how the molecular evolution of one gene might lead to alterations in behavior.”

Neuroscientists identify key role of language gene

Neuroscientists have found that a gene mutation that arose more than half a million years ago may be key to humans’ unique ability to produce and understand speech.

Researchers from MIT and several European universities have shown that the human version of a gene called Foxp2 makes it easier to transform new experiences into routine procedures. When they engineered mice to express humanized Foxp2, the mice learned to run a maze much more quickly than normal mice.

The findings suggest that Foxp2 may help humans with a key component of learning language — transforming experiences, such as hearing the word “glass” when we are shown a glass of water, into a nearly automatic association of that word with objects that look and function like glasses, says Ann Graybiel, an MIT Institute Professor, member of MIT’s McGovern Institute for Brain Research, and a senior author of the study.

“This really is an important brick in the wall saying that the form of the gene that allowed us to speak may have something to do with a special kind of learning, which takes us from having to make conscious associations in order to act to a nearly automatic-pilot way of acting based on the cues around us,” Graybiel says.

Wolfgang Enard, a professor of anthropology and human genetics at Ludwig-Maximilians University in Germany, is also a senior author of the study, which appears in the Proceedings of the National Academy of Sciences this week. The paper’s lead authors are Christiane Schreiweis, a former visiting graduate student at MIT, and Ulrich Bornschein of the Max Planck Institute for Evolutionary Anthropology in Germany.

All animal species communicate with each other, but humans have a unique ability to generate and comprehend language. Foxp2 is one of several genes that scientists believe may have contributed to the development of these linguistic skills. The gene was first identified in a group of family members who had severe difficulties in speaking and understanding speech, and who were found to carry a mutated version of the Foxp2 gene.

In 2009, Svante Pääbo, director of the Max Planck Institute for Evolutionary Anthropology, and his team engineered mice to express the human form of the Foxp2 gene, which encodes a protein that differs from the mouse version by only two amino acids. His team found that these mice had longer dendrites — the slender extensions that neurons use to communicate with each other — in the striatum, a part of the brain implicated in habit formation. They were also better at forming new synapses, or connections between neurons.

Pääbo, who is also an author of the new PNAS paper, and Enard enlisted Graybiel, an expert in the striatum, to help study the behavioral effects of replacing Foxp2. They found that the mice with humanized Foxp2 were better at learning to run a T-shaped maze, in which the mice must decide whether to turn left or right at a T-shaped junction, based on the texture of the maze floor, to earn a food reward.

The first phase of this type of learning requires using declarative memory, or memory for events and places. Over time, these memory cues become embedded as habits and are encoded through procedural memory — the type of memory necessary for routine tasks, such as driving to work every day or hitting a tennis forehand after thousands of practice strokes.

Using another type of maze called a cross-maze, Schreiweis and her MIT colleagues were able to test the mice’s ability in each of type of memory alone, as well as the interaction of the two types. They found that the mice with humanized Foxp2 performed the same as normal mice when just one type of memory was needed, but their performance was superior when the learning task required them to convert declarative memories into habitual routines. The key finding was therefore that the humanized Foxp2 gene makes it easier to turn mindful actions into behavioral routines.

The protein produced by Foxp2 is a transcription factor, meaning that it turns other genes on and off. In this study, the researchers found that Foxp2 appears to turn on genes involved in the regulation of synaptic connections between neurons. They also found enhanced dopamine activity in a part of the striatum that is involved in forming procedures. In addition, the neurons of some striatal regions could be turned off for longer periods in response to prolonged activation — a phenomenon known as long-term depression, which is necessary for learning new tasks and forming memories.

Together, these changes help to “tune” the brain differently to adapt it to speech and language acquisition, the researchers believe. They are now further investigating how Foxp2 may interact with other genes to produce its effects on learning and language.

This study “provides new ways to think about the evolution of Foxp2 function in the brain,” says Genevieve Konopka, an assistant professor of neuroscience at the University of Texas Southwestern Medical Center who was not involved in the research. “It suggests that human Foxp2 facilitates learning that has been conducive for the emergence of speech and language in humans. The observed differences in dopamine levels and long-term depression in a region-specific manner are also striking and begin to provide mechanistic details of how the molecular evolution of one gene might lead to alterations in behavior.”

Filed under Foxp2 gene mutation language language acquisition speech learning neuroscience science

64 notes

Study First to Use Brain Scans to Forecast Early Reading Difficulties

UC San Francisco researchers have used brain scans to predict how young children learn to read, giving clinicians a possible tool to spot children with dyslexia and other reading difficulties before they experience reading challenges.

image

In the United States, children usually learn to read for the first time in kindergarten and become proficient readers by third grade, according to the authors. In the study, researchers examined brain scans of 38 kindergarteners as they were learning to read formally at school and tracked their white matter development until third grade. The brain’s white matter is essential for perceiving, thinking and learning.

The researchers found that the developmental course of the children’s white matter volume predicted the kindergarteners’ abilities to read.

“We show that white matter development during a critical period in a child’s life, when they start school and learn to read for the very first time, predicts how well the child ends up reading,” said Fumiko Hoeft, MD, PhD, senior author and an associate professor of child and adolescent psychiatry at UCSF, and member of the UCSF Dyslexia Center.

The research is published online in Psychological Science.

Doctors commonly use behavioral measures of reading readiness for assessments of ability. Other measures such as cognitive (i.e. IQ) ability, early linguistic skills, measures of the environment such as socio-economic status, and whether there is a family member with reading problems or dyslexia are all common early factors used to assess risk of developing reading difficulties.

“What was intriguing in this study was that brain development in regions important to reading predicted above and beyond all of these measures,” said Hoeft.

The researchers removed the effects of these commonly used assessments when doing the statistical analyses in order to assess how the white matter directly predicted future reading ability. They found that left hemisphere white matter in the temporo-parietal region just behind and above the left ear — thought to be important for language, reading and speech — was highly predictive of reading acquisition beyond effects of genetic predisposition, cognitive abilities, and environment at the outset of kindergarten. Brain scans improved prediction accuracy by 60 percent better at predicting reading difficulties than the compared to traditional assessments alone. 

“Early identification and interventions are extremely important in children with dyslexia as well as most neurodevelopmental disorders,” said Hoeft. “Accumulation of research evidence such as ours may one day help us identify kids who might be at risk for dyslexia, rather than waiting for children to become poor readers and experience failure.”

According to the National Institute of Child and Human Development, as many as 15 percent of Americans have major trouble reading.

“Examining developmental changes in the brain over a critical period of reading appears to be a unique sensitive measure of variation and may add insight to our understanding of reading development in ways that brain data from one time point, and behavioral and environmental measures, cannot,” said Chelsea Myers, BS, lead author and lab manager in UCSF’s Laboratory for Educational NeuroScience. “The hope is that understanding each child’s neurocognitive profiles will help educators provide targeted and personalized education and intervention, particularly in those with special needs.”

(Source: ucsf.edu)

Filed under reading difficulties dyslexia white matter brain development language psychology neuroscience science

121 notes

(Image caption: These figures show lagged maturation of connections in ADHD between the default mode network, involved in internally-directed thought (i.e., daydreaming) and shown on the left of each figure, and two brain networks involved in externally-focused attention, shown on the right of each figure. The width of each arc represents the number of lagged connections between two regions within each network. Connections that normally increase with age and that are hypoconnected in ADHD are shown in blue; connections that normally decrease with age and that are hyperconnected in ADHD are shown in red.)
Slow to mature, quick to distract: ADHD brain study finds slower development of key connections 
A peek inside the brains of more than 750 children and teens reveals a key difference in brain architecture between those with attention deficit hyperactivity disorder and those without.
Kids and teens with ADHD, a new study finds, lag behind others of the same age in how quickly their brains form connections within, and between, key brain networks.
The result: less-mature connections between a brain network that controls internally-directed thought (such as daydreaming) and networks that allow a person to focus on externally-directed tasks. That lag in connection development may help explain why people with ADHD get easily distracted or struggle to stay focused.
What’s more, the new findings, and the methods used to make them, may one day allow doctors to use brain scans to diagnose ADHD — and track how well someone responds to treatment. This kind of neuroimaging “biomarker” doesn’t yet exist for ADHD, or any psychiatric condition for that matter.
The new findings come from a team in the University of Michigan Medical School’s Department of Psychiatry. They used highly advanced computing techniques to analyze a large pool of detailed brain scans that were publicly shared for scientists to study. Their results are published in the Proceedings of the National Academy of Sciences.
Lead author Chandra Sripada, M.D., Ph.D., and colleagues looked at the brain scans of 275 kids and teens with ADHD, and 481 others without it, using “connectomic” methods that can map interconnectivity between networks in the brain.
The scans, made using function magnetic resonance imaging (fMRI) scanners, show brain activity during a resting state. This allows researchers to see how a number of different brain networks, each specialized for certain types of functions, were “talking” within and amongst themselves.
The researchers found lags in development of connection within the internally-focused network, called the default mode network or DMN, and in development of connections between DMN and two networks that process externally-focused tasks, often called task-positive networks, or TPNs. They could even see that the lags in connection development with the two task-related networks — the frontoparietal and ventral attention networks —  were located primarily in two specific areas of the brain.
The new findings mesh well with what other researchers have found by examining the physical structure of the brains of people with and without ADHD in other ways.
Such research has already shown alterations in regions within DMN and TPNs. So, the new findings build on that understanding and add to it. 
The findings are also relevant to thinking about the longitudinal course of ADHD from childhood to adulthood. For instance, some children and teens “grow out” of the disorder, while for others the disorder persists throughout adulthood. Future studies of brain network maturation in ADHD could shed light into the neural basis for this difference.
“We and others are interested in understanding the neural mechanisms of ADHD in hopes that we can contribute to better diagnosis and treatment,” says Sripada, an assistant professor and psychiatrist who holds a joint appointment in the U-M Philosophy department and is a member of the U-M Center for Computational Medicine and Bioinformatics. “But without the database of fMRI images, and the spirit of collaboration that allowed them to be compiled and shared, we would never have reached this point.”  
Sripada explains that in the last decade, functional medical imaging has revealed that the human brain is functionally organized into large-scale connectivity networks. These networks, and the connections between them, mature throughout early childhood all the way to young adulthood. “It is particularly noteworthy that the networks we found to have lagging maturation in ADHD are linked to the very behaviors that are the symptoms of ADHD,” he says. 
Studying the vast array of connections in the brain, a field called connectomics, requires scientists to be able to parse through not just the one-to-one communications between two specific brain regions, but the patterns of communication among thousands of nodes within the brain. This requires major computing power and access to massive amounts of data – which makes the open sharing of fMRI images so important.
“The results of this study set the stage for the next phase of this research, which is to examine individual components of the networks that have the maturational lag,” he says. “This study provides a coarse-grained understanding, and now we want to examine this phenomenon in a more fine-grained way that might lead us to a true biological marker, or neuromarker, for ADHD.”
Sripada also notes that connectomics could be used to examine other disorders with roots in brain connectivity – including autism, which some evidence has suggested stems from over-maturation of some brain networks, and schizophrenia, which may arise from abnormal connections. Pooling more fMRI data from people with these conditions, and depression, anxiety, bipolar disorder and more could boost connectomics studies in those fields.

(Image caption: These figures show lagged maturation of connections in ADHD between the default mode network, involved in internally-directed thought (i.e., daydreaming) and shown on the left of each figure, and two brain networks involved in externally-focused attention, shown on the right of each figure. The width of each arc represents the number of lagged connections between two regions within each network. Connections that normally increase with age and that are hypoconnected in ADHD are shown in blue; connections that normally decrease with age and that are hyperconnected in ADHD are shown in red.)

Slow to mature, quick to distract: ADHD brain study finds slower development of key connections

A peek inside the brains of more than 750 children and teens reveals a key difference in brain architecture between those with attention deficit hyperactivity disorder and those without.

Kids and teens with ADHD, a new study finds, lag behind others of the same age in how quickly their brains form connections within, and between, key brain networks.

The result: less-mature connections between a brain network that controls internally-directed thought (such as daydreaming) and networks that allow a person to focus on externally-directed tasks. That lag in connection development may help explain why people with ADHD get easily distracted or struggle to stay focused.

What’s more, the new findings, and the methods used to make them, may one day allow doctors to use brain scans to diagnose ADHD — and track how well someone responds to treatment. This kind of neuroimaging “biomarker” doesn’t yet exist for ADHD, or any psychiatric condition for that matter.

The new findings come from a team in the University of Michigan Medical School’s Department of Psychiatry. They used highly advanced computing techniques to analyze a large pool of detailed brain scans that were publicly shared for scientists to study. Their results are published in the Proceedings of the National Academy of Sciences.

Lead author Chandra Sripada, M.D., Ph.D., and colleagues looked at the brain scans of 275 kids and teens with ADHD, and 481 others without it, using “connectomic” methods that can map interconnectivity between networks in the brain.

The scans, made using function magnetic resonance imaging (fMRI) scanners, show brain activity during a resting state. This allows researchers to see how a number of different brain networks, each specialized for certain types of functions, were “talking” within and amongst themselves.

The researchers found lags in development of connection within the internally-focused network, called the default mode network or DMN, and in development of connections between DMN and two networks that process externally-focused tasks, often called task-positive networks, or TPNs. They could even see that the lags in connection development with the two task-related networks — the frontoparietal and ventral attention networks — were located primarily in two specific areas of the brain.

The new findings mesh well with what other researchers have found by examining the physical structure of the brains of people with and without ADHD in other ways.

Such research has already shown alterations in regions within DMN and TPNs. So, the new findings build on that understanding and add to it. 

The findings are also relevant to thinking about the longitudinal course of ADHD from childhood to adulthood. For instance, some children and teens “grow out” of the disorder, while for others the disorder persists throughout adulthood. Future studies of brain network maturation in ADHD could shed light into the neural basis for this difference.

“We and others are interested in understanding the neural mechanisms of ADHD in hopes that we can contribute to better diagnosis and treatment,” says Sripada, an assistant professor and psychiatrist who holds a joint appointment in the U-M Philosophy department and is a member of the U-M Center for Computational Medicine and Bioinformatics. “But without the database of fMRI images, and the spirit of collaboration that allowed them to be compiled and shared, we would never have reached this point.”  

Sripada explains that in the last decade, functional medical imaging has revealed that the human brain is functionally organized into large-scale connectivity networks. These networks, and the connections between them, mature throughout early childhood all the way to young adulthood. “It is particularly noteworthy that the networks we found to have lagging maturation in ADHD are linked to the very behaviors that are the symptoms of ADHD,” he says. 

Studying the vast array of connections in the brain, a field called connectomics, requires scientists to be able to parse through not just the one-to-one communications between two specific brain regions, but the patterns of communication among thousands of nodes within the brain. This requires major computing power and access to massive amounts of data – which makes the open sharing of fMRI images so important.

“The results of this study set the stage for the next phase of this research, which is to examine individual components of the networks that have the maturational lag,” he says. “This study provides a coarse-grained understanding, and now we want to examine this phenomenon in a more fine-grained way that might lead us to a true biological marker, or neuromarker, for ADHD.”

Sripada also notes that connectomics could be used to examine other disorders with roots in brain connectivity – including autism, which some evidence has suggested stems from over-maturation of some brain networks, and schizophrenia, which may arise from abnormal connections. Pooling more fMRI data from people with these conditions, and depression, anxiety, bipolar disorder and more could boost connectomics studies in those fields.

Filed under ADHD default mode network connectomics fMRI brain activity neuroscience science

209 notes

EEG Study Findings Reveal How Fear is Processed in the Brain
An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions. 
New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.
“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”
Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images. 
“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.” 
For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals). 
While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images. 
EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.
This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.

EEG Study Findings Reveal How Fear is Processed in the Brain

An estimated 8% of Americans will suffer from post traumatic stress disorder (PTSD) at some point during their lifetime. Brought on by an overwhelming or stressful event or events, PTSD is the result of altered chemistry and physiology of the brain. Understanding how threat is processed in a normal brain versus one altered by PTSD is essential to developing effective interventions. 

New research from the Center for BrainHealth at The University of Texas at Dallas published online today in Brain and Cognition illustrates how fear arises in the brain when individuals are exposed to threatening images. This novel study is the first to separate emotion from threat by controlling for the dimension of arousal, the emotional reaction provoked, whether positive or negative, in response to stimuli. Building on previous animal and human research, the study identifies an electrophysiological marker for threat in the brain.

“We are trying to find where thought exists in the mind,” explained John Hart, Jr., M.D., Medical Science Director at the Center for BrainHealth. “We know that groups of neurons firing on and off create a frequency and pattern that tell other areas of the brain what to do. By identifying these rhythms, we can correlate them with a cognitive unit such as fear.”

Utilizing electroencephalography (EEG), Dr. Hart’s research team identified theta and beta wave activity that signifies the brain’s reaction to visually threatening images. 

“We have known for a long time that the brain prioritizes threatening information over other cognitive processes,” explained Bambi DeLaRosa, study lead author. “These findings show us how this happens. Theta wave activity starts in the back of the brain, in it’s fear center – the amygdala – and then interacts with brain’s memory center - the hippocampus – before traveling to the frontal lobe where thought processing areas are engaged. At the same time, beta wave activity indicates that the motor cortex is revving up in case the feet need to move to avoid the perceived threat.” 

For the study, 26 adults (19 female, 7 male), ages 19-30 were shown 224 randomized images that were either unidentifiably scrambled or real pictures. Real pictures were separated into two categories: threatening (weapons, combat, nature or animals) and non-threatening (pleasant situations, food, nature or animals). 

While wearing an EEG cap, participants were asked to push a button with their right index finger for real items and another button with their right middle finger for nonreal/scrambled items. Shorter response times were recorded for scrambled images than the real images. There was no difference in reaction time for threatening versus non-threatening images. 

EEG results revealed that threatening images evoked an early increase in theta activity in the occipital lobe (the area in the brain where visual information is processed), followed by a later increase in theta power in the frontal lobe (where higher mental functions such as thinking, decision-making, and planning occur). A left lateralized desynchronization of the beta band, the wave pattern associated with motor behavior (like the impulse to run), also consistently appeared in the threatening condition.

This study will serve as a foundation for future work that will explore normal versus abnormal fear associated with an object in other atypical populations including individuals with PTSD.

Filed under fear PTSD emotions EEG brainwaves amygdala motor cortex hippocampus neuroscience science

153 notes

Brain Development in Schizophrenia Strays from the Normal Path
Schizophrenia is generally considered to be a disorder of brain development and it shares many risk factors, both genetic and environmental, with other neurodevelopmental disorders such as autism and intellectual disability.
The normal path for brain development is determined by the combined effects of a complex network of genes and a wide range of environmental factors.
However, longitudinal brain imaging studies in both healthy and patient populations are required in order to map the disturbances in brain structures as they emerge, i.e., the disturbed trajectories of brain development.
A new study by an international, collaborative group of researchers has measured neurodevelopment in schizophrenia, by studying brain development during childhood and adolescence in people with and without this disorder. With access to new statistical approaches and long-term follow-up with participants, in some cases over more than a decade, the researchers were able to describe brain development patterns associated with schizophrenia.
"Specifically, this paper shows that parts of the brain’s cortex develop differently in people with schizophrenia," said first author Dr. Aaron F. Alexander-Bloch, from the National Institute of Mental Health.
"The mapping of the path that the brain follows in deviating from normal development provides important clues to the underlying causes of the disorder," said Dr. John Krystal, Editor of Biological Psychiatry.
The findings were derived by investigating the trajectory of cortical thickness growth curves in 106 patients with childhood-onset schizophrenia and a comparison group of 102 healthy volunteers.
Each participant, ranging from 7–32 years of age, had repeated imaging scans over the course of several years. Then, using over 80,000 vertices across the cortex, the researchers modeled the effect of schizophrenia on the growth curve of cortical thickness.
This revealed differences that occur within a specific group of highly-connected brain regions that mature in synchrony during typical development, but follow altered trajectories of growth in schizophrenia.
"These findings show a relationship between the hypothesis that schizophrenia is a neurodevelopmental disorder and the longstanding hypothesis – first articulated by the German anatomist Karl Wernicke in the late 19th century – that it is a disease of altered connectivity between regions of the brain," added Alexander-Bloch.
This theoretical consistency is important, as it allows researchers to better focus future studies of brain connectivity in schizophrenia, by targeting the brain regions known to be affected.

Brain Development in Schizophrenia Strays from the Normal Path

Schizophrenia is generally considered to be a disorder of brain development and it shares many risk factors, both genetic and environmental, with other neurodevelopmental disorders such as autism and intellectual disability.

The normal path for brain development is determined by the combined effects of a complex network of genes and a wide range of environmental factors.

However, longitudinal brain imaging studies in both healthy and patient populations are required in order to map the disturbances in brain structures as they emerge, i.e., the disturbed trajectories of brain development.

A new study by an international, collaborative group of researchers has measured neurodevelopment in schizophrenia, by studying brain development during childhood and adolescence in people with and without this disorder. With access to new statistical approaches and long-term follow-up with participants, in some cases over more than a decade, the researchers were able to describe brain development patterns associated with schizophrenia.

"Specifically, this paper shows that parts of the brain’s cortex develop differently in people with schizophrenia," said first author Dr. Aaron F. Alexander-Bloch, from the National Institute of Mental Health.

"The mapping of the path that the brain follows in deviating from normal development provides important clues to the underlying causes of the disorder," said Dr. John Krystal, Editor of Biological Psychiatry.

The findings were derived by investigating the trajectory of cortical thickness growth curves in 106 patients with childhood-onset schizophrenia and a comparison group of 102 healthy volunteers.

Each participant, ranging from 7–32 years of age, had repeated imaging scans over the course of several years. Then, using over 80,000 vertices across the cortex, the researchers modeled the effect of schizophrenia on the growth curve of cortical thickness.

This revealed differences that occur within a specific group of highly-connected brain regions that mature in synchrony during typical development, but follow altered trajectories of growth in schizophrenia.

"These findings show a relationship between the hypothesis that schizophrenia is a neurodevelopmental disorder and the longstanding hypothesis – first articulated by the German anatomist Karl Wernicke in the late 19th century – that it is a disease of altered connectivity between regions of the brain," added Alexander-Bloch.

This theoretical consistency is important, as it allows researchers to better focus future studies of brain connectivity in schizophrenia, by targeting the brain regions known to be affected.

Filed under schizophrenia brain development neuroimaging cortical thickness neuroscience science

84 notes

(Image caption: Shown are fMRI scans across all subjects in the study. The yellow and red areas in Section A represent parts of the brain that are activated while subjects are forming “gist memories” of pictures viewed. Section B represents areas of increased activation, shown in yellow and red, as detailed memories are being formed. Credit: Image courtesy of Jagust Lab)
Researchers find neural compensation in people with Alzheimer’s-related protein
The human brain is capable of a neural workaround that compensates for the buildup of beta-amyloid, a destructive protein associated with Alzheimer’s disease, according to a new study led by UC Berkeley researchers.
The findings, published today (Sunday, Sept. 14) in the journal Nature Neuroscience, could help explain how some older adults with beta-amyloid deposits in their brain retain normal cognitive function while others develop dementia.
“This study provides evidence that there is plasticity or compensation ability in the aging brain that appears to be beneficial, even in the face of beta-amyloid accumulation,” said study principal investigator Dr. William Jagust, a professor with joint appointments at UC Berkeley’s Helen Wills Neuroscience Institute, the School of Public Health and Lawrence Berkeley National Laboratory.
Previous studies have shown a link between increased brain activity and beta-amyloid deposits, but it was unclear whether the activity was tied to better mental performance.
The study included 22 healthy young adults and 49 older adults who had no signs of mental decline. Brain scans showed that 16 of the older subjects had beta-amyloid deposits, while the remaining 55 adults did not.
The researchers used functional magnetic resonance imaging (fMRI) to track the brain activity of subjects in the process of memorizing pictures of various scenes. Afterwards, the researchers tested the subjects’ “gist memory” by asking them to confirm whether a written description of a scene – such as a boy doing a handstand – corresponded to a picture previously viewed. Subjects were then asked to confirm whether specific written details of a scene – such as the color of the boy’s shirt – were true.
“Generally, the groups performed equally well in the tasks, but it turned out that for people with beta-amyloid deposits in the brain, the more detailed and complex their memory, the more brain activity there was,” said Jagust. “It seems that their brain has found a way to compensate for the presence of the proteins associated with Alzheimer’s.”
What remains unclear, said Jagust, is why some people with beta-amyloid deposits are better at using different parts of their brain than others. Previous studies suggest that people who engage in mentally stimulating activities throughout their lives have lower levels of beta-amyloid.
“I think it’s very possible that people who spend a lifetime involved in cognitively stimulating activity have brains that are better able to adapt to potential damage,” said Jagust.

(Image caption: Shown are fMRI scans across all subjects in the study. The yellow and red areas in Section A represent parts of the brain that are activated while subjects are forming “gist memories” of pictures viewed. Section B represents areas of increased activation, shown in yellow and red, as detailed memories are being formed. Credit: Image courtesy of Jagust Lab)

Researchers find neural compensation in people with Alzheimer’s-related protein

The human brain is capable of a neural workaround that compensates for the buildup of beta-amyloid, a destructive protein associated with Alzheimer’s disease, according to a new study led by UC Berkeley researchers.

The findings, published today (Sunday, Sept. 14) in the journal Nature Neuroscience, could help explain how some older adults with beta-amyloid deposits in their brain retain normal cognitive function while others develop dementia.

“This study provides evidence that there is plasticity or compensation ability in the aging brain that appears to be beneficial, even in the face of beta-amyloid accumulation,” said study principal investigator Dr. William Jagust, a professor with joint appointments at UC Berkeley’s Helen Wills Neuroscience Institute, the School of Public Health and Lawrence Berkeley National Laboratory.

Previous studies have shown a link between increased brain activity and beta-amyloid deposits, but it was unclear whether the activity was tied to better mental performance.

The study included 22 healthy young adults and 49 older adults who had no signs of mental decline. Brain scans showed that 16 of the older subjects had beta-amyloid deposits, while the remaining 55 adults did not.

The researchers used functional magnetic resonance imaging (fMRI) to track the brain activity of subjects in the process of memorizing pictures of various scenes. Afterwards, the researchers tested the subjects’ “gist memory” by asking them to confirm whether a written description of a scene – such as a boy doing a handstand – corresponded to a picture previously viewed. Subjects were then asked to confirm whether specific written details of a scene – such as the color of the boy’s shirt – were true.

“Generally, the groups performed equally well in the tasks, but it turned out that for people with beta-amyloid deposits in the brain, the more detailed and complex their memory, the more brain activity there was,” said Jagust. “It seems that their brain has found a way to compensate for the presence of the proteins associated with Alzheimer’s.”

What remains unclear, said Jagust, is why some people with beta-amyloid deposits are better at using different parts of their brain than others. Previous studies suggest that people who engage in mentally stimulating activities throughout their lives have lower levels of beta-amyloid.

“I think it’s very possible that people who spend a lifetime involved in cognitively stimulating activity have brains that are better able to adapt to potential damage,” said Jagust.

Filed under beta amyloid brain activity cognitive function dementia alzheimer's disease neuroscience science

625 notes

Schizophrenia not a single disease but multiple genetically distinct disorders

New research shows that schizophrenia isn’t a single disease but a group of eight genetically distinct disorders, each with its own set of symptoms. The finding could be a first step toward improved diagnosis and treatment for the debilitating psychiatric illness.

image

The research at Washington University School of Medicine in St. Louis is reported online Sept. 15 in The American Journal of Psychiatry.

About 80 percent of the risk for schizophrenia is known to be inherited, but scientists have struggled to identify specific genes for the condition. Now, in a novel approach analyzing genetic influences on more than 4,000 people with schizophrenia, the research team has identified distinct gene clusters that contribute to eight different classes of schizophrenia.

“Genes don’t operate by themselves,” said C. Robert Cloninger, MD, PhD, one of the study’s senior investigators. “They function in concert much like an orchestra, and to understand how they’re working, you have to know not just who the members of the orchestra are but how they interact.”

Cloninger, the Wallace Renard Professor of Psychiatry and Genetics, and his colleagues matched precise DNA variations in people with and without schizophrenia to symptoms in individual patients. In all, the researchers analyzed nearly 700,000 sites within the genome where a single unit of DNA is changed, often referred to as a single nucleotide polymorphism (SNP). They looked at SNPs in 4,200 people with schizophrenia and 3,800 healthy controls, learning how individual genetic variations interacted with each other to produce the illness.

In some patients with hallucinations or delusions, for example, the researchers matched distinct genetic features to patients’ symptoms, demonstrating that specific genetic variations interacted to create a 95 percent certainty of schizophrenia. In another group, they found that disorganized speech and behavior were specifically associated with a set of DNA variations that carried a 100 percent risk of schizophrenia.

“What we’ve done here, after a decade of frustration in the field of psychiatric genetics, is identify the way genes interact with each other, how the ‘orchestra’ is either harmonious and leads to health, or disorganized in ways that lead to distinct classes of schizophrenia,” Cloninger said. 

Although individual genes have only weak and inconsistent associations with schizophrenia, groups of interacting gene clusters create an extremely high and consistent risk of illness, on the order of 70 to 100 percent. That makes it almost impossible for people with those genetic variations to avoid the condition. In all, the researchers identified 42 clusters of genetic variations that dramatically increased the risk of schizophrenia.

“In the past, scientists had been looking for associations between individual genes and schizophrenia,” explained Dragan Svrakic, PhD, MD, a co-investigator and a professor of psychiatry at Washington University. “When one study would identify an association, no one else could replicate it. What was missing was the idea that these genes don’t act independently. They work in concert to disrupt the brain’s structure and function, and that results in the illness.”

Svrakic said it was only when the research team was able to organize the genetic variations and the patients’ symptoms into groups that they could see that particular clusters of DNA variations acted together to cause specific types of symptoms.

Then they divided patients according to the type and severity of their symptoms, such as different types of hallucinations or delusions, and other symptoms, such as lack of initiative, problems organizing thoughts or a lack of connection between emotions and thoughts. The results indicated that those symptom profiles describe eight qualitatively distinct disorders based on underlying genetic conditions.

The investigators also replicated their findings in two additional DNA databases of people with schizophrenia, an indicator that identifying the gene variations that are working together is a valid avenue to explore for improving diagnosis and treatment.

By identifying groups of genetic variations and matching them to symptoms in individual patients, it soon may be possible to target treatments to specific pathways that cause problems, according to co-investigator Igor Zwir, PhD, research associate in psychiatry at Washington University and associate professor in the Department of Computer Science and Artificial Intelligence at the University of Granada, Spain.

And Cloninger added it may be possible to use the same approach to better understand how genes work together to cause other common but complex disorders.

“People have been looking at genes to get a better handle on heart disease, hypertension and diabetes, and it’s been a real disappointment,” he said. “Most of the variability in the severity of disease has not been explained, but we were able to find that different sets of genetic variations were leading to distinct clinical syndromes. So I think this really could change the way people approach understanding the causes of complex diseases.”

(Source: news.wustl.edu)

Filed under schizophrenia mental illness genes genetic variations genetics genomics neuroscience science

247 notes

Sometimes, adolescents just can’t resist
Don’t get mad the next time you catch your teenager texting when he promised to be studying.
He simply may not be able to resist.
A University of Iowa study found teenagers are far more sensitive than adults to the immediate effect or reward of their behaviors. The findings may help explain, for example, why the initial rush of texting may be more enticing for adolescents than the long-term payoff of studying.
“The rewards have a strong, perceptional draw and are more enticing to the teenager,” says Jatin Vaidya, a professor of psychiatry at the UI and corresponding author of the study, which appeared online this week in the journal Psychological Science. “Even when a behavior is no longer in a teenager’s best interest to continue, they will because the effect of the reward is still there and lasts much longer in adolescents than in adults.”
For parents, that means limiting distractions so teenagers can make better choices. Take the homework and social media dilemma: At 9 p.m., shut off everything except a computer that has no access to Facebook or Twitter, the researchers advise.
“I’m not saying they shouldn’t be allowed access to technology,” Vaidya says. “But they need help in regulating their attention so they can develop those impulse-control skills.”
In their study, “Value-Driven Attentional Capture in Adolescence,” Vaidya and co-authors Shaun Vecera, a professor of psychology, and Zachary Roper, a graduate student in psychology, note researchers generally believe teenagers are impulsive, make bad decisions, and engage in risky behavior because the frontal lobes of their brains are not fully developed.
But the UI researchers wondered whether something more fundamental was going on with adolescents to trigger behaviors independent of higher-level reasoning.
“We wanted to try to understand the brain’s reward system and how it changes from childhood to adulthood,” says Vaidya, who adds the reward trait in the human brain is much more primitive than decision-making. “We’ve been trying to understand the reward process in adolescence and whether there is more to adolescent behavior than an under-developed frontal lobe,” he adds.
For their study, the researchers recruited 40 adolescents, ages 13 and 16, and 40 adults, ages 20 and 35. First, participants were asked to find a red or green ring hidden within an array of rings on a computer screen. Once identified, they reported whether the white line inside the ring was vertical or horizontal. If they were right, they received a reward between 2 and 10 cents, depending on the color. For some participants, the red ring paid the highest reward; for others, it was the green. None was told which color would pay the most.
After 240 trials, the participants were asked whether they noticed anything about the colors. Most made no association between a color and reward, which researchers say proves the ring exercise didn’t involve high-level, decision-making.
In the next stage, participants showed they had developed an intuitive association when they were asked to find a diamond-shaped target. This time, the red and green rings were used as decoys.
At first, the adolescents and adults selected the color ring that garnered them the highest monetary reward, the goal of the first trial. But in short order, the adults adjusted and selected the diamond. The adolescents did not.
Even after 240 trials, the adolescents were still more apt to pick the colored rings.
“Even though you’ve told them, ‘You have a new target,’ the adolescents can’t get rid of the association they learned before,” Vecera says. “It’s as if that association is much more potent for the adolescent than for the adult.
“If you give the adolescent a reward, it will persist longer,” he adds. “The fact that the reward is gone doesn’t matter. They will act as if the reward is still there.”
Researchers say that inability to readily adjust behavior explains why, for example, a teenager may continue to make inappropriate comments in class long after friends stopped laughing.
In the future, researchers hope to delve into the psychological and neurological aspects of their results.
“Are there certain brain regions or circuits that continue to develop from adolescence to adulthood that play role in directing attention away from reward stimuli that are not task relevant?” Vaidya asks. “Also, what sort of life experiences and skill help to improve performance on this task?”

Sometimes, adolescents just can’t resist

Don’t get mad the next time you catch your teenager texting when he promised to be studying.

He simply may not be able to resist.

A University of Iowa study found teenagers are far more sensitive than adults to the immediate effect or reward of their behaviors. The findings may help explain, for example, why the initial rush of texting may be more enticing for adolescents than the long-term payoff of studying.

“The rewards have a strong, perceptional draw and are more enticing to the teenager,” says Jatin Vaidya, a professor of psychiatry at the UI and corresponding author of the study, which appeared online this week in the journal Psychological Science. “Even when a behavior is no longer in a teenager’s best interest to continue, they will because the effect of the reward is still there and lasts much longer in adolescents than in adults.”

For parents, that means limiting distractions so teenagers can make better choices. Take the homework and social media dilemma: At 9 p.m., shut off everything except a computer that has no access to Facebook or Twitter, the researchers advise.

“I’m not saying they shouldn’t be allowed access to technology,” Vaidya says. “But they need help in regulating their attention so they can develop those impulse-control skills.”

In their study, “Value-Driven Attentional Capture in Adolescence,” Vaidya and co-authors Shaun Vecera, a professor of psychology, and Zachary Roper, a graduate student in psychology, note researchers generally believe teenagers are impulsive, make bad decisions, and engage in risky behavior because the frontal lobes of their brains are not fully developed.

But the UI researchers wondered whether something more fundamental was going on with adolescents to trigger behaviors independent of higher-level reasoning.

“We wanted to try to understand the brain’s reward system and how it changes from childhood to adulthood,” says Vaidya, who adds the reward trait in the human brain is much more primitive than decision-making. “We’ve been trying to understand the reward process in adolescence and whether there is more to adolescent behavior than an under-developed frontal lobe,” he adds.

For their study, the researchers recruited 40 adolescents, ages 13 and 16, and 40 adults, ages 20 and 35. First, participants were asked to find a red or green ring hidden within an array of rings on a computer screen. Once identified, they reported whether the white line inside the ring was vertical or horizontal. If they were right, they received a reward between 2 and 10 cents, depending on the color. For some participants, the red ring paid the highest reward; for others, it was the green. None was told which color would pay the most.

After 240 trials, the participants were asked whether they noticed anything about the colors. Most made no association between a color and reward, which researchers say proves the ring exercise didn’t involve high-level, decision-making.

In the next stage, participants showed they had developed an intuitive association when they were asked to find a diamond-shaped target. This time, the red and green rings were used as decoys.

At first, the adolescents and adults selected the color ring that garnered them the highest monetary reward, the goal of the first trial. But in short order, the adults adjusted and selected the diamond. The adolescents did not.

Even after 240 trials, the adolescents were still more apt to pick the colored rings.

“Even though you’ve told them, ‘You have a new target,’ the adolescents can’t get rid of the association they learned before,” Vecera says. “It’s as if that association is much more potent for the adolescent than for the adult.

“If you give the adolescent a reward, it will persist longer,” he adds. “The fact that the reward is gone doesn’t matter. They will act as if the reward is still there.”

Researchers say that inability to readily adjust behavior explains why, for example, a teenager may continue to make inappropriate comments in class long after friends stopped laughing.

In the future, researchers hope to delve into the psychological and neurological aspects of their results.

“Are there certain brain regions or circuits that continue to develop from adolescence to adulthood that play role in directing attention away from reward stimuli that are not task relevant?” Vaidya asks. “Also, what sort of life experiences and skill help to improve performance on this task?”

Filed under adolescence attentional capture reward frontal lobe learning psychology neuroscience science

free counters