Posts tagged prefrontal cortex

Posts tagged prefrontal cortex
Brain scans link concern for justice with reason, not emotion
People who care about justice are swayed more by reason than emotion, according to new brain scan research from the Department of Psychology and Center for Cognitive and Social Neuroscience.
Psychologists have found that some individuals react more strongly than others to situations that invoke a sense of justice—for example, seeing a person being treated unfairly or mercifully. The new study used brain scans to analyze the thought processes of people with high “justice sensitivity.”
“We were interested to examine how individual differences about justice and fairness are represented in the brain to better understand the contribution of emotion and cognition in moral judgment,” explained lead author Jean Decety, the Irving B. Harris Professor of Psychology and Psychiatry.
Using a functional magnetic resonance imaging (fMRI) brain-scanning device, the team studied what happened in the participants’ brains as they judged videos depicting behavior that was morally good or bad. For example, they saw a person put money in a beggar’s cup or kick the beggar’s cup away. The participants were asked to rate on a scale how much they would blame or praise the actor seen in the video. People in the study also completed questionnaires that assessed cognitive and emotional empathy, as well as their justice sensitivity.
As expected, study participants who scored high on the justice sensitivity questionnaire assigned significantly more blame when they were evaluating scenes of harm, Decety said. They also registered more praise for scenes showing a person helping another individual.
But the brain imaging also yielded surprises. During the behavior-evaluation exercise, people with high justice sensitivity showed more activity than average participants in parts of the brain associated with higher-order cognition. Brain areas commonly linked with emotional processing were not affected.
The conclusion was clear, Decety said: “Individuals who are sensitive to justice and fairness do not seem to be emotionally driven. Rather, they are cognitively driven.”
According to Decety, one implication is that the search for justice and the moral missions of human rights organizations and others do not come primarily from sentimental motivations, as they are often portrayed. Instead, that drive may have more to do with sophisticated analysis and mental calculation.
Decety adds that evaluating good actions elicited relatively high activity in the region of the brain involved in decision-making, motivation and rewards. This finding suggests that perhaps individuals make judgments about behavior based on how they process the reward value of good actions as compared to bad actions.
“Our results provide some of the first evidence for the role of justice sensitivity in enhancing neural processing of moral information in specific components of the brain network involved in moral judgment,” Decety said.
Childhood’s end: ADHD, autism and schizophrenia tied to stronger inhibitory interactions in adolescent prefrontal cortex
Key cognitive functions such as working memory (which combines temporary storage and manipulation of information) and executive function (a set of mental processes that helps connect past experience with present action) are associated with the brain’s prefrontal cortex. Unlike other brain regions, the prefrontal cortex does not mature until early adulthood, with the most pronounced changes being seen between its peripubertal (onset of puberty) and postpubertal developmental states. Moreover, this maturation period is correlated with cognitive maturation – but the physical neuronal changes during this transition have remained for the most part unknown. Recently, however, scientists at the Wake Forest School of Medicine in Winston-Salem, NC recorded and compared prefrontal cortical activity peripubertal and adult monkeys.
The researchers found that compared with adults, peripubertal monkeys showed lower connectivity due to stronger inhibitory interactions, suggesting that intrinsic (or resting state) inhibitory connections – that is, inhibitory neural connections that are active in the absence of any particular task – decline with maturation. The scientists then concluded that prefrontal intrinsic connectivity changes are a possible substrate for cognitive maturation.
Prof. Christos Constantinidis discusses the paper that he, Dr. Xin Zhou and their co-authors published in Proceedings of the National Academy of Sciences. When comparing the functional connectivity between pairs of neurons in neuronal activity recorded from the prefrontal cortex of peripubertal and adult monkeys and evaluating the developmental stage of peripubertal rhesus monkeys with a series of morphometric, hormonal, and radiographic measures, Constantinidis tells Medical Xpress that a major challenge was to obtain neural activity from the brain of monkeys around the time of puberty. “We needed to make ourselves experts in the developmental trajectories of monkeys and conduct experiments just at the right time relative to the onset of puberty,” he explains.
It may seem normal: As we age, we misplace car keys, or can’t remember a name we just learned or a meal we just ordered. But University of Florida researchers say memory trouble doesn’t have to be inevitable, and they’ve found a drug therapy that could potentially reverse this type of memory decline.
The drug can’t yet be used in humans, but the researchers are pursuing compounds that could someday help the population of aging adults who don’t have Alzheimer’s or other dementias but still have trouble remembering day-to-day items. Their findings will be published in today’s (March 5) issue of the Journal of Neuroscience.
The kind of memory responsible for holding information in the mind for short periods of time is called “working memory.” Working memory relies on a balance of chemicals in the brain. The UF study shows this chemical balance tips in older adults, and working memory declines. The reason? It could be because their brains are producing too much of a chemical that slows neural activity.
“Graduate student Cristina Banuelos’ work suggests that cells that normally provide the brake on neural activity are in overdrive in the aged prefrontal cortex,” said researcher Jennifer Bizon, Ph.D., an associate professor in the department of neuroscience and a member of UF’s Evelyn F. & William L. McKnight Brain Institute.
This chemical, an inhibitory brain neurotransmitter called GABA, is essential. Without it, brain cells can become too active, similar to what happens in the brains of people with schizophrenia and epilepsy. A normal level of GABA helps maintain the optimal levels of cell activation, said collaborator Barry Setlow, Ph.D., an associate professor in UF’s departments of psychiatry and neuroscience.
Working memory underlies many mental abilities and is sometimes referred to as the brain’s mental sketchpad, Bizon said. For example, Bizon said, you use your working memory in many everyday activities such as calculating your final bill at the end of dining at a restaurant. Most people can calculate a 15 percent tip and add it to the cost of their meal without pencil and paper. Central to this process is the ability to keep multiple pieces of information in mind for a short duration — such as remembering the cost of your dinner while calculating the amount needed for the tip.
“Almost all higher cognitive processes depend on this fundamental operation,” Bizon said.
To determine the culprit behind working memory decline, the researchers tested the memory of young and aged rats in a “Skinner box.” In the Skinner box, rats had to remember the location of a lever for short periods of up to 30 seconds. The scientists found that while both young and old rats could remember the location of the lever for brief periods of time, as those time periods lengthened, old rats had more difficulty remembering the location of the lever than young rats.
But not all older rats did poorly on the memory test, just as not all older adults have memory problems. The study shows the older brains of some people or rats with no memory problems might compensate for the overactive inhibitory system — they are able to produce fewer GABA receptors and therefore bind less of the inhibitory chemical.
Older rats with memory problems had more GABA receptors. The drug the researchers tested blocked GABA receptors, mimicking the lower number of those receptors that some older rats had naturally and restoring working memory in aged rats to the level of younger rats.
“Modern medicine has done a terrific job of keeping us alive for longer, and now we have to keep up and determine how to maximize the quality of life for seniors,” Bizon said. “A key aspect of that is going to be developing strategies and therapies that can maintain and improve cognitive health.”
(Source: ufhealth.org)
Researchers report that one tiny variation in the sequence of a gene may cause some people to be more impaired by traumatic brain injury (TBI) than others with comparable wounds.
The study, described in the journal PLOS ONE, measured general intelligence in a group of 156 Vietnam War veterans who suffered penetrating head injuries during the war. All of the study subjects had damage to the prefrontal cortex, a brain region behind the forehead that is important to cognitive tasks such as planning, problem-solving, self-restraint and complex thought.
The researchers controlled for the size and location of subjects’ brain injuries and other factors, such as intelligence prior to injury, which might have contributed to differences in cognitive function. (Prior to combat, the veterans had completed the Armed Forces Qualifications Test, which included measures of intelligence that provided a baseline for the new analysis.)
“We administered a large, cognitive battery of tests to investigate how they performed after their injury,” said study leader Aron Barbey, a professor of speech and hearing science, of psychology and of neuroscience at the University of Illinois. “And we had a team of neurologists who helped characterize the nature and scope of the patients’ brain injuries.”
The researchers also collected blood for a genetic analysis, focusing on a gene known as BDNF (brain-derived neurotrophic factor).
The team found that a single polymorphism (a difference in one “letter” of the sequence) in the BDNF gene accounted for significant differences in intelligence among those with similar injuries and comparable intelligence before being injured.
“BDNF is a basic growth factor and it’s related to neurogenesis, the production of new neurons,” Barbey said. “What we found is that if people have a specific polymorphism in the BDNF gene, they recovered to a greater extent than those with a different variant of the gene.”
The change in the gene alters the BDNF protein: The amino acid methionine (Met) is incorporated at a specific site in the protein instead of valine (Val). Since people inherit two versions of each gene, one from each parent, they have either Val/Val, Val/Met or Met/Met variants of the gene.
“The effects of this difference were large – very large,” Barbey said. “If an individual had the Val/Val combination, then their performance on a battery of cognitive tests (conducted long after the injury occurred) was remarkably lower than that of individuals who had the Val/Met or Met/Met combination.”
On average, those with the Val/Val polymorphism scored about eight IQ points lower on tests of general intelligence than those with the Val/Met or Met/Met variants, Barbey said. Those with the Val/Val variant also were significantly more impaired in “specific competencies for intelligence like verbal comprehension, perceptual organization, working memory and processing speed,” he said.
To test these results, the researchers did the analysis over again “in a subset of individuals who had very similar (brain injuries) to the other group,” Barbey said. “We found the same kind of effects, suggesting that lesion location isn’t a factor influencing the difference between the groups.”
The finding opens a new avenue of exploration for treatments to aid the process of recovery from TBI, Barbey said.
(Source: news.illinois.edu)
Study reveals workings of working memory
Keep this in mind: Scientists say they’ve learned how your brain plucks information out of working memory when you decide to act.
Say you’re a busy mom trying to wrap up a work call now that you’ve arrived home. While you converse on your Bluetooth headset, one kid begs for an unspecified snack, another asks where his homework project has gone, and just then an urgent e-mail from your boss buzzes the phone in your purse. During the call’s last few minutes these urgent requests — snack, homework, boss — wait in your working memory. When you hang up, you’ll pick one and act.
When you do that, according to Brown University psychology researchers whose findings appear in the journal Neuron, you’ll employ brain circuitry that links a specific chunk of the striatum called the caudate and a chunk of the prefrontal cortex centered on the dorsal anterior premotor cortex. Selecting from working memory, it turns out, uses similar circuits to those involved in planning motion.
In lab experiments with 22 adult volunteers, the researchers used magnetic resonance imaging to track brain activity during a carefully designed working memory task. They also measured how quickly the subjects could choose from working memory — a phenomenon the scientists called “output gating.”
“In the immediacy of what we’re doing we have this small working memory capacity where we can hang on to a few things that are going to be useful in a few moments, and that’s where output gating is crucial,” said study senior author David Badre, professor of cognitive, linguistic, and psychological sciences at Brown.
From the perspective of cognition, said lead author and postdoctoral scholar Christopher Chatham, input gating — choosing what goes into working memory — and output gating allow people to maintain a course of action (e.g., finish that Bluetooth call) while being flexible enough to account for context in planning what’s next.
Of cognition and wingdings
In their experiments Badre, Chatham, and co-author Michael Frank, associate professor of cognitive, linguistic, and psychological sciences, provided their volunteers with four different versions of a similar working memory task. The versions distinguished output gating from input gating so that the anatomical action observed in the MRI could reliably associate with output gating behavior.
In each round, volunteers saw a sequence of characters — either letters of the alphabet or wingdings (typographical symbols like stars and snowflakes). Before or after the sequence, the volunteers were also given a context cue in the form of a numeral that told them which kind of character would be relevant at end of the task (e.g., “1” might mean a wingding while “2” might mean a letter). The last step for volunteers was to select between groups of characters on the screen that included whichever contextually relevant character they had seen in the sequence (e.g., if the subject had seen a “1” and later a snowflake during the sequence, they should select the group that included a snowflake).
When the context numeral came first, say a “2,” volunteers would “input gate” only letters into their working memory. When it came time to make a selection, they’d simply “output gate” the correct letter from the letters in working memory. If the context came last, people would have to input gate everything they saw into working memory, making all the real thinking a matter of output gating. If the context cue came last, they would carry a higher load of characters in working memory. To address this disparity, the experimenters created two more conditions in which a global context indicator, “3,” required people to keep everything they saw in working memory whether it came before the sequence or after.
With this experimental design the researchers could measure performance and monitor brain activity with subjects who had distinct moments of input and output gating, regardless of the character load in working memory.
People accomplished the tasks with a range of speeds, which the researchers regarded as a proxy for the amount of cognitive work volunteers had to do. People were slowest in making a selection when they got the context cue last and then had to gate just one specific symbol out of memory (e.g., they saw the sequence, then saw a 1, and then had to choose the option with a wingding they had seen). People were fastest at making a selection when they were given the context first and then had to pick the one character of that kind that they saw (e.g., they saw a “2,” then the sequence in which only letters mattered, and then had to choose the option with a letter they had seen).
In analyzing the results, Chatham and his co-authors found that the caudate and the dorsal anterior premotor cortex, contributed distinctly to the reaction times they saw. These separate roles in the partnership agree with computational models of how the brain works.
“The division of labor that’s specifically posited by these computational models is one in which there is a basically a context being represented in the prefrontal cortex that determines the overall efficiency of going from stimulus to response – like a route,” Chatham said. “The striatum is involved in the actual gating of that flow of information,” he said, “like traffic lights along the route.”
So the cortex interprets the context, while the striatum implements the gating. When the context is unhelpfully general and the gating is very specific, for example, the task takes a lot of time.
The findings help advance studies of how cognition works in the brain and could help psychiatrists analyze behavior in people where those areas of the brain have been injured, the researchers said. It also highlights how similar brain circuits can execute different functions – motion and working memory gating.

Pinpointing the Brain’s Arbitrator
We tend to be creatures of habit. In fact, the human brain has a learning system that is devoted to guiding us through routine, or habitual, behaviors. At the same time, the brain has a separate goal-directed system for the actions we undertake only after careful consideration of the consequences. We switch between the two systems as needed. But how does the brain know which system to give control to at any given moment? Enter The Arbitrator.
Researchers at the California Institute of Technology (Caltech) have, for the first time, pinpointed areas of the brain—the inferior lateral prefrontal cortex and frontopolar cortex—that seem to serve as this “arbitrator” between the two decision-making systems, weighing the reliability of the predictions each makes and then allocating control accordingly. The results appear in the current issue of the journal Neuron.
According to John O’Doherty, the study’s principal investigator and director of the Caltech Brain Imaging Center, understanding where the arbitrator is located and how it works could eventually lead to better treatments for brain disorders, such as drug addiction, and psychiatric disorders, such as obsessive-compulsive disorder. These disorders, which involve repetitive behaviors, may be driven in part by malfunctions in the degree to which behavior is controlled by the habitual system versus the goal-directed system.
"Now that we have worked out where the arbitrator is located, if we can find a way of altering activity in this area, we might be able to push an individual back toward goal-directed control and away from habitual control," says O’Doherty, who is also a professor of psychology at Caltech. "We’re a long way from developing an actual treatment based on this for disorders that involve over-egging of the habit system, but this finding has opened up a highly promising avenue for further research."
In the study, participants played a decision-making game on a computer while connected to a functional magnetic resonance imaging (fMRI) scanner that monitored their brain activity. Participants were instructed to try to make optimal choices in order to gather coins of a certain color, which were redeemable for money.
During a pre-training period, the subjects familiarized themselves with the game—moving through a series of on-screen rooms, each of which held different numbers of red, yellow, or blue coins. During the actual game, the participants were told which coins would be redeemable each round and given a choice to navigate right or left at two stages, knowing that they would collect only the coins in their final room. Sometimes all of the coins were redeemable, making the task more habitual than goal-directed. By altering the probability of getting from one room to another, the researchers were able to further test the extent of participants’ habitual and goal-directed behavior while monitoring corresponding changes in their brain activity.
With the results from those tests in hand, the researchers were able to compare the fMRI data and choices made by the subjects against several computational models they constructed to account for behavior. The model that most accurately matched the experimental data involved the two brain systems making separate predictions about which action to take in a given situation. Receiving signals from those systems, the arbitrator kept track of the reliability of the predictions by measuring the difference between the predicted and actual outcomes for each system. It then used those reliability estimates to determine how much control each system should exert over the individual’s behavior. In this model, the arbitrator ensures that the system making the most reliable predictions at any moment exerts the greatest degree of control over behavior.
"What we’re showing is the existence of higher-level control in the human brain," says Sang Wan Lee, lead author of the new study and a postdoctoral scholar in neuroscience at Caltech. "The arbitrator is basically making decisions about decisions."
In line with previous findings from the O’Doherty lab and elsewhere, the researchers saw in the brain scans that an area known as the posterior putamen was active at times when the model predicted that the habitual system should be calculating prediction values. Going a step further, they examined the connectivity between the posterior putamen and the arbitrator. What they found might explain how the arbitrator sets the weight for the two learning systems: the connection between the arbitrator area and the posterior putamen changed according to whether the goal-directed or habitual system was deemed to be more reliable. However, no such connection effects were found between the arbitrator and brain regions involved in goal-directed learning. This suggests that the arbitrator may work mainly by modulating the activity of the habitual system.
"One intriguing possibility arising from these findings, which we will need to test in future work, is that being in a habitual mode of behavior may be the default state," says O’Doherty. "So when the arbitrator determines you need to be more goal-directed in your behavior, it accomplishes this by inhibiting the activity of the habitual system, almost like pressing the breaks on your car when you are in drive."
What makes us human? Unique brain area linked to higher cognitive powers
Oxford University researchers have identified an area of the human brain that appears unlike anything in the brains of some of our closest relatives.
The brain area pinpointed is known to be intimately involved in some of the most advanced planning and decision-making processes that we think of as being especially human.
'We tend to think that being able to plan into the future, be flexible in our approach and learn from others are things that are particularly impressive about humans. We've identified an area of the brain that appears to be uniquely human and is likely to have something to do with these cognitive powers,' says senior researcher Professor Matthew Rushworth of Oxford University's Department of Experimental Psychology.
MRI imaging of 25 adult volunteers was used to identify key components in the ventrolateral frontal cortex area of the human brain, and how these components were connected up with other brain areas. The results were then compared to equivalent MRI data from 25 macaque monkeys.
The brain’s RAM: Rats, like humans, have a “working memory”
Thousands of times a day, the brain stores sensory information for very short periods of time in a working memory, to be able to use it later. A research study carried out with the collaboration of SISSA has shown, for the first time, that this function also exists in the brain of rodents, a finding that sheds light on the evolutionary origins of this cognitive mechanism.
In computers it’s called “RAM”, but the mechanism is conceptually similar to what scientists call a “working memory” in the brain of humans and primates: when we interact with the environment our senses gather information that a temporary memory system keeps fresh and readily accessible for a few minutes, so that the body can carry out operations (for example, an action). For the first time, a research team coordinated by Mathew Diamond of the International School for Advanced Studies (SISSA) in Trieste has shown that this memory system also exists in simpler mammals like rodents.
Working memory has been studied in detail in humans and primates, but little was known about its existence in other animals. “Knowing that a working memory also exists in the brain of evolutionarily simpler organisms helps us to understand the origins of this important cognitive mechanism”, explains Diamond. “Comparative psychology studies have historically helped scientists not only to trace the evolutionary roots of human brain functions but also to gain deeper insight into human cognitive processes themselves”.
The type of sensory memory studied by Diamond and co-workers in rats is tactile memory. The performance of rodents in tasks assessing recognition of vibratory stimuli was compared with that of humans performing similar tasks (rats used their whiskers and humans their fingertips). “Rats exhibited similar behaviour patterns to humans, demonstrating that these animals use a tactile working memory that enables them to recognise and interact with environmental stimuli”. The research paper has been published in The Proceedings of the National Academy of Sciences (PNAS).
More in detail…
“Working memory is an extraordinary cognitive mechanism”, comments Diamond. “It’s like a container where the brain stores little bits of very recent experience, to be able to assess the best course of action. Without this temporary memory, experience would slip away without any chance of being used”.
“Working memory can hold only a limited amount of information for a fairly short period of time. These limits are the result of a cost-benefit balance: the brain’s computational capacity is fixed and decisions as to what action to take often need to be quick and effective as the same time. Our working memory’s capacity is therefore the best we can achieve in terms of accuracy and speed with our brain”.
“The brain regions responsible for working memory have not yet been identified in rats. Some believe that rats don’t have the brain centres known as “prefrontal cortex” which are involved in this function in primates”, continues Diamond. ”Our surprise was to discover that rodents realize memory in a manner similar to humans. Now we are continuing our studies to understand how these mechanisms work in detail”.
Study reveals how ecstasy acts on the brain and hints at therapeutic uses
Brain imaging experiments have revealed for the first time how ecstasy produces feelings of euphoria in users.
Results of the study at Imperial College London, parts of which were televised in Drugs Live on Channel 4 in 2012, have now been published in the journal Biological Psychiatry.
The findings hint at ways that ecstasy, or MDMA, might be useful in the treatment of anxiety and post-traumatic stress disorder (PTSD).
MDMA has been a popular recreational drug since the 1980s, but there has been little research on which areas of the brain it affects. The new study is the first to use functional magnetic resonance imaging (fMRI) on resting subjects under its influence.
Twenty-five volunteers underwent brain scans on two occasions, one after taking the drug and one after taking a placebo, without knowing which they had been given.
The results show that MDMA decreases activity in the limbic system – a set of structures involved in emotional responses. These effects were stronger in subjects who reported stronger subjective experiences, suggesting that they are related.
Communication between the medial temporal lobe and medial prefrontal cortex, which is involved in emotional control, was reduced. This effect, and the drop in activity in the limbic system, are opposite to patterns seen in patients who suffer from anxiety.
MDMA also increased communication between the amygdala and the hippocampus. Studies on patients with PTSD have found a reduction in communication between these areas.
The project was led by David Nutt, the Edmond J. Safra Professor of Neuropsychopharmacology at Imperial College London, and Professor Val Curran at UCL.
Dr Robin Carhart-Harris from the Department of Medicine at Imperial, who performed the research, said: “We found that MDMA caused reduced blood flow in regions of the brain linked to emotion and memory. These effects may be related to the feelings of euphoria that people experience on the drug.”
Professor Nutt added: “The findings suggest possible clinical uses of MDMA in treating anxiety and PTSD, but we need to be careful about drawing too many conclusions from a study in healthy volunteers. We would have to do studies in patients to see if we find the same effects.”
MDMA has been investigated as an adjunct to psychotherapy in the treatment of PTSD, with a recent pilot study in the US reporting positive preliminary results.
As part of the Imperial study, the volunteers were asked to recall their favourite and worst memories while inside the scanner. They rated their favourite memories as more vivid, emotionally intense and positive after MDMA than placebo, and they rated their worst memories less negatively. This was reflected in the way that parts of the brain were activated more or less strongly under MDMA. These results were published in the International Journal of Neuropsychopharmacology.
Dr Carhart-Harris said: “In healthy volunteers, MDMA seems to lessen the impact of painful memories. This fits with the idea that it could help patients with PTSD revisit their traumatic experiences in psychotherapy without being overwhelmed by negative emotions, but we need to do studies in PTSD patients to see if the drug affects them in the same way.”
In the Human Brain, Size Really Isn’t Everything
There are many things that make humans a unique species, but a couple stand out. One is our mind, the other our brain.
The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking.
The human brain is exceptional, too. At three pounds, it is gigantic relative to our body size. Our closest living relatives, chimpanzees, have brains that are only a third as big.
Scientists have long suspected that our big brain and powerful mind are intimately connected. Starting about three million years ago, fossils of our ancient relatives record a huge increase in brain size. Once that cranial growth was underway, our forerunners started leaving behind signs of increasingly sophisticated minds, like stone tools and cave paintings.
But scientists have long struggled to understand how a simple increase in size could lead to the evolution of those faculties. Now, two Harvard neuroscientists, Randy L. Buckner and Fenna M. Krienen, have offered a powerful yet simple explanation.
In our smaller-brained ancestors, the researchers argue, neurons were tightly tethered in a relatively simple pattern of connections. When our ancestors’ brains expanded, those tethers ripped apart, enabling our neurons to form new circuits.
Dr. Buckner and Dr. Krienen call their idea the tether hypothesis, and present it in a paper in the December issue of the journal Trends in Cognitive Sciences.
“I think it presents some pretty exciting ideas,” said Chet C. Sherwood, an expert on human brain evolution at George Washington University who was not involved in the research.