Neuroscience

Articles and news from the latest research reports.

Posts tagged prefrontal cortex

149 notes

Brain Size Didn’t Drive Evolution, Research Suggests
Brain organization, not overall size, may be the key evolutionary difference between primate brains, and the key to what gives humans their smarts, new research suggests.
In the study, researchers looked at 17 species that span 40 million years of evolutionary time, finding changes in the relative size of specific brain regions, rather than changes in brain size, accounted for three-quarters of brain evolution over that time. The study, published today (March 26) in the Proceedings of the Royal Society B, also revealed that massive increases in the brain’s prefrontal cortex played a critical role in great ape evolution.
"For the first time, we can really identify what is so special about great ape brain organization," said study co-author Jeroen Smaers, an evolutionary biologist at the University College London. 
Is bigger better?
Traditionally, scientists have thought humans’ superior intelligence derived mostly from the fact that our brains are three times bigger than our nearest living relatives, chimpanzees.
But bigger isn’t always better. Bigger brains take much more energy to power, so scientists have hypothesized that brain reorganization could be a smarter strategy to evolve mental abilities.
To see how brain organization evolved throughout primates, Smaers and his colleague Christophe Soligo analyzed post-mortem slices of brains from 17 different primates, then mapped changes in brain size onto an evolutionary tree.
Over evolutionary time, several key brain regions increased in size relative to other regions. Great apes (especially humans) saw a rise in white matter in the prefrontal cortex, which contributes to social cognition, moral judgments, introspection and goal-directed planning.
"The prefrontal cortex is a little bit like the CEO of the brain," Smaers told LiveScience. "It takes information from other brain areas and it synthesizes them."
When great apes diverged from old-world monkeys about 20 million years ago, brain regions tied to motor planning also increased in relative size. That could have helped them orchestrate the complex movements needed to manipulate tools — possibly to get at different food sources, Smaers said.
Gibbons and howler monkeys showed a different pattern. Even though their bodies and their brains got smaller over time, the hippocampus, which plays a role in spatial tasks, tended to increase in size in relation to the rest of the brain. That may have allowed these monkeys to be spatially adept and inhabit a more diverse range of environments.
Prefrontal cortex
The study shows that specific parts of the brain can selectively scale up to meet the demands of new environments, said Chet Sherwood, an anthropologist at George Washington University, who was not involved in the study.
The finding also drives home the importance of the prefrontal cortex, he said.
"It’s very suggestive that connectivity of prefrontal cortex has been a particularly strong driving force in ape and human brains," Sherwood told LiveScience.

Brain Size Didn’t Drive Evolution, Research Suggests

Brain organization, not overall size, may be the key evolutionary difference between primate brains, and the key to what gives humans their smarts, new research suggests.

In the study, researchers looked at 17 species that span 40 million years of evolutionary time, finding changes in the relative size of specific brain regions, rather than changes in brain size, accounted for three-quarters of brain evolution over that time. The study, published today (March 26) in the Proceedings of the Royal Society B, also revealed that massive increases in the brain’s prefrontal cortex played a critical role in great ape evolution.

"For the first time, we can really identify what is so special about great ape brain organization," said study co-author Jeroen Smaers, an evolutionary biologist at the University College London.

Is bigger better?

Traditionally, scientists have thought humans’ superior intelligence derived mostly from the fact that our brains are three times bigger than our nearest living relatives, chimpanzees.

But bigger isn’t always better. Bigger brains take much more energy to power, so scientists have hypothesized that brain reorganization could be a smarter strategy to evolve mental abilities.

To see how brain organization evolved throughout primates, Smaers and his colleague Christophe Soligo analyzed post-mortem slices of brains from 17 different primates, then mapped changes in brain size onto an evolutionary tree.

Over evolutionary time, several key brain regions increased in size relative to other regions. Great apes (especially humans) saw a rise in white matter in the prefrontal cortex, which contributes to social cognition, moral judgments, introspection and goal-directed planning.

"The prefrontal cortex is a little bit like the CEO of the brain," Smaers told LiveScience. "It takes information from other brain areas and it synthesizes them."

When great apes diverged from old-world monkeys about 20 million years ago, brain regions tied to motor planning also increased in relative size. That could have helped them orchestrate the complex movements needed to manipulate tools — possibly to get at different food sources, Smaers said.

Gibbons and howler monkeys showed a different pattern. Even though their bodies and their brains got smaller over time, the hippocampus, which plays a role in spatial tasks, tended to increase in size in relation to the rest of the brain. That may have allowed these monkeys to be spatially adept and inhabit a more diverse range of environments.

Prefrontal cortex

The study shows that specific parts of the brain can selectively scale up to meet the demands of new environments, said Chet Sherwood, an anthropologist at George Washington University, who was not involved in the study.

The finding also drives home the importance of the prefrontal cortex, he said.

"It’s very suggestive that connectivity of prefrontal cortex has been a particularly strong driving force in ape and human brains," Sherwood told LiveScience.

Filed under brain evolution brain size prefrontal cortex cerebellum primates evolution neuroscience psychology science

113 notes

Researchers Show that Suppressing the Brain’s “Filter” Can Improve Performance in Creative Tasks
The brain’s prefrontal cortex is thought to be the seat of cognitive control, working as a kind of filter that keeps irrelevant thoughts, perceptions and memories from interfering with a task at hand.
Now, researchers at the University of Pennsylvania have shown that inhibiting this filter can boost performance for tasks in which unfiltered, creative thoughts present an advantage.
The research was conducted by Sharon Thompson-Schill, the Christopher H. Browne Distinguished Professor of Psychology and director of the Center for Cognitive Neuroscience, and Evangelia Chrysikou, a member of her lab who is now an assistant professor at the University of Kansas. They collaborated with Roy Hamilton and H. Branch Coslett of the Department of Neurology at Penn’s Perelman School of Medicine and Abhishek Datta and Marom Bikson of the Department of Biomedical Engineering at the City College of New York.
Their work was published in the journal Cognitive Neuroscience.

Researchers Show that Suppressing the Brain’s “Filter” Can Improve Performance in Creative Tasks

The brain’s prefrontal cortex is thought to be the seat of cognitive control, working as a kind of filter that keeps irrelevant thoughts, perceptions and memories from interfering with a task at hand.

Now, researchers at the University of Pennsylvania have shown that inhibiting this filter can boost performance for tasks in which unfiltered, creative thoughts present an advantage.

The research was conducted by Sharon Thompson-Schill, the Christopher H. Browne Distinguished Professor of Psychology and director of the Center for Cognitive Neuroscience, and Evangelia Chrysikou, a member of her lab who is now an assistant professor at the University of Kansas. They collaborated with Roy Hamilton and H. Branch Coslett of the Department of Neurology at Penn’s Perelman School of Medicine and Abhishek Datta and Marom Bikson of the Department of Biomedical Engineering at the City College of New York.

Their work was published in the journal Cognitive Neuroscience.

Filed under brain memory perception prefrontal cortex cognitive control transcranial direct current stimulation creative task psychology neuroscience science

168 notes

Human cognition depends upon slow-firing neurons
Good mental health and clear thinking depend upon our ability to store and manipulate thoughts on a sort of “mental sketch pad.” In a new study, Yale School of Medicine researchers describe the molecular basis of this ability — the hallmark of human cognition — and describe how a breakdown of the system contributes to diseases such as schizophrenia and Alzheimer’s disease.
“Insults to these highly evolved cortical circuits impair the ability to create and maintain our mental representations of the world, which is the basis of higher cognition,” said Amy Arnsten, professor of neurobiology and senior author of the paper published in the Feb. 20 issue of the journal Neuron.
High-order thinking depends upon our ability to generate mental representations in our brains without any sensory stimulation from the environment. These cognitive abilities arise from highly evolved circuits in the prefrontal cortex. Mathematical models by former Yale neurobiologist Xiao-Jing Wang, now of New York University, predicted that in order to maintain these visual representations the prefrontal cortex must rely on a family of receptors that allow for slow, steady firing of neurons. The Yale scientists show that NMDA-NR2B receptors involved in glutamate signaling regulate this neuronal firing.  These receptors, studied at Yale for more than a decade, are responsible for activity of highly evolved brain circuits found especially in primates.
Earlier studies have shown these types of NMDA receptors are often altered in patients with schizophrenia. The Neuron study suggests that those suffering from the disease may be unable to hold onto a stable view of the world. Also, these receptors seem to be altered in Alzheimer’s patients, which may contribute to the cognitive deficits of dementia.
The lab of Dr. John Krystal, chair of the department of psychiatry at Yale, has found that the anesthetic ketamine, abused as a street drug, blocks NMDA receptors and can mimic some of the symptoms of schizophrenia. The current study in Neuron shows that ketamine may reduce the firing of the same higher-order neural circuits that are decimated in schizophrenia. 
“Identifying the receptor needed for higher cognition may help us to understand why certain genetic insults lead to cognitive impairment and will help us to develop strategies for treating these debilitating disorders,” Arnsten said.

Human cognition depends upon slow-firing neurons

Good mental health and clear thinking depend upon our ability to store and manipulate thoughts on a sort of “mental sketch pad.” In a new study, Yale School of Medicine researchers describe the molecular basis of this ability — the hallmark of human cognition — and describe how a breakdown of the system contributes to diseases such as schizophrenia and Alzheimer’s disease.

“Insults to these highly evolved cortical circuits impair the ability to create and maintain our mental representations of the world, which is the basis of higher cognition,” said Amy Arnsten, professor of neurobiology and senior author of the paper published in the Feb. 20 issue of the journal Neuron.

High-order thinking depends upon our ability to generate mental representations in our brains without any sensory stimulation from the environment. These cognitive abilities arise from highly evolved circuits in the prefrontal cortex. Mathematical models by former Yale neurobiologist Xiao-Jing Wang, now of New York University, predicted that in order to maintain these visual representations the prefrontal cortex must rely on a family of receptors that allow for slow, steady firing of neurons. The Yale scientists show that NMDA-NR2B receptors involved in glutamate signaling regulate this neuronal firing.  These receptors, studied at Yale for more than a decade, are responsible for activity of highly evolved brain circuits found especially in primates.

Earlier studies have shown these types of NMDA receptors are often altered in patients with schizophrenia. The Neuron study suggests that those suffering from the disease may be unable to hold onto a stable view of the world. Also, these receptors seem to be altered in Alzheimer’s patients, which may contribute to the cognitive deficits of dementia.

The lab of Dr. John Krystal, chair of the department of psychiatry at Yale, has found that the anesthetic ketamine, abused as a street drug, blocks NMDA receptors and can mimic some of the symptoms of schizophrenia. The current study in Neuron shows that ketamine may reduce the firing of the same higher-order neural circuits that are decimated in schizophrenia. 

“Identifying the receptor needed for higher cognition may help us to understand why certain genetic insults lead to cognitive impairment and will help us to develop strategies for treating these debilitating disorders,” Arnsten said.

Filed under brain brain circuits cognition cognitive deficit prefrontal cortex mental representations receptors neuroscience science

48 notes

UCSB Study of Cocaine Addiction Reveals Targets for Treatment
Scientists at UC Santa Barbara are researching cocaine addiction, part of a widespread problem, which, along with other addictions, costs billions of dollars in damage to individuals, families, and society. Laboratory studies at UCSB have revealed that the diminished brain function and learning impairment that result from cocaine addiction can be treated –– and that learning can be restored.
Karen Szumlinski, a professor in the Department of Psychological & Brain Sciences at UCSB, and her colleagues Osnat Ben-Shahar and Tod Kippin, have worked in the field of addiction for many years. Senior author of a paper on this topic published recently in The Journal of Neuroscience, Szumlinski is particularly interested in the part of the brain called the prefrontal cortex, where the process of “executive function” –– or decision-making –– is located. This area is involved in directing one’s behavior in an appropriate manner, and in controlling behavior.
With her research team, Szumlinski discovered that a drug that stimulates a certain type of glutamate receptor –– when aimed at the prefrontal cortex –– could restore learning impairment in rats with simulated cocaine addiction.
"Needless to say, this (the prefrontal cortex) is one of the last parts of the brain to develop, and, of relevance to our students, continues to develop through about age 25 to 28," said Szumlinski.
Szumlinski explained that in the prefrontal cortex there seems to be “hypo-frontality,” or reduced functioning, in drug addicts, as well as in patients with a range of neuropsychiatric diseases, including schizophrenia, depression, and attention deficit disorder.
Szumlinski calls the prefrontal cortex a late-developing brain area that is critical for making proper decisions, and inhibiting behavior. “You damage this brain region and you lose the ability to self-regulate, you make impulsive decisions like engaging in risky sexual behavior or drug-taking, you basically go off the deep end in terms of function,” she said. “So we were very much interested in how drugs of abuse impact the prefrontal cortex, given that human drug addicts show deficits in this brain area when you put them into a scanner. They show hypo-activity.” She said this hypo-activity, or hypo-frontality, might relate to a neurotransmitter that scientists know is involved in exciting the brain.
A key question, according to Szumlinski, is this: “Was that hypo-frontality there in the first place, and that’s why they became an addict; or did the drugs change their prefrontal cortext, to cause it to become hypo-functioning and thus they’re not able to control their drug use? You can’t parse that out in humans. So that’s why we turn then to animal models of the disorder, and we do have this rat model that we use in the paper.”
Szumlinski pointed out a key difficulty in the development of treatments for addiction: There is little money targeted to the study of this disease. Hence, in addition to studying the brain mechanisms that are involved, she is joining forces with researchers who study other neurological diseases that are well-funded, to help find cures. She hopes that government approval of new drugs for these other diseases would eventually make the drugs available for clinical trials to study their effects on cocaine addiction.
(Image: iStock)

UCSB Study of Cocaine Addiction Reveals Targets for Treatment

Scientists at UC Santa Barbara are researching cocaine addiction, part of a widespread problem, which, along with other addictions, costs billions of dollars in damage to individuals, families, and society. Laboratory studies at UCSB have revealed that the diminished brain function and learning impairment that result from cocaine addiction can be treated –– and that learning can be restored.

Karen Szumlinski, a professor in the Department of Psychological & Brain Sciences at UCSB, and her colleagues Osnat Ben-Shahar and Tod Kippin, have worked in the field of addiction for many years. Senior author of a paper on this topic published recently in The Journal of Neuroscience, Szumlinski is particularly interested in the part of the brain called the prefrontal cortex, where the process of “executive function” –– or decision-making –– is located. This area is involved in directing one’s behavior in an appropriate manner, and in controlling behavior.

With her research team, Szumlinski discovered that a drug that stimulates a certain type of glutamate receptor –– when aimed at the prefrontal cortex –– could restore learning impairment in rats with simulated cocaine addiction.

"Needless to say, this (the prefrontal cortex) is one of the last parts of the brain to develop, and, of relevance to our students, continues to develop through about age 25 to 28," said Szumlinski.

Szumlinski explained that in the prefrontal cortex there seems to be “hypo-frontality,” or reduced functioning, in drug addicts, as well as in patients with a range of neuropsychiatric diseases, including schizophrenia, depression, and attention deficit disorder.

Szumlinski calls the prefrontal cortex a late-developing brain area that is critical for making proper decisions, and inhibiting behavior. “You damage this brain region and you lose the ability to self-regulate, you make impulsive decisions like engaging in risky sexual behavior or drug-taking, you basically go off the deep end in terms of function,” she said. “So we were very much interested in how drugs of abuse impact the prefrontal cortex, given that human drug addicts show deficits in this brain area when you put them into a scanner. They show hypo-activity.” She said this hypo-activity, or hypo-frontality, might relate to a neurotransmitter that scientists know is involved in exciting the brain.

A key question, according to Szumlinski, is this: “Was that hypo-frontality there in the first place, and that’s why they became an addict; or did the drugs change their prefrontal cortext, to cause it to become hypo-functioning and thus they’re not able to control their drug use? You can’t parse that out in humans. So that’s why we turn then to animal models of the disorder, and we do have this rat model that we use in the paper.”

Szumlinski pointed out a key difficulty in the development of treatments for addiction: There is little money targeted to the study of this disease. Hence, in addition to studying the brain mechanisms that are involved, she is joining forces with researchers who study other neurological diseases that are well-funded, to help find cures. She hopes that government approval of new drugs for these other diseases would eventually make the drugs available for clinical trials to study their effects on cocaine addiction.

(Image: iStock)

Filed under cocaine addiction brain function learning impairment prefrontal cortex neuroscience science

175 notes

Even the brains of people with anxiety states can get used to fear
Fear is a protective function against possible dangers that is designed to save our lives. Where there are problems with this fear mechanism, its positive effects are cancelled out: patients who have a social phobia become afraid of perfectly normal, everyday social situations because they are worried about behaving inappropriately or being thought of as stupid by other people. Scientists from the Centre for Medical Physics and Biomedical Technology and the University Department of Psychiatry and Psychotherapy at the MedUni Vienna have now discovered that this fear circuit can be deactivated, at least in part.
In a study by Ronald Sladky, led by Christian Windischberger (Centre for Medical Physics and Biomedical Technology), which has recently been published in the magazine PLOS One, functional magnetic resonance tomography was used to measure the changes in the brain activity of socially phobic patients and healthy test subjects while they were looking at faces. This experiment simulates social confrontation with other people without actually placing the individual in an intolerable situation of anxiety.
Permanent confrontation has a diminishing effect on anxiety“The study demonstrated that people with social phobia initially exhibit greater activity in the amygdala and in the medial, prefrontal cortex of the brain, however after a few faces this activity recedes,” says Sladky. This contradicts the assumption made thus far that the emotional circuit of socially phobic individuals is unable to adapt adequately to this stress-inducing situation.
Permanent confrontation with the test task not only led to a solution to the “problem” being found more quickly among the patients with anxiety, but also to some areas of the brain being bypassed which otherwise were over-stimulated, a characteristic typical of anxiety. Says Sladky: “We therefore concluded that there are functional control strategies even in the emotional circuits of people with social phobia, although the mechanisms take longer to take effect in these individuals. The misregulation of these parts of the brain can therefore be compensated to a degree.”
These findings could, according to Sladky, provide a starting point for the development of personalised training programmes that will help affected individuals to conquer unpleasant situations in their everyday lives more effectively. In Austria, around 200,000 people a year are affected by some form of social phobia. The number of people who suffer this condition without seeking help for it is likely to be very high, since many affected individuals fail to seek assistance or do so only too late as a result of their anxiety.

Even the brains of people with anxiety states can get used to fear

Fear is a protective function against possible dangers that is designed to save our lives. Where there are problems with this fear mechanism, its positive effects are cancelled out: patients who have a social phobia become afraid of perfectly normal, everyday social situations because they are worried about behaving inappropriately or being thought of as stupid by other people. Scientists from the Centre for Medical Physics and Biomedical Technology and the University Department of Psychiatry and Psychotherapy at the MedUni Vienna have now discovered that this fear circuit can be deactivated, at least in part.

In a study by Ronald Sladky, led by Christian Windischberger (Centre for Medical Physics and Biomedical Technology), which has recently been published in the magazine PLOS One, functional magnetic resonance tomography was used to measure the changes in the brain activity of socially phobic patients and healthy test subjects while they were looking at faces. This experiment simulates social confrontation with other people without actually placing the individual in an intolerable situation of anxiety.

Permanent confrontation has a diminishing effect on anxiety
“The study demonstrated that people with social phobia initially exhibit greater activity in the amygdala and in the medial, prefrontal cortex of the brain, however after a few faces this activity recedes,” says Sladky. This contradicts the assumption made thus far that the emotional circuit of socially phobic individuals is unable to adapt adequately to this stress-inducing situation.

Permanent confrontation with the test task not only led to a solution to the “problem” being found more quickly among the patients with anxiety, but also to some areas of the brain being bypassed which otherwise were over-stimulated, a characteristic typical of anxiety. Says Sladky: “We therefore concluded that there are functional control strategies even in the emotional circuits of people with social phobia, although the mechanisms take longer to take effect in these individuals. The misregulation of these parts of the brain can therefore be compensated to a degree.”

These findings could, according to Sladky, provide a starting point for the development of personalised training programmes that will help affected individuals to conquer unpleasant situations in their everyday lives more effectively. In Austria, around 200,000 people a year are affected by some form of social phobia. The number of people who suffer this condition without seeking help for it is likely to be very high, since many affected individuals fail to seek assistance or do so only too late as a result of their anxiety.

Filed under anxiety social phobia fear brain activity amygdala prefrontal cortex psychology neuroscience science

190 notes

Poor sleep in old age prevents the brain from storing memories
The connection between poor sleep, memory loss and brain deterioration as we grow older has been elusive. But for the first time, scientists at the University of California, Berkeley, have found a link between these hallmark maladies of old age. Their discovery opens the door to boosting the quality of sleep in elderly people to improve memory.
UC Berkeley neuroscientists have found that the slow brain waves generated during the deep, restorative sleep we typically experience in youth play a key role in transporting memories from the hippocampus – which provides short-term storage for memories – to the prefrontal cortex’s longer term “hard drive.”
However, in older adults, memories may be getting stuck in the hippocampus due to the poor quality of deep ‘slow wave’ sleep, and are then overwritten by new memories, the findings suggest.
“What we have discovered is a dysfunctional pathway that helps explain the relationship between brain deterioration, sleep disruption and memory loss as we get older – and with that, a potentially new treatment avenue,” said UC Berkeley sleep researcher Matthew Walker, an associate professor of psychology and neuroscience at UC Berkeley and senior author of the study published in the journal Nature Neuroscience.

Poor sleep in old age prevents the brain from storing memories

The connection between poor sleep, memory loss and brain deterioration as we grow older has been elusive. But for the first time, scientists at the University of California, Berkeley, have found a link between these hallmark maladies of old age. Their discovery opens the door to boosting the quality of sleep in elderly people to improve memory.

UC Berkeley neuroscientists have found that the slow brain waves generated during the deep, restorative sleep we typically experience in youth play a key role in transporting memories from the hippocampus – which provides short-term storage for memories – to the prefrontal cortex’s longer term “hard drive.”

However, in older adults, memories may be getting stuck in the hippocampus due to the poor quality of deep ‘slow wave’ sleep, and are then overwritten by new memories, the findings suggest.

“What we have discovered is a dysfunctional pathway that helps explain the relationship between brain deterioration, sleep disruption and memory loss as we get older – and with that, a potentially new treatment avenue,” said UC Berkeley sleep researcher Matthew Walker, an associate professor of psychology and neuroscience at UC Berkeley and senior author of the study published in the journal Nature Neuroscience.

Filed under brainwaves sleep memory prefrontal cortex frontal lobe aging neuroscience science

63 notes

How the brain copes with multi-tasking alters with age
The pattern of blood flow in the prefrontal cortex in the brains alters with age during multi-tasking, finds a new study in BioMed Central’s open access journal BMC Neuroscience. Increased blood volume, measured using oxygenated haemoglobin (Oxy-Hb) increased at the start of multitasking in all age groups. But to perform the same tasks, healthy older people had a higher and more sustained increase in Oxy-Hb than younger people.
Age related changes to the brain occur earliest in the prefrontal cortex, the area of the brain associated with memory, emotion, and higher decision making functions. It is changes to this area of the brain that are also associated with dementia, depression and other neuropsychiatric disorders. Some studies have shown that regular physical activity and cognitive training can prevent cognitive decline (use it or lose it!) but to establish what occurs in a healthy aging brain researchers from Japan and USA have compared brain activity during single and dual tasks for young (aged 21 to 25) and older (over 65) people.
Near infrared spectroscopy (NIRS) measurements of Oxy-Hb showed that blood flow to the prefrontal cortex was not affected by the physical task for either age group but was affected by the mental task. For both the young and the over 65s the start of the calculation task  coincided with an increase in blood volume which reduced to baseline once the task was completed.
The main difference between the groups was only seen when performing the physical and mental tasks at the same time - older people had a higher prefrontal cortex response which lasted longer than the younger group.
Hironori Ohsugi, from Seirei Christopher University, and one of the team who performed this research explained “From our observations during the dual task it seems that the older people turn their attention to the calculation at the expense of the physical task, while younger people are able to maintain concentration on both. Since our subjects were all healthy it seems that this requirement for increased activation of the prefrontal cortex is part of normal decrease in brain function associated with aging. Further study will show whether or not dual task training can be used to maintain a more youthful brain.”
(Image: Photos.com)

How the brain copes with multi-tasking alters with age

The pattern of blood flow in the prefrontal cortex in the brains alters with age during multi-tasking, finds a new study in BioMed Central’s open access journal BMC Neuroscience. Increased blood volume, measured using oxygenated haemoglobin (Oxy-Hb) increased at the start of multitasking in all age groups. But to perform the same tasks, healthy older people had a higher and more sustained increase in Oxy-Hb than younger people.

Age related changes to the brain occur earliest in the prefrontal cortex, the area of the brain associated with memory, emotion, and higher decision making functions. It is changes to this area of the brain that are also associated with dementia, depression and other neuropsychiatric disorders. Some studies have shown that regular physical activity and cognitive training can prevent cognitive decline (use it or lose it!) but to establish what occurs in a healthy aging brain researchers from Japan and USA have compared brain activity during single and dual tasks for young (aged 21 to 25) and older (over 65) people.

Near infrared spectroscopy (NIRS) measurements of Oxy-Hb showed that blood flow to the prefrontal cortex was not affected by the physical task for either age group but was affected by the mental task. For both the young and the over 65s the start of the calculation task  coincided with an increase in blood volume which reduced to baseline once the task was completed.

The main difference between the groups was only seen when performing the physical and mental tasks at the same time - older people had a higher prefrontal cortex response which lasted longer than the younger group.

Hironori Ohsugi, from Seirei Christopher University, and one of the team who performed this research explained “From our observations during the dual task it seems that the older people turn their attention to the calculation at the expense of the physical task, while younger people are able to maintain concentration on both. Since our subjects were all healthy it seems that this requirement for increased activation of the prefrontal cortex is part of normal decrease in brain function associated with aging. Further study will show whether or not dual task training can be used to maintain a more youthful brain.”

(Image: Photos.com)

Filed under brain brain activity prefrontal cortex cognitive decline aging multi-tasking neuroscience science

82 notes

The Role of Medial Prefrontal Cortex in Memory and Decision Making
Some have claimed that the medial prefrontal cortex (mPFC) mediates decision making. Others suggest mPFC is selectively involved in the retrieval of remote long-term memory. Yet others suggests mPFC supports memory and consolidation on time scales ranging from seconds to days. How can all these roles be reconciled? We propose that the function of the mPFC is to learn associations between context, locations, events, and corresponding adaptive responses, particularly emotional responses. Thus, the ubiquitous involvement of mPFC in both memory and decision making may be due to the fact that almost all such tasks entail the ability to recall the best action or emotional response to specific events in a particular place and time. An interaction between multiple memory systems may explain the changing importance of mPFC to different types of memories over time. In particular, mPFC likely relies on the hippocampus to support rapid learning and memory consolidation.

The Role of Medial Prefrontal Cortex in Memory and Decision Making

Some have claimed that the medial prefrontal cortex (mPFC) mediates decision making. Others suggest mPFC is selectively involved in the retrieval of remote long-term memory. Yet others suggests mPFC supports memory and consolidation on time scales ranging from seconds to days. How can all these roles be reconciled? We propose that the function of the mPFC is to learn associations between context, locations, events, and corresponding adaptive responses, particularly emotional responses. Thus, the ubiquitous involvement of mPFC in both memory and decision making may be due to the fact that almost all such tasks entail the ability to recall the best action or emotional response to specific events in a particular place and time. An interaction between multiple memory systems may explain the changing importance of mPFC to different types of memories over time. In particular, mPFC likely relies on the hippocampus to support rapid learning and memory consolidation.

Filed under mPFC prefrontal cortex decision making memory memory consolidation learning neuroscience science

483 notes

Study Shows Working Memory Is Driven By Prefrontal Cortex And Dopamine
One of the unique features of the human mind is its ability re-prioritize its goals and priorities as situations change and new information arises. This happens when you cancel a planned cruise because you need the money to repair your broke-down car, or when you interrupt your morning jog because your cell phone is ringing in your pocket.
In a new study published in the Proceedings of the National Academy of Sciences (PNAS), researchers from Princeton University say that they have discovered the mechanisms that control how our brains use new information to modify our existing priorities.
The team of researchers at Princeton’s Neuroscience Institute (PNI) used functional magnetic resonance imaging (fMRI) to scan subjects and find out where and how the human brain reprioritizes goals. Unsurprisingly, they found that the shifting of goals takes place in the prefrontal cortex, a region of the brain which is known to be associated with a variety of higher-level behaviors. They also observed that the powerful neurotransmitter dopamine – also known as the “pleasure chemical” – appears to play a critical role in this process.
Using a harmless magnetic pulse, the scientists interrupted activity in the prefrontal cortex of the participants while they were playing games and found they were unable to switch to a different task in the game.
“We have found a fundamental mechanism that contributes to the brain’s ability to concentrate on one task and then flexibly switch to another task,” explained Jonathan Cohen, co-director of PNI and the university’s Robert Bendheim and Lynn Bendheim Thoman Professor in Neuroscience.
“Impairments in this system are central to many critical disorders of cognitive function such as those observed in schizophrenia and obsessive-compulsive disorder.”
Previous research had already demonstrated that when the brain uses new information to modify its goals or behaviors, this information is temporarily filed away into the brain’s working memory, a type of short-term memory storage. Until now, however, scientists have not understood the mechanisms controlling how this information is updated.

Study Shows Working Memory Is Driven By Prefrontal Cortex And Dopamine

One of the unique features of the human mind is its ability re-prioritize its goals and priorities as situations change and new information arises. This happens when you cancel a planned cruise because you need the money to repair your broke-down car, or when you interrupt your morning jog because your cell phone is ringing in your pocket.

In a new study published in the Proceedings of the National Academy of Sciences (PNAS), researchers from Princeton University say that they have discovered the mechanisms that control how our brains use new information to modify our existing priorities.

The team of researchers at Princeton’s Neuroscience Institute (PNI) used functional magnetic resonance imaging (fMRI) to scan subjects and find out where and how the human brain reprioritizes goals. Unsurprisingly, they found that the shifting of goals takes place in the prefrontal cortex, a region of the brain which is known to be associated with a variety of higher-level behaviors. They also observed that the powerful neurotransmitter dopamine – also known as the “pleasure chemical” – appears to play a critical role in this process.

Using a harmless magnetic pulse, the scientists interrupted activity in the prefrontal cortex of the participants while they were playing games and found they were unable to switch to a different task in the game.

“We have found a fundamental mechanism that contributes to the brain’s ability to concentrate on one task and then flexibly switch to another task,” explained Jonathan Cohen, co-director of PNI and the university’s Robert Bendheim and Lynn Bendheim Thoman Professor in Neuroscience.

“Impairments in this system are central to many critical disorders of cognitive function such as those observed in schizophrenia and obsessive-compulsive disorder.”

Previous research had already demonstrated that when the brain uses new information to modify its goals or behaviors, this information is temporarily filed away into the brain’s working memory, a type of short-term memory storage. Until now, however, scientists have not understood the mechanisms controlling how this information is updated.

Filed under brain prefrontal cortex working memory OCD dopamine neuroscience science

190 notes

The ethical minefield of using neuroscience to prevent crime
On the evening of March 10, 2007, Abdelmalek Bayout, an Algerian citizen living in Italy, brutally stabbed to death Walter Perez, a fellow immigrant from Colombia. Bayout admitted to the crime, saying he was provoked by Perez, who ridiculed him for wearing eye makeup.
According to Nature magazine, Bayout’s defence argued that he was mentally ill at the time of the offence. The court accepted that argument and, although it found Bayout guilty of the crime, imposed on him a reduced prison sentence of nine years and two months.
Bayout nevertheless appealed the judgment, and the Court of Appeal ordered a new psychiatric report. That report showed, among other things, that Bayout had low levels of the neurotransmitter monoamine oxidase A (MAO-A) — an important development given that previous research discovered that men who had low MAO-A levels and who had been abused as children were more likely to be convicted of violent crimes as adults.
Ultimately, the Court of Appeal further reduced Bayout’s sentence by a year, with Judge Pier Valerio Reinotti describing the MAO-A evidence as “particularly compelling.”
Upon a brief review of the scientific evidence, certain glaring problems with the court’s judgment quickly become apparent. Most obviously, the research showing an association between low MAO-A levels and violence tells us nothing about Bayout’s — or any specific individual’s — propensity for violence. Indeed, while a significant percentage of men with low MAO-A levels commit violent offences, the majority do not.
Yet the fact that the court allowed such evidence to influence its verdict suggests that neuroscience, while not eliminating criminal responsibility, might lead courts to conclude that defendants with certain neurological deficits are less responsible than those with “normal” brains.
There is, in fact, a precedent for this, and it’s one that few people question. Adolescents in virtually every country are subject to differential sentencing, and in many cases to an entirely separate system of justice, because their neurobiology renders them less blameworthy, less responsible than adults.
Indeed, while the limbic system, or emotional centre of the brain, is typically mature by the age of 16, the prefrontal cortex, which is associated with one’s capacity to control emotions, is not fully developed, in most people, until the early 20s. Hence according to what’s sometimes called the “two systems” theory, the imbalance in development of the limbic system and the PFC explains the risk taking and emotional behaviour that is characteristic of adolescence. And it justifies our treating adolescents as less responsible than adults.
There are, of course, substantial differences between adolescents and adults with neurological deficits, the most obvious being that most adolescents will outgrow the developmental imbalance. But the basic principle — that people who suffer from neurological aberrations that render them less capable of controlling their behaviour should be held less blameworthy — seems to have swayed the Italian Court of Appeal.
But not just the Italian Court of Appeal. While the “MAO-A defence” has been tried and failed in many courts around the world, recent research led by University of Utah psychologist Lisa Aspinwall suggests that many judges, when presented with neurobiological evidence, are inclined to reduce defendants’ sentences.
Read more

The ethical minefield of using neuroscience to prevent crime

On the evening of March 10, 2007, Abdelmalek Bayout, an Algerian citizen living in Italy, brutally stabbed to death Walter Perez, a fellow immigrant from Colombia. Bayout admitted to the crime, saying he was provoked by Perez, who ridiculed him for wearing eye makeup.

According to Nature magazine, Bayout’s defence argued that he was mentally ill at the time of the offence. The court accepted that argument and, although it found Bayout guilty of the crime, imposed on him a reduced prison sentence of nine years and two months.

Bayout nevertheless appealed the judgment, and the Court of Appeal ordered a new psychiatric report. That report showed, among other things, that Bayout had low levels of the neurotransmitter monoamine oxidase A (MAO-A) — an important development given that previous research discovered that men who had low MAO-A levels and who had been abused as children were more likely to be convicted of violent crimes as adults.

Ultimately, the Court of Appeal further reduced Bayout’s sentence by a year, with Judge Pier Valerio Reinotti describing the MAO-A evidence as “particularly compelling.”

Upon a brief review of the scientific evidence, certain glaring problems with the court’s judgment quickly become apparent. Most obviously, the research showing an association between low MAO-A levels and violence tells us nothing about Bayout’s — or any specific individual’s — propensity for violence. Indeed, while a significant percentage of men with low MAO-A levels commit violent offences, the majority do not.

Yet the fact that the court allowed such evidence to influence its verdict suggests that neuroscience, while not eliminating criminal responsibility, might lead courts to conclude that defendants with certain neurological deficits are less responsible than those with “normal” brains.

There is, in fact, a precedent for this, and it’s one that few people question. Adolescents in virtually every country are subject to differential sentencing, and in many cases to an entirely separate system of justice, because their neurobiology renders them less blameworthy, less responsible than adults.

Indeed, while the limbic system, or emotional centre of the brain, is typically mature by the age of 16, the prefrontal cortex, which is associated with one’s capacity to control emotions, is not fully developed, in most people, until the early 20s. Hence according to what’s sometimes called the “two systems” theory, the imbalance in development of the limbic system and the PFC explains the risk taking and emotional behaviour that is characteristic of adolescence. And it justifies our treating adolescents as less responsible than adults.

There are, of course, substantial differences between adolescents and adults with neurological deficits, the most obvious being that most adolescents will outgrow the developmental imbalance. But the basic principle — that people who suffer from neurological aberrations that render them less capable of controlling their behaviour should be held less blameworthy — seems to have swayed the Italian Court of Appeal.

But not just the Italian Court of Appeal. While the “MAO-A defence” has been tried and failed in many courts around the world, recent research led by University of Utah psychologist Lisa Aspinwall suggests that many judges, when presented with neurobiological evidence, are inclined to reduce defendants’ sentences.

Read more

Filed under brain neurotransmitters MAO-A neurological deficits crime prefrontal cortex neuroscience science

free counters