Neuroscience

Articles and news from the latest research reports.

Posts tagged frontal lobe

247 notes

Sometimes, adolescents just can’t resist
Don’t get mad the next time you catch your teenager texting when he promised to be studying.
He simply may not be able to resist.
A University of Iowa study found teenagers are far more sensitive than adults to the immediate effect or reward of their behaviors. The findings may help explain, for example, why the initial rush of texting may be more enticing for adolescents than the long-term payoff of studying.
“The rewards have a strong, perceptional draw and are more enticing to the teenager,” says Jatin Vaidya, a professor of psychiatry at the UI and corresponding author of the study, which appeared online this week in the journal Psychological Science. “Even when a behavior is no longer in a teenager’s best interest to continue, they will because the effect of the reward is still there and lasts much longer in adolescents than in adults.”
For parents, that means limiting distractions so teenagers can make better choices. Take the homework and social media dilemma: At 9 p.m., shut off everything except a computer that has no access to Facebook or Twitter, the researchers advise.
“I’m not saying they shouldn’t be allowed access to technology,” Vaidya says. “But they need help in regulating their attention so they can develop those impulse-control skills.”
In their study, “Value-Driven Attentional Capture in Adolescence,” Vaidya and co-authors Shaun Vecera, a professor of psychology, and Zachary Roper, a graduate student in psychology, note researchers generally believe teenagers are impulsive, make bad decisions, and engage in risky behavior because the frontal lobes of their brains are not fully developed.
But the UI researchers wondered whether something more fundamental was going on with adolescents to trigger behaviors independent of higher-level reasoning.
“We wanted to try to understand the brain’s reward system and how it changes from childhood to adulthood,” says Vaidya, who adds the reward trait in the human brain is much more primitive than decision-making. “We’ve been trying to understand the reward process in adolescence and whether there is more to adolescent behavior than an under-developed frontal lobe,” he adds.
For their study, the researchers recruited 40 adolescents, ages 13 and 16, and 40 adults, ages 20 and 35. First, participants were asked to find a red or green ring hidden within an array of rings on a computer screen. Once identified, they reported whether the white line inside the ring was vertical or horizontal. If they were right, they received a reward between 2 and 10 cents, depending on the color. For some participants, the red ring paid the highest reward; for others, it was the green. None was told which color would pay the most.
After 240 trials, the participants were asked whether they noticed anything about the colors. Most made no association between a color and reward, which researchers say proves the ring exercise didn’t involve high-level, decision-making.
In the next stage, participants showed they had developed an intuitive association when they were asked to find a diamond-shaped target. This time, the red and green rings were used as decoys.
At first, the adolescents and adults selected the color ring that garnered them the highest monetary reward, the goal of the first trial. But in short order, the adults adjusted and selected the diamond. The adolescents did not.
Even after 240 trials, the adolescents were still more apt to pick the colored rings.
“Even though you’ve told them, ‘You have a new target,’ the adolescents can’t get rid of the association they learned before,” Vecera says. “It’s as if that association is much more potent for the adolescent than for the adult.
“If you give the adolescent a reward, it will persist longer,” he adds. “The fact that the reward is gone doesn’t matter. They will act as if the reward is still there.”
Researchers say that inability to readily adjust behavior explains why, for example, a teenager may continue to make inappropriate comments in class long after friends stopped laughing.
In the future, researchers hope to delve into the psychological and neurological aspects of their results.
“Are there certain brain regions or circuits that continue to develop from adolescence to adulthood that play role in directing attention away from reward stimuli that are not task relevant?” Vaidya asks. “Also, what sort of life experiences and skill help to improve performance on this task?”

Sometimes, adolescents just can’t resist

Don’t get mad the next time you catch your teenager texting when he promised to be studying.

He simply may not be able to resist.

A University of Iowa study found teenagers are far more sensitive than adults to the immediate effect or reward of their behaviors. The findings may help explain, for example, why the initial rush of texting may be more enticing for adolescents than the long-term payoff of studying.

“The rewards have a strong, perceptional draw and are more enticing to the teenager,” says Jatin Vaidya, a professor of psychiatry at the UI and corresponding author of the study, which appeared online this week in the journal Psychological Science. “Even when a behavior is no longer in a teenager’s best interest to continue, they will because the effect of the reward is still there and lasts much longer in adolescents than in adults.”

For parents, that means limiting distractions so teenagers can make better choices. Take the homework and social media dilemma: At 9 p.m., shut off everything except a computer that has no access to Facebook or Twitter, the researchers advise.

“I’m not saying they shouldn’t be allowed access to technology,” Vaidya says. “But they need help in regulating their attention so they can develop those impulse-control skills.”

In their study, “Value-Driven Attentional Capture in Adolescence,” Vaidya and co-authors Shaun Vecera, a professor of psychology, and Zachary Roper, a graduate student in psychology, note researchers generally believe teenagers are impulsive, make bad decisions, and engage in risky behavior because the frontal lobes of their brains are not fully developed.

But the UI researchers wondered whether something more fundamental was going on with adolescents to trigger behaviors independent of higher-level reasoning.

“We wanted to try to understand the brain’s reward system and how it changes from childhood to adulthood,” says Vaidya, who adds the reward trait in the human brain is much more primitive than decision-making. “We’ve been trying to understand the reward process in adolescence and whether there is more to adolescent behavior than an under-developed frontal lobe,” he adds.

For their study, the researchers recruited 40 adolescents, ages 13 and 16, and 40 adults, ages 20 and 35. First, participants were asked to find a red or green ring hidden within an array of rings on a computer screen. Once identified, they reported whether the white line inside the ring was vertical or horizontal. If they were right, they received a reward between 2 and 10 cents, depending on the color. For some participants, the red ring paid the highest reward; for others, it was the green. None was told which color would pay the most.

After 240 trials, the participants were asked whether they noticed anything about the colors. Most made no association between a color and reward, which researchers say proves the ring exercise didn’t involve high-level, decision-making.

In the next stage, participants showed they had developed an intuitive association when they were asked to find a diamond-shaped target. This time, the red and green rings were used as decoys.

At first, the adolescents and adults selected the color ring that garnered them the highest monetary reward, the goal of the first trial. But in short order, the adults adjusted and selected the diamond. The adolescents did not.

Even after 240 trials, the adolescents were still more apt to pick the colored rings.

“Even though you’ve told them, ‘You have a new target,’ the adolescents can’t get rid of the association they learned before,” Vecera says. “It’s as if that association is much more potent for the adolescent than for the adult.

“If you give the adolescent a reward, it will persist longer,” he adds. “The fact that the reward is gone doesn’t matter. They will act as if the reward is still there.”

Researchers say that inability to readily adjust behavior explains why, for example, a teenager may continue to make inappropriate comments in class long after friends stopped laughing.

In the future, researchers hope to delve into the psychological and neurological aspects of their results.

“Are there certain brain regions or circuits that continue to develop from adolescence to adulthood that play role in directing attention away from reward stimuli that are not task relevant?” Vaidya asks. “Also, what sort of life experiences and skill help to improve performance on this task?”

Filed under adolescence attentional capture reward frontal lobe learning psychology neuroscience science

264 notes

Social origins of intelligence in the brain

By studying the injuries and aptitudes of Vietnam War veterans who suffered penetrating head wounds during the war, scientists are tackling — and beginning to answer — longstanding questions about how the brain works.

image

The researchers found that brain regions that contribute to optimal social functioning also are vital to general intelligence and to emotional intelligence. This finding bolsters the view that general intelligence emerges from the emotional and social context of one’s life.

The findings are reported in the journal Brain.

“We are trying to understand the nature of general intelligence and to what extent our intellectual abilities are grounded in social cognitive abilities,” said Aron Barbey, a University of Illinois professor of neuroscience, of psychology, and of speech and hearing science. Barbey (bar-BAY), an affiliate of the Beckman Institute and of the Institute for Genomic Biology at the U. of I., led the new study with an international team of collaborators.

Studies in social psychology indicate that human intellectual functions originate from the social context of everyday life, Barbey said.

“We depend at an early stage of our development on social relationships — those who love us care for us when we would otherwise be helpless,” he said.

Social interdependence continues into adulthood and remains important throughout the lifespan, Barbey said.

“Our friends and family tell us when we could make bad mistakes and sometimes rescue us when we do,” he said. “And so the idea is that the ability to establish social relationships and to navigate the social world is not secondary to a more general cognitive capacity for intellectual function, but that it may be the other way around. Intelligence may originate from the central role of relationships in human life and therefore may be tied to social and emotional capacities.”

The study involved 144 Vietnam veterans injured by shrapnel or bullets that penetrated the skull, damaging distinct brain tissues while leaving neighboring tissues intact. Using CT scans, the scientists painstakingly mapped the affected brain regions of each participant, then pooled the data to build a collective map of the brain.

The researchers used a battery of carefully designed tests to assess participants’ intellectual, emotional and social capabilities. They then looked for patterns that tied damage to specific brain regions to deficits in the participants’ ability to navigate the intellectual, emotional or social realms. Social problem solving in this analysis primarily involved conflict resolution with friends, family and peers at work.

As in their earlier studies of general intelligence and emotional intelligence, the researchers found that regions of the frontal cortex (at the front of the brain), the parietal cortex (further back near the top of the head) and the temporal lobes (on the sides of the head behind the ears) are all implicated in social problem solving. The regions that contributed to social functioning in the parietal and temporal lobes were located only in the brain’s left hemisphere, while both left and right frontal lobes were involved.

The brain networks found to be important to social adeptness were not identical to those that contribute to general intelligence or emotional intelligence, but there was significant overlap, Barbey said.

“The evidence suggests that there’s an integrated information-processing architecture in the brain, that social problem solving depends upon mechanisms that are engaged for general intelligence and emotional intelligence,” he said. “This is consistent with the idea that intelligence depends to a large extent on social and emotional abilities, and we should think about intelligence in an integrated fashion rather than making a clear distinction between cognition and emotion and social processing. This makes sense because our lives are fundamentally social — we direct most of our efforts to understanding others and resolving social conflict. And our study suggests that the architecture of intelligence in the brain may be fundamentally social, too.”

(Source: news.illinois.edu)

Filed under intelligence social intelligence social interaction frontal lobe neuroscience science

107 notes

Study finds cognitive performance can be improved in teens months,years after traumatic brain injury

Traumatic brain injuries from sports, recreational activities, falls or car accidents are the leading cause of death and disability in children and adolescents. While previously it was believed that the window for brain recovery was at most one year after injury, new research from the Center for BrainHealth at The University of Texas at Dallas published online today in the open-access journal Frontiers in Neurology shows cognitive performance can be improved to significant degrees months, and even years, after injury, given targeted brain training.

image

"The after-effects of concussions and more severe brain injuries can be very different and more detrimental to a developing child or adolescent brain than an adult brain," said Dr. Lori Cook, study author and director of the Center for BrainHealth’s pediatric brain injury programs. "While the brain undergoes spontaneous recovery in the immediate days, weeks, and months following a brain injury, cognitive deficits may continue to evolve months to years after the initial brain insult when the brain is called upon to perform higher-order reasoning and critical thinking tasks."

Twenty adolescents, ages 12-20 who experienced a traumatic brain injury at least six months prior to participating in the research and were demonstrating gist reasoning deficits, or the inability to “get the essence” from dense information, were enrolled in the study. The participants were randomized into two different cognitive training groups – strategy-based gist reasoning training versus fact-based memory training.

Participants completed eight, 45-minute sessions over a one-month period. Researchers compared the effects of the two forms of training on the ability to abstract meaning and recall facts. Testing included pre- and post-training assessments, in which adolescents were asked to read several texts and then craft a high-level summary, drawing upon inferences to transform ideas into novel, generalized statements, and recall important facts.

After training, only the gist-reasoning group showed significant improvement in the ability to abstract meanings – a foundational cognitive skill to everyday life functionality. Additionally, the gist-reasoning-trained group showed significant generalized gains to untrained areas including executive functions of working memory (i.e., holding information in mind for use – such as performing mental addition or subtraction ) and inhibition (i.e., filtering out irrelevant information). The gist-reasoning training group also demonstrated increased memory for facts, even though this skill was not specifically targeted in training.

"These preliminary results are promising in that higher-order cognitive training that focuses on ‘big picture’ thinking improves cognitive performance in ways that matter to everyday life success," said Dr. Cook. "What we found was that training higher-order cognitive skills can have a positive impact on untrained key executive functions as well as lower-level, but also important, processes such as straightforward memory, which is used to remember details. While the study sample was small and a larger trial is needed, the real-life application of this training program is especially important for adolescents who are at a very challenging life-stage when they face major academic and social complexities. These cognitive challenges require reasoning, filtering, focusing, planning, self-regulation, activity management and combating ‘information overload,’ which is one of the chief complaints that teens with concussions express."

This research advances best practices by implicating changes to common treatment schedules for traumatic brain injury and concussion. The ability to achieve cognitive gains through a brain training treatment regimen at chronic stages of brain injury (6 months or longer) supports the need to monitor brain recovery annually and offer treatment when deficits persist or emerge later.

"Brain injuries require routine follow-up monitoring. We need to make sure that optimized brain recovery continues to support later cognitive milestones, and that is especially true in the case of adolescents," said Dr. Sandra Bond Chapman, study author, founder and chief director of the Center for BrainHealth and Dee Wyly Distinguished University Chair at The University of Texas at Dallas. "What’s promising is that no matter the severity of the injury or the amount of time since injury, brain performance improved when teens were taught how to strategically process incoming information in a meaningful way, instead of just focusing on rote memorization."

(Source: brainhealth.utdallas.edu)

Filed under TBI brain injury concussions cognitive performance frontal lobe neuroscience science

243 notes

Genetic factor contributes to forgetfulness

University of Bonn psychologists prove genetic variation is underlying factor in higher incidence of forgetfulness

Misplaced your keys? Can’t remember someone’s name? Didn’t notice the stop sign? Those who frequently experience such cognitive lapses now have an explanation. Psychologists from the University of Bonn have found a connection between such everyday lapses and the DRD2 gene. Those who have a certain variant of this gene are more easily distracted and experience a significantly higher incidence of lapses due to a lack of attention. The scientific team will probably report their results in the May issue of “Neuroscience Letters,” which is already available online in advance.

image

Most of us are familiar with such everyday lapses; can’t find your keys, again! Or you walk into another room but forgot what you actually went there for. Or you are on the phone with someone and cannot remember their name. “Such short-term memory lapses are very common, but some people experience them particularly often,” said Prof. Dr. Martin Reuter from the department for Differential and Biological Psychology at the University of Bonn. Mistakes occurring due to such short-term lapses can become a hazard in cases where, e.g., a person overlooks a stop sign at an intersection. And in the workplace, a lack of attention can also become a problem–so for example when it results in forgetting to save essential data.

A gene “directing” your brain

"A familial clustering of such lapses suggests that they are subject to genetic effects," explained Dr. Sebastian Markett, the principal author and a member of Prof. Reuter’s team. In lab experiments, the group of scientists had already found indications earlier that the so-called dopamine D2 receptor gene (DRD2) plays a part in forgetfulness. DRD2 has an essential function in signal transmission within the frontal lobes. "This structure can be compared to a director coordinating the brain like an orchestra," Dr. Markett added. In this simile, the DRD2 gene would correspond to the baton, because it plays a part in dopamine transmission in the brain. If the baton skips a beat, the orchestra gets confused.

The psychologists from the University of Bonn tested a total of 500 women and men by taking a saliva sample and examining it using methods from molecular biology. All humans carry the DRD2 gene, which comes in two variants that are distinguished by only one letter within the genetic code. The one variant has C (cytosine) in one locus, which is displaced by T (thymine) in the other. According to the research team’s analyses, about a quarter of the subjects exclusively had the DRD2 gene with the cytosine nucleobase, while three quarters were the genotype with at least one thymine base.

The scientists then wanted to find out whether this difference in the genetic code also had an effect on everyday behavior. By means of a self-assessment survey they asked the subjects to state how frequently they experience these lapses–how often they forgot names, misplaced their keys. The survey also included questions regarding certain impulsivity-related factors, such as how easily a subject was distracted from actual tasks at hand, and how long they were able to maintain their concentration.

Lapses can clearly be tied to the gene variant

The scientists used statistical methods to check whether it was possible to associate the forgetfulness symptoms elicited by means of the surveys to one of the DRD2 gene variants. The results showed that functions such as attention and memory are less clearly expressed in persons who carry the thymine variant of the gene than in the cytosine type. “The connection is obvious; such lapses can partially be attributed to this gene variant,” reported Dr. Markett. According to their own statements, the subjects with the thymine DRD2 variant more frequently “fall victim” to forgetfulness or attention deficits. And vice versa, the cytosine type seems to be protected from that. “This result matches the results of other studies very well,” added Dr. Markett.

Carriers of the gene variant linked to forgetfulness may now find solace in the fact that they are not responsible for their genes, and that this is just their fate….but Dr. Markett doesn’t agree. “There are things you can do to compensate for forgetfulness; writing yourself notes or making more of an effort to put your keys down in a specific location–and not just anywhere.” Those who develop such strategies for the different areas of their lives are better able to handle their deficit.

(Source: www3.uni-bonn.de)

Filed under forgetfulness DRD2 dopamine memory frontal lobe neuroscience science

246 notes

Outside the body our memories fail us

New research from Karolinska Institutet and Umeå University in Sweden demonstrates for the first time that there is a close relationship between body perception and the ability to remember. For us to be able to store new memories from our lives, we need to feel that we are in our own body. According to researchers, the results could be of major importance in understanding the memory problems that psychiatric patients often exhibit.

The memories of what happened on the first day of school are an example of an episodic memory. How these memories are created and how the role that the perception of one’s own body has when storing memories has long been inconclusive. Swedish researchers can now demonstrate that volunteers who experience an exciting event whilst perceiving an illusion of being outside their own body exhibit a form of memory loss.

“It is already evident that people who have suffered psychiatric conditions in which they felt that they were not in their own body have fragmentary memories of what actually occurred”, says Loretxu Bergouignan, principal author of the current study. “We wanted to see how this manifests itself in healthy subjects.”

The study, which is published in the scientific journal PNAS, involved a total of 84 students reading about and undergoing four oral questioning sessions. To make these sessions extra memorable, an actor (Peter Bergared) took up the role of examiner – a (fictional) very eccentric professor at Karolinska Institutet. Two of the interrogations were perceived from a first person perspective from their own bodies in the usual way, while the participants in the other two sessions experienced a created illusion of being outside their own body. In both cases, the participants wore virtual reality goggles and earphones. One week later, they either underwent memory testing where they had to recall the events and provide details about what had happened, in which order, and what they felt, or they had to try to remember the events while they underwent brain imaging with functional magnetic resonance imaging (fMRI).

It then turned out that the participants remembered the ‘out-of-body’ interrogations significantly worse than those experienced from the normal ‘In body’ perspective. This was the case despite the fact that they responded equally well to the questions from each situation and also indicated that they experienced the same level of emotion. The fMRI scans further revealed a crucial difference in activity in the portion of the temporal lobe – the hippocampus – that is known to be central for episodic memories.

“When they tried to remember what happened during the interrogations experienced out-of-body, activity in the hippocampus was eliminated, unlike when they remembered the other situations. However, we could see activity in the frontal lobe cortex, so they were really making an effort to remember”, says professor Henrik Ehrsson, the research group leader behind the study. 

The researchers’ interpretation of the results is that there is a close relationship between body experience and memory. Our brain constantly creates the experience of one’s own body in space by combining information from multiple senses: sight, hearing, touch, and more. When a memory is created, it is the task of the hippocampus to link all the information found in the cerebral cortex into a unified memory for further long-term storage. During the experience of being outside one’s body, this memory storage process is disturbed, whereupon the brain creates fragmentary memories instead.

“We believe that this new knowledge may be important for future research on memory disorders in a number of psychiatric conditions such as post-traumatic stress disorder, borderline personality disorder and certain psychoses where patients have dissociative experiences,” says Loretxu Bergouignan.

(Source: news.cision.com)

Filed under hippocampus frontal lobe body perception memory neuroimaging neuroscience science

267 notes

Honesty beats dishonesty for making you feel good
A University of Toronto report based on two neural imaging studies that monitored brain activity has found a reward given for telling the truth gives people greater satisfaction than the same reward given for deceit.
These studies were published recently in the neuroscience journals Neuropsychologia and NeuroImage.
"Our findings together show that people typically find truth-telling to be more rewarding than lying in different types of deceptive situations,” said Professor Kang Lee,whose research is funded in part by the Social Sciences and Humanities Research Council.
The findings are based on two studies of Chinese participants using a new neuroimaging method called near-infrared spectroscopy. The studies are among the first to address the question of whether lying makes people feel better or worse than telling the truth.
The studies explored two different types of deception. In first-order deception, the recipient does not know the deceiver is lying. In second-order deception, the deceivers are fully aware that the recipient knows their intention, such as bluffing in poker.
The researchers were surprised to find that a liar’s cortical reward system was more active when a reward was gained through truth-telling than lying. This was true in both types of deception.
Researchers also found that in both types of deception, telling a lie produced greater brain activations than telling the truth in the frontal lobe, suggesting lying is cognitively more taxing than truth-telling and uses more neural resources.
The researchers hope this study will advance understanding of the neural mechanisms underlying lying, a ubiquitous and frequent human behaviour, and help to diagnose pathological liars who may have different neural responses when lying or telling the truth.

Honesty beats dishonesty for making you feel good

A University of Toronto report based on two neural imaging studies that monitored brain activity has found a reward given for telling the truth gives people greater satisfaction than the same reward given for deceit.

These studies were published recently in the neuroscience journals Neuropsychologia and NeuroImage.

"Our findings together show that people typically find truth-telling to be more rewarding than lying in different types of deceptive situations,” said Professor Kang Lee,whose research is funded in part by the Social Sciences and Humanities Research Council.

The findings are based on two studies of Chinese participants using a new neuroimaging method called near-infrared spectroscopy. The studies are among the first to address the question of whether lying makes people feel better or worse than telling the truth.

The studies explored two different types of deception. In first-order deception, the recipient does not know the deceiver is lying. In second-order deception, the deceivers are fully aware that the recipient knows their intention, such as bluffing in poker.

The researchers were surprised to find that a liar’s cortical reward system was more active when a reward was gained through truth-telling than lying. This was true in both types of deception.

Researchers also found that in both types of deception, telling a lie produced greater brain activations than telling the truth in the frontal lobe, suggesting lying is cognitively more taxing than truth-telling and uses more neural resources.

The researchers hope this study will advance understanding of the neural mechanisms underlying lying, a ubiquitous and frequent human behaviour, and help to diagnose pathological liars who may have different neural responses when lying or telling the truth.

Filed under neuroimaging brain activity lying deception frontal lobe psychology neuroscience science

174 notes

Sleep deprivation linked to junk food cravings
A sleepless night makes us more likely to reach for doughnuts or pizza than for whole grains and leafy green vegetables, suggests a new study from UC Berkeley that examines the brain regions that control food choices. The findings shed new light on the link between poor sleep and obesity.
Using functional magnetic resonance imaging (fMRI), UC Berkeley researchers scanned the brains of 23 healthy young adults, first after a normal night’s sleep and next, after a sleepless night. They found impaired activity in the sleep-deprived brain’s frontal lobe, which governs complex decision-making, but increased activity in deeper brain centers that respond to rewards. Moreover, the participants favored unhealthy snack and junk foods when they were sleep deprived.
“What we have discovered is that high-level brain regions required for complex judgments and decisions become blunted by a lack of sleep, while more primal brain structures that control motivation and desire are amplified,” said Matthew Walker, a UC Berkeley professor of psychology and neuroscience and senior author of the study published today (Tuesday, Aug. 6) in the journal Nature Communications.
Moreover, he added, “high-calorie foods also became significantly more desirable when participants were sleep-deprived. This combination of altered brain activity and decision-making may help explain why people who sleep less also tend to be overweight or obese.”
Previous studies have linked poor sleep to greater appetites, particularly for sweet and salty foods, but the latest findings provide a specific brain mechanism explaining why food choices change for the worse following a sleepless night, Walker said.
“These results shed light on how the brain becomes impaired by sleep deprivation, leading to the selection of more unhealthy foods and, ultimately, higher rates of obesity,” said Stephanie Greer, a doctoral student in Walker’s Sleep and Neuroimaging Laboratory and lead author of the paper. Another co-author of the study is Andrea Goldstein, also a doctoral student in Walker’s lab.
In this newest study, researchers measured brain activity as participants viewed a series of 80 food images that ranged from high-to low-calorie and healthy and unhealthy, and rated their desire for each of the items. As an incentive, they were given the food they most craved after the MRI scan.
Food choices presented in the experiment ranged from fruits and vegetables, such as strawberries, apples and carrots, to high-calorie burgers, pizza and doughnuts. The latter are examples of the more popular choices following a sleepless night.
On a positive note, Walker said, the findings indicate that “getting enough sleep is one factor that can help promote weight control by priming the brain mechanisms governing appropriate food choices.”

Sleep deprivation linked to junk food cravings

A sleepless night makes us more likely to reach for doughnuts or pizza than for whole grains and leafy green vegetables, suggests a new study from UC Berkeley that examines the brain regions that control food choices. The findings shed new light on the link between poor sleep and obesity.

Using functional magnetic resonance imaging (fMRI), UC Berkeley researchers scanned the brains of 23 healthy young adults, first after a normal night’s sleep and next, after a sleepless night. They found impaired activity in the sleep-deprived brain’s frontal lobe, which governs complex decision-making, but increased activity in deeper brain centers that respond to rewards. Moreover, the participants favored unhealthy snack and junk foods when they were sleep deprived.

“What we have discovered is that high-level brain regions required for complex judgments and decisions become blunted by a lack of sleep, while more primal brain structures that control motivation and desire are amplified,” said Matthew Walker, a UC Berkeley professor of psychology and neuroscience and senior author of the study published today (Tuesday, Aug. 6) in the journal Nature Communications.

Moreover, he added, “high-calorie foods also became significantly more desirable when participants were sleep-deprived. This combination of altered brain activity and decision-making may help explain why people who sleep less also tend to be overweight or obese.”

Previous studies have linked poor sleep to greater appetites, particularly for sweet and salty foods, but the latest findings provide a specific brain mechanism explaining why food choices change for the worse following a sleepless night, Walker said.

“These results shed light on how the brain becomes impaired by sleep deprivation, leading to the selection of more unhealthy foods and, ultimately, higher rates of obesity,” said Stephanie Greer, a doctoral student in Walker’s Sleep and Neuroimaging Laboratory and lead author of the paper. Another co-author of the study is Andrea Goldstein, also a doctoral student in Walker’s lab.

In this newest study, researchers measured brain activity as participants viewed a series of 80 food images that ranged from high-to low-calorie and healthy and unhealthy, and rated their desire for each of the items. As an incentive, they were given the food they most craved after the MRI scan.

Food choices presented in the experiment ranged from fruits and vegetables, such as strawberries, apples and carrots, to high-calorie burgers, pizza and doughnuts. The latter are examples of the more popular choices following a sleepless night.

On a positive note, Walker said, the findings indicate that “getting enough sleep is one factor that can help promote weight control by priming the brain mechanisms governing appropriate food choices.”

Filed under sleep deprivation obesity brain activity fMRI decision making frontal lobe neuroscience science

289 notes

Brain frontal lobes not sole centre of human intelligence

Human intelligence cannot be explained by the size of the brain’s frontal lobes, say researchers.

image

Research into the comparative size of the frontal lobes in humans and other species has determined that they are not - as previously thought - disproportionately enlarged relative to other areas of the brain, according to the most accurate and conclusive study of this area of the brain.

It concludes that the size of our frontal lobes cannot solely account for humans’ superior cognitive abilities.

The study by Durham and Reading universities suggests that supposedly more ‘primitive’ areas, such as the cerebellum, were equally important in the expansion of the human brain. These areas may therefore play unexpectedly important roles in human cognition and its disorders, such as autism and dyslexia, say the researchers.

The study is published in the Proceedings of the National Academy of Sciences (PNAS) today.

The frontal lobes are an area in the brain of mammals located at the front of each cerebral hemisphere, and are thought to be critical for advanced intelligence.

Lead author Professor Robert Barton from the Department of Anthropology at Durham University, said: “Probably the most widespread assumption about how the human brain evolved is that size increase was concentrated in the frontal lobes.

"It has been thought that frontal lobe expansion was particularly crucial to the development of modern human behaviour, thought and language, and that it is our bulging frontal lobes that truly make us human. We show that this is untrue: human frontal lobes are exactly the size expected for a non-human brain scaled up to human size.

"This means that areas traditionally considered to be more primitive were just as important during our evolution. These other areas should now get more attention. In fact there is already some evidence that damage to the cerebellum, for example, is a factor in disorders such as autism and dyslexia."

The scientists argue that many of our high-level abilities are carried out by more extensive brain networks linking many different areas of the brain. They suggest it may be the structure of these extended networks more than the size of any isolated brain region that is critical for cognitive functioning.

Previously, various studies have been conducted to try and establish whether humans’ frontal lobes are disproportionately enlarged compared to their size in other primates such as apes and monkeys. They have resulted in a confused picture with use of different methods and measurements leading to inconsistent findings.

The Durham and Reading researchers, funded by The Leverhulme Trust, analysed data sets from previous animal and human studies using phylogenetic, or ‘evolutionary family tree’, methods, and found consistent results across all their data. They used a new method to look at the speed with which evolutionary change occurred, concluding that the frontal lobes did not evolve especially fast along the human lineage after it split from the chimpanzee lineage.

(Source: eurekalert.org)

Filed under frontal lobe cognition intelligence cerebellum prefrontal cortex neuroscience psychology science

190 notes

Poor sleep in old age prevents the brain from storing memories
The connection between poor sleep, memory loss and brain deterioration as we grow older has been elusive. But for the first time, scientists at the University of California, Berkeley, have found a link between these hallmark maladies of old age. Their discovery opens the door to boosting the quality of sleep in elderly people to improve memory.
UC Berkeley neuroscientists have found that the slow brain waves generated during the deep, restorative sleep we typically experience in youth play a key role in transporting memories from the hippocampus – which provides short-term storage for memories – to the prefrontal cortex’s longer term “hard drive.”
However, in older adults, memories may be getting stuck in the hippocampus due to the poor quality of deep ‘slow wave’ sleep, and are then overwritten by new memories, the findings suggest.
“What we have discovered is a dysfunctional pathway that helps explain the relationship between brain deterioration, sleep disruption and memory loss as we get older – and with that, a potentially new treatment avenue,” said UC Berkeley sleep researcher Matthew Walker, an associate professor of psychology and neuroscience at UC Berkeley and senior author of the study published in the journal Nature Neuroscience.

Poor sleep in old age prevents the brain from storing memories

The connection between poor sleep, memory loss and brain deterioration as we grow older has been elusive. But for the first time, scientists at the University of California, Berkeley, have found a link between these hallmark maladies of old age. Their discovery opens the door to boosting the quality of sleep in elderly people to improve memory.

UC Berkeley neuroscientists have found that the slow brain waves generated during the deep, restorative sleep we typically experience in youth play a key role in transporting memories from the hippocampus – which provides short-term storage for memories – to the prefrontal cortex’s longer term “hard drive.”

However, in older adults, memories may be getting stuck in the hippocampus due to the poor quality of deep ‘slow wave’ sleep, and are then overwritten by new memories, the findings suggest.

“What we have discovered is a dysfunctional pathway that helps explain the relationship between brain deterioration, sleep disruption and memory loss as we get older – and with that, a potentially new treatment avenue,” said UC Berkeley sleep researcher Matthew Walker, an associate professor of psychology and neuroscience at UC Berkeley and senior author of the study published in the journal Nature Neuroscience.

Filed under brainwaves sleep memory prefrontal cortex frontal lobe aging neuroscience science

162 notes

Individuals with a low risk for cocaine dependence have a differently shaped brain to those with addiction
People who take cocaine over many years without becoming addicted have a brain structure which is significantly different from those individuals who developed cocaine-dependence, researchers have discovered. New research from the University of Cambridge has found that recreational drug users who have not developed a dependence have an abnormally large frontal lobe, the section of the brain implicated in self-control. Their research was published in the journal Biological Psychiatry.
For the study, led by Dr Karen Ersche, individuals who use cocaine on a regular basis underwent a brain scan and completed a series of personality tests. The majority of the cocaine users were addicted to the drug but some were not (despite having used it for several years).
The scientists discovered that a region in the frontal lobes of the brain, known to be critically implicated in decision-making and self-control, was abnormally bigger in the recreational cocaine users. The Cambridge researchers suggest that this abnormal increase in grey matter volume, which they believe predates drug use, might reflect resilience to the effects of cocaine, and even possibly helps these recreational cocaine users to exert self-control and to make advantageous decisions which minimize the risk of them becoming addicted.
They found that this same region in the frontal lobes of the brain was significantly reduced in size in people with cocaine dependence, confirming earlier research that had found similar results. They believe that at least some of these changes are the result of drug use, which causes drug users to lose grey matter.
They also found that people who use illicit drugs like cocaine exhibit high levels of sensation-seeking personality traits, but only those developing dependence show personality traits of impulsivity and compulsivity.
Dr Ersche, of the Behavioural and Clinical Neuroscience Institute (BCNI) at the University of Cambridge, said: “These findings are important because they show that the use of cocaine does not inevitably lead to addiction in people with good self-control and no familial risk.
“Our findings indicate that preventative strategies might be more effective if they were tailored more closely to those individuals at risk according to their personality profile and brain structure.”
The researchers will next explore the basis of the recreational users’ apparent resilience to drug dependence. Dr Ersche added: “Their high level of education, less troubled family background or the beginning of drug-taking only after puberty may all play a role.”

Individuals with a low risk for cocaine dependence have a differently shaped brain to those with addiction

People who take cocaine over many years without becoming addicted have a brain structure which is significantly different from those individuals who developed cocaine-dependence, researchers have discovered. New research from the University of Cambridge has found that recreational drug users who have not developed a dependence have an abnormally large frontal lobe, the section of the brain implicated in self-control. Their research was published in the journal Biological Psychiatry.

For the study, led by Dr Karen Ersche, individuals who use cocaine on a regular basis underwent a brain scan and completed a series of personality tests. The majority of the cocaine users were addicted to the drug but some were not (despite having used it for several years).

The scientists discovered that a region in the frontal lobes of the brain, known to be critically implicated in decision-making and self-control, was abnormally bigger in the recreational cocaine users. The Cambridge researchers suggest that this abnormal increase in grey matter volume, which they believe predates drug use, might reflect resilience to the effects of cocaine, and even possibly helps these recreational cocaine users to exert self-control and to make advantageous decisions which minimize the risk of them becoming addicted.

They found that this same region in the frontal lobes of the brain was significantly reduced in size in people with cocaine dependence, confirming earlier research that had found similar results. They believe that at least some of these changes are the result of drug use, which causes drug users to lose grey matter.

They also found that people who use illicit drugs like cocaine exhibit high levels of sensation-seeking personality traits, but only those developing dependence show personality traits of impulsivity and compulsivity.

Dr Ersche, of the Behavioural and Clinical Neuroscience Institute (BCNI) at the University of Cambridge, said: “These findings are important because they show that the use of cocaine does not inevitably lead to addiction in people with good self-control and no familial risk.

“Our findings indicate that preventative strategies might be more effective if they were tailored more closely to those individuals at risk according to their personality profile and brain structure.”

The researchers will next explore the basis of the recreational users’ apparent resilience to drug dependence. Dr Ersche added: “Their high level of education, less troubled family background or the beginning of drug-taking only after puberty may all play a role.”

Filed under cocaine cocaine dependence brain brain structure frontal lobe psychology neuroscience science

free counters