Neuroscience

Articles and news from the latest research reports.

222 notes

Blind lead the way in brave new world of tactile technology

Imagine feeling a slimy jellyfish, a prickly cactus or map directions on your iPad mini Retina display, because that’s where tactile technology is headed. But you’ll need more than just an index finger to feel your way around.

image

New research at UC Berkeley has found that people are better and faster at navigating tactile technology when using both hands and several fingers. Moreover, blind people in the study outmaneuvered their sighted counterparts – especially when using both hands and several fingers – possibly because they’ve developed superior cognitive strategies for finding their way around.

Bottom line: Two hands are better than one in the brave new world of tactile or “haptic” technology, and the visually impaired can lead the way.

”Most sighted people will explore these types of displays with a single finger. But our research shows that this is a bad decision. No matter what the task, people perform better using multiple fingers and hands,” said Valerie Morash, a doctoral student in psychology at UC Berkeley, and lead author of the study just published in the online issue of the journal, Perception.

“We can learn from blind people how to effectively use multiple fingers, and then teach these strategies to sighted individuals who have recently lost vision or are using tactile displays in high-stakes applications like controlling surgical robots,” she added.

For decades, scientists have studied how receptors on the fingertips relay information to the brain. Now, researchers at Disney and other media companies are implementing more tactile interfaces, which use vibrations, and electrostatic or magnetic feedback for users to find their way around, or experience how something feels.

In this latest study, Morash and fellow researchers at UC Berkeley and the Smith-Kettlewell Eye Research Institute in San Francisco tested 14 blind adults and 14 blindfolded sighted adults on several tasks using a tactile map. Using various hand and finger combinations, they were tasked with such challenges as finding a landmark or figuring out if a road looped around.

Overall, both blind and sighted participants performed better when using both hands and several fingers, although blind participants were, on average, 50 percent faster at completing the tasks, and even faster when they used both hands and all their fingers.

“As we move forward with integrating tactile feedback into displays, these technologies absolutely need to support multiple fingers,” Morash said. “This will promote the best tactile performance in applications such as the remote control of robotics used in space and high-risk situations, among other things.”

(Source: newscenter.berkeley.edu)

Filed under blind tactile technology haptic sensing perception neuroscience science

127 notes

Fruit fly research may reveal what happens in female brains during courtship and mating
What are the complex processes in the brain involved with choosing a mate, and are these processes different in females versus males? It’s difficult to study such questions in people, but researchers are finding clues in fruit flies that might be relevant to humans and other animals. Three different studies on the topic are being published in the Cell Press journals Neuron (1, 2) and Current Biology.
Work over the past 100 years has largely focused on the overt courtship behaviors that male flies direct toward females. However, the female ultimately decides whether to reject the male or copulate with him. How does the female make this decision? In one Neuron paper, researchers report that they have identified two small groups of neurons in the female brain that function to modulate whether she will mate or not with a male based on his distinct pheromones and courtship song. In this paper, a team led by Dr. Bruce Baker of the Howard Hughes Medical Institute’s Janelia Farm Research Campus in Virginia also reports that these neurons are genetically distinct from the previously identified neurons that function to drive the elaborate courtship ritual with which a male woos a female. “An understanding of the neural mechanisms underlying how sensory information elicits appropriate sexual behaviors can be used as a point of comparison for how similar sexual behavior circuits are structured and function in other species,” says Dr. Baker.
In the Current Biology study, Dr. Leslie Vosshall of The Rockefeller University in New York City and her team found that a small group of neurons in the abdominal nerve cord and reproductive tract—called Abdominal-B neurons—is necessary for the female to pause her movement and interact with a courting male. When the neurons are inactivated, the female ignores the male and keeps moving, but when the neurons are activated, the female spontaneously pauses. “Sexual courtship is a duet—the male and female send signals back and forth until they reach the point that copulation proceeds,” says Dr. Jennifer Bussell, the lead author of the study. “Pausing to interact with a male, rather than avoiding him, is a crucial step in any female’s behavior leading to copulation. Tying a group of neurons to this particular response to males will allow us to dissect in detail how female mating circuitry functions.”
In another Neuron paper, researchers studied the effects of a small protein called sex peptide that is transferred along with sperm from males to females and is detected by sensory neurons in the uterus. Sex peptide changes the female’s behavior so that she is reluctant to mate again for about10 days. The investigators traced the neuronal pathway that is modulated when the uterus’s sensory neurons detect sex peptide. “Thanks to our work, we think the sex peptide signal goes to a region of the fly’s brain that is the homolog of the hypothalamus, which has been know for many years to be central in controlling sexual receptivity in vertebrates,” explains co-lead author Dr. Mark Palfreyman of the Research Institute of Molecular Pathology in Vienna, Austria. This region of the brain links the nervous system to the endocrine, or hormonal, system. “Of course, these models will still need to be tested and our work only provides an initial glimpse, but our study opens the possibility that analogous neuroendocrine systems control sexual receptivity from flies to vertebrates,” adds senior author Dr. Barry Dickson, who was also a co-author on the Current Biology paper published by Dr. Vosshall.

Fruit fly research may reveal what happens in female brains during courtship and mating

What are the complex processes in the brain involved with choosing a mate, and are these processes different in females versus males? It’s difficult to study such questions in people, but researchers are finding clues in fruit flies that might be relevant to humans and other animals. Three different studies on the topic are being published in the Cell Press journals Neuron (1, 2) and Current Biology.

Work over the past 100 years has largely focused on the overt courtship behaviors that male flies direct toward females. However, the female ultimately decides whether to reject the male or copulate with him. How does the female make this decision? In one Neuron paper, researchers report that they have identified two small groups of neurons in the female brain that function to modulate whether she will mate or not with a male based on his distinct pheromones and courtship song. In this paper, a team led by Dr. Bruce Baker of the Howard Hughes Medical Institute’s Janelia Farm Research Campus in Virginia also reports that these neurons are genetically distinct from the previously identified neurons that function to drive the elaborate courtship ritual with which a male woos a female. “An understanding of the neural mechanisms underlying how sensory information elicits appropriate sexual behaviors can be used as a point of comparison for how similar sexual behavior circuits are structured and function in other species,” says Dr. Baker.

In the Current Biology study, Dr. Leslie Vosshall of The Rockefeller University in New York City and her team found that a small group of neurons in the abdominal nerve cord and reproductive tract—called Abdominal-B neurons—is necessary for the female to pause her movement and interact with a courting male. When the neurons are inactivated, the female ignores the male and keeps moving, but when the neurons are activated, the female spontaneously pauses. “Sexual courtship is a duet—the male and female send signals back and forth until they reach the point that copulation proceeds,” says Dr. Jennifer Bussell, the lead author of the study. “Pausing to interact with a male, rather than avoiding him, is a crucial step in any female’s behavior leading to copulation. Tying a group of neurons to this particular response to males will allow us to dissect in detail how female mating circuitry functions.”

In another Neuron paper, researchers studied the effects of a small protein called sex peptide that is transferred along with sperm from males to females and is detected by sensory neurons in the uterus. Sex peptide changes the female’s behavior so that she is reluctant to mate again for about10 days. The investigators traced the neuronal pathway that is modulated when the uterus’s sensory neurons detect sex peptide. “Thanks to our work, we think the sex peptide signal goes to a region of the fly’s brain that is the homolog of the hypothalamus, which has been know for many years to be central in controlling sexual receptivity in vertebrates,” explains co-lead author Dr. Mark Palfreyman of the Research Institute of Molecular Pathology in Vienna, Austria. This region of the brain links the nervous system to the endocrine, or hormonal, system. “Of course, these models will still need to be tested and our work only provides an initial glimpse, but our study opens the possibility that analogous neuroendocrine systems control sexual receptivity from flies to vertebrates,” adds senior author Dr. Barry Dickson, who was also a co-author on the Current Biology paper published by Dr. Vosshall.

Filed under fruit flies neurons mating sex peptide sensory neurons neuroscience science

353 notes

Only 25 Minutes of Mindfulness Meditation Alleviates Stress
Mindfulness meditation has become an increasingly popular way for people to improve their mental and physical health, yet most research supporting its benefits has focused on lengthy, weeks-long training programs.
New research from Carnegie Mellon University is the first to show that brief mindfulness meditation practice — 25 minutes for three consecutive days — alleviates psychological stress. Published in the journal “Psychoneuroendocrinology,” the study investigates how mindfulness meditation affects people’s ability to be resilient under stress.
"More and more people report using meditation practices for stress reduction, but we know very little about how much you need to do for stress reduction and health benefits," said lead author J. David Creswell, associate professor of psychology in the Dietrich College of Humanities and Social Sciences.
For the study, Creswell and his research team had 66 healthy individuals aged 18-30 years old participate in a three-day experiment. Some participants went through a brief mindfulness meditation training program; for 25 minutes for three consecutive days, the individuals were given breathing exercises to help them monitor their breath and pay attention to their present moment experiences. A second group of participants completed a matched three-day cognitive training program in which they were asked to critically analyze poetry in an effort to enhance problem-solving skills.
Following the final training activity, all participants were asked to complete stressful speech and math tasks in front of stern-faced evaluators. Each individual reported their stress levels in response to stressful speech and math performance stress tasks, and provided saliva samples for measurement of cortisol, commonly referred to as the stress hormone.
The participants who received the brief mindfulness meditation training reported reduced stress perceptions to the speech and math tasks, indicating that the mindfulness meditation fostered psychological stress resilience. More interestingly, on the biological side, the mindfulness meditation participants showed greater cortisol reactivity.
"When you initially learn mindfulness mediation practices, you have to cognitively work at it — especially during a stressful task," said Creswell. "And, these active cognitive efforts may result in the task feeling less stressful, but they may also have physiological costs with higher cortisol production."
Creswell’s group is now testing the possibility that mindfulness can become more automatic and easy to use with long-term mindfulness meditation training, which may result in reduced cortisol reactivity.

Only 25 Minutes of Mindfulness Meditation Alleviates Stress

Mindfulness meditation has become an increasingly popular way for people to improve their mental and physical health, yet most research supporting its benefits has focused on lengthy, weeks-long training programs.

New research from Carnegie Mellon University is the first to show that brief mindfulness meditation practice — 25 minutes for three consecutive days — alleviates psychological stress. Published in the journal “Psychoneuroendocrinology,” the study investigates how mindfulness meditation affects people’s ability to be resilient under stress.

"More and more people report using meditation practices for stress reduction, but we know very little about how much you need to do for stress reduction and health benefits," said lead author J. David Creswell, associate professor of psychology in the Dietrich College of Humanities and Social Sciences.

For the study, Creswell and his research team had 66 healthy individuals aged 18-30 years old participate in a three-day experiment. Some participants went through a brief mindfulness meditation training program; for 25 minutes for three consecutive days, the individuals were given breathing exercises to help them monitor their breath and pay attention to their present moment experiences. A second group of participants completed a matched three-day cognitive training program in which they were asked to critically analyze poetry in an effort to enhance problem-solving skills.

Following the final training activity, all participants were asked to complete stressful speech and math tasks in front of stern-faced evaluators. Each individual reported their stress levels in response to stressful speech and math performance stress tasks, and provided saliva samples for measurement of cortisol, commonly referred to as the stress hormone.

The participants who received the brief mindfulness meditation training reported reduced stress perceptions to the speech and math tasks, indicating that the mindfulness meditation fostered psychological stress resilience. More interestingly, on the biological side, the mindfulness meditation participants showed greater cortisol reactivity.

"When you initially learn mindfulness mediation practices, you have to cognitively work at it — especially during a stressful task," said Creswell. "And, these active cognitive efforts may result in the task feeling less stressful, but they may also have physiological costs with higher cortisol production."

Creswell’s group is now testing the possibility that mindfulness can become more automatic and easy to use with long-term mindfulness meditation training, which may result in reduced cortisol reactivity.

Filed under meditation mindfulness meditation stress psychology neuroscience science

74 notes

Study Identifies Predictors for Teen Binge-Drinking

Neuroscientists leading the largest longitudinal adolescent brain imaging study to date have learned that predicting teenage binge-drinking is possible. In fact, say the researchers in the group’s latest publication, a number of factors – genetics, brain function and about 40 different variables – can help scientists predict with about 70 percent accuracy which teens will become binge drinkers. The study appears online July 3, 2014 as an Advance Online Publication in the journal Nature.

image

First author Robert Whelan, Ph.D., a former University of Vermont (UVM) postdoctoral fellow in psychiatry and current lecturer at University College Dublin, and senior author Hugh Garavan, Ph.D., UVM associate professor of psychiatry, and colleagues conducted 10 hours of comprehensive assessments – these included neuroimaging to assess brain activity and brain structure, along with other measures such as IQ, cognitive task performance, personality and blood tests – on each of 2,400 14-year-old adolescents at eight different sites across Europe.

“Our goal was to develop a model to better understand the relative roles of brain structure and function, personality, environmental influences and genetics in the development of adolescent abuse of alcohol,” says Whelan. “This multidimensional risk profile of genes, brain function and environmental influences can help in the prediction of binge drinking at age 16 years.”

A 2012 Nature Neuroscience paper by the same researchers identified brain networks that predisposed some teens to higher-risk behaviors like experimentation with drugs and alcohol. This new study develops on that earlier work by following those kids for years (the participants in the study are now 19 years old) and identifying those who developed a pattern of binge-drinking. The 2014 Nature study aimed to predict those who went on to drink heavily at age 16 using only data collected at age 14. They applied a broad range of measures, developing a unique analytic method to predict which individuals would become binge-drinkers. The reliability of the results were confirmed by showing the same accuracy when tested on a new, separate group of teenagers. The result was a list of predictors that ranged from brain and genetics to personality and personal history factors.

“Notably, it’s not the case that there’s a single one or two or three variables that are critical,” says Garavan. “The final model was very broad – it suggests that a wide mixture of reasons underlie teenage drinking.”

Some of the best predictors, shares Garavan, include variables like personality, sensation-seeking traits, lack of conscientiousness, and a family history of drug use. Having even a single drink at age 14, was also a powerful predictor. That type of risk-taking behavior – and the impulsivity that often accompanies it – was a critical predictor. In addition, those teens who had experienced several stressful life events were among those at greater risk for binge-drinking.

One interesting finding, says Garavan, was that bigger brains were also predictive. Adolescents undergo significant brain changes, so in addition to the formation of personalities and social networks, it’s actually normal for their brains to reduce to a more efficient size.

“There’s refining and sculpting of the brain, and most of the gray matter – the neurons and the connections between them, are getting smaller and the white matter is getting larger,” he explains. “Kids with more immature brains – those that are still larger – are more likely to drink.”

Garavan, Whelan and colleagues believe that by better understanding the probable causal factors for binge-drinking, targeted interventions for those most at risk could be applied.

Gunter Schumann, M.D.,professor of biological psychiatry and head of the section at the Social, Genetic and Developmental Psychiatry Centre, Institute of Psychiatry, King’s College London, is the principle investigator of the IMAGEN study, which is the source of this latest paper. “We aimed to develop a ‘gold standard’ model for predicting teenage behavior, which can be used as a benchmark for the development of simpler, widely applicable prediction models,” says Schumann. “This work will inform the development of specific early interventions in carriers of the risk profile to reduce the incidence of adolescent substance abuse. We now propose to extend analysis of the IMAGEN data in order to investigate the development of substance use patterns in the context of moderating environmental factors, such as exposure to nicotine or drugs as well as psychosocial stress.”

In the future, the researchers hope to perform more in-depth analyses of the brain factors involved and determine whether or not there are different predictors for abuse of other drugs. A similar analysis, which is using the same dataset to look at the predictors of cannabis use, is planned for the near future.

(Source: uvm.edu)

Filed under binge-drinking alcohol neuroimaging brain activity brain structure neuroscience science

1,640 notes

New study discovers biological basis for magic mushroom ‘mind expansion’
Psychedelic drugs such as LSD and magic mushrooms can profoundly alter the way we experience the world but little is known about what physically happens in the brain. New research, published in Human Brain Mapping, has examined the brain effects of the psychedelic chemical in magic mushrooms, called psilocybin, using data from brain scans of volunteers who had been injected with the drug.
The study found that under psilocybin, activity in the more primitive brain network linked to emotional thinking became more pronounced, with several different areas in this network - such as the hippocampus and anterior cingulate cortex - active at the same time. This pattern of activity is similar to the pattern observed in people who are dreaming. Conversely, volunteers who had taken psilocybin had more disjointed and uncoordinated activity in the brain network that is linked to high-level thinking, including self-consciousness.
Psychedelic drugs are unique among other psychoactive chemicals in that users often describe ‘expanded consciousness,’ including enhanced associations, vivid imagination and dream-like states. To explore the biological basis for this experience, researchers analysed brain imaging data from 15 volunteers who were given psilocybin intravenously while they lay in a functional magnetic resonance imaging (fMRI) scanner. Volunteers were scanned under the influence of psilocybin and when they had been injected with a placebo.
“What we have done in this research is begin to identify the biological basis of the reported mind expansion associated with psychedelic drugs,” said Dr Robin Carhart-Harris from the Department of Medicine, Imperial College London.  “I was fascinated to see similarities between the pattern of brain activity in a psychedelic state and the pattern of brain activity during dream sleep, especially as both involve the primitive areas of the brain linked to emotions and memory. People often describe taking psilocybin as producing a dream-like state and our findings have, for the first time, provided a physical representation for the experience in the brain.”    
The new study examined variation in the amplitude of fluctuations in what is called the blood-oxygen level dependent (BOLD) signal, which tracks activity levels in the brain. This revealed that activity in important brain networks linked to high-level thinking in humans becomes unsynchronised and disorganised under psilocybin. One particular network that was especially affected plays a central role in the brain, essentially ‘holding it all together’, and is linked to our sense of self.
In comparison, activity in the different areas of a more primitive brain network became more synchronised under the drug, indicating they were working in a more co-ordinated, ‘louder’ fashion. The network involves areas of the hippocampus, associated with memory and emotion, and the anterior cingulate cortex which is related to states of arousal.
Lead author Dr Enzo Tagliazucchi from Goethe University, Germany said: “A good way to understand how the brain works is to perturb the system in a marked and novel way. Psychedelic drugs do precisely this and so are powerful tools for exploring what happens in the brain when consciousness is profoundly altered. It is the first time we have used these methods to look at brain imaging data and it has given some fascinating insight into how psychedelic drugs expand the mind. It really provides a window through which to study the doors of perception.”
Dr. Carhart-Harris added: “Learning about the mechanisms that underlie what happens under the influence of psychedelic drugs can also help to understand their possible uses. We are currently studying the effect of LSD on creative thinking and we will also be looking at the possibility that psilocybin may help alleviate symptoms of depression by allowing patients to change their rigidly pessimistic patterns of thinking. Psychedelics were used for therapeutic purposes in the 1950s and 1960s but now we are finally beginning to understand their action in the brain and how this can inform how to put them to good use.”
The data was originally collected at Imperial College London in 2012 by a research group led by Dr Carhart-Harris and Professor David Nutt from the Department of Medicine, Imperial College London. Initial results revealed a variety of changes in the brain associated with drug intake. To explore the data further Dr. Carhart-Harris recruited specialists in the mathematical modelling of brain networks, Professor Dante Chialvo and Dr Enzo Tagliazucchi to investigate how psilocybin alters brain activity to produce its unusual psychological effects.
As part of the new study, the researchers applied a measure called entropy. This was originally developed by physicists to quantify lost energy in mechanical systems, such as a steam engine, but entropy can also be used to measure the range or randomness of a system. For the first time, researchers computed the level of entropy for different networks in the brain during the psychedelic state. This revealed a remarkable increase in entropy in the more primitive network, indicating there was an increased number of patterns of activity that were possible under the influence of psilocybin. It seemed the volunteers had a much larger range of potential brain states that were available to them, which may be the biophysical counterpart of ‘mind expansion’ reported by users of psychedelic drugs.
Previous research has suggested that there may be an optimal number of dynamic networks active in the brain, neither too many nor too few. This may provide evolutionary advantages in terms of optimising the balance between the stability and flexibility of consciousness. The mind works best at a critical point when there is a balance between order and disorder and the brain maintains this optimal number of networks. However, when the number goes above this point, the mind tips into a more chaotic regime where there are more networks available than normal. Collectively, the present results suggest that psilocybin can manipulate this critical operating point.

New study discovers biological basis for magic mushroom ‘mind expansion’

Psychedelic drugs such as LSD and magic mushrooms can profoundly alter the way we experience the world but little is known about what physically happens in the brain. New research, published in Human Brain Mapping, has examined the brain effects of the psychedelic chemical in magic mushrooms, called psilocybin, using data from brain scans of volunteers who had been injected with the drug.

The study found that under psilocybin, activity in the more primitive brain network linked to emotional thinking became more pronounced, with several different areas in this network - such as the hippocampus and anterior cingulate cortex - active at the same time. This pattern of activity is similar to the pattern observed in people who are dreaming. Conversely, volunteers who had taken psilocybin had more disjointed and uncoordinated activity in the brain network that is linked to high-level thinking, including self-consciousness.

Psychedelic drugs are unique among other psychoactive chemicals in that users often describe ‘expanded consciousness,’ including enhanced associations, vivid imagination and dream-like states. To explore the biological basis for this experience, researchers analysed brain imaging data from 15 volunteers who were given psilocybin intravenously while they lay in a functional magnetic resonance imaging (fMRI) scanner. Volunteers were scanned under the influence of psilocybin and when they had been injected with a placebo.

“What we have done in this research is begin to identify the biological basis of the reported mind expansion associated with psychedelic drugs,” said Dr Robin Carhart-Harris from the Department of Medicine, Imperial College London.  “I was fascinated to see similarities between the pattern of brain activity in a psychedelic state and the pattern of brain activity during dream sleep, especially as both involve the primitive areas of the brain linked to emotions and memory. People often describe taking psilocybin as producing a dream-like state and our findings have, for the first time, provided a physical representation for the experience in the brain.”    

The new study examined variation in the amplitude of fluctuations in what is called the blood-oxygen level dependent (BOLD) signal, which tracks activity levels in the brain. This revealed that activity in important brain networks linked to high-level thinking in humans becomes unsynchronised and disorganised under psilocybin. One particular network that was especially affected plays a central role in the brain, essentially ‘holding it all together’, and is linked to our sense of self.

In comparison, activity in the different areas of a more primitive brain network became more synchronised under the drug, indicating they were working in a more co-ordinated, ‘louder’ fashion. The network involves areas of the hippocampus, associated with memory and emotion, and the anterior cingulate cortex which is related to states of arousal.

Lead author Dr Enzo Tagliazucchi from Goethe University, Germany said: “A good way to understand how the brain works is to perturb the system in a marked and novel way. Psychedelic drugs do precisely this and so are powerful tools for exploring what happens in the brain when consciousness is profoundly altered. It is the first time we have used these methods to look at brain imaging data and it has given some fascinating insight into how psychedelic drugs expand the mind. It really provides a window through which to study the doors of perception.”

Dr. Carhart-Harris added: “Learning about the mechanisms that underlie what happens under the influence of psychedelic drugs can also help to understand their possible uses. We are currently studying the effect of LSD on creative thinking and we will also be looking at the possibility that psilocybin may help alleviate symptoms of depression by allowing patients to change their rigidly pessimistic patterns of thinking. Psychedelics were used for therapeutic purposes in the 1950s and 1960s but now we are finally beginning to understand their action in the brain and how this can inform how to put them to good use.”

The data was originally collected at Imperial College London in 2012 by a research group led by Dr Carhart-Harris and Professor David Nutt from the Department of Medicine, Imperial College London. Initial results revealed a variety of changes in the brain associated with drug intake. To explore the data further Dr. Carhart-Harris recruited specialists in the mathematical modelling of brain networks, Professor Dante Chialvo and Dr Enzo Tagliazucchi to investigate how psilocybin alters brain activity to produce its unusual psychological effects.

As part of the new study, the researchers applied a measure called entropy. This was originally developed by physicists to quantify lost energy in mechanical systems, such as a steam engine, but entropy can also be used to measure the range or randomness of a system. For the first time, researchers computed the level of entropy for different networks in the brain during the psychedelic state. This revealed a remarkable increase in entropy in the more primitive network, indicating there was an increased number of patterns of activity that were possible under the influence of psilocybin. It seemed the volunteers had a much larger range of potential brain states that were available to them, which may be the biophysical counterpart of ‘mind expansion’ reported by users of psychedelic drugs.

Previous research has suggested that there may be an optimal number of dynamic networks active in the brain, neither too many nor too few. This may provide evolutionary advantages in terms of optimising the balance between the stability and flexibility of consciousness. The mind works best at a critical point when there is a balance between order and disorder and the brain maintains this optimal number of networks. However, when the number goes above this point, the mind tips into a more chaotic regime where there are more networks available than normal. Collectively, the present results suggest that psilocybin can manipulate this critical operating point.

Filed under psychedelic drugs psilocybin functional connectivity neuroimaging brain activity neuroscience science

224 notes

Short sleep, aging brain

Researchers at Duke-NUS Graduate Medical School Singapore (Duke-NUS) have found evidence that the less older adults sleep, the faster their brains age. These findings, relevant in the context of Singapore’s rapidly ageing society, pave the way for future work on sleep loss and its contribution to cognitive decline, including dementia.

image

Past research has examined the impact of sleep duration on cognitive functions in older adults. Though faster brain ventricle enlargement is a marker for cognitive decline and the development of neurodegenerative diseases such as Alzheimer’s, the effects of sleep on this marker have never been measured.

The Duke-NUS study examined the data of 66 older Chinese adults, from the Singapore-Longitudinal Aging Brain Study(1). Participants underwent structural MRI brain scans measuring brain volume and neuropsychological assessments testing cognitive function every two years. Additionally, their sleep duration was recorded through a questionnaire. Those who slept fewer hours showed evidence of faster ventricle enlargement and decline in cognitive performance.

"Our findings relate short sleep to a marker of brain aging," said Dr June Lo, the lead author and a Duke-NUS Research Fellow. "Work done elsewhere suggests that seven hours a day(2) for adults seems to be the sweet spot for optimal performance on computer based cognitive tests. In coming years we hope to determine what’s good for cardio-metabolic and long term brain health too," added Professor Michael Chee, senior author and Director of the Centre for Cognitive Neuroscience at Duke-NUS.

(Source: eurekalert.org)

Filed under sleep sleep duration cognitive decline dementia neuroimaging neuroscience science

140 notes

Insect diet helped early humans build bigger brains
Figuring out how to survive on a lean-season diet of hard-to-reach ants, slugs and other bugs may have spurred the development of bigger brains and higher-level cognitive functions in the ancestors of humans and other primates, suggests research from Washington University in St. Louis.
“Challenges associated with finding food have long been recognized as important in shaping evolution of the brain and cognition in primates, including humans,” said Amanda D. Melin, PhD, assistant professor of anthropology in Arts & Sciences and lead author of the study.
“Our work suggests that digging for insects when food was scarce may have contributed to hominid cognitive evolution and set the stage for advanced tool use.”
Based on a five-year study of capuchin monkeys in Costa Rica, the research provides support for an evolutionary theory that links the development of sensorimotor (SMI) skills, such as increased manual dexterity, tool use, and innovative problem solving, to the creative challenges of foraging for insects and other foods that are buried, embedded or otherwise hard to procure.
Published in the June 2014 Journal of Human Evolution, the study is the first to provide detailed evidence from the field on how seasonal changes in food supplies influence the foraging patterns of wild capuchin monkeys.
The study is co-authored by biologist Hilary C. Young and anthropologists Krisztina N. Mosdossy and Linda M. Fedigan, all from the University of Calgary, Canada.
It notes that many human populations also eat embedded insects on a seasonal basis and suggests that this practice played a key role in human evolution.
“We find that capuchin monkeys eat embedded insects year-round but intensify their feeding seasonally, during the time that their preferred food – ripe fruit – is less abundant,” Melin said. “These results suggest embedded insects are an important fallback food.”
Previous research has shown that fallback foods help shape the evolution of primate body forms, including the development of strong jaws, thick teeth and specialized digestive systems in primates whose fallback diets rely mainly on vegetation.
This study suggests that fallback foods can also play an important role in shaping brain evolution among primates that fall back on insect-based diets, and that this influence is most pronounced among primates that evolve in habitats with wide seasonal variations, such as the wet-dry cycles found in some South American forests.
“Capuchin monkeys are excellent models for examining evolution of brain size and intelligence for their small body size, they have impressively large brains,” Melin said. “Accessing hidden and well-protected insects living in tree branches and under bark is a cognitively demanding task, but provides a high-quality reward: fat and protein, which is needed to fuel big brains.”
But when it comes to using tools, not all capuchin monkey strains and lineages are created equal, and Melin’s theories may explain why.
Perhaps the most notable difference between the robust (tufted, genus Sapajus) and gracile (untufted, genus Cebus) capuchin lineages is their variation in tool use. While Cebus monkeys are known for clever food-foraging tricks, such as banging snails or fruits against branches, they can’t hold a stick to their Sapajus cousins when it comes to theinnovative use and modification of sophisticated tools.
One explanation, Melin said, is that Cebus capuchins have historically and consistently occupied tropical rainforests, whereas the Sapajus lineage spread from their origins in the Atlantic rainforest into drier, more temperate and seasonal habitat types.
“Primates who extract foods in the most seasonal environments are expected to experience the strongest selection in the ‘sensorimotor intelligence’ domain, which includes cognition related to object handling,” Melin said. “This may explain the occurrence of tool use in some capuchin lineages, but not in others.”
Genetic analysis of mitochondial chromosomes suggests that the Sapajus-Cebus diversification occurred millions of years ago in the late Miocene epoch.
“We predict that the last common ancestor of Cebus and Sapajus had a level of SMI more closely resembling extant Cebus monkeys, and that further expansion of SMI evolved in the robust lineage to facilitate increased access to varied embedded fallback foods,necessitated by more intense periods of fruit shortage,” she said.
One of the more compelling modern examples of this behavior, said Melin, is the seasonal consumption of termites by chimpanzees, whose use of tools to extract this protein-rich food source is an important survival technique in harsh environments.
What does this all mean for hominids?
While it’s hard to decipher the extent of seasonal dietary variations from the fossil record, stable isotope analyses indicate seasonal variation in diet for at least one South African hominin, Paranthropus robustus. Other isotopic research suggests that early human diets may have included a range of extractable foods, such as termites, plant roots and tubers.
Modern humans frequently consume insects, which are seasonally important when other animal foods are limited.
This study suggests that the ingenuity required to survive on a diet of elusive insects has been a key factor in the development of uniquely human skills:
It may well have been bugs that helped build our brains.

Insect diet helped early humans build bigger brains

Figuring out how to survive on a lean-season diet of hard-to-reach ants, slugs and other bugs may have spurred the development of bigger brains and higher-level cognitive functions in the ancestors of humans and other primates, suggests research from Washington University in St. Louis.

“Challenges associated with finding food have long been recognized as important in shaping evolution of the brain and cognition in primates, including humans,” said Amanda D. Melin, PhD, assistant professor of anthropology in Arts & Sciences and lead author of the study.

“Our work suggests that digging for insects when food was scarce may have contributed to hominid cognitive evolution and set the stage for advanced tool use.”

Based on a five-year study of capuchin monkeys in Costa Rica, the research provides support for an evolutionary theory that links the development of sensorimotor (SMI) skills, such as increased manual dexterity, tool use, and innovative problem solving, to the creative challenges of foraging for insects and other foods that are buried, embedded or otherwise hard to procure.

Published in the June 2014 Journal of Human Evolution, the study is the first to provide detailed evidence from the field on how seasonal changes in food supplies influence the foraging patterns of wild capuchin monkeys.

The study is co-authored by biologist Hilary C. Young and anthropologists Krisztina N. Mosdossy and Linda M. Fedigan, all from the University of Calgary, Canada.

It notes that many human populations also eat embedded insects on a seasonal basis and suggests that this practice played a key role in human evolution.

“We find that capuchin monkeys eat embedded insects year-round but intensify their feeding seasonally, during the time that their preferred food – ripe fruit – is less abundant,” Melin said. “These results suggest embedded insects are an important fallback food.”

Previous research has shown that fallback foods help shape the evolution of primate body forms, including the development of strong jaws, thick teeth and specialized digestive systems in primates whose fallback diets rely mainly on vegetation.

This study suggests that fallback foods can also play an important role in shaping brain evolution among primates that fall back on insect-based diets, and that this influence is most pronounced among primates that evolve in habitats with wide seasonal variations, such as the wet-dry cycles found in some South American forests.

“Capuchin monkeys are excellent models for examining evolution of brain size and intelligence for their small body size, they have impressively large brains,” Melin said. “Accessing hidden and well-protected insects living in tree branches and under bark is a cognitively demanding task, but provides a high-quality reward: fat and protein, which is needed to fuel big brains.”

But when it comes to using tools, not all capuchin monkey strains and lineages are created equal, and Melin’s theories may explain why.

Perhaps the most notable difference between the robust (tufted, genus Sapajus) and gracile (untufted, genus Cebus) capuchin lineages is their variation in tool use. While Cebus monkeys are known for clever food-foraging tricks, such as banging snails or fruits against branches, they can’t hold a stick to their Sapajus cousins when it comes to the
innovative use and modification of sophisticated tools.

One explanation, Melin said, is that Cebus capuchins have historically and consistently occupied tropical rainforests, whereas the Sapajus lineage spread from their origins in the Atlantic rainforest into drier, more temperate and seasonal habitat types.

“Primates who extract foods in the most seasonal environments are expected to experience the strongest selection in the ‘sensorimotor intelligence’ domain, which includes cognition related to object handling,” Melin said. “This may explain the occurrence of tool use in some capuchin lineages, but not in others.”

Genetic analysis of mitochondial chromosomes suggests that the Sapajus-Cebus diversification occurred millions of years ago in the late Miocene epoch.

“We predict that the last common ancestor of Cebus and Sapajus had a level of SMI more closely resembling extant Cebus monkeys, and that further expansion of SMI evolved in the robust lineage to facilitate increased access to varied embedded fallback foods,
necessitated by more intense periods of fruit shortage,” she said.

One of the more compelling modern examples of this behavior, said Melin, is the seasonal consumption of termites by chimpanzees, whose use of tools to extract this protein-rich food source is an important survival technique in harsh environments.

What does this all mean for hominids?

While it’s hard to decipher the extent of seasonal dietary variations from the fossil record, stable isotope analyses indicate seasonal variation in diet for at least one South African hominin, Paranthropus robustus. Other isotopic research suggests that early human diets may have included a range of extractable foods, such as termites, plant roots and tubers.

Modern humans frequently consume insects, which are seasonally important when other animal foods are limited.

This study suggests that the ingenuity required to survive on a diet of elusive insects has been a key factor in the development of uniquely human skills:

It may well have been bugs that helped build our brains.

Filed under primates sensorimotor intelligence evolution tool use problem solving neuroscience science

134 notes

The Biology of Addiction Risk Looks Like Addiction

Research suggests that people at increased risk for developing addiction share many of the same neurobiological signatures of people who have already developed addiction. This similarity is to be expected, as individuals with family members who have struggled with addiction are over-represented in the population of addicted people.

However, a generation of animal research supports the hypothesis that the addiction process changes the brain in ways that converge with the distinctive neurobiology of the heritable risk for addiction. In other words, the more one uses addictive substances, the more one’s brain acquires the profile of someone who has inherited a risk for addiction.

One such change is a reduction in striatal dopamine release. Dopamine is a key brain chemical messenger involved in reward-related behaviors. Disturbances in dopamine signaling appear to contribute to reward processing that biases people to seek drug-like rewards and to develop drug-taking habits.

In the current issue of Biological Psychiatry, researchers at McGill University report that individuals at high risk for addiction show the same reduced dopamine response often observed in addicted individuals, identifying a new link between addiction risk and addiction in humans.

Dr. Marco Leyton and his colleagues recruited young adults, aged 18 to 25, who were classified into three groups: 1) a high-risk group of occasional stimulant users with an extensive family history of substance abuse; 2) a comparison group of occasional stimulant users with no family history; and 3) a second comparison group of individuals with no history of stimulant use and no known risk factors for addiction. Volunteers underwent a positron emission tomography (PET) scan involving the administration of amphetamine, which enabled the researchers to measure their dopamine response.

The authors found that the high-risk group of non-dependent young adults with extensive family histories of addiction displayed markedly reduced dopamine responses in comparison with both stimulant-naïve subjects and non-dependent users with no family history.

“This interesting new parallel between addiction risk and addiction may help to focus our attention on reward-related processes that contribute to the development of addiction, perhaps informing prevention strategies,” said Dr. John Krystal, Editor of Biological Psychiatry.

Leyton, a Professor at McGill University, said, “Young adults at risk of addictions have a strikingly disturbed brain dopamine reward system response when they are administered amphetamine. Past drug use seemed to aggravate the dopamine response also but this was not a sufficient explanation. Instead, the disturbance may be a heritable biological marker that could identify those at highest risk.”

This finding suggests that there are common brain mechanisms that promote the use of addictive substances in vulnerable people and in people who have long-standing habitual substance use.

Better understanding this biology may help to advance our understanding of how people develop addiction problems, as well as providing hints related to biological mechanisms that might be targeted for prevention and treatment.

(Source: elsevier.com)

Filed under addiction reward system dopamine neuroscience science

352 notes

Addiction starts with an overcorrection in the brain
The National Institutes of Health has turned to neuroscientists at the nation’s most “Stone Cold Sober” university for help finding ways to treat drug and alcohol addiction.
Brigham Young University professor Scott Steffensen and his collaborators have published three new scientific papers that detail the brain mechanisms involved with addictive substances. And the NIH thinks Steffensen’s on the right track, as evidenced by a $2-million grant that will help fund projects in his BYU lab for the next five years.
“Addiction is a brain disease that could be treated like any other disease,” Steffensen said. “I wouldn’t be as motivated to do this research, or as passionate about the work, if I didn’t think a cure was possible.” 
Steffensen’s research suggests that the process of a brain becoming addicted is similar to a driver overcorrecting a vehicle. When drugs and alcohol release unnaturally high levels of dopamine in the brain’s pleasure system, oxidative stress occurs in the brain.
Steffensen and his collaborators have found that the brain responds by generating a protein called BDNF (brain derived neurotrophic factor). This correction suppresses the brain’s normal production of dopamine long after someone comes down from a high. Not having enough dopamine is what causes the pains, distress and anxiety of withdrawal.
“The body attempts to compensate for unnatural levels of dopamine, but a pathological process occurs,” Steffensen said. “We think it all centers around a subset of neurons that ordinarily put the brakes on dopamine release.”
A group of undergraduate students work in Steffensen’s lab along with post-doctoral fellows and graduate students. Jennifer Blanchard Mabey, a graduate student in neuroscience, co-authored a paper about withdrawal that is in the current issue of The Journal of Neuroscience.
“It’s rewarding to see that your research efforts place another small piece in the enormous addiction puzzle,” said Mabey.
A separate study, co-authored by Steffensen and Ph.D. candidates Nathan Schilaty and David Hedges, explains how nicotine and alcohol interact in the brain.
“Addiction is a huge concern in our society and is very misunderstood,” Schilaty said. “Our research is helping us to formulate ideas on how we can better help these individuals through non-invasive and non-pharmacological means.”
Eun Young Jang, a post-doctoral fellow in Steffensen’s lab, authored a third paper for Addiction Biology describing the effects of cocaine addiction on the brain’s reward circuitry.
In these three research papers, dopamine is the common thread.
“I am optimistic that in the near future medical science will be able to reverse the brain changes in dopamine transmission that occur with drug dependence and return an ‘addict’ to a relatively normal state,” Steffensen said. “Then the addict will be in a better position to make rational decisions regarding their behavior and will be empowered to remain drug free.”

Addiction starts with an overcorrection in the brain

The National Institutes of Health has turned to neuroscientists at the nation’s most “Stone Cold Sober” university for help finding ways to treat drug and alcohol addiction.

Brigham Young University professor Scott Steffensen and his collaborators have published three new scientific papers that detail the brain mechanisms involved with addictive substances. And the NIH thinks Steffensen’s on the right track, as evidenced by a $2-million grant that will help fund projects in his BYU lab for the next five years.

“Addiction is a brain disease that could be treated like any other disease,” Steffensen said. “I wouldn’t be as motivated to do this research, or as passionate about the work, if I didn’t think a cure was possible.” 

Steffensen’s research suggests that the process of a brain becoming addicted is similar to a driver overcorrecting a vehicle. When drugs and alcohol release unnaturally high levels of dopamine in the brain’s pleasure system, oxidative stress occurs in the brain.

Steffensen and his collaborators have found that the brain responds by generating a protein called BDNF (brain derived neurotrophic factor). This correction suppresses the brain’s normal production of dopamine long after someone comes down from a high. Not having enough dopamine is what causes the pains, distress and anxiety of withdrawal.

“The body attempts to compensate for unnatural levels of dopamine, but a pathological process occurs,” Steffensen said. “We think it all centers around a subset of neurons that ordinarily put the brakes on dopamine release.”

A group of undergraduate students work in Steffensen’s lab along with post-doctoral fellows and graduate students. Jennifer Blanchard Mabey, a graduate student in neuroscience, co-authored a paper about withdrawal that is in the current issue of The Journal of Neuroscience.

“It’s rewarding to see that your research efforts place another small piece in the enormous addiction puzzle,” said Mabey.

A separate study, co-authored by Steffensen and Ph.D. candidates Nathan Schilaty and David Hedges, explains how nicotine and alcohol interact in the brain.

“Addiction is a huge concern in our society and is very misunderstood,” Schilaty said. “Our research is helping us to formulate ideas on how we can better help these individuals through non-invasive and non-pharmacological means.”

Eun Young Jang, a post-doctoral fellow in Steffensen’s lab, authored a third paper for Addiction Biology describing the effects of cocaine addiction on the brain’s reward circuitry.

In these three research papers, dopamine is the common thread.

“I am optimistic that in the near future medical science will be able to reverse the brain changes in dopamine transmission that occur with drug dependence and return an ‘addict’ to a relatively normal state,” Steffensen said. “Then the addict will be in a better position to make rational decisions regarding their behavior and will be empowered to remain drug free.”

Filed under addiction brain-derived neurotrophic factor opiates dopamine neuroscience science

97 notes

Research Links Alzheimer’s Disease to Brain Hyperactivity

Patients with Alzheimer’s disease run a high risk of seizures. While the amyloid-beta protein involved in the development and progression of Alzheimer’s seems the most likely cause for this neuronal hyperactivity, how and why this elevated activity takes place hasn’t yet been explained — until now.

image

A new study by Tel Aviv University researchers, published in Cell Reports, pinpoints the precise molecular mechanism that may trigger an enhancement of neuronal activity in Alzheimer’s patients, which subsequently damages memory and learning functions. The research team, led by Dr. Inna Slutsky of TAU’s Sackler Faculty of Medicine and Sagol School of Neuroscience, discovered that the amyloid precursor protein (APP), in addition to its well-known role in producing amyloid-beta, also constitutes the receptor for amyloid-beta. According to the study, the binding of amyloid-beta to pairs of APP molecules triggers a signalling cascade, which causes elevated neuronal activity.

Elevated activity in the hippocampus — the area of the brain that controls learning and memory — has been observed in patients with mild cognitive impairment and early stages of Alzheimer’s disease. Hyperactive hippocampal neurons, which precede amyloid plaque formation, have also been observed in mouse models with early onset Alzheimer’s disease. “These are truly exciting results,” said Dr. Slutsky. “Our work suggests that APP molecules, like many other known cell surface receptors, may modulate the transfer of information between neurons.”

With the understanding of this mechanism, the potential for restoring memory and protecting the brain is greatly increased.

Building on earlier research

The research project was launched five years ago, following the researchers’ discovery of the physiological role played by amyloid-beta, previously known as an exclusively toxic molecule. The team found that amyloid-beta is essential for the normal day-to-day transfer of information through the nerve cell networks. If the level of amyloid-beta is even slightly increased, it causes neuronal hyperactivity and greatly impairs the effective transfer of information between neurons.

In the search for the underlying cause of neuronal hyperactivity, TAU doctoral student Hilla Fogel and postdoctoral fellow Samuel Frere found that while unaffected “normal” neurons became hyperactive following a rise in amyloid-beta concentration, neurons lacking APP did not respond to amyloid-beta. “This finding was the starting point of a long journey toward decoding the mechanism of APP-mediated hyperactivity,” said Dr. Slutsky.

The researchers, collaborating with Prof. Joel Hirsch of TAU’s Faculty of Life Sciences, Prof. Dominic Walsh of Harvard University, and Prof. Ehud Isacoff of University of California Berkeley, harnessed a combination of cutting-edge high-resolution optical imaging, biophysical methods and molecular biology to examine APP-dependent signalling in neural cultures, brain slices, and mouse models. Using highly sensitive biophysical techniques based on fluorescence resonance energy transfer (FRET) between fluorescent proteins in close proximity, they discovered that APP exists as a dimer at presynaptic contacts, and that the binding of amyloid-beta triggers a change in the APP-APP interactions, leading to an increase in calcium flux and higher glutamate release — in other words, brain hyperactivity.

A new approach to protecting the brain

"We have now identified the molecular players in hyperactivity," said Dr. Slutsky. "TAU postdoctoral fellow Oshik Segev is now working to identify the exact spot where the amyloid-beta binds to APP and how it modifies the structure of the APP molecule. If we can change the APP structure and engineer molecules that interfere with the binding of amyloid-beta to APP, then we can break up the process leading to hippocampal hyperactivity. This may help to restore memory and protect the brain."

Previous studies by Prof. Lennart Mucke’s laboratory strongly suggest that a reduction in the expression level of “tau” (microtubule-associated protein), another key player in Alzheimer’s pathogenesis, rescues synaptic deficits and decreases abnormal brain activity in animal models. “It will be crucial to understand the missing link between APP and ‘tau’-mediated signalling pathways leading to hyperactivity of hippocampal circuits. If we can find a way to disrupt the positive signalling loop between amyloid-beta and neuronal activity, it may rescue cognitive decline and the conversion to Alzheimer’s disease,” said Dr. Slutsky.

(Source: aftau.org)

Filed under alzheimer's disease brain activity beta amyloid hippocampus hyperactivity neuroscience science

free counters