Neuroscience

Articles and news from the latest research reports.

Posts tagged science

51 notes

Dementia Risk Quadrupled in People with Mild Cognitive Impairment

In a long-term, large-scale population-based study of individuals aged 55 years or older in the general population researchers found that those diagnosed with mild cognitive impairment (MCI) had a four-fold increased risk of developing dementia or Alzheimer’s disease (AD) compared to cognitively healthy individuals. Several risk factors including older age, positive APOE-ɛ4 status, low total cholesterol levels, and stroke, as well as specific MRI findings were associated with an increased risk of developing MCI. The results are published in a supplement to the Journal of Alzheimer’s Disease.

“Mild cognitive impairment has been identified as the transitional stage between normal aging and dementia,” comments M. Arfan Ikram, MD, PhD, a neuroepidemiologist at Erasmus MC University Medical Center (Rotterdam). “Identifying persons at a higher risk of dementia could postpone or even prevent dementia by timely targeting modifiable risk factors.”

Unlike a clinical trial, the Rotterdam study is an observational cohort study focusing on the general population, instead of persons referred to a memory clinic. The Rotterdam study began in 1990, when almost 8,000 inhabitants of Rotterdam aged 55 years or older agreed to participate in the study. Ten years later, another 3,000 individuals were added. Participants undergo home interviews and examinations every four years.

“This important prospective study adds to the accumulating evidence that strokes, presumably related to so called ‘vascular’ risk factors, also contribute to the appearance of dementia in Alzheimer’s disease. This leads to the conclusion that starting at midlife people should minimize those risk factors. The recent results of the Finish FINGER study corroborate this idea. It should be remembered that delaying the onset of dementia by five years will reduce the prevalence of the disease by half. And of course, since there is no cure for AD, prevention is the best approach at present,” explains Professor Emeritus Amos D Korczyn, Tel Aviv University, Ramat Aviv, Israel, and Guest Editor of the Supplement.

To be diagnosed with MCI in the study, individuals were required to meet three criteria: a self-reported awareness of having problems with memory or everyday functioning; deficits detected on a battery of cognitive tests; and no evidence of dementia. They were categorized into those with memory problems (amnestic MCI) and those with normal memory (non-amnestic MCI).

Of 4,198 persons found to be eligible for the study, almost 10% were diagnosed with MCI. Of these, 163 had amnestic MCI and 254 had non-amnestic MCI.

The risk of dementia was especially high for people with amnestic MCI. Similar results were observed regarding the risk for Alzheimer’s disease. Those with MCI also faced a somewhat higher risk of death. 

The research team investigated possible determinants of MCI, considering factors such as age, APOE-ɛ status, waist circumference, hypertension, diabetes mellitus, total and HDL-cholesterol levels, smoking, and stroke. Only older age, being an APOE-ɛ4 carrier, low total cholesterol levels, and stroke at baseline were associated with developing MCI. Having the APOE-ɛ4 genotype and smoking were related only to amnestic MCI.

When the investigators analysed MRI studies of the brain, they found that participants with MCI, particularly those with non-amnestic MCI, had larger white matter lesion volumes and worse microstructural integrity of normal-appearing white matter compared to controls. They were also three-times more likely than controls to have lacunes (3 to 15 mm cerebrospinal fluid (CSF)-filled cavities in the basal ganglia or white matter, frequently observed when imaging older people). MCI was not associated with total brain volume, hippocampal volume, or cerebral microbleeds.

“Our results suggest that accumulating vascular damage plays a role in both amnestic and non-amnestic MCI,” says Dr. Ikram. “We propose that timely targeting of modifiable vascular risk factors might contribute to the prevention of MCI and dementia.”

Reference:

Determinants, MRI Correlates, and Prognosis of Mild Cognitive Impairment: The Rotterdam Study. Renée F.A.G. de Bruijn, Saloua Akoudad, Lotte G.M. Cremers, Albert Hofman, Wiro J. Niessen, Aad van der Lugt, Peter J. Koudstaal, Meike W. Vernooij, M. Arfan Ikram. Journal of Alzheimer’s Disease, Volume 42/Supplement 3 (August 2014): 2013 International Congress on Vascular Dementia (Guest Editor: Amos D. Korczyn)

(Source: iospress.nl)

Filed under cognitive impairment dementia alzheimer's disease memory brain structure neuroscience science

280 notes

Link between vitamin D and dementia risk confirmed

Vitamin D deficiency is associated with a substantially increased risk of dementia and Alzheimer’s disease in older people, according to the most robust study of its kind ever conducted.

image

An international team, led by Dr David Llewellyn at the University of Exeter Medical School, found that study participants who were severely Vitamin D deficient were more than twice as likely to develop dementia and Alzheimer’s disease.

The team studied elderly Americans who took part in the Cardiovascular Health Study. They discovered that adults in the study who were moderately deficient in vitamin D had a 53 per cent increased risk of developing dementia of any kind, and the risk increased to 125 per cent in those who were severely deficient.

Similar results were recorded for Alzheimer’s disease, with the moderately deficient group 69 per cent more likely to develop this type of dementia, jumping to a 122 per cent increased risk for those severely deficient.

The study was part-funded by the Alzheimer’s Association, and is published in August 6 2014 online issue of Neurology, the medical journal of the American Academy of Neurology. It looked at 1,658 adults aged 65 and over, who were able to walk unaided and were free from dementia, cardiovascular disease and stroke at the start of the study. The participants were then followed for six years to investigate who went on to develop Alzheimer’s disease and other forms of dementia.

Dr Llewellyn said: “We expected to find an association between low Vitamin D levels and the risk of dementia and Alzheimer’s disease, but the results were surprising – we actually found that the association was twice as strong as we anticipated.

“Clinical trials are now needed to establish whether eating foods such as oily fish or taking vitamin D supplements can delay or even prevent the onset of Alzheimer’s disease and dementia. We need to be cautious at this early stage and our latest results do not demonstrate that low vitamin D levels cause dementia. That said, our findings are very encouraging, and even if a small number of people could benefit, this would have enormous public health implications given the devastating and costly nature of dementia.”

Research collaborators included experts from Angers University Hospital, Florida International University, Columbia University, the University of Washington, the University of Pittsburgh and the University of Michigan. The study was supported by the Alzheimer’s Association, the Mary Kinross Charitable Trust, the James Tudor Foundation, the Halpin Trust, the Age Related Diseases and Health Trust, the Norman Family Charitable Trust, and the National Institute for Health Research Collaboration for Leadership in Applied Research and Care South West Peninsula (NIHR PenCLAHRC).

Dementia is one of the greatest challenges of our time, with 44 million cases worldwide – a number expected to triple by 2050 as a result of rapid population ageing. A billion people worldwide are thought to have low vitamin D levels and many older adults may experience poorer health as a result.

The research is the first large study to investigate the relationship between vitamin D and dementia risk where the diagnosis was made by an expert multidisciplinary team, using a wide range of information including neuroimaging. Previous research established that people with low vitamin D levels are more likely to go on to experience cognitive problems, but this study confirms that this translates into a substantial increase in the risk of Alzheimer’s disease and dementia.

Vitamin D comes from three main sources – exposure of skin to sunlight, foods such as oily fish, and supplements. Older people’s skin can be less efficient at converting sunlight into Vitamin D, making them more likely to be deficient and reliant on other sources. In many countries the amount of UVB radiation in winter is too low to allow vitamin D production.

The study also found evidence that there is a threshold level of Vitamin D circulating in the bloodstream below which the risk of developing dementia and Alzheimer’s disease increases.  The team had previously hypothesized that this might lie in the region of 25-50 nmol/L, and their new findings confirm that vitamin D levels above 50 nmol/L are most strongly associated with good brain health.

Commenting on the study, Dr Doug Brown, Director of Research and Development at Alzheimer’s Society said: “Shedding light on risk factors for dementia is one of the most important tasks facing today’s health researchers. While earlier studies have suggested that a lack of the sunshine vitamin is linked to an increased risk of Alzheimer’s disease, this study found that people with very low vitamin D levels were more than twice as likely to develop any kind of dementia.

“During this hottest of summers, hitting the beach for just 15 minutes of sunshine is enough to boost your vitamin D levels. However, we’re not quite ready to say that sunlight or vitamin D supplements will reduce your risk of dementia. Large scale clinical trials are needed to determine whether increasing vitamin D levels in those with deficiencies can help prevent the dementia from developing.”

(Source: exeter.ac.uk)

Filed under alzheimer's disease dementia vitamin deficiency vitamin d neuroscience science

110 notes

Patients with autism spectrum disorder are not sensitive to ‘being imitated’

A Japanese research group led by Prof Norihiro Sadato, a professor of the National Institute for Physiological Sciences (NIPS), National Institutes of Natural Sciences (NINS), has found that people with autism spectrum disorders (ASD) have decreased activity in an area in the brain critical for understanding if his/her movement was imitated by others. These results will be published in Neuroscience Research.

image

The research group of Norihiro Sadato, a professor of NIPS, Hirotaka Kosaka, a specially-assigned associate professor of the University of Fukui, and Toshio Munesue, a professor of Kanazawa University measured brain activity by functional magnetic resonance imaging (fMRI) when one’s movement was imitated by others. The group studied brain activity when a subject saw his/her finger movement imitated or not imitated by others. Normal subjects have increased activity in the extrastriate body area (EBA) when they are imitated compared to when they are not being imitated. The EBA is a region in the visual cortex for visual processing that responds powerfully during the perception of human body parts. On the other hand, because this kind of activity in the EBA of subjects with ASD was not observed, it shows that the EBA of subjects with ASD is not working properly when imitated.

Persons with ASD are known to have difficulty in interpersonal communication and have trouble noticing that their movement was imitated. Behavioral intervention research to alleviate ASD is proceeding and indicates that training utilizing imitation is useful. The result of the above research not only provided clues to ASD, but also can be used in the evaluation of behavioral intervention to alleviate the disorder.

(Source: eurekalert.org)

Filed under autism extrastriate body area brain activity neuroimaging visual processing neuroscience science

105 notes

Blood-oxytocin levels in normal range in children with autism

Autism does not appear to be solely caused by a deficiency of oxytocin, but the hormone’s universal ability to boost social function may prove useful in treating a subset of children with the developmental disorder, according to new findings from the Stanford University School of Medicine and Lucile Packard Children’s Hospital Stanford.

image

Low levels of oxytocin, a hormone involved in social functioning, have for years been suspected of causing autism. Prior research seeking a link has produced mixed results. Now, in the largest-ever study to test the purported connection, the range of blood oxytocin levels has been shown to be the same in children with autism as that observed in two comparison groups: children with autistic siblings and children without autistic siblings. In other words, similar numbers of children with low, medium and high oxytocin levels were found in all three groups.

A paper describing the new findings was published online Aug. 4 in Proceedings of the National Academy of Sciences.

Although autism was not directly linked to oxytocin deficiency, the Stanford team found that higher oxytocin levels were linked to better social functioning in all groups. All children with autism have social deficits, but in the study these deficits were worst in those with the lowest blood oxytocin and mildest in those with the highest oxytocin. In the comparison groups, children’s social skills also fell across a range that correlated to their oxytocin levels.

Regulator of social functioning

“Oxytocin appears to be a universal regulator of social functioning in humans,” said Karen Parker, PhD, assistant professor of psychiatry and behavioral sciences and the lead author of the study. “That encompasses both typically developing children as well as those with the severe social deficits we see in children with autism.”

Autism is a developmental disorder that affects 1 of every 68 children in the United States. It is characterized by social and communication deficits, repetitive behaviors and sensory problems. The new study included 79 children with autism, 52 of their unaffected siblings and 62 unrelated children without autism. All of the children were between the ages of 3 and 12.

“It didn’t matter if you were a typically developing child, a sibling or an individual with autism: Your social ability was related to a certain extent to your oxytocin levels, which is very different from what people have speculated,” said Antonio Hardan, MD, professor of psychiatry and behavioral sciences and the study’s senior author. Hardan is a child and adolescent psychiatrist who treats children with autism at the hospital.

“The previous hypotheses saying that low oxytocin was linked to autism were maybe a little bit simplistic,” he said. “It’s much more complex: Oxytocin is a vulnerability factor that has to be accounted for, but it’s not the only thing leading to the development of autism.”

The researchers caution, however, that blood oxytocin measurements may be different than oxytocin levels in the cerebrospinal fluid bathing the brain, which they did not measure.

In addition to examining blood oxytocin levels, the researchers examined the importance of small variations in the gene coding for the oxytocin receptor. Certain receptor variants were correlated to higher scores on standard tests of social ability, the study found.

Inheriting social abilities

The team also discovered that blood levels of oxytocin are highly heritable: The levels are influenced by inheritance to about the same degree as adult height, which is often described as being strongly influenced by genetics.

"What our study hints at is that social function may be heritable in families," Parker said.

The study will help to guide future research to determine whether oxytocin is a useful autism treatment. The study’s findings suggest that some children with autism — such as the subset of kids with autism who have naturally low oxytocin levels, or those with oxytocin receptor gene variants associated with worse social functioning — might benefit most from oxytocin-like drugs.

 “Autism is so heterogeneous,” Parker said. “If we can identify biomarkers that help us identify the patients most likely to benefit from a specific therapy, we expect that will be very useful.”

(Source: med.stanford.edu)

Filed under autism oxytocin social interaction social function genetics neuroscience science

113 notes

Common chemical in mothers may negatively affect the IQ of their unborn children

In some women abnormally high levels of a common and pervasive chemical may lead to adverse effects in their offspring. The study, published recently in the Journal of Clinical Endocrinology & Metabolism, is the first of its kind to shed light on the possible harmful side effects of perchlorate in mothers and their children.

image

Using data from the Controlled Antenatal Thyroid Study (CATS) cohort, researchers at Boston University School of Medicine (BUSM) and Cardiff University studied the effect of perchlorate, an environmental contaminant found in many foods and in some drinking water supplies, and its effects on children born to mothers with above average levels of this substance in their system. They studied 487 mother-child pairs from women with underactive thyroid glands and in the 50 women with the highest levels of perchlorate in their body, their offspring had below average IQ levels when compared to other children.

"The reason people really care about perchlorate is because it is ubiquitous. It’s everywhere," said Elizabeth Pearce, MD, MSc, associate professor of medicine at BUSM. "Prior studies have already shown perchlorate, at low levels, can be found in each and every one of us."

Perchlorate is a compound known to affect the thyroid gland, an organ needed to help regulate hormone levels in humans. According to Pearce previous studies have attempted to implicate this anti-thyroid activity in pregnant mothers as a possible cause of hypothyroidism, or an underactive thyroid gland. Hypothyroidism in newborns and children can lead to an array of unwelcome side effects, including below average intelligence.

(Source: eurekalert.org)

Filed under perchlorate intelligence pregnancy thyroid gland cognitive development neuroscience science

158 notes

New brain mechanism study could advance artificial intelligence

Research at the University of Reading has provided a new understanding of how our brain processes information to change how we see the world.

image

Using a simple computer game, akin to a 3D version of the 80s game Pong, the researchers examined how the brain recalibrates its perception of slant in order to bounce a moving ball through a target hoop.

They found that the brain uses an internal simulation of the laws of physics to change its perception of slant in order to ‘score’ consistently.

The findings provide a unique insight into why humans are such an adaptable and skillful species. With the development of effective autonomous robots, engineers are starting to look at how humans’ sensory systems effortlessly achieve what is currently impossible for robotic systems.

The study, funded by the Engineering and Physical Sciences Research Council and the Wellcome Trust, saw participants play a 3D game where they had to adjust the slant of a surface so that a moving ball bounced off it and through a target hoop.

Part way through the game, without telling the participants, researchers altered the bounce of the ball so that the surface behaved differently to the slant signalled by visual cues. 

When faced with the altered bounce, participants changed their behaviour to continue scoring points. At the same time, their brain recalibrated their perception of slant - simulating the laws of physics to actually change how the slant looked. In a separate group, making the ball spin eliminated this recalibration.

Dr. Peter Scarfe from the School of Psychology and Clinical Language Sciences, who conducted the study with colleague Prof. Andrew Glennerster, said: “We take for granted our amazing ‘adaptability’ which allows us to enjoy such past-times as DIY or playing ball sports. However, little is known about the brain mechanisms that enable us to do these activities. Our research shows how our brains appear to have an intimate understanding of the laws of physics. In addition to aiding skillful action, this can change how we perceive the world around us.”

The researchers say understanding the basic mechanisms that allow the brain to calibrate sensory information will prove vital in the design of future autonomous robots.

Dr. Scarfe continued: “The human brain exhibits expert skill in making predictions about how the world behaves. For example, a child can bounce a ball off a wall and understand how spinning the ball alters its bounce. However, many of the fine motor skills of a young child are currently way beyond the capability of modern robots. Understanding how sensory systems adapt to feedback about the consequences of actions is likely to be key in solving this problem.”

Humans Use Predictive Kinematic Models to Calibrate Visual Cues to Three-Dimensional Surface Slant is published in the Journal of Neuroscience

(Source: reading.ac.uk)

Filed under AI somatosensory system kinematics perception psychology neuroscience science

122 notes

In search for Alzheimer’s drug, a major STEP forward

Researchers at Yale School of Medicine have discovered a new drug compound that reverses the brain deficits of Alzheimer’s disease in an animal model. Their findings are published in the Aug. 5 issue of the journal PLoS Biology.

The compound, TC-2153, inhibits the negative effects of a protein called STtriatal-Enriched tyrosine Phosphatase (STEP), which is key to regulating learning and memory. These cognitive functions are impaired in Alzheimer’s.

"Decreasing STEP levels reversed the effects of Alzheimer’s disease in mice," said lead author Paul Lombroso, M.D., professor in the Yale Child Study Center and in the Departments of Neurobiology and Psychiatry at Yale School of Medicine.

Lombroso and co-authors studied thousands of small molecules, searching for those that would inhibit STEP activity. Once identified, those STEP-inhibiting compounds were tested in brain cells to examine how effectively they could halt the effects of STEP. They examined the most promising compound in a mouse model of Alzheimer’s disease, and found a reversal of deficits in several cognitive exercises that gauged the animals’ ability to remember previously seen objects.

High levels of STEP proteins keep synapses in the brain from strengthening. Synaptic strengthening is a process that is required for people to turn short-term memories into long-term memories. When STEP is elevated in the brain, it depletes receptors from synaptic sites, and inactivates other proteins that are necessary for proper cognitive function. This disruption can result in Alzheimer’s disease or a number of neuropsychiatric and neurodegenerative disorders, all marked by cognitive deficits.

"The small molecule inhibitor is the result of a five-year collaborative effort to search for STEP inhibitors," said Lombroso. "A single dose of the drug results in improved cognitive function in mice. Animals treated with TC compound were indistinguishable from a control group in several cognitive tasks."

The team is currently testing the TC compound in other animals with cognitive defects, including rats and non-human primates. “These studies will determine whether the compound can improve cognitive deficits in other animal models,” said Lombroso. “Successful results will bring us a step closer to testing a drug that improves cognition in humans.”

(Source: eurekalert.org)

Filed under alzheimer's disease STEP TC-2153 cognitive function animal model neuroscience science

125 notes

(Image caption: The presence of p45 (green staining) and p75 (red staining) indicates that motor neurons increase both p45 and p75 expression after sciatic nerve injury in an animal. Image credit: Courtesy of the Salk Institute for Biological Studies)
Scientists uncover new clues to repairing an injured spinal cord
Frogs, dogs, whales, snails can all do it, but humans and primates can’t. Regrow nerves after an injury, that is—while many animals have this ability, humans don’t. But new research from the Salk Institute suggests that a small molecule may be able to convince damaged nerves to grow and effectively rewire circuits. Such a feat could eventually lead to therapies for the thousands of Americans with severe spinal cord injuries and paralysis.
"This research implies that we might be able to mimic neuronal repair processes that occur naturally in lower animals, which would be very exciting," says the study’s senior author and Salk professor Kuo-Fen Lee. The results were published today in PLOS Biology.
For a damaged nerve to regain function, its long, signal-transmitting extensions known as axons need to grow and establish new connections to other cells.
In a study published last summer in PLOS ONE, Lee and his colleagues found that the protein p45 promotes nerve regeneration by preventing the axon sheath (known as myelin) from inhibiting regrowth. However, humans, primates and some other more advanced vertebrates don’t have p45. Instead, the researchers discovered a different protein, p75, that binds to the axon’s myelin when nerve damage occurs in these animals. Instead of promoting nerve regeneration, p75 actually halts growth in damaged nerves.
"We don’t know why this nerve regeneration doesn’t occur in humans. We can speculate that the brain has so many neural connections that this regeneration is not absolutely necessary," Lee says.
In the study published today, the scientists looked at how two p75 proteins bind together and form a pair that latches onto the inhibitors released from damaged myelin.
By studying the configurations of the proteins in solutions using nuclear magnetic resonance (NMR) technology, the researchers found that the growth-promoting p45 could disrupt the p75 pairing.
"For reasons that are not understood, when p45 comes in, it breaks the pair apart," says Lee, holder of the Helen McLoraine Chair in Molecular Neurobiology.
What’s more, the p45 protein was able to bind to the specific region in the p75 protein that is critical for the formation of the p75 pair, thus decreasing the amount of p75 pairs that bond to inhibitors release from myelin. With less p75 pairs available to bond to inhibitor signals, axons were able to regrow.
The findings suggest that an agent—either p45 or another disrupting molecule—that can effectively break the p75 pair could offer a possible therapy for spinal cord damage.
One method of therapy could be to introduce more p45 protein to injured neurons, but a smarter tactic might be to introduce a small molecule that jams the link between the two p75 proteins, Lee says. “Such an agent could possibly get through the blood-brain barrier and to the site of spinal cord injuries,” he says.
The next step will be to see if introducing p45 helps regenerate damaged human nerves. “That is what we hope to do in the future,” Lee says.

(Image caption: The presence of p45 (green staining) and p75 (red staining) indicates that motor neurons increase both p45 and p75 expression after sciatic nerve injury in an animal. Image credit: Courtesy of the Salk Institute for Biological Studies)

Scientists uncover new clues to repairing an injured spinal cord

Frogs, dogs, whales, snails can all do it, but humans and primates can’t. Regrow nerves after an injury, that is—while many animals have this ability, humans don’t. But new research from the Salk Institute suggests that a small molecule may be able to convince damaged nerves to grow and effectively rewire circuits. Such a feat could eventually lead to therapies for the thousands of Americans with severe spinal cord injuries and paralysis.

"This research implies that we might be able to mimic neuronal repair processes that occur naturally in lower animals, which would be very exciting," says the study’s senior author and Salk professor Kuo-Fen Lee. The results were published today in PLOS Biology.

For a damaged nerve to regain function, its long, signal-transmitting extensions known as axons need to grow and establish new connections to other cells.

In a study published last summer in PLOS ONE, Lee and his colleagues found that the protein p45 promotes nerve regeneration by preventing the axon sheath (known as myelin) from inhibiting regrowth. However, humans, primates and some other more advanced vertebrates don’t have p45. Instead, the researchers discovered a different protein, p75, that binds to the axon’s myelin when nerve damage occurs in these animals. Instead of promoting nerve regeneration, p75 actually halts growth in damaged nerves.

"We don’t know why this nerve regeneration doesn’t occur in humans. We can speculate that the brain has so many neural connections that this regeneration is not absolutely necessary," Lee says.

In the study published today, the scientists looked at how two p75 proteins bind together and form a pair that latches onto the inhibitors released from damaged myelin.

By studying the configurations of the proteins in solutions using nuclear magnetic resonance (NMR) technology, the researchers found that the growth-promoting p45 could disrupt the p75 pairing.

"For reasons that are not understood, when p45 comes in, it breaks the pair apart," says Lee, holder of the Helen McLoraine Chair in Molecular Neurobiology.

What’s more, the p45 protein was able to bind to the specific region in the p75 protein that is critical for the formation of the p75 pair, thus decreasing the amount of p75 pairs that bond to inhibitors release from myelin. With less p75 pairs available to bond to inhibitor signals, axons were able to regrow.

The findings suggest that an agent—either p45 or another disrupting molecule—that can effectively break the p75 pair could offer a possible therapy for spinal cord damage.

One method of therapy could be to introduce more p45 protein to injured neurons, but a smarter tactic might be to introduce a small molecule that jams the link between the two p75 proteins, Lee says. “Such an agent could possibly get through the blood-brain barrier and to the site of spinal cord injuries,” he says.

The next step will be to see if introducing p45 helps regenerate damaged human nerves. “That is what we hope to do in the future,” Lee says.

Filed under motor neurons spinal cord spinal cord injury nerve regeneration p45 neuroscience science

113 notes

Researchers boost insect aggression by altering brain metabolism
Scientists report they can crank up insect aggression simply by interfering with a basic metabolic pathway in the insect brain. Their study, of fruit flies and honey bees, shows a direct, causal link between brain metabolism (how the brain generates the energy it needs to function) and aggression.
The team reports its findings in the Proceedings of the National Academy of Sciences.
The new research follows up on previous work from the laboratory of University of Illinois entomology professor and Institute for Genomic Biology director Gene Robinson, who also led the new analysis. When he and his colleagues looked at brain gene activity in honey bees after they had faced down an intruder, the team found that some metabolic genes were suppressed. These genes play a key role in the most efficient type of energy generation in cells, a process called oxidative phosphorylation.
“It was a counterintuitive finding because these genes were down-regulated,” Robinson said. “You tend to think of aggression as requiring more energy, not less.”
In the new study, postdoctoral researcher Clare Rittschof used drugs to suppress key steps in oxidative phosphorylation in the bee brains. She saw that aggression increased in the drugged bees in a dose-responsive manner, Robinson said. But the drugs had no effect on chronically stressed bees – they were not able to increase their aggression in response to an intruder.
“Something about chronic stress changed their response to the drug, which is a fascinating finding in and of itself,” Robinson said. “We want to know just how this experience gets under their skin to affect their brain.”
In separate experiments, postdoctoral researcher Hongmei Li-Byarlay and undergraduate student Jonathan Massey found that reduced oxidative phosphorylation in fruit flies also increased aggression. Using advanced fly genetics, the team found this effect only when oxidative phosphorylation was reduced in neurons, but not in neighboring cells known as glia. This finding, too, was surprising, since “glia are metabolically very active, and are the energy storehouses of the brain,” Robinson said.
The findings offer insight into the immediate and longer-term changes that occur in response to threats, Robinson said.
“When an animal faces a threat, it has an immediate aggressive response, within seconds,” Robinson said. But changes in brain metabolism take much longer and cannot account for this immediate response, he said. Such changes likely make individuals more vigilant to subsequent threats.
“This makes good sense in an ecological sense,” Robinson said, “because threats often come in bunches.”
The fact that the researchers observed these effects in two species that diverged 300 million years ago makes the findings even more compelling, Robinson said.
“Because fruit flies and honey bees are separated by 300 million years of evolution, this is a very robust and well-conserved mechanism.”

Researchers boost insect aggression by altering brain metabolism

Scientists report they can crank up insect aggression simply by interfering with a basic metabolic pathway in the insect brain. Their study, of fruit flies and honey bees, shows a direct, causal link between brain metabolism (how the brain generates the energy it needs to function) and aggression.

The team reports its findings in the Proceedings of the National Academy of Sciences.

The new research follows up on previous work from the laboratory of University of Illinois entomology professor and Institute for Genomic Biology director Gene Robinson, who also led the new analysis. When he and his colleagues looked at brain gene activity in honey bees after they had faced down an intruder, the team found that some metabolic genes were suppressed. These genes play a key role in the most efficient type of energy generation in cells, a process called oxidative phosphorylation.

“It was a counterintuitive finding because these genes were down-regulated,” Robinson said. “You tend to think of aggression as requiring more energy, not less.”

In the new study, postdoctoral researcher Clare Rittschof used drugs to suppress key steps in oxidative phosphorylation in the bee brains. She saw that aggression increased in the drugged bees in a dose-responsive manner, Robinson said. But the drugs had no effect on chronically stressed bees – they were not able to increase their aggression in response to an intruder.

“Something about chronic stress changed their response to the drug, which is a fascinating finding in and of itself,” Robinson said. “We want to know just how this experience gets under their skin to affect their brain.”

In separate experiments, postdoctoral researcher Hongmei Li-Byarlay and undergraduate student Jonathan Massey found that reduced oxidative phosphorylation in fruit flies also increased aggression. Using advanced fly genetics, the team found this effect only when oxidative phosphorylation was reduced in neurons, but not in neighboring cells known as glia. This finding, too, was surprising, since “glia are metabolically very active, and are the energy storehouses of the brain,” Robinson said.

The findings offer insight into the immediate and longer-term changes that occur in response to threats, Robinson said.

“When an animal faces a threat, it has an immediate aggressive response, within seconds,” Robinson said. But changes in brain metabolism take much longer and cannot account for this immediate response, he said. Such changes likely make individuals more vigilant to subsequent threats.

“This makes good sense in an ecological sense,” Robinson said, “because threats often come in bunches.”

The fact that the researchers observed these effects in two species that diverged 300 million years ago makes the findings even more compelling, Robinson said.

“Because fruit flies and honey bees are separated by 300 million years of evolution, this is a very robust and well-conserved mechanism.”

Filed under aggression aerobic glycolysis oxidative phosphorylation bees glia cells neuroscience science

405 notes

Our brains judge a face’s trustworthiness - Even when we can’t see it
Our brains are able to judge the trustworthiness of a face even when we cannot consciously see it, a team of scientists has found. Their findings, which appear in the Journal of Neuroscience, shed new light on how we form snap judgments of others.
“Our findings suggest that the brain automatically responds to a face’s trustworthiness before it is even consciously perceived,” explains Jonathan Freeman, an assistant professor in New York University’s Department of Psychology and the study’s senior author.
“The results are consistent with an extensive body of research suggesting that we form spontaneous judgments of other people that can be largely outside awareness,” adds Freeman, who conducted the study as a faculty member at Dartmouth College.
The study’s other authors included Ryan Stolier, an NYU doctoral candidate, Zachary Ingbretsen, a research scientist who previously worked with Freeman and is now at Harvard University, and Eric Hehman, a post-doctoral researcher at NYU.
The researchers focused on the workings of the brain’s amygdala, a structure that is important for humans’ social and emotional behavior. Previous studies have shown this structure to be active in judging the trustworthiness of faces. However, it had not been known if the amygdala is capable of responding to a complex social signal like a face’s trustworthiness without that signal reaching perceptual awareness.
To gauge this part of the brain’s role in making such assessments, the study’s authors conducted a pair of experiments in which they monitored the activity of subjects’ amygdala while the subjects were exposed to a series of facial images.
These images included both standardized photographs of actual strangers’ faces as well as artificially generated faces whose trustworthiness cues could be manipulated while all other facial cues were controlled. The artificially generated faces were computer synthesized based on previous research showing that cues such as higher inner eyebrows and pronounced cheekbones are seen as trustworthy and lower inner eyebrows and shallower cheekbones are seen as untrustworthy.
Prior to the start of these experiments, a separate group of subjects examined all the real and computer-generated faces and rated how trustworthy or untrustworthy they appeared. As previous studies have shown, subjects strongly agreed on the level of trustworthiness conveyed by each given face.
In the experiments, a new set of subjects viewed these same faces inside a brain scanner, but were exposed to the faces very briefly—for only a matter of milliseconds. This rapid exposure, together with another feature known as “backward masking,” prevented subjects from consciously seeing the faces. Backward masking works by presenting subjects with an irrelevant “mask” image that immediately follows an extremely brief exposure to a face, which is thought to terminate the brain’s ability to further process the face and prevent it from reaching awareness. In the first experiment, the researchers examined amygdala activity in response to three levels of a face’s trustworthiness: low, medium, and high. In the second experiment, they assessed amygdala activity in response to a fully continuous spectrum of trustworthiness.
Across the two experiments, the researchers found that specific regions inside the amygdala exhibited activity tracking how untrustworthy a face appeared, and other regions inside the amygdala exhibited activity tracking the overall strength of the trustworthiness signal (whether untrustworthy or trustworthy)—even though subjects could not consciously see any of the faces.
“These findings provide evidence that the amygdala’s processing of social cues in the absence of awareness may be more extensive than previously understood,” observes Freeman. “The amygdala is able to assess how trustworthy another person’s face appears without it being consciously perceived.”

Our brains judge a face’s trustworthiness - Even when we can’t see it

Our brains are able to judge the trustworthiness of a face even when we cannot consciously see it, a team of scientists has found. Their findings, which appear in the Journal of Neuroscience, shed new light on how we form snap judgments of others.

“Our findings suggest that the brain automatically responds to a face’s trustworthiness before it is even consciously perceived,” explains Jonathan Freeman, an assistant professor in New York University’s Department of Psychology and the study’s senior author.

“The results are consistent with an extensive body of research suggesting that we form spontaneous judgments of other people that can be largely outside awareness,” adds Freeman, who conducted the study as a faculty member at Dartmouth College.

The study’s other authors included Ryan Stolier, an NYU doctoral candidate, Zachary Ingbretsen, a research scientist who previously worked with Freeman and is now at Harvard University, and Eric Hehman, a post-doctoral researcher at NYU.

The researchers focused on the workings of the brain’s amygdala, a structure that is important for humans’ social and emotional behavior. Previous studies have shown this structure to be active in judging the trustworthiness of faces. However, it had not been known if the amygdala is capable of responding to a complex social signal like a face’s trustworthiness without that signal reaching perceptual awareness.

To gauge this part of the brain’s role in making such assessments, the study’s authors conducted a pair of experiments in which they monitored the activity of subjects’ amygdala while the subjects were exposed to a series of facial images.

These images included both standardized photographs of actual strangers’ faces as well as artificially generated faces whose trustworthiness cues could be manipulated while all other facial cues were controlled. The artificially generated faces were computer synthesized based on previous research showing that cues such as higher inner eyebrows and pronounced cheekbones are seen as trustworthy and lower inner eyebrows and shallower cheekbones are seen as untrustworthy.

Prior to the start of these experiments, a separate group of subjects examined all the real and computer-generated faces and rated how trustworthy or untrustworthy they appeared. As previous studies have shown, subjects strongly agreed on the level of trustworthiness conveyed by each given face.

In the experiments, a new set of subjects viewed these same faces inside a brain scanner, but were exposed to the faces very briefly—for only a matter of milliseconds. This rapid exposure, together with another feature known as “backward masking,” prevented subjects from consciously seeing the faces. Backward masking works by presenting subjects with an irrelevant “mask” image that immediately follows an extremely brief exposure to a face, which is thought to terminate the brain’s ability to further process the face and prevent it from reaching awareness. In the first experiment, the researchers examined amygdala activity in response to three levels of a face’s trustworthiness: low, medium, and high. In the second experiment, they assessed amygdala activity in response to a fully continuous spectrum of trustworthiness.

Across the two experiments, the researchers found that specific regions inside the amygdala exhibited activity tracking how untrustworthy a face appeared, and other regions inside the amygdala exhibited activity tracking the overall strength of the trustworthiness signal (whether untrustworthy or trustworthy)—even though subjects could not consciously see any of the faces.

“These findings provide evidence that the amygdala’s processing of social cues in the absence of awareness may be more extensive than previously understood,” observes Freeman. “The amygdala is able to assess how trustworthy another person’s face appears without it being consciously perceived.”

Filed under amygdala trustworthiness face perception brain activity psychology neuroscience science

free counters