Neuroscience

Articles and news from the latest research reports.

195 notes

Mouse Model Sheds Light Mitochondria’s Role in Neurodegenerative Diseases
A new study by researchers at the University of Utah School of Medicine sheds light on a longstanding question about the role of mitochondria in debilitating and fatal motor neuron diseases and resulted in a new mouse model to study such illnesses.
Researchers led by Janet Shaw, Ph.D., professor of biochemistry, found that when healthy, functioning mitochondria was prevented from moving along axons – nerve fibers that conduct electricity away from neurons – mice developed symptoms of neurodegenerative diseases. In a study in the Proceedings of the National Academy of Sciences, Shaw and her research colleagues said their findings indicate that motor neuron diseases might result from poor distribution of mitochondria along the spinal cord and axons. First author Tammy T. Nguyen, is a student in the U medical school’s M.D./Ph.D. program, which aims to produce physicians with outstanding clinical skills and rigorous scientific training to bridge the worlds of clinical medicine and basic research to improve health care.
“We’ve known for a long time of the link between mitochondrial function and distribution and neural disease,” Shaw says. “But we haven’t been able to tell if the defect occurs because mitochondria aren’t getting to the right place or because they’re not functioning correctly.”
Mitochondria are organelles – compartments contained inside cells – that serve several functions, including making ATP, a nucleotide that cells convert into chemical energy to stay alive. For this reason mitochondria often are called “cellular power plants.” They also play a critical role in preventing too much calcium from building up in cells, which can cause apoptosis, or cell death.
For mitochondria to perform its functions, it must be distributed to cells throughout the body, which is accomplished with the help of small protein “motors” that transport the organelles along axons. For the motors to transport mitochondria, enzymes known as Mitochondrial Rho (Miro1) GTPases act to attach mitochondria to the motors. To study how the movement of mitochondria is related to motor neuron disease, Nguyen developed two mouse models in which the gene that makes Miro1 was knocked out. In one model, mice lacked Miro1 during the embryonic stage. A second model lacked the enzyme in the cerebral cortex, spinal cord and hippocampus.
The researchers observed that mice lacking Miro1 during the embryonic stage had motor neuron defects that prevented them from taking a single breath once born. After examining the mice, Nguyen, Shaw and their colleagues discovered that neurons required for breathing after birth were missing from the upper half of the mice’s brain stems. The phrenic nerve, also important for breathing, was not fully developed, either.
“We believe the physical difficulties in the mice indicated there were motor neuron defects,” Shaw says.
Conversely, the mice without Miro1 in their brain and spinal cord were fine at birth but soon developed signs of neurological problems, such as hunched spines, difficulty moving and clasping their hind paws together, and died around 35 days after birth. Those symptoms appeared similar to motor neuron disease, according to Shaw.
“The mitochondrial function in the cells appeared to be fine, and calcium levels were normal,” she says. “This shows for the first time that restricting mitochondrial movement and distribution could cause neuronal disease.”
Stefan M. Pulst, M.D., Dr. med, professor and chair of the University’s neurology department and a co-author on the study, says the mitochondrial transport process is important not just for motor neurons but other neurons as well. “The Miro1 proteins and the respective animal models represent a breakthrough for studying ALS (Lou Gehrig’s disease) and other neurodegenerative diseases.”
Although much more research must be done, the study opens the possibility of developing new drugs to partially correct the mitochondrial distribution defects to slow the progression of motor neuron diseases. First, Shaw wants to generate a model to knock out the Miro1 gene in adult mice to see if the results mimic neurological diseases.

Mouse Model Sheds Light Mitochondria’s Role in Neurodegenerative Diseases

A new study by researchers at the University of Utah School of Medicine sheds light on a longstanding question about the role of mitochondria in debilitating and fatal motor neuron diseases and resulted in a new mouse model to study such illnesses.

Researchers led by Janet Shaw, Ph.D., professor of biochemistry, found that when healthy, functioning mitochondria was prevented from moving along axons – nerve fibers that conduct electricity away from neurons – mice developed symptoms of neurodegenerative diseases. In a study in the Proceedings of the National Academy of Sciences, Shaw and her research colleagues said their findings indicate that motor neuron diseases might result from poor distribution of mitochondria along the spinal cord and axons. First author Tammy T. Nguyen, is a student in the U medical school’s M.D./Ph.D. program, which aims to produce physicians with outstanding clinical skills and rigorous scientific training to bridge the worlds of clinical medicine and basic research to improve health care.

“We’ve known for a long time of the link between mitochondrial function and distribution and neural disease,” Shaw says. “But we haven’t been able to tell if the defect occurs because mitochondria aren’t getting to the right place or because they’re not functioning correctly.”

Mitochondria are organelles – compartments contained inside cells – that serve several functions, including making ATP, a nucleotide that cells convert into chemical energy to stay alive. For this reason mitochondria often are called “cellular power plants.” They also play a critical role in preventing too much calcium from building up in cells, which can cause apoptosis, or cell death.

For mitochondria to perform its functions, it must be distributed to cells throughout the body, which is accomplished with the help of small protein “motors” that transport the organelles along axons. For the motors to transport mitochondria, enzymes known as Mitochondrial Rho (Miro1) GTPases act to attach mitochondria to the motors. To study how the movement of mitochondria is related to motor neuron disease, Nguyen developed two mouse models in which the gene that makes Miro1 was knocked out. In one model, mice lacked Miro1 during the embryonic stage. A second model lacked the enzyme in the cerebral cortex, spinal cord and hippocampus.

The researchers observed that mice lacking Miro1 during the embryonic stage had motor neuron defects that prevented them from taking a single breath once born. After examining the mice, Nguyen, Shaw and their colleagues discovered that neurons required for breathing after birth were missing from the upper half of the mice’s brain stems. The phrenic nerve, also important for breathing, was not fully developed, either.

“We believe the physical difficulties in the mice indicated there were motor neuron defects,” Shaw says.

Conversely, the mice without Miro1 in their brain and spinal cord were fine at birth but soon developed signs of neurological problems, such as hunched spines, difficulty moving and clasping their hind paws together, and died around 35 days after birth. Those symptoms appeared similar to motor neuron disease, according to Shaw.

“The mitochondrial function in the cells appeared to be fine, and calcium levels were normal,” she says. “This shows for the first time that restricting mitochondrial movement and distribution could cause neuronal disease.”

Stefan M. Pulst, M.D., Dr. med, professor and chair of the University’s neurology department and a co-author on the study, says the mitochondrial transport process is important not just for motor neurons but other neurons as well. “The Miro1 proteins and the respective animal models represent a breakthrough for studying ALS (Lou Gehrig’s disease) and other neurodegenerative diseases.”

Although much more research must be done, the study opens the possibility of developing new drugs to partially correct the mitochondrial distribution defects to slow the progression of motor neuron diseases. First, Shaw wants to generate a model to knock out the Miro1 gene in adult mice to see if the results mimic neurological diseases.

Filed under neurodegenerative diseases mitochondria miro1 ALS motor neuron disease neuroscience science

118 notes

Researchers link gene to increased dendritic spines – a signpost of autism

Scientists at the UNC School of Medicine have discovered that knocking out the gene NrCAM leads to an increase of dendritic spines on excitatory pyramidal cells in the brains of mammals. Other studies have confirmed that the overabundance of dendritic spines on this type of brain cell allows for too many synaptic connections to form between neurons – a phenomenon strongly linked to autism.

image

(Image caption: A comparison of a dendrite with the protein NrCAM (top) and a dendrite without the protein (bottom), which has a greater density of spines that neurons use to form synaptic connections.)

The finding, published in The Journal of Neuroscience, adds evidence that NrCAM is a major player in neurological disorders. Previous UNC studies showed that knocking out the NrCAM gene caused mice to exhibit the same sorts of social behaviors associated with autism in humans.

“There are many genes involved in autism, but we’re now finding out exactly which ones and how they’re involved,” said Patricia Maness, PhD, professor of biochemistry and biophysics and senior author of the Journal of Neuroscience paper. “Knowing that NrCAM has this effect on dendrites allows us to test potential drugs, not only to observe a change in behaviors linked to autism but to see if we can improve dendritic spine abnormalities, which may underlie autism.

Maness’s finding comes on the heels of a report from Columbia University researchers who found an overabundance of the protein MTOR in mice bred to develop a rare form of autism. By using a drug to limit MTOR in mice, the Columbia researchers were able to decrease the number of dendritic spines and thus prune the overabundance of synaptic connections during adolescence. As a result, the social behaviors associated with autism were decreased. However, the drug used to limit MTOR can cause serious side effects, and it is located inside cells, making it a potentially difficult protein to target.

It is too early to tell if NrCAM and MTOR are linked, but Maness is now studying if the decreased amount of the NrCAM protein could trigger activation of MTOR. If so, then NrCAM, which is an accessible membrane-bound protein, might be a preferred therapeutic target for certain autism-related conditions.

In their study, Maness and her colleagues found that the NrCAM protein forms a complex with two other molecules to create a receptor on the membrane of excitatory pyramidal neurons. Maness’s team found that this receptor allows dendritic spines to retract, allowing for proper neuron pruning during maturation of the cortex. As a result, excitatory and inhibitory synapses between neurons develop in a balanced ratio necessary for brain circuits to function properly. 

Maness, a member of the UNC Neuroscience Center and the Carolina Institute for Developmental Disabilities, also said that there are likely many other proteins downstream of NrCAM that depend on the protein to maintain the proper amount of dendritic spines. Decreasing NrCAM could allow for an increase in the levels of some of these proteins, thus kick starting the creation of dendritic spines.

“Basic science in autism is converging in really exciting ways,” Maness said. “Too many spines and too many excitatory connections that are not pruned between early childhood and adolescence could be one of the chief problems underlying autism. Our goal is to understand the molecular mechanisms involved in pruning and find promising targets for therapeutic agents.”

(Source: news.unchealthcare.org)

Filed under autism dendritic spines NrCAM neurons neuroscience science

107 notes

Simple test can help detect Alzheimer’s before dementia signs show
York University researchers say a simple test that combines thinking and movement can help to detect heightened risk for developing Alzheimer’s disease in a person, even before there are any telltale behavioural signs of dementia.
Faculty of Health Professor Lauren Sergio and PhD candidate Kara Hawkins who led the study asked the participants to complete four increasingly demanding visual-spatial and cognitive-motor tasks, on dual screen laptop computers. The test aimed at detecting the tendency for Alzheimer’s in those who were having cognitive difficulty even though they were not showing outward signs of the disease.
“We included a task which involved moving a computer mouse in the opposite direction of a visual target on the screen, requiring the person’s brain to think before and during their hand movements,” says Sergio in the School of Kinesiology & Health Science. “This is where we found the most pronounced difference between those with mild cognitive impairment (MCI) and family history group and the two control groups.”
Hawkins adds, “We know that really well-learned, stereotyped motor behaviours are preserved until very late in Alzheimer’s disease.” These include routine movements, such as walking. The disruption in communication will be evident when movements require the person to think about what it is they are trying to do.
For the test, the participants were divided into three groups – those diagnosed with MCI or had a family history of Alzheimer’s disease, and two control groups, young adults and older adults, without a family history of the disease.
The study, Visuomotor Impairments in Older Adults at Increased Alzheimer’s Disease Risk, published in the Journal of Alzheimer’s Disease, found that 81.8 per cent of the participants that had a family history of Alzheimer’s disease and those with MCI displayed difficulties on the most cognitively demanding visual motor task.
“The brain’s ability to take in visual and sensory information and transform that into physical movements requires communication between the parietal area at the back of the brain and the frontal regions,” explains Sergio. “The impairments observed in the participants at increased risk of Alzheimer’s disease may reflect inherent brain alteration or early neuropathology, which is disrupting reciprocal brain communication between hippocampal, parietal and frontal brain regions.”
“In terms of being able to categorize the low Alzheimer’s disease risk and the high Alzheimer’s disease risk, we were able to do that quite well using these kinematic measures,” says Hawkins. “This group had slower reaction time and movement time, as well as less accuracy and precision in their movements.”
Hawkins says the findings don’t predict who will develop Alzheimer’s disease, but they do show there is something different in the brains of most of the participants diagnosed with MCI or who had a family history of the disease.

Simple test can help detect Alzheimer’s before dementia signs show

York University researchers say a simple test that combines thinking and movement can help to detect heightened risk for developing Alzheimer’s disease in a person, even before there are any telltale behavioural signs of dementia.

Faculty of Health Professor Lauren Sergio and PhD candidate Kara Hawkins who led the study asked the participants to complete four increasingly demanding visual-spatial and cognitive-motor tasks, on dual screen laptop computers. The test aimed at detecting the tendency for Alzheimer’s in those who were having cognitive difficulty even though they were not showing outward signs of the disease.

“We included a task which involved moving a computer mouse in the opposite direction of a visual target on the screen, requiring the person’s brain to think before and during their hand movements,” says Sergio in the School of Kinesiology & Health Science. “This is where we found the most pronounced difference between those with mild cognitive impairment (MCI) and family history group and the two control groups.”

Hawkins adds, “We know that really well-learned, stereotyped motor behaviours are preserved until very late in Alzheimer’s disease.” These include routine movements, such as walking. The disruption in communication will be evident when movements require the person to think about what it is they are trying to do.

For the test, the participants were divided into three groups – those diagnosed with MCI or had a family history of Alzheimer’s disease, and two control groups, young adults and older adults, without a family history of the disease.

The study, Visuomotor Impairments in Older Adults at Increased Alzheimer’s Disease Risk, published in the Journal of Alzheimer’s Disease, found that 81.8 per cent of the participants that had a family history of Alzheimer’s disease and those with MCI displayed difficulties on the most cognitively demanding visual motor task.

“The brain’s ability to take in visual and sensory information and transform that into physical movements requires communication between the parietal area at the back of the brain and the frontal regions,” explains Sergio. “The impairments observed in the participants at increased risk of Alzheimer’s disease may reflect inherent brain alteration or early neuropathology, which is disrupting reciprocal brain communication between hippocampal, parietal and frontal brain regions.”

“In terms of being able to categorize the low Alzheimer’s disease risk and the high Alzheimer’s disease risk, we were able to do that quite well using these kinematic measures,” says Hawkins. “This group had slower reaction time and movement time, as well as less accuracy and precision in their movements.”

Hawkins says the findings don’t predict who will develop Alzheimer’s disease, but they do show there is something different in the brains of most of the participants diagnosed with MCI or who had a family history of the disease.

Filed under alzheimer's disease dementia cognitive impairment movement neuroscience science

83 notes

Researchers make new discovery about brain’s 3-D shape processing
While previous studies of the brain suggest that processing of objects and places occurs in very different locations, a Johns Hopkins University research team has found that they are closely related.
In research funded by the National Institutes of Health and published today in the journal Neuron, a team led by Johns Hopkins researcher Charles E. Connor reports that a major pathway long associated with object shape also carries information about landscapes and other environments.
Siavash Vaziri, then a biomedical engineering graduate student and now a post-doctoral fellow in the Connor lab, studied how neurons in the ventral visual pathway of the monkey brain respond to 3-D images. In one channel of the ventral pathway, neurons responded to small, discrete objects as expected. But in a neighboring, parallel channel, the researchers were surprised by the overwhelming responsiveness of many neurons to large-scale environments that surround the viewer, extending beyond the field of view.
"We were entirely surprised ourselves," said Connor, senior author of the paper. "Based on decades of research, we expected that all neurons in the ventral pathway would be primarily concerned with objects."
The ventral pathway is one of the two major branches of high-level visual processing in humans and other primates. It is sometimes called the “what” pathway, based on its role in identifying objects based on their shapes and colors.
"Dr. Vaziri’s finding is exciting because it puts environmental shape information together with object shape information in two densely connected neighboring channels. This could be a site for integrating object information into environmental contexts in order to understand scenes," Connor said.
Vaziri used microelectrodes to study how individual neurons responded to a large variety of 3-D shapes projected onto a large screen. Depth structure was conveyed by shading, texture gradients, and stereopsis, the effect used in 3-D movies. The shape stimuli evolved during the experiment based on the neuron’s responses, sometimes in the direction of small objects near the viewer, sometimes in the direction of environments filling the screen and surrounding the viewer.
Connor, a professor of neuroscience and the director of the Zanvyl Krieger Mind/Brain Institute at Johns Hopkins, is a noted expert on the neural mechanisms of object vision. His research focuses on deciphering the algorithms that make object vision possible and explain the nature of visual experience.
"Many people would say that vision is our richest and most vivid experience," said Connor. "We want to understand the brain events that create that experience."
Connor said that the next step will be to understand how object and environment information are integrated between the two channels.
"We don’t typically experience objects in isolation," Connor said. "We experience scenes, that is, environments containing multiple objects. We now think that the ventral pathway may be where all that information gets put together to create scene understanding."

Researchers make new discovery about brain’s 3-D shape processing

While previous studies of the brain suggest that processing of objects and places occurs in very different locations, a Johns Hopkins University research team has found that they are closely related.

In research funded by the National Institutes of Health and published today in the journal Neuron, a team led by Johns Hopkins researcher Charles E. Connor reports that a major pathway long associated with object shape also carries information about landscapes and other environments.

Siavash Vaziri, then a biomedical engineering graduate student and now a post-doctoral fellow in the Connor lab, studied how neurons in the ventral visual pathway of the monkey brain respond to 3-D images. In one channel of the ventral pathway, neurons responded to small, discrete objects as expected. But in a neighboring, parallel channel, the researchers were surprised by the overwhelming responsiveness of many neurons to large-scale environments that surround the viewer, extending beyond the field of view.

"We were entirely surprised ourselves," said Connor, senior author of the paper. "Based on decades of research, we expected that all neurons in the ventral pathway would be primarily concerned with objects."

The ventral pathway is one of the two major branches of high-level visual processing in humans and other primates. It is sometimes called the “what” pathway, based on its role in identifying objects based on their shapes and colors.

"Dr. Vaziri’s finding is exciting because it puts environmental shape information together with object shape information in two densely connected neighboring channels. This could be a site for integrating object information into environmental contexts in order to understand scenes," Connor said.

Vaziri used microelectrodes to study how individual neurons responded to a large variety of 3-D shapes projected onto a large screen. Depth structure was conveyed by shading, texture gradients, and stereopsis, the effect used in 3-D movies. The shape stimuli evolved during the experiment based on the neuron’s responses, sometimes in the direction of small objects near the viewer, sometimes in the direction of environments filling the screen and surrounding the viewer.

Connor, a professor of neuroscience and the director of the Zanvyl Krieger Mind/Brain Institute at Johns Hopkins, is a noted expert on the neural mechanisms of object vision. His research focuses on deciphering the algorithms that make object vision possible and explain the nature of visual experience.

"Many people would say that vision is our richest and most vivid experience," said Connor. "We want to understand the brain events that create that experience."

Connor said that the next step will be to understand how object and environment information are integrated between the two channels.

"We don’t typically experience objects in isolation," Connor said. "We experience scenes, that is, environments containing multiple objects. We now think that the ventral pathway may be where all that information gets put together to create scene understanding."

Filed under inferotemporal cortex neurons 3d shapes object processing object vision neuroscience science

73 notes

Neurons express ‘gloss’ using three perceptual parameters
Japanese researchers showed monkeys a number of images representing various glosses and then they measured the responses of 39 neurons by using microelectrodes. They found that a specific population of neurons changed the intensities of the responses linearly according to either the contrast-of-highlight, sharpness-of-highlight, or brightness of the object. This shows that these 3 perceptual parameters are used as parameters when the brain recognizes a variety of glosses. They also found that different parameters are represented by different populations of neurons. This was published in the Journal of Neuroscience.
The gloss of an object surface provides information about the condition of that object. For instance, whether it is wet or dry, whether food is fresh or old. Several gloss-related physical parameters such as specular reflectance and diffuse reflectance have been described and used in computer graphics so far. However, the parameters used when neurons respond to gloss have not yet been found.
A Japanese research group led by Hidehiko Komatsu, professor of the National Institute for Physiological Sciences (NIPS), National Institutes of Natural Sciences (NINS), in collaboration with the Advanced Telecommunications Research Institute International (ATR) prepared 16 images representing various glosses and showed them to monkeys. In a circumscribed area in the inferior temporal cortex of the brain, neurons strengthened their responses proportionately as the contrast-of-highlight and/or sharpness-of-highlight got higher. Neural responses also vary greatly depending on the brightness, for instance, whether the object is black, gray, or white. Furthermore, the perceptual gloss parameters of the presented image could be fairly precisely predicted from the strengths of the population neural responses.
By the application of these findings in an artificial image recognition system, the researchers are expecting that it would be able to develop robots that recognize gloss like humans.

Neurons express ‘gloss’ using three perceptual parameters

Japanese researchers showed monkeys a number of images representing various glosses and then they measured the responses of 39 neurons by using microelectrodes. They found that a specific population of neurons changed the intensities of the responses linearly according to either the contrast-of-highlight, sharpness-of-highlight, or brightness of the object. This shows that these 3 perceptual parameters are used as parameters when the brain recognizes a variety of glosses. They also found that different parameters are represented by different populations of neurons. This was published in the Journal of Neuroscience.

The gloss of an object surface provides information about the condition of that object. For instance, whether it is wet or dry, whether food is fresh or old. Several gloss-related physical parameters such as specular reflectance and diffuse reflectance have been described and used in computer graphics so far. However, the parameters used when neurons respond to gloss have not yet been found.

A Japanese research group led by Hidehiko Komatsu, professor of the National Institute for Physiological Sciences (NIPS), National Institutes of Natural Sciences (NINS), in collaboration with the Advanced Telecommunications Research Institute International (ATR) prepared 16 images representing various glosses and showed them to monkeys. In a circumscribed area in the inferior temporal cortex of the brain, neurons strengthened their responses proportionately as the contrast-of-highlight and/or sharpness-of-highlight got higher. Neural responses also vary greatly depending on the brightness, for instance, whether the object is black, gray, or white. Furthermore, the perceptual gloss parameters of the presented image could be fairly precisely predicted from the strengths of the population neural responses.

By the application of these findings in an artificial image recognition system, the researchers are expecting that it would be able to develop robots that recognize gloss like humans.

Filed under neurons inferotemporal cortex perception gloss lightness neuroscience science

266 notes

Brain Imaging Research Pinpoints Neurobiological Basis for Key Symptoms Associated with Post-Traumatic Stress Disorder Like Listlessness and Emotional Detachment in Trauma Victims
In a novel brain-imaging study among trauma victims, researchers at NYU Langone Medical Center have linked an opioid receptor in the brain  — associated with emotions — to a narrow cluster of trauma symptoms, including sadness, emotional detachment and listlessness. The study, published online today in the journal JAMA Psychiatry, holds important implications for targeted, personalized treatment of post-traumatic stress disorder, or PTSD, a psychiatric condition affecting more than 8 million Americans that can cause a wide range of debilitating psychiatric symptoms.
“Our study points toward a more personalized treatment approach for people with a specific symptom profile that’s been linked to a particular neurobiological abnormality,” says lead author Alexander Neumeister, MD, director of the molecular imaging program in the Departments of Psychiatry and Radiology at NYU School of Medicine, and Co-Director of NYU Langone’s Steven and Alexandra Cohen Veterans Center for the Study of Post-Traumatic Stress Disorder and Traumatic Brain Injury. “Understanding more about where and how symptoms of PTSD manifest in the brain is a critical part of research efforts to develop more effective medications and treatment modalities.”
The new study confirms a growing body of evidence linking a particular set of symptoms to specific brain circuits and chemicals, and bolsters a shift within the field of psychiatry away from “one-size-fits-all treatments” and toward more individualized medication regimens that target highly specific neurobiological components. “We know from previous clinical trials that antidepressants, for example, do not work well for dysphoria and the numbing symptoms often found in PTSD,” Dr. Neumeister added. “Currently available antidepressants are just not linked specifically enough to the neurobiological basis of these symptoms in PTSD. Going forward, our study will help pave the way toward development of better options.”
“People with cancer have a variety of different treatment options available based on the type of cancer that they have,” adds Dr. Neumeister. “We aim to do the same thing in psychiatry. We’re deconstructing PTSD symptoms, linking them to different brain dysfunction, and then developing treatments that target those symptoms. It’s really a revolutionary step forward that has been supported by the National Institute of Mental Health (NIMH) over the past few years in their Research Domain Criteria Project.”
The study, funded by the National Institute of Mental Health (NIMH), compared the brain scans of healthy volunteers with those of clinically diagnosed trauma victims with PTSD, major depression, and generalized anxiety disorder whose symptoms ranged from emotional detachment to isolation. Participants received a harmless radioactive tracer that binds to and illuminates a class of opioid receptors, known as kappa, when exposed to high-resolution positron emission tomography (PET). Kappa opioid receptors bind a potent natural opioid known as dynorphin, which is released by the body during times of stress to help relieve dysphoria or numbing.
Chronic exposure to stress, such as the case with PTSD, taxes kappa opioid receptors, however, causing the receptors to retract inside cells, leaving dynorphin without a place to dock. As a result, patients can experience dysphoria, characterized by feelings of hopelessness, detachment and emotional unease.
Results showed that fewer available kappa opioid receptors in the brain regions believed to govern emotions were associated with more intense feelings of dysphoria, but not feelings of anxious arousal. The findings confirm previous studies in animals linking the opioid-receptor system expressed in these specific brain regions to symptoms of dysphoria. The study also found an association between lower levels of cortisol, a stress hormone, and unavailable kappa opioid receptors, suggesting a new role for cortisol as a biomarker for certain types of PTSD symptoms.
“This is the first brain-imaging study to explore any psychiatric condition using a protein that binds to the kappa opioid receptor system,” notes Dr. Neumeister, who says the data support clinical trials under way at NYU Langone and other institutions of new medications that target kappa opioid receptors and other brain systems that can be linked to specific symptoms in trauma survivors. Such medications could be widely available for the treatment of PTSD in the future if ongoing clinical trials yield encouraging results.
(Image: Alamy)

Brain Imaging Research Pinpoints Neurobiological Basis for Key Symptoms Associated with Post-Traumatic Stress Disorder Like Listlessness and Emotional Detachment in Trauma Victims

In a novel brain-imaging study among trauma victims, researchers at NYU Langone Medical Center have linked an opioid receptor in the brain — associated with emotions — to a narrow cluster of trauma symptoms, including sadness, emotional detachment and listlessness. The study, published online today in the journal JAMA Psychiatry, holds important implications for targeted, personalized treatment of post-traumatic stress disorder, or PTSD, a psychiatric condition affecting more than 8 million Americans that can cause a wide range of debilitating psychiatric symptoms.

“Our study points toward a more personalized treatment approach for people with a specific symptom profile that’s been linked to a particular neurobiological abnormality,” says lead author Alexander Neumeister, MD, director of the molecular imaging program in the Departments of Psychiatry and Radiology at NYU School of Medicine, and Co-Director of NYU Langone’s Steven and Alexandra Cohen Veterans Center for the Study of Post-Traumatic Stress Disorder and Traumatic Brain Injury. “Understanding more about where and how symptoms of PTSD manifest in the brain is a critical part of research efforts to develop more effective medications and treatment modalities.”

The new study confirms a growing body of evidence linking a particular set of symptoms to specific brain circuits and chemicals, and bolsters a shift within the field of psychiatry away from “one-size-fits-all treatments” and toward more individualized medication regimens that target highly specific neurobiological components. “We know from previous clinical trials that antidepressants, for example, do not work well for dysphoria and the numbing symptoms often found in PTSD,” Dr. Neumeister added. “Currently available antidepressants are just not linked specifically enough to the neurobiological basis of these symptoms in PTSD. Going forward, our study will help pave the way toward development of better options.”

“People with cancer have a variety of different treatment options available based on the type of cancer that they have,” adds Dr. Neumeister. “We aim to do the same thing in psychiatry. We’re deconstructing PTSD symptoms, linking them to different brain dysfunction, and then developing treatments that target those symptoms. It’s really a revolutionary step forward that has been supported by the National Institute of Mental Health (NIMH) over the past few years in their Research Domain Criteria Project.”

The study, funded by the National Institute of Mental Health (NIMH), compared the brain scans of healthy volunteers with those of clinically diagnosed trauma victims with PTSD, major depression, and generalized anxiety disorder whose symptoms ranged from emotional detachment to isolation. Participants received a harmless radioactive tracer that binds to and illuminates a class of opioid receptors, known as kappa, when exposed to high-resolution positron emission tomography (PET). Kappa opioid receptors bind a potent natural opioid known as dynorphin, which is released by the body during times of stress to help relieve dysphoria or numbing.

Chronic exposure to stress, such as the case with PTSD, taxes kappa opioid receptors, however, causing the receptors to retract inside cells, leaving dynorphin without a place to dock. As a result, patients can experience dysphoria, characterized by feelings of hopelessness, detachment and emotional unease.

Results showed that fewer available kappa opioid receptors in the brain regions believed to govern emotions were associated with more intense feelings of dysphoria, but not feelings of anxious arousal. The findings confirm previous studies in animals linking the opioid-receptor system expressed in these specific brain regions to symptoms of dysphoria. The study also found an association between lower levels of cortisol, a stress hormone, and unavailable kappa opioid receptors, suggesting a new role for cortisol as a biomarker for certain types of PTSD symptoms.

“This is the first brain-imaging study to explore any psychiatric condition using a protein that binds to the kappa opioid receptor system,” notes Dr. Neumeister, who says the data support clinical trials under way at NYU Langone and other institutions of new medications that target kappa opioid receptors and other brain systems that can be linked to specific symptoms in trauma survivors. Such medications could be widely available for the treatment of PTSD in the future if ongoing clinical trials yield encouraging results.

(Image: Alamy)

Filed under PTSD amygdala kappa opioid receptors cortisol neuroimaging neuroscience science

87 notes

Gambling with confidence: Are you sure about that?
Life is a series of decisions, ranging from the mundane to the monumental. And each decision is a gamble, carrying with it the chance to second-guess. Did I make the right turn at that light? Did I choose the right college? Was this the right job for me?
Our desire to persist along a chosen path is almost entirely determined by our confidence in the decision: when you are confident that your choice is correct, you are willing to stick it out for a lot longer.
Confidence determines much of our path through life, but what is it? Most people would describe it as an emotion or a feeling. In contrast, scientists at Cold Spring Harbor Laboratory (CSHL) have found that confidence is actually a measureable quantity, and not reserved just for humans. The team, led by CSHL Associate Professor Adam Kepecs, has identified a brain region in rats whose function is required for the animals to express confidence in their decisions.
How do we know when a rat is exhibiting confidence? The researchers devised a method to study decision making in these animals. The rats were offered an odor that they were trained to associate with one of two doors. When they chose the correct door, they were rewarded. This part was easy for the animals: their selections were almost always correct.­­ Things got trickier when Kepecs and his team offered a mixture of the two scents, with one dominating over the other by only a very small percentage. The rats now needed to choose the door representing the dominant odor in order to get their reward – a choice that reflects their best guess.
In work published today in Neuron, the team describes how confidence can be measured simply by challenging a rat to wait for the reward to be revealed behind the door. The time they are willing wait serves as a measure of the confidence in their original decision. “We found that the rats are willing to ‘gamble’ with their time,” Kepecs explains, sometimes waiting as much as 15 seconds, which is an eternity for these animals. “This is something that we can measure and create mathematical models to explain,” says Kepecs. “The time rats are willing to wait predicts the likelihood of correct decisions and provides an objective measure to track the feeling of confidence.”
The researchers hypothesized that a distinct region of the brain might control confidence. Previous work has suggested that the orbitofrontal cortex (OFC), a part of the brain involved in making predictions, might have a role in decision confidence. Kepecs and his team specifically shut off neurons in the OFC, inactivating it, and found that rats no longer exhibited appropriate levels of confidence in their decisions.
“With an inactive OFC, the rats retained the ability to make decisions – their accuracy did not change,” says Kepecs. “And they spent the same amount of time waiting for a reward on average. The only difference is that animals’ willingness to wait for a reward was no longer guided by confidence. They would often wait a long time even when they were wrong.”
The discovery offers a rare glimpse into the neuronal basis of a higher-level cognitive process, and is likely to have implications in human decision-making as well. As Kepecs describes, “we now know that the OFC is critical for making on-the-fly predictions in rats. The human OFC is just a more sophisticated version of the rodent counterpart.” The team is expanding their research to explore how the elusive feelings of confidence are based on objective predictions that influence human decisions as well.

Gambling with confidence: Are you sure about that?

Life is a series of decisions, ranging from the mundane to the monumental. And each decision is a gamble, carrying with it the chance to second-guess. Did I make the right turn at that light? Did I choose the right college? Was this the right job for me?

Our desire to persist along a chosen path is almost entirely determined by our confidence in the decision: when you are confident that your choice is correct, you are willing to stick it out for a lot longer.

Confidence determines much of our path through life, but what is it? Most people would describe it as an emotion or a feeling. In contrast, scientists at Cold Spring Harbor Laboratory (CSHL) have found that confidence is actually a measureable quantity, and not reserved just for humans. The team, led by CSHL Associate Professor Adam Kepecs, has identified a brain region in rats whose function is required for the animals to express confidence in their decisions.

How do we know when a rat is exhibiting confidence? The researchers devised a method to study decision making in these animals. The rats were offered an odor that they were trained to associate with one of two doors. When they chose the correct door, they were rewarded. This part was easy for the animals: their selections were almost always correct.­­ Things got trickier when Kepecs and his team offered a mixture of the two scents, with one dominating over the other by only a very small percentage. The rats now needed to choose the door representing the dominant odor in order to get their reward – a choice that reflects their best guess.

In work published today in Neuron, the team describes how confidence can be measured simply by challenging a rat to wait for the reward to be revealed behind the door. The time they are willing wait serves as a measure of the confidence in their original decision. “We found that the rats are willing to ‘gamble’ with their time,” Kepecs explains, sometimes waiting as much as 15 seconds, which is an eternity for these animals. “This is something that we can measure and create mathematical models to explain,” says Kepecs. “The time rats are willing to wait predicts the likelihood of correct decisions and provides an objective measure to track the feeling of confidence.”

The researchers hypothesized that a distinct region of the brain might control confidence. Previous work has suggested that the orbitofrontal cortex (OFC), a part of the brain involved in making predictions, might have a role in decision confidence. Kepecs and his team specifically shut off neurons in the OFC, inactivating it, and found that rats no longer exhibited appropriate levels of confidence in their decisions.

“With an inactive OFC, the rats retained the ability to make decisions – their accuracy did not change,” says Kepecs. “And they spent the same amount of time waiting for a reward on average. The only difference is that animals’ willingness to wait for a reward was no longer guided by confidence. They would often wait a long time even when they were wrong.”

The discovery offers a rare glimpse into the neuronal basis of a higher-level cognitive process, and is likely to have implications in human decision-making as well. As Kepecs describes, “we now know that the OFC is critical for making on-the-fly predictions in rats. The human OFC is just a more sophisticated version of the rodent counterpart.” The team is expanding their research to explore how the elusive feelings of confidence are based on objective predictions that influence human decisions as well.

Filed under decision making orbitofrontal cortex confidence judgments neuroscience science

521 notes

Single dose of antidepressant changes the brain
A single dose of antidepressant is enough to produce dramatic changes in the functional architecture of the human brain. Brain scans taken of people before and after an acute dose of a commonly prescribed SSRI (serotonin reuptake inhibitor) reveal changes in connectivity within three hours, say researchers who report their observations in the Cell Press journal Current Biology on September 18.
"We were not expecting the SSRI to have such a prominent effect on such a short timescale or for the resulting signal to encompass the entire brain," says Julia Sacher of the Max Planck Institute for Human Cognitive and Brain Sciences.
While SSRIs are among the most widely studied and prescribed form of antidepressants worldwide, it’s still not entirely clear how they work. The drugs are believed to change brain connectivity in important ways, but those effects had generally been thought to take place over a period of weeks, not hours.
The new findings show that changes begin to take place right away. Sacher says what they are seeing in medication-free individuals who had never taken antidepressants before may be an early marker of brain reorganization.
Study participants let their minds wander for about 15 minutes in a brain scanner that measures the oxygenation of blood flow in the brain. The researchers characterized three-dimensional images of each individual’s brain by measuring the number of connections between small blocks known as voxels (comparable to the pixels in an image) and the change in those connections with a single dose of escitalopram (trade name Lexapro).
Their whole-brain network analysis shows that one dose of the SSRI reduces the level of intrinsic connectivity in most parts of the brain. However, Sacher and her colleagues observed an increase in connectivity within two brain regions, specifically the cerebellum and thalamus.
The researchers say the new findings represent an essential first step toward clinical studies in patients suffering from depression. They also plan to compare the functional connectivity signature of brains in recovery and those of patients who fail to respond after weeks of SSRI treatment.
Understanding the differences between the brains of individuals who respond to SSRIs and those who don’t “could help to better predict who will benefit from this kind of antidepressant versus some other form of therapy,” Sacher says. “The hope that we have is that ultimately our work will help to guide better treatment decisions and tailor individualized therapy for patients suffering from depression.”

Single dose of antidepressant changes the brain

A single dose of antidepressant is enough to produce dramatic changes in the functional architecture of the human brain. Brain scans taken of people before and after an acute dose of a commonly prescribed SSRI (serotonin reuptake inhibitor) reveal changes in connectivity within three hours, say researchers who report their observations in the Cell Press journal Current Biology on September 18.

"We were not expecting the SSRI to have such a prominent effect on such a short timescale or for the resulting signal to encompass the entire brain," says Julia Sacher of the Max Planck Institute for Human Cognitive and Brain Sciences.

While SSRIs are among the most widely studied and prescribed form of antidepressants worldwide, it’s still not entirely clear how they work. The drugs are believed to change brain connectivity in important ways, but those effects had generally been thought to take place over a period of weeks, not hours.

The new findings show that changes begin to take place right away. Sacher says what they are seeing in medication-free individuals who had never taken antidepressants before may be an early marker of brain reorganization.

Study participants let their minds wander for about 15 minutes in a brain scanner that measures the oxygenation of blood flow in the brain. The researchers characterized three-dimensional images of each individual’s brain by measuring the number of connections between small blocks known as voxels (comparable to the pixels in an image) and the change in those connections with a single dose of escitalopram (trade name Lexapro).

Their whole-brain network analysis shows that one dose of the SSRI reduces the level of intrinsic connectivity in most parts of the brain. However, Sacher and her colleagues observed an increase in connectivity within two brain regions, specifically the cerebellum and thalamus.

The researchers say the new findings represent an essential first step toward clinical studies in patients suffering from depression. They also plan to compare the functional connectivity signature of brains in recovery and those of patients who fail to respond after weeks of SSRI treatment.

Understanding the differences between the brains of individuals who respond to SSRIs and those who don’t “could help to better predict who will benefit from this kind of antidepressant versus some other form of therapy,” Sacher says. “The hope that we have is that ultimately our work will help to guide better treatment decisions and tailor individualized therapy for patients suffering from depression.”

Filed under SSRIs antidepressants cerebellum thalamus brain function serotonin neuroscience science

426 notes

ucsdhealthsciences:

Scientists Discover “Dimmer Switch” For Mood Disorders
Researchers at University of California, San Diego School of Medicine have identified a control mechanism for an area of the brain that processes sensory and emotive information that humans experience as “disappointment.”
The discovery of what may effectively be a neurochemical antidote for feeling let-down is reported Sept. 18 in the online edition of Science.
“The idea that some people see the world as a glass half empty has a chemical basis in the brain,” said senior author Roberto Malinow, MD, PhD, professor in the Department of Neurosciences and neurobiology section of the Division of Biological Sciences. “What we have found is a process that may dampen the brain’s sensitivity to negative life events.”
Because people struggling with depression are believed to register negative experiences more strongly than others, the study’s findings have implications for understanding not just why some people have a brain chemistry that predisposes them to depression but also how to treat it.
Specifically, in experiments with rodents, UC San Diego researchers discovered that neurons feeding into a small region above the thalamus known as the lateral habenula (LHb) secrete both a common excitatory neurotransmitter, glutamate, and its opposite, the inhibitory neurotransmitter GABA.
Excitatory neurotransmitters promote neuronal firing while inhibitory ones suppress it, and although glutamate and GABA are among two of the most common neurotransmitters in the mammalian brain, neurons are usually specialists, producing one but not both kinds of chemical messengers.
Indeed, prior to the study, there were only two other systems in the brain where neurons had been observed to co-release excitatory and inhibitory neurotransmitters – in a particular connection in the hippocampus and in the brainstem during development of the brain’s auditory map.
“Our study is one of the first to rigorously document that inhibition can co-exist with excitation in a brain pathway,” said lead author Steven Shabel, a postdoctoral researcher with Department of Neurosciences and neurobiology section of the Division of Biological Sciences. “In our case, that pathway is believed to signal disappointment.”
The LHb is a small node-like structure in the epithalamus region of the brain that is critical for processing a variety of inputs from the basal ganglia, hypothalamus and cerebral cortex and transmitting encoded responses (output) to the brainstem, an ancient part of the brain that mammals share with reptiles.
Experiments with primates have shown that activity in the LHb increases markedly when monkeys are expecting but don’t get a sip of fruit juice or other reward, hence the idea that this region is part of a so-called disappointment pathway.
Proper functioning of the LHb, however, is believed to be important in much more than just disappointment and has been implicated in regulating pain responses and a variety of motivational behaviors. It has also been linked to psychosis.
Depression, in particular, has been linked to hyperactivity of the LHb, but until this study, researchers had little empirical evidence as to how this overstimulation is prevented in healthy individuals given the apparent lack of inhibitory neurons in this region of the brain.
"The take-home of this study is that inhibition in this pathway is coming from an unusual co-release of neurotransmitters into the habenula," Shabel said. Researchers do not know why this region of the brain is controlled in this manner, but one hypothesis is that it allows for a more subtle control of signaling than having two neurons directly counter-acting each other.
Researchers were also able to show that neurons of rodents with aspects of human depression produced less GABA, relative to glutamate. When these animals were given an antidepressant to raise their brain’s serotonin levels, their relative GABA levels increased.
"Our study suggests that one of the ways in which serotonin alleviates depression is by rebalancing the brain’s processing of negative life events vis-à-vis the balance of glutamate and GABA in the habenula," Shabel said. "We may now have a precise neurochemical explanation for why antidepressants make some people more resilient to negative experiences."
Pictured: Basal ganglia neurons (green) feed into the brain and release glutamate (red) and GABA (blue) and sometimes a mix of both neurotransmitters (white).

ucsdhealthsciences:

Scientists Discover “Dimmer Switch” For Mood Disorders

Researchers at University of California, San Diego School of Medicine have identified a control mechanism for an area of the brain that processes sensory and emotive information that humans experience as “disappointment.”

The discovery of what may effectively be a neurochemical antidote for feeling let-down is reported Sept. 18 in the online edition of Science.

“The idea that some people see the world as a glass half empty has a chemical basis in the brain,” said senior author Roberto Malinow, MD, PhD, professor in the Department of Neurosciences and neurobiology section of the Division of Biological Sciences. “What we have found is a process that may dampen the brain’s sensitivity to negative life events.”

Because people struggling with depression are believed to register negative experiences more strongly than others, the study’s findings have implications for understanding not just why some people have a brain chemistry that predisposes them to depression but also how to treat it.

Specifically, in experiments with rodents, UC San Diego researchers discovered that neurons feeding into a small region above the thalamus known as the lateral habenula (LHb) secrete both a common excitatory neurotransmitter, glutamate, and its opposite, the inhibitory neurotransmitter GABA.

Excitatory neurotransmitters promote neuronal firing while inhibitory ones suppress it, and although glutamate and GABA are among two of the most common neurotransmitters in the mammalian brain, neurons are usually specialists, producing one but not both kinds of chemical messengers.

Indeed, prior to the study, there were only two other systems in the brain where neurons had been observed to co-release excitatory and inhibitory neurotransmitters – in a particular connection in the hippocampus and in the brainstem during development of the brain’s auditory map.

“Our study is one of the first to rigorously document that inhibition can co-exist with excitation in a brain pathway,” said lead author Steven Shabel, a postdoctoral researcher with Department of Neurosciences and neurobiology section of the Division of Biological Sciences. “In our case, that pathway is believed to signal disappointment.”

The LHb is a small node-like structure in the epithalamus region of the brain that is critical for processing a variety of inputs from the basal ganglia, hypothalamus and cerebral cortex and transmitting encoded responses (output) to the brainstem, an ancient part of the brain that mammals share with reptiles.

Experiments with primates have shown that activity in the LHb increases markedly when monkeys are expecting but don’t get a sip of fruit juice or other reward, hence the idea that this region is part of a so-called disappointment pathway.

Proper functioning of the LHb, however, is believed to be important in much more than just disappointment and has been implicated in regulating pain responses and a variety of motivational behaviors. It has also been linked to psychosis.

Depression, in particular, has been linked to hyperactivity of the LHb, but until this study, researchers had little empirical evidence as to how this overstimulation is prevented in healthy individuals given the apparent lack of inhibitory neurons in this region of the brain.

"The take-home of this study is that inhibition in this pathway is coming from an unusual co-release of neurotransmitters into the habenula," Shabel said. Researchers do not know why this region of the brain is controlled in this manner, but one hypothesis is that it allows for a more subtle control of signaling than having two neurons directly counter-acting each other.

Researchers were also able to show that neurons of rodents with aspects of human depression produced less GABA, relative to glutamate. When these animals were given an antidepressant to raise their brain’s serotonin levels, their relative GABA levels increased.

"Our study suggests that one of the ways in which serotonin alleviates depression is by rebalancing the brain’s processing of negative life events vis-à-vis the balance of glutamate and GABA in the habenula," Shabel said. "We may now have a precise neurochemical explanation for why antidepressants make some people more resilient to negative experiences."

Pictured: Basal ganglia neurons (green) feed into the brain and release glutamate (red) and GABA (blue) and sometimes a mix of both neurotransmitters (white).

137 notes

World Alzheimer Report 2014: Evidence for dementia risk reduction
The World Alzheimer Report 2014 ‘Dementia and Risk Reduction: An analysis of protective and modifiable factors’, released today, calls for dementia to be integrated into both global and national public health programmes alongside other major non communicable diseases (NCDs). 
Alzheimer’s Disease International (ADI) commissioned a team of researchers, led by Professor Martin Prince from King’s College London, to produce the report. ADI is publishing this report, in conjunction with World Alzheimer’s Day (21 September) and as a part of World Alzheimer’s Month, an international campaign to raise awareness and challenge stigma.
The report reveals that control of diabetes and high blood pressure as well as measures to encourage smoking cessation and to reduce cardiovascular risk, have the potential to reduce the risk of dementia even in late-life. The report found that diabetes can increase the risk of dementia by 50%. Obesity and lack of physical activity are important risk factors for diabetes and hypertension, and should, therefore, also be targeted.
While cardiovascular health is improving in many high income countries, many low and middle income countries show a recent pattern of increasing exposure to cardiovascular risk factors, with rising rates of diabetes, heart disease and stroke. 
Smoking cessation is strongly linked in the report with a reduction in dementia risk. For example, studies of dementia incidence among people aged 65 years and over show that ex-smokers have a similar risk to those who have never smoked, while those who continue to smoke are at much higher risk. 
Furthermore, the study revealed that those who have had better educational opportunities have a lower risk of dementia in late-life. Evidence suggests that education has no impact on the brain changes that lead to dementia, but reduces their impact on intellectual functioning.
The evidence in the report suggest that if we enter old age with better developed, healthier brains we are likely to live longer, happier and more independent lives, with a much reduced chance of developing dementia. Brain health promotion is important across the life span, but particularly in mid-life, as changes in the brain can begin decades before symptoms appear. 
The report also urges NCD programs to be more inclusive of older people, with the message that it’s never too late to make a change, as the future course of the global dementia epidemic is likely to depend crucially upon the success or failure of efforts to improve global public health, across the population. Combining efforts to tackle the increasing global burden of NCDs will be strategically important, efficient and cost effective. Leading a healthier lifestyle is a positive step towards preventing a range of long-term diseases, including cancer, heart disease, stroke and diabetes. 
However, survey data released by Bupa* has shown that many people are unclear about the causes and actions they can take to potentially reduce their risk of dementia. Just over a sixth (17%) of people realised that social interaction with friends and family could impact on the risk. Only a quarter (25%) identified being overweight as a possible factor, and only one in five (23%) said physical activity could affect the risk of developing dementia and losing their memories. The survey also revealed that over two thirds (68%) of people surveyed around the world are concerned about getting dementia in later life.  
Professor Martin Prince, from King’s College London’s Institute of Psychiatry, Psychology & Neuroscience (IoPPN) and author of the report, commented: “There is already evidence from several studies that the incidence of dementia may be falling in high income countries, linked to improvements in education and cardiovascular health. We need to do all we can to accentuate these trends. With a global cost of over US$ 600 billion, the stakes could hardly be higher.”
Marc Wortmann, Executive Director, Alzheimer’s Disease International said: “From a public health perspective, it is important to note that most of the risk factors for dementia overlap with those for the other major non communicable diseases (NCDs). In high income countries, there is an increased focus on healthier lifestyles, but this is not always the case with lower and middle income countries. By 2050, we estimate that 71% of people living with dementia will live in these regions, so implementing effective public health campaigns may help to reduce the global risk.”
Professor Graham Stokes, Global Director of Dementia Care, Bupa, said: “While age and genetics are part of the disease’s risk factors, not smoking, eating more healthily, getting some exercise, and having a good education, coupled with challenging your brain to ensure it is kept active, can all play a part in minimising your chances of developing dementia. People who already have dementia, or signs of it, can also do these things, which may help to slow the progression of the disease.”
 * These figures, unless otherwise stated, are from YouGov Plc. Total sample size was 8,513, from the UK (2,401), Australia (1,000), Chile (1,000), China (1,031), Poland (1,002), and Spain (1,077). Fieldwork was undertaken online, between 17–25 July 2014. The figures have been weighted and are representative of all adults (aged 18+) in each country. An even weighting was applied to each country to find a ‘Global Average’. 
(Image credit)

World Alzheimer Report 2014: Evidence for dementia risk reduction

The World Alzheimer Report 2014 ‘Dementia and Risk Reduction: An analysis of protective and modifiable factors’, released today, calls for dementia to be integrated into both global and national public health programmes alongside other major non communicable diseases (NCDs). 

Alzheimer’s Disease International (ADI) commissioned a team of researchers, led by Professor Martin Prince from King’s College London, to produce the report. ADI is publishing this report, in conjunction with World Alzheimer’s Day (21 September) and as a part of World Alzheimer’s Month, an international campaign to raise awareness and challenge stigma.

The report reveals that control of diabetes and high blood pressure as well as measures to encourage smoking cessation and to reduce cardiovascular risk, have the potential to reduce the risk of dementia even in late-life. The report found that diabetes can increase the risk of dementia by 50%. Obesity and lack of physical activity are important risk factors for diabetes and hypertension, and should, therefore, also be targeted.

While cardiovascular health is improving in many high income countries, many low and middle income countries show a recent pattern of increasing exposure to cardiovascular risk factors, with rising rates of diabetes, heart disease and stroke. 

Smoking cessation is strongly linked in the report with a reduction in dementia risk. For example, studies of dementia incidence among people aged 65 years and over show that ex-smokers have a similar risk to those who have never smoked, while those who continue to smoke are at much higher risk. 

Furthermore, the study revealed that those who have had better educational opportunities have a lower risk of dementia in late-life. Evidence suggests that education has no impact on the brain changes that lead to dementia, but reduces their impact on intellectual functioning.

The evidence in the report suggest that if we enter old age with better developed, healthier brains we are likely to live longer, happier and more independent lives, with a much reduced chance of developing dementia. Brain health promotion is important across the life span, but particularly in mid-life, as changes in the brain can begin decades before symptoms appear. 

The report also urges NCD programs to be more inclusive of older people, with the message that it’s never too late to make a change, as the future course of the global dementia epidemic is likely to depend crucially upon the success or failure of efforts to improve global public health, across the population. Combining efforts to tackle the increasing global burden of NCDs will be strategically important, efficient and cost effective. Leading a healthier lifestyle is a positive step towards preventing a range of long-term diseases, including cancer, heart disease, stroke and diabetes. 

However, survey data released by Bupa* has shown that many people are unclear about the causes and actions they can take to potentially reduce their risk of dementia. Just over a sixth (17%) of people realised that social interaction with friends and family could impact on the risk. Only a quarter (25%) identified being overweight as a possible factor, and only one in five (23%) said physical activity could affect the risk of developing dementia and losing their memories. The survey also revealed that over two thirds (68%) of people surveyed around the world are concerned about getting dementia in later life.  

Professor Martin Prince, from King’s College London’s Institute of Psychiatry, Psychology & Neuroscience (IoPPN) and author of the report, commented: “There is already evidence from several studies that the incidence of dementia may be falling in high income countries, linked to improvements in education and cardiovascular health. We need to do all we can to accentuate these trends. With a global cost of over US$ 600 billion, the stakes could hardly be higher.”

Marc Wortmann, Executive Director, Alzheimer’s Disease International said: “From a public health perspective, it is important to note that most of the risk factors for dementia overlap with those for the other major non communicable diseases (NCDs). In high income countries, there is an increased focus on healthier lifestyles, but this is not always the case with lower and middle income countries. By 2050, we estimate that 71% of people living with dementia will live in these regions, so implementing effective public health campaigns may help to reduce the global risk.”

Professor Graham Stokes, Global Director of Dementia Care, Bupa, said: “While age and genetics are part of the disease’s risk factors, not smoking, eating more healthily, getting some exercise, and having a good education, coupled with challenging your brain to ensure it is kept active, can all play a part in minimising your chances of developing dementia. People who already have dementia, or signs of it, can also do these things, which may help to slow the progression of the disease.”

* These figures, unless otherwise stated, are from YouGov Plc. Total sample size was 8,513, from the UK (2,401), Australia (1,000), Chile (1,000), China (1,031), Poland (1,002), and Spain (1,077). Fieldwork was undertaken online, between 17–25 July 2014. The figures have been weighted and are representative of all adults (aged 18+) in each country. An even weighting was applied to each country to find a ‘Global Average’.

(Image credit)

Filed under alzheimer's disease dementia diabetes hypertension cardiovascular disease tobacco smoking neuroscience science

free counters