Neuroscience

Articles and news from the latest research reports.

77 notes



Diuretic Drug Offers Latest Hope for Autism Treatment
A drug used for decades to treat high blood pressure and other conditions has shown promise in a small clinical trial for autism. The drug, bumetanide, reduced the overall severity of behavioral symptoms after 3 months of daily treatment. The researchers say that many parents of children who received the drug reported that their children were more “present” and engaged in social interactions after taking it. The new findings are among several recent signs that treatments to address the social deficits at the core of autism may be on the horizon.
Several lines of evidence suggest that autism interferes with the neurotransmitter GABA, which typically puts a damper on neural activity. Bumetanide may enhance the inhibitory effects of GABA, and the drug has been used safely as a diuretic to treat a wide range of heart, lung, and kidney conditions. In the new study, researchers led by Yehezkel Ben-Ari at the Mediterranean Institute of Neurobiology in Marseille, France, recruited 60 autistic children between the ages of 3 and 11 and randomly assigned them to receive either a daily pill of bumetanide or a placebo. (Neither the children’s parents nor the researchers who assessed the children knew who received the actual drug).
As a group, those who got bumetanide improved by 5.6 points on a 60-point scale that’s often used to assess behaviors related to autism, the researchers report today in Translational Psychiatry. That was enough to nudge the group average just under the cutoff for severe autism and into the mild to medium category. The study did not look directly at whether the drug improved all symptoms equally or some more than others. “We have some indications that the symptoms particularly ameliorated with bumetanide are the genuine core symptoms of autism, namely communication and social interactions,” Ben-Ari says. More work will be needed to verify that impression. Ben-Ari says his team is now preparing for a larger, multicenter trial in Europe.

Diuretic Drug Offers Latest Hope for Autism Treatment

A drug used for decades to treat high blood pressure and other conditions has shown promise in a small clinical trial for autism. The drug, bumetanide, reduced the overall severity of behavioral symptoms after 3 months of daily treatment. The researchers say that many parents of children who received the drug reported that their children were more “present” and engaged in social interactions after taking it. The new findings are among several recent signs that treatments to address the social deficits at the core of autism may be on the horizon.

Several lines of evidence suggest that autism interferes with the neurotransmitter GABA, which typically puts a damper on neural activity. Bumetanide may enhance the inhibitory effects of GABA, and the drug has been used safely as a diuretic to treat a wide range of heart, lung, and kidney conditions. In the new study, researchers led by Yehezkel Ben-Ari at the Mediterranean Institute of Neurobiology in Marseille, France, recruited 60 autistic children between the ages of 3 and 11 and randomly assigned them to receive either a daily pill of bumetanide or a placebo. (Neither the children’s parents nor the researchers who assessed the children knew who received the actual drug).

As a group, those who got bumetanide improved by 5.6 points on a 60-point scale that’s often used to assess behaviors related to autism, the researchers report today in Translational Psychiatry. That was enough to nudge the group average just under the cutoff for severe autism and into the mild to medium category. The study did not look directly at whether the drug improved all symptoms equally or some more than others. “We have some indications that the symptoms particularly ameliorated with bumetanide are the genuine core symptoms of autism, namely communication and social interactions,” Ben-Ari says. More work will be needed to verify that impression. Ben-Ari says his team is now preparing for a larger, multicenter trial in Europe.

Filed under autism treatment bumetanide neurotransmitters GABA science

138 notes

The image of mental fatigue

Functional magnetic resonance imaging offers insights into mental fatigue

image

We all perhaps know the feeling of mental exhaustion, but what does it mean physiologically to have mental fatigue? A new study carried out using brain scans could help scientists uncover the neurobiological mechanisms underlying mental fatigue.

According to Bui Ha Duc and Xiaoping Li of the National University of Singapore writing in a forthcoming issue of the International Journal Computer Applications in Technology, mental fatigue has become commonplace as many people face increasing mental demands from stressful jobs, longer working hours with less time to relax and increasingly suffer sleep problems. Mental fatigue has received attention from those involved generally in health and well being as well as from the military and transport industry. After all, mental fatigue not only affects the health of individuals but can also have implications for road safety and international security.

The researchers used functional magnetic resonance imaging (fMRI) to monitor activity in the brains of ten student volunteers (male and female aged 19 to 25 years) deprived of sleep for 25 hours and given a simple task repeatedly through that period. They carried out scans at 9am, 2pm, 3am, 9am the following day. All volunteers had to have avoided alcohol and caffeine for the 24 hours prior to the experiment, were all physically and mentally fit prior to participation and none had any sleep problems.

The activation of the left thalamus increases with sleep deprivation, going in an exactly opposite trend to the inferior parietal that (following the circadian rhythm) decreases in activation from 9 am to 3 am next day and then increases in activation. This finding fits with logic as the inferior parietal cortex integrates information from different sensory modalities. As all the information has to go through the thalamus and then is sent by the thalamus to the inferior parietal, when the inferior parietal decreases in activation, the thalamus must increase its activation to get the information sent through.

The team explains that a gradual increase in mental fatigue led to decreased activity in the volunteers’ brains in specific regions: the anterior cingulate gyrus, right inferior frontal, left middle frontal and right superior temporal cortex. The anterior cingulate cortex has been described as an interface between motivation, cognition and action, and has been implicated in using reinforcement information to control behavior. The fMRI scans suggest that decreased activity in this part of the brain is therefore linked to those familiar feelings of mental fatigue including lethargy and slowness of thinking.

"The research provides a neurophysiologic basis for measuring the level of mental fatigue by EEG, as well as for the intervention by non-invasive neural stimulation to maintain wakefulness," the team says. "We have developed devices for both, which will be commercialized by our spinoff company, Newrocare Pte Ltd."

(Source: eurekalert.org)

Filed under brain mental fatigue health fMRI sleep deprivation neuroscience science

39 notes





Researcher Finds Gender Differences In Seasonal Auditory Changes
Auditory systems differ between sexes in sparrows depending on the season, a Georgia State University neuroscientist has found. The work adds to our knowledge of how the parts of the nervous system, including that of humans, are able to change.
Megan Gall, a post-doctoral researcher with Georgia State’s Neuroscience Institute, tested the peripheral auditory systems of male and female house sparrows, comparing the hearing of each gender during non-breeding seasons and breeding seasons.
Gall measured frequency selectivity – the ability to tell sounds that are close in frequency apart, and temporal resolution, the ability to tell sounds apart that are very close together in time.
“We found that males have the same frequency selectivity and temporal resolution across breeding seasons,” Gall said. “In the fall, males and females aren’t different. But in the breeding season, females had better frequency selectivity, but this came at the expense of worse temporal resolution.”
The study was published in the Proceedings of The Royal Society B, a British scientific journal.
The difference shows “plasticity,” the ability to change, she said. Plasticity is an important concept in neuroscience, as scientists have increasingly been able to show that neurological systems have the ability to change.
Gall said the work shows, for the first time, that there’s seasonal plasticity in these properties in the periphery of the auditory system, the ear and the auditory nerve, not just inside the parts of the brain that control auditory function.
Similar changes happen in humans, she said. Women show different auditory sensitivities during the course of a menstrual cycle. “I always like to say that if your husband says he can’t hear you, it may be that he can’t. His auditory system is different than yours,” Gall said.
The changes might have evolved over time for different reasons, she said, with one reason being that it is harder for the body to maintain a certain kind of tissues involved in hearing.

Researcher Finds Gender Differences In Seasonal Auditory Changes

Auditory systems differ between sexes in sparrows depending on the season, a Georgia State University neuroscientist has found. The work adds to our knowledge of how the parts of the nervous system, including that of humans, are able to change.

Megan Gall, a post-doctoral researcher with Georgia State’s Neuroscience Institute, tested the peripheral auditory systems of male and female house sparrows, comparing the hearing of each gender during non-breeding seasons and breeding seasons.

Gall measured frequency selectivity – the ability to tell sounds that are close in frequency apart, and temporal resolution, the ability to tell sounds apart that are very close together in time.

“We found that males have the same frequency selectivity and temporal resolution across breeding seasons,” Gall said. “In the fall, males and females aren’t different. But in the breeding season, females had better frequency selectivity, but this came at the expense of worse temporal resolution.”

The study was published in the Proceedings of The Royal Society B, a British scientific journal.

The difference shows “plasticity,” the ability to change, she said. Plasticity is an important concept in neuroscience, as scientists have increasingly been able to show that neurological systems have the ability to change.

Gall said the work shows, for the first time, that there’s seasonal plasticity in these properties in the periphery of the auditory system, the ear and the auditory nerve, not just inside the parts of the brain that control auditory function.

Similar changes happen in humans, she said. Women show different auditory sensitivities during the course of a menstrual cycle. “I always like to say that if your husband says he can’t hear you, it may be that he can’t. His auditory system is different than yours,” Gall said.

The changes might have evolved over time for different reasons, she said, with one reason being that it is harder for the body to maintain a certain kind of tissues involved in hearing.

Filed under sparrow songbird temporal resolution auditory system plasticity neuroscience science

267 notes




First facial reconstruction of the Indonesian ‘Hobbit’ unveiled
Scientists at this week’s Australian Archaeological Conference have unveiled the face of Homo floresiensis— more commonly referred to as the ‘Hobbit’ — for the first time. Specialist facial anthropologist Dr. Susan Hayes used forensic facial approximation techniques to build out a female skull specimen discovered in 2003 in Flores, Indonesia. Other bones have been found since, revealing that these Hobbits were only about three and a half feet tall— just like the creatures of J.R.R. Tolkien lore that will hit the big screen later this week. Homo floresiensis populated the island of Flores between 95,000 and 17,000 years ago, but it’s not yet clear where the species falls within the human evolutionary tree. Although she’s pleased with the final results, Hayes says that the reconstruction was far from easy— “she’s not what you’d call pretty, but she is definitely distinctive.”

First facial reconstruction of the Indonesian ‘Hobbit’ unveiled

Scientists at this week’s Australian Archaeological Conference have unveiled the face of Homo floresiensis— more commonly referred to as the ‘Hobbit’ — for the first time. Specialist facial anthropologist Dr. Susan Hayes used forensic facial approximation techniques to build out a female skull specimen discovered in 2003 in Flores, Indonesia. Other bones have been found since, revealing that these Hobbits were only about three and a half feet tall— just like the creatures of J.R.R. Tolkien lore that will hit the big screen later this week. Homo floresiensis populated the island of Flores between 95,000 and 17,000 years ago, but it’s not yet clear where the species falls within the human evolutionary tree. Although she’s pleased with the final results, Hayes says that the reconstruction was far from easy— “she’s not what you’d call pretty, but she is definitely distinctive.”

Filed under Hobbit anthropology facial reconstruction homo floresiensis evolution neuroscience science

54 notes




EyeWire launches today with J Day!
It’s time to mobilize a global community of citizen neuroscientists to trace the 3D structure of J Cells and understand how retinal connectomes relate to visual perception.
A specific type of retinal neurons called J Cells respond to stimuli that move downward on the retina (which is the same as upward in the visual world). Neuroscientists do not currently understand how the neural circuits of the retina cause the J Cell to respond in this way. That’s one of the reasons we built EyeWire. By playing EyeWire, you map the 3D structure of retinal neurons and their connections, and collaborate with neuroscientists at MIT, the Max Planck Institute for Medical Research, and Harvard.
Over the past several months, members of Sebastian Seung’s lab at MIT have been hard at work making sure EyeWire allows users to accurately contribute to research. During our beta period, an average of 30 to 50 people played EyeWire each day. Collectively, EyeWirers have mapped over 160,000 individual cubes since the beta went live in spring. We hope to dwarf these numbers in the coming months.
Check out a short video from Sebastian Seung, who shares why we created EyeWire and how you can get involved.

EyeWire launches today with J Day!

It’s time to mobilize a global community of citizen neuroscientists to trace the 3D structure of J Cells and understand how retinal connectomes relate to visual perception.

A specific type of retinal neurons called J Cells respond to stimuli that move downward on the retina (which is the same as upward in the visual world). Neuroscientists do not currently understand how the neural circuits of the retina cause the J Cell to respond in this way. That’s one of the reasons we built EyeWire. By playing EyeWire, you map the 3D structure of retinal neurons and their connections, and collaborate with neuroscientists at MIT, the Max Planck Institute for Medical Research, and Harvard.

Over the past several months, members of Sebastian Seung’s lab at MIT have been hard at work making sure EyeWire allows users to accurately contribute to research. During our beta period, an average of 30 to 50 people played EyeWire each day. Collectively, EyeWirers have mapped over 160,000 individual cubes since the beta went live in spring. We hope to dwarf these numbers in the coming months.

Check out a short video from Sebastian Seung, who shares why we created EyeWire and how you can get involved.

Filed under EyeWire J cells visual perception retinal connectomes neuroscience science

70 notes







Mayo Clinic Researchers Uncover Toxic Interaction in Neurons that Leads to Dementia and ALS
Researchers at Mayo Clinic in Florida have uncovered a toxic cellular process by which a protein that maintains the health of neurons becomes deficient and can lead to dementia. The findings shed new light on the link between culprits implicated in two devastating neurological diseases: frontotemporal dementia and amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease. The study is published Dec. 10 in the online issue of Proceedings of the National Academy of Sciences.
There is no cure for frontotemporal dementia, a disorder that affects personality, behavior and language and is second only to Alzheimer’s disease as the most common form of early-onset dementia. While much research is devoted to understanding the role of each defective protein in these diseases, the team at Mayo Clinic took a new approach to examine the interplay between TDP-43, a protein that regulates messenger ribonucleic acid (mRNA) — biological molecules that carry the information of genes and are used by cells to guide protein synthesis — and sortilin, which regulates the protein progranulin.
"We sought to investigate how TDP-43 regulates the levels of the protein progranulin, given that extreme progranulin levels at either end of the spectrum, too low or too high, can respectively lead to neurodegeneration or cancer," says the study’s lead investigator, Mercedes Prudencio, Ph.D., a neuroscientist at the Mayo Clinic campus in Florida.
The neuroscientists found that a lack of the protein TDP-43, long implicated in frontotemporal dementia and amyotrophic lateral sclerosis, leads to elevated levels of defective sortilin mRNA. The research team is the first to identify significantly elevated levels of the defective sortilin mRNA in autopsied human brain tissue of frontotemporal dementia/TDP cases, the most common subtype of the disease.









(Image: Wikimedia Commons)

Mayo Clinic Researchers Uncover Toxic Interaction in Neurons that Leads to Dementia and ALS

Researchers at Mayo Clinic in Florida have uncovered a toxic cellular process by which a protein that maintains the health of neurons becomes deficient and can lead to dementia. The findings shed new light on the link between culprits implicated in two devastating neurological diseases: frontotemporal dementia and amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease. The study is published Dec. 10 in the online issue of Proceedings of the National Academy of Sciences.

There is no cure for frontotemporal dementia, a disorder that affects personality, behavior and language and is second only to Alzheimer’s disease as the most common form of early-onset dementia. While much research is devoted to understanding the role of each defective protein in these diseases, the team at Mayo Clinic took a new approach to examine the interplay between TDP-43, a protein that regulates messenger ribonucleic acid (mRNA) — biological molecules that carry the information of genes and are used by cells to guide protein synthesis — and sortilin, which regulates the protein progranulin.

"We sought to investigate how TDP-43 regulates the levels of the protein progranulin, given that extreme progranulin levels at either end of the spectrum, too low or too high, can respectively lead to neurodegeneration or cancer," says the study’s lead investigator, Mercedes Prudencio, Ph.D., a neuroscientist at the Mayo Clinic campus in Florida.

The neuroscientists found that a lack of the protein TDP-43, long implicated in frontotemporal dementia and amyotrophic lateral sclerosis, leads to elevated levels of defective sortilin mRNA. The research team is the first to identify significantly elevated levels of the defective sortilin mRNA in autopsied human brain tissue of frontotemporal dementia/TDP cases, the most common subtype of the disease.

Filed under neurodegeneration neuron dementia protein synthesis protein neuroscience science

45 notes




Infants process faces long before they recognize other objects
New research from psychology Research Professor Anthony Norcia and postdoctoral fellow Faraz Farzin, both of the Stanford Vision and NeuroDevelopment Lab, suggests a physical basis for infants’ ogling. At as early as four months, babies’ brains already process faces at nearly adult levels, even while other images are still being analyzed in lower levels of the visual system.
The results fit, Farzin pointed out, with the prominent role human faces play in a baby’s world.
"If anything’s going to develop earlier it’s going to be face recognition," she said.
The paper appeared in the online Journal of Vision.
The researchers noninvasively measured electrical activity generated in the infants’ brains with a net of sensors placed over the scalp – a sort of electroencephalographic skullcap.
The sensors were monitoring what are called steady state visual potentials – spikes in brain activity elicited by visual stimulation. By flashing photographs at infants and adults and measuring their brain activity at the same steady rhythm – a technique Norcia has pioneered for over three decades – the researchers were able to “ask” the participants’ brains what they perceived.
When the experiment is conducted on adults, faces and objects (like a telephone or an apple) light up similar areas of the temporal lobe – a region of the brain devoted to higher-level visual processing.
Infants’ neural responses to faces were similar to those of adults, showing activity over a part of the temporal lobe researchers think is devoted to face processing.

Infants process faces long before they recognize other objects

New research from psychology Research Professor Anthony Norcia and postdoctoral fellow Faraz Farzin, both of the Stanford Vision and NeuroDevelopment Lab, suggests a physical basis for infants’ ogling. At as early as four months, babies’ brains already process faces at nearly adult levels, even while other images are still being analyzed in lower levels of the visual system.

The results fit, Farzin pointed out, with the prominent role human faces play in a baby’s world.

"If anything’s going to develop earlier it’s going to be face recognition," she said.

The paper appeared in the online Journal of Vision.

The researchers noninvasively measured electrical activity generated in the infants’ brains with a net of sensors placed over the scalp – a sort of electroencephalographic skullcap.

The sensors were monitoring what are called steady state visual potentials – spikes in brain activity elicited by visual stimulation. By flashing photographs at infants and adults and measuring their brain activity at the same steady rhythm – a technique Norcia has pioneered for over three decades – the researchers were able to “ask” the participants’ brains what they perceived.

When the experiment is conducted on adults, faces and objects (like a telephone or an apple) light up similar areas of the temporal lobe – a region of the brain devoted to higher-level visual processing.

Infants’ neural responses to faces were similar to those of adults, showing activity over a part of the temporal lobe researchers think is devoted to face processing.

Filed under infants face recognition face processing object perception neuroscience psychology science

53 notes







Can Going Hungry As a Child Slow Down Cognitive Decline in Later Years?
People who sometimes went hungry as children had slower cognitive decline once they were elderly than people who always had enough food to eat, according to a new study published in the December 11, 2012, print issue of Neurology®, the medical journal of the American Academy of Neurology.
“These results were unexpected because other studies have shown that people who experience adversity as children are more likely to have problems such as heart disease, mental illness and even lower cognitive functioning than people whose childhoods are free of adversity,” said study author Lisa L. Barnes, PhD, of Rush University Medical Center in Chicago.
For the African American participants, the 5.8 percent who reported that they went without enough food to eat sometimes, often or always were more likely to have a slower rate of cognitive decline, or decline that was reduced by about one-third, than those who rarely or never went without enough food to eat. The 8.4 percent of African American participants who reported that they were much thinner at age 12 than other kids their age also were more likely to have a slower rate of cognitive decline, also by one-third, than those who said they were about the same size or heavier than other kids their age. For Caucasians, there was no relationship between any of the childhood adversity factors and cognitive decline. Barnes said researchers aren’t sure why childhood hunger could have a possible protective effect on cognitive decline. One potential explanation for the finding could be found in research that has shown that calorie restriction can delay the onset of age-related changes in the body and increase the life span. Another explanation could be a selective survival effect. The older people in the study who experienced childhood adversity may be the hardiest and most resilient of their era; those with the most extreme adversity may have died before they reached old age.
Barnes noted that the results stayed the same after researchers adjusted for factors such as amount of education and health problems. The results also did not change after researchers repeated the analysis after excluding people with the lowest cognitive function at the beginning of the study to help rule out the possibility that people with mild, undiagnosed Alzheimer’s disease were included in the study.
Because relatively few Caucasians in the study reported childhood adversity, the study may not have been able to detect an effect of adversity on cognitive decline in Caucasians, Barnes said.








(Image Credit)

Can Going Hungry As a Child Slow Down Cognitive Decline in Later Years?

People who sometimes went hungry as children had slower cognitive decline once they were elderly than people who always had enough food to eat, according to a new study published in the December 11, 2012, print issue of Neurology®, the medical journal of the American Academy of Neurology.

“These results were unexpected because other studies have shown that people who experience adversity as children are more likely to have problems such as heart disease, mental illness and even lower cognitive functioning than people whose childhoods are free of adversity,” said study author Lisa L. Barnes, PhD, of Rush University Medical Center in Chicago.

For the African American participants, the 5.8 percent who reported that they went without enough food to eat sometimes, often or always were more likely to have a slower rate of cognitive decline, or decline that was reduced by about one-third, than those who rarely or never went without enough food to eat. The 8.4 percent of African American participants who reported that they were much thinner at age 12 than other kids their age also were more likely to have a slower rate of cognitive decline, also by one-third, than those who said they were about the same size or heavier than other kids their age. For Caucasians, there was no relationship between any of the childhood adversity factors and cognitive decline. Barnes said researchers aren’t sure why childhood hunger could have a possible protective effect on cognitive decline. One potential explanation for the finding could be found in research that has shown that calorie restriction can delay the onset of age-related changes in the body and increase the life span. Another explanation could be a selective survival effect. The older people in the study who experienced childhood adversity may be the hardiest and most resilient of their era; those with the most extreme adversity may have died before they reached old age.

Barnes noted that the results stayed the same after researchers adjusted for factors such as amount of education and health problems. The results also did not change after researchers repeated the analysis after excluding people with the lowest cognitive function at the beginning of the study to help rule out the possibility that people with mild, undiagnosed Alzheimer’s disease were included in the study.

Because relatively few Caucasians in the study reported childhood adversity, the study may not have been able to detect an effect of adversity on cognitive decline in Caucasians, Barnes said.

Filed under cognitive decline children hunger cognitive functioning childhood adversity neuroscience science

42 notes


Postpartum women less stressed by threats unrelated to the baby
Following the birth of a child, new mothers may have an altered perception of stresses around them, showing less interest in threats unrelated to the baby. This change to the neuroendocrine circuitry could help the mothers adapt to the additional stress often accompanying newborns, say researchers from Indiana University’s Kinsey Institute and the University of Zurich.
When viewing disturbing images during the study, postpartum women reported less distress and demonstrated less activity in their amygdala, the part of the brain that controls emotional response, than nulliparous, or childless, women, according to functional magnetic resonance imaging.
When the childless women were administered a nasal spray containing the hormone oxytocin, however, their brain images looked more similar to the postpartum women, and they also reported less subjective stress when viewing the images.
"Our findings extend previous work showing a lower stress response with motherhood that likely enhances her ability to cope with this dramatic new role," said lead author Heather Rupp, director of psychology and neuroscience at Brain Surgery Worldwide Inc. and a research fellow at The Kinsey Institute for Research in Sex, Gender and Reproduction.
The study, “Amygdala response to negative images in postpartum verses nulliparous women and intranasal oxytocin,” was published in the online journal Social Cognitive and Affective Neuroscience.

Postpartum women less stressed by threats unrelated to the baby

Following the birth of a child, new mothers may have an altered perception of stresses around them, showing less interest in threats unrelated to the baby. This change to the neuroendocrine circuitry could help the mothers adapt to the additional stress often accompanying newborns, say researchers from Indiana University’s Kinsey Institute and the University of Zurich.

When viewing disturbing images during the study, postpartum women reported less distress and demonstrated less activity in their amygdala, the part of the brain that controls emotional response, than nulliparous, or childless, women, according to functional magnetic resonance imaging.

When the childless women were administered a nasal spray containing the hormone oxytocin, however, their brain images looked more similar to the postpartum women, and they also reported less subjective stress when viewing the images.

"Our findings extend previous work showing a lower stress response with motherhood that likely enhances her ability to cope with this dramatic new role," said lead author Heather Rupp, director of psychology and neuroscience at Brain Surgery Worldwide Inc. and a research fellow at The Kinsey Institute for Research in Sex, Gender and Reproduction.

The study, “Amygdala response to negative images in postpartum verses nulliparous women and intranasal oxytocin,” was published in the online journal Social Cognitive and Affective Neuroscience.

Filed under stress stress response oxytocin amygdala postpartum women neuroscience science

81 notes



Combination of imaging exams improves Alzheimer’s diagnosis
A combination of diagnostic tests, including imaging and cerebrospinal fluid biomarkers can improve prediction of conversion from mild cognitive impairment (MCI) to Alzheimer’s disease, according to a new study published online in the journal Radiology.
"Because new treatments are likely to be most effective at the earliest stages of Alzheimer’s disease, there is great urgency to develop sensitive markers that facilitate detection and monitoring of early brain changes in individuals at risk," said Jeffrey R. Petrella, M.D., associate professor of radiology, division of neuroradiology, and director of the Alzheimer’s Disease Research Lab at Duke University Medical Center (DUMC) in Durham, N.C. "Our study looks at whether more sophisticated diagnostic tests such as magnetic resonance imaging (MRI), positron emission tomography (PET) and spinal fluid protein analysis might provide additional prognostic information, compared to more readily available cognitive and blood testing."
According to the World Health Organization, more than 35 million people worldwide are living with Alzheimer’s disease, which is incurable, and the prevalence is expected to double by 2030.
"Although there is no cure for Alzheimer’s disease, there are four symptomatic treatments that might provide some benefits," said coauthor P. Murali Doraiswamy, M.D., professor of psychiatry at DUMC. "So developing the right combination of diagnostic tests is critical to make sure we enable an accurate and early diagnosis in patients, so they can evaluate their care options."

Combination of imaging exams improves Alzheimer’s diagnosis

A combination of diagnostic tests, including imaging and cerebrospinal fluid biomarkers can improve prediction of conversion from mild cognitive impairment (MCI) to Alzheimer’s disease, according to a new study published online in the journal Radiology.

"Because new treatments are likely to be most effective at the earliest stages of Alzheimer’s disease, there is great urgency to develop sensitive markers that facilitate detection and monitoring of early brain changes in individuals at risk," said Jeffrey R. Petrella, M.D., associate professor of radiology, division of neuroradiology, and director of the Alzheimer’s Disease Research Lab at Duke University Medical Center (DUMC) in Durham, N.C. "Our study looks at whether more sophisticated diagnostic tests such as magnetic resonance imaging (MRI), positron emission tomography (PET) and spinal fluid protein analysis might provide additional prognostic information, compared to more readily available cognitive and blood testing."

According to the World Health Organization, more than 35 million people worldwide are living with Alzheimer’s disease, which is incurable, and the prevalence is expected to double by 2030.

"Although there is no cure for Alzheimer’s disease, there are four symptomatic treatments that might provide some benefits," said coauthor P. Murali Doraiswamy, M.D., professor of psychiatry at DUMC. "So developing the right combination of diagnostic tests is critical to make sure we enable an accurate and early diagnosis in patients, so they can evaluate their care options."

Filed under mild cognitive impairment alzheimer's disease neuroimaging diagnostic test neuroscience science

free counters