Neuroscience

Articles and news from the latest research reports.

Posts tagged science

91 notes

Stressed-Out Tadpoles Grow Larger Tails to Escape Predators 
When people or animals are thrust into threatening situations such as combat or attack by a predator, stress hormones are released to help prepare the organism to defend itself or to rapidly escape from danger—the so-called fight-or-flight response.
Now University of Michigan researchers have demonstrated for the first time that stress hormones are also responsible for altering the body shape of developing animals, in this case the humble tadpole, so they are better equipped to survive predator attacks.
Through a series of experiments conducted at field sites and in the laboratory, U-M researchers demonstrated that prolonged exposure to a stress hormone enabled tadpoles to increase the size of their tails, which improved their ability to avoid lethal predator attacks.
"This is the first clear demonstration that a stress hormone produced by the animal can actually cause a morphological change, a change in body shape, that improves their survival in the presence of lethal predators. It’s a survival response," said Robert Denver, a professor of molecular, cellular and developmental biology and of ecology and evolutionary biology.
The team’s surprising findings are detailed in a paper to be published online March 5 in the journal Proceedings of the Royal Society B. First author of the paper is Jessica Middlemis Maher, a former U-M doctoral student, now at Michigan State University, who conducted the work for her dissertation.
Scientists have long known that environmental changes can prompt animals and plants to alter their morphology and physiology, as well as the timing of developmental events. For example, tadpoles can accelerate metamorphosis into frogs in response to a drying pond, a high density of predators or a lack of food.
The term “phenotypic plasticity” is used to describe modifications by animals and plants in response to a changing environment.
"There’s been a lot of interest in phenotypic plasticity among developmental biologists and evolutionary ecologists for more than 70 years, but there’s been relatively little focus on the mechanisms by which the environmental signal is translated into a functional response," Denver said.
"We’ve known, for example, that tadpoles can change their body shape in response to predation risk. But until now, nobody knew the basic physiological mechanisms mediating that response. That’s what’s novel about this study."
(Image: Wikimedia Commons)

Stressed-Out Tadpoles Grow Larger Tails to Escape Predators

When people or animals are thrust into threatening situations such as combat or attack by a predator, stress hormones are released to help prepare the organism to defend itself or to rapidly escape from danger—the so-called fight-or-flight response.

Now University of Michigan researchers have demonstrated for the first time that stress hormones are also responsible for altering the body shape of developing animals, in this case the humble tadpole, so they are better equipped to survive predator attacks.

Through a series of experiments conducted at field sites and in the laboratory, U-M researchers demonstrated that prolonged exposure to a stress hormone enabled tadpoles to increase the size of their tails, which improved their ability to avoid lethal predator attacks.

"This is the first clear demonstration that a stress hormone produced by the animal can actually cause a morphological change, a change in body shape, that improves their survival in the presence of lethal predators. It’s a survival response," said Robert Denver, a professor of molecular, cellular and developmental biology and of ecology and evolutionary biology.

The team’s surprising findings are detailed in a paper to be published online March 5 in the journal Proceedings of the Royal Society B. First author of the paper is Jessica Middlemis Maher, a former U-M doctoral student, now at Michigan State University, who conducted the work for her dissertation.

Scientists have long known that environmental changes can prompt animals and plants to alter their morphology and physiology, as well as the timing of developmental events. For example, tadpoles can accelerate metamorphosis into frogs in response to a drying pond, a high density of predators or a lack of food.

The term “phenotypic plasticity” is used to describe modifications by animals and plants in response to a changing environment.

"There’s been a lot of interest in phenotypic plasticity among developmental biologists and evolutionary ecologists for more than 70 years, but there’s been relatively little focus on the mechanisms by which the environmental signal is translated into a functional response," Denver said.

"We’ve known, for example, that tadpoles can change their body shape in response to predation risk. But until now, nobody knew the basic physiological mechanisms mediating that response. That’s what’s novel about this study."

(Image: Wikimedia Commons)

Filed under tadpoles stress stress hormones corticosterone fight-or-flight response evolution neuroscience science

30 notes

New Effort to Identify Parkinson’s Biomarkers

Last month, the National Institutes of Health announced a new collaborative initiative that aims to accelerate the search for biomarkers — changes in the body that can be used to predict, diagnose or monitor a disease — in Parkinson’s disease, in part by improving collaboration among researchers and helping patients get involved in clinical studies. As part of this program, launched by the National Institute of Neurological Disorders and Stroke (NINDS), part of the NIH, Clemens Scherzer, MD, a neurologist and researcher at Brigham and Women’s Hospital (BWH), was awarded $2.6 million over five years to work on the development of biomarkers and facilitate NINDS-wide access to one of the largest data and biospecimens bank in the world for Parkinson’s available at BWH. This NINIDS initiative is highlighted in an editorial in the March issue of Lancet Neurology.

"There is a critical gap in the research that leads to lack of treatment for diseases like Parkinson’s," said Scherzer. "Biomarkers are desperately needed to make clinical trials more efficient, less expensive and to monitor disease and treatment response. We are hopeful that this initiative will fast track new discoveries in this area."

According to Scherzer, most of our knowledge of the human brain is based on the analysis of just 1.5 percent of the human genome that encodes proteins. The first part of Scherzer’s project will examine the function of the remaining 98.5 percent of the genome that, so far, has been unexplored in the human brain. While this remainder had been previously dismissed as “junk”, it is now becoming clearer that parts of it actively regulate cell biology.  Scherzer and colleagues believe that “dark matter” RNA transcribed from stretches of so called “junk” DNA is active in brain cells and contributes to the complexity of normal dopamine neurons and, when corrupted, Parkinson’s disease.

"This offers a potentially ground breaking opportunity for biomarker development. Initially, the team will search for these RNAs associated in brain tissue of individuals at earliest stages of the disease. Then, this team will look for related biomarkers in the bloodstream and cerebrospinal fluid in both healthy brains and those with Parkinson’s," Scherzer said.

Scherzer’s lab has been spearheading biomarker research in this field since 2004 and the team already has 2,000 patients enrolled and being followed in a longitudinal study with rich clinical data and one of the largest biobanks in the world for Parkinson’s tissue with support from the Harvard NeuroDiscovery Center. The biobank was designed as an incubator for Parkinson’s research and until now was chiefly available for research collaborations within the Harvard-affiliated community. As part of this new project, this vast resource will be open to all NIH-funded investigators.

"Our ultimate goal is to personalize treatment for our patients with Parkinson’s." said Scherzer. "By opening up this vast collection of specimens, we are exploding the resources that are available to NIH-funded investigators looking at this disease. We hope to harness the power of collaboration to speed up biomarkers discovery."

(Source: brighamandwomens.org)

Filed under parkinson's disease biomarker brain brain tissue genomics neuroscience science

172 notes

New gene variant may explain psychotic features in bipolar disorder
Researchers at Karolinska Institutet have found an explanation for why the level of kynurenic acid (KYNA) is higher in the brains of people with schizophrenia or bipolar disease with psychosis. The study, which is published in the scientific periodical Molecular Psychiatry, identifies a gene variant associated with an increased production of KYNA. 
The discovery contributes to the further understanding of the link between inflammation and psychosis, and might pave the way for improved therapies. Kynurenic acid (KYNA) is a substance that affects several signalling pathways in the brain and that is integral to cognitive function. Earlier studies of cerebrospinal fluid have shown that levels of KYNA are elevated in the brains of patients with schizophrenia or bipolar diseases with psychotic features. The reason for this has, however, not been fully understood.
KMO is an enzyme involved in the production of KYNA, and the Karolinska Institutet team has now shown that some individuals have a particular genetic variant of KMO that affects its quantity, resulting in higher levels of KYNA. The study also shows that patients with bipolar disease who carry this gene variant had almost twice the chance of developing psychotic episodes.
KYNA is produced in inflammation, such as when the body is exposed to stress and infection. It is also known that stress and infection may trigger psychotic episodes. The present study provides a likely description of this process, which is more likely to occur in those individuals with the gene variant related to higher production of KYNA. The researchers also believe that the discovery can help explain certain features of schizophrenia or development of other psychotic conditions.
"Psychosis related to bipolar disease has a very high degree of heredity, up to 80 per cent, but we don’t know which genes and which mechanisms are involved," says Martin Schalling, Professor of medical genetics at Karolinska Institutet’s Department of Molecular Medicine and Surgery, also affiliated to the Center for Molecular Medicine (CMM). "This is where our study comes in, with a new explanation that can be linked to signal systems activated by inflammation. This has consequences for diagnostics, and paves the way for new therapies, since there is a large arsenal of already approved drugs that modulate inflammation."

New gene variant may explain psychotic features in bipolar disorder

Researchers at Karolinska Institutet have found an explanation for why the level of kynurenic acid (KYNA) is higher in the brains of people with schizophrenia or bipolar disease with psychosis. The study, which is published in the scientific periodical Molecular Psychiatry, identifies a gene variant associated with an increased production of KYNA.

The discovery contributes to the further understanding of the link between inflammation and psychosis, and might pave the way for improved therapies. Kynurenic acid (KYNA) is a substance that affects several signalling pathways in the brain and that is integral to cognitive function. Earlier studies of cerebrospinal fluid have shown that levels of KYNA are elevated in the brains of patients with schizophrenia or bipolar diseases with psychotic features. The reason for this has, however, not been fully understood.

KMO is an enzyme involved in the production of KYNA, and the Karolinska Institutet team has now shown that some individuals have a particular genetic variant of KMO that affects its quantity, resulting in higher levels of KYNA. The study also shows that patients with bipolar disease who carry this gene variant had almost twice the chance of developing psychotic episodes.

KYNA is produced in inflammation, such as when the body is exposed to stress and infection. It is also known that stress and infection may trigger psychotic episodes. The present study provides a likely description of this process, which is more likely to occur in those individuals with the gene variant related to higher production of KYNA. The researchers also believe that the discovery can help explain certain features of schizophrenia or development of other psychotic conditions.

"Psychosis related to bipolar disease has a very high degree of heredity, up to 80 per cent, but we don’t know which genes and which mechanisms are involved," says Martin Schalling, Professor of medical genetics at Karolinska Institutet’s Department of Molecular Medicine and Surgery, also affiliated to the Center for Molecular Medicine (CMM). "This is where our study comes in, with a new explanation that can be linked to signal systems activated by inflammation. This has consequences for diagnostics, and paves the way for new therapies, since there is a large arsenal of already approved drugs that modulate inflammation."

Filed under bipolar disorder kynurenic acid psychosis inflammation cognitive function neuroscience science

173 notes

Single gene might explain dramatic differences among people with schizophrenia
Some of the dramatic differences seen among patients with schizophrenia may be explained by a single gene that regulates a group of other schizophrenia risk genes. These findings appear in a new imaging-genetics study from the Centre for Addiction and Mental Health (CAMH).
The study revealed that people with schizophrenia who had a particular version of the microRNA-137 gene (or MIR137), tended to develop the illness at a younger age and had distinct brain features – both associated with poorer outcomes – compared to patients who did not have this version. This work, led by Drs. Aristotle Voineskos and James Kennedy, appears in the latest issue of Molecular Psychiatry.
Treating schizophrenia is particularly challenging as the illness can vary from patient to patient. Some individuals stay hospitalized for years, while others respond well to treatment.
"What’s exciting about this study is that we could have a legitimate answer as to why some of these differences occur," explained Dr. Voineskos, a clinician-scientist in CAMH’s Campbell Family Mental Health Research Institute. "In the future, we might have the capability of using this gene to tell us about prognosis and how a person might respond to treatment."
"Drs. Voineskos and Kennedy’s findings are very important as they provide new insights into the genetic bases of this condition that affects thousands of Canadians and their families," said Dr. Anthony Phillips, Scientific Director at the Canadian Institutes of Health Research Institute of Neurosciences, Mental Health and Addiction.
Also, until now, sex has been the strongest predictor of the age at which schizophrenia develops in individuals. Typically, women tend to develop the illness a few years later than men, and experience a milder form of the disease.
"We showed that this gene has a bigger effect on age-at-onset than one’s gender has," said Dr. Voineskos, who heads the Kimel Family Translational Imaging-Genetics Research Laboratory at CAMH. "This may be a paradigm shift for the field."
The researchers studied MIR137 — a gene involved in turning on and off other schizophrenia-related genes — in 510 individuals living with schizophrenia. The scientists found that patients with a specific version of the gene tended to develop the illness at a younger age, around 20.8 years of age, compared to 23.4 years of age among those without this version.
"Although three years of difference in age-at-onset may not seem large, those years are important in the final development of brain circuits in the young adult," said Dr. Kennedy, Director of CAMH’s Neuroscience Research Department. "This can have major impact on disease outcome."
In a separate part of the study involving 213 people, the researchers used MRI and diffusion tensor-magnetic resonance brain imaging (DT-MRI). They found that individuals who had the particular gene version tended to have unique brain features. These features included a smaller hippocampus, which is a brain structure involved in memory, and larger lateral ventricles, which are fluid-filled structures associated with disease outcome. As well, these patients tended to have more impairment in white matter tracts, which are structures connecting brain regions, and serving as the information highways of the brain.
Developing tests that screen for versions of this gene could be helpful in treating patients earlier and more effectively.
"We’re hoping that in the near future we can use this combination of genetics and brain imaging to predict how severe a version of illness someone might have," said Dr. Voineskos. "This would allow us to plan earlier for specific treatments and clinical service delivery and pursue more personalized treatment options right from the start." 
(Image: Akelei van Dam)

Single gene might explain dramatic differences among people with schizophrenia

Some of the dramatic differences seen among patients with schizophrenia may be explained by a single gene that regulates a group of other schizophrenia risk genes. These findings appear in a new imaging-genetics study from the Centre for Addiction and Mental Health (CAMH).

The study revealed that people with schizophrenia who had a particular version of the microRNA-137 gene (or MIR137), tended to develop the illness at a younger age and had distinct brain features – both associated with poorer outcomes – compared to patients who did not have this version. This work, led by Drs. Aristotle Voineskos and James Kennedy, appears in the latest issue of Molecular Psychiatry.

Treating schizophrenia is particularly challenging as the illness can vary from patient to patient. Some individuals stay hospitalized for years, while others respond well to treatment.

"What’s exciting about this study is that we could have a legitimate answer as to why some of these differences occur," explained Dr. Voineskos, a clinician-scientist in CAMH’s Campbell Family Mental Health Research Institute. "In the future, we might have the capability of using this gene to tell us about prognosis and how a person might respond to treatment."

"Drs. Voineskos and Kennedy’s findings are very important as they provide new insights into the genetic bases of this condition that affects thousands of Canadians and their families," said Dr. Anthony Phillips, Scientific Director at the Canadian Institutes of Health Research Institute of Neurosciences, Mental Health and Addiction.

Also, until now, sex has been the strongest predictor of the age at which schizophrenia develops in individuals. Typically, women tend to develop the illness a few years later than men, and experience a milder form of the disease.

"We showed that this gene has a bigger effect on age-at-onset than one’s gender has," said Dr. Voineskos, who heads the Kimel Family Translational Imaging-Genetics Research Laboratory at CAMH. "This may be a paradigm shift for the field."

The researchers studied MIR137 — a gene involved in turning on and off other schizophrenia-related genes — in 510 individuals living with schizophrenia. The scientists found that patients with a specific version of the gene tended to develop the illness at a younger age, around 20.8 years of age, compared to 23.4 years of age among those without this version.

"Although three years of difference in age-at-onset may not seem large, those years are important in the final development of brain circuits in the young adult," said Dr. Kennedy, Director of CAMH’s Neuroscience Research Department. "This can have major impact on disease outcome."

In a separate part of the study involving 213 people, the researchers used MRI and diffusion tensor-magnetic resonance brain imaging (DT-MRI). They found that individuals who had the particular gene version tended to have unique brain features. These features included a smaller hippocampus, which is a brain structure involved in memory, and larger lateral ventricles, which are fluid-filled structures associated with disease outcome. As well, these patients tended to have more impairment in white matter tracts, which are structures connecting brain regions, and serving as the information highways of the brain.

Developing tests that screen for versions of this gene could be helpful in treating patients earlier and more effectively.

"We’re hoping that in the near future we can use this combination of genetics and brain imaging to predict how severe a version of illness someone might have," said Dr. Voineskos. "This would allow us to plan earlier for specific treatments and clinical service delivery and pursue more personalized treatment options right from the start."

(Image: Akelei van Dam)

Filed under schizophrenia genes microRNA-137 genetics neuroimaging brain circuits hippocampus neuroscience science

115 notes

Computer Model May Help Athletes and Soldiers Avoid Brain Damage and Concussions
Concussions can occur in sports and in combat, but health experts do not know precisely which jolts, collisions and awkward head movements during these activities pose the greatest risks to the brain. To find out, Johns Hopkins engineers have developed a powerful new computer-based process that helps identify the dangerous conditions that lead to concussion-related brain injuries. This approach could lead to new medical treatment options and some sports rule changes to reduce brain trauma among players.
The research comes at a time when greater attention is being paid to assessing and preventing the head injuries sustained by both soldiers and athletes. Some kinds of head injuries are difficult to see with standard diagnostic imaging but can have serious long-term consequences. Concussions, once dismissed as a short-term nuisance, have more recently been linked to serious brain disorders.
“Concussion-related injuries can develop even when nothing has physically touched the head, and no damage is apparent on the skin,” said K. T. Ramesh, the Alonzo G. Decker Jr. Professor of Science and Engineering who led the research at Johns Hopkins. “Think about a soldier who is knocked down by the blast wave of an explosion, or a football player reeling after a major collision. The person may show some loss of cognitive function, but you may not immediately see anything in a CT-scan or MRI that tells you exactly where and how much damage has been done to the brain. You don’t know what happened to the brain, so how do you figure out how to treat the patient?”
To help doctors answer this question, Ramesh led a team that used a powerful technique called diffusion tensor imaging, together with a computer model of the head, to identify injured axons, which are tiny but important fibers that carry information from one brain cell to another. These axons are concentrated in a kind of brain tissue known as “white matter,” and they appear to be injured during the so-called mild traumatic brain injury associated with concussions. Ramesh’s team has shown that the axons are injured most easily by strong rotations of the head, and the researchers’ process can calculate which parts of the brain are most likely to be injured during a specific event.
The team described its new technique in the Jan. 8 edition of the Journal of Neurotrauma. The lead author, Rika M. Wright, played a major role in the research while completing her doctoral studies in Johns Hopkins’ Whiting School of Engineering, supervised by Ramesh. Wright is now a postdoctoral research fellow at Carnegie Mellon University. Ramesh is continuing to conduct research using the technique at Johns Hopkins with support from the National Institutes of Health.
Beyond its use in evaluating combat and sports-related injuries, the work could have wider applications, such as detecting axonal damage among patients who have received head injuries in vehicle accidents or serious falls. “This is the kind of injury that may take weeks to manifest,” Ramesh said. “By the time you assess the symptoms, it may be too late for some kinds of treatment to be helpful. But if you can tell right away what happened to the brain and where the injury is likely to have occurred, you may be able to get a crucial head-start on the treatment.”

Computer Model May Help Athletes and Soldiers Avoid Brain Damage and Concussions

Concussions can occur in sports and in combat, but health experts do not know precisely which jolts, collisions and awkward head movements during these activities pose the greatest risks to the brain. To find out, Johns Hopkins engineers have developed a powerful new computer-based process that helps identify the dangerous conditions that lead to concussion-related brain injuries. This approach could lead to new medical treatment options and some sports rule changes to reduce brain trauma among players.

The research comes at a time when greater attention is being paid to assessing and preventing the head injuries sustained by both soldiers and athletes. Some kinds of head injuries are difficult to see with standard diagnostic imaging but can have serious long-term consequences. Concussions, once dismissed as a short-term nuisance, have more recently been linked to serious brain disorders.

“Concussion-related injuries can develop even when nothing has physically touched the head, and no damage is apparent on the skin,” said K. T. Ramesh, the Alonzo G. Decker Jr. Professor of Science and Engineering who led the research at Johns Hopkins. “Think about a soldier who is knocked down by the blast wave of an explosion, or a football player reeling after a major collision. The person may show some loss of cognitive function, but you may not immediately see anything in a CT-scan or MRI that tells you exactly where and how much damage has been done to the brain. You don’t know what happened to the brain, so how do you figure out how to treat the patient?”

To help doctors answer this question, Ramesh led a team that used a powerful technique called diffusion tensor imaging, together with a computer model of the head, to identify injured axons, which are tiny but important fibers that carry information from one brain cell to another. These axons are concentrated in a kind of brain tissue known as “white matter,” and they appear to be injured during the so-called mild traumatic brain injury associated with concussions. Ramesh’s team has shown that the axons are injured most easily by strong rotations of the head, and the researchers’ process can calculate which parts of the brain are most likely to be injured during a specific event.

The team described its new technique in the Jan. 8 edition of the Journal of Neurotrauma. The lead author, Rika M. Wright, played a major role in the research while completing her doctoral studies in Johns Hopkins’ Whiting School of Engineering, supervised by Ramesh. Wright is now a postdoctoral research fellow at Carnegie Mellon University. Ramesh is continuing to conduct research using the technique at Johns Hopkins with support from the National Institutes of Health.

Beyond its use in evaluating combat and sports-related injuries, the work could have wider applications, such as detecting axonal damage among patients who have received head injuries in vehicle accidents or serious falls. “This is the kind of injury that may take weeks to manifest,” Ramesh said. “By the time you assess the symptoms, it may be too late for some kinds of treatment to be helpful. But if you can tell right away what happened to the brain and where the injury is likely to have occurred, you may be able to get a crucial head-start on the treatment.”

Filed under brain brain damage concussions brain injuries athletes computer model diffusion tensor imaging neuroscience science

97 notes

Is it a Stroke or Benign Dizziness? A Simple Bedside Test Can Tell
A bedside electronic device that measures eye movements can successfully determine whether the cause of severe, continuous, disabling dizziness is a stroke or something benign, according to results of a small study led by Johns Hopkins Medicine researchers.
"Using this device can directly predict who has had a stroke and who has not," says David Newman-Toker, M.D., Ph.D., an associate professor of neurology and otolaryngology at the Johns Hopkins University School of Medicine and leader of the study described in the journal Stroke. “We’re spending hundreds of millions of dollars a year on expensive stroke work-ups that are unnecessary, and probably missing the chance to save tens of thousands of lives because we aren’t properly diagnosing their dizziness or vertigo as stroke symptoms.”
Newman-Toker says if additional larger studies confirm these results, the device could one day be the equivalent of an electrocardiogram (EKG), a simple noninvasive test routinely used to rule out heart attack in patients with chest pain. And, he adds, universal use of the device could “virtually eliminate deaths from misdiagnosis and save a lot of time and money.”
To distinguish stroke from a more benign condition, such as vertigo linked to an inner ear disturbance, specialists typically use three eye movement tests that are essentially a stress test for the balance system. In the hands of specialists, these bedside clinical tests (without the device) have been shown in several large research studies to be extremely accurate — “nearly perfect, and even better than immediate MRI,” says Newman-Toker. One of those tests, known as the horizontal head impulse test, is the best predictor of stroke. To perform it, doctors or technicians ask patients to look at a target on the wall and keep their eyes on the target as doctors move the patients’ heads from side to side. But, says Newman-Toker, it requires expertise to determine whether a patient is making the fast corrective eye adjustments that would indicate a benign form of dizziness as opposed to a stroke.
For the new study, researchers instead performed the same test using a small, portable device — a video-oculography machine that detects minute eye movements that are difficult for most physicians to notice. The machine includes a set of goggles, akin to swimming goggles, with a USB-connected webcam and an accelerometer in the frame. The webcam is hooked up to a laptop where a continuous picture of the eye is taken. Software interprets eye position based on movements and views of the pupil, while the accelerometer measures the speed of the movement of the head.
Newman-Toker says the test could be easily employed to prevent misdiagnosis of  as many as 100,000 strokes a year, leading to earlier stroke diagnosis and more efficient triage and treatment decisions for patients with disabling dizziness. Overlooked strokes mean delayed or missed treatments that lead to roughly 20,000 to 30,000 preventable deaths or disabilities a year, he says. The technology, he adds, could someday be used in a smartphone application to enable wider access to a quick and accurate diagnosis of strokes whose main symptom is dizziness, as opposed to one-sided weakness or garbled speech.
The diagnosis of stroke in patients with severe dizziness, vomiting, difficulty walking and intolerance to head motion is difficult, Newman-Toker says. He estimates there are 4 million emergency department visits annually in the United States for dizziness or vertigo, at least half a million of which involve patients at high risk for stroke. The most common causes are benign inner ear conditions, but many emergency room doctors, Newman-Toker says, find it nearly impossible to tell the difference between the benign conditions and something more serious, such as a stroke. So they often rely on brain imaging - usually a CT scan, an expensive and inaccurate technology for this particular diagnosis.
The Hopkins-led study enrolled 12 patients at The Johns Hopkins Hospital and the University of Illinois College of Medicine at Peoria, who later underwent confirmatory MRI. Six were diagnosed with stroke and six with a benign condition using video-oculography. MRI later confirmed all 12 diagnoses.

Is it a Stroke or Benign Dizziness? A Simple Bedside Test Can Tell

A bedside electronic device that measures eye movements can successfully determine whether the cause of severe, continuous, disabling dizziness is a stroke or something benign, according to results of a small study led by Johns Hopkins Medicine researchers.

"Using this device can directly predict who has had a stroke and who has not," says David Newman-Toker, M.D., Ph.D., an associate professor of neurology and otolaryngology at the Johns Hopkins University School of Medicine and leader of the study described in the journal Stroke. “We’re spending hundreds of millions of dollars a year on expensive stroke work-ups that are unnecessary, and probably missing the chance to save tens of thousands of lives because we aren’t properly diagnosing their dizziness or vertigo as stroke symptoms.”

Newman-Toker says if additional larger studies confirm these results, the device could one day be the equivalent of an electrocardiogram (EKG), a simple noninvasive test routinely used to rule out heart attack in patients with chest pain. And, he adds, universal use of the device could “virtually eliminate deaths from misdiagnosis and save a lot of time and money.”

To distinguish stroke from a more benign condition, such as vertigo linked to an inner ear disturbance, specialists typically use three eye movement tests that are essentially a stress test for the balance system. In the hands of specialists, these bedside clinical tests (without the device) have been shown in several large research studies to be extremely accurate — “nearly perfect, and even better than immediate MRI,” says Newman-Toker. One of those tests, known as the horizontal head impulse test, is the best predictor of stroke. To perform it, doctors or technicians ask patients to look at a target on the wall and keep their eyes on the target as doctors move the patients’ heads from side to side. But, says Newman-Toker, it requires expertise to determine whether a patient is making the fast corrective eye adjustments that would indicate a benign form of dizziness as opposed to a stroke.

For the new study, researchers instead performed the same test using a small, portable device — a video-oculography machine that detects minute eye movements that are difficult for most physicians to notice. The machine includes a set of goggles, akin to swimming goggles, with a USB-connected webcam and an accelerometer in the frame. The webcam is hooked up to a laptop where a continuous picture of the eye is taken. Software interprets eye position based on movements and views of the pupil, while the accelerometer measures the speed of the movement of the head.

Newman-Toker says the test could be easily employed to prevent misdiagnosis of  as many as 100,000 strokes a year, leading to earlier stroke diagnosis and more efficient triage and treatment decisions for patients with disabling dizziness. Overlooked strokes mean delayed or missed treatments that lead to roughly 20,000 to 30,000 preventable deaths or disabilities a year, he says. The technology, he adds, could someday be used in a smartphone application to enable wider access to a quick and accurate diagnosis of strokes whose main symptom is dizziness, as opposed to one-sided weakness or garbled speech.

The diagnosis of stroke in patients with severe dizziness, vomiting, difficulty walking and intolerance to head motion is difficult, Newman-Toker says. He estimates there are 4 million emergency department visits annually in the United States for dizziness or vertigo, at least half a million of which involve patients at high risk for stroke. The most common causes are benign inner ear conditions, but many emergency room doctors, Newman-Toker says, find it nearly impossible to tell the difference between the benign conditions and something more serious, such as a stroke. So they often rely on brain imaging - usually a CT scan, an expensive and inaccurate technology for this particular diagnosis.

The Hopkins-led study enrolled 12 patients at The Johns Hopkins Hospital and the University of Illinois College of Medicine at Peoria, who later underwent confirmatory MRI. Six were diagnosed with stroke and six with a benign condition using video-oculography. MRI later confirmed all 12 diagnoses.

Filed under brain stroke benign dizziness eye movements electronic device medicine science

244 notes

Green tea extract interferes with the formation of amyloid plaques in Alzheimer’s disease
Researchers at the University of Michigan have found a new potential benefit of a molecule in green tea: preventing the misfolding of specific proteins in the brain.
The aggregation of these proteins, called metal-associated amyloids, is associated with Alzheimer’s disease and other neurodegenerative conditions.
A paper published recently in the Proceedings of the National Academy of Sciences explained how Life Sciences Institute faculty member Mi Hee Lim and an interdisciplinary team of researchers used green tea extract to control the generation of metal-associated amyloid-β aggregates associated with Alzheimer’s disease in the lab.
The specific molecule in green tea, (—)-epigallocatechin-3-gallate, also known as EGCG, prevented aggregate formation and broke down existing aggregate structures in the proteins that contained metals—specifically copper, iron and zinc.
"A lot of people are very excited about this molecule," said Lim, noting that the EGCG and other flavonoids in natural products have long been established as powerful antioxidants. "We used a multidisciplinary approach. This is the first example of structure-centric, multidisciplinary investigations by three principal investigators with three different areas of expertise."
The research team included chemists, biochemists and biophysicists.
While many researchers are investigating small molecules and metal-associated amyloids, most are looking from a limited perspective, said Lim, assistant professor of chemistry and research assistant professor at the Life Sciences Institute, where her lab is located and her research is conducted.
"But we believe you have to have a lot of approaches working together, because the brain is very complex," she said.
The PNAS paper was a starting point, Lim said, and her team’s next step is to “tweak” the molecule and then test its ability to interfere with plaque formation in fruit flies.
"We want to modify them for the brain, specifically to interfere with the plaques associated with Alzheimer’s," she said.
Lim plans to collaborate with Bing Ye, a neurobiologist in the LSI. Together, the researchers will test the new molecule’s power to inhibit potential toxicity of aggregates containing proteins and metals in fruit flies.

Green tea extract interferes with the formation of amyloid plaques in Alzheimer’s disease

Researchers at the University of Michigan have found a new potential benefit of a molecule in green tea: preventing the misfolding of specific proteins in the brain.

The aggregation of these proteins, called metal-associated amyloids, is associated with Alzheimer’s disease and other neurodegenerative conditions.

A paper published recently in the Proceedings of the National Academy of Sciences explained how Life Sciences Institute faculty member Mi Hee Lim and an interdisciplinary team of researchers used green tea extract to control the generation of metal-associated amyloid-β aggregates associated with Alzheimer’s disease in the lab.

The specific molecule in green tea, (—)-epigallocatechin-3-gallate, also known as EGCG, prevented aggregate formation and broke down existing aggregate structures in the proteins that contained metals—specifically copper, iron and zinc.

"A lot of people are very excited about this molecule," said Lim, noting that the EGCG and other flavonoids in natural products have long been established as powerful antioxidants. "We used a multidisciplinary approach. This is the first example of structure-centric, multidisciplinary investigations by three principal investigators with three different areas of expertise."

The research team included chemists, biochemists and biophysicists.

While many researchers are investigating small molecules and metal-associated amyloids, most are looking from a limited perspective, said Lim, assistant professor of chemistry and research assistant professor at the Life Sciences Institute, where her lab is located and her research is conducted.

"But we believe you have to have a lot of approaches working together, because the brain is very complex," she said.

The PNAS paper was a starting point, Lim said, and her team’s next step is to “tweak” the molecule and then test its ability to interfere with plaque formation in fruit flies.

"We want to modify them for the brain, specifically to interfere with the plaques associated with Alzheimer’s," she said.

Lim plans to collaborate with Bing Ye, a neurobiologist in the LSI. Together, the researchers will test the new molecule’s power to inhibit potential toxicity of aggregates containing proteins and metals in fruit flies.

Filed under alzheimer's disease dementia green tea beta amyloid proteins flavonoids neuroscience science

43 notes

Age-Related Dementia May Begin with Neurons’ Inability to Rid Themselves of Unwanted Proteins
A team of European scientists from the University Medical Center Hamburg-Eppendorf (UKE) and the Cologne Excellence Cluster on Cellular Stress Responses in Aging-Associated Diseases (CECAD) at the University of Cologne in Germany has taken an important step closer to understanding the root cause of age-related dementia. In research involving both worms and mice, they have found that age-related dementia is likely the result of a declining ability of neurons to dispose of unwanted aggregated proteins. As protein disposal becomes significantly less efficient with increasing age, the buildup of these unwanted proteins ultimately leads to the development and progression of dementia. This research appears in the March 2013 issue of the journal Genetics.
“By studying disease progression in dementia, specifically by focusing on mechanisms neurons use to dispose of unwanted proteins, we show how these are interconnected and how these mechanisms deteriorate over time,” said Markus Glatzel, M.D., a researcher involved in the work from the Institute of Neuropathology at UKE in Hamburg, Germany. “This gives us a better understanding as to why dementias affect older persons; the ultimate aim is to use these insights to devise novel therapies to restore the full capacity of protein disposal in aged neurons.”
To make this discovery, scientists carried out their experiments in both worm and mouse models that had a genetically-determined dementia in which the disease was caused by protein accumulation in neurons. In the worm model, researchers in the lab of Thorsten Hoppe, Ph.D., from the CECAD Cluster of Excellence could inactivate distinct routes used for the disposal of the unwanted proteins. Results provided valuable insight into the mechanisms that neurons use to cope with protein accumulation. These pathways were then assessed in young and aged mice. This study provides an explanation of why dementias exponentially increase with age. Additionally, neuron protein disposal methods may offer a therapeutic target for the development of drugs to treat and/or prevent dementias.
“This is an exciting study that helps us understand what’s going wrong at a cellular level in age-related dementias,” said Mark Johnston, Ph.D., Editor-in-Chief of the journal Genetics. “This research holds possibilities for future identification of substances that can prevent, stop, or reverse this cellular malfunction in humans.”
(Image: damato)

Age-Related Dementia May Begin with Neurons’ Inability to Rid Themselves of Unwanted Proteins

A team of European scientists from the University Medical Center Hamburg-Eppendorf (UKE) and the Cologne Excellence Cluster on Cellular Stress Responses in Aging-Associated Diseases (CECAD) at the University of Cologne in Germany has taken an important step closer to understanding the root cause of age-related dementia. In research involving both worms and mice, they have found that age-related dementia is likely the result of a declining ability of neurons to dispose of unwanted aggregated proteins. As protein disposal becomes significantly less efficient with increasing age, the buildup of these unwanted proteins ultimately leads to the development and progression of dementia. This research appears in the March 2013 issue of the journal Genetics.

“By studying disease progression in dementia, specifically by focusing on mechanisms neurons use to dispose of unwanted proteins, we show how these are interconnected and how these mechanisms deteriorate over time,” said Markus Glatzel, M.D., a researcher involved in the work from the Institute of Neuropathology at UKE in Hamburg, Germany. “This gives us a better understanding as to why dementias affect older persons; the ultimate aim is to use these insights to devise novel therapies to restore the full capacity of protein disposal in aged neurons.”

To make this discovery, scientists carried out their experiments in both worm and mouse models that had a genetically-determined dementia in which the disease was caused by protein accumulation in neurons. In the worm model, researchers in the lab of Thorsten Hoppe, Ph.D., from the CECAD Cluster of Excellence could inactivate distinct routes used for the disposal of the unwanted proteins. Results provided valuable insight into the mechanisms that neurons use to cope with protein accumulation. These pathways were then assessed in young and aged mice. This study provides an explanation of why dementias exponentially increase with age. Additionally, neuron protein disposal methods may offer a therapeutic target for the development of drugs to treat and/or prevent dementias.

“This is an exciting study that helps us understand what’s going wrong at a cellular level in age-related dementias,” said Mark Johnston, Ph.D., Editor-in-Chief of the journal Genetics. “This research holds possibilities for future identification of substances that can prevent, stop, or reverse this cellular malfunction in humans.”

(Image: damato)

Filed under brain dementia aging neurons proteins animal model neuroscience science

43 notes

Brain tumours and peripheral neuropathy

Researchers from Plymouth University Peninsula Schools of Medicine and Dentistry are part of an international team which has for the first time identified the role of a tumour suppressor in peripheral neuropathy in those suffering multiple tumours of the brain and nervous system.

One in 25,000 people worldwide is affected by neurofibromatosis type 2 (NF2), a condition where the loss of a tumour suppressor called Merlin results in multiple tumours in the brain and nervous system.

Sufferers may experience 20 to 30 tumours at any one time and such numbers often lead to hearing loss, disability and eventually death. Those with NF2 may also experience peripheral neuropathy, which is when the nerves carrying messages to and from the brain and spinal column to the rest of the body do not work.

Peripheral neuropathy leads to further complications for NF2 sufferers, such as pain and numbness, muscle problems, problems with body organs and other symptoms of nerve damage, such as bladder problems, uncontrollable sweating and sexual dysfunction.

Researchers from Plymouth University Peninsula Schools of Medicine and Dentistry are part of an international research team which has for the first time identified the role of a tumour suppressor called Merlin in regulating the integrity of axons. Axons are nerve fibres which transmit information around the body and it is these are that damaged in peripheral neuropathy.

The research team showed that Merlin regulates a protein called neurofilament which supplies structural support for the axon. A better understanding of this mechanism could lead to effective drug therapies to alleviate the symptoms of peripheral neuropathy in patients with NF2.

The results of the research is published this week in Nature Neuroscience.

(Source: plymouth.ac.uk)

Filed under peripheral neuropathy nerve fibres neurofibromatosis tumor nervous system brain neuroscience science

185 notes

Cell death in retina helps tune our internal clocks
With every sunrise and sunset, our eyes make note of the light as it waxes and wanes, a process that is critical to aligning our circadian rhythms to match the solar day so we are alert during the day and restful at night. Watching the sun come and go sounds like a peaceful process, but Johns Hopkins scientists have discovered that behind the scenes, millions of specialized cells in our eyes are fighting for their lives to help the retina set the stage to keep our internal clocks ticking.
In a study that appeared in a recent issue of Neuron, a team led by biologist Samer Hattar has found that there is a kind of turf war going on behind our eyeballs, where intrinsically photosensitive retinal ganglion cells (ipRGCs) are jockeying for the best position to receive information from rod and cone cells about light levels. By studying these specialized cells in mice, Hattar and his team found that the cells actually kill each other to seize more space and find the best position to do their job.
Understanding this fight could one day lead to victories against several conditions, including autism and some psychiatric disorders, where neural circuits influence our behavior. The results could help scientists have a better idea about how the circuits behind our eyes assemble to influence our physiological functions, said Hattar, an associate professor of biology in the Krieger School of Arts and Sciences.
“In a nutshell, death in our retina plays a vital role in assembling the retinal circuits that influence crucial physiological functions such as circadian rhythms and sleep-wake cycles,” Hattar said. “Once we have a greater understanding of the circuit formation underlying all of our neuronal abilities, this could be applied to any neurological function.”
Hattar and his team determined that the killing among rival ipRGCs is justifiable homicide: Without this cell death, circadian blindness overcame the mice, who could no longer distinguish day from night. Hattar’s team studied mice that were genetically modified to prevent cell death by removing the Bax protein, an essential factor for cell death to occur. They discovered that if cell death is prevented, ipRGCs distribution is highly affected, leading the surplus cells to bunch up and form ineffectual, ugly clumps incapable of receiving light information from rods and cones for the alignment of circadian rhythms. To detect this, the researchers used wheel running activity measurements in mice that lacked the Bax protein as well as the melanopsin protein which allows ipRGCs to respond only through rods and cones and compared it to animals where only the Bax gene was deleted.
What the authors uncovered was exciting: When death is prevented, the ability of rods and cones to signal light to our internal clocks is highly impaired. This shows that cell death plays an essential role in setting the circuitry that allows the retinal rods and cones to influence our circadian rhythms and sleep.
(Image: Advanced Retinal Institute, Inc.)

Cell death in retina helps tune our internal clocks

With every sunrise and sunset, our eyes make note of the light as it waxes and wanes, a process that is critical to aligning our circadian rhythms to match the solar day so we are alert during the day and restful at night. Watching the sun come and go sounds like a peaceful process, but Johns Hopkins scientists have discovered that behind the scenes, millions of specialized cells in our eyes are fighting for their lives to help the retina set the stage to keep our internal clocks ticking.

In a study that appeared in a recent issue of Neuron, a team led by biologist Samer Hattar has found that there is a kind of turf war going on behind our eyeballs, where intrinsically photosensitive retinal ganglion cells (ipRGCs) are jockeying for the best position to receive information from rod and cone cells about light levels. By studying these specialized cells in mice, Hattar and his team found that the cells actually kill each other to seize more space and find the best position to do their job.

Understanding this fight could one day lead to victories against several conditions, including autism and some psychiatric disorders, where neural circuits influence our behavior. The results could help scientists have a better idea about how the circuits behind our eyes assemble to influence our physiological functions, said Hattar, an associate professor of biology in the Krieger School of Arts and Sciences.

“In a nutshell, death in our retina plays a vital role in assembling the retinal circuits that influence crucial physiological functions such as circadian rhythms and sleep-wake cycles,” Hattar said. “Once we have a greater understanding of the circuit formation underlying all of our neuronal abilities, this could be applied to any neurological function.”

Hattar and his team determined that the killing among rival ipRGCs is justifiable homicide: Without this cell death, circadian blindness overcame the mice, who could no longer distinguish day from night. Hattar’s team studied mice that were genetically modified to prevent cell death by removing the Bax protein, an essential factor for cell death to occur. They discovered that if cell death is prevented, ipRGCs distribution is highly affected, leading the surplus cells to bunch up and form ineffectual, ugly clumps incapable of receiving light information from rods and cones for the alignment of circadian rhythms. To detect this, the researchers used wheel running activity measurements in mice that lacked the Bax protein as well as the melanopsin protein which allows ipRGCs to respond only through rods and cones and compared it to animals where only the Bax gene was deleted.

What the authors uncovered was exciting: When death is prevented, the ability of rods and cones to signal light to our internal clocks is highly impaired. This shows that cell death plays an essential role in setting the circuitry that allows the retinal rods and cones to influence our circadian rhythms and sleep.

(Image: Advanced Retinal Institute, Inc.)

Filed under retina cell death retinal ganglion cells neural circuits circadian rhythms neurons neuroscience science

free counters