Neuroscience

Articles and news from the latest research reports.

111 notes

Study finds stem cell combination therapy improves traumatic brain injury outcomes
Traumatic brain injuries (TBI), sustained by close to 2 million Americans annually, including military personnel, are debilitating and devastating for patients and their families. Regardless of severity, those with TBI can suffer a range of motor, behavioral, intellectual and cognitive disabilities over the short or long term. Sadly, clinical treatments for TBI are few and largely ineffective.
In an effort to find an effective therapy, neuroscientists at the Center of Excellence for Aging and Brain Repair, Department of Neurosurgery in the USF Health Morsani College of Medicine, University of South Florida, have conducted several preclinical studies aimed at finding combination therapies to improve TBI outcomes.
In their study of several different therapies—alone and in combination—applied to laboratory rats modeled with TBI, USF researchers found that a combination of human umbilical cord blood cells (hUBCs) and granulocyte colony stimulating factor (G-CSF), a growth factor, was more therapeutic than either administered alone, or each with saline, or saline alone.
The study appeared in a recent issue of PLoS ONE.
“Chronic TBI is typically associated with major secondary molecular injuries, including chronic neuroinflammation, which not only contribute to the death of neuronal cells in the central nervous system, but also impede any natural repair mechanism,” said study lead author Cesar V. Borlongan, PhD, professor of neurosurgery and director of USF’s Center of Excellence for Aging and Brain Repair. “In our study, we used hUBCs and G-CSF alone and in combination. In previous studies, hUBCs have been shown to suppress inflammation, and G-CSF is currently being investigated as a potential therapeutic agent for patients with stroke or Alzheimer’s disease.”
Their stand-alone effects have a therapeutic potential for TBI, based on results from previous studies. For example, G-CSF has shown an ability to mobilize stem cells from bone marrow and then infiltrate injured tissues, promoting self-repair of neural cells, while hUBCs have been shown to suppress inflammation and promote cell growth.
The involvement of the immune system in the central nervous system to either stimulate repair or enhance molecular damage has been recognized as key to the progression of many neurological disorders, including TBI, as well as in neurodegenerative diseases such as Parkinson’s disease, multiple sclerosis and some autoimmune diseases, the researchers report. Increased expression of MHCII positive cells—cell members that secrete a family of molecules mediating interactions between the immune system’s white blood cells—has been directly linked to neurodegeneration and cognitive decline in TBI.
“Our results showed that the combined therapy of hUBCs and G-CSF significantly reduced the TBI-induced loss of neuronal cells in the hippocampus,” said Borlongan. “Therapy with hUBCs and G-CSF alone or in combination produced beneficial results in animals with experimental TBI. G-CSF alone produced only short-lived benefits, while hUBCs alone afforded more robust and stable improvements. However, their combination offered the best motor improvement in the laboratory animals.”
“This outcome may indicate that the stem cells had more widespread biological action than the drug therapy,” said Paul R. Sanberg, distinguished professor at USF and principal investigator of the Department of Defense funded project. “Regardless, their combination had an apparent synergistic effect and resulted in the most effective amelioration of TBI-induced behavioral deficits.”
The researchers concluded that additional studies of this combination therapy are warranted in order to better understand their modes of action. While this research focused on motor improvements, they suggested that future combination therapy research should also include analysis of cognitive improvement in the laboratory animals modeled with TBI.

Study finds stem cell combination therapy improves traumatic brain injury outcomes

Traumatic brain injuries (TBI), sustained by close to 2 million Americans annually, including military personnel, are debilitating and devastating for patients and their families. Regardless of severity, those with TBI can suffer a range of motor, behavioral, intellectual and cognitive disabilities over the short or long term. Sadly, clinical treatments for TBI are few and largely ineffective.

In an effort to find an effective therapy, neuroscientists at the Center of Excellence for Aging and Brain Repair, Department of Neurosurgery in the USF Health Morsani College of Medicine, University of South Florida, have conducted several preclinical studies aimed at finding combination therapies to improve TBI outcomes.

In their study of several different therapies—alone and in combination—applied to laboratory rats modeled with TBI, USF researchers found that a combination of human umbilical cord blood cells (hUBCs) and granulocyte colony stimulating factor (G-CSF), a growth factor, was more therapeutic than either administered alone, or each with saline, or saline alone.

The study appeared in a recent issue of PLoS ONE.

“Chronic TBI is typically associated with major secondary molecular injuries, including chronic neuroinflammation, which not only contribute to the death of neuronal cells in the central nervous system, but also impede any natural repair mechanism,” said study lead author Cesar V. Borlongan, PhD, professor of neurosurgery and director of USF’s Center of Excellence for Aging and Brain Repair. “In our study, we used hUBCs and G-CSF alone and in combination. In previous studies, hUBCs have been shown to suppress inflammation, and G-CSF is currently being investigated as a potential therapeutic agent for patients with stroke or Alzheimer’s disease.”

Their stand-alone effects have a therapeutic potential for TBI, based on results from previous studies. For example, G-CSF has shown an ability to mobilize stem cells from bone marrow and then infiltrate injured tissues, promoting self-repair of neural cells, while hUBCs have been shown to suppress inflammation and promote cell growth.

The involvement of the immune system in the central nervous system to either stimulate repair or enhance molecular damage has been recognized as key to the progression of many neurological disorders, including TBI, as well as in neurodegenerative diseases such as Parkinson’s disease, multiple sclerosis and some autoimmune diseases, the researchers report. Increased expression of MHCII positive cells—cell members that secrete a family of molecules mediating interactions between the immune system’s white blood cells—has been directly linked to neurodegeneration and cognitive decline in TBI.

“Our results showed that the combined therapy of hUBCs and G-CSF significantly reduced the TBI-induced loss of neuronal cells in the hippocampus,” said Borlongan. “Therapy with hUBCs and G-CSF alone or in combination produced beneficial results in animals with experimental TBI. G-CSF alone produced only short-lived benefits, while hUBCs alone afforded more robust and stable improvements. However, their combination offered the best motor improvement in the laboratory animals.”

“This outcome may indicate that the stem cells had more widespread biological action than the drug therapy,” said Paul R. Sanberg, distinguished professor at USF and principal investigator of the Department of Defense funded project. “Regardless, their combination had an apparent synergistic effect and resulted in the most effective amelioration of TBI-induced behavioral deficits.”

The researchers concluded that additional studies of this combination therapy are warranted in order to better understand their modes of action. While this research focused on motor improvements, they suggested that future combination therapy research should also include analysis of cognitive improvement in the laboratory animals modeled with TBI.

Filed under TBI brain injury hUBCs G-CSF cytokines neurogenesis stem cell therapy neuroscience science

291 notes

New approach makes cancer cells explode
Researchers at Karolinska Institutet have discovered that a substance called Vacquinol-1 makes cells from glioblastoma, the most aggressive type of brain tumour, literally explode. When mice were given the substance, which can be given in tablet form, tumour growth was reversed and survival was prolonged. The findings are published in the journal Cell.
The established treatments that are available for glioblastoma include surgery, radiation and chemotherapy. But even if this treatment is given the average survival is just 15 months. It is therefore critical to find better treatments for malignant brain tumours.
Researchers at Karolinska Institutet and colleagues at Uppsala University have discovered an entirely new mechanism to kill tumour cells in glioblastoma. Researchers in an initial stage have exposed tumour cells to a wide range of molecules. If the cancer cells died, the molecule was considered of interest for further studies, which initially applied to over 200 kinds of molecules. Following extensive studies, a single molecule has been identified as being of particular interest. The researchers wanted to find out why it caused cancer cell death.
It was found that the molecule gave the cancer cells an uncontrolled vacuolization, a process in which the cell carries substances from outside the cell into its interior. This carrying process is made via the vacuoles, which can roughly be described as blisters or bags consisting of cell membranes. The process is similar to what was behind last year’s Nobel Prize in physiology or medicine, the discovery that describes how cellular vesicles move things from the interior of the cell to its surface.
Cell membranes collapsed
When cancer cells were filled with a large amount of vacuoles, the cell membranes, the outer wall of the cell, collapsed and the cell simply exploded and necrotized.
“This is an entirely new mechanism for cancer treatment. A possible medicine based on this principle would therefore attack the glioblastoma in an entirely new way. This principle may also work for other cancer diseases, we have not really explored this yet,” says Patrik Ernfors, professor of tissue biology at the Department of Medical Biochemistry and Biophysics at Karolinska Institutet.
Researchers made mice that had human glioblastoma cells transplanted ingest the substance for five days. The average survival was about 30 days for the control group that did not receive the substance. Of those who received the substance six of eight mice were still alive after 80 days. The study was then considered of such interest that the scientific journal wanted to publish the article immediately.
“We now want to try to take this discovery in basic research through preclinical development and all the way to the clinic. The goal is to get into a phase 1 trial,” says Patrik Ernfors.

New approach makes cancer cells explode

Researchers at Karolinska Institutet have discovered that a substance called Vacquinol-1 makes cells from glioblastoma, the most aggressive type of brain tumour, literally explode. When mice were given the substance, which can be given in tablet form, tumour growth was reversed and survival was prolonged. The findings are published in the journal Cell.

The established treatments that are available for glioblastoma include surgery, radiation and chemotherapy. But even if this treatment is given the average survival is just 15 months. It is therefore critical to find better treatments for malignant brain tumours.

Researchers at Karolinska Institutet and colleagues at Uppsala University have discovered an entirely new mechanism to kill tumour cells in glioblastoma. Researchers in an initial stage have exposed tumour cells to a wide range of molecules. If the cancer cells died, the molecule was considered of interest for further studies, which initially applied to over 200 kinds of molecules. Following extensive studies, a single molecule has been identified as being of particular interest. The researchers wanted to find out why it caused cancer cell death.

It was found that the molecule gave the cancer cells an uncontrolled vacuolization, a process in which the cell carries substances from outside the cell into its interior. This carrying process is made via the vacuoles, which can roughly be described as blisters or bags consisting of cell membranes. The process is similar to what was behind last year’s Nobel Prize in physiology or medicine, the discovery that describes how cellular vesicles move things from the interior of the cell to its surface.

Cell membranes collapsed

When cancer cells were filled with a large amount of vacuoles, the cell membranes, the outer wall of the cell, collapsed and the cell simply exploded and necrotized.

“This is an entirely new mechanism for cancer treatment. A possible medicine based on this principle would therefore attack the glioblastoma in an entirely new way. This principle may also work for other cancer diseases, we have not really explored this yet,” says Patrik Ernfors, professor of tissue biology at the Department of Medical Biochemistry and Biophysics at Karolinska Institutet.

Researchers made mice that had human glioblastoma cells transplanted ingest the substance for five days. The average survival was about 30 days for the control group that did not receive the substance. Of those who received the substance six of eight mice were still alive after 80 days. The study was then considered of such interest that the scientific journal wanted to publish the article immediately.

“We now want to try to take this discovery in basic research through preclinical development and all the way to the clinic. The goal is to get into a phase 1 trial,” says Patrik Ernfors.

Filed under cancer cells glioblastoma brain tumour vacquinol-1 cancer neuroscience science

113 notes

Bioimaging: Visualizing real-time development of capillary networks in adult brains
The advancement of microscopic photoimaging techniques has enabled the visualization of real-time cellular events in living organs. The brain capillary network exhibits a unique feature that forms a blood-brain barrier (BBB), which is an interface of vascular endothelial cells that control the traffic of substances from the bloodstream into the brain. Damage and disruption to the BBB are implicated in contributing to the pathogenesis and progression of neurodegenerative disorders such as Alzheimer’s and epilepsy. However, the cellular interactions present in the BBB are incredibly difficult to study in vivo, so understanding of these mechanisms in living brains is limited.
Now, Kazuto Masamoto and co-workers at the University of Electro-Communications in Tokyo, National Institute of Radiological Sciences, and Keio University School of Medicine, have used 4D live imaging technology to study the effects of hypoxia (a deprivation of oxygen) on the BBB plasticity in live adult mice.
The team focused their attention on how the BBB plastic changes work against hypoxia, looking in particular at the endothelial cells and their communications to the neighboring astrocytes - interactions which take place in controlling the BBB traffic to fulfill neural demands. Using genetically-modified mice with endothelial cells that express green-fluorescent protein, Masamoto and colleagues imaged the real-time changes of BBBs before and during a three-week period of hypoxia in adult mouse cortex.
Their results showed that the capillaries in the BBB, which prior to hypoxia showed no signs of activity, began to sprout new blood vessels which in places formed new networks together. The neighboring astrocytes reacted quickly to wrap the outside of the new vessels, activity which the researchers believe helps stabilize the BBB traffic and integrity.
Further investigations into the molecular mechanisms that control BBB plasticity are expected to lead to advances in treatment of neurodegenerative disorders and cerebral ischemia, and thus provide an effective way for preventing BBB dysfunction in diabetes, hypertension, and aging.

Bioimaging: Visualizing real-time development of capillary networks in adult brains

The advancement of microscopic photoimaging techniques has enabled the visualization of real-time cellular events in living organs. The brain capillary network exhibits a unique feature that forms a blood-brain barrier (BBB), which is an interface of vascular endothelial cells that control the traffic of substances from the bloodstream into the brain. Damage and disruption to the BBB are implicated in contributing to the pathogenesis and progression of neurodegenerative disorders such as Alzheimer’s and epilepsy. However, the cellular interactions present in the BBB are incredibly difficult to study in vivo, so understanding of these mechanisms in living brains is limited.

Now, Kazuto Masamoto and co-workers at the University of Electro-Communications in Tokyo, National Institute of Radiological Sciences, and Keio University School of Medicine, have used 4D live imaging technology to study the effects of hypoxia (a deprivation of oxygen) on the BBB plasticity in live adult mice.

The team focused their attention on how the BBB plastic changes work against hypoxia, looking in particular at the endothelial cells and their communications to the neighboring astrocytes - interactions which take place in controlling the BBB traffic to fulfill neural demands. Using genetically-modified mice with endothelial cells that express green-fluorescent protein, Masamoto and colleagues imaged the real-time changes of BBBs before and during a three-week period of hypoxia in adult mouse cortex.

Their results showed that the capillaries in the BBB, which prior to hypoxia showed no signs of activity, began to sprout new blood vessels which in places formed new networks together. The neighboring astrocytes reacted quickly to wrap the outside of the new vessels, activity which the researchers believe helps stabilize the BBB traffic and integrity.

Further investigations into the molecular mechanisms that control BBB plasticity are expected to lead to advances in treatment of neurodegenerative disorders and cerebral ischemia, and thus provide an effective way for preventing BBB dysfunction in diabetes, hypertension, and aging.

Filed under blood-brain barrier astrocytes endothelial cells neurodegenerative diseases neuroscience science

243 notes

Genetic factor contributes to forgetfulness

University of Bonn psychologists prove genetic variation is underlying factor in higher incidence of forgetfulness

Misplaced your keys? Can’t remember someone’s name? Didn’t notice the stop sign? Those who frequently experience such cognitive lapses now have an explanation. Psychologists from the University of Bonn have found a connection between such everyday lapses and the DRD2 gene. Those who have a certain variant of this gene are more easily distracted and experience a significantly higher incidence of lapses due to a lack of attention. The scientific team will probably report their results in the May issue of “Neuroscience Letters,” which is already available online in advance.

image

Most of us are familiar with such everyday lapses; can’t find your keys, again! Or you walk into another room but forgot what you actually went there for. Or you are on the phone with someone and cannot remember their name. “Such short-term memory lapses are very common, but some people experience them particularly often,” said Prof. Dr. Martin Reuter from the department for Differential and Biological Psychology at the University of Bonn. Mistakes occurring due to such short-term lapses can become a hazard in cases where, e.g., a person overlooks a stop sign at an intersection. And in the workplace, a lack of attention can also become a problem–so for example when it results in forgetting to save essential data.

A gene “directing” your brain

"A familial clustering of such lapses suggests that they are subject to genetic effects," explained Dr. Sebastian Markett, the principal author and a member of Prof. Reuter’s team. In lab experiments, the group of scientists had already found indications earlier that the so-called dopamine D2 receptor gene (DRD2) plays a part in forgetfulness. DRD2 has an essential function in signal transmission within the frontal lobes. "This structure can be compared to a director coordinating the brain like an orchestra," Dr. Markett added. In this simile, the DRD2 gene would correspond to the baton, because it plays a part in dopamine transmission in the brain. If the baton skips a beat, the orchestra gets confused.

The psychologists from the University of Bonn tested a total of 500 women and men by taking a saliva sample and examining it using methods from molecular biology. All humans carry the DRD2 gene, which comes in two variants that are distinguished by only one letter within the genetic code. The one variant has C (cytosine) in one locus, which is displaced by T (thymine) in the other. According to the research team’s analyses, about a quarter of the subjects exclusively had the DRD2 gene with the cytosine nucleobase, while three quarters were the genotype with at least one thymine base.

The scientists then wanted to find out whether this difference in the genetic code also had an effect on everyday behavior. By means of a self-assessment survey they asked the subjects to state how frequently they experience these lapses–how often they forgot names, misplaced their keys. The survey also included questions regarding certain impulsivity-related factors, such as how easily a subject was distracted from actual tasks at hand, and how long they were able to maintain their concentration.

Lapses can clearly be tied to the gene variant

The scientists used statistical methods to check whether it was possible to associate the forgetfulness symptoms elicited by means of the surveys to one of the DRD2 gene variants. The results showed that functions such as attention and memory are less clearly expressed in persons who carry the thymine variant of the gene than in the cytosine type. “The connection is obvious; such lapses can partially be attributed to this gene variant,” reported Dr. Markett. According to their own statements, the subjects with the thymine DRD2 variant more frequently “fall victim” to forgetfulness or attention deficits. And vice versa, the cytosine type seems to be protected from that. “This result matches the results of other studies very well,” added Dr. Markett.

Carriers of the gene variant linked to forgetfulness may now find solace in the fact that they are not responsible for their genes, and that this is just their fate….but Dr. Markett doesn’t agree. “There are things you can do to compensate for forgetfulness; writing yourself notes or making more of an effort to put your keys down in a specific location–and not just anywhere.” Those who develop such strategies for the different areas of their lives are better able to handle their deficit.

(Source: www3.uni-bonn.de)

Filed under forgetfulness DRD2 dopamine memory frontal lobe neuroscience science

324 notes

Computers See Through Faked Expressions of Pain Better Than People
A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.
The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.
“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”
Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”
The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.
“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”
“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”
The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.
“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”
In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.
“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Computers See Through Faked Expressions of Pain Better Than People

A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.

The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.

“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”

Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”

The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.

“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”

“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”

The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.

“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”

In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.

“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Filed under pain emotion facial expressions computer-vision system psychology neuroscience science

90 notes

What singing fruit flies can tell us about quick decisions

You wouldn’t hear the mating song of the male fruit fly as you reached for the infested bananas in your kitchen. Yet, the neural activity behind the insect’s amorous call could help scientists understand how you made the quick decision to pull your hand back from the tiny swarm.

image

Male fruit flies base the pitch and tempo of their mating song on the movement and behavior of their desired female, Princeton University researchers have discovered. In the animal kingdom, lusty warblers such as birds typically have a mating song with a stereotyped pattern. A fruit fly’s song, however, is an unordered series of loud purrs and soft drones made by wing vibrations, the researchers reported in the journal Nature. A male adjusts his song in reaction to his specific environment, which in this case is the distance and speed of a female — the faster and farther away she’s moving, the louder he “sings.”

While the actors are small, the implications of these findings could be substantial for understanding rapid decision-making, explained corresponding author Mala Murthy, a Princeton assistant professor of molecular biology and the Princeton Neuroscience Institute. Fruit flies are a common model for studying the systems of more advanced beings such as humans, and have the basic components of more complex nervous systems, she said.

The researchers have provided a possible tool for studying the neural pathways behind how an organism engaged in a task adjusts its behavior to sudden changes, be it a leopard chasing a zigzagging gazelle, or a commuter navigating stop-and-go traffic, Murthy said. She and her co-authors created a model that could predict a fly’s choice of song in response to its changing environment, and identified the neural pathways involved in these decisions.

"Here we have natural courtship behavior and we have this discovery that males are using information about their sensory environment in real time to shape their song. That makes the fly system a unique model to study decision-making in a natural context," Murthy said.

"You can imagine that if a fly can integrate visual information quickly to modulate his song, the way in which it does that is probably a very basic equivalent of how a more complicated animal solves a similar problem," she said. "To figure out at the level of individual neurons how flies perform sensory-motor integration will give us insight into how a mammalian brain does it and, ultimately, maybe how a human brain does it."

Read more

Filed under fruit flies decision making mating song neural circuitry neuroscience science

147 notes

Researchers discover underlying genetics, marker for stroke, cardiovascular disease
Scientists studying the genomes of nearly 5,000 people have pinpointed a genetic variant tied to an increased risk for stroke, and have also uncovered new details about an important metabolic pathway that plays a major role in several common diseases. Together, their findings may provide new clues to underlying genetic and biochemical influences in the development of stroke and cardiovascular disease, and may also help lead to new treatment strategies.
"Our findings have the potential to identify new targets in the prevention and treatment of stroke, cardiovascular disease and many other common diseases," said Stephen R. Williams, Ph.D., a postdoctoral fellow at the University of Virginia Cardiovascular Research Center and the University of Virginia Center for Public Health Genomics, Charlottesville.
Dr. Williams, Michele Sale, Ph.D., associate professor of medicine, Brad Worrall, M.D., professor of neurology and public health sciences, all at the University of Virginia, and their team reported their findings March 20, 2014 in PLoS Genetics. The investigators were supported by the National Human Genome Research Institute (NHGRI) Genomics and Randomized Trials Network (GARNET) program (www.genome.gov/27541119).
Stroke is the fourth leading cause of death and a major cause of adult disability in this country, yet its underlying genetics have been difficult to understand. Numerous genetic and environmental factors can contribute to a person having a stroke. “Our goals were to break down the risk factors for stroke,” Dr. Williams said.
The researchers focused on one particular biochemical pathway called the folate one-carbon metabolism (FOCM) pathway. They knew that abnormally high blood levels of the amino acid homocysteine are associated with an increased risk of common diseases such as stroke, cardiovascular disease and dementia. Homocysteine is a breakdown product of methionine, which is part of the FOCM pathway. The same pathway can affect many important cellular processes, including the methylation of proteins, DNA and RNA. DNA methylation is a mechanism that cells use to control which genes are turned on and off, and when.
But clinical trials of homocysteine-lowering therapies have not prevented disease, and the genetics underlying high homocysteine levels - and methionine metabolism gone awry - are not well defined.
Dr. Williams and his colleagues conducted genome-wide association studies of participants from two large long-term projects: the Vitamin Intervention for Stroke Prevention (VISP), a trial looking at ways to prevent a second ischemic stroke, and the Framingham Heart Study (FHS), which has followed the cardiovascular health and disease in a general population for decades. They also measured methionine metabolism - the ability to convert methionine to homocysteine - in both groups. In all, they studied 2,100 VISP participants and 2,710 FHS subjects.
In a genome-wide association study, researchers scan the genome to identify specific genomic variants associated with a disease. In this case, the scientists were trying to identify variants associated with a trait - the ability to metabolize methionine into homocysteine.
Investigators identified variants in five genes in the FOCM pathway that were associated with differences in a person’s ability to convert methionine to homocysteine. They found that among the five genes, one - the ALDH1L1 gene - was also strongly associated with stroke in the Framingham study. When the gene is not working properly, it has been associated with a breakdown in a normal cellular process called programmed cell death, and cancer cell survival.
They also made important discoveries about the methionine-homocysteine process. “GNMT produces a protein that converts methionine to homocysteine. Of the five genes that we identified, it was the one most significantly associated with this process,” Dr. Williams said. “The analyses suggest that differences in GNMT are the major drivers behind the differences in methionine metabolism in humans.”
"It’s striking that the genes are in the same pathway, so we know that the genomic variants affecting that pathway contribute to the variability in disease and risk that we’re seeing," he said. "We may have found how genetic information controls the regulation of GNMT."
The group determined that the five genes accounted for 6 percent of the difference in individuals’ ability to process methionine into homocysteine among those in the VISP trial. The genes also accounted for 13 percent of the difference in those participants in the FHS, a remarkable result given the complex nature of methionine metabolism and its impact on cerebrovascular risk. In many complex diseases, genomic variants often account for less than 5 percent of such differences.
"This is a great example of the kinds of successful research efforts coming out of the GARNET program," said program director Ebony Madden, Ph.D. "GARNET scientists aim to identify variants that affect treatment response by doing association studies in randomized trials. These results show that variants in genes are associated with the differences in homocysteine levels in individuals."
The association of the ALDH1L1 gene variant with stroke is just one example of how the findings may potentially lead to new prevention efforts, and help develop new targets for treating stroke and heart disease, Dr. Williams said.
"As genome sequencing becomes more widespread, clinicians may be able to determine if a person’s risk for abnormally high levels of homocysteine is elevated," he said. "Changes could be made to an individual’s diet because of a greater risk for stroke and cardiovascular disease."
The investigators plan to study the other four genes in the pathway to try to better understand their potential roles in stroke and cardiovascular disease risk.

Researchers discover underlying genetics, marker for stroke, cardiovascular disease

Scientists studying the genomes of nearly 5,000 people have pinpointed a genetic variant tied to an increased risk for stroke, and have also uncovered new details about an important metabolic pathway that plays a major role in several common diseases. Together, their findings may provide new clues to underlying genetic and biochemical influences in the development of stroke and cardiovascular disease, and may also help lead to new treatment strategies.

"Our findings have the potential to identify new targets in the prevention and treatment of stroke, cardiovascular disease and many other common diseases," said Stephen R. Williams, Ph.D., a postdoctoral fellow at the University of Virginia Cardiovascular Research Center and the University of Virginia Center for Public Health Genomics, Charlottesville.

Dr. Williams, Michele Sale, Ph.D., associate professor of medicine, Brad Worrall, M.D., professor of neurology and public health sciences, all at the University of Virginia, and their team reported their findings March 20, 2014 in PLoS Genetics. The investigators were supported by the National Human Genome Research Institute (NHGRI) Genomics and Randomized Trials Network (GARNET) program (www.genome.gov/27541119).

Stroke is the fourth leading cause of death and a major cause of adult disability in this country, yet its underlying genetics have been difficult to understand. Numerous genetic and environmental factors can contribute to a person having a stroke. “Our goals were to break down the risk factors for stroke,” Dr. Williams said.

The researchers focused on one particular biochemical pathway called the folate one-carbon metabolism (FOCM) pathway. They knew that abnormally high blood levels of the amino acid homocysteine are associated with an increased risk of common diseases such as stroke, cardiovascular disease and dementia. Homocysteine is a breakdown product of methionine, which is part of the FOCM pathway. The same pathway can affect many important cellular processes, including the methylation of proteins, DNA and RNA. DNA methylation is a mechanism that cells use to control which genes are turned on and off, and when.

But clinical trials of homocysteine-lowering therapies have not prevented disease, and the genetics underlying high homocysteine levels - and methionine metabolism gone awry - are not well defined.

Dr. Williams and his colleagues conducted genome-wide association studies of participants from two large long-term projects: the Vitamin Intervention for Stroke Prevention (VISP), a trial looking at ways to prevent a second ischemic stroke, and the Framingham Heart Study (FHS), which has followed the cardiovascular health and disease in a general population for decades. They also measured methionine metabolism - the ability to convert methionine to homocysteine - in both groups. In all, they studied 2,100 VISP participants and 2,710 FHS subjects.

In a genome-wide association study, researchers scan the genome to identify specific genomic variants associated with a disease. In this case, the scientists were trying to identify variants associated with a trait - the ability to metabolize methionine into homocysteine.

Investigators identified variants in five genes in the FOCM pathway that were associated with differences in a person’s ability to convert methionine to homocysteine. They found that among the five genes, one - the ALDH1L1 gene - was also strongly associated with stroke in the Framingham study. When the gene is not working properly, it has been associated with a breakdown in a normal cellular process called programmed cell death, and cancer cell survival.

They also made important discoveries about the methionine-homocysteine process. “GNMT produces a protein that converts methionine to homocysteine. Of the five genes that we identified, it was the one most significantly associated with this process,” Dr. Williams said. “The analyses suggest that differences in GNMT are the major drivers behind the differences in methionine metabolism in humans.”

"It’s striking that the genes are in the same pathway, so we know that the genomic variants affecting that pathway contribute to the variability in disease and risk that we’re seeing," he said. "We may have found how genetic information controls the regulation of GNMT."

The group determined that the five genes accounted for 6 percent of the difference in individuals’ ability to process methionine into homocysteine among those in the VISP trial. The genes also accounted for 13 percent of the difference in those participants in the FHS, a remarkable result given the complex nature of methionine metabolism and its impact on cerebrovascular risk. In many complex diseases, genomic variants often account for less than 5 percent of such differences.

"This is a great example of the kinds of successful research efforts coming out of the GARNET program," said program director Ebony Madden, Ph.D. "GARNET scientists aim to identify variants that affect treatment response by doing association studies in randomized trials. These results show that variants in genes are associated with the differences in homocysteine levels in individuals."

The association of the ALDH1L1 gene variant with stroke is just one example of how the findings may potentially lead to new prevention efforts, and help develop new targets for treating stroke and heart disease, Dr. Williams said.

"As genome sequencing becomes more widespread, clinicians may be able to determine if a person’s risk for abnormally high levels of homocysteine is elevated," he said. "Changes could be made to an individual’s diet because of a greater risk for stroke and cardiovascular disease."

The investigators plan to study the other four genes in the pathway to try to better understand their potential roles in stroke and cardiovascular disease risk.

Filed under stroke cardiovascular disease health genetics medicine science

182 notes

Sniff study suggests humans can distinguish more than 1 trillion scents
The human sense of smell does not get the respect it deserves, new research suggests. In an experiment led by Andreas Keller, of Rockefeller’s Laboratory of Neurogenetics and Behavior, researchers tested volunteers’ ability to distinguish between complex mixtures of scents. Based on the sensitivity of these people’s noses and brains, the team calculated the human sense of smell can detect more than 1 trillion odor mixtures, far more discrete stimuli than previous smell studies have estimated.
The existing generally accepted number is just 10,000, says Leslie Vosshall, Robert Chemers Neustein Professor and head of the laboratory. “Everyone in the field had the general sense that this number was ludicrously small, but Andreas was the first to put the number to a real scientific test,” Vosshall says.
In fact, even 1 trillion may be understating it, says Keller. “The message here is that we have more sensitivity in our sense of smell than for which we give ourselves credit. We just don’t pay attention to it and don’t use it in everyday life,” he says.
The quality of an odor has multiple dimensions, because the odors we encounter in real life are composed of complex mixes of molecules. For instance, the characteristic scent of rose has 275 components, but only a small percentage of those dominate the perceived smell. That makes odor much more difficult to study than vision and hearing, which require us to detect variations in a single dimension. For comparison, researchers estimate the number of colors we can distinguish at between 2.3 and 7.5 million and audible tones at about 340,000.
To overcome this complexity, Keller combined odors and asked volunteers whether they could distinguish between mixtures with some components in common. “Our trick is we use mixtures of odor molecules, and we use the percentage of overlap between two mixtures to measure the sensitivity of a person’s sense of smell,” Keller says. To create his mixtures, Keller drew upon 128 odor molecules responsible for scents such as orange, anise and spearmint. He mixed these in combinations of 10, 20 and 30 with different proportions of components in common. The volunteers received three vials, two of which contained identical mixes, and they were asked to pick out the odd one.
This approach was inspired by previous work at the Weizmann Institute in Israel, in which researchers combined odors at similar intensities to create neutral smelling “olfactory white.” In that experiment and in Keller’s study, the researchers were interested in the perception of odor qualities, such as fishy, floral or musky — not their intensity. But since intensity can interfere with the perceived qualities, both had to account for it.
The results, published this week in Science, show that while individual volunteers’ performance varied greatly, on average they could tell the difference between mixtures containing as much as 51 percent of the same components. Once the mixes shared more than half of their components, fewer volunteers could tell the difference between them. This was true for mixes of 10, 20 and 30 odors.
By analyzing the data, the researchers could calculate the total number of distinguishable mixtures.
“It turns out that the resolution of the olfactory system is not extraordinary – you need to change a fair fraction of the components before the change can be reliably detected by more than 50 percent of the subjects,” says collaborator Marcelo O. Magnasco, head of the Laboratory of Mathematical Physics at Rockefeller. “However, because the number of combinations is quite literally astronomical, even after accounting for this limitation the total number of distinguishable odor combinations is quite large.” The 1 trillion estimate is almost certainly too low, the researchers say, because there are many, many more odor molecules in the real world that can be mixed in many more ways.
Keller theorizes that our ancestors had much more use and appreciation for our sense of smell than we do. Humans’ upright posture lifted our noses far from the ground where most smells originate, and more recently, conveniences such as refrigerators and daily showers, have effectively limited odors in the modern world. “This could explain our attitude that smell is unimportant, compared to hearing and vision,” he says.
Nevertheless, the sense of smell remains closely linked to human behavior, and studying it can tell us a lot about how our brains process complex information. The results of this study are a step toward an elusive quantitative science of odor perception that can help drive further research, Keller says.

Sniff study suggests humans can distinguish more than 1 trillion scents

The human sense of smell does not get the respect it deserves, new research suggests. In an experiment led by Andreas Keller, of Rockefeller’s Laboratory of Neurogenetics and Behavior, researchers tested volunteers’ ability to distinguish between complex mixtures of scents. Based on the sensitivity of these people’s noses and brains, the team calculated the human sense of smell can detect more than 1 trillion odor mixtures, far more discrete stimuli than previous smell studies have estimated.

The existing generally accepted number is just 10,000, says Leslie Vosshall, Robert Chemers Neustein Professor and head of the laboratory. “Everyone in the field had the general sense that this number was ludicrously small, but Andreas was the first to put the number to a real scientific test,” Vosshall says.

In fact, even 1 trillion may be understating it, says Keller. “The message here is that we have more sensitivity in our sense of smell than for which we give ourselves credit. We just don’t pay attention to it and don’t use it in everyday life,” he says.

The quality of an odor has multiple dimensions, because the odors we encounter in real life are composed of complex mixes of molecules. For instance, the characteristic scent of rose has 275 components, but only a small percentage of those dominate the perceived smell. That makes odor much more difficult to study than vision and hearing, which require us to detect variations in a single dimension. For comparison, researchers estimate the number of colors we can distinguish at between 2.3 and 7.5 million and audible tones at about 340,000.

To overcome this complexity, Keller combined odors and asked volunteers whether they could distinguish between mixtures with some components in common. “Our trick is we use mixtures of odor molecules, and we use the percentage of overlap between two mixtures to measure the sensitivity of a person’s sense of smell,” Keller says. To create his mixtures, Keller drew upon 128 odor molecules responsible for scents such as orange, anise and spearmint. He mixed these in combinations of 10, 20 and 30 with different proportions of components in common. The volunteers received three vials, two of which contained identical mixes, and they were asked to pick out the odd one.

This approach was inspired by previous work at the Weizmann Institute in Israel, in which researchers combined odors at similar intensities to create neutral smelling “olfactory white.” In that experiment and in Keller’s study, the researchers were interested in the perception of odor qualities, such as fishy, floral or musky — not their intensity. But since intensity can interfere with the perceived qualities, both had to account for it.

The results, published this week in Science, show that while individual volunteers’ performance varied greatly, on average they could tell the difference between mixtures containing as much as 51 percent of the same components. Once the mixes shared more than half of their components, fewer volunteers could tell the difference between them. This was true for mixes of 10, 20 and 30 odors.

By analyzing the data, the researchers could calculate the total number of distinguishable mixtures.

“It turns out that the resolution of the olfactory system is not extraordinary – you need to change a fair fraction of the components before the change can be reliably detected by more than 50 percent of the subjects,” says collaborator Marcelo O. Magnasco, head of the Laboratory of Mathematical Physics at Rockefeller. “However, because the number of combinations is quite literally astronomical, even after accounting for this limitation the total number of distinguishable odor combinations is quite large.” The 1 trillion estimate is almost certainly too low, the researchers say, because there are many, many more odor molecules in the real world that can be mixed in many more ways.

Keller theorizes that our ancestors had much more use and appreciation for our sense of smell than we do. Humans’ upright posture lifted our noses far from the ground where most smells originate, and more recently, conveniences such as refrigerators and daily showers, have effectively limited odors in the modern world. “This could explain our attitude that smell is unimportant, compared to hearing and vision,” he says.

Nevertheless, the sense of smell remains closely linked to human behavior, and studying it can tell us a lot about how our brains process complex information. The results of this study are a step toward an elusive quantitative science of odor perception that can help drive further research, Keller says.

Filed under olfaction smell odor perception olfactory system neuroscience science

205 notes

The Aging Brain Needs REST

Why do neurodegenerative diseases such as Alzheimer’s affect only the elderly? Why do some people live to be over 100 with intact cognitive function while others develop dementia decades earlier?

image

Image: A new study shows that a gene regulator called REST, dormant in the brains of young people (left), switches on in normal aging brains (center) to protect against various stresses, including abnormal proteins associated with neurodegenerative diseases. REST is lost in critical brain regions of people with Alzheimer’s (right). Credit: Yankner Lab

More than a century of research into the causes of dementia has focused on the clumps and tangles of abnormal proteins that appear in the brains of people with neurodegenerative diseases. However, scientists know that at least one piece of the puzzle has been missing because some people with these abnormal protein clumps show few or no signs of cognitive decline.

A new study offers an explanation for these longstanding mysteries. Researchers have discovered that a gene regulator active during fetal brain development, called REST, switches back on later in life to protect aging neurons from various stresses, including the toxic effects of abnormal proteins. The researchers also showed that REST is lost in critical brain regions of people with Alzheimer’s and mild cognitive impairment.

(Source: hms.harvard.edu)

Read more …

Filed under dementia neurodegenerative diseases REST genetics neuroscience science

61 notes

Rats’ brains may “remember” odor experienced while under general anesthesia
Rats’ brains may remember odors they were exposed to while deeply anesthetized, suggests research in rats published in the April issue of Anesthesiology.
Previous research has led to the belief that sensory information is received by the brain under general anesthesia but not perceived by it. These new findings suggest the brain not only receives sensory information, but also registers the information at the cellular level while anesthetized without behavioral reporting of the same information after recovering from anesthesia.
In the study, rats were exposed to a specific odor while under general anesthesia. Examination of the brain tissue after they had recovered from anesthesia revealed evidence of cellular imprinting, even though the rats behaved as if they had never encountered the odor before.
“It raises the question of whether our brains are being imprinted during anesthesia in ways we don’t recognize because we simply don’t remember,” said Yan Xu, Ph.D., lead author and vice chairman for basic sciences in the Department of Anesthesiology at the University of Pittsburgh School of Medicine. “The fact that an anesthetized brain can receive sensory information – and distinguish whether that information is novel or familiar during and after anesthesia, even if one does not remember receiving it – suggests a need to re-evaluate how the depth of anesthesia should be measured clinically.”
Researchers randomly assigned 107 rats to 12 different anesthesia and odor exposure paradigms: some were exposed to the same odor during and after anesthesia, some to air before and an odor after, some to familiar odors, others to novel odors, and still others were not exposed to odors at all. After the rats had recovered from the anesthesia, researchers observed their behavior of looking for hidden odors or interacting with scented beads to determine their memory of the smell. Researchers then analyzed the rats’ brains at a cellular level. While the rats had no memory of being exposed to the odor under anesthesia, changes in the brain tissue on a cellular level suggested the rats “remembered” the exposure to the odor under anesthesia and no longer registered the odor as novel.
“This study reveals important new information about how anesthesia affects our brains,” said Dr. Xu. “The results highlight a need for additional research into the effects of general anesthesia on learning and memory.”

Rats’ brains may “remember” odor experienced while under general anesthesia

Rats’ brains may remember odors they were exposed to while deeply anesthetized, suggests research in rats published in the April issue of Anesthesiology.

Previous research has led to the belief that sensory information is received by the brain under general anesthesia but not perceived by it. These new findings suggest the brain not only receives sensory information, but also registers the information at the cellular level while anesthetized without behavioral reporting of the same information after recovering from anesthesia.

In the study, rats were exposed to a specific odor while under general anesthesia. Examination of the brain tissue after they had recovered from anesthesia revealed evidence of cellular imprinting, even though the rats behaved as if they had never encountered the odor before.

“It raises the question of whether our brains are being imprinted during anesthesia in ways we don’t recognize because we simply don’t remember,” said Yan Xu, Ph.D., lead author and vice chairman for basic sciences in the Department of Anesthesiology at the University of Pittsburgh School of Medicine. “The fact that an anesthetized brain can receive sensory information – and distinguish whether that information is novel or familiar during and after anesthesia, even if one does not remember receiving it – suggests a need to re-evaluate how the depth of anesthesia should be measured clinically.”

Researchers randomly assigned 107 rats to 12 different anesthesia and odor exposure paradigms: some were exposed to the same odor during and after anesthesia, some to air before and an odor after, some to familiar odors, others to novel odors, and still others were not exposed to odors at all. After the rats had recovered from the anesthesia, researchers observed their behavior of looking for hidden odors or interacting with scented beads to determine their memory of the smell. Researchers then analyzed the rats’ brains at a cellular level. While the rats had no memory of being exposed to the odor under anesthesia, changes in the brain tissue on a cellular level suggested the rats “remembered” the exposure to the odor under anesthesia and no longer registered the odor as novel.

“This study reveals important new information about how anesthesia affects our brains,” said Dr. Xu. “The results highlight a need for additional research into the effects of general anesthesia on learning and memory.”

Filed under odors olfaction anesthesia memory learning neuroscience science

free counters