Neuroscience

Articles and news from the latest research reports.

92 notes

New Research on the Effects of Traumatic Brain Injury (TBI)
Considerable opportunity exists to improve interventions and outcomes of traumatic brain injury (TBI) in older adults, according to three studies published in the recent online issue of NeuroRehabilitation by researchers from the Icahn School of Medicine at Mount Sinai.
An Exploration of Clinical Dementia Phenotypes Among Individuals With and Without Traumatic Brain Injury
Some evidence suggests that a history of TBI is associated with an increased risk of dementia later in life, but the clinical features of dementia associated with TBI have not been well investigated.  Researchers at the Icahn School of Medicine as well as other institutions analyzed data from elderly individuals with dementia with and without a history of TBI to characterize the clinical profiles of patients with post-TBI dementia.
The results of the study indicate that compared to older adults with dementia with no history of TBI, those with a history of TBI had higher fluency and verbal memory scores and later onset of decline. However, their general health was worse, they were more likely to have received medical attention for depression, and were more likely to have a gait disorder, falls, and motor slowness.  These findings suggest that dementia among individuals with a history of TBI may represent a unique clinical phenotype that is distinct from that seen among elderly individuals who develop dementia without a history of TBI.
"Our study indicates that individuals with dementia and without a history of TBI may present clinical characteristics that differ in subtle but meaningful ways," said Kristen Dams-O’Connor, PhD, first author of the study and an Assistant Professor of Rehabilitation Medicine at the Icahn School of Medicine at Mount Sinai. "It is imperative that clinicians take a history of TBI into account when making dementia diagnoses."
For this study, researchers used data from the National Alzheimer’s Coordinating Center (NACC) Uniform Data Set (UDS) collected between September 2005 and May 2012 to analyze 332 elderly individuals with dementia and a history of TBI and 664 elderly individuals without dementia who do have a history of TBI. Statistical analyses focused on evaluating differences in the areas of neurocognitive functioning, psychiatric functioning, medical history and health, clinical characteristics of dementia, and dementia diagnosis using data collected at the baseline (first) NACC study visit.
Mortality of Elderly Individuals with TBI in the First 5 Years Following Injury
After observing a high rate of mortality among patients over the age of 55 in the first five years after sustaining a TBI, researchers at the Icahn School of Medicine at Mount Sinai were interested in learning more about the precise causes for what may be considered a premature death.
The results of this study indicate that for approximately a third of the patients, death one to five years after TBI resulted from health conditions that were present at the time of injury before the onset of TBI, suggesting a continuation of an already ongoing process. The remainder of patients died from conditions that appeared to unfold in the years after injury. According to the authors, each cause of death in this sample would have required pro-active medical management, medical intervention and medication compliance.
"Like those with other chronic health conditions, individuals with TBI could benefit from the development of a disease management model of primary care," said one of the study authors, Wayne Gordon, PhD, Jack Nash Professor and Vice Chair of the Department of Rehabilitation Medicine at the Icahn School of Medicine at Mount Sinai and Chief of the Rehabilitation Psychology and Neuropsychology service. "This study suggests that close medical management and lifestyle interventions may help to prevent premature death among elderly survivors of TBI in the future."
Researchers reviewed the charts of 30 individuals over the age of 55 who completed inpatient acute rehabilitation during the period from 2003-2009 and who died one to four years after TBI, and then compared that data to a matched sample of 30 patients who did not die. They found that 53 percent of deceased subjects had been diagnosed with gait abnormalities, 32 percent were taking respiratory medications at admission, and 17 percent were taking respiratory medications at discharge. Compared to patients who survived several years after injury, deceased patients were discharged from the hospital with significantly more medications.
Inpatient Rehabilitation for Traumatic Brain Injury: The Influence of Age on Treatments and Outcomes
For this study, researchers analyzed the difference in treatment and outcomes between elderly and younger patients with TBI. They found that patients over 65 had lower brain injury severity and a shorter length of stay in acute care. Elderly patients also received fewer hours of rehabilitation therapy, due to a shorter length of stay, and fewer hours of treatment per day, especially from psychology and therapeutic recreation. They gained less functional ability during and after rehabilitation, and had a very high mortality rate.
"We know significantly more about the treatment received by adolescents and young adults with TBI than we do about those over 65," said Marcel Dijkers, PhD, lead author and Research Professor in the Department of Rehabilitation Medicine at Mount Sinai.  "Our data indicates that elderly people can be rehabilitated successfully, but it raises a number of questions. For instance: is the high mortality due to the TBI or is it the result of the continuation of a condition that began pre-TBI?"
The researchers analyzed data on 1,419 patients with TBI admitted to nine TBI rehabilitation inpatient programs across the country between 2009 and 2011. They collected data through abstracting of medical records, point-of-care forms completed by therapists, and interviews conducted three and nine months after discharge.

New Research on the Effects of Traumatic Brain Injury (TBI)

Considerable opportunity exists to improve interventions and outcomes of traumatic brain injury (TBI) in older adults, according to three studies published in the recent online issue of NeuroRehabilitation by researchers from the Icahn School of Medicine at Mount Sinai.

An Exploration of Clinical Dementia Phenotypes Among Individuals With and Without Traumatic Brain Injury

Some evidence suggests that a history of TBI is associated with an increased risk of dementia later in life, but the clinical features of dementia associated with TBI have not been well investigated.  Researchers at the Icahn School of Medicine as well as other institutions analyzed data from elderly individuals with dementia with and without a history of TBI to characterize the clinical profiles of patients with post-TBI dementia.

The results of the study indicate that compared to older adults with dementia with no history of TBI, those with a history of TBI had higher fluency and verbal memory scores and later onset of decline. However, their general health was worse, they were more likely to have received medical attention for depression, and were more likely to have a gait disorder, falls, and motor slowness.  These findings suggest that dementia among individuals with a history of TBI may represent a unique clinical phenotype that is distinct from that seen among elderly individuals who develop dementia without a history of TBI.

"Our study indicates that individuals with dementia and without a history of TBI may present clinical characteristics that differ in subtle but meaningful ways," said Kristen Dams-O’Connor, PhD, first author of the study and an Assistant Professor of Rehabilitation Medicine at the Icahn School of Medicine at Mount Sinai. "It is imperative that clinicians take a history of TBI into account when making dementia diagnoses."

For this study, researchers used data from the National Alzheimer’s Coordinating Center (NACC) Uniform Data Set (UDS) collected between September 2005 and May 2012 to analyze 332 elderly individuals with dementia and a history of TBI and 664 elderly individuals without dementia who do have a history of TBI. Statistical analyses focused on evaluating differences in the areas of neurocognitive functioning, psychiatric functioning, medical history and health, clinical characteristics of dementia, and dementia diagnosis using data collected at the baseline (first) NACC study visit.

Mortality of Elderly Individuals with TBI in the First 5 Years Following Injury

After observing a high rate of mortality among patients over the age of 55 in the first five years after sustaining a TBI, researchers at the Icahn School of Medicine at Mount Sinai were interested in learning more about the precise causes for what may be considered a premature death.

The results of this study indicate that for approximately a third of the patients, death one to five years after TBI resulted from health conditions that were present at the time of injury before the onset of TBI, suggesting a continuation of an already ongoing process. The remainder of patients died from conditions that appeared to unfold in the years after injury. According to the authors, each cause of death in this sample would have required pro-active medical management, medical intervention and medication compliance.

"Like those with other chronic health conditions, individuals with TBI could benefit from the development of a disease management model of primary care," said one of the study authors, Wayne Gordon, PhD, Jack Nash Professor and Vice Chair of the Department of Rehabilitation Medicine at the Icahn School of Medicine at Mount Sinai and Chief of the Rehabilitation Psychology and Neuropsychology service. "This study suggests that close medical management and lifestyle interventions may help to prevent premature death among elderly survivors of TBI in the future."

Researchers reviewed the charts of 30 individuals over the age of 55 who completed inpatient acute rehabilitation during the period from 2003-2009 and who died one to four years after TBI, and then compared that data to a matched sample of 30 patients who did not die. They found that 53 percent of deceased subjects had been diagnosed with gait abnormalities, 32 percent were taking respiratory medications at admission, and 17 percent were taking respiratory medications at discharge. Compared to patients who survived several years after injury, deceased patients were discharged from the hospital with significantly more medications.

Inpatient Rehabilitation for Traumatic Brain Injury: The Influence of Age on Treatments and Outcomes

For this study, researchers analyzed the difference in treatment and outcomes between elderly and younger patients with TBI. They found that patients over 65 had lower brain injury severity and a shorter length of stay in acute care. Elderly patients also received fewer hours of rehabilitation therapy, due to a shorter length of stay, and fewer hours of treatment per day, especially from psychology and therapeutic recreation. They gained less functional ability during and after rehabilitation, and had a very high mortality rate.

"We know significantly more about the treatment received by adolescents and young adults with TBI than we do about those over 65," said Marcel Dijkers, PhD, lead author and Research Professor in the Department of Rehabilitation Medicine at Mount Sinai.  "Our data indicates that elderly people can be rehabilitated successfully, but it raises a number of questions. For instance: is the high mortality due to the TBI or is it the result of the continuation of a condition that began pre-TBI?"

The researchers analyzed data on 1,419 patients with TBI admitted to nine TBI rehabilitation inpatient programs across the country between 2009 and 2011. They collected data through abstracting of medical records, point-of-care forms completed by therapists, and interviews conducted three and nine months after discharge.

Filed under TBI brain injury dementia brain rehabilitation neuroscience neurobiology medicine science

77 notes

Sugar Cube-Sized Robotic Ants Mimic Real Foraging Behavior
For ants, the pheromone-laden foraging trails they leave behind are like lifelines: they direct the workers toward food hubs discovered earlier and help guide them home back to their nest.
These networks of trails can stretch for hundreds of feet, quite the achievement considering many worker ants are less than half an inch in length. One type of harvester ant can lay down a set of trails (PDF) that stretch 82 feet from the entrance of its nest. The trails of a wood ant, an insect measuring just five millimeters (that’s one-fifth of an inch), reach 656 feet, each one branching out into more pathways at up to 10 spots on each trail. The leafcutter ant can build a network that spreads for almost two and a half acres.
Ant species such as these tend to take the shortest path between their colony’s nest and a food source, following branches that stray as little as possible from the direction in which they began their journey. The forks in their network of trails, known as bifurcations, are not symmetrical and don’t branch out into angles of the same size. But do ants use a sophisticated sense of geometry to trace their path, measuring the angles of the roads before picking one?
To learn more, researchers at the New Jersey Institute of Technology (NJIT) and the Research Centre on Animal Cognition in France used miniature robots to replicate the behavior of a colony of Argentine ants on the move, reported today in the journal PLOS Computational Biology. This ant species has extremely poor eyesight and darts around at high speeds, yet it can maneuver through corridor after corridor, from home to food and vice versa.

Sugar Cube-Sized Robotic Ants Mimic Real Foraging Behavior

For ants, the pheromone-laden foraging trails they leave behind are like lifelines: they direct the workers toward food hubs discovered earlier and help guide them home back to their nest.

These networks of trails can stretch for hundreds of feet, quite the achievement considering many worker ants are less than half an inch in length. One type of harvester ant can lay down a set of trails (PDF) that stretch 82 feet from the entrance of its nest. The trails of a wood ant, an insect measuring just five millimeters (that’s one-fifth of an inch), reach 656 feet, each one branching out into more pathways at up to 10 spots on each trail. The leafcutter ant can build a network that spreads for almost two and a half acres.

Ant species such as these tend to take the shortest path between their colony’s nest and a food source, following branches that stray as little as possible from the direction in which they began their journey. The forks in their network of trails, known as bifurcations, are not symmetrical and don’t branch out into angles of the same size. But do ants use a sophisticated sense of geometry to trace their path, measuring the angles of the roads before picking one?

To learn more, researchers at the New Jersey Institute of Technology (NJIT) and the Research Centre on Animal Cognition in France used miniature robots to replicate the behavior of a colony of Argentine ants on the move, reported today in the journal PLOS Computational Biology. This ant species has extremely poor eyesight and darts around at high speeds, yet it can maneuver through corridor after corridor, from home to food and vice versa.

Filed under robots robotics foraging trail networks ants colony behavior navigation skills alice neuroscience science

129 notes

Child development varies and is hard to predict
On average, children take the first steps on their own at the age of 12 months. Many parents perceive this event as a decisive turning point. However, the timing is really of no consequence. Children who start walking early turn out later to be neither more intelligent nor more well-coordinated. This is the conclusion reached by a study supported by the Swiss National Science Foundation (SNSF).
Because parents pay great attention to their offspring, they often compare them with the other children in the sandpit or playground. Many of them worry that their child is lagging behind in terms of mental development if it sits up or starts to walk a bit later than other children. Now, however, in a statistical analysis of the developmental data of 222 children born healthy, researchers headed by Oskar Jenni of the Zurich Children’s Hospital and Valentin Rousson of Lausanne University have come to the conclusion that most of these fears are groundless.
Considerable varianceWithin the framework of the Zurich longitudinal study, the paediatricians conducted a detailed study of the development of 119 boys and 103 girls. The researchers examined the children seven times during the first two years of their life and subsequently carried out motor and intelligence tests with them every two to three years after they reached school age. The results show that children sit up for the first time at an age of between slightly less than four months and thirteen months (average 6.5 months). They begin to walk at an age of between 8.5 months and 20 months (average 12 months). In other words, there is considerable variance.
The researchers found no correlation between the age at which the children reached these motor milestones and their performance in the intelligence and motor tests between the age of seven and eighteen. In short, by the time they reach school age, children who start walking later than others are just as well-coordinated and intelligent as those who were up on their feet early.
More relaxedAlthough the first steps that a child takes on its own represent a decisive turning point for most parents, the precise timing of this event is manifestly of no consequence. “That’s why I advise parents to be more relaxed if their child only starts walking at 16 or 18 months,” says Jenni. If a child still can’t walk unaided after 20 months, then further medical investigations are indicated.
(Image: Getty Images)

Child development varies and is hard to predict

On average, children take the first steps on their own at the age of 12 months. Many parents perceive this event as a decisive turning point. However, the timing is really of no consequence. Children who start walking early turn out later to be neither more intelligent nor more well-coordinated. This is the conclusion reached by a study supported by the Swiss National Science Foundation (SNSF).

Because parents pay great attention to their offspring, they often compare them with the other children in the sandpit or playground. Many of them worry that their child is lagging behind in terms of mental development if it sits up or starts to walk a bit later than other children. Now, however, in a statistical analysis of the developmental data of 222 children born healthy, researchers headed by Oskar Jenni of the Zurich Children’s Hospital and Valentin Rousson of Lausanne University have come to the conclusion that most of these fears are groundless.

Considerable variance
Within the framework of the Zurich longitudinal study, the paediatricians conducted a detailed study of the development of 119 boys and 103 girls. The researchers examined the children seven times during the first two years of their life and subsequently carried out motor and intelligence tests with them every two to three years after they reached school age. The results show that children sit up for the first time at an age of between slightly less than four months and thirteen months (average 6.5 months). They begin to walk at an age of between 8.5 months and 20 months (average 12 months). In other words, there is considerable variance.

The researchers found no correlation between the age at which the children reached these motor milestones and their performance in the intelligence and motor tests between the age of seven and eighteen. In short, by the time they reach school age, children who start walking later than others are just as well-coordinated and intelligent as those who were up on their feet early.

More relaxed
Although the first steps that a child takes on its own represent a decisive turning point for most parents, the precise timing of this event is manifestly of no consequence. “That’s why I advise parents to be more relaxed if their child only starts walking at 16 or 18 months,” says Jenni. If a child still can’t walk unaided after 20 months, then further medical investigations are indicated.

(Image: Getty Images)

Filed under child development developmental milestones babies walking psychology neuroscience science

115 notes

Bulging Eyes Of The Tarsier Provide Insight Into Evolution Of Human Vision
A new study, led by Dartmouth College, suggests that primates developed highly accurate, three-color vision that allowed them to shift to daytime living after eons of wandering in the dark.
The findings, published in the journal Proceedings of the Royal Society B: Biological Sciences, challenge the prevailing theory that trichromatic color vision, a hallmark event in primate evolution, evolved only after primates became diurnal. Learning to rise with the sun was an evolutionary shift that gave rise to anthropoid (higher) primates, which led to the human lineage.
Dr. Amanda D. Melin, a postdoctoral research associate in the Department of Anthropology at Dartmouth, led the team of scientists who based their findings on a genetic study of tarsiers, the enigmatic elfin primate that branched off early on from monkeys, apes and humans. These tiny animals, which measure between 3.3 and 6.5 inches in height, have a number of unusual traits – from communicating in pure ultrasound to their bulging eyes. Sensory specializations such as these have long fueled debate on the adaptive origins of anthropoid primates.
Previous research by this same team discovered the tarsiers’ ultrasound vocalizations last year. The new study sheds light on why the nocturnal animal’s ancestors had enhanced color vision better suited for daytime living conditions, like their anthropoid cousins.
The team analyzed the genes that encode photopigments in the eye. This analysis revealed that the last common ancestor of living tarsiers had highly acute, three-color vision much like modern monkeys and apes. Normally, such findings would indicate a daytime lifestyle. The tarsier fossil record, however, shows enlarged eyes that suggest they were active mainly at night.
Because of these contradictory lines of evidence, the researchers suggest that early tarsiers were instead adapted to dim light levels, like bright moonlight or twilight. Such conditions are dark enough to favor large eyes, but still bright enough to support trichromatic color vision.
Keen-sightedness such as this might have helped higher primates to carve out a fully daytime niche, the authors suggest, allowing them to better see prey, predators and fellow primates. They would also be able to expand their territory in a life no longer limited to the shadows.

Bulging Eyes Of The Tarsier Provide Insight Into Evolution Of Human Vision

A new study, led by Dartmouth College, suggests that primates developed highly accurate, three-color vision that allowed them to shift to daytime living after eons of wandering in the dark.

The findings, published in the journal Proceedings of the Royal Society B: Biological Sciences, challenge the prevailing theory that trichromatic color vision, a hallmark event in primate evolution, evolved only after primates became diurnal. Learning to rise with the sun was an evolutionary shift that gave rise to anthropoid (higher) primates, which led to the human lineage.

Dr. Amanda D. Melin, a postdoctoral research associate in the Department of Anthropology at Dartmouth, led the team of scientists who based their findings on a genetic study of tarsiers, the enigmatic elfin primate that branched off early on from monkeys, apes and humans. These tiny animals, which measure between 3.3 and 6.5 inches in height, have a number of unusual traits – from communicating in pure ultrasound to their bulging eyes. Sensory specializations such as these have long fueled debate on the adaptive origins of anthropoid primates.

Previous research by this same team discovered the tarsiers’ ultrasound vocalizations last year. The new study sheds light on why the nocturnal animal’s ancestors had enhanced color vision better suited for daytime living conditions, like their anthropoid cousins.

The team analyzed the genes that encode photopigments in the eye. This analysis revealed that the last common ancestor of living tarsiers had highly acute, three-color vision much like modern monkeys and apes. Normally, such findings would indicate a daytime lifestyle. The tarsier fossil record, however, shows enlarged eyes that suggest they were active mainly at night.

Because of these contradictory lines of evidence, the researchers suggest that early tarsiers were instead adapted to dim light levels, like bright moonlight or twilight. Such conditions are dark enough to favor large eyes, but still bright enough to support trichromatic color vision.

Keen-sightedness such as this might have helped higher primates to carve out a fully daytime niche, the authors suggest, allowing them to better see prey, predators and fellow primates. They would also be able to expand their territory in a life no longer limited to the shadows.

Filed under primates tarsiers vision trichromatic color vision evolution neuroscience science

36 notes

Parkinson’s Disease Protein Gums up Garbage Disposal System in Cells

Clumps of α-synuclein protein in nerve cells are hallmarks of many degenerative brain diseases, most notably Parkinson’s disease.

image

“No one has been able to determine if Lewy bodies and Lewy neurites, hallmark pathologies in Parkinson’s disease can be degraded,” says Virginia Lee, PhD, director of the Center for Neurodegenerative Disease Research, at the Perelman School of Medicine, University of Pennsylvania.

“With the new neuron model system of Parkinson’s disease pathologies our lab has developed recently, we demonstrated that these aberrant clumps in cells resist degradation as well as impair the function of the macroautophagy  system, one of the major garbage disposal systems within the cell.”

Macroautophagy, literally self eating, is the degradation of unnecessary or dysfunctional cellular bits and pieces by a compartment in the cell called the lysosome.

Lee, also a professor of Pathology and Laboratory Medicine, and colleagues published their results in the early online edition of the Journal of Biological Chemistry this week.

Alpha-synuclein (α-syn ) diseases all have  clumps of the protein and include Parkinson’s disease (PD), and array of related disorders: PD with dementia , dementia with Lewy bodies, and multiple system atrophy. In most of these, α-syn forms insoluble aggregates of stringy fibrils that accumulate in the cell body and extensions of neurons.

These unwanted α-syn clumps are modified by abnormal attachments of many phosphate chemical groups as well as by the protein ubiquitin, a molecular tag for degradation. They are widely distributed in the central nervous system, where they are associated with neuron loss.

Using cell models in which intracellular α-syn clumps accumulate after taking up synthetic α-syn fibrils, the team showed that α-syn inclusions cannot be degraded, even though they are located near the  lysosome and the proteasome, another type of garbage disposal in the cell.

The α-syn aggregates persist even after soluble α-syn levels within the cell are substantially reduced, suggesting that once formed, the α-syn inclusions are resistant to being cleared. What’s more, they found that α-syn aggregates impair the overall autophagy degradative process by delaying the maturation of autophagy machines known as autophagosomes, which may contribute to the increased cell death seen in clump-filled nerve cells. Understanding the impact of α-syn aggregates on autophagy may help elucidate therapies for α-syn-related neurodegeneration.

(Source: uphs.upenn.edu)

Filed under neurodegenerative diseases parkinson's disease nerve cells lysosome CNS autophagy neuroscience science

472 notes

Smoking genes predict risk
Your DNA may play a significant role in determining whether or not you end up a smoker – and how easy you find it to kick the habit.
Many large studies have identified particular gene variants that are more common in smokers than other people, suggesting the they play a role in nicotine dependence.
Now an international team of researchers have used these genetic clues develop a ‘genetic risk profile’, and to see how accurate it is, they have road-tested it on the on a well known sample of Kiwis: the Dunedin Birth Cohort.
Researchers analysed data from the long-term study of 1,000 New Zealanders to identify whether individuals at high genetic risk got hooked on cigarettes more quickly as teens and whether, as adults, they had a harder time quitting.
The results, published in JAMA Psychiatry, showed that a person’s genetic risk profile did not predict whether he or she would try cigarettes. But for those who did try cigarettes, having a high-risk genetic profile predicted increased likelihood of heavy smoking and nicotine dependence.
This link was most apparent for teenagers; Among teens who tried cigarettes, those with a high-risk genetic profile were 24 percent more likely to become daily smokers by age 15 and 43 percent more likely to become pack-a-day smokers by age 18.
As adults, those with high-risk genetic profiles were 22 percent more likely to fail in their attempts at quitting.
“The effects of genetic risk seem to be limited to people who start smoking as teens,” said author Daniel Belsky, a post-doctoral research fellow at Duke University.
“This suggests there may be something special about nicotine exposure in the adolescent brain, with respect to these genetic variants.”
The authors noted that their genetic risk profile isn’t yet accurate enough to be used for targeted interventions to prevent at-risk teens smoking, but it does highlight the critical adolescent period in addiction development.
“Public health policies that make it harder for teens to become regular smokers should continue to be a focus in antismoking efforts,” Belsky said.

Smoking genes predict risk

Your DNA may play a significant role in determining whether or not you end up a smoker – and how easy you find it to kick the habit.

Many large studies have identified particular gene variants that are more common in smokers than other people, suggesting the they play a role in nicotine dependence.

Now an international team of researchers have used these genetic clues develop a ‘genetic risk profile’, and to see how accurate it is, they have road-tested it on the on a well known sample of Kiwis: the Dunedin Birth Cohort.

Researchers analysed data from the long-term study of 1,000 New Zealanders to identify whether individuals at high genetic risk got hooked on cigarettes more quickly as teens and whether, as adults, they had a harder time quitting.

The results, published in JAMA Psychiatry, showed that a person’s genetic risk profile did not predict whether he or she would try cigarettes. But for those who did try cigarettes, having a high-risk genetic profile predicted increased likelihood of heavy smoking and nicotine dependence.

This link was most apparent for teenagers; Among teens who tried cigarettes, those with a high-risk genetic profile were 24 percent more likely to become daily smokers by age 15 and 43 percent more likely to become pack-a-day smokers by age 18.

As adults, those with high-risk genetic profiles were 22 percent more likely to fail in their attempts at quitting.

“The effects of genetic risk seem to be limited to people who start smoking as teens,” said author Daniel Belsky, a post-doctoral research fellow at Duke University.

“This suggests there may be something special about nicotine exposure in the adolescent brain, with respect to these genetic variants.”

The authors noted that their genetic risk profile isn’t yet accurate enough to be used for targeted interventions to prevent at-risk teens smoking, but it does highlight the critical adolescent period in addiction development.

“Public health policies that make it harder for teens to become regular smokers should continue to be a focus in antismoking efforts,” Belsky said.

Filed under smoking nicotine dependence adolescent brain genes genetics neuroscience science

35 notes

Surgical menopause may prime brain for stroke, Alzheimer’s

Women who abruptly and prematurely lose estrogen from surgical menopause have a two-fold increase in cognitive decline and dementia.

image

"This is what the clinical studies indicate and our animal studies looking at the underlying mechanisms back this up," said Brann, corresponding author of the study in the journal Brain. “We wanted to find out why that is occurring. We suspect it’s due to the premature loss of estrogen.”

In an effort to mimic what occurs in women, Brann and his colleagues looked at rats 10 weeks after removal of their estrogen-producing ovaries that were either immediately started on low-dose estrogen therapy, started therapy 10 weeks later or never given estrogen.

When the researchers caused a stroke-like event in the brain’s hippocampus, a center of learning and memory, they found the rodents treated late or not at all experienced more brain damage, specifically to a region of the hippocampus called CA3 that is normally stroke-resistant.

To make matters worse, untreated or late-treated rats also began an abnormal, robust production of Alzheimer’s disease-related proteins in the CA3 region, even becoming hypersensitive to one of the most toxic of the beta amyloid proteins that are a hallmark of Alzheimer’s.

Both problems appear associated with the increased production of free radicals in the brain. In fact, when the researchers blocked the excessive production, heightened stroke sensitivity and brain cell death in the CA3 region were reduced.

Interestingly the brain’s increased sensitivity to stressors such as inadequate oxygen was gender specific, Brann said. Removing testes in male rats, didn’t affect stroke size or damage.

Although exactly how it works is unknown, estrogen appears to help protect younger females from problems such as stroke and heart attack. Their risks of the maladies increase after menopause to about the same as males. Follow up studies are needed to see if estrogen therapy also reduces sensitivity to the beta amyloid protein in the CA3 region, as they expect, Brann noted.

Brann earlier showed that prolonged estrogen deprivation in aging rats dramatically reduces the number of brain receptors for the hormone as well as its ability to prevent strokes. Damage was forestalled if estrogen replacement was started shortly after hormone levels drop, according to the 2011 study in the journal Proceedings of the National Academy of Sciences.

The surprising results of the much-publicized Women’s Health Initiative – a 12-year study of 161,808 women ages 50-79 – found hormone therapy generally increased rather than decreased stroke risk as well as other health problems. Critics said one problem with the study was that many of the women, like Brann’s aged rats, had gone years without hormone replacement, bolstering the case that timing is everything.

(Source: eurekalert.org)

Filed under beta amyloid brain damage cognitive decline dementia alzheimer's disease neuroscience science

93 notes

Should I trust my intuition?
Do we always make better decisions when we take more time to think? Or are there decisions where more time doesn’t really help?
A study led by Zachary Mainen, Director of the Champalimaud Neuroscience Programme, and published in the scientific journal, Neuron, reports that when rats were challenged with a series of perceptual decision problems, their performance was just as good when they decided rapidly as when they took a much longer time to respond. Despite being encouraged to slow down and try harder, the subjects of this study achieved their maximum performance in less than 300 milliseconds.
'There are many kinds of decisions, and for some, having more time appears to be of no help. In these cases, you'd better go with your intuition, and that's what our subjects did', explains Zachary Mainen, the neuroscientist who led this study, while an Associate Professor at CSHL, in the USA.
This study suggests that rats can be used as an animal model to investigate what is happening in the human brain when ‘intuitive’ decisions are being made. ‘Decision-making is not a well-understood process, but it appears to be surprisingly similar among species. This study provides a basis to begin to take apart one type of decision and see how it really works’, the author adds. 
(Image: Kristen Dold | Thinkstock)

Should I trust my intuition?

Do we always make better decisions when we take more time to think? Or are there decisions where more time doesn’t really help?

A study led by Zachary Mainen, Director of the Champalimaud Neuroscience Programme, and published in the scientific journal, Neuron, reports that when rats were challenged with a series of perceptual decision problems, their performance was just as good when they decided rapidly as when they took a much longer time to respond. Despite being encouraged to slow down and try harder, the subjects of this study achieved their maximum performance in less than 300 milliseconds.

'There are many kinds of decisions, and for some, having more time appears to be of no help. In these cases, you'd better go with your intuition, and that's what our subjects did', explains Zachary Mainen, the neuroscientist who led this study, while an Associate Professor at CSHL, in the USA.

This study suggests that rats can be used as an animal model to investigate what is happening in the human brain when ‘intuitive’ decisions are being made. ‘Decision-making is not a well-understood process, but it appears to be surprisingly similar among species. This study provides a basis to begin to take apart one type of decision and see how it really works’, the author adds.

(Image: Kristen Dold | Thinkstock)

Filed under decision-making animal model intuitive decisions neuroscience psychology science

228 notes

How herpesvirus invades nervous system
Northwestern Medicine scientists have identified a component of the herpesvirus that “hijacks” machinery inside human cells, allowing the virus to rapidly and successfully invade the nervous system upon initial exposure.
Led by Gregory Smith, associate professor in immunology and microbiology at Northwestern University Feinberg School of Medicine, researchers found that viral protein 1-2, or VP1/2, allows the herpesvirus to interact with cellular motors, known as dynein. Once the protein has overtaken this motor, the virus can speed along intercellular highways, or microtubules, to move unobstructed from the tips of nerves in skin to the nuclei of neurons within the nervous system.
This is the first time researchers have shown a viral protein directly engaging and subverting the cellular motor; most other viruses passively hitch a ride into the nervous system.
"This protein not only grabs the wheel, it steps on the gas," says Smith. "Overtaking the cellular motor to invade the nervous system is a complicated accomplishment that most viruses are incapable of achieving. Yet the herpesvirus uses one protein, no others required, to transport its genetic information over long distances without stopping."
Herpesvirus is widespread in humans and affects more than 90 percent of adults in the United States. It is associated with several types of recurring diseases, including cold sores, genital herpes, chicken pox, and shingles. The virus can live dormant in humans for a lifetime, and most infected people do not know they are disease carriers. The virus can occasionally turn deadly, resulting in encephalitis in some.
Until now, scientists knew that herpesviruses travel quickly to reach neurons located deep inside the body, but the mechanism by which they advance remained a mystery.
Smith’s team conducted a variety of experiments with VP1/2 to demonstrate its important role in transporting the virus, including artificial activation and genetic mutation of the protein. The team studied the herpesvirus in animals, and also in human and animal cells in culture under high-resolution microscopy. In one experiment, scientists mutated the virus with a slower form of the protein dyed red, and raced it against a healthy virus dyed green. They observed that the healthy virus outran the mutated version down nerves to the neuron body to insert DNA and establish infection.
"Remarkably, this viral protein can be artificially activated, and in these conditions it zips around within cells in the absence of any virus. It is striking to watch," Smith says.
He says that understanding how the viruses move within people, especially from the skin to the nervous system, can help better prevent the virus from spreading.
Additionally, Smith says, “By learning how the virus infects our nervous system, we can mimic this process to treat unrelated neurologic diseases. Even now, laboratories are working on how to use herpesviruses to deliver genes into the nervous system and kill cancer cells.”
Smith’s team will next work to better understand how the protein functions. He notes that many researchers use viruses to learn how neurons are connected to the brain.
"Some of our mutants will advance brain mapping studies by resolving these connections more clearly than was previously possible," he says.

How herpesvirus invades nervous system

Northwestern Medicine scientists have identified a component of the herpesvirus that “hijacks” machinery inside human cells, allowing the virus to rapidly and successfully invade the nervous system upon initial exposure.

Led by Gregory Smith, associate professor in immunology and microbiology at Northwestern University Feinberg School of Medicine, researchers found that viral protein 1-2, or VP1/2, allows the herpesvirus to interact with cellular motors, known as dynein. Once the protein has overtaken this motor, the virus can speed along intercellular highways, or microtubules, to move unobstructed from the tips of nerves in skin to the nuclei of neurons within the nervous system.

This is the first time researchers have shown a viral protein directly engaging and subverting the cellular motor; most other viruses passively hitch a ride into the nervous system.

"This protein not only grabs the wheel, it steps on the gas," says Smith. "Overtaking the cellular motor to invade the nervous system is a complicated accomplishment that most viruses are incapable of achieving. Yet the herpesvirus uses one protein, no others required, to transport its genetic information over long distances without stopping."

Herpesvirus is widespread in humans and affects more than 90 percent of adults in the United States. It is associated with several types of recurring diseases, including cold sores, genital herpes, chicken pox, and shingles. The virus can live dormant in humans for a lifetime, and most infected people do not know they are disease carriers. The virus can occasionally turn deadly, resulting in encephalitis in some.

Until now, scientists knew that herpesviruses travel quickly to reach neurons located deep inside the body, but the mechanism by which they advance remained a mystery.

Smith’s team conducted a variety of experiments with VP1/2 to demonstrate its important role in transporting the virus, including artificial activation and genetic mutation of the protein. The team studied the herpesvirus in animals, and also in human and animal cells in culture under high-resolution microscopy. In one experiment, scientists mutated the virus with a slower form of the protein dyed red, and raced it against a healthy virus dyed green. They observed that the healthy virus outran the mutated version down nerves to the neuron body to insert DNA and establish infection.

"Remarkably, this viral protein can be artificially activated, and in these conditions it zips around within cells in the absence of any virus. It is striking to watch," Smith says.

He says that understanding how the viruses move within people, especially from the skin to the nervous system, can help better prevent the virus from spreading.

Additionally, Smith says, “By learning how the virus infects our nervous system, we can mimic this process to treat unrelated neurologic diseases. Even now, laboratories are working on how to use herpesviruses to deliver genes into the nervous system and kill cancer cells.”

Smith’s team will next work to better understand how the protein functions. He notes that many researchers use viruses to learn how neurons are connected to the brain.

"Some of our mutants will advance brain mapping studies by resolving these connections more clearly than was previously possible," he says.

Filed under herpesvirus dynein viral protein nervous system neurons infection neuroscience science

5,096 notes

Which Came First, the Head or the Brain?
The sea anemone, a cnidarian, has no brain. It does have a nervous system, and its body has a clear axis, with a mouth on one side and a basal disk on the other. However, there is no organized collection of neurons comparable to the kind of brain found in bilaterians, animals that have both a bilateral symmetry and a top and bottom. (Most animals except sponges, cnidarians, and a few other phyla are bilaterians.) So an interesting evolutionary question is, which came first, the head or the brain? Do animals such as sea anemones, which lack a brain, have something akin to a head?

In this issue of PLOS Biology, Chiara Sinigaglia and colleagues report that at least some developmental pathways seen in cnidarians share a common lineage with head and brain development in bilaterians. It might seem intuitive to expect to find genes involved in brain development around the mouth of the anemone, and previous work has suggested that the oral region in cnidarians corresponds to the head region of bilaterians. However, there has been debate over whether the oral or aboral pole of cnidarians is analogous to the anterior pole of bilaterians. At the start of its life cycle a sea anemone exists as a free swimming planula, which then attaches to a surface and becomes a sea anemone. That free-swimming phase contains an apical tuft, a sensory structure at the front of the swimming animal’s body. The apical tuft is the part that attaches and becomes the aboral pole (the part distal from the mouth) of the adult anemone.

To test whether genetic expression in the aboral pole of cnidarians does in fact resemble the head patterning seen in bilaterians, the researchers analyzed gene expression in Nematostella vectensis, a sea anemone found in estuaries and bays. They focused on the six3 and FoxQ2 transcription factors, as these genes are known to regulate development of the anterior-posterior axis in bilaterian species. (six3 knockout mice, for example, fail to develop a forebrain, and in humans, six3 is known to regulate the development of forebrain and eyes.)

The N. vectensis genome contains one gene from the six3/6 group and four foxQ2 genes. Sinigaglia and colleagues found that Nvsix3/6 and one of the foxQ2 genes, NvFoxQ2a, were expressed predominantly on the aboral pole of the developing cnidarian but, after gastrulation, were excluded from a small spot in that region (NvSix3/6 was also expressed in a small number of other cells of the planula that resembled neurons). Because of this, the authors call NvSix3/6 and NvFoQ2a “ring genes”, and genes that are then expressed in that spot “spot genes.” The spot then develops into the apical tuft.

Through knockdown and rescue experiments, the researchers demonstrate that NvSix3/6 is required for the development of the aboral region; without it, the expression of spot genes is reduced or eliminated and the apical tuft of the planula doesn’t form. This suggests that development of the region distal from the cnidarian mouth appears to parallel the development of the bilaterian head.

This research demonstrates that at least a subset of the genes that cause head and brain formation in bilaterians are also differentially expressed in the aboral region of the sea urchin. The expression patterns are not identical to those in all bilaterians; however, the similarities suggest that the patterns of gene expression arose in an ancestor common to bilaterians and cnidarians, and that the process was then modified in bilaterians to produce a brain. So to answer the evolutionary question posed above, it seems that the developmental module that produces a head came first.

Which Came First, the Head or the Brain?

The sea anemone, a cnidarian, has no brain. It does have a nervous system, and its body has a clear axis, with a mouth on one side and a basal disk on the other. However, there is no organized collection of neurons comparable to the kind of brain found in bilaterians, animals that have both a bilateral symmetry and a top and bottom. (Most animals except sponges, cnidarians, and a few other phyla are bilaterians.) So an interesting evolutionary question is, which came first, the head or the brain? Do animals such as sea anemones, which lack a brain, have something akin to a head?

In this issue of PLOS Biology, Chiara Sinigaglia and colleagues report that at least some developmental pathways seen in cnidarians share a common lineage with head and brain development in bilaterians. It might seem intuitive to expect to find genes involved in brain development around the mouth of the anemone, and previous work has suggested that the oral region in cnidarians corresponds to the head region of bilaterians. However, there has been debate over whether the oral or aboral pole of cnidarians is analogous to the anterior pole of bilaterians. At the start of its life cycle a sea anemone exists as a free swimming planula, which then attaches to a surface and becomes a sea anemone. That free-swimming phase contains an apical tuft, a sensory structure at the front of the swimming animal’s body. The apical tuft is the part that attaches and becomes the aboral pole (the part distal from the mouth) of the adult anemone.

To test whether genetic expression in the aboral pole of cnidarians does in fact resemble the head patterning seen in bilaterians, the researchers analyzed gene expression in Nematostella vectensis, a sea anemone found in estuaries and bays. They focused on the six3 and FoxQ2 transcription factors, as these genes are known to regulate development of the anterior-posterior axis in bilaterian species. (six3 knockout mice, for example, fail to develop a forebrain, and in humans, six3 is known to regulate the development of forebrain and eyes.)

The N. vectensis genome contains one gene from the six3/6 group and four foxQ2 genes. Sinigaglia and colleagues found that Nvsix3/6 and one of the foxQ2 genes, NvFoxQ2a, were expressed predominantly on the aboral pole of the developing cnidarian but, after gastrulation, were excluded from a small spot in that region (NvSix3/6 was also expressed in a small number of other cells of the planula that resembled neurons). Because of this, the authors call NvSix3/6 and NvFoQ2a “ring genes”, and genes that are then expressed in that spot “spot genes.” The spot then develops into the apical tuft.

Through knockdown and rescue experiments, the researchers demonstrate that NvSix3/6 is required for the development of the aboral region; without it, the expression of spot genes is reduced or eliminated and the apical tuft of the planula doesn’t form. This suggests that development of the region distal from the cnidarian mouth appears to parallel the development of the bilaterian head.

This research demonstrates that at least a subset of the genes that cause head and brain formation in bilaterians are also differentially expressed in the aboral region of the sea urchin. The expression patterns are not identical to those in all bilaterians; however, the similarities suggest that the patterns of gene expression arose in an ancestor common to bilaterians and cnidarians, and that the process was then modified in bilaterians to produce a brain. So to answer the evolutionary question posed above, it seems that the developmental module that produces a head came first.

Filed under sea anemone cnidarians brain brain formation gene expression genes neuroscience science

free counters