Neuroscience

Articles and news from the latest research reports.

Posts tagged science

102 notes

Children of addicted parents more likely to be depressed as adults
Children of parents who were addicted to drugs or alcohol are more likely to be depressed in adulthood, according to a new study by University of Toronto researchers.
“These findings underscore the intergenerational consequences of drug and alcohol addiction and reinforce the need to develop interventions that support healthy childhood development,” said the study’s lead author, Esme Fuller-Thomson, professor and Sandra Rotman Endowed Chair in the University of Toronto’s Factor-Inwentash Faculty of Social Work and the Department of Family and Community Medicine.
In a paper published online in the journal Psychiatry Research this month, investigators examined the association between parental addictions and adult depression in a representative sample of 6,268 adults, drawn from the 2005 Canadian Community Health Survey.
Of these respondents, 312 had a major depressive episode within the year preceding the survey and 877 reported that while they were under the age of 18 and still living at home that at least one parent who drank or used drugs “so often that it caused problems for the family.”
Results indicate that individuals whose parents were addicted to drugs or alcohol are more likely to develop depression than their peers. After adjusting for age, sex and race, parental addictions were associated with more than twice the odds of adult depression, says Fuller-Thomson.
“Even after adjusting for factors ranging from childhood maltreatment and parental unemployment to adult health behaviours including smoking and alcohol consumption, we found that parental addictions were associated with 69 per cent higher odds of depression in adulthood,” explains Fuller-Thomson. The study was co-authored with four graduate students at the University of Toronto: Robyn Katz, Vi Phan, Jessica Liddycoat and Sarah Brennenstuhl.
This study could not determine the cause of the relationship between parental addictions and adult depression. Co-author Robyn Katz, suggests that “It is possible that the prolonged and inescapable strain of parental addictions may permanently alter the way these children’s bodies react to stress throughout their life.
"One important avenue for future research is to investigate potential dysfunctions in cortisol production – the hormone that prepares us for ‘fight or flight’ – which may influence the later development of depression.”
“As an important first step, children who experience toxic stress at home can be greatly helped by the stable involvement of caring adults, including grandparents, teachers, coaches, neighbours and social workers,” said Fuller-Thomson. “Although more research is needed to determine if access to a responsive and loving adult decreases the likelihood of adult depression among children exposed to parental addictions, we do know that these caring relationships promote healthy development and buffer stress.”

Children of addicted parents more likely to be depressed as adults

Children of parents who were addicted to drugs or alcohol are more likely to be depressed in adulthood, according to a new study by University of Toronto researchers.

“These findings underscore the intergenerational consequences of drug and alcohol addiction and reinforce the need to develop interventions that support healthy childhood development,” said the study’s lead author, Esme Fuller-Thomson, professor and Sandra Rotman Endowed Chair in the University of Toronto’s Factor-Inwentash Faculty of Social Work and the Department of Family and Community Medicine.

In a paper published online in the journal Psychiatry Research this month, investigators examined the association between parental addictions and adult depression in a representative sample of 6,268 adults, drawn from the 2005 Canadian Community Health Survey.

Of these respondents, 312 had a major depressive episode within the year preceding the survey and 877 reported that while they were under the age of 18 and still living at home that at least one parent who drank or used drugs “so often that it caused problems for the family.”

Results indicate that individuals whose parents were addicted to drugs or alcohol are more likely to develop depression than their peers. After adjusting for age, sex and race, parental addictions were associated with more than twice the odds of adult depression, says Fuller-Thomson.

“Even after adjusting for factors ranging from childhood maltreatment and parental unemployment to adult health behaviours including smoking and alcohol consumption, we found that parental addictions were associated with 69 per cent higher odds of depression in adulthood,” explains Fuller-Thomson. The study was co-authored with four graduate students at the University of Toronto: Robyn Katz, Vi Phan, Jessica Liddycoat and Sarah Brennenstuhl.

This study could not determine the cause of the relationship between parental addictions and adult depression. Co-author Robyn Katz, suggests that “It is possible that the prolonged and inescapable strain of parental addictions may permanently alter the way these children’s bodies react to stress throughout their life.

"One important avenue for future research is to investigate potential dysfunctions in cortisol production – the hormone that prepares us for ‘fight or flight’ – which may influence the later development of depression.”

“As an important first step, children who experience toxic stress at home can be greatly helped by the stable involvement of caring adults, including grandparents, teachers, coaches, neighbours and social workers,” said Fuller-Thomson. “Although more research is needed to determine if access to a responsive and loving adult decreases the likelihood of adult depression among children exposed to parental addictions, we do know that these caring relationships promote healthy development and buffer stress.”

Filed under parental addictions addiction depression adult depression psychology neuroscience science

122 notes

Brain diseases affecting more people and starting earlier than ever before
Professor Colin Pritchard’s latest research published in Public Health Journal has found that the sharp rise of dementia and other neurological deaths in people under 74 cannot be put down to the fact that we are living longer – the rise is because a higher proportion of old people are being affected by such conditions, and what is really alarming, it is starting earlier and affecting people under 55 years.
Of the 10 biggest Western countries the USA had the worst increase in all neurological deaths, men up 66% and women 92% between 1979-2010. The UK was 4th highest, men up 32% and women 48%. In terms of numbers of deaths, in the UK, it was 4,500 and now 6,500, in the USA it was 14,500 now more than 28,500 deaths!
Professor Pritchard of Bournemouth University says: “These statistics are about real people and families, and we need to recognise that there is an ‘epidemic’ that clearly is influenced by environmental and societal changes.”
Tessa Gutteridge, Director YoungDementia UK says that our society needs to learn that dementia is increasingly affecting people from an earlier age: “The lives of an increasing number of families struggling with working-age dementia are made so much more challenging by services which fail to keep pace with their needs and a society which believes dementia to be an illness of old age.”
Bournemouth University researchers, Professor Colin Pritchard and Dr Andrew Mayers, along with the University of Southampton’s Professor David Baldwin show that there are rises in total neurological deaths, including the dementias, which are starting earlier, impacting upon patients, their families and health and social care services, exemplified by an 85% increase in UK Motor Neurone Disease deaths.
The research highlights that there is an alarming ‘hidden epidemic’ of rises in neurological deaths between 1979-2010 of adults (under 74) in Western countries, especially the UK.
Total neurological deaths in both men and women rose significantly in 16 of the countries covered by the research, which is in sharp contrast to the major reductions in deaths from all other causes.
Over the period the UK has the third biggest neurological increase, up 32% in men and 48% in women, whilst women’s neurological deaths rose faster than men’s in most countries.
Professor Pritchard said, “These rises in neurological deaths, with the earlier onset of the dementias, are devastating for families and pose a considerable public health problem. It is NOT that we have more old people but rather more old people have more brain disease than ever before, including Alzheimer’s. For example there are two new British charities, The Young Parkinson’s Society and Young Dementia UK, which are a grass-roots response to these rises. The need for such charities would have been inconceivable a little more than 30 years ago.”
When asked what he thought caused the increases he replied,
“This has to be speculative but it cannot be genetic because the period is too short. Whilst there will be some influence of more elderly people, it does not account for the earlier onset; the differences between countries nor the fact that more women have been affected, as their lives have changed more than men’s over the period, all indicates multiple environmental factors. Considering the changes over the last 30 years – the explosion in electronic devices, rises in background non-ionising radiation- PC’s, micro waves, TV’s, mobile phones; road and air transport up four-fold increasing background petro-chemical pollution; chemical additives to food etc. There is no one factor rather the likely interaction between all these environmental triggers, reflecting changes in other conditions. For example, whilst cancer deaths are down substantially, cancer incidence continues to rise; levels of asthma are un-precedented; the fall in male sperm counts - the rise of auto-immune diseases - all point to life-style and environmental influences. These `statistics’ are about real people and families, and we need to recognise that there is an `epidemic’ that clearly is influenced by environmental and societal changes.”

Brain diseases affecting more people and starting earlier than ever before

Professor Colin Pritchard’s latest research published in Public Health Journal has found that the sharp rise of dementia and other neurological deaths in people under 74 cannot be put down to the fact that we are living longer – the rise is because a higher proportion of old people are being affected by such conditions, and what is really alarming, it is starting earlier and affecting people under 55 years.

Of the 10 biggest Western countries the USA had the worst increase in all neurological deaths, men up 66% and women 92% between 1979-2010. The UK was 4th highest, men up 32% and women 48%. In terms of numbers of deaths, in the UK, it was 4,500 and now 6,500, in the USA it was 14,500 now more than 28,500 deaths!

Professor Pritchard of Bournemouth University says: “These statistics are about real people and families, and we need to recognise that there is an ‘epidemic’ that clearly is influenced by environmental and societal changes.”

Tessa Gutteridge, Director YoungDementia UK says that our society needs to learn that dementia is increasingly affecting people from an earlier age: “The lives of an increasing number of families struggling with working-age dementia are made so much more challenging by services which fail to keep pace with their needs and a society which believes dementia to be an illness of old age.”

Bournemouth University researchers, Professor Colin Pritchard and Dr Andrew Mayers, along with the University of Southampton’s Professor David Baldwin show that there are rises in total neurological deaths, including the dementias, which are starting earlier, impacting upon patients, their families and health and social care services, exemplified by an 85% increase in UK Motor Neurone Disease deaths.

The research highlights that there is an alarming ‘hidden epidemic’ of rises in neurological deaths between 1979-2010 of adults (under 74) in Western countries, especially the UK.

Total neurological deaths in both men and women rose significantly in 16 of the countries covered by the research, which is in sharp contrast to the major reductions in deaths from all other causes.

Over the period the UK has the third biggest neurological increase, up 32% in men and 48% in women, whilst women’s neurological deaths rose faster than men’s in most countries.

Professor Pritchard said, “These rises in neurological deaths, with the earlier onset of the dementias, are devastating for families and pose a considerable public health problem. It is NOT that we have more old people but rather more old people have more brain disease than ever before, including Alzheimer’s. For example there are two new British charities, The Young Parkinson’s Society and Young Dementia UK, which are a grass-roots response to these rises. The need for such charities would have been inconceivable a little more than 30 years ago.”

When asked what he thought caused the increases he replied,

“This has to be speculative but it cannot be genetic because the period is too short. Whilst there will be some influence of more elderly people, it does not account for the earlier onset; the differences between countries nor the fact that more women have been affected, as their lives have changed more than men’s over the period, all indicates multiple environmental factors. Considering the changes over the last 30 years – the explosion in electronic devices, rises in background non-ionising radiation- PC’s, micro waves, TV’s, mobile phones; road and air transport up four-fold increasing background petro-chemical pollution; chemical additives to food etc. There is no one factor rather the likely interaction between all these environmental triggers, reflecting changes in other conditions. For example, whilst cancer deaths are down substantially, cancer incidence continues to rise; levels of asthma are un-precedented; the fall in male sperm counts - the rise of auto-immune diseases - all point to life-style and environmental influences. These `statistics’ are about real people and families, and we need to recognise that there is an `epidemic’ that clearly is influenced by environmental and societal changes.”

Filed under brain diseases dementia alzheimer's disease health neuroscience science

804 notes

Brain implants: Restoring memory with a microchip
William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality.
Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.
But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement.
"I never thought I’d see this in my lifetime," said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. "I might not benefit from it myself but my kids will."
Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”
The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.
Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.
They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.
Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”
The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.
For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.
"It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago," Hampson says.
Much of the work that remains now is in shrinking down the electronics.
"Right now it’s not a device, it’s a fair amount of equipment,"Hampson says. "We’re probably looking at devices in the five to 10 year range for human patients."
The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.
Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.
Unfortunately, the team found that its method can’t help patients with advanced dementia.
"When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss," Hampson said.
Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.
"The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision." However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.
The UK’s Alzheimer’s Society is cautiously optimistic.
"Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore," said Doug Brown, director of research and development.
Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”
It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.
There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.
"That’s what it’s all about."

Brain implants: Restoring memory with a microchip

William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality.

Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.

But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement.

"I never thought I’d see this in my lifetime," said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. "I might not benefit from it myself but my kids will."

Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”

The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.

Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.

They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.

Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”

The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.

For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.

"It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago," Hampson says.

Much of the work that remains now is in shrinking down the electronics.

"Right now it’s not a device, it’s a fair amount of equipment,"Hampson says. "We’re probably looking at devices in the five to 10 year range for human patients."

The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.

Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.

Unfortunately, the team found that its method can’t help patients with advanced dementia.

"When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss," Hampson said.

Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.

"The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision." However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.

The UK’s Alzheimer’s Society is cautiously optimistic.

"Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore," said Doug Brown, director of research and development.

Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”

It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.

There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.

"That’s what it’s all about."

Filed under brain hippocampus memory memory device implants deep brain stimulation neuroscience science

118 notes

Sense of Touch Reproduced Through Prosthetic Hand

In a study recently published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, neurobiologists at the University of Chicago show how an organism can sense a tactile stimulus, in real time, through an artificial sensor in a prosthetic hand.

Scientists have made tremendous advances toward building lifelike prosthetic limbs that move and function like the real thing. These are amazing accomplishments, but an important element to creating a realistic replacement for a hand is the sense of touch. Without somatosensory feedback from the fingertips about how hard you’re squeezing something or where it’s positioned relative to the hand, grasping an object is about as accurate as using one of those skill cranes to grab a stuffed animal at an arcade. Sure, you can do it, but you have to concentrate intently while watching every movement. You’re relying on your sense of vision to compensate for the lack of touch.

Sliman Bensmaia, assistant professor of organismal biology and anatomy at the University of Chicago, studies the neural basis of the sense of touch. Now, he and his colleagues are working with a robotic hand equipped with sensors that send electrical signals to electrodes implanted in the brain to recreate the same response to touch as a real hand.

Bensmaia spoke about how important the sense of touch is to creating a lifelike experience with a prosthetic limb.

“If you lose your somatosensory system it almost looks like your motor system is impaired,” he said. “If you really want to create an arm that can actually be used dexterously without the enormous amount of concentration it takes without sensory feedback, you need to restore the somatosensory feedback.”

The researchers performed a series of experiments with rhesus macaques that were trained to respond to stimulation of the hand. In one setting, they were gently poked on the hand with a physical probe at varying levels of pressure. In a second setting, some of the animals had electrodes implanted into the area of the brain that responds to touch. These animals were given electrical pulses to simulate the sensation of touch, and their hands were hidden so they wouldn’t see that they weren’t actually being touched.

Using data from the animals’ responses to each type of stimulus, the researchers were able to create a function, or equation, that described the requisite electrical pulse to go with each physical poke of the hand. Then, they repeated the experiments with a prosthetic hand that was wired to the brain implants. They touched the prosthetic hand with the physical probe, which in turn sent electrical signals to the brain.

Bensmaia said that the animals performed identically whether poked on their own hand or on the prosthetic one.

“This is the first time as far as I know where an animal or organism actually perceives a tactile stimulus through an artificial transducer,” Bensmaia said. “It’s an engineering milestone. But from a neuroengineering standpoint, this validates this function. You can use this function to have an animal perform this very precise task, precisely identically.”

The FDA is in the process of approving similar devices for human trials, and Bensmaia said he hopes such a system is implemented within the next year. Producing a lifelike sense of touch would go a long way toward improving the dexterity and performance of prosthetic hands, but he said it would also help bridge a mental divide for amputees or people who have lost the use of a limb. Until now, prosthetics and robotic arms feel more like tools than real replacements because they don’t produce the expected sensations.

“If every time you see your robotic arm touching something, you get a sensation that is projected to it, I think it’s very possible that in fact, you will consider this new thing as being part of your body,” he said.

(Source: newswise.com)

Filed under prosthetic limbs prosthetic hand artificial limbs tactile sensation somatosensory system neuroscience robotics science

134 notes

Study finds brain system for emotional self-control

Different brain areas are activated when we choose to suppress an emotion, compared to when we are instructed to inhibit an emotion, according a new study from the UCL Institute of Cognitive Neuroscience and Ghent University.

In this study, published in Brain Structure and Function, the researchers scanned the brains of healthy participants and found that key brain systems were activated when choosing for oneself to suppress an emotion. They had previously linked this brain area to deciding to inhibit movement.

"This result shows that emotional self-control involves a quite different brain system from simply being told how to respond emotionally," said lead author Dr Simone Kuhn (Ghent University).

In most previous studies, participants were instructed to feel or inhibit an emotional response. However, in everyday life we are rarely told to suppress our emotions, and usually have to decide ourselves whether to feel or control our emotions.

In this new study the researchers showed fifteen healthy women unpleasant or frightening pictures. The participants were given a choice to feel the emotion elicited by the image, or alternatively to inhibit the emotion, by distancing themselves through an act of self-control.

The researchers used functional magnetic resonance imaging (fMRI) to scan the brains of the participants. They compared this brain activity to another experiment where the participants were instructed to feel or inhibit their emotions, rather than choose for themselves.

Different parts of the brain were activated in the two situations. When participants decided for themselves to inhibit negative emotions, the scientists found activation in the dorso-medial prefrontal area of the brain. They had previously linked this brain area to deciding to inhibit movement.

In contrast, when participants were instructed by the experimenter to inhibit the emotion, a second, more lateral area was activated.

"We think controlling one’s emotions and controlling one’s behaviour involve overlapping mechanisms," said Dr Kuhn.

"We should distinguish between voluntary and instructed control of emotions, in the same way as we can distinguish between making up our own mind about what do, versus following instructions."

Regulating emotions is part of our daily life, and is important for our mental health. For example, many people have to conquer fear of speaking in public, while some professionals such as health-care workers and firemen have to maintain an emotional distance from unpleasant or distressing scenes that occur in their jobs.

Professor Patrick Haggard (UCL Institute of Cognitive Neuroscience) co-author of the paper said the brain mechanism identified in this study could be a potential target for therapies.

"The ability to manage one’s own emotions is affected in many mental health conditions, so identifying this mechanism opens interesting possibilities for future research.

"Most studies of emotion processing in the brain simply assume that people passively receive emotional stimuli, and automatically feel the corresponding emotion. In contrast, the area we have identified may contribute to some individuals’ ability to rise above particular emotional situations.

"This kind of self-control mechanism may have positive aspects, for example making people less vulnerable to excessive emotion. But altered function of this brain area could also potentially lead to difficulties in responding appropriately to emotional situations."

(Source: eurekalert.org)

Filed under brain activity emotional response fMRI negative emotions psychology neuroscience science

52 notes

Imaging Technique Could Help Traumatic Brain Injury Patients
A new application of an existing medical imaging technology could help predict long-term damage in patients with traumatic brain injury, according to a recent UC San Francisco study.
The authors of the study analyzed brain scans using applied rapid automated resting state magnetoencephalography (MEG) imaging, a technique used to map brain activity by recording magnetic fields produced by natural electrical currents in the brain. They discovered “abnormally decreased functional connectivity” – or possible long-term brain damage – could persist years after a person suffers even a mild form of traumatic brain injury.
“We were hoping that areas of abnormal brain activity would match up with some of the functional measures such as patients’ symptoms after injury, and we saw such correlation,” said senior author Pratik Mukherjee, MD, PhD, associate professor in residence at the UCSF School of Medicine.
In a study published on April 19 in the Journal of Neurosurgery, UCSF researchers analyzed brain connectivity data on 14 male and seven female patients, whose median age was 29. Brain connectivity refers to a pattern of causal interactions between specific parts within a nervous system. Eleven patients had mild, one had moderate, and three had severe forms of traumatic brain injury. Six patients suffered no brain injury.
“Once we have connectivity information, we can create a template of what it looks like in a normal subject. When we have subjects that have had head injuries, we can compare their connectivity pattern to that of the normal subjects with an automated computer algorithm,” Mukherjee said. “And that will automatically detect areas of abnormally low and abnormally high connectivity compared to the normal database.” 
MEG imaging provides much richer information than a typical magnetic resonance imaging (MRI), which uses magnetic field and radio wave energy to give a static image of the brain or other internal structures of the body.
“If you scan someone a couple months after the trauma with an MRI, and you scan them again a couple of years after the trauma, it’s going to look the same,” Mukherjee said. “With MEG, we can characterize simple systems in much more in fine grain detail. It produces the most detailed activity mapping of the brain.”
Although MEG signals were first measured in 1968, the technology has not been widely used for patients with traumatic brain injury until recently. 
“It takes a minute or two to complete an MEG scan and it automatically detects the areas of abnormality using a computer algorithm,” Mukherjee said. “And it seems to be fairly sensitive because it’s showing us areas of abnormality even in people where MRIs missed some abnormalities.”
Every year approximately 1.7 million people in the United States suffer from traumatic brain injury, which costs the U.S. health care system an estimated $60 billion according to the U.S. Centers for Disease Control and Prevention. The most common forms of traumatic brain injury are suffered by athletes, members of the military, and those involved in motor vehicle collisions or occupational injuries.
“This is a preliminary study testing a new technique with a small sample, which makes it difficult to have enough statistical power to make such correlations,” Mukherjee said. “But I think this is an important step in our quest to help people suffering from traumatic brain injuries.”

Imaging Technique Could Help Traumatic Brain Injury Patients

A new application of an existing medical imaging technology could help predict long-term damage in patients with traumatic brain injury, according to a recent UC San Francisco study.

The authors of the study analyzed brain scans using applied rapid automated resting state magnetoencephalography (MEG) imaging, a technique used to map brain activity by recording magnetic fields produced by natural electrical currents in the brain. They discovered “abnormally decreased functional connectivity” – or possible long-term brain damage – could persist years after a person suffers even a mild form of traumatic brain injury.

“We were hoping that areas of abnormal brain activity would match up with some of the functional measures such as patients’ symptoms after injury, and we saw such correlation,” said senior author Pratik Mukherjee, MD, PhD, associate professor in residence at the UCSF School of Medicine.

In a study published on April 19 in the Journal of Neurosurgery, UCSF researchers analyzed brain connectivity data on 14 male and seven female patients, whose median age was 29. Brain connectivity refers to a pattern of causal interactions between specific parts within a nervous system. Eleven patients had mild, one had moderate, and three had severe forms of traumatic brain injury. Six patients suffered no brain injury.

“Once we have connectivity information, we can create a template of what it looks like in a normal subject. When we have subjects that have had head injuries, we can compare their connectivity pattern to that of the normal subjects with an automated computer algorithm,” Mukherjee said. “And that will automatically detect areas of abnormally low and abnormally high connectivity compared to the normal database.” 

MEG imaging provides much richer information than a typical magnetic resonance imaging (MRI), which uses magnetic field and radio wave energy to give a static image of the brain or other internal structures of the body.

“If you scan someone a couple months after the trauma with an MRI, and you scan them again a couple of years after the trauma, it’s going to look the same,” Mukherjee said. “With MEG, we can characterize simple systems in much more in fine grain detail. It produces the most detailed activity mapping of the brain.”

Although MEG signals were first measured in 1968, the technology has not been widely used for patients with traumatic brain injury until recently. 

“It takes a minute or two to complete an MEG scan and it automatically detects the areas of abnormality using a computer algorithm,” Mukherjee said. “And it seems to be fairly sensitive because it’s showing us areas of abnormality even in people where MRIs missed some abnormalities.”

Every year approximately 1.7 million people in the United States suffer from traumatic brain injury, which costs the U.S. health care system an estimated $60 billion according to the U.S. Centers for Disease Control and Prevention. The most common forms of traumatic brain injury are suffered by athletes, members of the military, and those involved in motor vehicle collisions or occupational injuries.

“This is a preliminary study testing a new technique with a small sample, which makes it difficult to have enough statistical power to make such correlations,” Mukherjee said. “But I think this is an important step in our quest to help people suffering from traumatic brain injuries.”

Filed under TBI MEG imaging brain injury brain damage brain activity neuroscience science

42 notes

Researchers identify how cells control calcium influx

When brain cells are overwhelmed by an influx of too many calcium molecules, they shut down the channels through which these molecules enter the cells. Until now, the “stop” signal mechanism that cells use to control the molecular traffic was unknown.

In the new issue of the journal Neuron, UC Davis Health System scientists report that they have identified the mechanism. Their findings are relevant to understanding the molecular causes of the disruption of brain functioning that occurs in stroke and other neurological disorders.

"Too much calcium influx clearly is part of the neuronal dysfunction in Alzheimer’s disease and causes the neuronal damage during and after a stroke. It also contributes to chronic pain," said Johannes W. Hell, professor of pharmacology at UC Davis. Hell headed the research team that identified the mechanism that stops the flow of calcium molecules, which are also called ions, into the specialized brain cells known as neurons.

Hell explained that each day millions of molecules of calcium enter and exit each of the 100 billion neurons of the human brain. These calcium ions move in and out of neurons through pore-like structures, known as channels, that are located in the outer surface, or “skin,” of each cell.

The flow of calcium ions into brain cells generates the electrical impulses needed to stimulate such actions as the movement of muscles in our legs and the creation of new memories in the brain. The movement of calcium ions also plays a role in gene expression and affects the flexibility of the structures, called synapses, that are located between neurons and transmit electrical or chemical signals of various strengths from one cell to a second cell.

Neurons employ an unexpected and highly complex mechanism to down regulate, or reduce, the activity of channels that are permitting too many calcium ions to enter neurons, Hell and his colleagues discovered. The mechanism, which leads to the elimination of the overly permissive ion channel employs two proteins, α-actinin and the calcium-binding messenger protein calmodulin.

Located on the neuron’s outer surface, referred to as the plasma membrane, α-actinin stabilizes the type of ion channels that constitute a major source of calcium ion influx into brain cells, Hell explained. This protein is a component of the cytoskeleton, the scaffolding of cells. The ion channels that are a major source of calcium ions are referred to as Cav1.2 (L type voltage-dependent calcium channels).

The researchers also found that the calcium-binding messenger protein calmodulin, which is the cell’s main sensor for calcium ions, induces internalization, or endocytosis, of Cav1.2 to remove this channel from the cell surface, thus providing an important negative feedback mechanism for excessive calcium ion influx into a neuron, Hell explained.

The discovery that α-actinin and calmodulin play a role in controlling calcium ion influx expands upon Hell’s previous research on the molecular mechanisms that regulate the activity of various ion channels at the synapse.

One previous study proved relevant to understanding the biological mechanisms that underlie the body’s fight-or-flight response during stress.

In work published in the journal Science in 2001, Hell and colleagues reported that the regulation of Cav1.2 by adrenergic signaling during stress is performed by one of the adrenergic receptors (beta 2 adrenergic receptor) directly linked to Cav1.2.

"This protein-protein interaction ensures that the adrenergic regulation is fast, efficient and precisely targets this channel," Hell said.

"We showed that Cav1.2 is regulated by adrenergic signaling on a time scale of a few seconds, and this is mainly increasing its activity when needed, for example during danger, to make our brain work faster and better. The same channel is in the heart, where adrenergic stimulation increases channel/Ca influx activity, increasing the pacing and strength of our heart beat to meet the increased physical demands during danger."

(Source: universityofcalifornia.edu)

Filed under calcium influx calcium ions synapses neurons neuronal damage chronic pain neuroscience science

54 notes

Advance in tuberous sclerosis brain science
By manipulating the timing of disease-causing mutations in the brains of developing mice, Brown University researchers have found that early genetic deletions in the thalamus may play an important role in course and severity of the developmental disease tuberous sclerosis complex. Findings appear in the journal Neuron. 
Doctors often diagnose tuberous sclerosis complex (TSC) based on the abnormal growths the genetic disease causes in organs around the body. Those overt anatomical structures, however, belie the microscopic and mysterious neurological differences behind the disease’s troublesome behavioral symptoms: autism, intellectual disabilities, and seizures. In a new study in mice, Brown University researchers highlight a role for a brain region called the thalamus and show that the timing of gene mutation during thalamus development makes a huge difference in the severity of the disease.
TSC can arise in humans and mice alike when both alleles (the one from mom and the one from dad) of the TSC1 gene are deleted. One bad gene is often inherited and the other accumulates a mutation some time during embryonic development. This happens to one in 6,000 people.
“We don’t know when during development the mutations are occurring in the patients,” said Elizabeth Normand, a Brown neuroscience graduate student and lead author of the paper in the journal Neuron. “That’s why we chose to look at the timing. It can give us some insight into the role of genes during embryonic development.”
Normand and adviser Mark Zervas, assistant professor of biology, not only wanted to assess the timing but also to probe the role the thalamus might have in contributing to the neurological symptoms of the disease. To do both, their team genetically engineered a clever mouse model in which they could, with a dose of the drug tamoxifen, delete both alleles exclusively in thalamus neurons at the developmental stage of their choosing.
Their interest in the thalamus comes from its role in forging strong but intricate links to the cortex, which is where most other TSC researchers have focused. As for timing, they tested the effect of controlling allele deletions on day 12 of gestation in some mice and day 18 (just before birth) in others. Still other mice were left healthy as experimental controls.
Significant symptoms
Overall, the researchers found they could indeed generate TSC-like behavioral symptoms in the mice, such as seizures, by deleting TSC1 alleles in developing cells of the thalamus. They also found that the timing of the deletion mattered tremendously to the extent of the disease in the brain, the degree of abnormality, and the severity of TSC-like symptoms.
The mice whose alleles were deleted on embryonic day 12 fared much worse behaviorally than the mice whose alleles were deleted on embryonic day 18.
At two months of age, the mice with the embryonic day 12 deletion exhibited excessive self-grooming to the point where they experienced lesions. Among those mice, 10 of 11 experienced seizures at an average rate of more than three per hour.
The mice with the embryonic day 18 deletion, on the other hand, fared better without any over-grooming. By eight months of age, however, four of 17 of the mice did exhibit rare seizures.
These behavioral differences traced to differences in the the way the mice’s brains became wired. A comparison of brain tissue from adult mice — some of which had the early TSC1 deletions and some of which didn’t — revealed differences in the connections between the thalamus and the cortex and in the electrical and physical properties of thalamus cells.
“We’re building off the core idea of the thalamus playing an important role in brain function and showing that if you disrupt the way that the thalamic neurons develop that you can get some of these behavioral consequences such as overgrooming or seizures,” said Zervas, who is affiliated with the Brown Institute for Brain Science.
The extent of mutant neurons was much more severe in the mice with the embryonic day 12 versus day 18 mutations. In embryonic day 12 deleted mice, for example, the deletion disrupted the growth-regulating “mTOR” pathway in 70 percent of neurons versus only 29 percent of neurons in the embryonic day 18 deleted mice. The disruptions occurred in more areas of the thalamus in embryonic day 12 than in day 18 mice as well. The overactivity of mTOR in TSC is what produces the unusual growths around the body, though these new findings indicate additional roles for the mTOR pathway in brain development and function, Zervas said.
In future work, the team plans to study the effects of deleting the TSC1 allele at other days during development as well as to understand whether there is a threshold of mutant neurons with mTOR disruption at which TSC-like symptoms begin to emerge.

Advance in tuberous sclerosis brain science

By manipulating the timing of disease-causing mutations in the brains of developing mice, Brown University researchers have found that early genetic deletions in the thalamus may play an important role in course and severity of the developmental disease tuberous sclerosis complex. Findings appear in the journal Neuron.

Doctors often diagnose tuberous sclerosis complex (TSC) based on the abnormal growths the genetic disease causes in organs around the body. Those overt anatomical structures, however, belie the microscopic and mysterious neurological differences behind the disease’s troublesome behavioral symptoms: autism, intellectual disabilities, and seizures. In a new study in mice, Brown University researchers highlight a role for a brain region called the thalamus and show that the timing of gene mutation during thalamus development makes a huge difference in the severity of the disease.

TSC can arise in humans and mice alike when both alleles (the one from mom and the one from dad) of the TSC1 gene are deleted. One bad gene is often inherited and the other accumulates a mutation some time during embryonic development. This happens to one in 6,000 people.

“We don’t know when during development the mutations are occurring in the patients,” said Elizabeth Normand, a Brown neuroscience graduate student and lead author of the paper in the journal Neuron. “That’s why we chose to look at the timing. It can give us some insight into the role of genes during embryonic development.”

Normand and adviser Mark Zervas, assistant professor of biology, not only wanted to assess the timing but also to probe the role the thalamus might have in contributing to the neurological symptoms of the disease. To do both, their team genetically engineered a clever mouse model in which they could, with a dose of the drug tamoxifen, delete both alleles exclusively in thalamus neurons at the developmental stage of their choosing.

Their interest in the thalamus comes from its role in forging strong but intricate links to the cortex, which is where most other TSC researchers have focused. As for timing, they tested the effect of controlling allele deletions on day 12 of gestation in some mice and day 18 (just before birth) in others. Still other mice were left healthy as experimental controls.

Significant symptoms

Overall, the researchers found they could indeed generate TSC-like behavioral symptoms in the mice, such as seizures, by deleting TSC1 alleles in developing cells of the thalamus. They also found that the timing of the deletion mattered tremendously to the extent of the disease in the brain, the degree of abnormality, and the severity of TSC-like symptoms.

The mice whose alleles were deleted on embryonic day 12 fared much worse behaviorally than the mice whose alleles were deleted on embryonic day 18.

At two months of age, the mice with the embryonic day 12 deletion exhibited excessive self-grooming to the point where they experienced lesions. Among those mice, 10 of 11 experienced seizures at an average rate of more than three per hour.

The mice with the embryonic day 18 deletion, on the other hand, fared better without any over-grooming. By eight months of age, however, four of 17 of the mice did exhibit rare seizures.

These behavioral differences traced to differences in the the way the mice’s brains became wired. A comparison of brain tissue from adult mice — some of which had the early TSC1 deletions and some of which didn’t — revealed differences in the connections between the thalamus and the cortex and in the electrical and physical properties of thalamus cells.

“We’re building off the core idea of the thalamus playing an important role in brain function and showing that if you disrupt the way that the thalamic neurons develop that you can get some of these behavioral consequences such as overgrooming or seizures,” said Zervas, who is affiliated with the Brown Institute for Brain Science.

The extent of mutant neurons was much more severe in the mice with the embryonic day 12 versus day 18 mutations. In embryonic day 12 deleted mice, for example, the deletion disrupted the growth-regulating “mTOR” pathway in 70 percent of neurons versus only 29 percent of neurons in the embryonic day 18 deleted mice. The disruptions occurred in more areas of the thalamus in embryonic day 12 than in day 18 mice as well. The overactivity of mTOR in TSC is what produces the unusual growths around the body, though these new findings indicate additional roles for the mTOR pathway in brain development and function, Zervas said.

In future work, the team plans to study the effects of deleting the TSC1 allele at other days during development as well as to understand whether there is a threshold of mutant neurons with mTOR disruption at which TSC-like symptoms begin to emerge.

Filed under embryonic development gene mutation animal model tuberous sclerosis complex neuroscience science

36 notes

Researchers discover a missing link in signals contributing to neurodegeneration

In many neurodegenerative diseases the neurons of the brain are over-stimulated and this leads to their destruction. After many failed attempts and much scepticism this process was finally shown last year to be a possible basis for treatment in some patients with stroke. But very few targets for drugs to block this process are known.

In a new highly detailed study, researchers have discovered a previously missing link between over-stimulation and destruction of brain tissue, and shown that this might be a target for future drugs. This research, led by the A. I. Virtanen Institute at the University of Eastern Finland in collaboration with scientists from Lausanne University Hospital, University of Lausanne and the company Xigen Pharma AG, was published in the Journal of Neuroscience. Research was funded mainly by the Academy of Finland.

What is this missing link? We have known for years that over-stimulated neurons produce nitric oxide molecules. Although this can activate a signal for destruction of cells, the small amount of nitric oxide produced cannot alone explain the damage to the brain. The team now show that a protein called NOS1AP links the nitric oxide that is produced to the damage that results. NOS1AP binds an initiator of cell destruction called MKK3 and also moves within the cell to the source of nitric oxide when cells are over-activated. The location of these proteins in cells causes them to convert the over-stimulation signal into a cell destruction response. The team designed a chemical that prevents NOS1AP from binding the source of nitric oxide. This reduces the cell destruction response in cells of the brain and as a result it limits brain lesions in rodents.

Other funders are the European Union and the University of Eastern Finland. Researchers used the recently developed high-throughput imaging facilities at the A. I. Virtanen Institute. The researchers hope that continuation of their work could lead to improved treatments for diseases such as stroke, epilepsy and chronic conditions like Alzheimer’s disease. As NOS1AP is associated with schizophrenia, diabetes and sudden cardiac death, future research in this area may assist the treatment of a wider range of diseases.

(Source: aka.fi)

Filed under neurodegenerative diseases brain tissue cell destruction nitric oxide molecules neuroscience science

83 notes

Scientists show how nerve wiring self-destructs

Many medical issues affect nerves, from injuries in car accidents and side effects of chemotherapy to glaucoma and multiple sclerosis. The common theme in these scenarios is destruction of nerve axons, the long wires that transmit signals to other parts of the body, allowing movement, sight and sense of touch, among other vital functions.

image

Now, researchers at Washington University School of Medicine in St. Louis have found a way the body can remove injured axons, identifying a potential target for new drugs that could prevent the inappropriate loss of axons and maintain nerve function.

“Treating axonal degeneration could potentially help a lot of patients because there are so many diseases and conditions where axons are inappropriately lost,” says Aaron DiAntonio, MD, PhD, professor of developmental biology. “While this would not be a cure for any of them, the hope is that we could slow the progression of a whole range of diseases by keeping axons healthy.”

DiAntonio is senior author of the study that appears online May 9 in the journal Cell Reports.

While axonal degeneration appears to be a major culprit in diseases like multiple sclerosis, it also paradoxically plays an important role in properly wiring the nervous systems of developing embryos.

“When an embryo is building its nervous system, there can be inappropriate or excessive axonal sprouts, or axons that are only needed at one time in development and not later,” DiAntonio says. “These axons degenerate, and that’s very important for wiring the nervous system. And in adult organisms, it might be useful to have a clean and quick way to remove a damaged axon from a healthy nerve, instead of letting it decay and potentially damage its neighboring axons.”

DiAntonio compares the process to programmed cell death, or apoptosis, which is also important in embryonic development. Apoptosis culls unnecessary or damaged cells from the body. If cell death programs become overactive, they can kill healthy cells that should remain. And if apoptosis fails to destroy damaged cells in adults, it can lead to cancer.

The new discovery also underscores the relatively recent understanding that loss of axons is not a passive decay process resulting from injury. Just as apoptosis actively destroys cells, axonal degeneration results from a cellular program that actively removes the damaged axon. In certain diseases, the program may be inappropriately triggered.

“We want to understand axonal degeneration at the same level that we understand programmed cell death, in the hopes of developing drugs to block the process when it becomes overactive,” DiAntonio says.

DiAntonio’s major collaborators in this project include Jeffrey D. Milbrandt, MD, PhD, the James S. McDonnell Professor and head of the Department of Genetics, and first author Elisabetta Babetto, PhD, postdoctoral research scholar.

Studying mice, the researchers found that a gene called Phr1 plays a major role in governing the self-destruction of injured axons. When they removed Phr1 from adult mice, the severed portion of the axons remained intact for much longer than in genetically normal mice.

In the normal mice, a severed axon degenerated entirely after two days. In mice without Phr1, they found that about 75 percent of the severed axons remained at five days, with a quarter persisting at least 10 days after being cut. The mice showed no side effects and suffered no obvious problems due to the missing Phr1.

The findings raise the possibility that blocking the Phr1 protein with a drug could keep damaged axons alive and functional when the body would normally cause the axons to self-destruct.

DiAntonio emphasizes that he is not trying to save axons that have no connection to the rest of the nerve. The paradigm is simply a good way to model nerve injury. In many instances, such as a crush injury or disease processes in which the axon is not severed, blocking the Phr1 protein could potentially preserve an attached axon that would otherwise self-destruct.

Importantly, the research team also looked at optic nerves of the central nervous system, which are damaged in glaucoma, and found similar protective effects from the loss of Phr1.

“This is not the first gene identified whose loss protects mammalian axons from degeneration,” DiAntonio says. “But it is the first one that shows evidence of working in the central nervous system. So it could be important in conditions like glaucoma, multiple sclerosis and other neurodegenerative diseases where the central nervous system is the primary problem.”

DiAntonio also points out possible ways to help cancer patients. Many chemotherapy drugs cause damage to peripheral axons, which may limit the doses a patient can tolerate.

As part of the new study, the researchers showed that intact axons without Phr1 were protected from the damage caused by vincristine, a chemotherapy drug used to treat leukemia, neuroblastoma, Hodgkin’s disease and non-Hodgkin’s lymphoma, among other cancers.

“In this case, the loss of axons is not caused by disease,” DiAntonio says. “It’s caused by the drug doctors are giving. You know the date it will start. You know the date it will stop. This is probably where I am most optimistic that we could make an impact.”

(Source: news.wustl.edu)

Filed under nerve axons axonal degeneration nervous system apoptosis genes neuroscience science

free counters