Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

45 notes

Research Reveals Possible Reason for Cholesterol-Drug Side Effects
The U.S. Food and Drug Administration and physicians continue to document that some patients experience fuzzy thinking and memory loss while taking statins, a class of global top-selling cholesterol-lowering drugs. 
A University of Arizona research team has made a novel discovery in brain cells being treated with statin drugs: unusual swellings within neurons, which the team has termed the “beads-on-a-string” effect.
The team is not entirely sure why the beads form, said UA neuroscientist Linda L. Restifo, who leads the investigation. However, the team believes that further investigation of the beads will help inform why some people experience cognitive declines while taking statins.
"What we think we’ve found is a laboratory demonstration of a problem in the neuron that is a more severe version for what is happening in some peoples’ brains when they take statins," said Restifo, a UA professor of neuroscience, neurology and cellular and molecular medicine, and principal investigator on the project.
Restifo and her team’s co-authored study and findings recently were published in Disease Models & Mechanisms, a peer-reviewed journal. Robert Kraft, a former research associate in the department of neuroscience, is lead author on the article.
Restifo and Kraft cite clinical reports noting that statin users often are told by physicians that cognitive disturbances experienced while taking statins were likely due to aging or other effects. However, the UA team’s research offers additional evidence that the cause for such declines in cognition is likely due to a negative response to statins.
The team also has found that removing statins results in a disappearance of the beads-on-a-string, and also a restoration of normal growth.
With research continuing, the UA team intends to investigate how genetics may be involved in the bead formation and, thus, could cause hypersensitivity to the drugs in people. Team members believe that genetic differences could involve neurons directly, or the statin interaction with the blood-brain barrier.
"This is a great first step on the road toward more personalized medication and therapy," said David M. Labiner, who heads the UA department of neurology. "If we can figure out a way to identify patients who will have certain side effects, we can improve therapeutic outcomes."
For now, the UA team has multiple external grants pending, and researchers carry the hope that future research will greatly inform the medical community and patients.
"If we are able to do genetic studies, the goal will be to come up with a predictive test so that a patient with high cholesterol could be tested first to determine whether they have a sensitivity to statins," Restifo said.
Detecting, Understanding a Drugs’ Side Effects
Restifo used the analogy of traffic to explain what she and her colleagues theorize. 
The beads indicate a sort of traffic jam, she described. In the presence of statins, neurons undergo a “dramatic change in their morphology,” said Restifo, also a BIO5 Institute member.
"Those very, very dramatic and obvious swellings are inside the neurons and act like a traffic pileup that is so bad that it disrupts the function of the neurons," she said.
It was Kraft’s observations that led to team’s novel discovery.
Restifo, Kraft and their colleagues had long been investigating mutations in genes, largely for the benefit of advancing discoveries toward the improved treatment of autism and other cognitive disorders.
At the time, and using a blind-screened library of 1,040 drug compounds, the team ran tests on fruit fly neurons, investigating the reduction of defects caused by a mutation when neurons were exposed to different drugs.
The team had shown that one mutation caused the neuron branches to be curly instead of straight, but certain drugs corrected this. The research findings were published in 2006 in the Journal of Neuroscience.
Then, something serendipitous occurred: Kraft observed that one compound, then another and then two more all created the same reaction – “these bulges, which we called beads-on-a-string,’” Kraft said. “And they were the only drugs causing this effect.”
At the end of the earlier investigation, the team decoded the library and found that the four compounds that resulted in the beads-on-a-string were, in fact, statins.
"The ‘beads’ effect of the statins was like a bonus prize from the earlier experiment," Restifo said. "It was so striking, we couldn’t ignore it."
In addition to detecting the beads effect, the team came upon yet another major finding: when statins are removed, the beads-on-a-string effect disappears, offering great promise to those being treated with the drugs.
"For some patients, just as much as statins work to save their lives, they can cause impairments," said Monica Chaung, who has been part of the team and is a UA undergraduate researcher studying molecular and cellular biology and physiology.
"It’s not a one drug fits all," said Chaung, a UA junior who is also in the Honors College. "We suspect different gene mutations alter how people respond to statins."
Having been trained by Kraft in techniques to investigate cultured neurons, Chuang was testing gene mutations and found variation in sensitivity to statins. It was through the work of Chuang and Kraft that the team would later determine that, after removing the statins, the cells were able to repair themselves; the neurotoxicity was not permanent, Restifo said.
"In the clinical literature, you can read reports on fuzzy thinking, which stops when a patient stops taking statins. So, that was a very important demonstration of a parallel between the clinical reports and the laboratory phenomena," Restifo said.
The finding led the team to further investigate the neurotoxicity of statins.
"There is no question that these are very important and very useful drugs," Restifo said. Statins have been shown to lower cholesterol and prevent heart attacks and strokes.
But too much remains unknown about how the drugs’ effects may contribute to muscular, cognitive and behavioral changes.
"We don’t know the implications of the beads, but we have a number of hypotheses to test," Restifo said, adding that further studies should reveal exactly what happens when the transportation system within neurons is disrupted.
Also, given the move toward prescribing statins to children, the need to have an expanded understanding of the effects of statins on cognitive development is critical, Kraft said.
"If statins have an effect on how the nervous system matures, that could be devastating," Kraft said. "Memory loss or any sort of disruption of your memory and cognition can have quite severe effects and negative consequences."
Restifo and her colleagues have multiple grants pending that would enable the team to continue investigating several facets related to the neurotoxicity of statins. Among the major questions is, to what extent does genetics contribute to a person’s sensitivity to statins?
"We have no idea who is at risk. That makes us think that we can use this genetic laboratory assay to infer which of the genes make people susceptible," Restifo said.
"This dramatic change in the morphology of the neurons is something we can now use to ask questions and experiment in the laboratory," she said. "Our contribution is to find a way to ask about genetics and what the genetic vulnerability factors are."
The Possibility for Future Research, Advice
The team’s findings and future research could have important implications for the medical field and for patients with regard to treatment, communication and improved personalized medicine.
"It’s important to look into this to see if people may have some sort of predisposition to the beads effect, and that’s where we want to go with this research," Kraft said. "There must be more research into what effects these drugs have other than just controlling a person’s elevated cholesterol levels."
And even as additional research is ongoing, suggestions already exist for physicians, patients and families.
"Most physicians assume that if a patient doesn’t report side effects, there are no side effects," Labiner said.
"The paternalistic days of medication are hopefully behind us. They should be," Labiner said.
"We can treat lots of things, but the problem is if there are side effects that worsen the treatment, the patient is more likely to shy away from the medication. That’s a bad outcome," he said. "There’s got to be a give and take between the patient and physician."
Patients should feel empowered to ask questions, and deeper questions, about their health and treatment and physicians should be very attentive to any reports of cognitive decline for those patients on statins, she said.
For some, it starts early after starting statins; for others, it takes time. And the signs vary. People may begin losing track of dates, the time or their keys.
"These are not trivial things. This could have a significant impact on your daily life, your interpersonal relationships, your ability to hold a job," Restifo said.
"This is the part of the brain that allows us to think clearly, to plan, to hold onto memories," she said. "If people are concerned that they are having this problem, patients should ask their physicians."
Restifo said open and direct patient-physician communication is even more important for those on statins who have a family history of side effects from statins.
Also, physicians could work more closely with patients to investigate family history and determine a better dosage plan. Even placing additional questions on the family history questionnaire could be useful, she said.
"There is good clinical data that every-other-day dosing give you most of the benefits, and maybe even prevents some of the accumulation of things that result in side effects," Restifo said, suggesting that physicians should try and get a better longitudinal picture on how people react while on statins. 
"Statins have been around now for long enough and are widely prescribed to so many people," she said. "But increased awareness could be very helpful."

Research Reveals Possible Reason for Cholesterol-Drug Side Effects

The U.S. Food and Drug Administration and physicians continue to document that some patients experience fuzzy thinking and memory loss while taking statins, a class of global top-selling cholesterol-lowering drugs. 

A University of Arizona research team has made a novel discovery in brain cells being treated with statin drugs: unusual swellings within neurons, which the team has termed the “beads-on-a-string” effect.

The team is not entirely sure why the beads form, said UA neuroscientist Linda L. Restifo, who leads the investigation. However, the team believes that further investigation of the beads will help inform why some people experience cognitive declines while taking statins.

"What we think we’ve found is a laboratory demonstration of a problem in the neuron that is a more severe version for what is happening in some peoples’ brains when they take statins," said Restifo, a UA professor of neuroscience, neurology and cellular and molecular medicine, and principal investigator on the project.

Restifo and her team’s co-authored study and findings recently were published in Disease Models & Mechanisms, a peer-reviewed journal. Robert Kraft, a former research associate in the department of neuroscience, is lead author on the article.

Restifo and Kraft cite clinical reports noting that statin users often are told by physicians that cognitive disturbances experienced while taking statins were likely due to aging or other effects. However, the UA team’s research offers additional evidence that the cause for such declines in cognition is likely due to a negative response to statins.

The team also has found that removing statins results in a disappearance of the beads-on-a-string, and also a restoration of normal growth.

With research continuing, the UA team intends to investigate how genetics may be involved in the bead formation and, thus, could cause hypersensitivity to the drugs in people. Team members believe that genetic differences could involve neurons directly, or the statin interaction with the blood-brain barrier.

"This is a great first step on the road toward more personalized medication and therapy," said David M. Labiner, who heads the UA department of neurology. "If we can figure out a way to identify patients who will have certain side effects, we can improve therapeutic outcomes."

For now, the UA team has multiple external grants pending, and researchers carry the hope that future research will greatly inform the medical community and patients.

"If we are able to do genetic studies, the goal will be to come up with a predictive test so that a patient with high cholesterol could be tested first to determine whether they have a sensitivity to statins," Restifo said.

Detecting, Understanding a Drugs’ Side Effects

Restifo used the analogy of traffic to explain what she and her colleagues theorize. 

The beads indicate a sort of traffic jam, she described. In the presence of statins, neurons undergo a “dramatic change in their morphology,” said Restifo, also a BIO5 Institute member.

"Those very, very dramatic and obvious swellings are inside the neurons and act like a traffic pileup that is so bad that it disrupts the function of the neurons," she said.

It was Kraft’s observations that led to team’s novel discovery.

Restifo, Kraft and their colleagues had long been investigating mutations in genes, largely for the benefit of advancing discoveries toward the improved treatment of autism and other cognitive disorders.

At the time, and using a blind-screened library of 1,040 drug compounds, the team ran tests on fruit fly neurons, investigating the reduction of defects caused by a mutation when neurons were exposed to different drugs.

The team had shown that one mutation caused the neuron branches to be curly instead of straight, but certain drugs corrected this. The research findings were published in 2006 in the Journal of Neuroscience.

Then, something serendipitous occurred: Kraft observed that one compound, then another and then two more all created the same reaction – “these bulges, which we called beads-on-a-string,’” Kraft said. “And they were the only drugs causing this effect.”

At the end of the earlier investigation, the team decoded the library and found that the four compounds that resulted in the beads-on-a-string were, in fact, statins.

"The ‘beads’ effect of the statins was like a bonus prize from the earlier experiment," Restifo said. "It was so striking, we couldn’t ignore it."

In addition to detecting the beads effect, the team came upon yet another major finding: when statins are removed, the beads-on-a-string effect disappears, offering great promise to those being treated with the drugs.

"For some patients, just as much as statins work to save their lives, they can cause impairments," said Monica Chaung, who has been part of the team and is a UA undergraduate researcher studying molecular and cellular biology and physiology.

"It’s not a one drug fits all," said Chaung, a UA junior who is also in the Honors College. "We suspect different gene mutations alter how people respond to statins."

Having been trained by Kraft in techniques to investigate cultured neurons, Chuang was testing gene mutations and found variation in sensitivity to statins. It was through the work of Chuang and Kraft that the team would later determine that, after removing the statins, the cells were able to repair themselves; the neurotoxicity was not permanent, Restifo said.

"In the clinical literature, you can read reports on fuzzy thinking, which stops when a patient stops taking statins. So, that was a very important demonstration of a parallel between the clinical reports and the laboratory phenomena," Restifo said.

The finding led the team to further investigate the neurotoxicity of statins.

"There is no question that these are very important and very useful drugs," Restifo said. Statins have been shown to lower cholesterol and prevent heart attacks and strokes.

But too much remains unknown about how the drugs’ effects may contribute to muscular, cognitive and behavioral changes.

"We don’t know the implications of the beads, but we have a number of hypotheses to test," Restifo said, adding that further studies should reveal exactly what happens when the transportation system within neurons is disrupted.

Also, given the move toward prescribing statins to children, the need to have an expanded understanding of the effects of statins on cognitive development is critical, Kraft said.

"If statins have an effect on how the nervous system matures, that could be devastating," Kraft said. "Memory loss or any sort of disruption of your memory and cognition can have quite severe effects and negative consequences."

Restifo and her colleagues have multiple grants pending that would enable the team to continue investigating several facets related to the neurotoxicity of statins. Among the major questions is, to what extent does genetics contribute to a person’s sensitivity to statins?

"We have no idea who is at risk. That makes us think that we can use this genetic laboratory assay to infer which of the genes make people susceptible," Restifo said.

"This dramatic change in the morphology of the neurons is something we can now use to ask questions and experiment in the laboratory," she said. "Our contribution is to find a way to ask about genetics and what the genetic vulnerability factors are."

The Possibility for Future Research, Advice

The team’s findings and future research could have important implications for the medical field and for patients with regard to treatment, communication and improved personalized medicine.

"It’s important to look into this to see if people may have some sort of predisposition to the beads effect, and that’s where we want to go with this research," Kraft said. "There must be more research into what effects these drugs have other than just controlling a person’s elevated cholesterol levels."

And even as additional research is ongoing, suggestions already exist for physicians, patients and families.

"Most physicians assume that if a patient doesn’t report side effects, there are no side effects," Labiner said.

"The paternalistic days of medication are hopefully behind us. They should be," Labiner said.

"We can treat lots of things, but the problem is if there are side effects that worsen the treatment, the patient is more likely to shy away from the medication. That’s a bad outcome," he said. "There’s got to be a give and take between the patient and physician."

Patients should feel empowered to ask questions, and deeper questions, about their health and treatment and physicians should be very attentive to any reports of cognitive decline for those patients on statins, she said.

For some, it starts early after starting statins; for others, it takes time. And the signs vary. People may begin losing track of dates, the time or their keys.

"These are not trivial things. This could have a significant impact on your daily life, your interpersonal relationships, your ability to hold a job," Restifo said.

"This is the part of the brain that allows us to think clearly, to plan, to hold onto memories," she said. "If people are concerned that they are having this problem, patients should ask their physicians."

Restifo said open and direct patient-physician communication is even more important for those on statins who have a family history of side effects from statins.

Also, physicians could work more closely with patients to investigate family history and determine a better dosage plan. Even placing additional questions on the family history questionnaire could be useful, she said.

"There is good clinical data that every-other-day dosing give you most of the benefits, and maybe even prevents some of the accumulation of things that result in side effects," Restifo said, suggesting that physicians should try and get a better longitudinal picture on how people react while on statins. 

"Statins have been around now for long enough and are widely prescribed to so many people," she said. "But increased awareness could be very helpful."

Filed under statins memory loss cholesterol drug brain cells neurons neuroscience science

89 notes

Colour a constant throughout ageing
Visionary study Age may dim our eyes, but our brains make sure aspects of the rich world of colour experience defy the passing of time, a UK scientist has found.
It’s well known that our colour vision declines with age. Gradual yellowing of the lenses cuts out light in the blue range of the spectrum, while colour-sensing cone receptors on our retinas slowly lose sensitivity.
"Our ability to discriminate small colour differences declines as we age, there is no doubt about that," says neuroscientist Sophie Wuerger from the Department of Psychological Sciences, University of Liverpool.
But she has found our brains apparently compensate for at least some of these physical frailties. Her results are published online this week in the journal PLoS One.
Wuerger explored the colour perception of 185 people aged between 18 and 75 years with normal colour vision, an unusually large and diverse group for a study of this kind.
First, she used well-known data on how the lens changes with age to predict the light signal that would be sent to the brain by the volunteers’ retinas.
She then asked the participants to undertake a variety of tests that required them to select patches of colour representing pure red, green, yellow, or blue, under different lighting conditions.
Constant perception
The idea was to compare the predicted physiological changes in the eye with the participants’ actual experience of colours.
"That’s the surprising bit. If you look just at the lens, it should introduce significant colour changes in older people, but we observed that … most of the time we have a very constant perception and it doesn’t change with age," says Wuerger.
The only age-related effects detected in the study were small changes that became apparent for green hues viewed under daylight.
In other words, although the colour signal being sent from the eye was changing significantly with age, the perception of colour was almost constant regardless of how old the study subject was.
This suggests that somewhere between the retina and the conscious perception of colour, the brain must recalibrate itself, she says.
"Something must be happening to change neural connections to maintain constant colour appearance," Wuerger says.
External standard
Exactly how this happens was not part of this study, but Wuerger offers one possible explanation.
"You could think our brain might be using some external standard like the blue sky or sunlight as a reference. There are things in the environment that don’t change and we could use them to recalibrate our visual system."
One useful clue about the mechanisms involved came from the fact that age did not affect all aspects of the visual system equally. While 18 year olds and 75 year olds were equally good at picking pure red or green and so on, older people were less able to distinguish between subtly different colours, particularly in the bluish range.
Because the recalibration doesn’t affect all our colour vision abilities, Wuerger concludes the adjustment isn’t likely to be taking place in the retina.
"I think that suggests that it must be happening later in the visual processing pathway, closer to the brain. We don’t have any proof of that but the experiments taken together suggest it’s … a kind of plasticity in the adult brain."
The next question might be why the brain performs this recalibration. What benefit is there in ensuring our perception of colours remains constant? For now, answering that question requires entering the realm of speculation.
Perhaps it has to do with a need to communicate colours effectively when describing objects, Wuerger ventures. “After all, to communicate colour meaningfully,” she says with a chuckle, “we all need to be - so to speak - on the same wavelength.”

Colour a constant throughout ageing

Visionary study Age may dim our eyes, but our brains make sure aspects of the rich world of colour experience defy the passing of time, a UK scientist has found.

It’s well known that our colour vision declines with age. Gradual yellowing of the lenses cuts out light in the blue range of the spectrum, while colour-sensing cone receptors on our retinas slowly lose sensitivity.

"Our ability to discriminate small colour differences declines as we age, there is no doubt about that," says neuroscientist Sophie Wuerger from the Department of Psychological Sciences, University of Liverpool.

But she has found our brains apparently compensate for at least some of these physical frailties. Her results are published online this week in the journal PLoS One.

Wuerger explored the colour perception of 185 people aged between 18 and 75 years with normal colour vision, an unusually large and diverse group for a study of this kind.

First, she used well-known data on how the lens changes with age to predict the light signal that would be sent to the brain by the volunteers’ retinas.

She then asked the participants to undertake a variety of tests that required them to select patches of colour representing pure red, green, yellow, or blue, under different lighting conditions.

Constant perception

The idea was to compare the predicted physiological changes in the eye with the participants’ actual experience of colours.

"That’s the surprising bit. If you look just at the lens, it should introduce significant colour changes in older people, but we observed that … most of the time we have a very constant perception and it doesn’t change with age," says Wuerger.

The only age-related effects detected in the study were small changes that became apparent for green hues viewed under daylight.

In other words, although the colour signal being sent from the eye was changing significantly with age, the perception of colour was almost constant regardless of how old the study subject was.

This suggests that somewhere between the retina and the conscious perception of colour, the brain must recalibrate itself, she says.

"Something must be happening to change neural connections to maintain constant colour appearance," Wuerger says.

External standard

Exactly how this happens was not part of this study, but Wuerger offers one possible explanation.

"You could think our brain might be using some external standard like the blue sky or sunlight as a reference. There are things in the environment that don’t change and we could use them to recalibrate our visual system."

One useful clue about the mechanisms involved came from the fact that age did not affect all aspects of the visual system equally. While 18 year olds and 75 year olds were equally good at picking pure red or green and so on, older people were less able to distinguish between subtly different colours, particularly in the bluish range.

Because the recalibration doesn’t affect all our colour vision abilities, Wuerger concludes the adjustment isn’t likely to be taking place in the retina.

"I think that suggests that it must be happening later in the visual processing pathway, closer to the brain. We don’t have any proof of that but the experiments taken together suggest it’s … a kind of plasticity in the adult brain."

The next question might be why the brain performs this recalibration. What benefit is there in ensuring our perception of colours remains constant? For now, answering that question requires entering the realm of speculation.

Perhaps it has to do with a need to communicate colours effectively when describing objects, Wuerger ventures. “After all, to communicate colour meaningfully,” she says with a chuckle, “we all need to be - so to speak - on the same wavelength.”

Filed under colour vision aging peripheral visual system colour perception psychology neuroscience science

89 notes

Cancer Drug Prevents Build-up of Toxic Brain Protein

Researchers at Georgetown University Medical Center have used tiny doses of a leukemia drug to halt accumulation of toxic proteins linked to Parkinson’s disease in the brains of mice. This finding provides the basis to plan a clinical trial in humans to study the effects.

image

They say their study, published online May 10 in Human Molecular Genetics, offers a unique and exciting strategy to treat neurodegenerative diseases that feature abnormal buildup of proteins in Parkinson’s disease, Alzheimer’s disease, amyotrophic lateral sclerosis (ALS), frontotemporal dementia, Huntington disease and Lewy body dementia, among others. 

“This drug, in very low doses, turns on the garbage disposal machinery inside neurons to clear toxic proteins from the cell. By clearing intracellular proteins, the drug prevents their accumulation in pathological inclusions called Lewy bodies and/or tangles, and also prevents amyloid secretion into the extracellular space between neurons, so proteins do not form toxic clumps or plaques in the brain,” says the study’s senior investigator, neuroscientist Charbel E-H Moussa, MB, PhD. Moussa heads the laboratory of dementia and Parkinsonism at Georgetown.

When the drug, nilotinib, is used to treat chronic myelogenous leukemia (CML), it forces cancer cells into autophagy — a biological process that leads to death of tumor cells in cancer.

“The doses used to treat CML are high enough that the drug pushes cells to chew up their own internal organelles, causing self-cannibalization and cell death,” Moussa says. “We reasoned that small doses — for these mice, an equivalent to one percent of the dose used in humans — would turn on just enough autophagy in neurons that the cells would clear malfunctioning proteins, and nothing else.”

Moussa, who has long sought a way to force neurons to clean up their garbage, came up with the idea of using cancer drugs that push autophagy in tumors to help diseased brains. “No one has tried anything like this before,” he says.

Moussa, and his two co-authors — graduate student Michaeline Hebron and Irina Lonskaya, PhD, a postdoctoral researcher in Moussa’s lab — searched for cancer drugs that can cross the blood-brain barrier. They discovered two candidates — nilotinib and bosutinib, which is also approved to treat CML. This study discusses experiments with nilotinib, but Moussa says that use of bosutinib is also beneficial.  

The mice used in this study over-express alpha-Synuclein, the protein that builds up in Lewy bodies in Parkinson’s disease and dementia patients and which is found in many other neurodegenerative diseases. The animals were given one milligram of nilotinib every two days. (By contrast, the FDA approved use of up to 1,000 milligrams of nilotinib once a day for CML patients.)

 “We successfully tested this for several diseases models that have an accumulation of intracellular protein,” Moussa says. “It gets rid of alpha synuclein and tau in a number of movement disorders, such as Parkinson’s disease as well as Lewy body dementia.”

The team also showed that movement and functionality in the treated mice was greatly improved, compared with untreated mice.

In order for such a therapy to be as successful as possible in patients, the agent would need to be used early in neurodegenerative diseases, Moussa hypothesizes. Later use might retard further extracellular plaque formation and accumulation of intracellular proteins in inclusions such as Lewy bodies.

Moussa is planning a phase II clinical trial in participants who have been diagnosed with disorders that feature build-up of alpha Synuclein, including Lewy body dementia, Parkinson’s disease, progressive supranuclear palsy (PSP) and multiple system atrophy (MSA).

(Source: explore.georgetown.edu)

Filed under neurodegenerative diseases parkinson's disease nilotinib chronic myelogenous leukemia neurology neuroscience science

102 notes

Children of addicted parents more likely to be depressed as adults
Children of parents who were addicted to drugs or alcohol are more likely to be depressed in adulthood, according to a new study by University of Toronto researchers.
“These findings underscore the intergenerational consequences of drug and alcohol addiction and reinforce the need to develop interventions that support healthy childhood development,” said the study’s lead author, Esme Fuller-Thomson, professor and Sandra Rotman Endowed Chair in the University of Toronto’s Factor-Inwentash Faculty of Social Work and the Department of Family and Community Medicine.
In a paper published online in the journal Psychiatry Research this month, investigators examined the association between parental addictions and adult depression in a representative sample of 6,268 adults, drawn from the 2005 Canadian Community Health Survey.
Of these respondents, 312 had a major depressive episode within the year preceding the survey and 877 reported that while they were under the age of 18 and still living at home that at least one parent who drank or used drugs “so often that it caused problems for the family.”
Results indicate that individuals whose parents were addicted to drugs or alcohol are more likely to develop depression than their peers. After adjusting for age, sex and race, parental addictions were associated with more than twice the odds of adult depression, says Fuller-Thomson.
“Even after adjusting for factors ranging from childhood maltreatment and parental unemployment to adult health behaviours including smoking and alcohol consumption, we found that parental addictions were associated with 69 per cent higher odds of depression in adulthood,” explains Fuller-Thomson. The study was co-authored with four graduate students at the University of Toronto: Robyn Katz, Vi Phan, Jessica Liddycoat and Sarah Brennenstuhl.
This study could not determine the cause of the relationship between parental addictions and adult depression. Co-author Robyn Katz, suggests that “It is possible that the prolonged and inescapable strain of parental addictions may permanently alter the way these children’s bodies react to stress throughout their life.
"One important avenue for future research is to investigate potential dysfunctions in cortisol production – the hormone that prepares us for ‘fight or flight’ – which may influence the later development of depression.”
“As an important first step, children who experience toxic stress at home can be greatly helped by the stable involvement of caring adults, including grandparents, teachers, coaches, neighbours and social workers,” said Fuller-Thomson. “Although more research is needed to determine if access to a responsive and loving adult decreases the likelihood of adult depression among children exposed to parental addictions, we do know that these caring relationships promote healthy development and buffer stress.”

Children of addicted parents more likely to be depressed as adults

Children of parents who were addicted to drugs or alcohol are more likely to be depressed in adulthood, according to a new study by University of Toronto researchers.

“These findings underscore the intergenerational consequences of drug and alcohol addiction and reinforce the need to develop interventions that support healthy childhood development,” said the study’s lead author, Esme Fuller-Thomson, professor and Sandra Rotman Endowed Chair in the University of Toronto’s Factor-Inwentash Faculty of Social Work and the Department of Family and Community Medicine.

In a paper published online in the journal Psychiatry Research this month, investigators examined the association between parental addictions and adult depression in a representative sample of 6,268 adults, drawn from the 2005 Canadian Community Health Survey.

Of these respondents, 312 had a major depressive episode within the year preceding the survey and 877 reported that while they were under the age of 18 and still living at home that at least one parent who drank or used drugs “so often that it caused problems for the family.”

Results indicate that individuals whose parents were addicted to drugs or alcohol are more likely to develop depression than their peers. After adjusting for age, sex and race, parental addictions were associated with more than twice the odds of adult depression, says Fuller-Thomson.

“Even after adjusting for factors ranging from childhood maltreatment and parental unemployment to adult health behaviours including smoking and alcohol consumption, we found that parental addictions were associated with 69 per cent higher odds of depression in adulthood,” explains Fuller-Thomson. The study was co-authored with four graduate students at the University of Toronto: Robyn Katz, Vi Phan, Jessica Liddycoat and Sarah Brennenstuhl.

This study could not determine the cause of the relationship between parental addictions and adult depression. Co-author Robyn Katz, suggests that “It is possible that the prolonged and inescapable strain of parental addictions may permanently alter the way these children’s bodies react to stress throughout their life.

"One important avenue for future research is to investigate potential dysfunctions in cortisol production – the hormone that prepares us for ‘fight or flight’ – which may influence the later development of depression.”

“As an important first step, children who experience toxic stress at home can be greatly helped by the stable involvement of caring adults, including grandparents, teachers, coaches, neighbours and social workers,” said Fuller-Thomson. “Although more research is needed to determine if access to a responsive and loving adult decreases the likelihood of adult depression among children exposed to parental addictions, we do know that these caring relationships promote healthy development and buffer stress.”

Filed under parental addictions addiction depression adult depression psychology neuroscience science

122 notes

Brain diseases affecting more people and starting earlier than ever before
Professor Colin Pritchard’s latest research published in Public Health Journal has found that the sharp rise of dementia and other neurological deaths in people under 74 cannot be put down to the fact that we are living longer – the rise is because a higher proportion of old people are being affected by such conditions, and what is really alarming, it is starting earlier and affecting people under 55 years.
Of the 10 biggest Western countries the USA had the worst increase in all neurological deaths, men up 66% and women 92% between 1979-2010. The UK was 4th highest, men up 32% and women 48%. In terms of numbers of deaths, in the UK, it was 4,500 and now 6,500, in the USA it was 14,500 now more than 28,500 deaths!
Professor Pritchard of Bournemouth University says: “These statistics are about real people and families, and we need to recognise that there is an ‘epidemic’ that clearly is influenced by environmental and societal changes.”
Tessa Gutteridge, Director YoungDementia UK says that our society needs to learn that dementia is increasingly affecting people from an earlier age: “The lives of an increasing number of families struggling with working-age dementia are made so much more challenging by services which fail to keep pace with their needs and a society which believes dementia to be an illness of old age.”
Bournemouth University researchers, Professor Colin Pritchard and Dr Andrew Mayers, along with the University of Southampton’s Professor David Baldwin show that there are rises in total neurological deaths, including the dementias, which are starting earlier, impacting upon patients, their families and health and social care services, exemplified by an 85% increase in UK Motor Neurone Disease deaths.
The research highlights that there is an alarming ‘hidden epidemic’ of rises in neurological deaths between 1979-2010 of adults (under 74) in Western countries, especially the UK.
Total neurological deaths in both men and women rose significantly in 16 of the countries covered by the research, which is in sharp contrast to the major reductions in deaths from all other causes.
Over the period the UK has the third biggest neurological increase, up 32% in men and 48% in women, whilst women’s neurological deaths rose faster than men’s in most countries.
Professor Pritchard said, “These rises in neurological deaths, with the earlier onset of the dementias, are devastating for families and pose a considerable public health problem. It is NOT that we have more old people but rather more old people have more brain disease than ever before, including Alzheimer’s. For example there are two new British charities, The Young Parkinson’s Society and Young Dementia UK, which are a grass-roots response to these rises. The need for such charities would have been inconceivable a little more than 30 years ago.”
When asked what he thought caused the increases he replied,
“This has to be speculative but it cannot be genetic because the period is too short. Whilst there will be some influence of more elderly people, it does not account for the earlier onset; the differences between countries nor the fact that more women have been affected, as their lives have changed more than men’s over the period, all indicates multiple environmental factors. Considering the changes over the last 30 years – the explosion in electronic devices, rises in background non-ionising radiation- PC’s, micro waves, TV’s, mobile phones; road and air transport up four-fold increasing background petro-chemical pollution; chemical additives to food etc. There is no one factor rather the likely interaction between all these environmental triggers, reflecting changes in other conditions. For example, whilst cancer deaths are down substantially, cancer incidence continues to rise; levels of asthma are un-precedented; the fall in male sperm counts - the rise of auto-immune diseases - all point to life-style and environmental influences. These `statistics’ are about real people and families, and we need to recognise that there is an `epidemic’ that clearly is influenced by environmental and societal changes.”

Brain diseases affecting more people and starting earlier than ever before

Professor Colin Pritchard’s latest research published in Public Health Journal has found that the sharp rise of dementia and other neurological deaths in people under 74 cannot be put down to the fact that we are living longer – the rise is because a higher proportion of old people are being affected by such conditions, and what is really alarming, it is starting earlier and affecting people under 55 years.

Of the 10 biggest Western countries the USA had the worst increase in all neurological deaths, men up 66% and women 92% between 1979-2010. The UK was 4th highest, men up 32% and women 48%. In terms of numbers of deaths, in the UK, it was 4,500 and now 6,500, in the USA it was 14,500 now more than 28,500 deaths!

Professor Pritchard of Bournemouth University says: “These statistics are about real people and families, and we need to recognise that there is an ‘epidemic’ that clearly is influenced by environmental and societal changes.”

Tessa Gutteridge, Director YoungDementia UK says that our society needs to learn that dementia is increasingly affecting people from an earlier age: “The lives of an increasing number of families struggling with working-age dementia are made so much more challenging by services which fail to keep pace with their needs and a society which believes dementia to be an illness of old age.”

Bournemouth University researchers, Professor Colin Pritchard and Dr Andrew Mayers, along with the University of Southampton’s Professor David Baldwin show that there are rises in total neurological deaths, including the dementias, which are starting earlier, impacting upon patients, their families and health and social care services, exemplified by an 85% increase in UK Motor Neurone Disease deaths.

The research highlights that there is an alarming ‘hidden epidemic’ of rises in neurological deaths between 1979-2010 of adults (under 74) in Western countries, especially the UK.

Total neurological deaths in both men and women rose significantly in 16 of the countries covered by the research, which is in sharp contrast to the major reductions in deaths from all other causes.

Over the period the UK has the third biggest neurological increase, up 32% in men and 48% in women, whilst women’s neurological deaths rose faster than men’s in most countries.

Professor Pritchard said, “These rises in neurological deaths, with the earlier onset of the dementias, are devastating for families and pose a considerable public health problem. It is NOT that we have more old people but rather more old people have more brain disease than ever before, including Alzheimer’s. For example there are two new British charities, The Young Parkinson’s Society and Young Dementia UK, which are a grass-roots response to these rises. The need for such charities would have been inconceivable a little more than 30 years ago.”

When asked what he thought caused the increases he replied,

“This has to be speculative but it cannot be genetic because the period is too short. Whilst there will be some influence of more elderly people, it does not account for the earlier onset; the differences between countries nor the fact that more women have been affected, as their lives have changed more than men’s over the period, all indicates multiple environmental factors. Considering the changes over the last 30 years – the explosion in electronic devices, rises in background non-ionising radiation- PC’s, micro waves, TV’s, mobile phones; road and air transport up four-fold increasing background petro-chemical pollution; chemical additives to food etc. There is no one factor rather the likely interaction between all these environmental triggers, reflecting changes in other conditions. For example, whilst cancer deaths are down substantially, cancer incidence continues to rise; levels of asthma are un-precedented; the fall in male sperm counts - the rise of auto-immune diseases - all point to life-style and environmental influences. These `statistics’ are about real people and families, and we need to recognise that there is an `epidemic’ that clearly is influenced by environmental and societal changes.”

Filed under brain diseases dementia alzheimer's disease health neuroscience science

804 notes

Brain implants: Restoring memory with a microchip
William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality.
Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.
But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement.
"I never thought I’d see this in my lifetime," said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. "I might not benefit from it myself but my kids will."
Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”
The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.
Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.
They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.
Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”
The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.
For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.
"It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago," Hampson says.
Much of the work that remains now is in shrinking down the electronics.
"Right now it’s not a device, it’s a fair amount of equipment,"Hampson says. "We’re probably looking at devices in the five to 10 year range for human patients."
The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.
Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.
Unfortunately, the team found that its method can’t help patients with advanced dementia.
"When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss," Hampson said.
Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.
"The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision." However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.
The UK’s Alzheimer’s Society is cautiously optimistic.
"Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore," said Doug Brown, director of research and development.
Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”
It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.
There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.
"That’s what it’s all about."

Brain implants: Restoring memory with a microchip

William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality.

Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.

But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement.

"I never thought I’d see this in my lifetime," said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. "I might not benefit from it myself but my kids will."

Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”

The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.

Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.

They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.

Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”

The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.

For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.

"It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago," Hampson says.

Much of the work that remains now is in shrinking down the electronics.

"Right now it’s not a device, it’s a fair amount of equipment,"Hampson says. "We’re probably looking at devices in the five to 10 year range for human patients."

The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.

Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.

Unfortunately, the team found that its method can’t help patients with advanced dementia.

"When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss," Hampson said.

Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.

"The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision." However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.

The UK’s Alzheimer’s Society is cautiously optimistic.

"Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore," said Doug Brown, director of research and development.

Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”

It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.

There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.

"That’s what it’s all about."

Filed under brain hippocampus memory memory device implants deep brain stimulation neuroscience science

118 notes

Sense of Touch Reproduced Through Prosthetic Hand

In a study recently published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, neurobiologists at the University of Chicago show how an organism can sense a tactile stimulus, in real time, through an artificial sensor in a prosthetic hand.

Scientists have made tremendous advances toward building lifelike prosthetic limbs that move and function like the real thing. These are amazing accomplishments, but an important element to creating a realistic replacement for a hand is the sense of touch. Without somatosensory feedback from the fingertips about how hard you’re squeezing something or where it’s positioned relative to the hand, grasping an object is about as accurate as using one of those skill cranes to grab a stuffed animal at an arcade. Sure, you can do it, but you have to concentrate intently while watching every movement. You’re relying on your sense of vision to compensate for the lack of touch.

Sliman Bensmaia, assistant professor of organismal biology and anatomy at the University of Chicago, studies the neural basis of the sense of touch. Now, he and his colleagues are working with a robotic hand equipped with sensors that send electrical signals to electrodes implanted in the brain to recreate the same response to touch as a real hand.

Bensmaia spoke about how important the sense of touch is to creating a lifelike experience with a prosthetic limb.

“If you lose your somatosensory system it almost looks like your motor system is impaired,” he said. “If you really want to create an arm that can actually be used dexterously without the enormous amount of concentration it takes without sensory feedback, you need to restore the somatosensory feedback.”

The researchers performed a series of experiments with rhesus macaques that were trained to respond to stimulation of the hand. In one setting, they were gently poked on the hand with a physical probe at varying levels of pressure. In a second setting, some of the animals had electrodes implanted into the area of the brain that responds to touch. These animals were given electrical pulses to simulate the sensation of touch, and their hands were hidden so they wouldn’t see that they weren’t actually being touched.

Using data from the animals’ responses to each type of stimulus, the researchers were able to create a function, or equation, that described the requisite electrical pulse to go with each physical poke of the hand. Then, they repeated the experiments with a prosthetic hand that was wired to the brain implants. They touched the prosthetic hand with the physical probe, which in turn sent electrical signals to the brain.

Bensmaia said that the animals performed identically whether poked on their own hand or on the prosthetic one.

“This is the first time as far as I know where an animal or organism actually perceives a tactile stimulus through an artificial transducer,” Bensmaia said. “It’s an engineering milestone. But from a neuroengineering standpoint, this validates this function. You can use this function to have an animal perform this very precise task, precisely identically.”

The FDA is in the process of approving similar devices for human trials, and Bensmaia said he hopes such a system is implemented within the next year. Producing a lifelike sense of touch would go a long way toward improving the dexterity and performance of prosthetic hands, but he said it would also help bridge a mental divide for amputees or people who have lost the use of a limb. Until now, prosthetics and robotic arms feel more like tools than real replacements because they don’t produce the expected sensations.

“If every time you see your robotic arm touching something, you get a sensation that is projected to it, I think it’s very possible that in fact, you will consider this new thing as being part of your body,” he said.

(Source: newswise.com)

Filed under prosthetic limbs prosthetic hand artificial limbs tactile sensation somatosensory system neuroscience robotics science

134 notes

Study finds brain system for emotional self-control

Different brain areas are activated when we choose to suppress an emotion, compared to when we are instructed to inhibit an emotion, according a new study from the UCL Institute of Cognitive Neuroscience and Ghent University.

In this study, published in Brain Structure and Function, the researchers scanned the brains of healthy participants and found that key brain systems were activated when choosing for oneself to suppress an emotion. They had previously linked this brain area to deciding to inhibit movement.

"This result shows that emotional self-control involves a quite different brain system from simply being told how to respond emotionally," said lead author Dr Simone Kuhn (Ghent University).

In most previous studies, participants were instructed to feel or inhibit an emotional response. However, in everyday life we are rarely told to suppress our emotions, and usually have to decide ourselves whether to feel or control our emotions.

In this new study the researchers showed fifteen healthy women unpleasant or frightening pictures. The participants were given a choice to feel the emotion elicited by the image, or alternatively to inhibit the emotion, by distancing themselves through an act of self-control.

The researchers used functional magnetic resonance imaging (fMRI) to scan the brains of the participants. They compared this brain activity to another experiment where the participants were instructed to feel or inhibit their emotions, rather than choose for themselves.

Different parts of the brain were activated in the two situations. When participants decided for themselves to inhibit negative emotions, the scientists found activation in the dorso-medial prefrontal area of the brain. They had previously linked this brain area to deciding to inhibit movement.

In contrast, when participants were instructed by the experimenter to inhibit the emotion, a second, more lateral area was activated.

"We think controlling one’s emotions and controlling one’s behaviour involve overlapping mechanisms," said Dr Kuhn.

"We should distinguish between voluntary and instructed control of emotions, in the same way as we can distinguish between making up our own mind about what do, versus following instructions."

Regulating emotions is part of our daily life, and is important for our mental health. For example, many people have to conquer fear of speaking in public, while some professionals such as health-care workers and firemen have to maintain an emotional distance from unpleasant or distressing scenes that occur in their jobs.

Professor Patrick Haggard (UCL Institute of Cognitive Neuroscience) co-author of the paper said the brain mechanism identified in this study could be a potential target for therapies.

"The ability to manage one’s own emotions is affected in many mental health conditions, so identifying this mechanism opens interesting possibilities for future research.

"Most studies of emotion processing in the brain simply assume that people passively receive emotional stimuli, and automatically feel the corresponding emotion. In contrast, the area we have identified may contribute to some individuals’ ability to rise above particular emotional situations.

"This kind of self-control mechanism may have positive aspects, for example making people less vulnerable to excessive emotion. But altered function of this brain area could also potentially lead to difficulties in responding appropriately to emotional situations."

(Source: eurekalert.org)

Filed under brain activity emotional response fMRI negative emotions psychology neuroscience science

52 notes

Imaging Technique Could Help Traumatic Brain Injury Patients
A new application of an existing medical imaging technology could help predict long-term damage in patients with traumatic brain injury, according to a recent UC San Francisco study.
The authors of the study analyzed brain scans using applied rapid automated resting state magnetoencephalography (MEG) imaging, a technique used to map brain activity by recording magnetic fields produced by natural electrical currents in the brain. They discovered “abnormally decreased functional connectivity” – or possible long-term brain damage – could persist years after a person suffers even a mild form of traumatic brain injury.
“We were hoping that areas of abnormal brain activity would match up with some of the functional measures such as patients’ symptoms after injury, and we saw such correlation,” said senior author Pratik Mukherjee, MD, PhD, associate professor in residence at the UCSF School of Medicine.
In a study published on April 19 in the Journal of Neurosurgery, UCSF researchers analyzed brain connectivity data on 14 male and seven female patients, whose median age was 29. Brain connectivity refers to a pattern of causal interactions between specific parts within a nervous system. Eleven patients had mild, one had moderate, and three had severe forms of traumatic brain injury. Six patients suffered no brain injury.
“Once we have connectivity information, we can create a template of what it looks like in a normal subject. When we have subjects that have had head injuries, we can compare their connectivity pattern to that of the normal subjects with an automated computer algorithm,” Mukherjee said. “And that will automatically detect areas of abnormally low and abnormally high connectivity compared to the normal database.” 
MEG imaging provides much richer information than a typical magnetic resonance imaging (MRI), which uses magnetic field and radio wave energy to give a static image of the brain or other internal structures of the body.
“If you scan someone a couple months after the trauma with an MRI, and you scan them again a couple of years after the trauma, it’s going to look the same,” Mukherjee said. “With MEG, we can characterize simple systems in much more in fine grain detail. It produces the most detailed activity mapping of the brain.”
Although MEG signals were first measured in 1968, the technology has not been widely used for patients with traumatic brain injury until recently. 
“It takes a minute or two to complete an MEG scan and it automatically detects the areas of abnormality using a computer algorithm,” Mukherjee said. “And it seems to be fairly sensitive because it’s showing us areas of abnormality even in people where MRIs missed some abnormalities.”
Every year approximately 1.7 million people in the United States suffer from traumatic brain injury, which costs the U.S. health care system an estimated $60 billion according to the U.S. Centers for Disease Control and Prevention. The most common forms of traumatic brain injury are suffered by athletes, members of the military, and those involved in motor vehicle collisions or occupational injuries.
“This is a preliminary study testing a new technique with a small sample, which makes it difficult to have enough statistical power to make such correlations,” Mukherjee said. “But I think this is an important step in our quest to help people suffering from traumatic brain injuries.”

Imaging Technique Could Help Traumatic Brain Injury Patients

A new application of an existing medical imaging technology could help predict long-term damage in patients with traumatic brain injury, according to a recent UC San Francisco study.

The authors of the study analyzed brain scans using applied rapid automated resting state magnetoencephalography (MEG) imaging, a technique used to map brain activity by recording magnetic fields produced by natural electrical currents in the brain. They discovered “abnormally decreased functional connectivity” – or possible long-term brain damage – could persist years after a person suffers even a mild form of traumatic brain injury.

“We were hoping that areas of abnormal brain activity would match up with some of the functional measures such as patients’ symptoms after injury, and we saw such correlation,” said senior author Pratik Mukherjee, MD, PhD, associate professor in residence at the UCSF School of Medicine.

In a study published on April 19 in the Journal of Neurosurgery, UCSF researchers analyzed brain connectivity data on 14 male and seven female patients, whose median age was 29. Brain connectivity refers to a pattern of causal interactions between specific parts within a nervous system. Eleven patients had mild, one had moderate, and three had severe forms of traumatic brain injury. Six patients suffered no brain injury.

“Once we have connectivity information, we can create a template of what it looks like in a normal subject. When we have subjects that have had head injuries, we can compare their connectivity pattern to that of the normal subjects with an automated computer algorithm,” Mukherjee said. “And that will automatically detect areas of abnormally low and abnormally high connectivity compared to the normal database.” 

MEG imaging provides much richer information than a typical magnetic resonance imaging (MRI), which uses magnetic field and radio wave energy to give a static image of the brain or other internal structures of the body.

“If you scan someone a couple months after the trauma with an MRI, and you scan them again a couple of years after the trauma, it’s going to look the same,” Mukherjee said. “With MEG, we can characterize simple systems in much more in fine grain detail. It produces the most detailed activity mapping of the brain.”

Although MEG signals were first measured in 1968, the technology has not been widely used for patients with traumatic brain injury until recently. 

“It takes a minute or two to complete an MEG scan and it automatically detects the areas of abnormality using a computer algorithm,” Mukherjee said. “And it seems to be fairly sensitive because it’s showing us areas of abnormality even in people where MRIs missed some abnormalities.”

Every year approximately 1.7 million people in the United States suffer from traumatic brain injury, which costs the U.S. health care system an estimated $60 billion according to the U.S. Centers for Disease Control and Prevention. The most common forms of traumatic brain injury are suffered by athletes, members of the military, and those involved in motor vehicle collisions or occupational injuries.

“This is a preliminary study testing a new technique with a small sample, which makes it difficult to have enough statistical power to make such correlations,” Mukherjee said. “But I think this is an important step in our quest to help people suffering from traumatic brain injuries.”

Filed under TBI MEG imaging brain injury brain damage brain activity neuroscience science

42 notes

Researchers identify how cells control calcium influx

When brain cells are overwhelmed by an influx of too many calcium molecules, they shut down the channels through which these molecules enter the cells. Until now, the “stop” signal mechanism that cells use to control the molecular traffic was unknown.

In the new issue of the journal Neuron, UC Davis Health System scientists report that they have identified the mechanism. Their findings are relevant to understanding the molecular causes of the disruption of brain functioning that occurs in stroke and other neurological disorders.

"Too much calcium influx clearly is part of the neuronal dysfunction in Alzheimer’s disease and causes the neuronal damage during and after a stroke. It also contributes to chronic pain," said Johannes W. Hell, professor of pharmacology at UC Davis. Hell headed the research team that identified the mechanism that stops the flow of calcium molecules, which are also called ions, into the specialized brain cells known as neurons.

Hell explained that each day millions of molecules of calcium enter and exit each of the 100 billion neurons of the human brain. These calcium ions move in and out of neurons through pore-like structures, known as channels, that are located in the outer surface, or “skin,” of each cell.

The flow of calcium ions into brain cells generates the electrical impulses needed to stimulate such actions as the movement of muscles in our legs and the creation of new memories in the brain. The movement of calcium ions also plays a role in gene expression and affects the flexibility of the structures, called synapses, that are located between neurons and transmit electrical or chemical signals of various strengths from one cell to a second cell.

Neurons employ an unexpected and highly complex mechanism to down regulate, or reduce, the activity of channels that are permitting too many calcium ions to enter neurons, Hell and his colleagues discovered. The mechanism, which leads to the elimination of the overly permissive ion channel employs two proteins, α-actinin and the calcium-binding messenger protein calmodulin.

Located on the neuron’s outer surface, referred to as the plasma membrane, α-actinin stabilizes the type of ion channels that constitute a major source of calcium ion influx into brain cells, Hell explained. This protein is a component of the cytoskeleton, the scaffolding of cells. The ion channels that are a major source of calcium ions are referred to as Cav1.2 (L type voltage-dependent calcium channels).

The researchers also found that the calcium-binding messenger protein calmodulin, which is the cell’s main sensor for calcium ions, induces internalization, or endocytosis, of Cav1.2 to remove this channel from the cell surface, thus providing an important negative feedback mechanism for excessive calcium ion influx into a neuron, Hell explained.

The discovery that α-actinin and calmodulin play a role in controlling calcium ion influx expands upon Hell’s previous research on the molecular mechanisms that regulate the activity of various ion channels at the synapse.

One previous study proved relevant to understanding the biological mechanisms that underlie the body’s fight-or-flight response during stress.

In work published in the journal Science in 2001, Hell and colleagues reported that the regulation of Cav1.2 by adrenergic signaling during stress is performed by one of the adrenergic receptors (beta 2 adrenergic receptor) directly linked to Cav1.2.

"This protein-protein interaction ensures that the adrenergic regulation is fast, efficient and precisely targets this channel," Hell said.

"We showed that Cav1.2 is regulated by adrenergic signaling on a time scale of a few seconds, and this is mainly increasing its activity when needed, for example during danger, to make our brain work faster and better. The same channel is in the heart, where adrenergic stimulation increases channel/Ca influx activity, increasing the pacing and strength of our heart beat to meet the increased physical demands during danger."

(Source: universityofcalifornia.edu)

Filed under calcium influx calcium ions synapses neurons neuronal damage chronic pain neuroscience science

free counters