Neuroscience

Articles and news from the latest research reports.

154 notes

Nicotine withdrawal reduces response to rewards across species
Cigarette smoking is a leading cause of preventable death worldwide and is associated with approximately 440,000 deaths in the United States each year, according to the U.S. Centers for Disease Control and Prevention, but nearly 20 percent of the U.S. population continues to smoke cigarettes. While more than half of U.S. smokers try to quit every year, less than 10 percent are able to remain smoke-free, and relapse commonly occurs within 48 hours of smoking cessation. Learning about withdrawal and difficulty of quitting can lead to more effective treatments to help smokers quit.
In a first of its kind study on nicotine addiction, scientists measured a behavior that can be similarly quantified across species like humans and rats, the responses to rewards during nicotine withdrawal. Findings from this study were published online on Sept. 10, 2014 in JAMA Psychiatry.
Response to reward is the brain’s ability to derive and recognize pleasure from natural things such as food, money and sex. The reduced ability to respond to rewards is a behavioral process associated with depression in humans. In prior studies of nicotine withdrawal, investigators used very different behavioral measurements across humans and rats, limiting our understanding of this important brain reward system.
Using a translational behavioral approach, Michele Pergadia, Ph.D., associate professor of clinical biomedical science in the Charles E. Schmidt College of Medicine at Florida Atlantic University, who completed the human study while at Washington University School of Medicine, Andre Der-Avakian, Ph.D., who completed the rat study at the University of California San Diego (UCSD), and colleagues, including senior collaborators Athina Markou, Ph.D. at UCSD and Diego Pizzagalli, Ph.D. at Harvard Medical School, found that nicotine withdrawal similarly reduced reward responsiveness in human smokers - particularly those with a history of depression - as well as in nicotine-treated rats.
Pergadia, one of the lead authors, notes that replication of experimental results across species is a major step forward, because it allows for greater generalizability and a more reliable means for identifying behavioral and neurobiological mechanisms that explain the complicated behavior of nicotine withdrawal in humans addicted to tobacco.
"The fact that the effect was similar across species using this translational task not only provides us with a ready framework to proceed with additional research to better understand the mechanisms underlying withdrawal of nicotine, and potentially new treatment development, but it also makes us feel more confident that we are actually studying the same behavior in humans and rats as the studies move forward," said Pergadia.
Pergadia and colleagues plan to pursue future studies that will include a systematic study of depression vulnerability as it relates to reward sensitivity, the course of withdrawal-related reward deficits, including effects on relapse to smoking, and identification of processes in the brain that lead to these behaviors.
Pergadia emphasizes that the ultimate goal of this line of research is to improve treatments that manage nicotine withdrawal-related symptoms and thereby increase success during efforts to quit.
"Many smokers are struggling to quit, and there is a real need to develop new strategies to aid them in this process. Therapies targeting this reward dysfunction during withdrawal may prove to be useful," said Pergadia.

Nicotine withdrawal reduces response to rewards across species

Cigarette smoking is a leading cause of preventable death worldwide and is associated with approximately 440,000 deaths in the United States each year, according to the U.S. Centers for Disease Control and Prevention, but nearly 20 percent of the U.S. population continues to smoke cigarettes. While more than half of U.S. smokers try to quit every year, less than 10 percent are able to remain smoke-free, and relapse commonly occurs within 48 hours of smoking cessation. Learning about withdrawal and difficulty of quitting can lead to more effective treatments to help smokers quit.

In a first of its kind study on nicotine addiction, scientists measured a behavior that can be similarly quantified across species like humans and rats, the responses to rewards during nicotine withdrawal. Findings from this study were published online on Sept. 10, 2014 in JAMA Psychiatry.

Response to reward is the brain’s ability to derive and recognize pleasure from natural things such as food, money and sex. The reduced ability to respond to rewards is a behavioral process associated with depression in humans. In prior studies of nicotine withdrawal, investigators used very different behavioral measurements across humans and rats, limiting our understanding of this important brain reward system.

Using a translational behavioral approach, Michele Pergadia, Ph.D., associate professor of clinical biomedical science in the Charles E. Schmidt College of Medicine at Florida Atlantic University, who completed the human study while at Washington University School of Medicine, Andre Der-Avakian, Ph.D., who completed the rat study at the University of California San Diego (UCSD), and colleagues, including senior collaborators Athina Markou, Ph.D. at UCSD and Diego Pizzagalli, Ph.D. at Harvard Medical School, found that nicotine withdrawal similarly reduced reward responsiveness in human smokers - particularly those with a history of depression - as well as in nicotine-treated rats.

Pergadia, one of the lead authors, notes that replication of experimental results across species is a major step forward, because it allows for greater generalizability and a more reliable means for identifying behavioral and neurobiological mechanisms that explain the complicated behavior of nicotine withdrawal in humans addicted to tobacco.

"The fact that the effect was similar across species using this translational task not only provides us with a ready framework to proceed with additional research to better understand the mechanisms underlying withdrawal of nicotine, and potentially new treatment development, but it also makes us feel more confident that we are actually studying the same behavior in humans and rats as the studies move forward," said Pergadia.

Pergadia and colleagues plan to pursue future studies that will include a systematic study of depression vulnerability as it relates to reward sensitivity, the course of withdrawal-related reward deficits, including effects on relapse to smoking, and identification of processes in the brain that lead to these behaviors.

Pergadia emphasizes that the ultimate goal of this line of research is to improve treatments that manage nicotine withdrawal-related symptoms and thereby increase success during efforts to quit.

"Many smokers are struggling to quit, and there is a real need to develop new strategies to aid them in this process. Therapies targeting this reward dysfunction during withdrawal may prove to be useful," said Pergadia.

Filed under nicotine nicotine withdrawal reward system tobacco smoking neuroscience science

167 notes

Cells Put Off Protein Production During Times of Stress

Living cells are like miniature factories, responsible for the production of more than 25,000 different proteins with very specific 3-D shapes. And just as an overwhelmed assembly line can begin making mistakes, a stressed cell can end up producing misshapen proteins that are unfolded or misfolded.

image

(Image caption: A color-enhanced electron micrograph shows the nucleus of a cell (blue) adjacent to the rough endoplasmic reticulum (green), where proteins are manufactured from mRNA templates produced by the nucleus. Credit: University of Edinburgh, via the Wellcome TrustAdd)

Now Duke University researchers in North Carolina and Singapore have shown that the cell recognizes the buildup of these misfolded proteins and responds by reshuffling its workload, much like a stressed out employee might temporarily move papers from an overflowing inbox into a junk drawer. 

The study, which appears Sept. 11, 2014 in Cell, could lend insight into diseases that result from misfolded proteins piling up, such as Alzheimer’s disease, ALS, Huntington’s disease, Parkinson’s disease, and type 2 diabetes.

“We have identified an entirely new mechanism for how the cell responds to stress,” said Christopher V. Nicchitta, Ph.D., a professor of cell biology at Duke University School of Medicine. “Essentially, the cell remodels the organization of its protein production machinery in order to compartmentalize the tasks at hand.” 

The general architecture and workflow of these cellular factories has been understood for decades. First, DNA’s master blueprint, which is locked tightly in the nucleus of each cell, is transcribed into messenger RNA or mRNA. Then this working copy travels to the ribosomes standing on the surface of a larger accordion-shaped structure called the endoplasmic reticulum (ER). The ribosomes on the ER are tiny assembly lines that translate the mRNAs into proteins.

When a cell gets stressed, either by overheating or starvation, its proteins no longer fold properly. These unfolded proteins can set off an alarm — called the unfolded protein response or UPR – to slow down the assembly line and clean up the improperly folded products. Nicchitta wondered if the stress response might also employ other tactics to deal with the problem.

In this study, Nicchitta and his colleagues treated tissue culture cells with a stress-inducing agent called thapsigargin. They then separated the cells into two groups — those containing mRNAs associated with ribosomes on the endoplasmic reticulum, and those containing mRNAs associated with free-floating ribosomes in the neighboring fluid-filled space known as the cytosol. 

The researchers found that when the cells were stressed, they quickly moved mRNAs from the endoplasmic reticulum to the cytosol. Once the stress was resolved, the mRNAs went back to their spots on the production floor of the endoplasmic reticulum. 

“You can slow down protein production, but sometimes slowing down the workflow is not enough,” Nicchitta said. “You can activate genes to help chew up the misfolded proteins, but sometimes they are accumulating too quickly. Here we have discovered a mechanism that does one better — it effectively puts everything on hold. Once things get back to normal, the mRNAs are released from the holding pattern.” 

Interestingly, the researchers found that shuttling ribosomes between the ER and the cytoplasm during stress only affected the subset of mRNAs that would give rise to secreted proteins like hormones or membrane proteins like growth factor receptors — the types of proteins that set off the stress response if they’re misfolded. They aren’t sure yet what this means.

Nicchitta is currently searching for the factors that ultimately determine which mechanisms cells employ during the stress response. He has already pinpointed one promising candidate, and is looking to see how cells respond to stress when that factor is manipulated.

(Source: today.duke.edu)

Filed under neurodegenerative diseases stress endoplasmic reticulum thapsigargin cytoplasm neuroscience science

534 notes

Breast milk is brain food
You are what you eat, the saying goes, and now a study conducted by researchers at UC Santa Barbara and the University of Pittsburgh suggests that the oft-repeated adage applies not just to physical health but to brain power as well.
In a paper published in the early online edition of the journal Prostaglandins, Leukotrienes and Essential Fatty Acids, the researchers compared the fatty acid profiles of breast milk from women in over two dozen countries with how well children from those same countries performed on academic tests.

Their findings show that the amount of omega-3 docosahexaenoic acid (DHA) in a mother’s milk — fats found primarily in certain fish, nuts and seeds — is the strongest predictor of test performance. It outweighs national income and the number of dollars spent per pupil in schools.
DHA alone accounted for about 20 percent of the differences in test scores among countries, the researchers found.
On the other hand, the amount of omega-6 fat in mother’s milk — fats that come from vegetable oils such as corn and soybean — predict lower test scores. When the amount of DHA and linoleic acid (LA) — the most common omega-6 fat — were considered together, they explained nearly half of the differences in test scores. In countries where mother’s diets contain more omega-6, the beneficial effects of DHA seem to be reduced.
More omega-3, less omega-6
“Human intelligence has a physical basis in the huge size of our brains — some seven times larger than would be expected for a mammal with our body size,” said Steven Gaulin, UCSB professor of anthropology and co-author of the paper. “Since there is never a free lunch, those big brains need lots of extra building materials — most importantly, they need omega-3 fatty acids, especially DHA. Omega-6 fats, however, undermine the effects of DHA and seem to be bad for brains.”
Both kinds of omega fat must be obtained through diet. But because diets vary from place to place, for their study Gaulin and his co-author, William D. Lassek, M.D., a professor at the University of Pittsburgh’s Graduate School of Public Health and a retired assistant surgeon general, estimated the DHA and LA content — the good fat and the bad fat — in diets in 50 countries by examining published studies of the fatty acid profiles of women’s breast milk.
The profiles are a useful measure for two reasons, according to Gaulin. First, because various kinds of fats interfere with one another in the body, breast milk DHA shows how much of this brain-essential fat survives competition with omega-6. Second, children receive their brain-building fats from their mothers. Breast milk profiles indicate the amount of DHA children in each region receive in the womb, through breastfeeding, and from the local diet available to their mothers and to them after they are weaned.
The academic test results came from the Programme for International Student Assessment (PISA), which administers standardized tests in 58 nations. Gaulin and Lassek averaged the three PISA tests — math, science and reading ability — as their measure of cognitive performance. There were 28 countries for which the researchers found information about both breast milk and test scores.
DHA content: best predictor of math test performance
“Looking at those 28 countries, the DHA content of breast milk was the single best predictor of math test performance,” Gaulin said. The second best indicator was the amount of omega-6, and its effect is opposite. “Considering the benefits of omega-3 and the detriment of omega-6, we can get pretty darn close to explaining half the difference in scores between countries,” he added. When DHA and LA are considered together, he added, they are twice as effective at predicting test scores as either is alone, Gaulin said.
Gaulin and Lassek considered two economic factors as well: per capita gross domestic product (a measure of average wealth in each nation) and per student expenditures on education. “Each of these factors helps explain some of the differences between nations in test scores, but the fatty acid profile of the average mother’s milk in a given country is a better predictor of the average cognitive performance in that country than is either of the conventional socioeconomic measures people use,” said Gaulin.
From their analysis, the researchers conclude that both economic wellbeing and diet make a difference in cognitive test performance, and children are best off when they have both factors in their favor. “But if you had to choose one, you should choose the better diet rather than the better economy,” Gaulin said.
The current research follows a study published in 2008 that showed that the children of women who had larger amounts of gluteofemoral fat “depots” performed better on academic tests than those of mothers with less. “At that time we weren’t trying to identify the dietary cause,” explained Gaulin. “We found that this depot that has been evolutionarily elaborated in women is important to building a good brain. We were content at that time to show that as a way of understanding why the female body is as evolutionarily distinctive as it is.”
Now the researchers are looking at diet as the key to brain-building fat, since mothers need to acquire these fats in the first place.
Their results are particularly interesting in 21st-century North America, Gaulin noted, because our current agribusiness-based diets provide very low levels of DHA — among the lowest in the world. Thanks to two heavily government-subsidized crops — corn and soybeans — the average U.S. diet is heavy in the bad omega-6 fatty acids and far too light on the good omega-3s, Gaulin said.
Wrong kind of polyunsaturated fat
“Back in the 1960s, in the middle of the cardiovascular disease epidemic, people got the idea that saturated fats were bad and polyunsaturated fats were good,” he explained. “That’s one reason margarine became so popular. But the polyunsaturated fats that were increased were the ones with omega-6, not omega-3. So our message is that not only is it advisable to increase omega 3 intake, it’s highly advisable to decrease omega-6 — the very fats that in the 1960s and ’70s we were told we should be eating more of.”
Gaulin added that mayonnaise is, in general, the most omega-6-laden food in the average person’s refrigerator. “If you have too much of one — omega-6 — and too little of the other — omega 3 — you’re going to end up paying a price cognitively,” he said.
The issue is a huge concern for women, Gaulin noted, because “that’s where kids’ brains come from. But it’s important for men as well because they have to take care of the brains their moms gave them.
“Just like a racecar burns up some of its motor oil with every lap, your brain burns up omega-3 and you need to replenish it every day,” he said.
(Image: Stacy Librandi)

Breast milk is brain food

You are what you eat, the saying goes, and now a study conducted by researchers at UC Santa Barbara and the University of Pittsburgh suggests that the oft-repeated adage applies not just to physical health but to brain power as well.

In a paper published in the early online edition of the journal Prostaglandins, Leukotrienes and Essential Fatty Acids, the researchers compared the fatty acid profiles of breast milk from women in over two dozen countries with how well children from those same countries performed on academic tests.

Their findings show that the amount of omega-3 docosahexaenoic acid (DHA) in a mother’s milk — fats found primarily in certain fish, nuts and seeds — is the strongest predictor of test performance. It outweighs national income and the number of dollars spent per pupil in schools.

DHA alone accounted for about 20 percent of the differences in test scores among countries, the researchers found.

On the other hand, the amount of omega-6 fat in mother’s milk — fats that come from vegetable oils such as corn and soybean — predict lower test scores. When the amount of DHA and linoleic acid (LA) — the most common omega-6 fat — were considered together, they explained nearly half of the differences in test scores. In countries where mother’s diets contain more omega-6, the beneficial effects of DHA seem to be reduced.

More omega-3, less omega-6

“Human intelligence has a physical basis in the huge size of our brains — some seven times larger than would be expected for a mammal with our body size,” said Steven Gaulin, UCSB professor of anthropology and co-author of the paper. “Since there is never a free lunch, those big brains need lots of extra building materials — most importantly, they need omega-3 fatty acids, especially DHA. Omega-6 fats, however, undermine the effects of DHA and seem to be bad for brains.”

Both kinds of omega fat must be obtained through diet. But because diets vary from place to place, for their study Gaulin and his co-author, William D. Lassek, M.D., a professor at the University of Pittsburgh’s Graduate School of Public Health and a retired assistant surgeon general, estimated the DHA and LA content — the good fat and the bad fat — in diets in 50 countries by examining published studies of the fatty acid profiles of women’s breast milk.

The profiles are a useful measure for two reasons, according to Gaulin. First, because various kinds of fats interfere with one another in the body, breast milk DHA shows how much of this brain-essential fat survives competition with omega-6. Second, children receive their brain-building fats from their mothers. Breast milk profiles indicate the amount of DHA children in each region receive in the womb, through breastfeeding, and from the local diet available to their mothers and to them after they are weaned.

The academic test results came from the Programme for International Student Assessment (PISA), which administers standardized tests in 58 nations. Gaulin and Lassek averaged the three PISA tests — math, science and reading ability — as their measure of cognitive performance. There were 28 countries for which the researchers found information about both breast milk and test scores.

DHA content: best predictor of math test performance

“Looking at those 28 countries, the DHA content of breast milk was the single best predictor of math test performance,” Gaulin said. The second best indicator was the amount of omega-6, and its effect is opposite. “Considering the benefits of omega-3 and the detriment of omega-6, we can get pretty darn close to explaining half the difference in scores between countries,” he added. When DHA and LA are considered together, he added, they are twice as effective at predicting test scores as either is alone, Gaulin said.

Gaulin and Lassek considered two economic factors as well: per capita gross domestic product (a measure of average wealth in each nation) and per student expenditures on education. “Each of these factors helps explain some of the differences between nations in test scores, but the fatty acid profile of the average mother’s milk in a given country is a better predictor of the average cognitive performance in that country than is either of the conventional socioeconomic measures people use,” said Gaulin.

From their analysis, the researchers conclude that both economic wellbeing and diet make a difference in cognitive test performance, and children are best off when they have both factors in their favor. “But if you had to choose one, you should choose the better diet rather than the better economy,” Gaulin said.

The current research follows a study published in 2008 that showed that the children of women who had larger amounts of gluteofemoral fat “depots” performed better on academic tests than those of mothers with less. “At that time we weren’t trying to identify the dietary cause,” explained Gaulin. “We found that this depot that has been evolutionarily elaborated in women is important to building a good brain. We were content at that time to show that as a way of understanding why the female body is as evolutionarily distinctive as it is.”

Now the researchers are looking at diet as the key to brain-building fat, since mothers need to acquire these fats in the first place.

Their results are particularly interesting in 21st-century North America, Gaulin noted, because our current agribusiness-based diets provide very low levels of DHA — among the lowest in the world. Thanks to two heavily government-subsidized crops — corn and soybeans — the average U.S. diet is heavy in the bad omega-6 fatty acids and far too light on the good omega-3s, Gaulin said.

Wrong kind of polyunsaturated fat

“Back in the 1960s, in the middle of the cardiovascular disease epidemic, people got the idea that saturated fats were bad and polyunsaturated fats were good,” he explained. “That’s one reason margarine became so popular. But the polyunsaturated fats that were increased were the ones with omega-6, not omega-3. So our message is that not only is it advisable to increase omega 3 intake, it’s highly advisable to decrease omega-6 — the very fats that in the 1960s and ’70s we were told we should be eating more of.”

Gaulin added that mayonnaise is, in general, the most omega-6-laden food in the average person’s refrigerator. “If you have too much of one — omega-6 — and too little of the other — omega 3 — you’re going to end up paying a price cognitively,” he said.

The issue is a huge concern for women, Gaulin noted, because “that’s where kids’ brains come from. But it’s important for men as well because they have to take care of the brains their moms gave them.

“Just like a racecar burns up some of its motor oil with every lap, your brain burns up omega-3 and you need to replenish it every day,” he said.

(Image: Stacy Librandi)

Filed under breast milk breastfeeding omega-3 cognitive performance health psychology neuroscience science

98 notes

Sleep disorders widely undiagnosed in individuals with multiple sclerosis

In what may be the largest study of sleep problems among individuals with multiple sclerosis (MS), researchers at UC Davis have found that widely undiagnosed sleep disorders may be at the root of the most common and disabling symptom of the disease: fatigue.

image

Conducted in over 2,300 individuals in Northern California with multiple sclerosis, the large, population-based study found that, overall, more than 70 percent of participants screened positive for one or more sleep disorders.

The research highlights the importance of diagnosing the root causes of fatigue among individuals with MS, as sleep disorders may affect the course of the disease as well as the overall health and well-being of sufferers, the authors said.

The study “The Underdiagnosis of Sleep Disorders in Patients with Multiple Sclerosis,” is published online today in the Journal of Clinical Sleep Medicine.

“A large percentage of MS subjects in our study are sleep deprived and screened positive for one or more sleep disorders,” said Steven Brass, associate clinical professor and director of the Neurology Sleep Clinical Program and co-medical director of the UC Davis Sleep Medicine Laboratory.

“The vast majority of these sleep disorders are potentially undiagnosed and untreated,” he said. “This work suggests that patients with MS may have sleep disorders requiring independent diagnosis and management.”

Fatigue is the hallmark of multiple sclerosis, an inflammatory disease affecting the white matter and spinal cord of sufferers. MS symptoms include loss of vision, vertigo, weakness and numbness. Patients also may experience psychiatric symptoms. Disease onset generally is between the ages of 20 and 50 years. The cause of MS is not known, although it is believed to be an autoimmune condition.

Sleep disorders are known to occur more frequently among patients with MS. To gauge the extent of sleep disorders, such as obstructive sleep apnea and insomnia, Brass and his colleagues surveyed members of the Northern California Chapter of the National MS Society. Subjects were recruited in 2011.

More than 11,000 surveys were mailed to prospective participants. Of those, 2,375 met criteria and were included in the study. Consistent with the reported epidemiology of multiple sclerosis, the majority (81 percent) were female and Caucasian (88 percent). The mean age of the participants was 54.

Participants were asked to complete a 10-page survey, which included a detailed sleep history and questions assessing obstructive sleep apnea, daytime sleepiness, insomnia and restless legs syndrome.

Most of the participants - nearly 52 percent - said it took them more than one half hour to fall asleep at night, and nearly 11 percent reported taking a medication to fall asleep. Close to 38 percent of participants screened positive for obstructive sleep apnea. Nearly 32 percent had moderate to severe insomnia and nearly 37 percent had restless legs syndrome.

However, most of the participants had not been diagnosed with a sleep disorder by a physician. While nearly 38 percent reported having obstructive sleep apnea, only a little more than 4 percent reported being diagnosed by a physician with the condition. Similar statistics were seen for other sleep disorders.

“This study shows that sleep disorder frequency, sleep patterns and complaints of excessive daytime sleepiness suggest that sleep problems may be a hidden epidemic in the MS population, separate from MS fatigue,” Brass said.

(Source: ucdmc.ucdavis.edu)

Filed under MS sleep sleep problems daytime sleepiness sleep apnea neuroscience science

85 notes

Brain inflammation dramatically disrupts memory retrieval networks
Brain inflammation can rapidly disrupt our ability to retrieve complex memories of similar but distinct experiences, according to UC Irvine neuroscientists Jennifer Czerniawski and John Guzowski.
Their study – which appears today in The Journal of Neuroscience – specifically identifies how immune system signaling molecules, called cytokines, impair communication among neurons in the hippocampus, an area of the brain critical for discrimination memory. The findings offer insight into why cognitive deficits occurs in people undergoing chemotherapy and those with autoimmune or neurodegenerative diseases.
Moreover, since cytokines are elevated in the brain in each of these conditions, the work suggests potential therapeutic targets to alleviate memory problems in these patients.
“Our research provides the first link among immune system activation, altered neural circuit function and impaired discrimination memory,” said Guzowski, the James L. McGaugh Chair in the Neurobiology of Learning & Memory. “The implications may be beneficial for those who have chronic diseases, such as multiple sclerosis, in which memory loss occurs and even for cancer patients.”
What he found interesting is that increased cytokine levels in the hippocampus only affected complex discrimination memory, the type that lets us differentiate among generally similar experiences – what we did at work or ate at dinner, for example. A simpler form of memory processed by the hippocampus – which would be akin to remembering where you work – was not altered by brain inflammation.
In the study, Czerniawski, a UCI postdoctoral scholar, exposed rats to two similar but discernable environments over several days. They received a mild foot shock daily in one, making them apprehensive about entering that specific site. Once the rodents showed that they had learned the difference between the two environments, some were given a low dose of a bacterial agent to induce a neuroinflammatory response, leading to cytokine release in the brain. Those animals were then no longer able to distinguish between the two environments.
Afterward, the researchers explored the activity patterns of neurons – the primary cell type for information processing – in the rats’ hippocampi using a gene-based cellular imaging method developed in the Guzowski lab. In the rodents that received the bacterial agent (and exhibited memory deterioration), the networks of neurons activated in the two environments were very similar, unlike those in the animals not given the agent (whose memories remained strong). This finding suggests that cytokines impaired recall by disrupting the function of these specific neuron circuits in the hippocampus.
“The cytokines caused the neural network to react as if no learning had taken place,” said Guzowski, associate professor of neurobiology & behavior. “The neural circuit activity was back to the pattern seen before learning.”
The work may also shed light on a chemotherapy-related mental phenomenon known as “chemo brain,” in which cancer patients find it difficult to efficiently process information. UCI neuro-oncologists have found that chemotherapeutic agents destroy stem cells in the brain that would have become neurons for creating and storing memories.
Dr. Daniela Bota, who co-authored that study, is currently collaborating with Guzowski’s research group to see if brain inflammation may be another of the underlying causes of “chemo brain” symptoms.
She said they’re looking for a simple intervention, such as an anti-inflammatory or steroid drug, that could lessen post-chemo inflammation. Bota will test this approach on patients, pending the outcome of animal studies.
“It will be interesting to see if limiting neuroinflammation will give cancer patients fewer or no problems,” she said. “It’s a wonderful idea, and it presents a new method to limit brain cell damage, improving quality of life. This is a great example of basic science and clinical ideas coming together to benefit patients.”

Brain inflammation dramatically disrupts memory retrieval networks

Brain inflammation can rapidly disrupt our ability to retrieve complex memories of similar but distinct experiences, according to UC Irvine neuroscientists Jennifer Czerniawski and John Guzowski.

Their study – which appears today in The Journal of Neuroscience – specifically identifies how immune system signaling molecules, called cytokines, impair communication among neurons in the hippocampus, an area of the brain critical for discrimination memory. The findings offer insight into why cognitive deficits occurs in people undergoing chemotherapy and those with autoimmune or neurodegenerative diseases.

Moreover, since cytokines are elevated in the brain in each of these conditions, the work suggests potential therapeutic targets to alleviate memory problems in these patients.

“Our research provides the first link among immune system activation, altered neural circuit function and impaired discrimination memory,” said Guzowski, the James L. McGaugh Chair in the Neurobiology of Learning & Memory. “The implications may be beneficial for those who have chronic diseases, such as multiple sclerosis, in which memory loss occurs and even for cancer patients.”

What he found interesting is that increased cytokine levels in the hippocampus only affected complex discrimination memory, the type that lets us differentiate among generally similar experiences – what we did at work or ate at dinner, for example. A simpler form of memory processed by the hippocampus – which would be akin to remembering where you work – was not altered by brain inflammation.

In the study, Czerniawski, a UCI postdoctoral scholar, exposed rats to two similar but discernable environments over several days. They received a mild foot shock daily in one, making them apprehensive about entering that specific site. Once the rodents showed that they had learned the difference between the two environments, some were given a low dose of a bacterial agent to induce a neuroinflammatory response, leading to cytokine release in the brain. Those animals were then no longer able to distinguish between the two environments.

Afterward, the researchers explored the activity patterns of neurons – the primary cell type for information processing – in the rats’ hippocampi using a gene-based cellular imaging method developed in the Guzowski lab. In the rodents that received the bacterial agent (and exhibited memory deterioration), the networks of neurons activated in the two environments were very similar, unlike those in the animals not given the agent (whose memories remained strong). This finding suggests that cytokines impaired recall by disrupting the function of these specific neuron circuits in the hippocampus.

“The cytokines caused the neural network to react as if no learning had taken place,” said Guzowski, associate professor of neurobiology & behavior. “The neural circuit activity was back to the pattern seen before learning.”

The work may also shed light on a chemotherapy-related mental phenomenon known as “chemo brain,” in which cancer patients find it difficult to efficiently process information. UCI neuro-oncologists have found that chemotherapeutic agents destroy stem cells in the brain that would have become neurons for creating and storing memories.

Dr. Daniela Bota, who co-authored that study, is currently collaborating with Guzowski’s research group to see if brain inflammation may be another of the underlying causes of “chemo brain” symptoms.

She said they’re looking for a simple intervention, such as an anti-inflammatory or steroid drug, that could lessen post-chemo inflammation. Bota will test this approach on patients, pending the outcome of animal studies.

“It will be interesting to see if limiting neuroinflammation will give cancer patients fewer or no problems,” she said. “It’s a wonderful idea, and it presents a new method to limit brain cell damage, improving quality of life. This is a great example of basic science and clinical ideas coming together to benefit patients.”

Filed under neuroinflammation memory hippocampus cytokines immune system neuroscience science

88 notes

Grey matter matters when measuring our tolerance of risk

There is a link between our brain structure and our tolerance of risk, new research suggests.

Dr Agnieszka Tymula, an economist at the University of Sydney, is one of the lead authors of a new study that identifies what might be considered the first stable ‘biomarker’ for financial risk-attitudes.

image

Using a whole-brain analysis, Dr Tymula and international collaborators found that the grey matter volume of a region in the right posterior parietal cortex was significantly predictive of individual risk attitudes. Men and women with higher grey matter volume in this region exhibited less risk aversion.

"Individual risk attitudes are correlated with the grey matter volume in the posterior parietal cortex suggesting existence of an anatomical biomarker for financial risk-attitude," said Dr Tymula.

This means tolerance of risk “could potentially be measured in billions of existing medical brain scans.”

But she has cautioned against making a causal link between brain structure and behaviour. More research will be needed to establish whether structural changes in the brain lead to changes in risk attitude or whether that individual’s risky choices alter his or her brain structure - or both.

"The findings fit nicely with our previous findings on risk attitude and ageing. In our Proceedings of the National Academy of Sciences 2013 paper we found that as people age they become more risk averse,” she said.

"From other work we know that cortex thins substantially as we age. It is possible that changes in risk attitude over lifespan are caused by thinning of the cortex."

The findings are published in the September 10 issue of The Journal of Neuroscience.

Filed under gray matter brain structure decision making risk aversion parietal cortex neuroscience science

360 notes

Deconstructing the placebo response: Why does it work in treating depression?
In the past three decades, the power of placebos has gone through the roof in treating major depressive disorder. In clinical trials for treating depression over that period of time, researchers have reported significant increases in patient’s response rates to placebos — the simple sugar pills given to patients who think that it may be actual medication.
New research conducted by UCLA psychiatrists helps explain how placebos can have such a powerful effect on depression.
“In short,” said Andrew Leuchter, the study’s first author and a professor of psychiatry at the UCLA Semel Institute for Neuroscience and Human Behavior, “if you think a pill is going to work, it probably will.”
The UCLA researchers examined three forms of treatment. One was supportive care in which a therapist assessed the patient’s risk and symptoms, and provided emotional support and encouragement but refrained from providing solutions to the patient’s issues that might result in specific therapeutic effects. The other two treatments provided the same type of therapy, but patients also received either medication or placebos.
The researchers found that treatment that incorporating either type of pill — real medication or placebo — yielded better outcomes than supportive care alone. Further, the success of the placebo treatment was closely correlated to people’s expectations before they began treatment. Those who believed that medication was likely to help them were much more likely to respond to placebos. Their belief in the effectiveness of medication was not related to the likelihood of benefitting from medication, however.
“Our study indicates that belief in ‘the power of the pill’ uniquely drives the placebo response, while medications are likely to work regardless of patients’ belief in their effectiveness,” Leuchter said.
The study appears in the current online edition of the British Journal of Psychiatry.
At the beginning and end of the study, patients were asked to complete the Hamilton Rating Scale for Depression, giving researchers a quantitative assessment of how their depression levels changed during treatment. Those who received antidepressant medication and supportive care improved an average of 46 percent, patients who received placebos and supportive care improved an average of 36 percent, and those who received supportive care alone improved an average of just 5 percent.
“Interestingly, while we found that medication was more effective than placebo, the difference was modest,” Leuchter said.
The researchers also found that people who received supportive care alone were more likely to discontinue treatment early than those who received pills.
People with major depressive disorder have a persistent low mood, low self-esteem and a loss of pleasure in things they once enjoyed. The disorder can be disabling, and it can affect a person’s family, work or school life, sleeping and eating habits, and overall health.
In the double-blind study, 88 people ages 18 to 65 who had been diagnosed with depression were given eight weeks of treatment. Twenty received supportive care alone, 29 received a placebo with supportive care and 39 received actual medication with supportive care.
The researchers measured the patients’ expectations for how effective they thought medication and general treatment would be, as well as their impressions of the strength of their relationship with the supportive care provider.
“These results suggest a unique role for people’s expectations about their medication in engendering a placebo response,” Leuchter said. “Higher expectations of medication effectiveness predicted an improvement in placebo-treated subjects, and it’s important to note that people’s expectations about how effective a medication may be were already formed before they entered the trial.”
Leuchter said the research indicates that factors such as direct-to-consumer advertising may be shaping peoples’ attitudes about medication. “It may not be an accident that placebo response rates have soared at the same time the pharmaceutical companies are spending $10 billion a year on consumer advertising.”
(Image credit: © Chris Lamphear)

Deconstructing the placebo response: Why does it work in treating depression?

In the past three decades, the power of placebos has gone through the roof in treating major depressive disorder. In clinical trials for treating depression over that period of time, researchers have reported significant increases in patient’s response rates to placebos — the simple sugar pills given to patients who think that it may be actual medication.

New research conducted by UCLA psychiatrists helps explain how placebos can have such a powerful effect on depression.

“In short,” said Andrew Leuchter, the study’s first author and a professor of psychiatry at the UCLA Semel Institute for Neuroscience and Human Behavior, “if you think a pill is going to work, it probably will.”

The UCLA researchers examined three forms of treatment. One was supportive care in which a therapist assessed the patient’s risk and symptoms, and provided emotional support and encouragement but refrained from providing solutions to the patient’s issues that might result in specific therapeutic effects. The other two treatments provided the same type of therapy, but patients also received either medication or placebos.

The researchers found that treatment that incorporating either type of pill — real medication or placebo — yielded better outcomes than supportive care alone. Further, the success of the placebo treatment was closely correlated to people’s expectations before they began treatment. Those who believed that medication was likely to help them were much more likely to respond to placebos. Their belief in the effectiveness of medication was not related to the likelihood of benefitting from medication, however.

“Our study indicates that belief in ‘the power of the pill’ uniquely drives the placebo response, while medications are likely to work regardless of patients’ belief in their effectiveness,” Leuchter said.

The study appears in the current online edition of the British Journal of Psychiatry.

At the beginning and end of the study, patients were asked to complete the Hamilton Rating Scale for Depression, giving researchers a quantitative assessment of how their depression levels changed during treatment. Those who received antidepressant medication and supportive care improved an average of 46 percent, patients who received placebos and supportive care improved an average of 36 percent, and those who received supportive care alone improved an average of just 5 percent.

“Interestingly, while we found that medication was more effective than placebo, the difference was modest,” Leuchter said.

The researchers also found that people who received supportive care alone were more likely to discontinue treatment early than those who received pills.

People with major depressive disorder have a persistent low mood, low self-esteem and a loss of pleasure in things they once enjoyed. The disorder can be disabling, and it can affect a person’s family, work or school life, sleeping and eating habits, and overall health.

In the double-blind study, 88 people ages 18 to 65 who had been diagnosed with depression were given eight weeks of treatment. Twenty received supportive care alone, 29 received a placebo with supportive care and 39 received actual medication with supportive care.

The researchers measured the patients’ expectations for how effective they thought medication and general treatment would be, as well as their impressions of the strength of their relationship with the supportive care provider.

“These results suggest a unique role for people’s expectations about their medication in engendering a placebo response,” Leuchter said. “Higher expectations of medication effectiveness predicted an improvement in placebo-treated subjects, and it’s important to note that people’s expectations about how effective a medication may be were already formed before they entered the trial.”

Leuchter said the research indicates that factors such as direct-to-consumer advertising may be shaping peoples’ attitudes about medication. “It may not be an accident that placebo response rates have soared at the same time the pharmaceutical companies are spending $10 billion a year on consumer advertising.”

(Image credit: © Chris Lamphear)

Filed under placebo major depressive disorder depression mental health health medicine science

98 notes

(Image caption: An undated handout picture released by Japan’s Riken research institute and Foundation for Biomedical Research and Innovation, shows a retina sheet prepared from iPS cells of a woman for transplant surgery. Japanese researchers on Friday conducted the world’s first surgery to implant “iPS” stem cells in a human body in a major boost to regenerative medicine, two institutions involved said. — PHOTO: AFP/RIKEN AND FOUNDATION FOR BIOMEDICAL RESEARCH AND INNOVATION. Adapted from: The Straits Times)
Japanese doctors test method for restoring impaired vision
Japanese doctors have successfully carried out the first ever implantation of a retina grown from induced pluripotent stem cells (iPS).
The recipient was a 70-year-old woman suffering from macular degeneration.
The procedure took place Friday at the Institute of Biomedical Research and Innovation in the southern city of Kobe, under the direction of a group of scientists from the Riken Institute.
Researchers extracted skin samples from women to grow iPS cells capable of serving as retinal tissue, which then were used to surgically replace part of the macula, the main photo-receptor layer of the retina.
The scientists said that their priority was not to attempt to restore the patient’s sight, but to determine if there are any unforeseen side effects, such as tumours, arising from the procedure.
According to the researchers, who will study the patient’s evolution over the next four years, since the patient will have already lost most of the cells responsible for vision, a transplant may bring only slight improvement or merely slow down the rate of degeneration.
Macular degeneration is an age-related disease that currently affects about 700,000 people in Japan and is the principal cause of blindness in the world.

(Image caption: An undated handout picture released by Japan’s Riken research institute and Foundation for Biomedical Research and Innovation, shows a retina sheet prepared from iPS cells of a woman for transplant surgery. Japanese researchers on Friday conducted the world’s first surgery to implant “iPS” stem cells in a human body in a major boost to regenerative medicine, two institutions involved said. — PHOTO: AFP/RIKEN AND FOUNDATION FOR BIOMEDICAL RESEARCH AND INNOVATION. Adapted from: The Straits Times)

Japanese doctors test method for restoring impaired vision

Japanese doctors have successfully carried out the first ever implantation of a retina grown from induced pluripotent stem cells (iPS).

The recipient was a 70-year-old woman suffering from macular degeneration.

The procedure took place Friday at the Institute of Biomedical Research and Innovation in the southern city of Kobe, under the direction of a group of scientists from the Riken Institute.

Researchers extracted skin samples from women to grow iPS cells capable of serving as retinal tissue, which then were used to surgically replace part of the macula, the main photo-receptor layer of the retina.

The scientists said that their priority was not to attempt to restore the patient’s sight, but to determine if there are any unforeseen side effects, such as tumours, arising from the procedure.

According to the researchers, who will study the patient’s evolution over the next four years, since the patient will have already lost most of the cells responsible for vision, a transplant may bring only slight improvement or merely slow down the rate of degeneration.

Macular degeneration is an age-related disease that currently affects about 700,000 people in Japan and is the principal cause of blindness in the world.

Filed under stem cells iPS cells macular degeneration regenerative medicine medicine science

102 notes

Say ‘ahh’ to let your smartphone check for Parkinson’s disease
Smartphones are designed to be curious. Having already learned about your friendships, your family and the pattern of your daily routine, designers are now interested in your health and fitness.
A new crop of apps and wearable devices continuously measure and analyse vital signs such as movement and heart rate, claiming to count calories, optimise sleep quality and guide diet. While cynics might be tempted to dismiss these products as glorified pedometers for lycra-clad smartphone addicts, new research shows that the hardware inside existing consumer devices can already reliably detect degenerative, life-changing disorders, including Parkinson’s disease.
Parkinson’s currently affects between seven to 10m people worldwide, and there is no cure. The disease can be diagnosed from a number of characteristic symptoms, including muscle tremor, changes in speech and difficulty of movement. However, diagnosis is challenging and usually involves regular visits to the doctor. It is estimated that one in five people with Parkinson’s are never diagnosed. Even if diagnosed, it can be difficult to accurately assess the how efficient treatment is in managing the disease.
Read more

Say ‘ahh’ to let your smartphone check for Parkinson’s disease

Smartphones are designed to be curious. Having already learned about your friendships, your family and the pattern of your daily routine, designers are now interested in your health and fitness.

A new crop of apps and wearable devices continuously measure and analyse vital signs such as movement and heart rate, claiming to count calories, optimise sleep quality and guide diet. While cynics might be tempted to dismiss these products as glorified pedometers for lycra-clad smartphone addicts, new research shows that the hardware inside existing consumer devices can already reliably detect degenerative, life-changing disorders, including Parkinson’s disease.

Parkinson’s currently affects between seven to 10m people worldwide, and there is no cure. The disease can be diagnosed from a number of characteristic symptoms, including muscle tremor, changes in speech and difficulty of movement. However, diagnosis is challenging and usually involves regular visits to the doctor. It is estimated that one in five people with Parkinson’s are never diagnosed. Even if diagnosed, it can be difficult to accurately assess the how efficient treatment is in managing the disease.

Read more

Filed under parkinson's disease technology health science

244 notes

Research Shows How Brain Can Tell Magnitude of Errors 
University of Pennsylvania researchers have made another advance in understanding how the brain detects errors caused by unexpected sensory events. This type of error detection is what allows the brain to learn from its mistakes, which is critical for improving fine motor control.  
Their previous work explained how the brain can distinguish true error signals from noise; their new findings show how it can tell the difference between errors of different magnitudes. Fine-tuning a tennis serve, for example, requires that the brain distinguish whether it needs to make a minor correction if the ball barely misses the target or a much bigger correction if it is way off.
The study was led by Javier Medina, an assistant professor in the Department of Psychology in Penn’s School of Arts & Sciences, and Farzaneh Najafi, then a graduate student in the Department of Biology. They collaborated with postdoctoral fellow Andrea Giovannucci and associate professor Samuel S. H. Wang of Princeton University.
It was published in the journal eLife.
Our movements are controlled by neurons known as Purkinje cells. Each muscle receives instructions from a dedicated set of hundreds of these brain cells. The instructions sent by each set of Purkinje cells are constantly fine tuned by climbing fibers, a specialized group of neurons that alert Purkinje cells whenever an unexpected stimulus occurs.
“An unexpected stimulus is often a sign that something has gone wrong,” Medina said, “When this happens, climbing fibers send signals to their related Purkinje cells that an error has occurred. These Purkinje cells can then make changes to avoid the error in the future.”
These error signals are mixed in with random firings of the climbing fibers, however, and researchers were long mystified about how the brain tells the difference between this noise and the useful, error-related information it needs to improve motor control.
Medina and his team showed the mechanism behind this differentiation in a study published earlier this year. By using a non-invasive microscopy technique that could monitor the Purkinje cells of awake and active mice, the researchers could measure the level of calcium within these cells when they received signals from climbing fibers.
The unexpected stimuli in this experiment were random puffs of air to the face, which caused the mice to blink. The researchers located Purkinje cells that control the mice’s eyelids and saw that calcium levels necessary for neuroplasticity, i.e., the brain’s ability to learn, were greater when the mice got an error signal triggered by a puff of air than they were after a random signal.
While being able to make such a distinction is critical to the brain’s ability to improve motor control, more information is needed to fine-tune it.  
“We wanted to see if the Purkinje cells could tell the difference not just between random firings and true errors signals but between smaller and bigger errors,” Medina said.
In their new study, the researchers used the same experimental set-up, with one key difference. They used air puffs of different durations: 15 milliseconds and 30 milliseconds.
What they found was that the eyelid-associated Purkinje cells filled with different amounts of calcium depending on the length of the puff; the longer ones produced larger spikes in calcium levels.        
In addition, the researchers saw that different percentages of eyelid-related Purkinje cells respond depending on the length of the puff.  
“Though there is a large population of climbing fibers that can give error-related information to the relevant Purkinje cells when they encounter something unexpected, not all of them fire each time,” Medina said. “We saw that there is information coded in the number of climbing fibers that fire. The longer puffs corresponded to more climbing fibers sending signals to their Purkinje cells.”
Their study could help explain how practice makes perfect, even when errors are imperceptibly small.
“If you felt a short puff and a long puff, you might not be able to say which one was which, but Purkinje cells can tell the difference,” Medina said. “The difference between a ‘very good’ and an ‘awesome’ tennis serve rests on being able to distinguish errors even as tiny as that.” 

Research Shows How Brain Can Tell Magnitude of Errors

University of Pennsylvania researchers have made another advance in understanding how the brain detects errors caused by unexpected sensory events. This type of error detection is what allows the brain to learn from its mistakes, which is critical for improving fine motor control.  

Their previous work explained how the brain can distinguish true error signals from noise; their new findings show how it can tell the difference between errors of different magnitudes. Fine-tuning a tennis serve, for example, requires that the brain distinguish whether it needs to make a minor correction if the ball barely misses the target or a much bigger correction if it is way off.

The study was led by Javier Medina, an assistant professor in the Department of Psychology in Penn’s School of Arts & Sciences, and Farzaneh Najafi, then a graduate student in the Department of Biology. They collaborated with postdoctoral fellow Andrea Giovannucci and associate professor Samuel S. H. Wang of Princeton University.

It was published in the journal eLife.

Our movements are controlled by neurons known as Purkinje cells. Each muscle receives instructions from a dedicated set of hundreds of these brain cells. The instructions sent by each set of Purkinje cells are constantly fine tuned by climbing fibers, a specialized group of neurons that alert Purkinje cells whenever an unexpected stimulus occurs.

“An unexpected stimulus is often a sign that something has gone wrong,” Medina said, “When this happens, climbing fibers send signals to their related Purkinje cells that an error has occurred. These Purkinje cells can then make changes to avoid the error in the future.”

These error signals are mixed in with random firings of the climbing fibers, however, and researchers were long mystified about how the brain tells the difference between this noise and the useful, error-related information it needs to improve motor control.

Medina and his team showed the mechanism behind this differentiation in a study published earlier this year. By using a non-invasive microscopy technique that could monitor the Purkinje cells of awake and active mice, the researchers could measure the level of calcium within these cells when they received signals from climbing fibers.

The unexpected stimuli in this experiment were random puffs of air to the face, which caused the mice to blink. The researchers located Purkinje cells that control the mice’s eyelids and saw that calcium levels necessary for neuroplasticity, i.e., the brain’s ability to learn, were greater when the mice got an error signal triggered by a puff of air than they were after a random signal.

While being able to make such a distinction is critical to the brain’s ability to improve motor control, more information is needed to fine-tune it.  

“We wanted to see if the Purkinje cells could tell the difference not just between random firings and true errors signals but between smaller and bigger errors,” Medina said.

In their new study, the researchers used the same experimental set-up, with one key difference. They used air puffs of different durations: 15 milliseconds and 30 milliseconds.

What they found was that the eyelid-associated Purkinje cells filled with different amounts of calcium depending on the length of the puff; the longer ones produced larger spikes in calcium levels.        

In addition, the researchers saw that different percentages of eyelid-related Purkinje cells respond depending on the length of the puff.  

“Though there is a large population of climbing fibers that can give error-related information to the relevant Purkinje cells when they encounter something unexpected, not all of them fire each time,” Medina said. “We saw that there is information coded in the number of climbing fibers that fire. The longer puffs corresponded to more climbing fibers sending signals to their Purkinje cells.”

Their study could help explain how practice makes perfect, even when errors are imperceptibly small.

“If you felt a short puff and a long puff, you might not be able to say which one was which, but Purkinje cells can tell the difference,” Medina said. “The difference between a ‘very good’ and an ‘awesome’ tennis serve rests on being able to distinguish errors even as tiny as that.” 

Filed under cerebellum purkinje cells motor learning motor control brain cells climbing fibers neuroscience science

free counters