Neuroscience

Month

May 2012

Pain Relief Through Distraction: It's Not All in Your Head

ScienceDaily (May 17, 2012) — Mental distractions make pain easier to take, and those pain-relieving effects aren’t just in your head, according to a report published online on May 17 in Current Biology, a Cell Press publication.

The findings based on high-resolution spinal fMRI (functional magnetic resonance imaging) as people experienced painful levels of heat show that mental distractions actually inhibit the response to incoming pain signals at the earliest stage of central pain processing.

"The results demonstrate that this phenomenon is not just a psychological phenomenon, but an active neuronal mechanism reducing the amount of pain signals ascending from the spinal cord to higher-order brain regions," said Christian Sprenger of the University Medical Center Hamburg-Eppendorf.

Those effects involve endogenous opioids, which are naturally produced by the brain and play a key role in the relief of pain, the new evidence shows.

The research group asked participants to complete either a hard or an easy memory task, both requiring them to remember letters, while they simultaneously applied a painful level of heat to their arms.

When study participants were more distracted by the harder of the two memory tasks, they did indeed perceive less pain. What’s more, their less painful experience was reflected by lower activity in the spinal cord as observed by fMRI scans. (fMRI is often used to measure changes in brain activity, Sprenger explained, and recent advances have made it possible to extend this tool for use in the spinal cord.)

Sprenger and colleagues then repeated the study again, this time giving participants either a drug called naloxone, which blocks the effects of opioids, or a simple saline infusion. The pain-relieving effects of distraction dropped by 40 percent during the application of the opioid antagonist compared to saline, evidence that endogenous opioids play an essential role.

The findings show just how deeply mental processes can go in altering the experience of pain, and that may have clinical importance.

"Our findings strengthen the role of cognitive-behavioral therapeutic approaches in the treatment of pain diseases, as it could be extrapolated that these approaches might also have the potential to alter the underlying neurobiological mechanisms as early as in the spinal cord," the researchers say.

Source: Science Daily

May 17, 20125 notes
#science #neuroscience #brain #psychology #pain
Suspicion resides in two regions of the brain

May 17, 2012

Fool me once, shame on you. Fool me twice, shame on my parahippocampal gyrus.

image

Read Montague, Ph.D., and colleagues at the Virginia Tech Carilion Research Institute discovered two distinct sites for suspicion in the brain: the amygdala, which correlates strongly with a baseline distrustfulness, and the parahippocampal gyrus, which acts like a cerebral lie detector. Credit: Virginia Tech

Scientists at the Virginia Tech Carilion Research Institute have found that suspicion resides in two distinct regions of the brain: the amygdala, which plays a central role in processing fear and emotional memories, and the parahippocampal gyrus, which is associated with declarative memory and the recognition of scenes.

"We wondered how individuals assess the credibility of other people in simple social interactions," said Read Montague, director of the Human Neuroimaging Laboratory and the Computational Psychiatry Unit at the Virginia Tech Carilion Research Institute, who led the study. "We found a strong correlation between the amygdala and a baseline level of distrust, which may be based on a person’s beliefs about the trustworthiness of other people in general, his or her emotional state, and the situation at hand. What surprised us, though, is that when other people’s behavior aroused suspicion, the parahippocampal gyrus lit up, acting like an inborn lie detector.”

The scientists used functional magnetic resonance imaging, or fMRI, to study the neural basis of suspicion. Seventy-six pairs of players, each with a buyer and a seller, competed in 60 rounds of a simple bargaining game while having their brains scanned. At the beginning of each round, the buyer would learn the value of a hypothetical widget and suggest a price to the seller. The seller would then set the price. If the seller’s price fell below the widget’s given value, the trade would go through, with the seller receiving the selling price and the buyer receiving any difference between the selling price and the actual value. If the seller’s price exceeded the value, though, the trade would not execute, and neither party would receive cash.

The authors found, as detailed in a previous paper, that buyers fell into three strategic categories: 42 percent were incrementalists, who were relatively honest about the widget’s value; 37 percent were conservatives, who adopted the strategy of withholding information; and 21 percent were strategists, who were actively deceptive, mimicking incrementalist behavior by sending high suggestions during low-value trials and then reaping greater benefits by sending low suggestions during high-value trials.

The sellers had a monetary incentive to read the buyers’ strategic profiles correctly, yet they received no feedback about the accuracy of the information they were receiving, so they could not confirm any suspicions about patterns of behavior. Without feedback, the sellers were forced to decide whether they should trust the buyers based on the pricing suggestions alone. “The more uncertain a seller was about a buyer’s credibility,” Montague said, “the more active his or her parahippocampal gyrus became.”

The authors believe a person’s baseline suspicion may have important consequences for his or her financial success. “People with a high baseline suspicion were often interacting with fairly trustworthy buyers, so in ignoring the information those buyers provided, they were giving up potential profits,” said Meghana Bhatt, the first author on the research paper. “The ability to recognize credible information in a competitive environment can be just as important as detecting untrustworthy behavior.”

The findings may also have implications for such psychiatric conditions as paranoia and anxiety disorders, said Montague. “The fact that increased amygdala activation corresponds to an inability to detect trustworthy behavior may provide insight into the social interactions of people with anxiety disorders, who often have increased activity in this area of the brain,” he said.

Provided by Virginia Tech

Source: medicalxpress.com

May 17, 201216 notes
#science #neuroscience #brain #psychology
Alzheimer's Gene Causes Brain's Blood Vessels to Leak Toxins and Die

ScienceDaily (May 16, 2012) — A well-known genetic risk factor for Alzheimer’s disease triggers a cascade of signaling that ultimately results in leaky blood vessels in the brain, allowing toxic substances to pour into brain tissue in large amounts, scientists report May 16 in the journal Nature.

image

The left photo shows destructive proteins (green) lining blood vessels in living brain tissue of mice with the human ApoE4 gene; after the drug cyclosporine A is added, the harmful proteins are nearly gone (right). (Credit: Image courtesy of University of Rochester Medical Center)

The results come from a team of scientists investigating why a gene called ApoE4 makes people more prone to developing Alzheimer’s. People who carry two copies of the gene have roughly eight to 10 times the risk of getting Alzheimer’s disease than people who do not.

A team of scientists from the University of Rochester, the University of Southern California, and other institutions found that ApoE4 works through cyclophilin A, a well-known bad actor in the cardiovascular system, causing inflammation in atherosclerosis and other conditions. The team found that cyclophilin A opens the gates to the brain assault seen in Alzheimer’s.

"We are beginning to understand much more about how ApoE4 may be contributing to Alzheimer’s disease," said Robert Bell, Ph.D., the post-doctoral associate at Rochester who is first author of the paper. "In the presence of ApoE4, increased cyclophilin A causes a breakdown of the cells lining the blood vessels in Alzheimer’s disease in the same way it does in cardiovascular disease or abdominal aneurysm. This establishes a new vascular target to fight Alzheimer’s disease."

The team found that ApoE4 makes it more likely that cyclophilin A will accumulate in large amounts in cells that help maintain the blood-brain barrier, a network of tightly bound cells that line the insides of blood vessels in the brain and carefully regulates what substances are allowed to enter and exit brain tissue.

ApoE4 creates a cascade of molecular signaling that weakens the barrier, causing blood vessels to become leaky. This makes it more likely that toxic substances will leak from the vessels into the brain, damaging cells like neurons and reducing blood flow dramatically by choking off blood vessels.

Doctors have long known that the changes in the brain seen in Alzheimer’s patients — the death of crucial brain cells called neurons — begins happening years or even decades before symptoms appear. The steps described in Nature discuss events much earlier in the disease process.

The idea that vascular problems are at the heart of Alzheimer’s disease is one championed for more than two decades by Berislav Zlokovic, M.D., Ph.D., the leader of the team and a neuroscientist formerly with the University of Rochester Medical Center and now at USC. For 20 years, Zlokovic has investigated how blood flow in the brain is affected in people with the disease, and how the blood-brain barrier allows nutrients to pass into the brain, and harmful substances to exit the brain.

At Rochester, Zlokovic struck up a collaboration with Bradford Berk, M.D., Ph.D.,a cardiologist and CEO of the Medical Center. For more than two decades Berk has studied cyclophilin A, showing how it promotes destructive forces in blood vessels and how it’s central to the forces that contribute to cardiovascular diseases like atherosclerosis and heart attack.

"As a cardiologist, I’ve been interested in understanding the role of cyclophilin A in patients who suffer from cardiovascular illness," said Berk, a professor at the Aab Cardiovascular Research Institute. "Now our collaboration in Rochester has resulted in the discovery that it also has an important role in Alzheimer’s disease. The finding reinforces the basic research enterprise — you never know when knowledge gained in one area will turn out to be crucial in another."

In studies of mice, the team found that mice carrying the ApoE4 gene had five times as much cyclophilin A compared to other mice in cells known as pericytes, which are crucial to maintaining the integrity of the blood-brain barrier. Blood vessels died, blood did not flow as completely through the brain as it did in other mice, and harmful substances like thrombin, fibrin, and hemosiderin, entered the brain tissue.

When the team blocked the action of cyclophilin A, either by knocking out its gene or by using the drug cyclosporine A to inhibit it, the damage in the mice was reversed. Blood flow resumed to normal, and unhealthy leakage of toxic substances from the blood vessels into the brain was slashed by 80 percent.

The team outlined the chain of events involved. Briefly:

  • When ApoE4 is present, cyclophilin A is much more plentiful;
  • Cyclophilin A causes an increase in a the inflammatory molecule NF Kappa B;
  • NF Kappa B boosts levels of certain types of molecules known as MMPs or matrix metalloproteinases that are known to damage blood vessels, reducing blood flow.

Altogether, the activity results in a dramatic boost in the amount of toxic substances in brain tissue. And when the cascade is interrupted at any of several points — when ApoE4 is not present, when cyclophilin A is blocked or shut off, or when NF Kappa B or the MMPs are inhibited — the blood-brain barrier is restored, blood flow returns to normal, and toxic substances do not leak into brain tissue.

For many years, researchers studying Alzheimer’s disease have been focused largely on amyloid beta, a protein structure that accumulates in the brains of patients with Alzheimer’s disease. The latest works points up the importance of other approaches, said Zlokovic, an adjunct professor at Rochester. At USC, Zlokovic is also deputy director of the Zilkha Neurogenetic Institute, director of the Center for Neurodegeneration and Regeneration, and professor and chair of the Department of Physiology and Biophysics.

"Our study has shown major neuronal injury resulting from vascular defects that are not related to amyloid beta," said Zlokovic. "This damage results from a breakdown of the blood-brain barrier and a reduction in blood flow.

"Amyloid beta definitely has an important role in Alzheimer’s disease," added Zlokovic. "But it’s very important to investigate other leads, perhaps where amyloid beta isn’t as centrally involved."

Source: Science Daily

May 17, 201212 notes
#science #neuroscience #brain #psychology #alzheimer
Human Genes Transplanted Into Zebrafish: Helps Identify Genes Related to Autism, Schizophrenia and Obesity

ScienceDaily (May 16, 2012) — What can a fish tell us about human brain development? Researchers at Duke University Medical Center transplanted a set of human genes into a zebrafish and then used it to identify genes responsible for head size at birth.

image

Here are images of live zebrafish that were studied for genetics and head size to give insight into human head size. The top fish does not have the gene KCTD13 and its head size and brain size are larger; the middle fish is normal; the fish on the bottom expresses too much of the gene and has the smallest head and brain size. (Credit: Christelle Golzio, Duke Center for Human Disease Modeling and Duke Department of Cell Biology)

Researchers at Duke University Medical Center transplanted a set of human genes into a zebrafish and then used it to identify genes responsible for head size at birth.

Head size in human babies is a feature that is related to autism, a condition that recent figures have shown to be more common than previously reported, 1 in 88 children in a March 2012 study. Head size is also a feature of other major neurological disorders, such as schizophrenia.

"In medical research, we need to dissect events in biology so we can understand the precise mechanisms that give rise to neurodevelopmental traits," said senior author Nicholas Katsanis, Ph.D., Jean and George Brumley Jr., MD, Professor of Developmental Biology, and Professor of Pediatrics and Cell Biology. "We need expert scientists to work side by side with clinicians who see such anatomic and other problems in patients, if we are to effectively solve many of our medical problems."

The study was published online in Nature journal on May 16.

Katsanis knew that a region on chromosome 16 was one of the largest genetic contributors to autism and schizophrenia, but a conversation at a European medical meeting pointed him to information that changes within that same region of the genome also were related to changes in a newborn’s head size.

The problem was difficult to address because the region had large deletions and duplications in DNA, which are the most common mutational mechanisms in humans. “Interpretation is harrowingly hard,” said Katsanis, who is also director of the Duke Center for Human Disease Modeling.

The reason is that a duplication of DNA or missing DNA usually involves several genes. “It is very difficult to go from ‘here is a region with many genes, sometimes over 50’ to ‘these are the genes that are driving this pathology,’” Katsanis said.

"There was a light bulb moment," Katsanis said. "The area of the genome we were exploring gave rise to reciprocal (opposite) defects in terms of brain cell growth, so we realized that overexpressing a gene in question might give one phenotype — a smaller head, while shutting down the same gene might yield the other, a larger head."

The researchers transplanted a common duplication area of human chromosome 16 known to contain 29 genes into zebrafish embryos and then systematically turned up the activity of each transplanted human gene to find which might cause a small head (microcephaly) in the fish. They then suppressed the same gene set and asked whether any of them caused the reciprocal defect: larger heads (macrocephaly).

The researchers knew that deletion of the region that contained these 29 genes occurred in 1.7% of children with autism.

It took the team a few months to dissect such a “copy number variant” — an alteration of the genome that results in an abnormal number of one or more sections of chromosomal DNA.

"Now we can go from a genetic finding that is dosage-sensitive and start asking reasonable questions about this gene as it pertains to neurocognitive traits, which is a big leap," Katsanis said. Neurocognitive refers to the ability to think, concentrate, reason, remember, process information, learn, understand and speak.

Many human conditions have anatomical features that are also related to genetics, he said. “There are major limitations in studying autistic or schizophrenic behavior in zebrafish, but we can measure head size, jaw size, or facial abnormalities.”

The single gene in question, KCTD13, is responsible for driving head size in zebrafish by regulating the creation and destruction of new neurons (brain cells). This discovery let the team focus on the analogous gene in humans. “This gene contributes to autism cases, and probably is associated with schizophrenia and also childhood obesity,” Katsanis said.

Once the gene has been uncovered, researchers can examine the protein it produces. “Once you have the protein, you can start asking valuable functional questions and learning what the gene does in the animal or human,” Katsanis said.

Copy number variants, such as the ones this team found on chromosome 16, are now thought to be one of the most common sources of genetic mutations. Hundreds, if not thousands, of such chromosomal deletions and duplications have been found in patients with a broad range of clinical problems, particularly neurodevelopmental disorders.

"Now we may have an efficient tool for dissecting them, which gives us the ability to improve both diagnosis and understanding of disease mechanisms," Katsanis said.

The current study suggests that KCTD13 is a major contributor to some cases of autism, but also points to the synergistic action of this gene with two other genes in the region, named MVP and MAPK3, Katsanis said.

Source: Science Daily

May 17, 20125 notes
#science #neuroscience #genetics #psychology
Internet Usage Patterns May Signify Depression

ScienceDaily (May 16, 2012) — In a new study analyzing Internet usage among college students, researchers at Missouri University of Science and Technology have found that students who show signs of depression tend to use the Internet differently than those who show no symptoms of depression.

Using actual Internet usage data collected from the university’s network, the researchers identified nine fine-grained patterns of Internet usage that may indicate depression. For example, students showing signs of depression tend to use file-sharing services more than their counterparts, and also use the Internet in a more random manner, frequently switching among several applications.

The researchers’ findings provide new insights on the association between Internet use and depression compared to existing studies, says Dr. Sriram Chellappan, an assistant professor of computer science at Missouri S&T and the lead researcher in the study.

"The study is believed to be the first that uses actual Internet data, collected unobtrusively and anonymously, to associate Internet usage with signs of depression," Chellappan says. Previous research on Internet usage has relied on surveys, which are "a far less accurate way" of assessing how people use the Internet, he says.

"This is because when students themselves reported their volume and type of Internet activity, the amount of Internet usage data is limited because people’s memories fade with time," Chellappan says. "There may be errors and social desirability bias when students report their own Internet usage." Social desirability bias refers to the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others.

Chellappan and his fellow researchers collected a month’s worth of Internet data for 216 Missouri S&T undergraduate students. The data was collected anonymously and unobtrusively, and students involved in the study were assigned pseudonyms to keep their identities hidden from the researchers.

Before the researchers collected the usage data from the campus network, the students were tested to determine whether they showed signs of depression. The researchers then analyzed the usage data of the study participants. They found that students who showed signs of depression used the Internet much differently than the other study participants.

Chellappan and his colleagues found that depressed students tended to use file-sharing services, send email and chat online more than the other students. Depressed students also tended to use higher “packets per flow” applications, those high-bandwidth applications often associated with online videos and games, than their counterparts.

Students who showed signs of depression also tended to use the Internet in a more “random” manner — frequently switching among applications, perhaps from chat rooms to games to email. Chellappan thinks that randomness may indicate trouble concentrating, a characteristic associated with depression.

The randomness stood out to Chellappan after his graduate student, Raghavendra Kotikalapudi, examined the “flow duration entropy” of students’ online usage. Flow duration entropy refers to the consistency of Internet use during certain periods of time. The lower the flow duration entropy, the more consistent the Internet use.

"Students showing signs of depression had high flow duration entropy, which means that the duration of Internet flows of these students is highly inconsistent," Chellappan says.

At the beginning of the study, the 216 participating students were tested to determine whether they exhibited symptoms of depression. Based on the Center for Epidemiologic Studies-Depression (CES-D) scale, about 30 percent of the students in the study met the minimum criteria for depression. Nationally, previous studies show that between 10 percent and 40 percent of all American students suffer from depression.

To ensure that participants were not identified during the study, each participant was assigned a pseudonym. The campus information technology department then provided the on-campus Internet usage data for each participant from the month of February 2011.

The researchers’ analysis of the month’s worth of data led Chellappan and his colleagues to conclude that students who were identified as exhibiting symptoms of depression used the Internet differently than the other students in the study.

Chellappan’s research has been accepted for publication in a forthcoming issue of IEEE Technology and Society Magazine.

The chief author of the paper is Kotikalapudi, who received his master of science degree in computer science from Missouri S&T in December 2011. His co-authors are Chellappan; Dr. Frances Montgomery, Curators’ Teaching Professor of psychological science; Dr. Donald C. Wunsch, the M.K. Finley Missouri Distinguished Professor of Computer Engineering; and Karl F. Lutzen, information security officer for Missouri S&T’s IT department.

Chellappan is now interested in using these findings to develop software that could be installed on home computers to help individuals determine whether their Internet usage patterns may indicate depression. The software would unobtrusively monitor Internet usage and alert individuals if their usage patterns indicate symptoms of depression.

"The software would be a cost-effective and an in-home tool that could proactively prompt users to seek medical help if their Internet usage patterns indicate possible depression," Chellappan says. "The software could also be installed on campus networks to notify counselors of students whose Internet usage patterns are indicative of depressive behavior."

Chellappan also believes the method used to connect Internet use and depression could also help diagnose other mental disorders like anorexia, bulimia, attention deficit hyperactivity disorder or schizophrenia.

"We could also investigate associations between other Internet features like visits to social networking sites, late night Internet use and randomness in time of Internet use with depressive symptoms," he says. "Applications of this study to diagnose and treat mental disorders for other vulnerable groups like the elderly and military veterans are also significant."

Source: Science Daily

May 17, 201212 notes
#science #neuroscience #brain #psychology #depression
Head Impacts in Contact Sports May Reduce Learning in College Athletes

ScienceDaily (May 16, 2012) — A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. The research is published in the May 16, 2012, online issue of Neurology®, the medical journal of the American Academy of Neurology.

image

A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. (Credit: © modestil / Fotolia)

The study involved college athletes at three Division I schools and compared 214 athletes in contact sports to 45 athletes in non-contact sports such as track, crew and Nordic skiing at the beginning and at the end of their seasons. The contact sport athletes wore special helmets that recorded the acceleration speed and other data at the time of any head impact.

The contact sport athletes experienced an average of 469 head impacts during the season. Athletes were not included in the study if they were diagnosed with a concussion during the season.

All of the athletes took tests of thinking and memory skills before and after the season. A total of 45 contact sport athletes and 55 non-contact sport athletes from one of the schools also took an additional set of tests of concentration, working memory and other skills.

"The good news is that overall there were few differences in the test results between the athletes in contact sports and the athletes in non-contact sports," said study author Thomas W. McAllister, MD, of The Geisel School of Medicine at Dartmouth in Lebanon, N.H. "But we did find that a higher percentage of the contact sport athletes had lower scores than would have been predicted after the season on a measure of new learning than the non-contact sport athletes."

A total of 22 percent of the contact sport athletes performed worse than expected on the test of new learning, compared to four percent of the non-contact sport athletes.

McAllister noted that the study did not find differences in test results between the two groups of athletes at the beginning of the season, suggesting that the cumulative head impacts that contact athletes had incurred over many previous seasons did not result in reduced thinking and memory skills in the overall group.

"These results are somewhat reassuring, given the recent heightened concern about the potential negative effects of these sports," he said. "Nevertheless, the findings do suggest that repetitive head impacts may have a negative effect on some athletes."

McAllister said it’s possible that some people may be genetically more sensitive to head impacts.

Source: Science Daily

May 17, 20122 notes
#science #neuroscience #brain #psychology
Character Traits Determined Genetically? Genes May Hold the Key to a Life of Success, Study Suggests

ScienceDaily (May 16, 2012) — Genes play a greater role in forming character traits — such as self-control, decision making or sociability — than was previously thought, new research suggests.

image

Identical twin boys. Genes play a greater role in forming character traits — such as self-control, decision making or sociability — than was previously thought, new research suggests. (Credit: © vgm6 / Fotolia)

A study of more than 800 sets of twins found that genetics were more influential in shaping key traits than a person’s home environment and surroundings.

Psychologists at the University of Edinburgh who carried out the study, say that genetically influenced characteristics could well be the key to how successful a person is in life.

The study of twins in the US — most aged 50 and over- used a series of questions to test how they perceived themselves and others. Questions included “Are you influenced by people with strong opinions?” and “Are you disappointed about your achievements in life?”

The results were then measured according to the Ryff Psychological Well-Being Scale which assesses and standardizes these characteristics.

By tracking their answers, the research team found that identical twins — whose DNA is [presumed to be] exactly the same — were twice as likely to share traits compared with non-identical twins.

Psychologists say the findings are significant because the stronger the genetic link, the more likely it is that these character traits are carried through a family.

Professor Timothy Bates, of the University of Edinburgh’s School of Philosophy, Psychology and Language Sciences, said that the genetic influence was strongest on a person’s sense of self-control.

Researchers found that genes affected a person’s sense of purpose, how well they get on with people and their ability to continue learning and developing.

Professor Bates added: “Ever since the ancient Greeks, people have debated the nature of a good life and the nature of a virtuous life. Why do some people seem to manage their lives, have good relationships and cooperate to achieve their goals while others do not? Previously, the role of family and the environment around the home often dominated people’s ideas about what affected psychological well-being. However, this work highlights a much more powerful influence from genetics.”

The study, which builds on previous research that found that happiness is underpinned by genes, is published online in the Journal of Personality.

Source: Science Daily

May 17, 201214 notes
#neuroscience #psychology #science #genetics
Damaged Connections in Phineas Gage's Brain: Famous 1848 Case of Man Who Survived Accident Has Modern Parallel

ScienceDaily (May 16, 2012) — Poor Phineas Gage. In 1848, the supervisor for the Rutland and Burlington Railroad in Vermont was using a 13-pound, 3-foot-7-inch rod to pack blasting powder into a rock when he triggered an explosion that drove the rod through his left cheek and out of the top of his head. As reported at the time, the rod was later found, “smeared with blood and brains.”

image

Recreation of Gage accident. (Credit: Copyright John Darrell Van Horn and the UCLA Laboratory of Neuro Imaging, 2012)

Miraculously, Gage lived, becoming the most famous case in the history of neuroscience — not only because he survived a horrific accident that led to the destruction of much of his left frontal lobe but also because of the injury’s reported effects on his personality and behavior, which were said to be profound. Gage went from being an affable 25-year-old to one that was fitful, irreverent and profane. His friends and acquaintances said he was “no longer Gage.”

Over the years, various scientists have studied and argued about the exact location and degree of damage to Gage’s cerebral cortex and the impact it had on his personality. Now, for the first time, researchers at UCLA, using brain-imaging data that was lost to science for a decade, have broadened the examination of Gage to look at the damage to the white matter “pathways” that connect various regions of the brain.

Reporting in the May 16 issue of the journal PLoS ONE, Jack Van Horn, a UCLA assistant professor of neurology, and colleagues note that while approximately 4 percent of the cerebral cortex was intersected by the rod’s passage, more than 10 percent of Gage’s total white matter was damaged. The passage of the tamping iron caused widespread damage to the white matter connections throughout Gage’s brain, which likely was a major contributor to the behavioral changes he experienced.

Because white matter and its myelin sheath — the fatty coating around the nerve fibers that form the basic wiring of the brain — connect the billions of neurons that allow us to reason and remember, the research not only adds to the lore of Phineas Gage but may eventually lead to a better understanding of multiple brain disorders that are caused in part by similar damage to these connections.

"What we found was a significant loss of white matter connecting the left frontal regions and the rest of the brain," said Van Horn, who is a member of UCLA’s Laboratory of Neuro Imaging (LONI). "We suggest that the disruption of the brain’s ‘network’ considerably compromised it. This may have had an even greater impact on Mr. Gage than the damage to the cortex alone in terms of his purported personality change."

LONI is part of an ambitious joint effort with Massachusetts General Hospital and the National Institutes of Health to document the trillions of microscopic links between every one of the brain’s 100 billion neurons — the so-called “connectome.” And because mapping the brain’s physical wiring eventually will lead to answers about what causes mental conditions that may be linked to the breakdown of these connections, it was appropriate, as well as historically interesting, to take a new look at the damage to Gage’s brain.

Since Gage’s 189-year-old skull, which is on display in the Warren Anatomical Museum at Harvard Medical School, is now fragile and unlikely to again be subjected to medical imaging, the researchers had to track down the last known imaging data, from 2001, which had been lost due to various circumstances at Brigham and Women’s Hospital, a teaching affiliate of Harvard, for some 10 years.

The authors were able to recover the computed tomographic data files and managed to reconstruct the scans, which revealed the highest-quality resolution available for modeling Gage’s skull. Next, they utilized advanced computational methods to model and determine the exact trajectory of the tamping iron that shot through his skull. Finally, because the original brain tissue was, of course, long gone, the researchers used modern-day brain images of males that matched Gage’s age and (right) handedness, then used software to position a composite of these 110 images into Gage’s virtual skull, the assumption being that Gage’s anatomy would have been similar.

Van Horn found that nearly 11 percent of Gage’s white matter was damaged, along with 4 percent of the cortex.

"Our work illustrates that while cortical damage was restricted to the left frontal lobe, the passage of the tamping iron resulted in the widespread interruption of white matter connectivity throughout his brain, so it likely was a major contributor to the behavioral changes he experienced," Van Horn said. "Connections were lost between the left frontal, left temporal and right frontal cortices and the left limbic structures of the brain, which likely had considerable impact on his executive as well as his emotional functions."

And while Gage’s personality changed, he eventually was able to travel and find employment as a stagecoach driver for several years in South America. Ultimately, he died in San Francisco, 12 years after the accident.

Van Horn noted a modern parallel.

"The extensive loss of white matter connectivity, affecting both hemispheres, plus the direct damage by the rod, which was limited to the left cerebral hemisphere, is not unlike modern patients who have suffered a traumatic brain injury," he said. "And it is analogous to certain forms of degenerative diseases, such as Alzheimer’s disease or frontal temporal dementia, in which neural pathways in the frontal lobes are degraded, which is known to result in profound behavioral changes."

Van Horn noted that the quantification of the changes to Gage’s brain’s pathways might well provide important insights for clinical assessment and outcome-monitoring in modern-day brain trauma patients.

Source: Science Daily

May 17, 201228 notes
#science #neuroscience #brain #psychology
Positive feedback in the developing brain

May 16, 2012

(Medical Xpress) — When an animal is born, its early experiences help map out the still-forming connections in its brain. As neurons in sensory areas of the brain fire in response to sights, smells, and sounds, synapses begin to form, laying the neuronal groundwork for activity later in life. Not all parts of the brain receive input directly from the external world, however, and researchers have wondered how these regions build their wiring early in development.

image

The output of this indirect-pathway neuron in the striatum of a mouse brain has been genetically silenced. The neuron has been filled through the attached electrode with a red fluorophore to measure its spine density and the number of active synapses. In the background, other indirect pathway neurons are seen in green and red. Credit: Bernardo Sabatini

New research from Howard Hughes Medical Institute investigator Bernardo Sabatini and colleagues on the basal ganglia, a region of the brain that controls motor planning, indicates that development here follows a different strategy. The new findings suggest that wiring of the basal ganglia during early development is driven not only by experience, but also by a self-reinforcing loop of neuronal signaling. As the loop strengthens, more synapses form.

“What we found is that silencing these neurons doesn’t really change their output patterns — of course they are silenced, but they still find their targets and survive — but instead drastically influences their inputs,” said Bernardo L. Sabatini.

The basal ganglia help an animal select its actions based on sensory and social context, as well as past experience. The new clues about how the basal ganglia gets wired shortly after birth, described in the May 13, 2012, issue of the journal Nature, may help scientists understand what happens when the area goes awry, such as in Parkinson’s disease, when degradation of neurons in the basal ganglia interferes with patients’ ability to initiate appropriate movements, or drug addiction, where overstimulation of the basal ganglia spurs inappropriate actions. Sabatini says his team’s findings also suggest that the process can be easily perturbed during development, and may contribute to human disorders such as cerebral palsy and attention deficit hyperactivity disorder.

Although the basal ganglia do not receive direct messages from the external world, this region of the brain is by no means anatomically isolated: it receives signals from all over the cortex, and its output eventually returns to the cortex. Sabatini, who is at Harvard Medical School, explains that to select a motor action, the brain likely signals through that whole loop. “The question is, how do you lay down the circuits for those patterns?”

The basal ganglia are complex, containing many clusters of cells, some of which send excitatory signals and others inhibitory. Sabatini’s group focused on the basal ganglia’s main input station, the striatum. The striatum uses the information it receives to help direct movement in two ways: a ‘direct’ pathway stimulates motor actions and an ‘indirect’ pathway inhibits them. To learn how striatal activity affects circuit development, Sabatini’s team studied mutant mice whose indirect or direct pathways were turned off (because they were unable to release the inhibitory chemical messenger, GABA).

The group expected that silencing these neurons would prevent them from forming connections with the neurons that should have been receiving their signals. To their surprise, the silenced neurons survived and wired themselves to their targets normally. Unexpectedly, however, silencing the striatum’s direct pathway seemed to prevent formation of the connections sending input to the striatum. Silencing the indirect pathway upped the number of inputs. “We went into this study thinking completely differently,” says Sabatini. “What we found is that silencing these neurons doesn’t really change their output patterns — of course they are silenced, but they still find their targets and survive — but instead drastically influences their inputs.”

To see whether individual cells help set up the basal ganglia circuit, Sabatini’s group turned off a select few striatal neurons, rather than whole pathways, in the mice. They found that silencing these neurons did not affect excitatory connections to the area, suggesting that circuit-level activity patterns set up the basal ganglia’s wiring, rather than individual genes or molecules within cells. “It’s hard to believe that there are molecular cues that specify these structures, because it would be way too complicated,” Sabatini says.

When the group dampened activity in neurons that project from the brain’s cortex to the striatum during development, then examined the brain when the mouse had reached early adulthood (25 days after birth) they saw fewer neuronal connections in the striatum compared to mice that had developed normally suggesting that early perturbations in development can have lasting effects. “That experiments is what told us that it’s the ongoing activity of cortical neurons that is driving this process in the striatum,” Sabatini says. The axons — the slender processes of the neuron that carry electrical impulses — stimulate striatal cells by releasing the excitatory neurotransmitter glutamate, telling them to make more synapses and stabilize them, he adds.

Sabatini believes that the basal ganglia tests random connection patterns after an animal is born and reinforces the correct ones. This type of plasticity of the basal ganglia probably lasts into adulthood, because animals are constantly learning to take new actions. Using genetically engineered mice that allow researchers to control exactly which neurons to inactivate and when, Sabatini’s group is now studying how perturbations affect the wiring later in life.

Sabatini expects that these results will get us a step closer to understanding human disease. “Maybe we will show that there’s hope for therapy,” he adds. “If it is plastic, maybe we can recover.”

Provided by Howard Hughes Medical Institute

Source: medicalxpress.com

May 16, 20128 notes
#science #neuroscience #brain #psychology
Let's get moving: Unravelling how locomotion starts

May 16, 2012

(Medical Xpress) — Scientists at the University of Bristol have shed new light on one of the great unanswered questions of neuroscience: how the brain initiates rhythmic movements like walking, running and swimming.

image

The Xenopus frog tadpole is a small, simple vertebrate

While experiments in the 1970s using electrical brain stimulation identified areas of the brain responsible for starting locomotion, the precise neuron-by-neuron pathway has not been described in any vertebrate – until now. 

To find this pathway, Dr. Edgar Buhl and colleagues in Bristol’s School of Biological Sciences studied a small, simple vertebrate: the Xenopus frog tadpole.

They found that the pathway to initiate swimming consists of just four types of neurons.  By touching skin on the head of the tadpole and applying cellular neurophysiology and anatomy techniques, the scientists identified nerve cells that detect the touch on the skin, two types of brain nerve cells which pass on the signal, and the motor nerve cells that control the swimming muscles. 

Dr. Buhl said: “These findings address the longstanding question of how locomotion is initiated following sensory stimulation and, for the first time in any vertebrate, define in detail a direct pathway responsible.  They could thus be of great evolutionary interest and could also open the path to understanding initiation of locomotion in other vertebrates.”

When mechanisms in the brain that initiate locomotion break down – for example, in people with Parkinson’s disease – starting to walk becomes a real problem.  Therefore, understanding the initiation of swimming in tadpoles could be a first step towards understanding the initiation of locomotion in more complex vertebrates, including people, and may eventually have implications for treating movement disorders such as Parkinson’s.

The research is published today in the Journal of Physiology.

Provided by University of Bristol

Source: medicalxpress.com

May 16, 20122 notes
#science #neuroscience #brain #psychology
Surgeons Restore Some Hand Function to Quadriplegic Patient

May 15th, 2012

Technique could help those with C6, C7 spinal cord injuries.

Surgeons at Washington University School of Medicine in St. Louis have restored some hand function in a quadriplegic patient with a spinal cord injury at the C7 vertebra, the lowest bone in the neck. Instead of operating on the spine itself, the surgeons rerouted working nerves in the upper arms. These nerves still “talk” to the brain because they attach to the spine above the injury.

Following the surgery, performed at Barnes-Jewish Hospital, and one year of intensive physical therapy, the patient regained some hand function, specifically the ability to bend the thumb and index finger. He can now feed himself bite-size pieces of food and write with assistance.

The case study, published online May 15 in the Journal of Neurosurgery, is, to the authors’ knowledge, the first reported case of using nerve transfer to restore the ability to flex the thumb and index finger after a spinal cord injury.

“This procedure is unusual for treating quadriplegia because we do not attempt to go back into the spinal cord where the injury is,” says surgeon Ida K. Fox, MD, assistant professor of plastic and reconstructive surgery at Washington University, who treats patients at Barnes-Jewish Hospital. “Instead, we go out to where we know things work — in this case the elbow — so that we can borrow nerves there and reroute them to give hand function.”

image

To detour around the block in this patient’s C7 spinal cord injury and return hand function, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury (green) and the non-working nerves that connect below the injury (red) run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor (yellow arrow). Image adapted from Eric Young image available in press release mentioned.

Although patients with spinal cord injuries at the C6 and C7 vertebra have no hand function, they do have shoulder, elbow and some wrist function because the associated nerves attach to the spinal cord above the injury and connect to the brain. Since the surgeon must tap into these working nerves, the technique will not benefit patients who have lost all arm function due to higher injuries — in vertebrae C1 through C5.

The surgery was developed and performed by the study’s senior author Susan E. Mackinnon, MD, chief of the Division of Plastic and Reconstructive Surgery at Washington University School of Medicine. Specializing in injuries to peripheral nerves, she has pioneered similar surgeries to return function to injured arms and legs.

Mackinnon originally developed this procedure for patients with arm injuries specifically damaging the nerves that provide the ability to flex the thumb and index finger. This is the first time she has applied this peripheral nerve technique to return limb function after a spinal cord injury.

[Video: Surgeons restore some hand function to quadriplegic patient]
Surgeons at Washington University School of Medicine in St. Louis have restored some hand function in a quadriplegic patient with a spinal cord injury at the C7 vertebra, the lowest bone in the neck. Instead of operating on the spine itself, the surgeons rerouted working nerves in the upper arms. These nerves still “talk” to the brain because they attach to the spine above the injury. Following the surgery, performed at Barnes-Jewish Hospital, and one year of intensive physical therapy, the patient regained the ability to pinch and can now feed himself bite-size pieces of food and write with assistance.

“Many times these patients say they would like to be able to do very simple things,” Fox says. “They say they would like to be able to feed themselves or write without assistance. If we can restore the ability to pinch, between thumb and index finger, it can return some very basic independence.”

Mackinnon cautions that the hand function restored to the patient was not instantaneous and required intensive physical therapy. It takes time to retrain the brain to understand that nerves that used to bend the elbow now provide pinch, she says.

Though this study reports only one case, Mackinnon and her colleagues do not anticipate a limited window of time during which a patient with a similar spinal cord injury must be treated with this nerve transfer technique. This patient underwent the surgery almost two years after his injury. As long as the nerve remains connected to the support and nourishment of the spinal cord, even though it no longer “talks” to the brain, the nerve and its associated muscle remain healthy, even years after the injury.

“The spinal cord is the control center for the nerves, which run like spaghetti all the way out to the tips of the fingers and the tips of the toes,” says Mackinnon, the Sydney M. Shoenberg Jr. and Robert H. Shoenberg Professor and director of the School of Medicine’s Center for Nerve Injury and Paralysis. “Even nerves below the injury remain healthy because they are still connected to the spinal cord. The problem is that these nerves no longer ‘talk’ to the brain because the spinal cord injury blocks the signals.”

To detour around the block in this patient’s C7 spinal cord injury and return hand function below the level of the injury, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury and the non-working nerves that connect below the injury run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor.

In this case, Mackinnon took a non-working nerve that controls the ability to pinch and plugged it into a working nerve that drives one of two muscles that flex the elbow. After the surgery, the bicep still flexes the elbow, but a second muscle, called the brachialis, that used to also provide elbow flexion, now bends the thumb and index finger.

“This is not a particularly expensive or overly complex surgery,” Mackinnon says. “It’s not a hand or a face transplant, for example. It’s something we would like other surgeons around the country to do.”

By Julia Evangelou Strait

Source: Neuroscience News

May 16, 201210 notes
#science #neuroscience
This Is Your Brain On Sugar: Study in Rats Shows High-Fructose Diet Sabotages Learning, Memory

ScienceDaily (May 15, 2012) — Attention, college students cramming between midterms and finals: Binging on soda and sweets for as little as six weeks may make you stupid.

image

New research suggests that binging on soda and sweets for as little as six weeks may make you stupid. (Credit: © RTimages / Fotolia)

A new UCLA rat study is the first to show how a diet steadily high in fructose slows the brain, hampering memory and learning — and how omega-3 fatty acids can counteract the disruption. The peer-reviewed Journal of Physiology publishes the findings in its May 15 edition.

"Our findings illustrate that what you eat affects how you think," said Fernando Gomez-Pinilla, a professor of neurosurgery at the David Geffen School of Medicine at UCLA and a professor of integrative biology and physiology in the UCLA College of Letters and Science. "Eating a high-fructose diet over the long term alters your brain’s ability to learn and remember information. But adding omega-3 fatty acids to your meals can help minimize the damage."

While earlier research has revealed how fructose harms the body through its role in diabetes, obesity and fatty liver, this study is the first to uncover how the sweetener influences the brain.

The UCLA team zeroed in on high-fructose corn syrup, an inexpensive liquid six times sweeter than cane sugar, that is commonly added to processed foods, including soft drinks, condiments, applesauce and baby food. The average American consumes more than 40 pounds of high-fructose corn syrup per year, according to the U.S. Department of Agriculture. “We’re not talking about naturally occurring fructose in fruits, which also contain important antioxidants,” explained Gomez-Pinilla, who is also a member of UCLA’s Brain Research Institute and Brain Injury Research Center. “We’re concerned about high-fructose corn syrup that is added to manufactured food products as a sweetener and preservative.”

Gomez-Pinilla and study co-author Rahul Agrawal, a UCLA visiting postdoctoral fellow from India, studied two groups of rats that each consumed a fructose solution as drinking water for six weeks. The second group also received omega-3 fatty acids in the form of flaxseed oil and docosahexaenoic acid (DHA), which protects against damage to the synapses — the chemical connections between brain cells that enable memory and learning.

"DHA is essential for synaptic function — brain cells’ ability to transmit signals to one another," Gomez-Pinilla said. "This is the mechanism that makes learning and memory possible. Our bodies can’t produce enough DHA, so it must be supplemented through our diet."

The animals were fed standard rat chow and trained on a maze twice daily for five days before starting the experimental diet. The UCLA team tested how well the rats were able to navigate the maze, which contained numerous holes but only one exit. The scientists placed visual landmarks in the maze to help the rats learn and remember the way.

Six weeks later, the researchers tested the rats’ ability to recall the route and escape the maze. What they saw surprised them.

"The second group of rats navigated the maze much faster than the rats that did not receive omega-3 fatty acids," Gomez-Pinilla said. "The DHA-deprived animals were slower, and their brains showed a decline in synaptic activity. Their brain cells had trouble signaling each other, disrupting the rats’ ability to think clearly and recall the route they’d learned six weeks earlier."

The DHA-deprived rats also developed signs of resistance to insulin, a hormone that controls blood sugar and regulates synaptic function in the brain. A closer look at the rats’ brain tissue suggested that insulin had lost much of its power to influence the brain cells.

"Because insulin can penetrate the blood-brain barrier, the hormone may signal neurons to trigger reactions that disrupt learning and cause memory loss," Gomez-Pinilla said.

He suspects that fructose is the culprit behind the DHA-deficient rats’ brain dysfunction. Eating too much fructose could block insulin’s ability to regulate how cells use and store sugar for the energy required for processing thoughts and emotions.

"Insulin is important in the body for controlling blood sugar, but it may play a different role in the brain, where insulin appears to disturb memory and learning," he said. "Our study shows that a high-fructose diet harms the brain as well as the body. This is something new."

Gomez-Pinilla, a native of Chile and an exercise enthusiast who practices what he preaches, advises people to keep fructose intake to a minimum and swap sugary desserts for fresh berries and Greek yogurt, which he keeps within arm’s reach in a small refrigerator in his office. An occasional bar of dark chocolate that hasn’t been processed with a lot of extra sweetener is fine too, he said.

Still planning to throw caution to the wind and indulge in a hot-fudge sundae? Then also eat foods rich in omega-3 fatty acids, like salmon, walnuts and flaxseeds, or take a daily DHA capsule. Gomez-Pinilla recommends one gram of DHA per day.

"Our findings suggest that consuming DHA regularly protects the brain against fructose’s harmful effects," said Gomez-Pinilla. "It’s like saving money in the bank. You want to build a reserve for your brain to tap when it requires extra fuel to fight off future diseases."

Source: Science Daily

May 16, 201228 notes
#science #neuroscience #brain #memory #psychology
Chronic Child Abuse Strong Indicator of Negative Adult Experiences

ScienceDaily (May 15, 2012) — Child abuse or neglect are strong predictors of major health and emotional problems, but little is known about how the chronicity of the maltreatment may increase future harm apart from other risk factors in a child’s life.

image

This chart illustrates the individual childhood and adult outcomes according to the number of reports that occurred before the event of interest. Because it was possible for some children to enter the study period with a pre-existing condition, these are indicated as gray or black bars with the legend indicating the outcome occurred “before the study.” Chronicity is associated with increasing risk for all but child maltreatment perpetration, violent delinquency, and head or brain injury. In these cases, there is a slight decline in prevalence for the highest category compared with middle categories, but in all cases having reports was associated with higher rates of outcomes. (Credit: Image courtesy of Washington University in St. Louis)

In a new study published in the current issue of the journal Pediatrics, Melissa Jonson-Reid, PhD, child welfare expert and a professor at the Brown School at Washington University in St. Louis, looked at how chronic maltreatment impacted the future health and behavior of children and adults.

The study tracked children by number of child maltreatment reports (zero to four or more) and followed the children into early adulthood, by which time some of the children had become parents.

The study sought to determine how well the number of child maltreatment reports predicted poor outcomes in adolescence, such as delinquency, substance abuse in the teen years or getting a sexually transmitted disease.

"For every measure studied, a more chronic history of child maltreatment reports was powerfully predictive of worse outcomes," Jonson-Reid says.

"For most outcomes, having a single maltreatment report put children at a 20 percent to 50 percent higher risk than non-maltreated comparison children.

In addition, a series of adult outcomes were tracked to see if the chronicity of maltreatment still mattered after controlling for the poor outcomes in adolescence. Adult outcomes included adult substance abuse or growing up and having children whom they then maltreated.

"In models of adult outcomes, children with four or more reports were about least twice as likely to later abuse their own children and have contact with the mental health system, even when controlling for the negative outcomes during adolescence." Jonson-Reid says that there appears to be good reason to put resources into preventing ongoing maltreatment.

"Successfully interrupting chronic child maltreatment may well reduce risk of a wide range of other costly child and adolescent health and behavioral problems," she says.

Jonson-Reid cites a recently published Centers for Disease Control and Prevention study estimating lifetime costs for a single year’s worth of children reported for maltreatment at $242 billion.

"What our study illustrates is that these costs are even more likely to accrue for children who continue to be re-reported," she says.

The study also found that maltreatment predicts a range of negative adolescent outcomes, and those adolescent outcomes then predict poor adult outcomes.

"If the poor outcomes in adolescence can be dealt with effectively, then later adult outcomes may also be forestalled," Jonson-Reid says.

"Our findings could therefore be interpreted as supporting many current evidence-based interventions that seek to improve behavioral and social functioning among children and adolescents who have experienced trauma like abuse or neglect."

Source: Science Daily

May 15, 201213 notes
#science #neuroscience #psychology
Mystery Gene Reveals New Mechanism for Anxiety Disorders

ScienceDaily (May 15, 2012) — A novel mechanism for anxiety behaviors, including a previously unrecognized inhibitory brain signal, may inspire new strategies for treating psychiatric disorders, University of Chicago researchers report.

By testing the controversial role of a gene called Glo1 in anxiety, scientists uncovered a new inhibitory factor in the brain: the metabolic by-product methylglyoxal. The system offers a tantalizing new target for drugs designed to treat conditions such as anxiety disorder, epilepsy, and sleep disorders.

The study, published in the Journal of Clinical Investigation, found that animals with multiple copies of the Glo1 gene were more likely to exhibit anxiety-like behavior in laboratory tests. Further experiments showed that Glo1 increased anxiety-like behavior by lowering levels of methylglyoxal (MG). Conversely, inhibiting Glo1 or raising MG levels reduced anxiety behaviors.

"Animals transgenic for Glo1 had different levels of anxiety-like behavior, and more copies made them more anxious," said Abraham Palmer, PhD, assistant professor of human genetics at the University of Chicago Medicine and senior author of the study. "We showed that Glo1 was causally related to anxiety-like behavior, rather than merely correlated."

In 2005, a comparison of different mouse strains found a link between anxiety-like behaviors and Glo1, the gene encoding the metabolic enzyme glyoxylase 1. However, subsequent studies questioned the link, and the lack of an obvious connection between glyoxylase 1 and brain function or behavior made some scientists skeptical.

Read More →

May 15, 201227 notes
#science #neuroscience #brain #psychology #anxiety
Drugs from lizard saliva reduces the cravings for food

May 15, 2012

A drug made from the saliva of the Gila monster lizard is effective in reducing the craving for food. Researchers at the Sahlgrenska Academy, University of Gothenburg, have tested the drug on rats, who after treatment ceased their cravings for both food and chocolate.

image

In a study with rats published in the Journal of Neuroscience, Assistant Professor Karolina Skibicka and her colleagues show that exendin-4 effectively reduces the cravings for food. Credit: Photo: University of Gothenburg

An increasing number of patients suffering from type 2 diabetes are offered a pharmaceutical preparation called Exenatide, which helps them to control their blood sugar. The drug is a synthetic version of a natural substance called exendin-4, which is obtained from a rather unusual source – the saliva of the Gila monster lizard (Heloderma suspectum), North America’s largest lizard.

Researchers at the Sahlgrenska Academy at the University of Gothenburg, have now found an entirely new and unexpected effect of the lizard substance.

In a study with rats published in the Journal of Neuroscience, Assistant Professor Karolina Skibicka and her colleagues show that exendin-4 effectively reduces the cravings for food.

"This is both unknown and quite unexpected effect," comments an enthusiastic Karolina Skibicka:

" Our decision to eat is linked to the same mechanisms in the brain which control addictive behaviours. We have shown that exendin-4 affects the reward and motivation regions of the brain"

The implications of the findings are significant” states Suzanne Dickson, Professor of Physiology at the Sahlgrenska Academy: “Most dieting fails because we are obsessed with the desire to eat, especially tempting foods like sweets. As exendin-4 suppresses the cravings for food, it can help obese people to take control of their weight,” suggests Professor Dickson.

Research on exendin-4 also gives hope for new ways to treat diseases related to eating disorders, for example, compulsive overeating.

Another hypothesis for the Gothenburg researchers’ continuing studies is that exendin-4 may be used to reduce the craving for alcohol.

"It is the same brain regions which are involved in food cravings and alcohol cravings, so it would be very interesting to test whether exendin-4 also reduces the cravings for alcohol,” suggests Assistant Professor Skibicka.

Provided by University of Gothenburg

Source: medicalxpress.com

May 15, 201218 notes
#neuroscience #science #psychology
Active lifestyle in elderly keeps their brains running

May 15, 2012

(Medical Xpress) — New research from Uppsala University, Sweden, suggests that an active lifestyle in late life protects grey matter and cognitive functions in humans. The findings are now published in the scientific journal Neurobiology of Aging.

In a new study, a multidisciplinary research team from the Uppsala University has systematically studied 331 men and women at the age of 75 years. The researchers examined whether an active lifestyle is tied to brain health in seniors living in Uppsala, Sweden. The brain structure of each participant was measured using magnetic imaging technology, so-called MRT, and various memory tests were administered in order to monitor the seniors’ cognitive status.

“We found that those elderly who reported to be more active in daily routine had larger grey and white matter and showed better performances on various memory tests, compared to those who had a sedentary lifestyle. Interestingly, active elderly had also more grey matter in the precuneus, a brain region that typically shrinks at the beginning of Alzheimer’s disease. Our findings suggest that an active lifestyle is a promising strategy for counteracting cognitive aging late in life,” says Christian Benedict.

The data for the study were taken from the major epidemiological study Prospective Investigation of the Vasculature in Uppsala Seniors (PIVUS). http://www.medsci.uu.se/pivus/

More information: Benedict C et al., Association between physical activity and brain health in older adults, Neurobiology of Aging, in press. http://www.sciencedirect.com/science/article/pii/S0197458012002618

Provided by Uppsala University

Source: medicalxpress.com

May 15, 20128 notes
#science #neuroscience #brain #psychology
First Gene Therapy Successful Against Aging-Associated Decline: Mouse Lifespan Extended Up to 24% With a Single Treatment

ScienceDaily (May 14, 2012) — A new study consisting of inducing cells to express telomerase, the enzyme which — metaphorically — slows down the biological clock — was successful. The research provides a “proof-of-principle” that this “feasible and safe” approach can effectively “improve health span.”

image

Pictured are Maria A. Blasco and Bruno M. Bernardes de Jesús (co-author) in the CNIO building in Madrid. (Credit: CNIO)

A number of studies have shown that it is possible to lengthen the average life of individuals of many species, including mammals, by acting on specific genes. To date, however, this has meant altering the animals’ genes permanently from the embryonic stage — an approach impracticable in humans. Researchers at the Spanish National Cancer Research Centre (CNIO), led by its director María Blasco, have demonstrated that the mouse lifespan can be extended by the application in adult life of a single treatment acting directly on the animal’s genes. And they have done so using gene therapy, a strategy never before employed to combat aging. The therapy has been found to be safe and effective in mice.

The results were recently published in the journal EMBO Molecular Medicine. The CNIO team, in collaboration with Eduard Ayuso and Fátima Bosch of the Centre of Animal Biotechnology and Gene Therapy at the Universitat Autònoma de Barcelona (UAB), treated adult (one-­‐year-­‐old) and aged (two-­‐year-­‐old) mice, with the gene therapy delivering a “rejuvenating” effect in both cases, according to the authors.

Mice treated at the age of one lived longer by 24% on average, and those treated at the age of two, by 13%. The therapy, furthermore, produced an appreciable improvement in the animals’ health, delaying the onset of age-­‐related diseases — like osteoporosis and insulin resistance — and achieving improved readings on aging indicators like neuromuscular coordination.

The gene therapy consisted of treating the animals with a DNA-­modified virus, the viral genes having been replaced by those of the telomerase enzyme, with a key role in aging. Telomerase repairs the extreme ends or tips of chromosomes, known as telomeres, and in doing so slows the cell’s and therefore the body’s biological clock. When the animal is infected, the virus acts as a vehicle depositing the telomerase gene in the cells.

This study “shows that it is possible to develop a telomerase-­based anti-­aging gene therapy without increasing the incidence of cancer,” the authors affirm. “Aged organisms accumulate damage in their DNA due to telomere shortening, [this study] finds that a gene therapy based on telomerase production can repair or delay this kind of damage,” they add.

'Resetting' the biological clock

Telomeres are the caps that protect the end of chromosomes, but they cannot do so indefinitely: each time the cell divides the telomeres get shorter, until they are so short that they lose all functionality. The cell, as a result, stops dividing and ages or dies. Telomerase gets around this by preventing telomeres from shortening or even rebuilding them. What it does, in essence, is stop or reset the cell’s biological clock.

But in most cells the telomerase gene is only active before birth; the cells of an adult organism, with few exceptions, have no telomerase. The exceptions in question are adult stem cells and cancer cells, which divide limitlessly and are therefore immortal — in fact several studies have shown that telomerase expression is the key to the immortality of tumour cells.

It is precisely this risk of promoting tumour development that has set back the investigation of telomerase-­‐based anti-­‐aging therapies.

In 2007, Blasco’s group demonstrated that it was feasible to prolong the lives of transgenic mice, whose genome had been permanently altered at the embryonic stage, by causing their cells to express telomerase and, also, extra copies of cancer-­‐resistant genes. These animals live 40% longer than is normal and do not develop cancer.

The mice subjected to the gene therapy now under test are likewise free of cancer. Researchers believe this is because the therapy begins when the animals are adult so do not have time to accumulate sufficient number of aberrant divisions for tumours to appear.

Also important is the kind of virus employed to carry the telomerase gene to the cells. The authors selected demonstrably safe viruses that have been successfully used in gene therapy treatment of hemophilia and eye disease. Specifically, they are non-­‐replicating viruses derived from others that are non-­‐pathogenic in humans.

This study is viewed primarily as “a proof-­‐of-­‐principle that telomerase gene therapy is a feasible and generally safe approach to improve healthspan and treat disorders associated with short telomeres,” state Virginia Boccardi (Second University of Naples) and Utz Herbig (New Jersey Medical School-­‐University Hospital Cancer Centre) in a commentary published in the same journal.

Although this therapy may not find application as an anti-­‐aging treatment in humans, in the short term at least, it could open up a new treatment option for ailments linked with the presence in tissue of abnormally short telomeres, as in some cases of human pulmonary fibrosis.

More healthy years

As Blasco says, “aging is not currently regarded as a disease, but researchers tend increasingly to view it as the common origin of conditions like insulin resistance or cardiovascular disease, whose incidence rises with age. In treating cell aging, we could prevent these diseases.”

With regard to the therapy under testing, Bosch explains: “Because the vector we use expresses the target gene (telomerase) over a long period, we were able to apply a single treatment. This might be the only practical solution for an anti-­‐aging therapy, since other strategies would require the drug to be administered over the patient’s lifetime, multiplying the risk of adverse effects.”

Source: Science Daily

May 15, 201218 notes
#science #neuroscience #brain #psychology
Smoked Cannabis Reduces Some Symptoms of Multiple Sclerosis

May 14th, 2012

Controlled trial shows improved spasticity, reduced pain after smoking medical marijuana.

A clinical study of 30 adult patients with multiple sclerosis (MS) at the University of California, San Diego School of Medicine has shown that smoked cannabis may be an effective treatment for spasticity – a common and disabling symptom of this neurological disease.

The placebo-controlled trial also resulted in reduced perception of pain, although participants also reported short-term, adverse cognitive effects and increased fatigue. The study will be published in the Canadian Medical Association Journal on May 14.

Principal investigator Jody Corey-Bloom, MD, PhD, professor of neurosciences and director of the Multiple Sclerosis Center at UC San Diego, and colleagues randomly assigned participants to either the intervention group (which smoked cannabis once daily for three days) or the control group (which smoked identical placebo cigarettes, also once a day for three days). After an 11-day interval, the participants crossed over to the other group.

“We found that smoked cannabis was superior to placebo in reducing symptoms and pain in patients with treatment-resistant spasticity, or excessive muscle contractions,” said Corey-Bloom.

Earlier reports suggested that the active compounds of medical marijuana were potentially effective in treating neurologic conditions, but most studies focused on orally administered cannabinoids. There were also anecdotal reports of MS patients that endorsed smoking marijuana to relieve symptoms of spasticity.

However, this trial used a more objective measurement, a modified Ashford scale which graded the intensity of muscle tone by measuring such things as resistance in range of motion and rigidity. The secondary outcome, pain, was measured using a visual analogue scale. The researchers also looked at physical performance (using a timed walk) and cognitive function and – at the end of each visit – asked patients to assess their feeling of “highness.”

Although generally well tolerated, smoking cannabis did have mild effects on attention and concentration. The researchers noted that larger, long-terms studies are needed to confirm their findings and determine whether lower doses can result in beneficial effects with less cognitive impact.

The current study is the fifth clinical test of the possible efficacy of cannabis for clinical use reported by the University of California Center for Medicinal Cannabis Research (CMCR). Four other human studies on control of neuropathic pain also reported positive results.

Source: Neuroscience News

May 15, 20128 notes
#science #neuroscience #brain #psychology
New Type of Retinal Prosthesis Could Better Restore Sight to Blind

May 14th, 2012

Using tiny solar-panel-like cells surgically placed underneath the retina, scientists at the Stanford University School of Medicine have devised a system that may someday restore sight to people who have lost vision because of certain types of degenerative eye diseases.

This device — a new type of retinal prosthesis — involves a specially designed pair of goggles, which are equipped with a miniature camera and a pocket PC that is designed to process the visual data stream. The resulting images would be displayed on a liquid crystal microdisplay embedded in the goggles, similar to what’s used in video goggles for gaming. Unlike the regular video goggles, though, the images would be beamed from the LCD using laser pulses of near-infrared light to a photovoltaic silicon chip — one-third as thin as a strand of hair — implanted beneath the retina.

Electric currents from the photodiodes on the chip would then trigger signals in the retina, which then flow to the brain, enabling a patient to regain vision.

A study, to be published online May 13 in Nature Photonics, discusses how scientists tested the photovoltaic stimulation using the prosthetic device’s diode arrays in rat retinas in vitro and how they elicited electric responses, which are widely accepted indicators of visual activity, from retinal cells . The scientists are now testing the system in live rats, taking both physiological and behavioral measurements, and are hoping to find a sponsor to support tests in humans.

“It works like the solar panels on your roof, converting light into electric current,” said Daniel Palanker, PhD, associate professor of ophthalmology and one of the paper’s senior authors. “But instead of the current flowing to your refrigerator, it flows into your retina.” Palanker is also a member of the Hansen Experimental Physics Laboratory at Stanford and of the interdisciplinary Stanford research program, Bio-X. The study’s other senior author is Alexander Sher, PhD, of the Santa Cruz Institute of Particle Physics at UC Santa Cruz; its co-first authors are Keith Mathieson, PhD, a visiting scholar in Palanker’s lab, and James Loudin, PhD, a postdoctoral scholar. Palanker and Loudin jointly conceived and designed the prosthesis system and the photovoltaic arrays.

image

This pinpoint-sized photovoltaic chip (upper right corner) is implanted under the retina in a blind rat to restore sight. The center image shows how the chip is comprised of an array of photodiodes, which can be activated by pulsed near-infrared light to stimulate neural signals in the eye that propagate then to the brain. A higher magnification view (lower left corner) shows a single pixel of the implant, which has three diodes around the perimeter and an electrode in the center. The diodes turn light into an electric current which flows from the chip into the inner layer of retinal cells. Adapted from Stanford image courtesy of the Daniel Palanker lab.

Read More →

May 15, 20127 notes
#science #neuroscience #brain #vision #psychology
Sleepwalking more prevalent among US adults than previously suspected

May 14, 2012

What goes bump in the night? In many U.S. households: people. That’s according to new Stanford University School of Medicine research, which found that about 3.6 percent of U.S. adults are prone to sleepwalking. The work also showed an association between nocturnal wanderings and certain psychiatric disorders, such as depression and anxiety.

The study, the researchers noted, “underscores the fact that sleepwalking is much more prevalent in adults than previously appreciated.”

Maurice Ohayon, MD, DSc, PhD, professor of psychiatry and behavioral sciences, is the lead author of the paper, which will appear in the May 15 issue of Neurology, the medical journal of the American Academy of Neurology.

Sleepwalking is a disorder “of arousal from non-REM sleep.” While wandering around at night can be harmless and is often played for laughs — anyone remember the Simpsons episode where Homer began wandering around and doing silly things in his sleep? — sleepwalking can have serious consequences. Episodes can result in injuries to the wanderer or others and lead to impaired psychosocial functioning.

It is thought that medication use and certain psychological and psychiatric conditions can trigger sleepwalking, but the exact causes are unknown. Also unclear to experts in the field is the prevalence.

"Apart from a study we did 10 years ago in the European general population, where we reported a prevalence of 2 percent of sleepwalking," the researchers wrote in their paper, "there are nearly no data regarding the prevalence of nocturnal wanderings in the adult general population. In the United States, the only prevalence rate was published 30 years ago."

For this study, the first to use a large, representative sample of the U.S. general population to demonstrate the number of sleepwalkers, the researchers also aimed to evaluate the importance of medication use and mental disorders associated with sleepwalking. Ohayon and his colleagues secured a sample of 19,136 individuals from 15 states and then used phone surveys to gather information on participants’ mental health, medical history and medication use.

Participants were asked specific questions related to sleepwalking, including frequency of episodes during sleep, duration of the sleep disorder and any inappropriate or potentially dangerous behaviors during sleep. Those who didn’t report any episodes in the last year were asked if they had sleepwalked during their childhood. Participants were also queried about whether there was a family history of sleepwalking and whether they had other parasomnia symptoms, such as sleep terrors and violent behaviors during sleep.

The researchers determined that as many as 3.6 percent of the sample reported at least one episode of sleepwalking in the previous year, with 1 percent saying they had two or more episodes in a month. Because of the number of respondents who reported having episodes during childhood or adolescence, lifetime prevalence of sleepwalking was found to be 29.2 percent.

The study also showed that people with depression were 3.5 times more likely to sleepwalk than those without, and people with alcohol abuse/dependence or obsessive-compulsive disorder were also significantly more likely to have sleepwalking episodes. In addition, individuals taking SSRI antidepressants were three times more likely to sleepwalk twice a month or more than those who didn’t.

"There is no doubt an association between nocturnal wanderings and certain conditions, but we don’t know the direction of the causality," said Ohayon. "Are the medical conditions provoking sleepwalking, or is it vice versa? Or perhaps it’s the treatment that is responsible."

Although more research is needed, the work could help raise awareness of this association among primary care physicians. “We’re not expecting them to diagnose sleepwalking, but they might detect symptoms that could be indices of sleepwalking,” said Ohayon.

Among the researchers’ other findings:

  • The duration of sleepwalking was mostly chronic, with just over 80 percent of those who have sleepwalked reporting they’ve done so for more than five years.
  • Sleepwalking was not associated with gender and seemed to decrease with age.
  • Nearly one-third of individuals with nocturnal wandering had a family history of the disorder.
  • People using over-the-counter sleeping pills had a higher likelihood of reporting sleepwalking episodes at least two times per month. (Indeed, a sleeping pill was the trigger for Homer Simpson’s middle-of-the-night shenanigans.)

D. Le’ger, MD, PhD, from the Universite Paris Descartes in France, was senior author of the study. Researchers from the University of Minnesota Medical School, the Hopital Gui-de-Chauliac in Montpellier, France, and Duke University School of Medicine were also involved.

Provided by Stanford University Medical Center

Source: medicalxpress.com

May 15, 20123 notes
#science #neuroscience #brain #psychology #depression #anxiety
Brain circuitry is different for women with anorexia and obesity

May 14, 2012

Why does one person become anorexic and another obese? A study recently published by a University of Colorado School of Medicine researcher shows that reward circuits in the brain are sensitized in anorexic women and desensitized in obese women. The findings also suggest that eating behavior is related to brain dopamine pathways involved in addictions.

Guido Frank, MD, assistant professor director of the Developmental Brain Research Program at the CU School of Medicine and his colleagues used functional magnetic resonance imaging (fMRI) to examine brain activity in 63 women who were either anorexic or obese. Scientists compared them to women considered “normal” weight. The participants were visually conditioned to associate certain shapes with either a sweet or a non-sweet solution and then received the taste solutions expectedly or unexpectedly. This task has been associated with brain dopamine function in the past.

The authors found that during these fMRI sessions, an unexpected sweet-tasting solution resulted in increased neural activation of reward systems in the anorexic patients and diminished activation in obese individuals. In rodents, food restriction and weight loss have been associated with greater dopamine-related reward responses in the brain.

"It is clear that in humans the brain’s reward system helps to regulate food intake" said Frank. "The specific role of these networks in eating disorders such as anorexia nervosa and, conversely, obesity, remains unclear.”

Scientists agree that more research is needed in this area. The study was published in Neuropsychopharmacology.

Provided by University of Colorado Denver

Source: medicalxpress.com

May 15, 201225 notes
#science #neuroscience #brain #psychology #anorexia #obesity
How to minimize stroke damage

May 14, 2012

Following a stroke, factors as varied as blood sugar, body temperature and position in bed can affect patient outcomes, Loyola University Medical Center researchers report.

In a review article in the journal MedLink Neurology, first author Murray Flaster, MD, PhD and colleagues summarize the latest research on caring for ischemic stroke patients. (Most strokes are ischemic, meaning they are caused by blood clots.)

"The period immediately following an acute ischemic stroke is a time of significant risk,” the Loyola neurologists write. “Meticulous attention to the care of the stroke patient during this time can prevent further neurologic injury and minimize common complications, optimizing the chance of functional recovery.”

Stroke care has two main objectives – minimizing injury to brain tissue and preventing and treating the many neurologic and medical complications that can occur just after a stroke.

The authors discuss the many complex factors that affect outcomes. For example, there is considerable evidence of a link between hyperglycemia (high blood sugar) and poor outcomes after stroke. The authors recommend strict blood sugar control, using frequent finger-stick glucose checks and aggressive insulin treatment.

For each 1 degree C increase in the body temperature of stroke patients, the risk of death or severe disability more than doubles. Therapeutic cooling has been shown to help cardiac arrest patients, and clinical trials are underway to determine whether such cooling could also help stroke patients. Until those trials are completed, the goal should be to keep normal temperatures (between 95.9 and 99.5 degrees F).

Position in bed also is important, because sitting upright decreases blood flow in the brain. A common practice is to keep the patient lying flat for 24 hours. If a patient has orthopnea (difficulty breathing while lying flat), the head of the bed should be kept at the lowest elevation the patient can tolerate.

The authors discuss many other issues in stroke care, including blood pressure management; blood volume; statin therapy; management of complications such as pneumonia and sepsis; heart attack and other cardiac problems; blood clots; infection; malnutrition and aspiration; brain swelling; seizures; recurrent stroke; and brain hemorrhages.

Studies have shown that hospital units that specialize in stroke care decrease mortality, increase the likelihood of being discharged to home and improve functional status and quality of life.

All patients should receive supportive care — including those who suffer major strokes and the elderly. “Even in these populations, the majority of patients will survive their stroke,” the authors write. “The degree of functional recovery, however, may be dramatically impacted by the intensity and appropriateness of supportive care.”

Provided by Loyola University Health System

Source: medicalxpress.com

May 15, 201214 notes
#science #neuroscience #brain #stroke #psychology
Brain oscillations reveal that our senses do not experience the world continuously

May 14, 2012

(Medical Xpress) — It has long been suspected that humans do not experience the world continuously, but rather in rapid snapshots.

Now, researchers at the University of Glasgow have demonstrated this is indeed the case. Just as the body goes through a 24-hour sleep-wake cycle controlled by a circadian clock, brain function undergoes such cyclic activity – albeit at a much faster rate.

Professor Gregor Thut of the Institute of Neuroscience and Psychology, said: “Rhythms are intrinsic to biological systems. The circadian rhythm, with its very slow periodicity of sleep and wake cycles every 24 hours has an obvious, periodic effect on bodily functions.

“Brain oscillations – the recurrent neural activity that we see in the brain – also show periodicity but cycle at much faster speeds. What we wanted to know was whether brain function was affected in a cyclic manner by these rapid oscillations.”

The researchers studied a prominent brain rhythm associated with visual cortex functioning that cycles at a rate of 10 times per second (10Hz).

They used a ‘simple trick’ to affect the oscillations of this rhythm which involved presenting a brief sound to ‘reset’ the oscillation.

Testing subsequent visual perception, by using transcranial magnetic stimulation of the visual cortex, revealed a cyclic pattern at the very rapid rate of brain oscillations, in time with the underlying brainwaves.

Prof Thut said: “Rhythmicity therefore is indeed omnipresent not only in brain activity but also brain function. For perception, this means that despite experiencing the world as a continuum, we do not sample our world continuously but in discrete snapshots determined by the cycles of brain rhythms.”

The research, ‘Sounds reset rhythms of visual cortex and corresponding human visual perception’ is published in the journal Current Biology.

Provided by University of Glasgow

Source: medicalxpress.com

May 15, 201268 notes
#science #neuroscience #brain #psychology
Let there be light: It's good for our brains

May 14, 2012 By Sandy Evangelista

(Medical Xpress) — Swiss scientists have proven that light intensity influences our cognitive performance and how alert we feel, and that these positive effects last until early evening.

image

Credit: 2012 EPFL

Tests conducted in EPFL’s Solar Energy and Building Physics Laboratory (LESO) have confirmed the hypothesis that light influences our subjective feeling of sleepiness. The research team, led by Mirjam Münch, also showed that the effects of light exposure last until the early evening, and that light intensity has an impact on cognitive mechanisms. The results of this research were recently published in the journal Behavioral Neuroscience.

Light synchronizes our biological clocks. It is collected in the eye by photoreceptors that use photopigments (pigments that change when exposed to light), known as melanopsin. These cells, which differ from rods and cones, are considered a third class of photoreceptors in the retina and were discovered just ten years ago. They’re not there to form an image, but to perceive and absorb photons in the visible light spectrum. In addition, they are stimulated by blue light.

Exploring office lighting

Münch and her team wanted to know how our circadian rhythm could be influenced by our perception of light during the daytime. They created realistic office lighting conditions and recruited 29 young participants. “For this study, we took into account the intensity of natural and artificial light without specifically evaluating their spectra.”

From daytime to dusk

To synchronize their internal biological clocks, the volunteers had to maintain a regular sleep schedule during the seven days leading up to the test. They wore bracelets equipped with light sensors and accelerometers, so that the scientists could monitor their movements.

The study itself took place over two eight-hour sessions. The participants spent the first six hours in an experiment room, first in well-lighted conditions (1000-2000 lux, more or less equivalent to natural light in a room). In the second session, the light intensity was about 170 lux, which is what the eye perceives in a room without a window, lit with artificial light. For this experiment, light intensity was measured at eye-level. Every 30 minutes, the subjects were asked to assess how alert or sleepy they felt.

Finally, at the end of each session, the participants underwent two hours of supplemental memory tests in a darkened room – less than 6 lux. During these last two hours, the researchers took saliva samples in order to measure cortisol and melatonin concentrations. These two hormones are produced in a in a 24-hour cycle by the human body.

Boosted by the light

The volunteers who were subjected to higher light intensity during the afternoon were more alert all the way into the early evening. When they were subjected to light intensity ten times weaker, however, they showed signs of sleepiness and obtained lower scores on the memory tests.

These results were observed even in the absence of changes in cortisol and melatonin concentrations in their saliva. “With this study, we have discovered that light intensity has a direct effect on the subjective feeling of sleepiness as well as on objective cognitive performance, and that the benefits of more intense light during the daytime last long past the time of exposure,” concludes Münch.

Provided by Ecole Polytechnique Federale de Lausanne

Source: medicalxpress.com

May 14, 201212 notes
#science #neuroscience #brain #psychology
Powerful Function of Single Protein That Controls Neurotransmission Discovered

ScienceDaily (May 13, 2012) — Scientists at Weill Cornell Medical College have discovered that the single protein — alpha 2 delta — exerts a spigot-like function, controlling the volume of neurotransmitters and other chemicals that flow between the synapses of brain neurons. The study, published online in Nature, shows how brain cells talk to each other through these signals, relaying thoughts, feelings and action, and this powerful molecule plays a crucial role in regulating effective communication.

In the study, the investigators also suggest how the widely used pain drug Lyrica might work. The alpha 2 delta protein is the target of this drug and the new work suggests an approach to how other drugs could be developed that effectively twist particular neurotransmitter spigots on and off to treat neurological disorders. The research findings surprised the research team, which includes scientists from University College London.

"We are amazed that any single protein has such power," says the study’s lead investigator Dr. Timothy A. Ryan, professor of Biochemistry and associate professor of Biochemistry in Anesthesiology at Weill Cornell Medical College. "It is indeed rare to identify a biological molecule’s function that is so potent, that seems to be controlling the effectiveness of neurotransmission."

The researchers found that alpha 2 delta determines how many calcium channels will be present at the synaptic junction between neurons. The transmission of chemical signals is triggered at the synapse by the entry of calcium into these channels, so the volume and speed of neurotransmission depends on the availability of these channels.

Researchers discovered that taking away alpha 2 delta from brain cells prevented calcium channels from getting to the synapse. “But if you add more alpha 2 delta, you can triple the number of channels at synapses,” Dr. Ryan says. “This change in abundance was tightly linked to how well synapses carry out their function, which is to release neurotransmitters.”

Before this study, it was known that Lyrica, which is used for neuropathic pain, seizures and fibromyalgia, binds to alpha 2 delta, but little was understood about how this protein works to control synapses.

Read More →

May 14, 20129 notes
#science #neuroscience #brain #psychology
Vitamin K2: New Hope for Parkinson's Patients?

ScienceDaily (May 11, 2012) — Neuroscientist Patrik Verstreken, associated with VIB and KU Leuven, succeeded in undoing the effect of one of the genetic defects that leads to Parkinson’s using vitamin K2. His discovery gives hope to Parkinson’s patients.

image

Male fruit fly (Drosophila Melanogaster). Scientists have succeeded in undoing the effect of one of the genetic defects that leads to Parkinson’s using vitamin K2. The research was done in fruit flies. (Credit: © Studiotouch / Fotolia)

This research was done in collaboration with colleagues from Northern Illinois University (US) and was recently published in the journal Science.

"It appears from our research that administering vitamin K2 could possibly help patients with Parkinson’s. However, more work needs to be done to understand this better," says Patrik Verstreken.

Malfunctioning power plants are at the basis of Parkinson’s.

If we looked at cells as small factories, then mitochondria would be the power plants responsible for supplying the energy for their operation. They generate this energy by transporting electrons. In Parkinson’s patients, the activity of mitochondria and the transport of electrons have been disrupted, resulting in the mitochondria no longer producing sufficient energy for the cell. This has major consequences as the cells in certain parts of the brain will start dying off, disrupting communication between neurons. The results are the typical symptoms of Parkinson’s: lack of movement (akinesia), tremors and muscle stiffness.

The exact cause of this neurodegenerative disease is not known. In recent years, however, scientists have been able to describe several genetic defects (mutations) found in Parkinson’s patients, including the so-called PINK1 and Parkin mutations, which both lead to reduced mitochondrial activity. By studying these mutations, scientists hope to unravel the mechanisms underlying the disease process.

Paralyzed fruit flies

Fruit flies (Drosophila) are frequently used in lab experiments because of their short life spans and breeding cycles, among other things. Within two weeks of her emergence, every female is able to produce hundreds of offspring. By genetically modifying fruitflies, scientists can study the function of certain genes and proteins. Patrik Verstreken and his team used fruitflies with a genetic defect in PINK1 or Parkin that is similar to the one associated with Parkinson’s. They found that the flies with a PINK1 or Parkin mutation lost their ability to fly.

Upon closer examination, they discovered that the mitochondria in these flies were defective, just as in Parkinson’s patients. Because of this they generated less intracellular energy — energy the insects needed to fly. When the flies were given vitamin K2, the energy production in their mitochondria was restored and the insects’ ability to fly improved. The researchers were also able to determine that the energy production was restored because the vitamin K2 had improved electron transport in the mitochondria. This in turn led to improved energy production.

Conclusion

Vitamin K2 plays a role in the energy production of defective mitochondria. Because defective mitochondria are also found in Parkinson’s patients with a PINK1 or Parkin mutation, vitamin K2 potentially offers hope for a new treatment for Parkinson’s.

Source: Science Daily

May 14, 20128 notes
#science #neuroscience #brain #psychology #parkinson
Gene therapy for hearing loss: Potential and limitations

May 11, 2012

Regenerating sensory hair cells, which produce electrical signals in response to vibrations within the inner ear, could form the basis for treating age- or trauma-related hearing loss. One way to do this could be with gene therapy that drives new sensory hair cells to grow.

Researchers at Emory University School of Medicine have shown that introducing a gene called Atoh1 into the cochleae of young mice can induce the formation of extra sensory hair cells.

Their results show the potential of a gene therapy approach, but also demonstrate its current limitations. The extra hair cells produce electrical signals like normal hair cells and connect with neurons. However, after the mice are two weeks old, which is before puberty, inducing Atoh1 has little effect. This suggests that an analogous treatment in adult humans would also not be effective by itself.

The findings were published May 9 in the Journal of Neuroscience.

"We’ve shown that hair cell regeneration is possible in principle," says Ping Chen, PhD, associate professor of cell biology at Emory University School of Medicine. “In this paper, we have identified which cells are capable of becoming hair cells under the influence of Atoh1, and we show that there are strong age-dependent limitations on the effects of Atoh1 by itself.”

The first author of the paper, Michael Kelly, now a postdoctoral fellow at the National Institute on Deafness and Other Communication Disorders, was a graduate student in Emory’s Neuroscience program.

Kelly and his coworkers engineered mice to turn on the Atoh1 gene in the inner ear in response to the antibiotic doxycycline. Previous experimenters had used a virus to introduce Atoh1 into the cochleae of animals. This approach resembles gene therapy, but has the disadvantage of being slightly different each time, Chen says. In contrast, the mice have the Atoh1 gene turned on in specific cells along the lining of the inner ear, called the cochlear epithelium, but only when fed doxycycline.

Young mice given doxycycline for two days had extra sensory hair cells, in parts of the cochlea where developing hair cells usually appear, and also additional locations (see accompanying image).

The extra hair cells could generate electrical signals, although those signals weren’t as strong as mature hair cells. Also, the extra hair cells appeared to attract neuronal fibers, which suggests that those signals could connect to the rest of the nervous system.

"They can generate electrical signals, but we don’t know if they can really function in the context of hearing.” Chen says. “For that to happen, the hair cells’ signals need to be coordinated and integrated.”

Although doxycycline could turn on Atoh1 all over the surface of the cochlea, extra sensory hair cells did not appear everywhere. When they removed cochleae from the mice and grew them in culture dishes, her team was able to provoke even more hair cells to grow when they added a drug that inhibits the Notch pathway.

Manipulating the Notch pathway affects several aspects of embryonic development and in some contexts appears to cause cancer, so the approach needs to be refined further. Chen says that it may be possible to unlock the age-related limits on hair cell regeneration by supplying additional genes or drugs in combination with Atoh1, and the results with the Notch drug provide an example.

"Our future goals are to develop approaches to stimulate hair cell formation in older animals, and to examine functional recovery after Atoh1 induction," she says.

Provided by Emory University

Source: medicalxpress.com

May 14, 20124 notes
#science #neuroscience #brain #psychology
Study raises questions about use of anti-epilepsy drugs in newborns

May 11, 2012

A brain study in infant rats demonstrates that the anti-epilepsy drug phenobarbital stunts neuronal growth, which could prompt new questions about using the first-line drug to treat epilepsy in human newborns.

In Annals of Neurology EarlyView posted online May 11, researchers at Georgetown University Medical Center (GUMC) report that the anti-epilepsy drug phenobarbital given to rat pups about a week old changed the way the animals’ brains were wired, causing cognitive abnormalities later in life.

The researchers say it has been known that some of the drugs used to treat epilepsy increase the amount of neurons that die shortly after birth in the rat brain, but, until this study, no one had shown whether this action had any adverse impact on subsequent brain development.

"Our study is the first to show that the exposure to these drugs — and just a single exposure — can prevent brain circuits from developing their normal connectivity, meaning they may not be wired correctly, which can have long-lasting effects on brain function,” says the study’s senior investigator, Karen Gale, Ph.D., a professor of pharmacology at GUMC. “These findings suggest that in the growing brain, these drugs are not as benign as one would like to believe.”

For their study, the Georgetown researchers studied four agents including phenobarbital.

"The good news is not all anti-epilepsy drugs have this disruptive effect in the animal studies," Gale says.

The researchers found that the anti-epilepsy drug levetiracetam did not stunt synaptic growth. Animals treated with a third drug, lamotrigine, showed neural maturation, but it was delayed. An additional finding involved melatonin. When added to phenobarbital, it appeared to prevent the persistent adverse neural effects in the rat pups. Melatonin has been used clinically to protect cells from injury in humans.

"Many clinicians have been advocating for a reexamination of the use of these drugs in infants, and our findings provide experimental data to support that need," says the study’s co-lead investigator, Patrick A. Forcelli, Ph.D., a postdoctoral fellow in the department of pharmacology and physiology at GUMC. "Phenobarbital has been used to treat seizures for over 100 years — well before a Food and Drug Administration approval process was established— and for more than 50 years, it has been the first drug of choice in the treatment of seizures in neonates."

Read More →

May 14, 20121 note
#science #neuroscience #brain #epilepsy #psychology
Confirmation of repeated patterns of neurons indicates stereotypical organization throughout brain's cerebral cortex

May 11, 2012

Neurons are arranged in periodic patterns that repeat over large distances in two areas of the cerebral cortex, suggesting that the entire cerebral cortex has a stereotyped organization, reports a team of researchers led by Toshihiko Hosoya of the RIKEN Brain Science Institute. The entire cortex has a stereotypical layered structure with the same cell types arranged in the same way, but how neurons are organized in the other orientation—parallel to the brain’s surface—is poorly understood.

image

Figure 1: In the mouse visual cortex, neurons expressing id2 mRNA (magenta) are found in regularly repeating clusters. Reproduced from Ref. 1 © 2011 Hisato Maruoka et al., RIKEN Brain Science Institute

Hosoya and his colleagues therefore examined layer V (5) of the mouse cortex, which contains two classes of large pyramidal neurons that look identical but differ in the connections they form. One projects axons straight down to regions beneath the cortex; the other projects to the cortex on the opposite side of the brain.

First, the researchers examined expression of the id2 gene in cells of the visual cortex, because these cells form clusters in that part of the brain. They found that id2 is expressed in nearly all cells that project axons downward, but not in those that cross over. Hosoya and colleagues verified this by visualizing the connections of cells using fluorescent cholera toxin, which binds to cell membranes and travels along the axons.

Further examination of gene expression patterns in tissue slices revealed that the cells are arranged in clusters aligned perpendicular to the brain’s surface, and that the clusters are organized in a regular pattern, with the same basic unit repeating every thirty micrometers (Fig. 1). They also observed the same pattern in layer V of the somatosensory cortex, suggesting that this organization is common to all other areas.

By generating a strain of mutant mice expressing green fluorescent protein in the progenitor cells that produce the cells in layer V during brain development, Hosoya and his colleagues investigated the embryonic origin of these cells. This revealed that each cluster contains neurons that are produced by different progenitor cells.

Finally, the researchers showed that the regular pattern persists in the adult visual cortex, and that neurons in each cluster show the same activity patterns in response to visual stimulation. “Our preliminary data suggest that at least several other areas in the cortex have the same structure,” says Hosoya. “It’s likely that the entire cortex has the same organization, and I expect that the human cortex has the same structure.”

Provided by RIKEN

Source: medicalxpress.com

May 14, 20127 notes
#science #neuroscience #brain #neuron #psychology
Astrocytes found to bridge gap between global brain activity and localized circuits

May 11, 2012

Global network activity in the brain modulates local neural circuitry via calcium signaling in non-neuronal cells called astrocytes (Fig. 1), according to research led by Hajime Hirase of the RIKEN Brain Science Institute. The finding clarifies the link between two important processes in the brain.

image

Figure 1: Astrocytes are star-shaped cells with numerous fine projections that ensheath synapses in the brain. © 2012 Hajime Hirase

Activity in large-scale brain networks is thought to modulate changes in neuronal connectivity, so-called ‘synaptic plasticity’, in the cerebral cortex. The neurotransmitter acetylcholine regulates global brain activity associated with attention and awareness, and is involved in plasticity.

To investigate how these processes are linked, Hirase and his colleagues simultaneously stimulated the whiskers of mice and the nucleus basalis of Meynert (NBM), a basal forebrain structure containing neurons that synthesize acetylcholine and project widely to the cortex. Using electrodes and an imaging technique called two-photon microscopy, performed through a ‘cranial window’, they monitored the responses of cells in the barrel cortex, which receives inputs from the whiskers.

Recordings from the electrodes showed that repeated co-stimulation of the whiskers and NBM induced plasticity in the barrel cortex. This plasticity depended on two types of receptors—muscarinic acetylcholine receptors (mAChRs) and N-methyl-D-aspartic acid receptors (NMDARs). Two-photon imaging microscopy further revealed that activation of the mAChRs during co-stimulation elevated the concentration of calcium ions within astrocytes of the barrel cortex.

The researchers repeated these experiments in mutant mice lacking the receptor that controls the release of calcium ions in astrocytes. Since co-stimulation of whiskers and NBM did not induce plasticity in the mutants, Hirase and colleagues concluded that calcium signaling in astrocytes acts as a ‘gate’ linking the changes in global brain state induced by acetylcholine to activity in local cortical circuits.

Furthermore, the researchers found that stimulation of the NBM led to an increase in the extracellular concentration of the amino acid D-serine in the normal, but not the mutant, mice. D-serine is secreted by astrocytes and activates NMDARs. Hirase’s team had previously shown that astrocytes are electrically silent in living rodents even in the presence of neural activity2. The new findings showed that the biochemical, as opposed to electrical, activation of astrocytes induces them to release the transmitter that modulates synaptic plasticity in the neuronal circuitry.

“Our study is probably the first to show that calcium signaling in astrocytes is related to neuronal circuit plasticity in living animals,” says Hirase. “We are now studying if this type of calcium signaling occurs in all parts of an astrocyte or is restricted to some parts of the cell.”

Provided by RIKEN

Source: medicalxpress.com

May 14, 201212 notes
#science #neuroscience #brain #psychology
Mild traumatic brain injury may alter brain's neuronal circuit excitability and contribute to brain network dysfunction

May 11, 2012

Even mild head injuries can cause significant abnormalities in brain function that last for several days, which may explain the neurological symptoms experienced by some individuals who have experienced a head injury associated with sports, accidents or combat, according to a study by Virginia Commonwealth University School of Medicine researchers.

These findings, published in the May issue of the Journal of Neuroscience, advance research in the field of traumatic brain injury (TBI), enabling researchers to better understand what brain structural or functional changes underlie posttraumatic disorders – a question that until now has remained unclear.

Previous research has shown that even a mild case of TBI can result in long-lasting neurological issues that include slowing of cognitive processes, confusion, chronic headache, posttraumatic stress disorder and depression.

The VCU team, led by Kimberle M. Jacobs, Ph.D., associate professor in the Department of Anatomy and Neurobiology, demonstrated for the first time, using sophisticated bioimaging and electrophysiological approaches, that mild injury can cause structural disruption of axons in the brain while also changing the way the neurons fire in areas where they have not been structurally altered. Axons are nerve fibers in the brain responsible for conducting electrical impulses. The team used models of mild traumatic brain injury and followed morphologically identified neurons in live cortical slices.

“These findings should help move the field forward by providing a unique bioimaging and electrophysiological approach to assess the evolving changes evoked by mild TBI and their potential therapeutic modulation,” said co-investigator, John T. Povlishock, Ph.D., professor and chair of the VCU School of Medicine’s Department of Anatomy and Neurobiology and director of the Commonwealth Center for the Study of Brain Injury.

According to Povlishock, additional benefit may also derive from the use of this model system with repetitive injuries to determine if repeated insults exacerbate the observed abnormalities.

Provided by Virginia Commonwealth University

Source: medicalxpress.com

May 14, 20125 notes
#science #neuroscience #brain #psychology
Maternal Antibodies to Gluten Linked to Schizophrenia Risk in Children

May 11th, 2012

Babies born to women with sensitivity to gluten appear to be at increased risk for certain psychiatric disorders later in life, according to research by scientists at Karolinska Institutet in Sweden and Johns Hopkins Children’s Center in Baltimore.

The team’s findings, published in The American Journal of Psychiatry, add to a growing body of evidence that many “adult” diseases may take root before and shortly after birth.

“Lifestyle and genes are not the only factors that shape disease risk, and factors and exposures before, during and after birth can help pre-program much of our adult health,” said investigator Robert Yolken, M.D., a neuro-virologist at Johns Hopkins Children’s Center. “Our study is an illustrative example suggesting that a dietary sensitivity before birth could be a catalyst in the development of schizophrenia or a similar condition 25 years later.”

Maternal infections and other inflammatory disorders during pregnancy have long been linked to greater risk for schizophrenia in the offspring but, the Swedish and U.S. investigators say, this is the first study that points to maternal food sensitivity as a possible culprit in the development of such disorders. The findings establish a strong link but do not mean that gluten sensitivity will invariably cause schizophrenia, the investigators caution. The research, however, does suggest an intriguing new mechanism that may drive up risk and illuminate possible prevention strategies.

“Our research not only underscores the importance of maternal nutrition during pregnancy and its lifelong effects on the offspring, but also suggests one potential cheap and easy way to reduce risk if we were to find further proof that gluten sensitivity exacerbates or drives up schizophrenia risk,” said study lead investigator Håkan Karlsson, M.D., Ph.D., a neuroscientist at Karolinska Institutet and former neuro-virology fellow at Johns Hopkins.

The team’s findings are based on an examination of 764 birth records and neonatal blood samples of Swedes born between 1975 and 1985. Some 211 of them subsequently developed non-affective psychoses, such as schizophrenia and delusional disorders.

Using stored neonatal blood samples, the investigators measured levels of IgG antibodies to milk and wheat. IgG antibodies are markers of immune system reaction triggered by the presence of certain proteins. Because a mother’s antibodies cross the placenta during pregnancy to confer immunity to the baby, a newborn’s elevated IgG levels are proof of protein sensitivity in the mother.

Children born to mothers with abnormally high levels of antibodies to the wheat protein gluten had nearly twice the risk of developing schizophrenia later in life, compared with children who had normal levels of gluten antibodies. The link persisted even after researchers accounted for other factors known to increase schizophrenia risk, including maternal age, gestational age, mode of delivery and the mother’s immigration status. The risk for psychiatric disorders was not increased among those with elevated levels of antibodies to milk protein.

The researchers say the suspicion that food sensitivity in the mother can affect her child’s risk for psychiatric disorders stems from an observation made in the wake of the World War II by U.S. Army researcher F. Curtis Dohan, M.D. Dohan noted that food scarcity in post-war Europe and wheat-poor diets led to notably fewer hospital admissions for schizophrenia.  The link was merely observational, but it has piqued the curiosity of scientists ever since.

Researchers in the past also have observed that people diagnosed with schizophrenia have disproportionately high rates celiac disease, a rare autoimmune disorder characterized by gluten sensitivity. Although it is a hallmark of the condition, gluten sensitivity alone is not enough to diagnose celiac disease. Other studies have found that some people with schizophrenia have gluten sensitivity without other signs of celiac disease, the researchers note.

Yolken and Karlsson say the team already is conducting follow-up studies to clarify how gluten or sensitivity to it increases schizophrenia risk and whether it does so only in those genetically predisposed.

Source: Neuroscience News

May 14, 201221 notes
#science #neuroscience #brain #psychology #schizophrenia
Neurodegeneration 'Switched Off' in Mice

ScienceDaily (May 10, 2012) — Researchers at the Medical Research Council (MRC) Toxicology Unit at the University of Leicester have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice.

image

Scientists have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice. (Credit: © pressmaster / Fotolia)

In human neurodegenerative diseases, including Alzheimer’s, Parkinson’s and prion diseases, proteins “mis-fold” in a variety of different ways resulting in the build up of mis-shapen proteins. These form the plaques found in Alzheimer’s and the Lewy bodies found in Parkinson’s disease.

The researchers studied mice with neurodegeneration caused by prion disease. These mouse models currently provide the best animal representation of human neurodegenerative disorders, where it is known that the build up of mis-shapen proteins is linked with brain cell death.

They found that the build up of mis-folded proteins in the brains of these mice activates a natural defense mechanism in cells, which switches off the production of new proteins. This would normally switch back ‘on’ again, but in these mice the continued build-up of mis-shapen protein keeps the switch turned ‘off’. This is the trigger point leading to brain cell death, as those key proteins essential for nerve cell survival are not made.

By injecting a protein that blocks the ‘off’ switch of the pathway, the scientists were able to restore protein production, independently of the build up of mis-shapen proteins,and halt the neurodegeneration. The brain cells were protected, protein levels and synaptic transmission (the way in which brain cells signal to each other) were restored and the mice lived longer, even though only a very small part of their brain had been treated.

Mis-shapen proteins in human neurodegenerative diseases, such as Alzheimer’s and Parkinson’s diseases, also over-activate this fundamental pathway controlling protein synthesis in the brains of patients, which represents a common target underlying these different clinical conditions. The scientists’ results suggest that treatments focused on this pathway could be protective in a range of neurodegenerative disease in which mis-shapen proteins are building up and causing neurons to die.

Professor Giovanna Mallucci, who led the team, said, “What’s exciting is the emergence of a common mechanism of brain cell death, across a range of different neurodegenerative disorders, activated by the different mis-folded proteins in each disease. The fact that, in mice with prion disease, we were able to manipulate this mechanism and protect the brain cells means we may have a way forward in how we treat other disorders. Instead of targeting individual mis-folded proteins in different neurodegenerative diseases, we may be able to target the shared pathways and rescue brain cell degeneration irrespective of the underlying disease.”

Professor Hugh Perry, chair of the MRC’s Neuroscience and Mental Health Board, said, “Neurodegenerative diseases such as Alzheimer’s and Parkinson’s are debilitating and largely untreatable conditions. Alzheimer’s disease and related disorders affect over seven million people in Europe, and this figure is expected to double every 20 years as the population ages across Europe. The MRC believes that research such as this, which looks at the fundamental mechanisms of these devastating diseases, is absolutely vital. Understanding the mechanism that leads to neuronal dysfunction prior to neuronal loss is a critical step in finding ways to arrest disease progression.”

Source: Science Daily

May 14, 20129 notes
#science #neuroscience #brain #psychology #alzheimer #parkinson
Glial Cells Supply Nerve Fibers with Energy-Rich Metabolic Products

May 10th, 2012

Glial cells pass on metabolites to neurons.

Around 100 billion neurons in the human brain enable us to think, feel and act. They transmit electrical impulses to remote parts of the brain and body via long nerve fibres known as axons. This communication requires enormous amounts of energy, which the neurons are thought to generate from sugar. Axons are closely associated with glial cells which, on the one hand, surround them with an electrically insulating myelin sheath and, on the other hand support their long-term function. Klaus Armin and his research group from the Max Planck Institute of Experimental Medicine in Göttingen have now discovered a possible mechanisms by which these glial cells in the brain can support their associated axons and keep them alive in the long term.

Oligodendrocytes are a group of highly specialised glial cells in the central nervous system. They are responsible for the formation of the fat-rich myelin sheath that surrounds the nerve fibres as an insulating layer. The comparison with the coating on electricity cables is an obvious one; however, myelin can do much more than act as the insulating layer on electricity cables: it increases the transmission speed of the axons and also reduces ongoing energy consumption. The extreme importance of myelin for a functioning nervous system is shown by the diseases that arise from a defective insulating layer, such as multiple sclerosis

Interestingly, the function of the oligodendrocytes goes far beyond the mere provision of myelin. Klaus-Armin Nave and his team at the Max Planck Institute in Göttingen already succeeded in demonstrating years ago that healthy glial cells are also essential for the long-term function and survival of the axons themselves, irrespective of myelination. “The way in which the oligodendrocytes functionally support their associated axons was not clear to us up to now,” says Nave. In a new study, the researchers were able to show that the glial cells are involved in, among other things, the replenishment of energy in the nerve fibres. “They could be described as the petrol stations on the data highway of the axons,” says Nave, explaining the results.

image

Electron microscope cross-section image of the nerve fibres (axons) of the optic nerve. Axons are surrounded by special glial cells, the oligodendrocytes, wrapping themselves around the axons in several layers. Between the axons, there are extensions of astrocytes, another type of glial cells. © K.-A.Nave/MPI f. Experimental Medicine

But how does the energy refuelling work? Is there a metabolic connection between the oligodendrocytes and axons? To find out, Ursula Fünfschilling generated genetically modified mice: the function of the mitochondria was deliberately disrupted in the oligodendrocytes through the inactivation of the Cox10 gene. This affects the final stages of sugar breakdown taking place in the mitochondria where energy is harnessed – a process known as the respiratory chain. If a link in this chain is missing, in this instance cytochrome oxidase, which is only functional when cells have the enzyme Cox10, the glial cells gradually lose the capacity for cell respiration in their mitochondria. “Without independent breathing, the manipulated glial cells of the nervous systems should have died,” explains the scientist. That is, unless the low level of energy harnessed from the splitting of the glucose to form pyruvate or milk acid, a process known as glycolysis, is sufficient for them.

And this is precisely what the scientists observed in their mice: the animals’ myelin was initially formed in the normal way. The loss of the mitochondrial respiratory chain, which started at this point, did not appear to affect the glial cells in the central nervous system. Even one year later, there were no neurodegenerative changes in the brain to be observed. The scientists assume that in the early weeks of life – a phase characterised by maximum energy requirement – the mutated oligodendrocytes still rely on many intact mitochondria. All of the more mature oligodendrocytes later appear to reduce the mitochondrial respiration and set it to energy generation through increased glycolysis. This has the advantage in healthy glial cells that the metabolic products which arise during the breaking down of glucose can be used as components for myelin synthesis. In addition, the lactic acid that arises in the oligodendrocytes can be given to the axons where it can be used to produce energy with the help of the axon’s own mitochondria.

“The complete loss of the respiratory chain in the deliberately modified oligodendrocytes probably elevates a developmental step that unfolds naturally,” explains Nave. Thus the loss of glial mitochondria does not result in the deterioration of the energy supply to the axons but, conversely, to an oversupply of exploitable lactic acid. The affected nerve pathways themselves have no problem demonstrably in metabolising the lactic acid from oligodendrocytes. Transport proteins ensure the rapid transfer of the lactic acid between the oligodendrocytes and their myelinated axons.

This finding provides a new understanding of the role of oligodendrocytes: in addition to their known significance for myelinisation [aka myelination], they can directly provide the axons with glucose products that can be used as fuel with the help of axonal mitochondria in periods of high activity. This coupling of glial cells could explain, among other things, why in many myelin diseases, for example multiple sclerosis, the affected demyelinised axons often suffer irreversible damage.

Source: Neuroscience News

May 14, 201216 notes
#science #neuroscience #brain #psychology #neuron
Key Cellular Mechanisms Behind the Onset of Tinnitus Identified

ScienceDaily (May 10, 2012) — Research into hearing loss after exposure to loud noises could lead to the first drug treatments to prevent the development of tinnitus.

Researchers in the University of Leicester’s Department of Cell Physiology and Pharmacology have identified a cellular mechanism that could underlie the development of tinnitus following exposure to loud noises. The discovery could lead to novel tinnitus treatments, and investigations into potential drugs to prevent tinnitus are currently underway.

Tinnitus is a sensation of phantom sounds, usually ringing or buzzing, heard in the ears when no external noise is present. It commonly develops after exposure to loud noises (acoustic over-exposure), and scientists have speculated that it results from damage to nerve cells connected to the ears.

Although hearing loss and tinnitus affect around ten percent of the population, there are currently no drugs available to treat or prevent tinnitus.

University of Leicester researcher Dr Martine Hamann, who led the study published in the journal Hearing Research, said: “We need to know the implications of acoustic over exposure, not only in terms of hearing loss but also what’s happening in the brain and central nervous system. It’s believed that tinnitus results from changes in excitability in cells in the brain — cells become more reactive, in this case more reactive to an unknown sound.”

Dr Hamann and her team, including PhD student Nadia Pilati, looked at cells in an area of the brain called the dorsal cochlear nucleus — the relay carrying signals from nerve cells in the ear to the parts of the brain that decode and make sense of sounds. Following exposure to loud noises, some of the nerve cells (neurons) in the dorsal cochlear nucleus start to fire erratically, and this uncontrolled activity eventually leads to tinnitus.

Dr Hamann said “We showed that exposure to loud sound triggers hearing loss a few days after the exposure to the sound. It also triggers this uncontrolled activity in the neurons of the dorsal cochlear nucleus. This is all happening very quickly, in a matter of days”

In a key breakthrough in collaboration with GSK who sponsored Dr Pilati’s PhD, the team also discovered the specific cellular mechanism that leads to the neurons’ over-activity. Malfunctions in specific potassium channels that help regulate the nerve cell’s electrical activity mean the neurons cannot return to an equilibrium resting state.

Ordinarily, these cells only fire regularly and therefore regularly return to a rest state. However, if the potassium channels are not working properly, the cells cannot return to a rest state and instead fire continuously in random bursts, creating the sensation of constant noise when none exists.

Dr Hamann explained: “In normal conditions the channel helps to drag down the cellular electrical activity to its resting state and this allows the cell to function with a regular pattern. After exposure to loud sound, the channel is functioning less and therefore the cell is constantly active, being unable to reach its resting state and displaying those irregular bursts.”

Although many researchers have investigated the mechanisms underlying tinnitus, this is the first time that cellular bursting activity has been characterised and linked to specific potassium channels. Identifying the potassium channels involved in the early stages of tinnitus opens up new possibilities for preventing tinnitus with early drug treatments.

Dr Hamann’s team is currently investigating potential drugs that could regulate the damaged cells, preventing their erratic firing and returning them to a resting state. If suitable drug compounds are discovered, they could be given to patients who have been exposed to loud noises to protect them against the onset of tinnitus.

These investigations are still in the preliminary stages, and any drug treatment would still be years away.

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #hearing #psychology #brain
Testosterone-Fueled Infantile Males Might Be a Product of Mom's Behavior

ScienceDaily (May 10, 2012) — By comparing the testosterone levels of five-month old pairs of twins, both identical and non-identical, University of Montreal researchers were able to establish that testosterone levels in infancy are not inherited genetically but rather determined by environmental factors.

image

Angry boy. Testosterone levels in infancy are not inherited genetically but rather determined by environmental factors, new research suggests. (Credit: © crestajohnson / Fotolia)

"Testosterone is a key hormone for the development of male reproductive organs, and it is also associated with behavioural traits, such as sexual behaviour and aggression," said lead author Dr. Richard E. Tremblay of the university’s Research Unit on Children’s Psychosocial Maladjustment. "Our study is the largest to be undertaken with newborns, and our results contrast with the findings gained by scientists working with adolescents and adults, indicating that testosterone levels are inherited."

The findings were presented in an article published inPsychoneuroendocrinology on May 7, 2012.

The researchers took saliva samples from 314 pairs of twins and measured the levels of testosterone. They then compared the similarity in testosterone levels between identical and fraternal twins to determine the contribution of genetic and environmental factors. Results indicated that differences in levels of testosterone were due mainly to environmental factors. “The study was not designed to specifically identify these environmental factors which could include a variety of environmental conditions, such as maternal diet, maternal smoking, breastfeeding and parent-child interactions.”

"Because our study suggests that testosterone levels in infants are determined by the circumstances in which the child develops before and after birth, further studies will be needed to find out exactly what these influencing factors are and to what extent they change from birth to puberty," Tremblay said.

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #brain #psychology
Evolution’s Gift May Also Be at the Root of a Form of Autism

May 10th, 2012

A recently evolved pattern of gene activity in the language and decision-making centers of the human brain is missing in a disorder associated with autism and learning disabilities, a new study by Yale University researchers shows.

“This is the cost of being human,” said Nenad Sestan, associate professor of neurobiology, researcher at Yale’s Kavli Institute for Neuroscience, and senior author of the paper. “The same evolutionary mechanisms that may have gifted our species with amazing cognitive abilities have also made us more susceptible to psychiatric disorders such as autism.”

The findings are reported in the May 11 issue of the journal Cell.

In the Cell paper, Kenneth Kwan, the lead author, and other members of the Sestan laboratory identified the evolutionary changes that led the NOS1 gene to become active specifically in the parts of the developing human brain that form the adult centers for speech and language and decision-making. This pattern of NOS1 activity is controlled by a protein called FMRP and is missing in Fragile X syndrome, a disorder caused by a genetic defect on the X chromosome that disrupts FMRP production. Fragile X syndrome, the leading inherited form of intellectual disability, is also the most common single-gene cause of autism. The loss of NOS1 activity may contribute to some of the many cognitive deficits suffered by those with Fragile X syndrome, such as lower IQ, attention deficits, and speech and language delays, the authors say.

The pattern of NOS1 activity in these brain centers does not occur in the developing mouse brain — suggesting that it is a more recent evolutionary adaptation possibly involved in the wiring of neural circuits important for higher cognitive abilities. The findings of the Cell paper support this hypothesis. The study also provides insights into how genetic deficits in early development, a time when brain circuits are formed, can lead to disorders such as autism, in which symptoms appear after birth.

“This is an example of where the function of genetic changes that likely drove aspects of human brain evolution was disrupted in disease, possibly reverting some of our newly acquired cognitive abilities and thus contributing to a psychiatric outcome,” Kwan said.

image

Artist representation of early developmental brain cells that when disrupted cause Fragile X syndrome. Adapted from Yale University press release image.

By Bill Hathaway

Source: Neuroscience News

May 10, 201219 notes
#science #neuroscience #psychology #brain #autism
Researchers identify genetic mutation causing rare form of spinal muscular atrophy

May 10, 2012

Scientists have confirmed that mutations of a gene are responsible for some cases of a rare, inherited disease that causes progressive muscle degeneration and weakness: spinal muscular atrophy with lower extremity predominance, also known as SMA-LED.

"Typical spinal muscular atrophies begin in infancy or early childhood and are fatal, involving all motor neurons, but SMA-LED predominantly affects nerve cells controlling muscles of the legs. It is not fatal and the prognosis is good, although patients usually are moderately disabled and require assistive devices such as bracing and wheelchairs throughout their lives," said Robert H. Baloh, MD, PhD, director of Cedars-Sinai Medical Center’s Neuromuscular Division and senior author of a Neurology article describing the new findings on DYNC1H1.

It is a molecule inside cells that acts as a motor to transport cellular components. Using cells cultured from patients, Baloh’s group showed that the mutation disrupts this motor’s function. The researchers found that some subjects with mutations had global developmental delay in addition to weakness, indicating the brain also is involved.

"Our observations suggest that a range of DYNC1H1-related disease exists in humans – from a widespread neurodevelopmental abnormality of the central nervous system to more selective involvement of certain motor neurons, which manifests as spinal muscular atrophy," Baloh said.

He pointed out that while this molecule is responsible for some inheritable cases of spinal muscular atrophy with lower extremity predominance, the genetic mutation is absent in others. The search continues, therefore, to find other culprit genetic mutations and develop biological therapies to correct them.

"Although this is a rare form of motor neuron disease, it tells us that dynein function – the molecular motor – is crucial for the development and maintenance of motor neurons, which we hope will provide insight into the common form of spinal muscular atrophy and also amyotrophic lateral sclerosis," Baloh said. ALS (also known as Lou Gehrig’s disease) is a progressive neurodegenerative disease that affects nerve cells in the brain and spinal cord.

Baloh, an expert in treating and studying neuromuscular and neurodegenerative diseases, joined Cedars-Sinai in early 2012, working with other physicians and scientists in the Department of Neurology and the Regenerative Medicine Institute to establish one of the most comprehensive neuromuscular disease treatment and research teams in California.

Provided by Cedars-Sinai Medical Center

Source: medicalxpress.com

May 10, 20123 notes
#science #neuroscience #psychology #biology #disease
Mathematical model unlocks key to brain wiring

May 10, 2012

(Medical Xpress) — A new mathematical model predicting how nerve fibres make connections during brain development could aid understanding of how some cognitive disorders occur.

The model, constructed by scientists at the Queensland Brain Institute (QBI) and School of Mathematics and Physics at the University of Queensland (UQ), gives new insight into how changing chemical levels in nerve fibres can modify nerve wiring underpinning connections in the brain.

Professor Geoff Goodhill says that while scientists have long known that changing these chemical levels can change where nerve fibres grow, only now are they understanding why this is the case.

“Our mathematical model allows us to predict precisely how these chemical levels control the direction in which nerve fibres grow, during both neural development and regeneration after injury,” he said.

Correct brain wiring is fundamental for normal brain function.

Recent discoveries suggest that wiring problems may underpin a number of nervous system disorders including autism, dyslexia, Down syndrome, Tourette’s syndrome and Parkinson’s disease.

The new model, published in the prestigious cell journal Neurondemonstrates the important role mathematics can play in understanding how the brain develops, and perhaps ultimately preventing such disorders. 

Provided by University of Queensland 

Source: medicalxpress.com

May 10, 20127 notes
#science #neuroscience #brain #psychology
Researchers move closer to delaying dementia

May 10, 2012

(Medical Xpress) — Scientists at University of Queensland’s Brain Institute are one step closer to developing new therapies for treating dementia.

QBI’s Dr Jana Vukovic said the work was aimed at understanding the molecular mechanism that may impair learning and memory in the ageing population.

“Ageing slows the production of new nerve cells, reducing the brain’s ability to form new memories,” said Dr Vokovic, who performed the work in the laboratory of Professor Perry Bartlett, the Director of QBI at The University of Queensland.

"But our research shows for the first time that the brain cells usually responsible for mediating immunity, microglia, have an inhibitory effect on memory during ageing.

“Furthermore, they have shown that a molecule produced by nerve cells, fractalkine, can reverse this process and stimulate stem cells to produce new neurons.”

The discovery, published in The Journal of Neuroscience today, came after QBI scientists observed that the increased production of new neurons in mice that were actively running was due to the release of fractalkine in the hippocampus – the brain structure responsible for specific types of learning and memory.

Professor Bartlett said it had been known for some time that exercise increased the production of new nerve cells in the hippocampus in young and even aged mice.

“But this study found that it is fractalkine that appears to be specifically mediating this effect by making the microglia produce factors that activate the stem cells that produce new nerve cells,” he said.

“Once the cells are activated they divide and produce new cells, which underpin the animal’s ability to learn and form memories.

"This means that fractalkine may form the basis for the development of future therapies.

“The discovery is especially exciting because we have found that older animals suffering cognitive decline showed significantly lower levels of fractalkine.

“We are seeking ways of increasing fractalkine levels in patients with cognitive decline, and hoping this may be a new frontline therapy in treating dementia.”

Dr Vukovic said that until relatively recently, it was thought the adult brain was incapable of generating new neurons.

“But work from Professor Bartlett’s laboratory over the past 20 years has demonstrated that the brains of adult animals, including humans, retain the ability to make new nerve cells,” she said.

“The challenge is to find out how to stimulate this production in the aged animal and human where production has slowed.”

The latest work was a significant step toward achieving this goal, she said.

Provided by University of Queensland

Source: medicalxpress.com

May 10, 20128 notes
#science #neuroscience #brain #psychology
Think global, act local: New roles for protein synthesis at synapses

May 10, 2012

(Medical Xpress) — How do we build a memory in the brain? It is well known that for animals (and humans) new proteins are needed to establish long-term memories. During learning information is stored at the synapses, the junctions connecting nerve cells. Synapses also require new proteins in order to show changes in their strength (synaptic plasticity). Historically, scientists have focused on the cell body as the place where the required proteins are synthesized. However, in recent years there has been increasing focus on the dendrites and axons (the compartments that meet to form synapses) as a potential site for protein synthesis.

Protein synthesis machines have been observed there as well as a limited number of their templates, the messenger RNA molecules. The limited number of mRNAs observed in dendrites and axons placed constraints on the constellation of proteins that could be synthesized to help synapses work and change. Researchers from Erin Schuman’s lab at the Max Planck Institute (MPI) for Brain Research used new-generation sequencing to directly identify a very large number (over 2500) of new mRNA molecules that are present at the axons and dendrites. Using high-resolution imaging techniques they were able to both quantify and visualize individual mRNA molecules. They published their findings in the latest issue of Neuron.

[Video]
Erin Schuman and her colleagues describe how they were able to detect numerous new mRNAs in the processes of neurons with unprecedented sensitivity. Video: Neuron.

Using microarray approaches and/or in situ hybridization techniques, many different groups had each identified a hundred or so mRNAs that might reside in the dendrites. By analyzing and comparing these studies the Schuman team discovered something surprising: it seems that not a single mRNA type was found in all three studies. This observation made the scientist at the MPI for Brain Research wonder whether the already discovered mRNAs are just the tip of the iceberg and whether there were many more mRNA molecules waiting to be discovered.

In order to find out the researchers dissected the neuropil layer of the rat hippocampus. This layer comprises a high concentration of axons and dendrites, but lacks the cell bodies of pyramidal neurons (the principal cell type in the hippocampus and other brain areas). By using sensitive high-resolution sequencing techniques, mRNAs could be detected which, due to their lower concentrations, were not discovered before. The researchers found an impressive number of 2550 unique mRNAs present at the dendrites and/or axons. To determine the relative abundance in the neuronal cells, the scientists at Erin Schuman’s lab used the Nanostring nCounter, a new technique allowing the high-resolution visualization and quantification of single mRNA molecules. They found that the concentration of mRNAs in the euronal cells varies by three orders of magnitude. Additionally, the researchers were able to classify many of the mRNAs and determine their function in synaptic plasticity. These include signaling molecules, scaffolds and the receptors for neurotransmitter molecules. In addition, many mRNAs coding for protein implicated in diseases like autism were discovered in the dendrites and axons. Finally, by using advanced imaging techniques, the researchers could directly visualize some of the mRNAs in the neuronal dendrites, hundreds of micrometers from the cell body.

These results reveal a previously unappreciated enormous potential for the local protein synthesis machinery to supply, maintain and modify the dendritic and synaptic protein population. It seems that neurons use a local control mechanism much in the same way that modern societies have learned that the most efficient means to distribute goods to the population is to use local distribution centers.

Provided by Max Planck Society

Source: medicalxpress.com

May 10, 20126 notes
#science #neuroscience #brain #memory #psychology
Researchers say genes and vascular risk modify effects of aging on brain and cognition

May 9, 2012 

Efforts to understand how the aging process affects the brain and cognition have expanded beyond simply comparing younger and older adults.

"Everybody ages differently. By looking at genetic variations and individual differences in markers of vascular health, we begin to understand that preventable factors may affect our chances for successful aging," said Wayne State University psychology doctoral student Andrew Bender, lead author of a study supported by the National Institute on Aging of the National Institutes of Health and now in press in the journal Neuropsychologia.

The report, “Age-related Differences in Memory and Executive Functions in Healthy APOE ε4 Carriers: The Contribution of Individual Differences in Prefrontal Volumes and Systolic Blood Pressure,” focuses on carriers of the ε4 variant of the apolipoprotein (APOE) gene, present in roughly 25 percent of the population. Compared to those who possess other forms of the APOE gene, carriers of the ε4 allele are at significantly greater risk for Alzheimer’s, dementia and cardiovascular disease.

Many studies also have shown that nondemented carriers of the APOE ε4 variant have smaller brain volumes and perform less well on cognitive tests than carriers of other gene variants. Those findings, however, are not consistent, and a possible explanation may come from examining interactions between the risky genes and other factors, such as markers of cardiovascular health. Prior research in typical samples of older adults has shown that indeed other vascular risk factors — such as elevated cholesterol, hypertension or diabetes — can exacerbate the impact of the APOE ε4 variant on brain and cognition, but it is unclear if such synergy of risks is present in healthy adults.

Thus, Wayne State researchers evaluated a group of volunteers from 19 to 77 years of age who self-reported as exceptionally healthy on a questionnaire that screened for a number of conditions, representing a “best case scenario” of healthy aging. The research project, led by Naftali Raz, Ph.D., professor of psychology and director of the Lifespan Cognitive Neuroscience Research Program at WSU’s Institute of Gerontology, tested different cognitive abilities known for their sensitivity to aging and the effects of the APOE ε4 variant. Those abilities include speed of information processing, working memory (holding and manipulating information in one’s mind) and episodic memory (memory for events).

Researchers also measured participants’ blood pressure, performed genetic testing to determine which APOE variant participants carried, and measured the volumes of several critical brain regions using a high-resolution structural magnetic resonance imaging brain scan. Bender and Raz showed that for older APOE ε4 carriers, even minor increases in systolic blood pressure (the higher of the two numbers that are reported in blood pressure measures) were linked with smaller volumes of the prefrontal cortex and prefrontal white matter, slower speed of information processing, reduced working memory capacity and worse verbal memory. Notably, they said, that pattern was not evident in those who lacked the ε4 gene variant.

The study concludes that the APOE ε4 gene may make its carriers sensitive to negative effects of relatively subtle elevations in systolic blood pressure, and that the interplay between two risk factors, genetic and physiological, is detrimental to the key brain structures and associated cognitive functions.

"Although genes play a significant role in shaping the effects of age and vascular risk on the brain and cognition, the impact of single genetic variants is relatively small, and there are quite a few of them. Thus, one’s aging should not be seen through the lens of one’s genetic profile," cautioned the study’s authors. They continued, "The negative impact of many genetic variations needs help from other risk factors, and while there isn’t much one can do about genes, a lot can be done about vascular risk factors such as blood pressure or cholesterol."

"Everybody should try to keep those in check, although people with certain genetic variants more so than others." Raz said. "Practically speaking, even with the best deck of genetic cards dealt to you, it still makes sense to reduce risk through whatever works: exercise, diet or, if those fail, medication."

Because the study is part of a longitudinal project, he and Bender said the immediate future task now is to determine how the interaction between risky genes and vascular risk factors affect the trajectory of age-related changes — not differences, as in this cross-sectional study — in brain and cognition.

Provided by Wayne State University

Source: medicalxpress.com

May 10, 20121 note
#science #neuroscience #brain #psychology
Chronic cocaine use triggers changes in brain's neuron structure

May 9, 2012

Chronic exposure to cocaine reduces the expression of a protein known to regulate brain plasticity, according to new, in vivo research on the molecular basis of cocaine addiction. That reduction drives structural changes in the brain, which produce greater sensitivity to the rewarding effects of cocaine.

image

The research, led by UB’s Dietz, suggests a potential new target for development of a treatment for cocaine addiction. Credit: Douglas Levere, UB Communications

The finding suggests a potential new target for development of a treatment for cocaine addiction. It was published last month in Nature Neuroscience by researchers at the University at Buffalo and Mount Sinai School of Medicine.

"We found that chronic cocaine exposure in mice led to a decrease in this protein’s signaling," says David Dietz, PhD, assistant professor of pharmacology and toxicology in the School of Medicine and Biomedical Sciences, who did the work while at Mt. Sinai. "The reduction of the expression of the protein, called Rac1, then set in motion a cascade of events involved in structural plasticity of the brain — the shape and growth of neuronal processes in the brain. Among the most important of these events is the large increase in the number of physical protrusions or spines that grow out from the neurons in the reward center of the brain.

"This suggests that Rac1 may control how exposure to drugs of abuse, like cocaine, may rewire the brain in a way that makes an individual more susceptible to the addicted state," says Dietz.

The presence of the spines demonstrates the spike in the reward effect that the individual obtains from exposure to cocaine. By changing the level of expression of Rac1, Dietz and his colleagues were able to control whether or not the mice became addicted, by preventing enhancement of the brain’s reward center due to cocaine exposure.

To do the experiment, Dietz and his colleagues used a novel tool, which allowed for light activation to control Rac1 expression, the first time that a light-activated protein has been used to modulate brain plasticity.

"We can now understand how proteins function in a very temporal pattern, so we could look at how regulating genes at a specific time point could affect behavior, such as drug addiction, or a disease state," says Dietz.

In his UB lab, Dietz is continuing his research on the relationship between behavior and brain plasticity, looking, for example, at how plasticity might determine how much of a drug an animal takes and how persistent the animal is in trying to get the drug.

Provided by University at Buffalo

Source: medicalxpress.com

May 10, 20123 notes
#science #neuroscience #brain #psychology
Scientists identify neurotranmitters that lead to forgetting

May 9, 2012

While we often think of memory as a way of preserving the essential idea of who we are, little thought is given to the importance of forgetting to our wellbeing, whether what we forget belongs in the “horrible memories department” or just reflects the minutia of day-to-day living.

Despite the fact that forgetting is normal, exactly how we forget—the molecular, cellular, and brain circuit mechanisms underlying the process—is poorly understood.

Now, in a study that appears in the May 10, 2012 issue of the journal Neuron, scientists from the Florida campus of The Scripps Research Institute have pinpointed a mechanism that is essential for forming memories in the first place and, as it turns out, is equally essential for eliminating them after memories have formed.

"This study focuses on the molecular biology of active forgetting," said Ron Davis, chair of the Scripps Research Department of Neuroscience who led the project. "Until now, the basic thought has been that forgetting is mostly a passive process. Our findings make clear that forgetting is an active process that is probably regulated."

The Two Faces of Dopamine

To better understand the mechanisms for forgetting, Davis and his colleagues studied Drosophila or fruit flies, a key model for studying memory that has been found to be highly applicable to humans. The flies were put in situations where they learned that certain smells were associated with either a positive reinforcement like food or a negative one, such as a mild electric shock. The scientists then observed changes in the flies’ brains as they remembered or forgot the new information.

The results showed that a small subset of dopamine neurons actively regulate the acquisition of memories and the forgetting of these memories after learning, using a pair of dopamine receptors in the brain. Dopamine is a neurotransmitter that plays an important role in a number of processes including punishment and reward, memory, learning and cognition.

But how can a single neurotransmitter, dopamine, have two seemingly opposite roles in both forming and eliminating memories? And how can these two dopamine receptors serve acquiring memory on the one hand, and forgetting on the other?

The study suggests that when a new memory is first formed, there also exists an active, dopamine-based forgetting mechanism—ongoing dopamine neuron activity—that begins to erase those memories unless some importance is attached to them, a process known as consolidation that may shield important memories from the dopamine-driven forgetting process.

The study shows that specific neurons in the brain release dopamine to two different receptors known as dDA1 and DAMB, located on what are called mushroom bodies because of their shape; these densely packed networks of neurons are vital for memory and learning in insects. The study found the dDA1 receptor is responsible for memory acquisition, while DAMB is required for forgetting.

When dopamine neurons begin the signaling process, the dDA1 receptor becomes overstimulated and begins to form memories, an essential part of memory acquisition. Once that memory is acquired, however, these same dopamine neurons continue signaling. Except this time, the signal goes through the DAMB receptor, which triggers forgetting of those recently acquired, but not yet consolidated, memories.

Jacob Berry, a graduate student in the Davis lab who led the experimentation, showed that inhibiting the dopamine signaling after learning enhanced the flies’ memory. Hyperactivating those same neurons after learning erased memory. And, a mutation in one of the receptors, dDA1, produced flies unable to learn, while a mutation in the other, DAMB, blocked forgetting.

Intriguing Issues

While Davis was surprised by the mechanisms the study uncovered, he was not surprised that forgetting is an active process. “Biology isn’t designed to do things in a passive way,” he said. “There are active pathways for constructing things, and active ones for degrading things. Why should forgetting be any different?”

The study also brings into a focus a lot of intriguing issues, Davis said—savant syndrome, for example.

"Savants have a high capacity for memory in some specialized areas," he said. "But maybe it isn’t memory that gives them this capacity, maybe they have a bad forgetting mechanism. This also might be a strategy for developing drugs to promote cognition and memory—what about drugs that inhibit forgetting as cognitive enhancers?"

Provided by The Scripps Research Institute

Source: medicalxpress.com

May 10, 2012114 notes
#science #neuroscience #brain #psychology
Why Do People Choke When the Stakes Are High? Loss Aversion May Be the Culprit

ScienceDaily (May 9, 2012) — In sports, on a game show, or just on the job, what causes people to choke when the stakes are high? A new study by researchers at the California Institute of Technology (Caltech) suggests that when there are high financial incentives to succeed, people can become so afraid of losing their potentially lucrative reward that their performance suffers.

image

In the study, each participant was asked to control this virtual object on a screen. The virtual object consisted of two weighted balls connected by a spring. The task was to place the object, which stretched and contracted as a weighted spring would in real life, into a square target within two seconds. (Credit: Image courtesy of California Institute of Technology)

It is a somewhat unexpected conclusion. After all, you would think that the more people are paid, the harder they will work, and the better they will do their jobs — until they reach the limits of their skills. That notion tends to hold true when the stakes are low, says Vikram Chib, a postdoctoral scholar at Caltech and lead author on a paper published in the May 10 issue of the journalNeuron. Previous research, however, has shown that if you pay people too much, their performance actually declines.

Some experts have attributed this decline to too much motivation: they think that, faced with the prospect of earning an extra chunk of cash, you might get so excited that you will fail to do the task properly. But now, after looking at brain-scan data of volunteers performing a specific motor task, the Caltech team says that what actually happens is that you become worried about losing your potential prize. The researchers also found that the more someone is afraid of loss, the worse they perform.

In the study, each participant was asked to control a virtual object on a screen by moving an index finger that had a tracking device attached to it. The virtual object consisted of two weighted balls connected by a spring. The task was to place the object, which stretched and contracted as a weighted spring would in real life, into a square target within two seconds.

The researchers controlled for individual skill levels by customizing the size of the target so that everyone would have the same success rate. That way, people who happened to be really good or bad at this task would not skew the data.

After a training period, the subjects were asked to perform the task while inside an fMRI machine, which measures blood flow in the brain — a proxy for brain activity, since wherever a brain is active, it needs extra oxygen, and thus a larger volume of blood. By monitoring blood flow, the researchers can pinpoint areas of the brain that turn on when a particular task is performed.

The task began with the researchers offering the participants a randomized range of rewards — from $0 to $100 — if they could successfully place the object into the square within the time limit. At the end of hundreds of trials — each with varying reward amounts — the participant was given their reward, based on the result of just one of the trials, picked at random.

As expected, the team found that performance improved as the incentives increased — but only when the cash reward amounts were at the low end of the spectrum. Once the rewards passed a certain threshold, which depended on the individual, performance began to fall off.

Incentives are known to activate a part of your brain called the ventral striatum, Chib says; the researchers thus expected to see the ventral striatum become increasingly active as they bumped up the prizes. And if the conventional thought were correct — that the reason for the observed performance decline was over-motivation — they would expect the striatum to continue showing a lot of activation when the incentives became high enough for performance to suffer.

What they found, instead, was that when the participants were shown their potential rewards, activity in the striatum did indeed increase with rising incentives. But once the volunteers started doing the task, striatal activity decreased with rising incentives. They also noticed that the less activity they saw in a participant’s striatum, the worse that person performed on the task.

Other studies have shown that decreasing striatal activity is related to fear or aversion to loss, Chib says. “When people see the incentive that they’re being offered, they initially encode it as a gain,” he explains. “But when they’re actually doing the task, the thing that causes them to perform poorly is that they worry about losing a potential incentive they haven’t even received yet.” He adds, “We’re showing loss aversion even though there are no explicit losses anywhere in the task — that’s very strange and something you really wouldn’t expect.”

To further test their hypothesis, Chib and his colleagues decided to measure how loss-averse each participant was. They had the participants play a coin-flip game in which there was an equal chance they could win or lose varying amounts of money.

Each participant was offered varying potential win-loss amounts ($20-$20, $20-$10, $20-$5, for example), and then given the opportunity to either accept each possible gamble or decline it. The win-loss ratio at which the subjects chose to take the gamble provided a measure of how loss-averse each person was; someone willing to gamble even when they might win or lose $20 is less loss-averse than someone who is only willing to gamble if they can win $20 but only lose $5.

Once the numbers had been crunched and compared to the original experiment, it turned out that the more averse a participant was, the worse they did on the task when the stakes were high. And for a particularly loss-aversive person, the threshold at which their performance started to decline did not have to be very high. “If you’re more loss-averse, it really hurts you,” Chib says. “You’re going to reach peak performance at a lower incentive level, and your performance is also going to be worse for higher incentives.”

"Previously, it’s been shown that the ventral striatum is involved in mediating performance increases in response to rising incentives," says John O’Doherty, professor of psychology and coauthor of the paper. "But our study shows that changes in activity in this same region can, under certain situations, also lead to worsening performance."

While this study only involved a specific motor task and financial incentives, these results may well be universal, says Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology and another coauthor of the study. “The implications and applications can include any sort of decision making that contains high stakes and uncertainties, such as business and politics.”

These findings, the researchers say, might be used to develop new ways to motivate people to perform better or to train them to be less loss-averse. “This loss aversion can be an important way of deciding how to set up incentive mechanisms and how to figure out who’s going to perform well and who isn’t,” Chib says. “If you can train somebody to be less loss-averse, maybe you can help them avoid performing poorly in stressful situations.”

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #brain #psychology
Response to first drug treatment may signal likelihood of future seizures in people with epilepsy

May 9, 2012

How well people with newly diagnosed epilepsy respond to their first drug treatment may signal the likelihood that they will continue to have more seizures, according to a study published in the May 9, 2012, online issue ofNeurology, the medical journal of the American Academy of Neurology.

"Our research shows a pattern based on how a person responds to initial treatment and specifically, to their first two courses of drug treatment," said study author Patrick Kwan, MD, PhD, with the University of Melbourne in Australia.

For the study, 1,098 people from Scotland between the ages of nine and 93 with newly diagnosed epilepsy were followed for as long as 26 years after being given their first drug therapy. Participants were considered seizure-free if they had no seizures for at least a year without changes in their treatment. If they had further seizures, a second drug was chosen to be given alone or to be added to the first. If seizures continued, a third drug regimen was selected, and the process continued for up to nine drug regimens.

The study found that 50 percent of the people were seizure-free after the first drug tried, 13 percent were seizure-free after the second drug regimen tried and 4 percent were seizure-free after the third drug regimen tried. Less than two percent of the participants stopped having seizures on additional drug treatment courses up to the seventh one tried, and none became seizure-free after that.

The research also found that 37 percent of people in the study became seizure-free within six months of treatment. Another 22 percent became seizure-free after more than six months of starting treatment. Both groups continued to be seizure-free. However, 16 percent had fluctuating periods of seizure freedom and relapses, and 25 percent were never seizure-free for one year.

At the end of the study, 749 people (68 percent) were seizure-free and 678 people (62 percent) were on only one drug. The results were independent of the age when the person had the first seizure or the type of epilepsy.

"A person who doesn’t respond well to two courses of epilepsy drug treatment should be further evaluated to verify an epilepsy diagnosis and to identify whether surgery is the best next step," said Patricia E. Penovich, MD, with the Minnesota Epilepsy Group PA and the University of Minnesota School of Medicine in St. Paul, Minn., and a Fellow with the American Academy of Neurology, who wrote an accompanying editorial on the study.

Provided by American Academy of Neurology

Source: medicalxpress.com

May 9, 20121 note
#science #neuroscience #brain #psychology
The music of the (hemi)spheres sheds new light on schizophrenia

May 9, 2012

In 1619, the pioneering astronomer Johannes Kepler published Harmonices Mundi in which he analyzed data on the movement of planets and asserted that the laws of nature governing the movements of planets show features of harmonic relationships in music. In so doing, Kepler provided important support for the, then controversial, model of the universe proposed by Copernicus.

In the latest issue of Biological Psychiatry, researchers at the University of California in San Diego suggest that careful analyses of the electrical signals of brain activity, measured using electroencephalography (EEG), may reveal important harmonic relationships in the electrical activity of brain circuits.

The underlying premise is a simple one - that brain function is expressed by circuits that fire, and therefore generate oscillating EEG signals, at different frequencies.

High frequency EEG activity called gamma, for example, might reflect the activity of fast-spiking cells which are often a subclass of inhibitory nerve cells containing parvalbumin. Represented musically, this would be a high pitch, i.e., toward the right side of the piano.

Lower frequency EEG activity, called theta, might come from cells that fire with a lower frequency.

As circuits interact with each other, one would see different “musical combinations”, like the chords of music, emerging in the EEG signal. Abnormalities in the structure and function of brain circuits would be reflected in cacophonous music, chords where the musical “voices” are firing at the wrong rate (pitch), volume (amplitude), or timing.

It is increasingly evident that schizophrenia is a disorder characterized by disturbances in the “music of the brain hemispheres.” This new report describes relationships between low- and high-frequency EEG oscillations in the human brain produced when high frequency auditory stimuli are presented to a research subject. The authors observed relatively slower oscillations and reduced cross-phase synchrony (for example, peak of theta coinciding with peak of gamma) in schizophrenia patients compared to healthy study participants.

Dr. John Krystal, Editor of Biological Psychiatry, commented, “The new findings highlight the importance of understanding the relationships between different circuits. It seems that cortical abnormalities in schizophrenia disturb brain function, in part, by disturbing the ‘tuning’ of brain circuits in relation to each other.”

Provided by Elsevier

Source: medicalxpress.com

May 9, 201211 notes
#science #neuroscience #brain #psychology
Researchers Discover a New Family of Key Mitochondrial Proteins for the Function and Variability of the Brain

May 9th, 2012

This family comprises a cluster of six genes that may be altered in neurological conditions, such as Parkinson’s and Charcot-Marie-Tooth disease.

A team headed by Eduardo Soriano at the Institute for Research in Biomedicine (IRB Barcelona) has published a study in Nature Communications describing a new family of six genes whose function regulates the movement and position of mitochondria in neurons. Many neurological conditions, including Parkinson’s and various types of Charcot-Marie-Tooth disease, are caused by alterations of genes that control mitochondrial transport, a process that provides the energy required for cell function.

“We have identified a set of new genes that are highly expressed in the nervous system and have a specific function in a biological process that is crucial for the activity and viability of the nervous system”, explains Eduardo Soriano, head of the Neurobiology and Cell Regeneration group at IRB Barcelona and full professor at the University of Barcelona (UB).

By means of comparative genomic analyses, the scientists have discovered that these genes are found only in more evolved mammals, the so-called Eutharia, these characterized by internal fertilization and development. “This finding indicates the relevance of mitochondrial biology. When the brain evolved in size, function and structure, the mitochondrial transport process also became more complex and probably required additional regulatory mechanisms”, says Soriano. “Likewise, given the origin of the gene cluster, in the transition between primitive mammals, such as marsupials (kangaroos) and the remaining placental mammals, it is tempting to propose that the cluster is linked to the increased complexity of the cerebral cortex in the lineage that leads to humans”, adds the full UB professor Jordi Garcia-Fernàndez, collaborator in the study.

image

In the image, red indicates the localization of mitochondria in a neuron. The new proteins described help to regulate their positions in the cell. Image adapted from IRB Barcelona press release image.

Correct brain function is highly energy-demanding. However, this energy must be finely distributed throughout neurons —cells that have ramifications that can reach up to tens of centimetres in length, from the brain to the limbs. This cluster of genes forms part of the “wheel” machinery of mitochondria and regulates the localization of each cell on the basis of its energy requirements. “These genes would be like an extra control in cellular mitochondrial trafficking and they interact with the major proteins associated with the regulation of mitochondrial transport”, explains Soriano.

Another striking characteristic of these new proteins is that they are found both in mitochondria, the function of which has already been described, and in the cell nucleus, where their function is unknown. “They may also be involved in the regulation of gene expression, a possibility that we are now studying”. In addition to their potential involvement in brain pathologies, the researchers believe that these proteins may be related to metabolic diseases and cancer.

Source: Neuroscience News

May 9, 20121 note
#science #neuroscience #brain #psychology
Virtual reality allows researchers to measure brain activity during behavior at unprecedented resolution

May 9, 2012

Researchers have developed a new technique which allows them to measure brain activity in large populations of nerve cells at the resolution of individual cells. The technique, reported today in the journal Nature, has been developed in zebrafish to represent a simplified model of how brain regions work together to flexibly control behaviour.

Our thoughts and actions are the product of large populations of nerve cells, called neurons, working in harmony, often millions at a time. Measuring brain activity during behaviour at detailed resolution in these groups of cells has proved extremely challenging. Currently, scientists are restricted to measuring their activity in individual brain areas of, for example, moving rats, typically in less than a few hundred neurons.

Dr Misha Ahrens, a Sir Henry Wellcome Postdoctoral Fellow based at Harvard University and the University of Cambridge, worked with colleagues to develop a technique which allows neuroscientists to study as many as 2,000 neurons simultaneously, anywhere in the brain of a transparent zebrafish. Their work was funded by the Wellcome Trust and the National Institutes of Health.

Dr Ahrens and colleagues created a virtual environment for zebrafish, which allowed them to measure activity in the neurons as the fish ‘moved’. In reality, the zebrafish was paralysed to allow the researchers to image its brain; the fish perceived to ‘move’ through the virtual environment by activating their motor neuron axons, the cells responsible for generating movement.

Zebrafish are often used as a simple organism to study genetics and characteristics of the nervous system that are conserved in humans . They are genetically modifiable, so by manipulating the fish’s genetic make-up, Dr Ahrens and colleagues created a fish in which all neurons contained a particular protein that increases its fluorescence when the cells are active. The fish are transparent and so the team were able to use a laser-scanning microscope, to see activity in any neuron in the brain of the fish, and up to 2,000 neurons simultaneously.

Dr Ahrens explains: “Our behaviour is determined by thousands, possibly millions, of nerve cells working in harmony. The zebrafish performs complex behaviors, with a brain of about 100,000 neurons, almost all of which are accessible to optical recording of neural activity. Our new technique will help us examine how large networks mediate behaviour, while at the same time telling us what each individual cell is doing.”

Using the technique, Dr Ahrens and colleagues asked the question: dozebrafish adapt their behaviour in response to changes in their environment? To do this, they manipulated the virtual environment to simulate the fish suddenly becoming more “muscular”. This served as a simplified version of what happens when the brain needs to adapt the way it drives behavior, for example, when water temperature changes the efficacy of the muscles, or when the fish gets injured.

Dr Ahrens adds: “The paralyzed fish in the virtual world do indeed adapt their behaviour, by adjusting the amount of impulses the brain sends to the muscles. They also ‘remember’ this change for a while. Imaging the brain everywhere during this behaviour, we identified certain brain regions that were involved, most notably the cerebellum and related structures. This technique opens the possibility that eventually, the behaviour may be used to gain insights into human motor control and motor control deficits.

"Our own motor control is continuously recalibrating itself in a similar way to the fish’s to cope with ever changing conditions of our body and environment, such as when we injure a leg, or if we’re walking on a slippery floor or carrying a heavy bag. The zebrafish’s behaviour is an ultra-simplified version of this and we have been able to gain some insight into how its brain structures drive behaviour. This might someday help us understand how damage to certain brain regions in humans affects the way in which the brain integrates sensory information to control body movements."

Understanding the brain is one of the Wellcome Trust’s five strategic challenges.

Provided by Wellcome Trust

Source: medicalxpress.com

May 9, 20123 notes
#science #neuroscience #brain #psychology
Reduction of excess brain activity improves memory in amnestic mild cognitive impairment

May 9, 2012

Research published in the May 10 issue of the journal Neuron, describes a potential new therapeutic approach for improving memory and modifying disease progression in patients with amnestic mild cognitive impairment. The study finds that excess brain activity may be doing more harm than good in some conditions that cause mild cognitive decline and memory impairment.

Elevated activity in specific parts of the hippocampus, a brain region involved in memory, is often seen in disorders associated with an increased risk for Alzheimer’s disease. Amnestic mild cognitive impairment (aMCI), where memory is worse than would be expected for a person’s age, is one such disorder. “In the case of early aMCI, it has been suggested that the increased hippocampal activation may serve a beneficial function by recruiting additional neural resources to compensate for those that are lost,” explains senior study author, Dr. Michela Gallagher, from Johns Hopkins University. “However, animal studies have raised the alternative view that this excess activation may be contributing to memory impairment.”

Dr. Gallagher and colleagues tested how a reduction of hippocampal activity would impact human patients with aMCI. The researchers used a low dose of a drug used clinically to treat epilepsy, for the purpose of reducing hippocampal activity in subjects with aMCI to levels that were similar to activity levels in healthy, age-matched subjects in a control group. The researchers found that treatment with the drug improved performance on a memory task. These findings point to the therapeutic potential of reducing excess activation in the hippocampus in aMCI.

The results also have broader significance as elevated activity in the hippocampus is also observed in other conditions that are thought to precede Alzheimer’s disease, and may be one of the underlying mechanisms of neurodegeneration. “Apart from a direct role in memory impairment, there is concern that elevated activity in vulnerable neural networks could be causing additional damage and, possibly, widespread disease-related degeneration that underlies cognitive decline and the conversion to Alzheimer’s disease,” concludes Dr. Gallagher. “Therefore, reducing the elevated activity in the hippocampus may help to restore memory and protect the brain.”

Provided by Cell Press

More information: Bakker et al.: “Reduction of hippocampal hyperactivity improves cognition in amnestic mild cognitive impairment.”,DOI:10.1016/j.neuron.2012.03.023

Source: medicalxpress.com

May 9, 20127 notes
#science #neuroscience #brain #psychology #memory
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December