Neuroscience

Month

February 2012

Feb 15, 2012
#science #neuroscience #brain #psychology #motor disease
Turmeric-Based Drug Effective On Alzheimer Flies

ScienceDaily (Feb. 14, 2012) — Curcumin, a substance extracted from turmeric, prolongs life and enhances activity of fruit flies with a nervous disorder similar to Alzheimers, according to new research. The study conducted at Linköping University, indicates that it is the initial stages of fibril formation and fragments of the amyloid fibrils that are most toxic to neurons.

Above left are the survival curves for “Alzheimer flies” treated (dashed line) and those not treated with curcumin. The flies that were administered curcumin lived longer and were more active. The scientists identified an accelerated formation of amyloid plaque in the treated flies, which seemed to protect the nerve cells. On the right we see microscopic images of neurons (blue) and plaque (green) in the fruit fly’s brain. The study strengthens the hypothesis that a curcumin-based drug can contribute to toxic fibrils being encapsulated (bottom left of the figure). (Credit: Per Hammarström, Ina Caesar)

Ina Caesar, as the lead author, has published the results of the study in the journal PLoS ONE.

For several years curcumin has been studied as a possible drug candidate to combat Alzheimer’s disease, which is characterized by the accumulation of sticky amyloid-beta and Tau protein fibres. Linköping researchers wanted to investigate how the substance affected transgenic fruit flies (Drosophila melanogaster), which developed evident Alzheimer’s symptoms. The fruit fly is increasingly used as a model for neurodegenerative diseases.

Five groups of diseased flies with different genetic manipulations were administered curcumin. They lived up to 75 % longer and maintained their mobility longer than the sick flies that did not receive the substance.

However, the scientists saw no decrease of amyloid in the brain or eyes. Curcumin did not dissolve the amyloid plaque; on the contrary it accelerated the formation of fibres by reducing the amount of their precursor forms, known as oligomers.

"The results confirm our belief that it is the oligomers that are most harmful to the nerve cells," says Professor Per Hammarstrom, who led the study.

"We now see that small molecules in an animal model can influence the amyloid form. To our knowledge the encapsulation of oligomers is a new and exciting treatment strategy," he said.

Several theories have been established about how oligomers can instigate the disease process. According to one hypothesis, they become trapped at synapses, inhibiting nerve impulse signals. Others claim that they cause cell death by puncturing the cell membrane.

Curcumin is extracted from the root of herbaceous plant turmeric and has been used as medicine for thousands of years. More recently, it has been tested against pain, thrombosis and cancer.

Source: Science Daily

Feb 15, 20121 note
#science #neuroscience #psychology #brain #alzheimer
People Forage for Memories in the Same Way Birds Forage for Berries

February 14th, 2012

Humans move between ‘patches’ in their memory using the same strategy as bees flitting between flowers for pollen or birds searching among bushes for berries.

Researchers at the University of Warwick and Indiana University have identified parallels between animals looking for food in the wild and humans searching for items within their memory – suggesting that people with the best ‘memory foraging’ strategies are better at recalling items.

Scientists asked people to name as many animals as they could in three minutes and then compared the results with a classic model of optimal foraging in the real world, the marginal value theorem, which predicts how long animals will stay in one patch before jumping to another.

Dr Thomas Hills, associate professor in the psychology department at the University of Warwick, said: “A bird’s food tends to be clumped together in a specific patch – for example on a bush laden with berries.

“But when the berries on a bush are depleted to the point where the bird’s energy is best focused on another more fruitful bush, it will move on.

“This kind of behaviour is predicted by the marginal value theorem, for a wide variety of animals.

“Because of the way human attention has evolved, we wondered if humans might use the same strategies to forage in memory. It turns out, they do.

“When faced with a memory task, we focus on specific clusters of information and jump between them like a bird between bushes. For example, when hunting for animals in memory, most people start with a patch of household pets—like dog, cat and hamster.

“But then as this patch becomes depleted, they look elsewhere. They might then alight on another semantically distinct ‘patch’, for example predatory animals such as lion, tiger and jaguar.”

The study shows that people who either stay too long or not long enough in one ‘patch’ did not recall as many animals as those who better judged the best time to switch between patches.

In other words, people who most closely adhered to the marginal value theorem produced more items.

The study Optimal Foraging in Semantic Memory, published in Psychological Review, asked 141 undergraduates (46 men and 95 women) at Indiana University to name as many animals as they could in three minutes.

They then analysed the responses using a categorisation scheme and also a semantic space model, called BEAGLE, which identifies clusters in the memory landscape based on the way words are related to one another in natural language.

Source: Neuroscience News 

Feb 15, 20126 notes
#sxience #neuroscience #psychology #brain #memory
Undergrad's work details protein's role in neurological disorders

February 14, 2012 

(Medical Xpress) — A UT Dallas undergraduate’s research is revealing new information about a key protein’s role in the development of epilepsy, autism and other neurological disorders. This work could one day lead to new treatments for the conditions.

Senior neuroscience student Francisco Garcia has worked closely with Dr. Marco Atzori, associate professor in the School of Behavioral and Brain Sciences (BBS), on several papers that outline their findings about interleukin 6 (IL-6) and hyper-excitability. An article on the project is slated for publication in Biological Psychiatry later this year.

Scientists know that stress elevates the levels of pro-inflammatory cytokines (signaling molecules used in intercellular communication) and promotes hyper-excitable conditions within the central nervous system. This hyper-excitability is thought to be a factor in epilepsy, autism and anxiety disorders.

Garcia and Atzori hypothesized that the protein IL-6 acutely and directly induces hyper-excitability by altering the balance between excitation and inhibition within synaptic communication. In other words, IL-6 is not just present when hyper-excitability occurs in the nervous system. It may actually cause it in some circumstances, Garcia said.

The UT Dallas research team administered IL-6 to rat brain tissue and monitored its synaptic excitability. The brain tissue exhibited higher than normal excitability in their synapses, a symptom that may lead to misfiring of signals in epilepsy and other conditions.

The researchers then injected sgp130 -a novel drug that acts as an IL-6 blocker- into the laboratory animals’ brains. The substance limited excitability and appeared to prevent the conditions that lead to related neurological and psychiatric disorders, Garcia said.

“This finding has the potential to lead to eventual new treatments for epilepsy, anxiety disorders or autism,” Garcia said.

The next stage of his research will involve looking at how IL-6 might affect development of other types of neurological problems. Human trials could follow sometime in the future.

Garcia is a native of Mexico, and he plans to pursue his master’s degree in neuroscience at UT Dallas after finishing his undergraduate studies. He credits the BBS faculty with allowing him to participate in laboratory experiments and expand his research skills.

“The UT Dallas faculty members have been great about giving me the opportunity to learn the techniques of a lab researcher,” he said. “It’s been a great experience to work as an undergraduate with such highly respected scientists as Dr. Atzori and Dr. Michael Kilgard.”

Atzori also praised Garcia’s efforts.

“Francisco has been an intelligent, hard-working and experimentally gifted student who contributed way more than the average undergraduate to the projects of the laboratory,” Atzori said. “I am proud that a fine piece of research with great potential for research and clinical applications has been carried out thanks to his enthusiasm and dedication. Francisco’s work in my laboratory is an example of the achievements possible when an institution like UT Dallas invests in and nurtures its research environment.”

Provided by University of Texas at Dallas

Source: medicalxpress.com

Feb 15, 20121 note
#science #neuroscience #brain #psychology #disorder
Trouble sleeping? It may affect your memory later on

February 14, 2012 in Neuroscience

The amount and quality of sleep you get at night may affect your memory later in life, according to research that was released today and will be presented at the American Academy of Neurology’s 64th Annual Meeting in New Orleans April 21 to April 28, 2012.

"Disrupted sleep appears to be associated with the build-up of amyloid plaques, a hallmark marker of Alzheimer’s disease, in the brains of people without memory problems," said study author Yo-El Ju, MD, with Washington University School of Medicine in St. Louis and a member of the American Academy of Neurology. "Further research is needed to determine why this is happening and whether sleep changes may predict cognitive decline."

Researchers tested the sleep patterns of 100 people between the ages of 45 and 80 who were free of dementia. Half of the group had a family history of Alzheimer’s disease. A device was placed on the participants for two weeks to measure sleep. Sleep diaries and questionnaires were also analyzed by researchers.

After the study, it was discovered that 25 percent of the participants had evidence of amyloid plaques, which can appear years before the symptoms of Alzheimer’s disease begin. The average time a person spent in bed during the study was about eight hours, but the average sleep time was 6.5 hours due to short awakenings in the night.

The study found that people who woke up more than five times per hour were more likely to have amyloid plaque build-up compared to people who didn’t wake up as much. The study also found those people who slept “less efficiently” were more likely to have the markers of early stage Alzheimer’s disease than those who slept more efficiently. In other words, those who spent less than 85 percent of their time in bed actually sleeping were more likely to have the markers than those who spent more than 85 percent of their time in bed actually sleeping.

"The association between disrupted sleep and amyloid plaques is intriguing, but the information from this study can’t determine a cause-effect relationship or the direction of this relationship. We need longer-term studies, following individuals’ sleep over years, to determine whether disrupted sleep leads to amyloid plaques, or whether brain changes in early Alzheimer’s disease lead to changes in sleep," Ju said. "Our study lays the groundwork for investigating whether manipulating sleep is a possible strategy in the prevention or slowing of Alzheimer disease."

Provided by American Academy of Neurology

Source: medicalxpress.com

Feb 15, 20122 notes
#science #neuroscience #psychology #brain #sleep
Study finds child abuse can lead to stunted brain development

February 14, 2012 by Bob Yirka in Neuroscience


(Medical Xpress) — A small team of researchers has found that various forms of child abuse can lead to stunted development in certain regions of the brain. The research carried out by Martin Teicher, Carl Anderson and Ann Polcari, all working in the Boston area, relied on questionnaires and MRI brain scans to determine that certain parts of the hippocampus, all known to be sensitive to stress, were up to six percent smaller in adults who as children had been sexually, verbally or physically abused. The team has published their results in the Proceedings of the National Academy of Sciences.

The three areas affected: the cornu ammonis, the dentate gyrus and the subiculum, all located in the hippocampus, are known to be vulnerable to stress which leads to less cell development than would normally occur in the absence of abuse.

To test the relationship between brain development and childhood abuse, the research team enlisted a group of otherwise healthy adult volunteers: 73 men and 120 women, all between the ages of 18 and 25. All were given questionnaires that delved into their childhood, specifically addressing issues of verbal, mental and physical abuse and other types of stresses such as the death of someone close to them or problems between parents. All were also given brain scans using an MRI machine. The team then compared the answers given on the questionnaires to the possibly impacted areas in the hippocampus of each volunteer. In so doing, they found that the brain regions under study were 5.8 to 6.5 percent smaller than average for those that reported such childhood stresses.

The researchers suggest that smaller brain regions due to childhood stressmay help explain the abnormally high levels of mental illness (depression, bi-polarism, anxiety, etc.) seen in adults who have endured abuse as children and why so many wind up with drug dependency problems. They also noted that one of the regions impacted, the subiculum, serves as a relay, moving information in and out of the hippocampus, which can have a direct impact on dopamine production. Those with reduced volume have been found to have problems with drug addiction and in some cases develop schizophrenia.

The researchers believe that increased stress leads to higher levels of the hormone cortisol, which in turn can slow or even stop the growth of new neurons in the brain which can result in permanently stunting certain brain regions.

The researchers are hoping their results will further highlight the damage that is done when children are subjected to adverse living conditions, leading perhaps to earlier interventions and possibly a means for developing treatments that may aid in preventing the stunting of brain regions, thus helping to pave the way for a better quality of life for those that have been abused as children.

Source: medicalxpress.com

Feb 15, 20121 note
#science #neuroscience #psychology #brain
Play
Feb 14, 201211 notes
#brain #fMRI #love #neuroscience #science #psychology
Discovery Of Complex Wiring Of Nervous System Provides Clues To Neurological Diseases And Cancer

Article Date: 14 Feb 2012 - 1:00 PST

Researchers at the Salk Institute have discovered a startling feature of early brain development that helps to explain how complex neuron wiring patterns are programmed using just a handful of critical genes. The findings, published in Cell, may help scientists develop new therapies for neurological disorders, such as amyotrophic lateral sclerosis (ALS), and provide insight into certain cancers.

The Salk researchers discovered that only a few proteins on the leading edge of a motor neuron’s axon - its outgoing electrical “wire” - and within the extracellular soup it travels through guide the nerve as it emerges from the spinal cord. These molecules can attract or repel the axon, depending on the long and winding path it must take to finally connect with its target muscle.

“The budding neuron has to detect the local environment it is growing through and decide where it is, and whether to grow straight, move to the left or right, or stop,” says the study’s senior investigator, Sam Pfaff, a professor in Salk’s Gene Expression Laboratory and a Howard Hughes Medical Institute investigator.

“It does this by mixing and matching just a handful of protein products to create complexes that tell a growing neuron which way to go, in the same way that a car uses the GPS signals it receives to guide it through an unfamiliar city,” he says.

The brain contains millions of times the number of neuron connections than the number of genes found in the DNA of brain cells. This is one of the first studies to try and understand how a growing neuron integrates many different pieces of information in order to navigate to its eventual target and make a functional connection.

“We focused on motor neurons that control muscle movements, but the same kind of thing is going on throughout embryonic development of the entire nervous system, during which millions of axons make trillions of decisions as they move to their targets,” he says. “It is the exquisite specificity with which they grow that underlies the basic architecture and proper function of the nervous system.”

These findings might eventually shed new light on a number of clinical disorders related to faulty nerve cell functioning, such as ALS, which is also known as Lou Gehrig’s disease, says the first author on the paper, Dario Bonanomi, a post-doctoral researcher in Pfaff’s laboratory.

“These are the motor neurons that die in diseases like Lou Gehrig’s disease and that are linked to a genetic disorder in children known as spinal muscle atrophy,” Bonanomi says.

“It is also a jumping off point to try and understand the basis for defects that might arise during fetal development of the nervous system,” he added. “A better understanding of those signals might help to be able to regenerate and rewire circuits following diseases or injuries of the nervous system.”

The researchers say the study also offers insights into cancer development, because a protein the researchers found to be crucial to the “push and pull” signaling system - Ret- is also linked to cancer. Mutations that activate Ret are linked to a number of different kinds of tumors.

The other protein receptors described in the study, known as Ephs, have also been implicated in cancer, Pfaff says.

“This study suggests that the way cells detect signals in their environment is likely a universal strategy,” he says, “and we know that genes and proteins known to function primarily during embryonic development have been linked to cancer.”

“Controlling neuronal growth requires very potent signaling molecules, and it makes sense they would be linked to disease,” Pfaff says. “We hope our findings help further unravel these connections.”

Source: Medical News Today 

Feb 14, 2012
#science #neuroscience #psychology #brain #disease
Mathematical Model Reveals System Of Compensating For Reduced Cellular Energy In The Brain

Article Date: 14 Feb 2012 - 1:00 PST

A distinctive pattern of brain activity associated with conditions including deep anesthesia, coma and congenital brain disorders appears to represent the brain’s shift into a protective, low-activity state in response to reduced metabolic energy. A mathematical model developed by a Massachusetts General Hospital (MGH)-based research team accurately predicts and explains for the first time how the condition called burst suppression is elicited when brain cells’ energy supply becomes insufficient. Their report has been released online in PNAS Early Edition.

“The seemingly unrelated brain states that lead to burst suppression - deep anesthesia, coma, hypothermia and some developmental brain disorders - all represent a depressed metabolic state,” says Emery Brown, MD, PhD, of the MGH Department of Anesthesia, Critical Care and Pain Medicine, senior author of the report. “We believe we have identified something fundamental about brain neurochemistry, neuroanatomy and neurophysiology that may help us plan better therapies for brain protection and design future anesthetics.”

Burst suppression is an electroencephalogram (EEG) pattern in which periods of normal, high brain activity - the bursts - are interrupted by stretches of greatly reduced activity that can last 10 seconds or longer. Burst suppression has been observed in deep general anesthesia, in induced hypothermia - used to protect the brain or other structures from damage caused by trauma or reduced blood flow - in coma, and in infants with serious neurodevelopmental disorders. It also has transiently been observed in some premature infants. Previous investigations of burst suppression focused on characterizing the structure of the EEG patterns and understanding the brain’s responsiveness to external stimuli while in this state, not on the underlying mechanism.

Lead author ShiNung Ching, PhD, a postdoctoral fellow in Brown’s lab, had been working with Nancy Kopell, PhD, a professor of Mathematics at Boston University and co-author of the PNAS article, to develop mathematical models of different brain states under general anesthesia. In developing a model for burst suppression, they focused on what the associated conditions have in common - a significant reduction in the brain’s metabolic state. In order for a signal to pass from one nerve cell to another, the balance between sodium ions outside the cell and potassium ions within the cell needs to be correct. Maintaining that balance requires that structures called ion pumps, fueled by the cellular energy molecule ATP, function correctly. The model developed by Ching and his colleagues revealed that, when brain energy supplies drop too low and cause a deficiency in ATP, potassium leaks from the nerve cells and signal transmission halts.

“It looks like burst suppression shifts the brain into an altered physiologic state to allow for the regeneration of ATP, which is the essential metabolic substrate,” Ching explains. “During suppression, the brain is trying to recover enough ATP to restart. If the substrate doesn’t regenerate quickly enough, the system will have these brief bursts of activity, stop and then need to recover again. The length of suppression is governed by how quickly ATP regenerates, which matches the observation that the deeper someone is anesthetized, the longer the periods of suppression.”

Brown adds, “When we use general anesthesia to place patients with serious neurologic injuries into induced comas to allow their brains to heal, we take them down to a level of burst suppression. But there are a lot of questions regarding how deeply anesthetized an individual patient should be - how often the bursts should occur - and how long we should maintain that state. By elucidating what appears to be a fundamental energy-preserving mechanism within the brain, this model may help us think about using burst suppression to guide induced coma and track recovery from brain injuries. This is also a great example of how studying anesthesia can help us learn something very basic about the brain.”  

Source: Medical News Today

Feb 14, 20121 note
#science #neuroscience #psychology #brain
New Imaging Methods Show Challenges of Identifying Cognitive Abilities in Severely Brain-Injured Patients

ScienceDaily (Feb. 13, 2012) — Only by employing complex machine-learning techniques to decipher repeated advanced brain scans were researchers at NewYork-Presbyterian/Weill Cornell able to provide evidence that a patient with a severe brain injury could, in her way, communicate accurately.

Their study, published in the Feb. 13 issue of the Archives of Neurology, demonstrates how difficult it is to determine whether a patient can communicate using only measured brain activity, even if it is possible for them to generate reliable patterns of brain activation in response to instructed commands. Patients in a minimally conscious state or who have locked-in syndrome (normal cognitive function with severe motor impairment) and can follow commands in the absence of a motor response may not generate clearly interpretable communications using the same patterns of brain activity, the researchers say.

While less sophisticated methods have been shown successful, the authors say their new approach provides important new insights into brain function and level of consciousness. It also identifies mechanisms of variation in brain activity supporting cognitive function after injury.

"In these studies we have reanalyzed earlier published data that demonstrated an effort to communicate using brain activations alone that apparently failed but was nonetheless a clear effort to generate a response," says Dr. Nicholas D. Schiff, professor of neurology and neuroscience and professor of public health at Weill Cornel Medical College, and a neurologist at NewYork-Presbyterian Hospital/Weill Cornell Medical Center. "Importantly, the reanalysis with new, more sensitive methods provides evidence that the problem with communication may reflect a mismatch of our expectations in designing the assessment, rather than a failure on the subject’s part in an attempt to accurately communicate with us."

"Our study shows that multivariate, machine-learning methods can be useful in determining whether patients are attempting to communicate, specifically when applied to data that already show evidence of a signal in univariate, more standard methods of analysis," says the study’s lead author, Jonathan Bardin, a fourth-year neuroscience graduate student at Weill Cornell Medical College.

"It is our clinical and ethical imperative to learn as much as possible about their ability to communicate," he says. "A simple bedside exam is not good enough."

"We need a set of methods that are both powerful and simple, and we are not there yet, as this study shows," adds Dr. Schiff. "We are using quite complex tasks to perhaps detect just the few of many patients who are conscious."

Patients Differ in Abilities

This study is a continuation of NewYork-Presbyterian/Weill Cornell research into how fMRI can establish a line of communication with brain-injured patients in order to understand if they can benefit from rehabilitation, and to gauge their level of pain and other clinical parameters that would improve care and quality of life.

It specifically follows up on a study published in the journal Brain last February that demonstrated use of fMRI to detect consciousness in six patients (either locked-in or minimally conscious) resulted in a wide, and largely unpredictable, variation in the ability of patients to respond to a simple command (such as “imagine swimming — now stop”) and then using the same command to answer simple yes/no or multiple-choice questions. This variation was apparent when compared with their ability to interact at the bedside using gestures or voice.

Some patients unable to communicate by gestures or voice were unable to do the mental tests, while others unable to communicate by gestures or voice were intermittently able to answer the researchers’ questions using mental imagery. And, intriguingly, some patients with the ability to communicate through gestures or voice were unable to do the mental tasks.

The researchers say these findings suggest that no exam yet exists at this time that can accurately assess the higher-level functioning that may be, and certainly seems to be, occurring in a number of severely brain-injured patients.

"There are people whose personal autonomy is abridged because they don’t have a good motor channel to express themselves despite, in some cases, having a clear mind and opinions and desires about themselves and the world," Dr. Schiff says about those results.

"Not all minimally conscious patients are the same, and not all patients with locked-in syndrome are the same," he says.

Sensitive and Flexible Methods Are Needed

This main new result of this study is a reinterpretation of findings from a 25-year-old patient who was the only one of six who showed an ability to use the fMRI signal for communication in the earlier research. But her results were confusing because it seemed that she was consistently responding to the answer that was directly after the correct answer, Bardin says.

"It’s often seen in patients like this — she had a stroke that damaged her brain — that there can be a cognitive delay in some area of the brain. FMRI is a readout of blood flow instead of actual neural activity, so these delays could be caused by an interruption of blood flow due to damage or could just mean they are working on the problem more slowly, and the answer looks wrong because it is given in the next response period."

To understand this, Bardin employed a newer technique, which he says has sprung out of machine-learning research, to instruct a computer to evaluate multiple fMRI scans from the patient after she answered the two questions a number of times.

This so-called multivariate approach used the same data gathered for the first study, which, in the typical “univariate” analysis, specifically looks at functioning in the brain’s Supplementary Motor Area (SMA), which is active when “normal” subjects imagine doing something.

In contrast, the multivariate analysis examines whether there is a pattern of activity in any part of the brain that is consistent from one scan to the next.

"When there is significant damage to the brain, it can rewire itself so that functions associated with SMA could be processed somewhere else," Bardin says.

Using this complex approach, the researchers found that, indeed, the patient had consistently attempted to communicate answers to both questions — but at a delayed speed.

The researchers say that one approach to analyze fMRI scans is not better than the other for all patients and that univariate methods should always be carried out first. Multivariate approaches can be especially sensitive to noise, leading to false positives if used on their own. If the standard approach reveals a signal, the multivariate approach could be used to gain further insights and possibly identify response in patients where the univariate results are ambiguous.

"We did all these things to simply show that we think this patient was trying to communicate," Bardin says. "You have to be very careful in your data analysis before saying anything strongly about what a patient can or cannot do."

"Rigid experimental paradigms like those used in the field can very well miss important information about these patients," Dr. Schiff says. "This is all extremely complex and messy, but we should expect that. Given the injuries some of our patients suffer, their cognitive abilities are very difficult to detect behaviorally or through simplistic tests or scans."

Source: Science Daily

Feb 14, 20121 note
#science #neuroscience #psychology #brain
Brain-Imaging Technique Predicts Who Will Suffer Cognitive Decline Over Time

ScienceDaily (Feb. 13, 2012) — Cognitive loss and brain degeneration currently affect millions of adults, and the number will increase, given the population of aging baby boomers. Today, nearly 20 percent of people age 65 or older suffer from mild cognitive impairment and 10 percent have dementia.

These are baseline and follow-up brain scans of a patient who converted to Alzheimer’s disease after two years (images to right of white line) that shows high medial temporal binding at baseline (lower left) and follow-up (lower right), but also demonstrates more baseline binding in frontal (upper images) and lateral temporal regions. Warmer colors (yellows, reds indicate higher binding levels. A second patient did not convert to Alzheimer’s after two years (images to left of white line) showing medial temporal (lower scans), but very mild frontal (upper scans) binding at baseline and follow-up. (Credit: UCLA)

UCLA scientists previously developed a brain-imaging tool to help assess the neurological changes associated with these conditions. The UCLA team now reports in the February issue of the journal Archives of Neurology that the brain-scan technique effectively tracked and predicted cognitive decline over a two-year period.

The team has created a chemical marker called FDDNP that binds to both plaque and tangle deposits — the hallmarks of Alzheimer’s disease — which can then be viewed using a positron emission tomography (PET) brain scan, providing a “window into the brain.” Using this method, researchers are able to pinpoint where in the brain these abnormal protein deposits are accumulating.

"We are finding that this may be a useful neuro-imaging marker that can detect changes early, before symptoms appear, and it may be helpful in tracking changes in the brain over time," said study author Dr. Gary Small, UCLA’s Parlow-Solomon Professor on Aging and a professor of psychiatry at the Semel Institute for Neuroscience and Human Behavior at UCLA.

Small noted that FDDNP-PET scanning is the only available brain-imaging technique that can assess tau tangles. Autopsy findings have found that tangles correlate with Alzheimer’s disease progression much better than do plaques.

For the study, researchers performed brain scans and cognitive assessments on the subjects at baseline and then again two years later. The study involved 43 volunteer paricipants, with an average age of 64, who did not have dementia. At the start of the study, approximately half (22) of the participants had normal aging and the other half (21) had mild cognitive impairment, or MCI, a condition that increases a person’s risk of developing Alzheimer’s disease.

Researchers found that for both groups, increases in FDDNP binding in the frontal, posterior cingulate and global areas of the brain at the two-year follow-up correlated with progression of cognitive decline. These areas of the brain are involved in decision-making, complex reasoning, memory and emotions. Higher initial baseline FDDNP binding in both subject groups was associated with a decline in cognitive functioning in areas such as language and attention at the two-year follow-up.

"We found that increases in FDDNP binding in key brain areas correlated with increases in clinical symptoms over time," said study author Dr. Jorge R. Barrio, who holds UCLA’s Plott Chair in Gerentology and is a professor of molecular and medical pharmacology at the David Geffen School of Medicine at UCLA. "Initial binding levels were also predictive of future cognitive decline."

Among the subjects with mild cognitive impairment, the level of initial binding in the frontal and parietal areas of the brain provided the greatest accuracy in identifying those who developed Alzheimer’s disease after two years. Of the 21 subjects with MCI, six were diagnosed with Alzheimer’s at follow-up, and these six subjects had higher initial frontal and parietal binding values than the other subjects in the MCI group.

In the normal aging subjects, three developed mild cognitive impairment after two years. Two of these three participants had had the highest baseline binding values in the temporal, parietal and frontal brain regions among this group.

Researchers said the next step in research will involve a longer duration of follow-up with larger samples of subjects. In addition, the team is using this brain-imaging technique in clinical trials to help track novel therapeutics for brain aging, such as curcumin, a chemical found in turmeric spice.

"Tracking the effectiveness of such treatments may help accelerate drug discovery efforts," Small, the author of the new book "The Alzheimer’s Prevention Program," said. "Because FDDNP appears to predict who will develop dementia, it may be particularly useful in tracking the effectiveness of interventions designed to delay the onset of dementia symptoms and eventually prevent the disease."

Small recently received research approval from the U.S. Food and Drug Administration to use FDDNP-PET to study people with mild cognitive impairment to determine whether a high-potency form of curcumin — a spice with anti-amyloid, anti-tau and anti-inflammatory properties — can prevent Alzheimer’s disease and the accumulation of plaques and tangles in the brain.

UCLA owns three U.S. patents on the FDDNP chemical marker. The Office of Intellectual Property at UCLA is actively seeking a commercial partner to bring this promising technology to market.

Source: Science Daily

Feb 14, 2012
#science #neuroscience #psychology #brain
Neuron memory key to taming chronic pain

February 13, 2012 

For some, the pain is so great that they can’t even bear to have clothes touch their skin. For others, it means that every step is a deliberate and agonizing choice. Whether the pain is caused by arthritic joints, an injury to a nerve or a disease like fibromyalgia, research now suggests there are new solutions for those who suffer from chronic pain.

A team of researchers led by McGill neuroscientist Terence Coderre, who is also affiliated with the Research Institute of the McGill University Health Centre, has found the key to understanding how memories of pain are stored in the brain. More importantly, the researchers are also able to suggest how these memories can be erased, making it possible to ease chronic pain.

It has long been known that the central nervous system “remembers” painful experiences, that they leave a memory trace of pain. And when there is new sensory input, the pain memory trace in the brain magnifies the feeling so that even a gentle touch can be excruciating.

"Perhaps the best example of a pain memory trace is found with phantom limb pain," suggests Coderre. "Patients may have a limb amputated because of gangrene, and because the limb was painful before it was amputated, even though the limb is gone, the patients continue to feel they are suffering from pain in the absent limb. That’s because the brain remembers the pain. In fact, there’s evidence that any pain that lasts more than a few minutes will leave a trace in the nervous system." It’s this memory of pain, which exists at the neuronal level, that is critical to the development of chronic pain. But until now, it was not known how these pain memories were stored at the level of the neurons.

Recent work has shown that the protein kinase PKMzeta plays a crucial role in building and maintaining memory by strengthening the connections between neurons. Now Coderre and his colleagues have discovered that PKMzeta is also the key to understanding how the memory of pain is stored in the neurons. They were able to show that after painful stimulation, the level of PKMzeta increases persistently in the central nervous system (CNS).

Even more importantly, the researchers found that by blocking the activity of PKMzeta at the neuronal level, they could reverse the hypersensitivity to pain that neurons developed after irritating the skin by applying capsaicin – the active ingredient in hot peppers. Moreover, erasing this pain memory trace was found to reduce both persistent pain and heightened sensitivity to touch.

Coderre and his colleagues believe that building on this study to devise ways to target PKMzeta in pain pathways could have a significant effect for patients with chronic pain. “Many pain medications target pain at the peripheral level, by reducing inflammation, or by activating analgesia systems in the brain to reduce the feeling of pain,” says Coderre. “This is the first time that we can foresee medications that will target an established pain memory trace as a way of reducing pain hypersensitivity. We believe it’s an avenue that may offer new hope to those suffering from chronic pain.”

Provided by McGill University

Source: medicalxpress.com

Feb 14, 20123 notes
#science #neuroscience #psychology #pain
Feb 13, 201234 notes
EEG Pattern Reflects Brain's Shift Into Low-Energy, Protective Mode

ScienceDaily (Feb. 10, 2012) — A distinctive pattern of brain activity associated with conditions including deep anesthesia, coma and congenital brain disorders appears to represent the brain’s shift into a protective, low-activity state in response to reduced metabolic energy. A mathematical model developed by a Massachusetts General Hospital (MGH)-based research team accurately predicts and explains for the first time how the condition called burst suppression is elicited when brain cells’ energy supply becomes insufficient. Their report has been released online in PNAS Early Edition.

"The seemingly unrelated brain states that lead to burst suppression — deep anesthesia, coma, hypothermia and some developmental brain disorders — all represent a depressed metabolic state," says Emery Brown, MD, PhD, of the MGH Department of Anesthesia, Critical Care and Pain Medicine, senior author of the report. "We believe we have identified something fundamental about brain neurochemistry, neuroanatomy and neurophysiology that may help us plan better therapies for brain protection and design future anesthetics."

Burst suppression is an electroencephalogram (EEG) pattern in which periods of normal, high brain activity — the bursts — are interrupted by stretches of greatly reduced activity that can last 10 seconds or longer. Burst suppression has been observed in deep general anesthesia, in induced hypothermia — used to protect the brain or other structures from damage caused by trauma or reduced blood flow — in coma, and in infants with serious neurodevelopmental disorders. It also has transiently been observed in some premature infants. Previous investigations of burst suppression focused on characterizing the structure of the EEG patterns and understanding the brain’s responsiveness to external stimuli while in this state, not on the underlying mechanism.

Lead author ShiNung Ching, PhD, a postdoctoral fellow in Brown’s lab, had been working with Nancy Kopell, PhD, a professor of Mathematics at Boston University and co-author of the PNAS article, to develop mathematical models of different brain states under general anesthesia. In developing a model for burst suppression, they focused on what the associated conditions have in common — a significant reduction in the brain’s metabolic state. In order for a signal to pass from one nerve cell to another, the balance between sodium ions outside the cell and potassium ions within the cell needs to be correct. Maintaining that balance requires that structures called ion pumps, fueled by the cellular energy molecule ATP, function correctly. The model developed by Ching and his colleagues revealed that, when brain energy supplies drop too low and cause a deficiency in ATP, potassium leaks from the nerve cells and signal transmission halts.

"It looks like burst suppression shifts the brain into an altered physiologic state to allow for the regeneration of ATP, which is the essential metabolic substrate," Ching explains. "During suppression, the brain is trying to recover enough ATP to restart. If the substrate doesn’t regenerate quickly enough, the system will have these brief bursts of activity, stop and then need to recover again. The length of suppression is governed by how quickly ATP regenerates, which matches the observation that the deeper someone is anesthetized, the longer the periods of suppression."

Brown adds, “When we use general anesthesia to place patients with serious neurologic injuries into induced comas to allow their brains to heal, we take them down to a level of burst suppression. But there are a lot of questions regarding how deeply anesthetized an individual patient should be — how often the bursts should occur — and how long we should maintain that state. By elucidating what appears to be a fundamental energy-preserving mechanism within the brain, this model may help us think about using burst suppression to guide induced coma and track recovery from brain injuries. This is also a great example of how studying anesthesia can help us learn something very basic about the brain.”

Brown is the Warren Zapol Professor of Anesthesia at Harvard Medical School. He also is a professor of Computational Neuroscience and Health Sciences and Technology at Massachusetts Institute of Technology. Additional co-authors of the PNAS report are Patrick Purdon, PhD, MGH Anesthesia, and Sujith Vijayan, PhD, Boston University Mathematics. The study was supported by grants from the National Institutes of Health and the National Science Foundation.

Source: Science Daily

Feb 13, 20121 note
#science #neuroscience #psychology #brain #EEG
Researchers Develop Gene Therapy to Boost Brain Repair for Demyelinating Diseases

February 10th, 2012

Our bodies are full of tiny superheroes—antibodies that fight foreign invaders, cells that regenerate, and structures that ensure our systems run smoothly. One such structure is myelin—a material that forms a protective, insulating cape around the axons of our nerve cells so that they can send signals quickly and efficiently. But myelin, and the specialized cells called oligodendrocytes that make it, become damaged in demyelinating diseases like multiple sclerosis (MS), leaving neurons without their myelin sheaths. As a consequence, the affected neurons can no longer communicate correctly and are prone to damage. Researchers from the California Institute of Technology (Caltech) now believe they have found a way to help the brain replace damaged oligodendrocytes and myelin.

The therapy, which has been successful in promoting remyelination in a mouse model of MS, is outlined in a paper published February 8 in The Journal of Neuroscience.

“We’ve developed a gene therapy to stimulate production of new oligodendrocytes from stem and progenitor cells—both of which can become more specialized cell types—that are resident in the adult central nervous system,” says Benjamin Deverman, a postdoctoral fellow in biology at Caltech and lead author of the paper. “In other words, we’re using the brain’s own progenitor cells as a way to boost repair.”

The therapy uses leukemia inhibitory factor (LIF), a naturally occurring protein that was known to promote the self-renewal of neural stem cells and to reduce immune-cell attacks to myelin in other MS mouse models.

“What hadn’t been done before our study was to use gene therapy in the brain to stimulate these cells to remyelinate,” says Paul Patterson, the Biaggini Professor of Biological Sciences at Caltech and senior author of the study.

According to the researchers, LIF enables remyelination by stimulating oligodendrocyte progenitor cells to proliferate and make new oligodendrocytes. The brain has the capacity to produce oligodendrocytes, but often fails to prompt a high enough repair response after demyelination.

“Researchers had been skeptical that a single factor could lead to remyelination of damaged cells,” says Deverman. “It was thought that you could use factors to stimulate the division and expansion of the progenitor population, and then add additional factors to direct those progenitors to turn into the mature myelin-forming cells. But in our mouse model, when we give our LIF therapy, it both stimulates the proliferation of the progenitor cells and allows them to differentiate into mature oligodendrocytes.”

In other words, once the researchers stimulated the proliferation of the progenitor cells, it appeared that the progenitors knew just what was needed—the team did not have to instruct the cells at each stage of development. And they found that LIF elicited such a strong response that the treated brain’s levels of myelin-producing oligodendrocytes were restored to those found in healthy populations.

The researchers note, too, that by placing LIF directly in the brain, one avoids potential side effects of the treatment that may arise when the therapy is infused into the bloodstream.

“This new application of LIF is an avenue of therapy that has not been explored in human patients with MS,” says Deverman, who points out that LIF’s benefits might also be good for spinal-cord injury patients since the demyelination of spared neurons may contribute to disability in that disorder.

To move the research closer to human clinical trials, the team will work to build better viral vectors for the delivery of LIF. “The way this gene therapy works is to use a virus that can deliver the genetic material—LIF—into cells,” explains Patterson. “This kind of delivery has been used before in humans, but the worry is that you can’t control the virus. You can’t necessarily target the right place, and you can’t control how much of the protein is being made.”

Which is why he and Deverman are developing viruses that can target LIF production to specific cell types and can turn it on and off externally, providing a means to regulate LIF levels. They also plan to test the therapy in additional MS mouse models.

“For MS, the current therapies all work by modulating or suppressing the immune system, because it’s thought to be a disease in which inflammation leads to immune-associated loss of oligodendrocytes and damage to the neurons,” says Deverman. “Those therapies can reduce the relapse rate in patients, but they haven’t shown much of an effect on the long-term progression of the disease. What are needed are therapies that promote repair. We hope this may one day be such a therapy.”

Source: Neuroscience News

Feb 13, 2012
#science #neuroscience #psychology #brain
Feb 11, 20126 notes
#science #neuroscience #psychology #brain #memory
Flipping a Light Switch in the Cell: Quantum Dots Used for Targeted Neural Activation

 February 9th, 2012

New technique holds promise for better understanding of brain disorders.

Quantum dot film. Optically excited quantum dots in close proximity to a cell control the opening of ion channels. Credit: Lugo et al., University of Washington.

Source: Neuroscience News

Feb 10, 20123 notes
#science #neuroscience #brain #parkinson #alzheimer
FDA-approved drug rapidly clears amyloid from the brain, reverses Alzheimer's symptoms in mice

February 9, 2012

Neuroscientists at Case Western Reserve University School of Medicine have made a dramatic breakthrough in their efforts to find a cure for Alzheimer’s disease. The researchers’ findings, published in the journal Science, show that use of a drug in mice appears to quickly reverse the pathological, cognitive and memory deficits caused by the onset of Alzheimer’s. The results point to the significant potential that the medication, bexarotene, has to help the roughly 5.4 million Americans suffering from the progressive brain disease.

Bexarotene has been approved for the treatment of cancer by the U.S. Food and Drug Administration for more than a decade. These experiments explored whether the medication might also be used to help patients with Alzheimer’s disease, and the results were more than promising.

Alzheimer’s disease arises in large part from the body’s inability to clear naturally-occurring amyloid beta from the brain. In 2008 Case Western Reserve researcher Gary Landreth, PhD, professor of neurosciences, discovered that the main cholesterol carrier in the brain, Apolipoprotein E (ApoE), facilitated the clearance of the amyloid beta proteins. Landreth, a professor of neurosciences in the university’s medical school, is the senior author of this study as well.

Landreth and his colleagues chose to explore the effectiveness of bexarotene for increasing ApoE expression. The elevation of brain ApoE levels, in turn, speeds the clearance of amyloid beta from the brain. Bexarotene acts by stimulating retinoid X receptors (RXR), which control how much ApoE is produced.

In particular, the researchers were struck by the speed with which bexarotene improved memory deficits and behavior even as it also acted to reverse the pathology of Alzheimer’s disease. The present view of the scientific community is that small soluble forms of amyloid beta cause the memory impairments seen in animal models and humans with the disease. Within six hours of administering bexarotene, however, soluble amyloid levels fell by 25 percent; even more impressive, the effect lasted as long as three days. Finally, this shift was correlated with rapid improvement in a broad range of behaviors in three different mouse models of Alzheimer’s.

One example of the improved behaviors involved the typical nesting instinct of the mice. When Alzheimer’s-diseased mice encountered material suited for nesting – in this case, tissue paper – they did nothing to create a space to nest. This reaction demonstrated that they had lost the ability to associate the tissue paper with the opportunity to nest. Just 72 hours after the bexarotene treatment, however, the mice began to use the paper to make nests. Administration of the drug also improved the ability of the mice to sense and respond to odors.

Bexarotene treatment also worked quickly to stimulate the removal of amyloid plaques from the brain. The plaques are compacted aggregates of amyloid that form in the brain and are the pathological hallmark of Alzheimer’s disease. Researchers found that more than half of the plaques had been cleared within 72 hours. Ultimately, the reduction totaled 75 percent. It appears that the bexarotene reprogrammed the brain’s immune cells to “eat” or phagocytose the amyloid deposits. This observation demonstrated that the drug addresses the amount of both soluble and deposited forms of amyloid beta within the brain and reverses the pathological features of the disease in mice.

This study identifies a link between the primary genetic risk factor for Alzheimer’s disease and a potential therapy to address it. Humans have three forms of ApoE: ApoE2, ApoE3, and ApoE4. Possession of the ApoE4 gene greatly increases the likelihood of developing Alzheimer’s disease. Previously, the Landreth laboratory had shown that this form of ApoE was impaired in its ability of clear amyloid. The new work suggests that elevation of ApoE levels in the brain may be an effective therapeutic strategy to clear the forms of amyloid associated with impaired memory and cognition.

"This is an unprecedented finding," says Paige Cramer, PhD candidate at Case Western Reserve School of Medicine and first author of the study. "Previously, the best existing treatment for Alzheimer’s disease in mice required several months to reduce plaque in the brain."

Added Professor Landreth: “This is a particularly exciting and rewarding study because of the new science we have discovered and the potential promise of a therapy for Alzheimer’s disease. We need to be clear; the drug works quite well in mouse models of the disease. Our next objective is to ascertain if it acts similarly in humans. We are at an early stage in translating this basic science discovery into a treatment.”

Daniel Wesson, PhD, assistant professor of neurosciences at Case Western Reserve School of Medicine and co-author of the study agreed.

"Many often think of Alzheimer’s as a problem of remembering and learning, but the prevalent reality is this disease spreads throughout the brain, resulting in serious insults to numerous functions," he said. "The results of this study, showing the preservation of behaviors across a wide spectrum, and accompanying brain function, are tremendously exciting and suggest great promise in the utility of this approach in treatment of Alzheimer’s disease."

Bexarotene has a good safety and side-effect profile. The Case Western Reserve researchers hope these attributes will help speed the transition to clinical trials of the drug.

Professor Landreth said modest resources funded this self-described “far-fetched idea.” Crucial support came from the Blanchette Hooker Rockefeller Foundation, the Thome Foundation, and the National Institutes of Health.

Provided by Case Western Reserve University

Source: medicalxpress.com

Feb 10, 20129 notes
#science #neuroscience #psychology #alzheimer #brain
Feb 9, 20123 notes
#science #neuroscience #brain
Feb 9, 201221 notes
Feb 9, 201230 notes
#science #neuroscience #psychology #brain
'Explorers,' who embrace the uncertainty of choices, use specific part of cortex

February 8, 2012

"Explorers," whose decision-making style embraces the possibilities of uncertainty, use specific parts (red) of the right rostrolateral prefrontal cortex to make calculations based on relative uncertainty. Credit: Badre-Frank Lab/Brown University

Life shrouds most choices in mystery. Some people inch toward a comfortable enough spot and stick close to that rewarding status quo. Out to dinner, they order the usual. Others consider their options systematically or randomly. But many choose to grapple with the uncertainty head on. “Explorers” order the special because they aren’t sure they’ll like it. It’s a strategy of maximizing rewards by discovering whether as yet unexplored options might yield better returns. In a new study, Brown University researchers show that such explorers use a specific part of their brain to calculate the relative uncertainty of their choices, while non-explorers do not.

The study, published in the journal Neuron, newly exposes an aspect of the brain’s architecture for producing decisions and learning, said co-author David Badre, assistant professor of cognitive, linguistic, and psychological sciences at Brown. There was no consensus that a precise area of theprefrontal cortex, in this case the right rostrolateral prefrontal cortex, would be so clearly associated with a specific operation, such as performing the requisite uncertainty comparison for supporting a decision-making strategy.

"There has long been a debate about the functional organization of the frontal cortex," Badre said. "There has been a notion that the frontal lobe lacks specialization when exercising cognitive control, that it’s undifferentiated. This study provides evidence that there is a kind of organization. This is an example of how higher-order functions such as decision-making may relate to the frontal lobe’s more general functional architecture."

Stop the clock

To spot explorer behavior among their 15 participants, Badre and Michael Frank, associate professor of cognitive, linguistic, and psychological sciences, slid them into an MRI scanner and presented them with a game to play. Participants had to stop the sweeping hand of a virtual clock to win points in different rounds. They were told that they could maximize their rewards by responding quickly in some rounds, and slowly in others. The trick is they did not know round-to-round which response prevailed, and the number of points they could win was highly variable. They therefore had to employ a strategy to discover how to maximize their rewards among uncertain options, keeping track of the current expected value of fast and slow responses in each round.

While the MRI scanner tracked the blood flow in the brains of the subjects — a proxy for neural activity — the game’s software tracked their response times in each round. The computer then fed the game’s data into mathematical models devised to determine whether participants adapted their response times by taking relative uncertainty into account or adapted in another manner.

Over dozens of rounds a clear pattern emerged. Regardless of which version of the model they used, the researchers found that about half the subjects were engaging in exploratory behavior based on uncertainty: Their choices of response times correlated strongly with the choices that had the greatest outcome uncertainty.

Badre, Frank, and their team then looked at the MRI scans, reasoning that if decision-making is based on relative uncertainty, then the subjects’ brains must somehow represent this uncertainty. Sure enough, as relative uncertainty between choice options increased, so did activation in the right rostrolateral prefrontal cortex. This effect was substantially stronger in the explorers than the nonexplorers.

The result is the first to show that this region of the brain keeps track of relative uncertainty to guide exploration, but is consistent with previous studies that have shown an association between the right rostrolateral prefrontal cortex and relative comparisons. It also provides a potential explanation for Frank’s previous findings that explorers were more likely to have a variation in a gene called COMT that affects dopamine levels in the prefrontal cortex.

From cortex to choice

Frank said researchers still don’t know why some people employ the explorer strategy while others do not, but they might not be so different. According to one hypothesis, they all have an aversion to uncertainty and ambiguity.

"The difference could be that some people are averse to ambiguity in the time point where they make a single decision and other people are averse to ambiguity about their strategy over the long run," Frank said.

In other words, explorers may seek to reduce uncertainty by confronting it, rather than avoiding it.

Badre said that while the study has no direct clinical implications, the findings may still inform efforts to understand a broad set of disorders that affect frontal lobe function.

"There are a lot of diseases and disorders that affect the frontal lobes," Badre said. "They affect the ability to live independently, to carry out the day and make good decisions that get you where you want to go. The more we know about the specificity of these systems, the better that you can diagnose and suggest treatments."

Provided by Brown University

Source: medicalxpress.com

Feb 9, 20125 notes
#science #neuroscience #psychology #brain
Scientists delve into the brain roots of hunger and eating

February 8, 2012

Synaptic plasticity – the ability of the synaptic connections between the brain’s neurons to change and modify over time — has been shown to be a key to memory formation and the acquisition of new learning behaviors. Now research led by a scientific team at Beth Israel Deaconess Medical Center (BIDMC) reveals that the neural circuits controlling hunger and eating behaviors are also controlled by plasticity.

Described in the February 9, 2012 issue of the journal Neuron, the findings show that during fasting, the AgRP neurons that drive feeding behaviors actually undergo anatomical changes that cause them to become more active, which results in their “learning” to be more responsive to hunger-promoting neural stimuli.

"The role of plasticity has generally not been evaluated in neuronal circuits that control feeding behavior and with this new discovery we can start to unravel the basic mechanisms underpinning hunger and gain a greater understanding of the factors that influence weight gain and obesity," explains senior author Bradford Lowell, MD, PhD, an investigator in BIDMC’s Division of Endocrinology, Diabetes and Metabolism and Professor of Medicine at Harvard Medical School (HMS).

Adds BIDMC Chairman of Neurology Clifford Saper, MD, PhD, “For most animals, finding enough food to survive is their biggest daily challenge, and so the brain’s increase in feeding drive may be adaptive. But, for humans who are overweight, reducing this drive to the AgRP neurons may prove to be a path to future weight loss therapies.”

The roots of hunger, eating, and weight are based in the brain’s complex and rapid-fire neurocircuitry. Over the years, nerve cells containing agouti-related peptide (AgRP) protein and pro-opiomelanocortin (POMC) protein have emerged as critical players in feeding behaviors. Located in the hypothalamus, the brain area that controls automatic body functions, AgRP neurons have been shown to drive eating and weight gain while POMC neurons inhibit feeding behaviors, causing satiety and weight loss.

Previous work by the Lowell lab and others had demonstrated that when AgRP neurons in mice are artificially switched on, the animals eat voraciously, consuming four times more than control animals. “The ‘switched-on’ animals search in an unrelenting fashion for food, and when given a task to obtain pellets, will work five times harder to get them,” Lowell explains.

Given the important role played by AgRP neurons, the scientists had a great interest in understanding the factors that regulate their activity. While much focus had centered on hormones, including leptin, insulin and ghrelin, as the possible mechanisms directly affecting neuronal activity, the Lowell team hypothesized that other nerve cells might be behind the regulation.

Neurons communicate with one another via neurotransmitters, chemical messengers that traverse synapses, the specialized junctions between upstream and downstream neurons. Glutamate is one such excitatory neurotransmitter.

"Studies in other regions of the brain [for example those controlling learning and reward and addiction behaviors] have demonstrated that glutamate synapses are highly plastic, changing in their strength and sometimes even in their number," explains Lowell. Shown to exert powerful control over behavior, synaptic plasticity is brought about when glutamate binds to NMDA receptors on downstream neurons.

"NMDA receptors are unusual and really interesting," he adds. "When glutamate gets released by upstream neurons and binds to NMDA receptors, calcium enters the downstream neuron. This, in turn, engages signal transduction pathways that cause synaptic plasticity. In other parts of the brain, such as the hippocampus, NMDA receptors drive plasticity which serves to encode memories."

Led by co-first authors Tiemin Liu, PhD, Dong Kong, PhD, Bhavik P. Shah, PhD, and Chianping Ye, PhD, the investigators created and studied mice genetically engineered to lack glutamate-binding NMDA receptors on the AgRP neurons. For the sake of comparison, they also created mice genetically engineered to lack NMDA receptors on POMC neurons.

They found that while mice lacking NMDA receptors on POMC neurons showed no change in feeding behavior, the situation was dramatically different in the mice lacking NMDA receptors on AgRP neurons.

"These mice ate a lot less and were much skinnier than a group of control mice," explains Lowell. Furthermore, the scientists found that a 24-hour period of fasting – which causes intense hunger in the control mice – was associated with a 67 percent increase in the number of dendritic spines on the AgRP neurons.

"Dendritic spines are tiny structures attached to the neuron’s dendrites, the tree-like branches that receive incoming signals from upstream neurons," explains Lowell. "These structures are the physical site, the subcellular communication hub, where synaptic input from upstream glutamate-releasing neurons is received, typically one synaptic input per spine."

"I’ve been studying spines for a long time and I’ve never before seen a manipulation that triggered such rapid and robust changes in spine number," says coauthor Bernardo Sabatini, MD, PhD, a Howard Hughes Medical Institute investigator in the Department of Neurobiology at Harvard Medical School. "Clearly, feeding is plugging in to the most basic mechanisms that control synapse and spine number in these cells. This may be a great system to understand not only feeding behavior, but also to understand the cell biology behind dynamic synapse formation and retraction."

When the control mice were refed – and their hunger alleviated – the number of spines dropped back to normal. (In contrast, fasting had no effect on spine number in the mutant mice lacking NMDA receptors on AgRP neurons.) These dramatic changes in spine number and their tight association with states of hunger and satiety in control mice – and the absence of changes in spine number in mice lacking NMDA receptors on the downstream AgRP neurons– strongly suggests that structural plasticity of excitatory glutamate synapses on AgRP neurons is an important regulator of feeding behavior, says Lowell.

"Obesity is a major risk factor for type 2 diabetes, cardiovascular disease, and certain types of cancer," he adds. "By understanding the neurobiological mechanisms underlying feeding behaviors, we can work on treatments for a problem that has now become a global epidemic. These findings move us closer to a mechanistic understanding of how various factors controlling hunger might work."

Provided by Beth Israel Deaconess Medical Center

Source: medicalxpress.com

Feb 9, 20123 notes
#science #neuroscience #psychology #brain
Neuroscientists link brain-wave pattern to energy consumption

February 8, 2012 by Anne Trafton

Emery Brown, an MIT professor of brain and cognitive sciences and health sciences and technology, left, and ShiNung Ching, a postdoc in Brown’s lab. Photo: M. Scott Brauer

Different brain states produce different waves of electrical activity, with the alert brain, relaxed brain and sleeping brain producing easily distinguishable electroencephalogram (EEG) patterns. These patterns change even more dramatically when the brain goes into certain deeply quiescent states during general anesthesia or a coma. 

MIT and Harvard University researchers have now figured out how one such quiescent state, known as burst suppression, arises. The finding, reported in the online edition of the Proceedings of the National Academy of Sciences the week of Feb. 6, could help researchers better monitor other states in which burst suppression occurs. For example, it is also seen in the brains of heart attack victims who are cooled to prevent brain damage due to oxygen deprivation, and in the brains of patients deliberately placed into a medical coma to treat a traumatic brain injury or intractable seizures.

During burst suppression, the brain is quiet for up to several seconds at a time, punctuated by short bursts of activity. Emery Brown, an MIT professor of brain and cognitive sciences and health sciences and technology and an anesthesiologist at Massachusetts General Hospital, set out to study burst suppression in the anesthetized brain and other brain states in hopes of discovering a fundamental mechanism for how the pattern arises. Such knowledge could help scientists figure out how much burst suppression is needed for optimal brain protection during induced hypothermia, when this state is created deliberately. 

“You might be able to develop a much more principled way to guide therapy for using burst suppression in cases of medical coma,” says Brown, senior author of the PNASpaper. “The question is, how do you know that patients are sufficiently brain-protected? Should they have one burst every second? Or one every five seconds?”

Modeling electrical activity

ShiNung Ching, a postdoc in Brown’s lab and lead author of the PNAS paper, developed a model to describe how burst suppression arises, based on the behavior of neurons in the brain. Neuron firing is controlled by the activity of channels that allow ions such as potassium and sodium to flow in and out of the cell, altering its voltage.

For each neuron, “we’re able to mathematically model the flow of ions into and out of the cell body, through the membrane,” Ching says. In this study, the team combined many neurons to create a model of a large brain network. By showing how both cooling and certain anesthetic drugs reduce the brain’s use of ATP (the cell’s energy currency), the researchers were able to generate burst-suppression patterns consistent with those actually seen in human patients. 

This is the first time that reductions in metabolic activity at the neuron level have been linked to burst suppression, and suggests that the brain likely uses burst suppression to conserve vital energy during times of trauma.

“What’s really exciting about this is the idea that the metabolic regulation of cell energy stores plays a role in the observed dynamics of EEG. That’s a different way to think about the determinants of EEG,” says Nicholas Schiff, a professor of neurology and neuroscience at Weill Cornell Medical College who was not involved in this research. 

The developing brain

Burst suppression is also seen in babies born prematurely. As these babies get older, their brain patterns move into the normal continuous pattern. Brown speculates that in premature infants, the brain may be protecting itself by conserving energy.

“When you’re looking at these kids develop, we can easily start to suggest ways of tracking their improvement quantitatively. So the same algorithms we use to track burst suppression in the operating room could be used to track the disappearance of burst suppression in these kids,” Brown says.

Such tracking could help doctors determine whether premature infants are moving toward normal development or have an underlying brain disorder that might otherwise go undiagnosed, Ching says. 

In future studies, the researchers plan to study premature infants as well as patients whose brains are cooled and those in induced comas. Such studies could reveal just how much burst suppression is enough to protect the brain in those vulnerable situations.

Provided by Massachusetts Institute of Technology

Source: medicalxpress.com

Feb 9, 20122 notes
#science #neuroscience #psychology #brain
Brain Proteins May Be Key to Aging

Deterioration of long-lived proteins on the surface of neuronal nuclei in the brain could lead to age-related defects in nervous function.

By Bob Grant | February 8, 2012

Scientists have found that aptly named extremely long-lived proteins (ELLPs) in the brains of rats can persist for more than one year—a result that suggests the proteins, also found in human brains, last an entire lifetime. Most proteins only last a day or two before being recycled. The researchers reported their findings last week in Science.

A team at the Salk Institute for Biological Studies made the discovery while studying ELLPs that are part of the nuclear pore complex (NPC), which is a transport channel that regulates the flow of molecules into or out of the nucleus in neurons. Because the persistent ELLPs are more likely to accumulate molecular damage, NPC function may eventually become compromised, allowing more toxins into the nucleus. This could result in alterations to DNA, subsequent changes in gene activity, and signs of cellular aging. “Most cells, but not neurons, combat functional deterioration of their protein components through the process of protein turnover, in which the potentially impaired parts of the proteins are replaced with new functional copies,” said senior author Martin Hetzer, of Salk’s Molecular and Cell Biology Laboratory, in a statement. “Our results also suggest that nuclear pore deterioration might be a general aging mechanism leading to age-related defects in nuclear function, such as the loss of youthful gene expression programs.”

In addition to aging, the results may provide key clues to the development of neurodegenerative disorders like Alzheimer’s and Parkinson’s diseases.

Source: TheScientist

Feb 9, 2012
#science #neuroscience #psychology #brain
Research links 'brain waves' to cognition, attention and diagnosing disorders

February 7, 2012

Professor Jason Mattingley, Foundation Chair in Cognitive Neuroscience at The University of Queensland, released his findings into ‘brain waves’ at the Australian Neuroscience Society’s (ANS) annual conference last week.

'Brain waves' are the oscillations produced by the brain, which are thought to contribute to its remarkable capacity to integrate information about the world.

According to Professor Mattingley’s research, brain oscillations can be linked to sleep, navigation, cognition, attention, and to diagnosing a wide range of disorders including autism, schizophrenia and epilepsy.

To understand how the brain filters information during visual attention and perception, Professor Mattingley and his fellow researchers encouraged subjects to perform tasks involving the use of flickering stimuli on a computer display. This included embedding colour-coded visual information to see how well subjects track a specific target colour from a myriad of distracting information.

“Imagine the brain as a stadium full of sports fans. Each spectator is like an individual neuron in the brain. Now imagine the spectators starting a Mexican wave that sweeps through the crowd from one side of the stadium to the other. Our research shows that neurons in the brain act in much the same way. Distinct waves of neural activity, moving at different speeds and in different directions, help coordinate neurons across widely separated areas of the brain,” Professor Mattingley said.

“We can measure these brain waves as people engage in different tasks, such as focusing their attention on just one colour in multi-coloured display. The measurements we take from the brain are a bit like the ripples from a handful of pebbles thrown into a pond.”

“While interesting in their own right, these studies are also relevant to brain dysfunction, as defects in neural responses to flickering visual stimuli have been found in individuals with autism, schizophrenia, and epilepsy, and such oscillations have been found to be significantly altered in aging, depression, and neurodegenerative disorders. Using these tasks may help to both diagnose and understand the basis for differences in brain function in people with these conditions.”

The Australian Neuroscience Society’s (ANS) annual conference brings together researchers in search of a greater understanding of the human nervous system and its functions.

As part of the program around 100 international speakers and delegates shared their insights into the peripheral senses - touch, sight, hearing and smell – perception, cognition, learning and memory, with a particular focus on neurological and neurodegenerative disease. 

Provided by University of Queensland

Source: medicalxpress.com

Feb 8, 20121 note
#science #neuroscience #psychology #brain
Feb 8, 20122 notes
#science #neuroscience #psychology
Warning! Collision imminent! The brain's quick interceptions help you navigate the world

February 7, 2012

Researchers at The Neuro and the University of Maryland have figured out the mathematical calculations that specific neurons employ in order to inform us of our distance from an object and the 3-D velocities of moving objects and surfaces relative to ourselves.

When you are about to collide into something and manage to swerve away just in the nick of time, what exactly is happening in your brain? A new study from the Montreal Neurological Institute and Hospital – The Neuro, McGill University shows how the brain processes visual information to figure out when something is moving towards you or when you are about to head into a collision. The study, published in the Proceedings of the National Academy of Sciences (PNAS), provides vital insight into our sense of vision and a greater understanding of the brain.

Researchers at The Neuro and the University of Maryland have figured out the mathematical calculations that specific neurons employ in order to inform us of our distance from an object and the 3D velocities of moving objects and surfaces relative to ourselves. Highly specialized neurons located in the brain’s visual cortex, in an area known as MST, respond selectively to motion patterns such as expansion, rotation, and deformation. However, the computations underlying such selectivity were unknown until now.

Using mathematical models and sophisticated recording techniques, researchers have discovered how individual MST neurons function. “Area MST is typical of high-level visual cortex, in that information about important aspects of vision can be seen in the firing patterns of single neurons. A classic example is a neuron that only fires when the subject is looking at the image of a particular face. This type of neuron has to gather information from other neurons that are selective to simpler features, like lines, colors, and textures, and combine these pieces of information in a fairly sophisticated way,” says Dr. Christopher Pack, neuroscientist at The Neuro and senior author. “Similarly, for motion detection, neurons have to combine input from many other neurons earlier in the visual pathway, in order to determine whether something is moving toward you or just drifting past.” The brain’s visual pathway is made up of building blocks. For example, neurons in the retina respond to very simple stimuli, such as small spots of light. Further along the visual pathway, neurons respond to more complex stimulus such as straight lines, by combining inputs from neurons earlier on. Neurons further along respond to even more complex stimulus such as combinations of lines (angles), ultimately leading to neurons that can respond to, or recognize, faces and objects for example.

Source: medicalxpress.com

Feb 8, 20121 note
#science #neuroscience #psychology #brain
Study of Live Human Neurons Reveals Parkinson's Origins

ScienceDaily (Feb. 7, 2012) — Parkinson’s disease researchers at the University at Buffalo have discovered how mutations in the parkin gene cause the disease, which afflicts at least 500,000 Americans and for which there is no cure.

The results are published in the current issue of Nature Communications. The UB findings reveal potential new drug targets for the disease as well as a screening platform for discovering new treatments that might mimic the protective functions of parkin. UB has applied for patent protection on the screening platform.

"This is the first time that human dopamine neurons have ever been generated from Parkinson’s disease patients with parkin mutations," says Jian Feng, PhD, professor of physiology and biophysics in the UB School of Medicine and Biomedical Sciences and the study’s lead author.

As the first study of human neurons affected by parkin, the UB research overcomes a major roadblock in research on Parkinson’s disease and on neurological diseases in general. The problem has been that human neurons live in a complex network in the brain and thus are off-limits to invasive studies, Feng explains.

"Before this, we didn’t even think about being able to study the disease in human neurons," he says. "The brain is so fully integrated. It’s impossible to obtain live human neurons to study."

But studying human neurons is critical in Parkinson’s disease, Feng explains, because animal models that lack the parkin gene do not develop the disease; thus, human neurons are thought to have “unique vulnerabilities.”

"Our large brains may use more dopamine to support the neural computation needed for bipedal movement, compared to quadrupedal movement of almost all other animals," he says. Since in 2007, when Japanese researchers announced they had converted human cells to induced pluripotent stem cells (iPSCs) that could then be converted to nearly any cells in the body, mimicking embryonic stem cells, Feng and his UB colleagues saw their enormous potential. They have been working on it ever since.

"This new technology was a game-changer for Parkinson’s disease and for other neurological diseases," says Feng. "It finally allowed us to obtain the material we needed to study this disease."

The current paper is the fruition of the UB team’s ability to “reverse engineer” human neurons from human skin cells taken from four subjects: two with a rare type of Parkinson’s disease in which the parkin mutation is the cause of their disease and two healthy subjects who served as controls.

"Once parkin is mutated, it can no longer precisely control the action of dopamine, which supports the neural computation required for our movement," says Feng.

The UB team also found that parkin mutations prevent it from tightly controlling the production of monoamine oxidase (MAO), which catalyzes dopamine oxidation.

"Normally, parkin makes sure that MAO, which can be toxic, is expressed at a very low level so that dopamine oxidation is under control," Feng explains. "But we found that when parkin is mutated, that regulation is gone, so MAO is expressed at a much higher level. The nerve cells from our Parkinson’s patients had much higher levels of MAO expression than those from our controls. We suggest in our study that it might be possible to design a new class of drugs that would dial down the expression level of MAO."

He notes that one of the drugs currently used to treat Parkinson’s disease inhibits the enzymatic activity of MAO and has been shown in clinical trials to slow down the progression of the disease.

Parkinson’s disease is caused by the death of dopamine neurons. In the vast majority of cases, the reason for this is unknown, Feng explains. But in 10 percent of Parkinson’s cases, the disease is caused by mutations of genes, such as parkin: the subjects with Parkinson’s in the UB study had this rare form of the disease.

"We found that a key reason for the death of dopamine neurons is oxidative stress due to the overproduction of MAO," explains Feng. "But before the death of the neurons, the precise action of dopamine in supporting neural computation is disrupted by parkin mutations. This paper provides the first clues about what the parkin gene is doing in healthy controls and what it fails to achieve in Parkinson’s patients."

He noted in this study that these defects are reversed by delivering the normal parkin gene into the patients’ neurons, thus offering hope that these neurons may be used as a screening platform for discovering new drug candidates that could mimic the protective functions of parkin and potentially even lead to a cure for Parkinson’s.

While the parkin mutations are only responsible for a small percentage of Parkinson’s cases, Feng notes that understanding how parkin works is relevant to all Parkinson’s patients. His ongoing research on sporadic Parkinson’s disease, in which the cause is unknown, also points to the same direction.

Source: ScienceDaily

Feb 7, 20121 note
#science #neuroscience #psychology #parkinson
Why the Middle Finger Has Such a Slow Connection

ScienceDaily (Feb. 7, 2012) — Each part of the body has its own nerve cell area in the brain -we therefore have a map of our bodies in our heads. The functional significance of these maps is largely unclear. What effects they can have is now shown by RUB neuroscientists through reaction time measurements combined with learning experiments and “computational modelling.” They have been able to demonstrate that inhibitory influences of neighbouring “finger nerve cells” affect the reaction time of a finger. The fingers on the outside — i.e. the thumb and little finger — therefore react faster than the middle finger, which is exposed to the “cross fire” of two neighbours on each side. Through targeted learning, this speed handicap can be compensated.

The working group led by PD Dr. Hubert Dinse (Neural Plasticity Lab at the Institute for Neuroral Computation) report in the current issue of PNAS.

Thumb and little finger are the quickest

The researchers set subjects a simple task to measure the speed of decision: they showed them an image on a monitor that represented all ten fingers. If one of the fingers was marked, the subjects were to press a corresponding key as quickly as possible with that finger. The thumb and little finger were the fastest. The middle finger brought up the rear. “You might think that this has anatomical reasons or depends on the exercise” said Dr Dinse, “but we were able to rule that out through further tests. In principle, each finger is able to react equally quickly. Only in the selection task, the middle finger is at a distinct disadvantage.”

Computer simulation depicts brain maps

To explain their observations, the researchers used computer simulations based on a so-called mean-field model. It is especially suited for modelling large neuronal networks in the brain. For these simulations, each individual finger is represented by a group of nerve cells, which are arranged in the form of a topographic map of the fingers based on the actual conditions in the somatosensory cortex of the brain. “Adjacent fingers are adjacent in the brain too, and thus also in the simulation,” explained Dr. Dinse. The communication of the nerve cells amongst themselves is organised so that the nerve cells interact through mutual excitation and inhibition.

Inhibitory influences from both sides slow down the middle finger

The computer simulations showed that the longer reaction time of the middle finger in a multiple choice task is a consequence of the fact that the middle finger is within the inhibition range of the two adjacent fingers. The thumb and little finger on the other hand only receive an inhibitory effect of comparable strength from one adjacent finger each. “In other words, the high level of inhibition received by the nerve cells of the middle fingers mean that it takes longer for the excitement to build up — they therefore react more slowly” said Dr. Dinse.

Targeted reduction of the inhibition through learning

From the results of the computer simulation it can be concluded that weaker inhibition from the neighbouring fingers would shorten the reaction time of the middle finger. This would require a so-termed plastic change in the brain — a specialty of the Neural Plasticity Lab, which has been studying the development of learning protocols that induce such changes for years. One such protocol is the repeated stimulation of certain nerve cell groups, which the laboratory has already used in many experiments. “If, for example, you stimulate one finger electrically or by means of vibration for two to three hours, then its representation in the brain changes” explained Dr. Dinse. The result is an improvement in the sense of touch and a measurable reduction of the inhibitory processes in this brain area. This also results in the enlargement of the representation of the finger stimulated.

Second experiment confirms the prediction

The Bochum researchers then conducted a second experiment in which the middle finger of the right hand was subjected to such stimulation. The result was a significant shortening of the reaction time of this finger in the selection task. “This finding confirms our prediction” Dr. Dinse summed up. Thus, for the first time, Bochum’s researchers have established a direct link between the so-called lateral inhibitory processes and decision making processes. They have shown that learning processes that change the cortical maps can have far-reaching implications not only for simple discrimination tasks, but also for decision processes that were previously attributed to the so-called “higher” cortical areas. 

Source: ScienceDaily

Feb 7, 2012
#science #neuroscience #psychology
Sharp Images from the Living Mouse Brain

February 6th, 2012

This STED image of a nerve cell in the upper brain layer of a living mouse shows in previously impossible detail the very fine dendritic protrusions of a nerve cell, the so-called spines, at which the synapses are located. The inset shows the mushroom-shaped head of such a dendritic spine at which the nerve cells receive information from their peers. © Max Planck Institute for Biophysical Chemistry

Source: Neuroscience News

Feb 7, 2012
#science #neuroscience #psychology #brain
It's not solitaire: Brain activity differs when one plays against others

February 6, 2012

Rock, paper or scissors? Learning while playing a strategic game against others involves a different pattern of brain activity than learning from the consequences of one’s own actions, researchers found. Credit: L. Brian Stauffer

Researchers have found a way to study how our brains assess the behavior – and likely future actions – of others during competitive social interactions. Their study, described in a paper in the Proceedings of the National Academy of Sciences, is the first to use a computational approach to tease out differing patterns of brain activity during these interactions, the researchers report.

"When players compete against each other in a game, they try to make a mental model of the other person’s intentions, what they’re going to do and how they’re going to play, so they can play strategically against them," said University of Illinois postdoctoral researcher Kyle Mathewson, who conducted the study as a doctoral student in the Beckman Institute with graduate student Lusha Zhu and economics professor and Beckman affiliate Ming Hsu, who now is at the University of California, Berkeley. "We were interested in how this process happens in the brain."

Previous studies have tended to consider only how one learns from the consequences of one’s own actions, called reinforcement learning, Mathewson said. These studies have found heightened activity in the basal ganglia, a set of brain structures known to be involved in the control of muscle movements, goals and learning. Many of these structures signal via the neurotransmitter dopamine.

"That’s been pretty well studied and it’s been figured out that dopamine seems to carry the signal for learning about the outcome of our own actions," Mathewson said. "But how we learn from the actions of other people wasn’t very well characterized."

Researchers call this type of learning “belief learning.”

To better understand how the brain processes information in a competitive setting, the researchers used functional magnetic resonance imaging (fMRI) to track activity in the brains of participants while they played a competitive game, called a Patent Race, against other players. The goal of the game was to invest more than one’s opponent in each round to win a prize (a patent worth considerably more than the amount wagered), while minimizing one’s own losses (the amount wagered in each trial was lost). The fMRI tracked activity at the moment the player learned the outcome of the trial and how much his or her opponent had wagered.

A computational model evaluated the players’ strategies and the outcomes of the trials to map the brain regions involved in each type of learning.

"Both types of learning were tracked by activity in the ventral striatum, which is part of the basal ganglia," Mathewson said. "That’s traditionally known to be involved in reinforcement learning, so we were a little bit surprised to see that belief learning also was represented in that area."

Belief learning also spurred activity in the rostral anterior cingulate, a structure deep in the front of the brain. This region is known to be involved in error processing, regret and “learning with a more social and emotional flavor,” Mathewson said.

The findings offer new insight into the workings of the brain as it is engaged in strategic thinking, Hsu said, and may aid the understanding of neuropsychiatric illnesses that undermine those processes.

"There are a number of mental disorders that affect the brain circuits implicated in our study," Hsu said. "These include schizophrenia, depression and Parkinson’s disease. They all affect these dopaminergic regions in the frontal and striatal brain areas. So to the degree that we can better understand these ubiquitous social functions in strategic settings, it may help us understand how to characterize and, eventually, treat the social deficits that are symptoms of these diseases."

Provided by University of Illinois at Urbana-Champaign

Source: medicalxpress.com

Feb 7, 2012
#science #neuroscience #brain #psychology
Magnetic research for better brain health

February 6, 2012

A pioneering therapy that uses magnetic pulses to stimulate the brain to treat conditions such as Parkinson’s disease, depression, schizophrenia, epilepsy and stroke is now better understood thanks to researchers from The University of Western Australia and the Université Pierre et Marie Curie in France.

Research Associate Professor Jennifer Rodger from UWA’s School of Animal Biology said she and her team tested the therapy - known as repetitive transcranial magnetic stimulation (rTMS) - on mice to find out how it can be applied to treating human neurological disease.

The research was published recently in the prestigious journal FASEB.

"Our work demonstrated for the first time that pulsed magnetic fields promote changes in brain chemicals that correct abnormal brain connections, resulting in improved behaviour and brain function," joint lead author Dr Rodger said.

"rTMS is an exciting therapy that stimulates the brain. It has shown promising results in treating the damaged human brain. Our research helps to explain how this therapy works on the cells of the brain. Previously, evidence of its usefulness was mainly from anecdotal clinical evidence.

"Our results greatly increase our understanding of the specific cellular and molecular events that occur in the brain during rTMS therapy. We are the first to show that changes in brain circuits underpin these beneficial effects. Our results have implications for how rTMS is used in humans to treat disease and improve brain function."

Dr Rodger explained that the structural and functional changes caused by the therapy in malfunctioning circuits were not seen in the normal healthy brain, suggesting that the therapy could have minimal side effects in humans.

Provided by University of Western Australia

Source: medicalxpress.com

Feb 7, 2012
#science #neuroscience #psychology #brain
Magnetic therapy becoming more popular for treating depression

February 6, 2012

(Medical Xpress) — A new magnetic therapy that treats major depression recently received a major boost when the government announced Medicare will cover the procedure in Illinois.

The treatment, called transcranial magnetic stimulation (TMS), sends short pulses of magnetic fields to the brain. TMS “is rapidly gaining momentum” said Dr. Murali Rao of Loyola University Medical Center, one of the first Chicago-area centers to offer TMS. There now are nearly 300 such centers in the United States.

At Loyola, about two-thirds of Rao’s TMS patients so far report that their depression has significantly lessened or gone away completely.

Before receiving TMS, Nan Miller had failed nine antidepressants and suffered increasingly severe cycles of depression over seven years. There were times when she couldn’t get out of bed or eat. “I just wanted to die,” she said. She had even tried electroconvulsive therapy (formerly known as electroshock) but did not want to consider that option anymore.

Miller said that a few weeks after beginning TMS treatments, she was eating lunch when she suddenly realized depression did not consume her anymore. “I could almost hear the chains breaking, the darkness lifting and the heaviness dissolving,” she said. “I feel about 10 years younger and 20 shades lighter.”

The Food and Drug Administration approved TMS in 2009 for patients who have major depression and have failed at least one antidepressant. The FDA has approved one TMS system, NeuroStar®, made by Neuronetics.

The patient reclines in a comfortable padded chair. A magnetic coil, placed next to the left side of the head, sends short pulses of magnetic fields to the surface of the brain. This produces currents that stimulate brain cells. The currents, in turn, affect mood-regulatory circuits deeper in the brain. The resulting changes in the brain appear to be beneficial to patients who suffer depression.

Each treatment lasts 35 to 40 minutes. Patients typically undergo three treatments per week for four to six weeks.

The treatments do not require anesthesia or sedation. Afterward, a patient can immediately resume normal activities, including driving. Studies have found that patients do not experience memory loss or seizures. Side effects include mild headache or tingling in the scalp, which can be treated with Tylenol.

Together, psychotherapy and antidepressants successfully treat only about one-third of patients who suffer major depression. TMS is a noninvasive treatment option now available for the other two-thirds of patients, who experience only partial relief from depression or no relief at all, Rao said.

Provided by Loyola University Health System

Source: medicalxpress.com

Feb 7, 20121 note
#science #neuroscience #psychology #depression
Mom’s Love Good for Child’s Brain

January 30th, 2012

The hippocampus (highlighted in fuchsia) is a key brain structure important to learning, memory and stress response. New research shows that children who were nurtured by their mothers early in life have a larger hippocampus than children who were not nurtured as much. Credit: Washington University Medical School from press release

Source: Neuroscience News

Feb 6, 20123 notes
#science #neuroscience #psychology #brain
DNA Test that Identifies Down Syndrome in Pregnancy Can Also Detect Trisomy 18 and Trisomy 13

February 2nd, 2012

A newly available DNA-based prenatal blood test that can identify a pregnancy with Down syndrome can also identify two additional chromosome abnormalities: trisomy 18 (Edwards syndrome) and trisomy 13 (Patau syndrome).The test for all three defects can be offered as early as 10 weeks of pregnancy to women who have been identified as being at high risk for these abnormalities.

These are the results of an international, multicenter study published on-line today in the journal Genetics in Medicine. The study, the largest and most comprehensive done to date, adds to the documented capability (study published in Genetics in Medicine in October 2011) of the tests by examining results in 62 pregnancies with trisomy 18 and 12 pregnancies with trisomy 13.Together with the Down syndrome pregnancies reported earlier, 286 trisomic pregnancies and 1,702 normal pregnancies are included in the report.

The research was led by Glenn Palomaki, PhD, and Jacob Canick, PhD, of the Division of Medical Screening and Special Testing in the Department of Pathology and Laboratory Medicine at Women & Infants Hospital of Rhode Island and The Warren Alpert Medical School of Brown University, and included scientists at Sequenom Inc. and Sequenom Center for Molecular Medicine, San Diego, CA, and an independent academic laboratory at the University of California at Los Angeles.

The test identified 100% (59/59) of the trisomy 18 and 91.7% (11/12) of the trisomy 13 pregnancies.The associated false positive rates were 0.28 and 0.97%, respectively.Overall, testing failed to provide a clinical interpretation in 17 women (0.9%); three of these women had a trisomy 18 pregnancy.By slightly raising the definition of a positive test for chromosome 18 and 13, the detection rate remained constant, but the false positive rate could be as low as 0.1%.These findings, along with the detailed information learned from testing such a large number of samples, demonstrate that the new test will be highly effective when offered to women considering invasive testing.

“Our previous work demonstrated the ability to identify Down syndrome, the most common trisomy.These new data extend the finding to the next two most common trisomies and will allow for wider use of such testing with the ability to identify all three common trisomies,” said Dr. Palomaki.”The new DNA test can now also be offered to women identified as being as high risk for trisomy 18 or trisomy 13, as well those at high risk for Down syndrome.”

“This highly sensitive and specific DNA test has the potential to impact on couples’ decision-making,” says Dr. Canick.”A woman whose pregnancy was identified as high risk who earlier would have chosen not to have invasive diagnostic testing, might now consider the DNA test as a safe way to obtain further information, before making a final decision.”The US Centers for Disease Control and Prevention estimated in 1995 that about one in every 200 invasive diagnostic procedures will cause a pregnancy miscarriage.

Trisomy 18, also called Edwards syndrome, is a serious disorder with up to 70% of first trimester affected fetuses being spontaneously lost during pregnancies.Among those born alive, half die within a week with only 5% surviving the first year.All have serious medical and developmental problems.About 1,330 infants with trisomy 18 would be born in the US each year in the absence of prenatal diagnosis.Trisomy 13, also called Patau syndrome, is less common but equally serious.About 600 infants with trisomy 13 would be born in the US each year in the absence of prenatal diagnosis.Like Down syndrome, trisomy 18 and trisomy 13 are more common as maternal age increases.For comparison, about 7,730 Down syndrome cases would be born each year in the absence of prenatal diagnosis.Current prenatal screening tests for trisomy 18 and trisomy 13 rely on both biochemical and ultrasound markers.For more information visit the US National Library of Medicine PubMed Health.

This industry-sponsored project, awarded to Drs. Palomaki and Canick and Women & Infants Hospital in 2008, enrolled 4,500 women at 27 prenatal diagnostic centers throughout the world.Women & Infants also served as one of the enrollment centers under the direction of maternal-fetal medicine specialist and director of Perinatal Genetics, Barbara O’Brien, MD.

“It is clinically more relevant that all three trisomies can be detected by this test,” said Dr. O’Brien.”Having access to such a comprehensive, DNA-based test that can be done early in pregnancy will give us more information so that we can better guide which patients should consider diagnostic testing.”

Women & Infants Hospital has been an international center for prenatal screening research. For more than three decades, Drs. Palomaki and Canick have collaborated with others in developing and improving screening tests for Down syndrome and other fetal abnormalities.In 1988, Drs. Palomaki and Canick were involved in the development of triple marker screening. The team was able to convert its findings into prenatal screening tests now used throughout the world.Dr. Canick’s lab in 1998 was the first in the US to offer quad marker screening and in the past decade was the laboratory center for the NIH-funded FASTER Trial which compared first and second trimester screening.

Source: Neuroscience News

Feb 6, 20121 note
#science #neuroscience #psychology #genetics
Gender Specific Behavior Traced To Hormone-Controlled Genes In The Brain

Article Date: 06 Feb 2012 - 0:00 PST

Men and women may be equals, but they often behave differently when it comes to sex and parenting. Now a study of the differences between the brains of male and female mice in the Cell Press journal Cell provides insight into how our own brains might be programmed for these stereotypically different behaviors.

The new evidence shows that the sex hormones - testosterone, estrogen, and progesterone - act in a key region of the brain, switching certain genes on and others off. When the researchers tinkered with each of these genes one by one, animals showed subtle but important shifts in individual sex-specific behaviors, such as how males mate or females care for their pups.

“What this means is that complex behaviors like male mating or maternal care in mice can be deconstructed at the genetic level,” said Nirao Shah of the University of California, San Francisco. The findings present a cellular and molecular representation of gender that is remarkable in its complexity, the researchers say.

Shah’s team made these discoveries after screening mouse brains for genes that show differences in expression in males versus females. The researchers focused specifically on the hypothalamus, a region previously implicated in the control of sex-specific behaviors. Their screen produced a list of 16 genes with clear sex differences in distinct neurons in the hypothalamus. Surprisingly, Shah’s team found that many of these genes also show sex differences in the amygdala, a part of the brain important for emotions.

In further studies, the researchers examined the effects of a subset of these individual genes. Mice missing only one of these 16 genes seemed to behave normally. But upon closer observation, these mice showed significant differences in sex-specific behaviors. For instance, Shah explained, females mutant for one gene took longer to return their pups to the nest and to fight off intruders. “They still take care of their pups, but less effectively,” he said.

In other experiments, deletion of a single gene produced females that were two-fold less receptive to mating with males. Similarly, males mutant for another gene were less interested in females. Together these results mean that sex-specific behaviors can be controlled in modular fashion, such that the loss of any one gene leads to subtle but potentially important changes.

“At the superficial level, the mice appear normal, but this is pretty significant variation in behavior,” Shah said. It suggests that variation in such genes might explain not just differences between the sexes, but also differences in behaviors within one sex or the other - why some male mice are more aggressive than other males or some females more attentive to their offspring than other females.

The researchers don’t yet know exactly how these differences in gene expression lead to those differences in behavior, although Shah says some of the genes are known to be involved in sending or receiving neural messages in the brain. It also remains to be seen how the male and female gene expression programs might be influenced by the animals’ social interactions and experiences.

There is still a lot to learn about what makes males and females tick. “This gene list of sex differences in the brain is probably just a small subset of what we will eventually unearth,” Shah said.  

Source: Medical News Today

Feb 6, 20125 notes
#science #neuroscience #psychology #genetics #brain
Memory Function - Decaffeinated Coffee May Help

Article Date: 05 Feb 2012 - 0:00 PST

Drinking decaffeinated coffee may improve brain energy metabolism associated with diabetes type 2, according to a study published in Nutritional Neuroscience and carried out by researchers at Mount Sinai School of Medicine. Brain energy metabolism is a dysfunction with a known risk factor for dementia and other neurodegenerative disorders like Alzheimer’s disease.

Giulio Maria Pasinetti, MD, PhD, and team decided to investigate whether dietary supplementation with a standard decaffeinated coffee prior to diabetes onset could improve insulin resistance and glucose utilization in mice with diet-induced type 2 diabetes.

The mice were given the supplement for five months, after which the researchers assessed the animals’ brain’s genetic response. They discovered that the brain could metabolize glucose more effectively and that it was used for cellular energy in the brain. People with type 2 diabetes have reduced glucose utilization in the brain, which often leads to neurocognitive problems.

Dr. Pasinetti stated:

"Impaired energy metabolism in the brain is known to be tightly correlated with cognitive decline during aging and in subjects at high risk for developing neurodegenerative disorders. This is the first evidence showing the potential benefits of decaffeinated coffee preparations for both preventing and treating cognitive decline caused by type 2 diabetes, aging, and/or neurodegenerative disorders."



Drinking coffee is not recommended for everyone, because of its association with cardiovascular health risks, including elevated blood cholesterol and blood pressure, both of which result in a higher risk of developing heart disease, stroke, and premature death. However, these negative effects have mainly been caused because of the high caffeine content of coffee - the study findings prove that some components in decaffeinated coffee have beneficial health factors for mice.

Dr. Pasinetti wants to investigate whether decaffeinated coffee as a dietary supplement in humans can act as a preventive measure.

He concludes:

"In light of recent evidence suggesting that cognitive impairment associated with Alzheimer’s disease and other age-related neurodegenerative disorders may be traced back to neuropathological conditions initiated several decades before disease onset, developing preventive treatments for such disorders is critical."


Petra Rattue 

Source: Medical News Today

Feb 6, 2012
#science #neuroscience #psychology #brain #memory
Hearing Metaphors Activates Brain Regions Involved in Sensory Experience

ScienceDaily (Feb. 3, 2012) — When a friend tells you she had a rough day, do you feel sandpaper under your fingers? The brain may be replaying sensory experiences to help understand common metaphors, new research suggests.

Regions of the brain activated by hearing textural metaphors are shown in green. Yellow and red show regions activated by sensory experience of textures visually and through touch. (Credit: Image courtesy of Emory University)

Linguists and psychologists have debated how much the parts of the brain that mediate direct sensory experience are involved in understanding metaphors. George Lakoff and Mark Johnson, in their landmark work ‘Metaphors we live by’, pointed out that our daily language is full of metaphors, some of which are so familiar (like “rough day”) that they may not seem especially novel or striking. They argued that metaphor comprehension is grounded in our sensory and motor experiences.

New brain imaging research reveals that a region of the brain important for sensing texture through touch, the parietal operculum, is also activated when someone listens to a sentence with a textural metaphor. The same region is not activated when a similar sentence expressing the meaning of the metaphor is heard.

The results were published online this week in the journal Brain & Language.

"We see that metaphors are engaging the areas of the cerebral cortex involved in sensory responses even though the metaphors are quite familiar," says senior author Krish Sathian, MD, PhD, professor of neurology, rehabilitation medicine, and psychology at Emory University. "This result illustrates how we draw upon sensory experiences to achieve understanding of metaphorical language."

Sathian is also medical director of the Center for Systems Imaging at Emory University School of Medicine and director of the Rehabilitation R&D Center of Excellence at the Atlanta Veterans Affairs Medical Center.

Seven college students who volunteered for the study were asked to listen to sentences containing textural metaphors as well as sentences that were matched for meaning and structure, and to press a button as soon as they understood each sentence. Blood flow in their brains was monitored by functional magnetic resonance imaging. On average, response to a sentence containing a metaphor took slightly longer (0.84 vs 0.63 seconds).

In a previous study, the researchers had already mapped out, for each of these individuals, which parts of the students’ brains were involved in processing actual textures by touch and sight. This allowed them to establish with confidence the link within the brain between metaphors involving texture and the sensory experience of texture itself.

"Interestingly, visual cortical regions were not activated by textural metaphors, which fits with other evidence for the primacy of touch in texture perception," says research associate Simon Lacey, PhD, the first author of the paper.

The researchers did not find metaphor-specific differences in cortical regions well known to be involved in generating and processing language, such as Broca’s or Wernicke’s areas. However, this result doesn’t rule out a role for these regions in processing metaphors, Sathian says. Also, other neurologists have seen that injury to various areas of the brain can interfere with patients’ understanding of metaphors.

"I don’t think that there’s only one area responsible for metaphor processing," Sathian says. "Actually, several recent lines of research indicate that engagement with abstract concepts is distributed around the brain." "I think our research highlights the role of neural networks, rather than a single area of the brain, in these processes. What could be happening is that the brain is conducting an internal simulation as a way to understand the metaphor, and that’s why the regions associated with touch get involved. This also demonstrates how complex processes involving symbols, such as appreciating a painting or understanding a metaphor, do not depend just on evolutionarily new parts of the brain, but also on adaptations of older parts of the brain."

Sathian’s future plans include asking whether similar relationships exist for other senses, such as vision. The researchers also plan to probe whether magnetic stimulation of the brain in regions associated with sensory experience can interfere with understanding metaphors.

The research was supported by the National Institutes of Health and the National Science Foundation.

Source: ScienceDaily

Feb 6, 2012
#science #neuroscience #psychology #brain
Feb 4, 2012293 notes
Treating Brain Injuries With Stem Cell Transplants - Promising Results

Article Date: 04 Feb 2012 - 10:00 PST

The February edition of Neurosurgery reports that animal experiments in brain-injured rats have shown that stem cells injected via the carotid artery travel directly to the brain, greatly enhancing functional recovery. The study demonstrates, according to leading researcher Dr Toshiya Osanai, of Hokkaido University Graduate School of Medicine in Sapporo, Japan, that the carotid artery injection technique could, together with some form of in-vivo optical imaging to track the stem cells after transplantation, potentially be part of a new approach for stem cell transplantation in human brain trauma injuries (TBI).

Dr. Osanai and team assessed a new “intra-arterial” technique of stem cell transplantation in rats, with the aim of delivering the stem cells directly to the brain without having to go through the general circulation. They induced TBI in the animals before injecting stem cells into the carotid artery seven days later.

The stem cells were obtained from the rats’ bone marrow and were labeled with “quantum dots” prior to being injected. Quantom dots are a biocompatible, fluorescent semiconductor created with nanotechnology that emit near-infrared light with much longer wavelengths that penetrate bone and skin, enabling a non-invasive method of monitoring the stem cells for a period of four weeks following transplantation.

This in vivo optical imaging technique enabled the scientists to observe that the injected stem cells entered the brain on the first attempt, without entering the general circulation. They observed that the stem cells started migrating from the capillaries into the injured part of the brain within three hours.

At week 4, the researchers noted that the rats in the stem cell transplant group achieved a substantial recovery of motor function, compared with the untreated animals that had no signs of recovery.

The team learnt, after examining the treated brains, that the stem cells had transformed into different brain cell types and aided in healing the injured brain area.

Over the last few years, the potential of stem cell therapy for curing and treating illnesses and conditions has been growing rapidly. Below is a list of some of its possible uses.

(Photo by: Mikael Häggström)

Developing stem cell therapy for brain injury in human patients

Stem cells represent a potential, new important method of treatment for those who suffered brain injuries, TBI and stroke. But even though bone marrow stem cells, similar to the ones used in the new study, are a promising source of donor cells, many questions remain open regarding the optimal timing, dose and route of stem cell delivery.


In the new animal study, the rats were injected with the stem cells one week after TBI. This is a “clinically relevant” time, given that this is the minimum time it takes to develop stem cells from bone marrow.

Transplanting the stem cells into the carotid artery is a fairly simple procedure that delivers the cells directly to the brain.

The experiments have also provided key evidence that stem cell treatment can promote healing after TBI with a substantial recovery of function.

Dr. Osanai and team write that by using in vivo optical imaging:

"The present study was the first to successfully track donor cells that were intra-arterially transplanted into the brain of living animals over four weeks."

A similar form of imaging technology could also prove beneficial for monitoring the effects of stem cell transplantation in humans, although the tracking will pose challenges, due to the human skull and scalp being much thicker than in rats.

The researchers conclude:

"Further studies are warranted to apply in vivo optical imaging clinically.”

Written by Petra Rattue

Source: Medical News Today

Feb 4, 20122 notes
#science #neuroscience #psychology #brain
Discovery of Extremely Long-Lived Proteins May Provide Insight Into Cell Aging and Neurodegenerative Diseases

ScienceDaily (Feb. 3, 2012) — One of the big mysteries in biology is why cells age. Now scientists at the Salk Institute for Biological Studies report that they have discovered a weakness in a component of brain cells that may explain how the aging process occurs in the brain.

This microscope image shows extremely long-lived proteins, or ELLPs, glowing green on the outside of the nucleus of a rat brain cell. DNA inside the nucleus is pictured in blue. The Salk scientists discovered that the ELLPs, which form channels through the wall of the nucleus, lasted for more than a year without being replaced. Deterioration of these proteins may allow toxins to enter the nucleus, resulting in cellular aging. (Credit: Courtesy of Brandon Toyama, Salk Institute for Biological Studies)

The scientists discovered that certain proteins, called extremely long-lived proteins (ELLPs), which are found on the surface of the nucleus of neurons, have a remarkably long lifespan.

While the lifespan of most proteins totals two days or less, the Salk Institute researchers identified ELLPs in the rat brain that were as old as the organism, a finding they reported February 3 in Science.

The Salk scientists are the first to discover an essential intracellular machine whose components include proteins of this age. Their results suggest the proteins last an entire lifetime, without being replaced.

ELLPs make up the transport channels on the surface of the nucleus; gates that control what materials enter and exit. Their long lifespan might be an advantage if not for the wear-and-tear that these proteins experience over time. Unlike other proteins in the body, ELLPs are not replaced when they incur aberrant chemical modifications and other damage.

Damage to the ELLPs weakens the ability of the three-dimensional transport channels that are composed of these proteins to safeguard the cell’s nucleus from toxins, says Martin Hetzer, a professor in Salk’s Molecular and Cell Biology Laboratory, who headed the research. These toxins may alter the cell’s DNA and thereby the activity of genes, resulting in cellular aging.

Funded by the Ellison Medical Foundation and the Glenn Foundation for Medical Research, Hetzer’s research group is the only lab in the world that is investigating the role of these transport channels, called the nuclear pore complex (NPC), in the aging process.

Previous studies have revealed that alterations in gene expression underlie the aging process. But, until the Hetzer lab’s discovery that mammals’ NPCs possess an Achilles’ heel that allows DNA-damaging toxins to enter the nucleus, the scientific community has had few solid clues about how these gene alterations occur.

"The fundamental defining feature of aging is an overall decline in the functional capacity of various organs such as the heart and the brain," says Hetzer. "This decline results from deterioration of the homeostasis, or internal stability, within the constituent cells of those organs. Recent research in several laboratories has linked breakdown of protein homeostasis to declining cell function."

The results that Hetzer and his team just report suggest that declining neuron function may originate in ELLPs that deteriorate as a result of damage over time.

"Most cells, but not neurons, combat functional deterioration of their protein components through the process of protein turnover, in which the potentially impaired parts of the proteins are replaced with new functional copies," says Hetzer.

"Our results also suggest that nuclear pore deterioration might be a general aging mechanism leading to age-related defects in nuclear function, such as the loss of youthful gene expression programs," he adds.

The findings may prove relevant to understanding the molecular origins of aging and such neurodegenerative disorders as Alzheimer’s disease and Parkinson’s disease.

In previous studies, Hetzer and his team discovered large filaments in the nuclei of neurons of old mice and rats, whose origins they traced to the cytoplasm. Such filaments have been linked to various neurological disorders including Parkinson’s disease. Whether the misplaced molecules are a cause, or a result, of the disease has not yet been determined.

Also in previous studies, Hetzer and his team documented age-dependent declines in the functioning of NPCs in the neurons of healthy aging rats, which are laboratory models of human biology.

Hetzer’s team includes his colleagues at the Salk Institute as well as John Yates III, a professor in the Department of Chemical Physiology of The Scripps Research Institute.

When Hetzer decided three years ago to investigate whether the NPC plays a role in initiating or contributing to the onset of aging and certain neurodegenerative diseases, some members of the scientific community warned him that such a study was too bold and would be difficult and expensive to conduct. But Hetzer was determined despite the warnings.

Source: ScienceDaily

Feb 4, 20126 notes
#science #neuroscience #psychology #disease
Feb 4, 201216 notes
#science #neuroscience #psychology #brain #brain wave
Human Brains Wire Up Slowly but Surely

by Jon Cohen on 1 February 2012, 6:00 PM

Synaptic division. Compared with chimpanzees, human children children slowly wire their brains. Credit: Fotosearch

As the father-to-son exchange in the old Cat Stevens song advised, “take your time, think a lot, … think of everything you’ve got.” Turns out the mellow ’70s folkie had stumbled upon what may explain a key feature of our brains that sets us apart from our closest relatives: We unhurriedly make synaptic connections through much of our early childhoods, and this plasticity enables us to slowly wire our brains based on our experiences. Given that humans and chimpanzees share 98.8% of the same genes, researchers have long wondered what drives our unique cognitive and social skills. Yes, chimpanzees are smart and cooperative to a degree, but we clearly outshine them when it comes to abstract thinking, self-regulation, assimilation of cultural knowledge, and reasoning abilities. Now a study that looks at postmortem brain samples from humans, chimpanzees, and macaques collected from before birth to up to the end of the life span for each of these species has found a key difference in the expression of genes that control the development and function of synapses, the connections among neurons through which information flows.

As researchers describe in a report published online today in Genome Research, they analyzed the expression of some 12,000 genes—part of the so-called transcriptome—from each species. They found 702 genes in the prefrontal cortex (PFC) of humans that had a pattern of expression over time that differed from the two other species. (The PFC plays a central role in social behavior, working toward goals, and reasoning.) By comparison, genes in the chimpanzee PFC at various life stages had only 55 unique expression patterns—12-fold fewer than found in humans.

The genes the researchers analyzed have myriad functions. But when the researchers created five modules that lumped together genes that were co-expressed, they found that the module in humans that’s most closely tied to synapse formation and function had a “drastically” different developmental trajectory. These genes were turned on high from just after birth until about 5 years of age; the same genes in chimpanzees and macaques began to stop expressing themselves shortly after birth. “We might have discovered one of the differences that makes human brains work differently from chimpanzees and macaques,” says lead researcher Philipp Khaitovich, an evolutionary biologist who works at both the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and the Chinese Academy of Sciences (CAS) in Shanghai, China.

The researchers, including Svante Pääbo of the Leipzig institute and Xiling Liu of CAS, went a step further and actually counted more than 7000 synapses visible in electron micrographs from the three species at different ages. They found that the number of synapses in macaques and chimpanzees skyrocketed shortly after birth but did not peak in humans until about 4 years of age. “Humans have much more time to form synaptic connections,” Khaitovich concludes.

In their analyses, the researchers factored in that humans have much longer life spans than the other species and develop and mature more slowly in general. Their findings still stood out, even when adjusting for this developmental delay.

The work builds on behavioral evidence that showed the advantages of a prolonged childhood, as well as several other studies that have found differences in chimpanzee and human genes involved with synapse formation and function. But no group has ever done such a thorough comparative, longitudinal analysis of the brain transciptomes of these three species, says Todd Preuss, a neuroscientist at the Yerkes National Primate Research Center in Atlanta. “The whole thing is a technical tour de force,” Preuss says.

Nenad Sestan, a neurobiologist at Yale University who published a comprehensive analysis of the transcriptome of human brains from embryos to late adulthood in the 27 October 2011 issue of Nature, says the new work “is novel and provocative.” Sestan says to clarify differences between the species, the field now needs to examine more brain regions “to have a clearer idea of how specific this may be to the dorsolateral prefrontal cortex.”

The findings from Khaitovich and colleagues promise to spark future studies that address profound questions about everything from evolution to gene regulation. For example, they suggest in their report that the differences they found may also separate us from Neandertals, as evidence suggests that these extinct humans had faster cranial and dental development than modern humans.

Neurologist Eric Courchesne of the University of California, San Diego, says the new findings also mesh with his own studies of autism and brain overgrowth. Courchesne has found that the brains of autistic children grow more quickly than normal, which he theorizes prevents them from having enough experiences to properly wire neurons. “This is an absolutely fascinating study that will have great importance for advancing understanding of human disorders of early brain development as well as illuminating the evolutionary changes in neural development,” Courchesne says.

Source: ScienceNow

Feb 4, 20121 note
#science #neuroscience #psychology #brain
New procedure repairs severed nerves in minutes, restoring limb use in days or weeks

February 3rd, 2012 in Neuroscience 

American scientists believe a new procedure to repair severed nerves could result in patients recovering in days or weeks, rather than months or years. The team used a cellular mechanism similar to that used by many invertebrates to repair damage to nerve axons. Their results are published today in the Journal of Neuroscience Research.

"We have developed a procedure which can repair severed nerves within minutes so that the behavior they control can be partially restored within days and often largely restored within two to four weeks," said Professor George Bittner from the University of Texas. "If further developed in clinical trials this approach would be a great advance on current procedures that usually imperfectly restore lost function within months at best."

The team studied the mechanisms all animal cells use to repair damage to their membranes and focused on invertebrates, which have a superior ability to regenerate nerve axons compared to mammals. An axon is a long extension arising from a nerve cell body that communicates with other nerve cells or with muscles.

This research success arises from Bittner’s discovery that nerve axons of invertebrates which have been severed from their cell body do not degenerate within days, as happens with mammals, but can survive for months, or even years.

The severed proximal nerve axon in invertebrates can also reconnect with its surviving distal nerve axon to produce much quicker and much better restoration of behaviour than occurs in mammals.

"Severed invertebrate nerve axons can reconnect proximal and distal ends of severed nerve axons within seven days, allowing a rate of behavioural recovery that is far superior to mammals," said Bittner. "In mammals the severed distal axonal stump degenerates within three days and it can take nerve growths from proximal axonal stumps months or years to regenerate and restore use of muscles or sensory areas, often with less accuracy and with much less function being restored."

The team described their success in applying this process to rats in two research papers published today. The team were able to repair severed sciatic nerves in the upper thigh, with results showing the rats were able to use their limb within a week and had much function restored within 2 to 4 weeks, in some cases to almost full function.

"We used rats as an experimental model to demonstrate how severed nerve axons can be repaired. Without our procedure, the return of nearly full function rarely comes close to happening," said Bittner. "The sciatic nerve controls all muscle movement of the leg of all mammals and this new approach to repairing nerve axons could almost-certainly be just as successful in humans."

To explore the long term implications and medical uses of this procedure, MD’s and other scientist- collaborators at Harvard Medical School and Vanderbilt Medical School and Hospitals are conducting studies to obtain approval to begin clinical trials.

"We believe this procedure could produce a transformational change in the way nerve injuries are repaired," concluded Bittner.

Provided by Wiley

"New procedure repairs severed nerves in minutes, restoring limb use in days or weeks." February 3rd, 2012. http://medicalxpress.com/news/2012-02-procedure-severed-nerves-minutes-limb.html

Feb 4, 20124 notes
#science #neuroscience #psychology
Renowned physicist invents microscope that can peer at living brain cells

February 3, 2012

Schematic drawing of the upright STED microscope used for the experiments. Image: Science, DOI:10.1126/science.1215369

(PhysOrg.com) — Ever since scientists began studying the brain, they’ve wanted to get a better look at what was going on. Researchers have poked and prodded and looked at dead cells under electron microscopes, but never before have they been able to get high resolution microscopic views of actual living brain cells as they function inside of a living animal. Now, thanks to work by physicist Stefan Hell and his colleagues at the Max Planck Institute in Germany, that dream is realized. In a paper published in Science, Hell and his team describe the workings of their marvelous discovery.

Hell (which in German means “bright”) and others at the Institute have been working for years on ultra high resolution microscopes that go by the name “stimulated emission depletion” or STED microscopes. Now, they’ve taken their work to a whole new level by cutting away a small portion of a mouse’s skull and replacing it with a glass window and then placing their latest STED microscope against the glass to peer inside. To make it easier to see what is what, the team first genetically altered the mouse to make certain brain cells fluorescent. Then, to allow for focusing exclusively on just those cells that are lit up, they added software to the microscope to blot out anything that was not lit up. The result is super high resolution real time imagery of the neurons that exist on the exterior part of a living mouse brain. 

(video)

STED time-lapse recording of a single spine at an interval of 10 seconds. The measurement includes 128 z-stacks consisting of 5 slices each. Most of the rapid remodeling of the spine head appears continuous and smooth at this frame rate. No damage is observed at the dendrite or the spine after recording a total of 640 slices. The movie was acquired in a different experiment than the spines in Fig.1. Scale bar = 1µm. Video: DOI:10.1126/science.1215369

The new microscope provides clear resolution down to 70 nanometers, which is four times that ever achieved before and is enough to allow scientists to see the actual movement of dendritic spines, which may help researches understand why they do so.

It is likely that researchers will find many varied uses for the new microscope. One prominent area will almost certainly involve looking into what psychiatric drugs are really doing within synapses, perhaps leading to breakthroughs in pharmaceutical drugs that are better able to target specific illnesses.

One downside to any new scientific breakthrough however, is the natural tendency of many to move from excitation, to wondering about what will come next. In this case, Hell and his team have already started contemplating ideas on ways to allow researchers to study any cell in the living brain at such high resolution, not just those that lie on the surface.

More information: Nanoscopy in a Living Mouse Brain, Science 3 February 2012: Vol. 335 no. 6068 p. 551. DOI: 10.1126/science.1215369

"Renowned physicist invents microscope that can peer at living brain cells." February 3rd, 2012. http://www.physorg.com/news/2012-02-renowned-physicist-microscope-peer-brain.html

Feb 4, 2012
#brain #science #neuroscience #psychology #physics
Feb 3, 20121 note
#placebo #placebo effect #brain
Placebo Effect: New Study Shows How to Boost the Power of Pain Relief, Without Drugs

ScienceDaily (Feb. 3, 2012) — Placebos reduce pain by creating an expectation of relief. Distraction — say, doing a puzzle — relieves it by keeping the brain busy. But do they use the same brain processes? Neuromaging suggests they do. When applying a placebo, scientists see activity in the dorsolateral prefrontal cortex. That’s the part of the brain that controls high-level cognitive functions like working memory and attention — which is what you use to do that distracting puzzle.

Now a new study challenges the theory that the placebo effect is a high-level cognitive function. The authors — Jason T. Buhle, Bradford L. Stevens, and Jonathan J. Friedman of Columbia University and Tor D. Wager of the University of Colorado Boulder — reduced pain in two ways — either by giving them a placebo, or a difficult memory task. lacebo. But when they put the two together, “the level of pain reduction that people experienced added up. There was no interference between them,” says Buhle. “That suggests they rely on separate mechanisms.” The findings, published in Psychological Science, a journal of the Association for Psychological Science, could help clinicians maximize pain relief without drugs.

In the study, 33 participants came in for three separate sessions. In the first, experimenters applied heat to the skin with a little metal plate and calibrated each individual’s pain perceptions. In the second session, some of the people applied an ordinary skin cream they were told was a powerful but safe analgesic. The others put on what they were told was a regular hand cream. In the placebo-only trials, participants stared at a cross on the screen and rated the pain of numerous applications of heat — the same level, though they were told it varied. For other trials they performed a tough memory task — distraction and placebo simultaneously. For the third session, those who’d had the plain cream got the “analgesic” and vice versa. The procedure was the same.

The results: With either the memory task or the placebo alone, participants felt less pain than during the trials when they just stared at the cross. Together, the two effects added up; they didn’t interact or interfere with each other. The data suggest that the placebo effect does not require executive attention or working memory.

So what about that neuroimaging? “Neuroimaging is great,” says Buhle, “but because each brain region does many things, when you see activation in a particular area, you don’t know what cognitive process is driving it.” This study tested the theory about how placebos work with direct behavioral observation.

The findings are promising for pain relief. Clinicians use both placebos and distraction — for instance, virtual reality in burn units. But they weren’t sure if one might diminish the other’s efficacy. “This study shows you can use them together,” says Buhle, “and get the maximum bang for your buck without medications.”

Source: ScienceDaily

Feb 3, 20121 note
#science #neuroscience #psychology #placebo
Schizophrenia: When Hallucinatory Voices Suppress Real Ones, New Electronic Application May Help

ScienceDaily (Feb. 3, 2012) — When a patient afflicted with schizophrenia hears inner voices something is taking place inside the brain that prevents the individual from perceiving real voices. A simple electronic application may help the patient learn to shift focus.

 

Image captures of the brain show how neurons are activated in healthy control subjects when hearing actual voices (top row) whereas activation fails to occur in patients who experience auditory hallucinations. (Credit: Kenneth Hugdahl)

"The patient experiences the inner voices as 100 per cent real, just as if someone was standing next to him and speaking" explains Professor Kenneth Hugdahl of the University of Bergen. "At the same time, he can’t hear voices of others actually present in the same room."

Auditory hallucinations are one of the most common symptoms associated with schizophrenia.

Neural activity ceases

Dr Hugdahl’s research group has made use of a variety of neuroimaging techniques, including functional magnetic resonance imaging technology (fMRI) to enable them quite literally to see what happens inside the brain when the inner voices make their presence known. The project received funding under the NevroNor national initiative on neuroscientific research administered under the auspices of the Research Council of Norway

Images of patients’ brains reveal a spontaneous activation of neurons in a particular area of the brain — specifically the rear, upper region of the left temporal lobe. This is the area responsible for speech perception, and when healthy people hear speech it becomes activated. So what happens when patients with schizophrenia hear a real voice and a hallucinatory one at the same time?

"It would be natural to assume that neural activity would increase somewhat — even twofold. But quite the opposite takes place; we actually observed that the activity ceased altogether," states Professor Hugdahl.

Losing contact with the outside world

In order to learn more about what was happening, Hugdahl and his colleagues Kristiina Kompus and René Westerhausen carried out a meta-analysis of 23 studies. These studies focused either on spontaneous inner-voice triggered neural activation in subjects with schizophrenia or the stimulatory reaction prompted by actual sounds in both healthy and schizophrenic subjects.

It emerged that many researchers had observed either that a spontaneous activation of neurons occurs in patients hearing inner voices or that the patients’ perception of actual voices becomes suppressed when these are heard simultaneously with inner voices. No one had seen the connection between these findings.

"Previously, we thought these were two separate phenomena. But our analyses revealed that the one causes the other: when neurons become activated by inner voices it inhibits perception of outside speech. The neurons become ‘preoccupied’ and can’t ‘process’ voices from the outside," explains Professor Hugdahl.

"This may explain why schizophrenic patients close themselves off so completely and lose touch with the outside world when experiencing hallucinations," he purports.

Electronic app designed to improve impulse control

Hugdal and his colleagues made yet another discovery that may well help explain how the lives of these individuals become consumed by inner voices. It turns out that the frontal lobe in the brains of schizophrenia patients does not function exactly the way it should. As a result, these patients have a lesser degree of impulse control and are unable to filter out their inner voices.

"Every one of us hears inner voices or melodies from time to time. The difference between non-afflicted individuals and schizophrenia patients is that the former manage to tune these out better," the professor points out.

If patients could learn to stifle inner noise it could have a huge impact on our ability to treat schizophrenia, he states. To this end, Professor Hugdahl’s research group has developed an application that can be used on mobile phones and other simple electronic devices, to help patients improve their filters.

Wearing headphones, the patient is exposed to simple speech sounds with different sounds played in each ear. The task is to practice hearing the sound in one ear while blocking out sound in the other. The application has only been tested on two patients with schizophrenia so far. The response from these patients is promising, Dr Hugdahl relates.

"The voices are still there, but the test subjects feel that they have control over the voices instead of the other way around. The patient feels it is a breakthrough since it means he can actively shift his focus from the inner voices over to the sounds coming from the outside," the professor explains.

Source: ScienceDaily

Feb 3, 20127 notes
#science #neuroscience #psychology #brain #schizophrenia
Noise Exposure Can Cause Long-Lasting Changes To Sensory Pathways; Touch-Sensing Nerve Cells May Lead To Future Tinnitus Treatments

Article Date: 03 Feb 2012 - 0:00 PST

We all know that it can take a little while for our hearing to bounce back after listening to our iPods too loud or attending a raucous concert. But new research at the University of Michigan Health System suggests over-exposure to noise can actually cause more lasting changes to our auditory circuitry - changes that may lead to tinnitus, commonly known as ringing in the ears.

U-M researchers previously demonstrated that after hearing damage, touch-sensing “somatosensory” nerves in the face and neck can become overactive, seeming to overcompensate for the loss of auditory input in a way the brain interprets - or “hears” - as noise that isn’t really there.

The new study, which appears in The Journal of Neuroscience, found that somatosensory neurons maintain a high level of activity following exposure to loud noise, even after hearing itself returns to normal.

The findings were made in guinea pigs, but mark an important step toward potential relief for people plagued by tinnitus, says lead investigator Susan E. Shore, Ph.D., of U-M’s Kresge Hearing Research Institute and a professor of otolaryngology and molecular and integrative physiology at the U-M Medical School.

“The animals that developed tinnitus after a temporary loss in their hearing after loud noise exposure were the ones who had sustained increases in activity in these neural pathways,” Shore says. “In the future it may be possible to treat tinnitus patients by dampening the hyperactivity by reprogramming these auditory-touch circuits in the brain.”

In normal hearing, a part of the brain called the dorsal cochlear nucleus is the first stop for signals arriving from the ear via the auditory nerve. But it’s also a hub where “multitasking” neurons process other sensory signals, such as touch, together with hearing information.

During hearing loss, the other sensory signals entering the dorsal cochlear nucleus are amplified, Shore’s earlier research found. This overcompensation by the somatosensory neurons, which carry information about touch, vibration, skin temperature and pain, is believed to fuel tinnitus in many cases.

Tinnitus affects up to 50 million people in the United States and millions more worldwide, according to the American Tinnitus Association. It can range from intermittent and mildly annoying to chronic, severe and debilitating. There is no cure.

It especially affects baby boomers, who, as they reach an age at which hearing tends to diminish, increasingly find that tinnitus moves in. The condition most commonly occurs with hearing loss, but can also follow head and neck trauma, such as after an auto accident, or dental work. Tinnitus is the number one disability afflicting members of the armed forces.

The involvement of touch sensing (or “somatosensory”) nerves in the head and neck explains why many tinnitus sufferers can change the volume and pitch of the sound by clenching their jaw, or moving their head and neck, Shore explains.

While the new study builds on previous discoveries by Shore and her team, many aspects are new.

“This is the first research to show that, in the animals that developed tinnitus after hearing returned to normal, increased excitation from the somatosensory nerves in the head and neck continued. This dovetails with our previous research, which suggests this somatosensory excitation is a major component of tinnitus,” says Shore, who serves on the scientific advisory committee of the American Tinnitus Association.

“The better we understand the underlying causes of tinnitus, the better we’ll be able to develop new treatments,” she adds.

Source: Medical News Today 

Feb 3, 201230 notes
#science #neuroscience #psychology #ear #tinnitus
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December