Neuroscience

Month

May 2012

Synchronized Brains: Feeling Strong Emotions Makes People's Brains 'Tick Together'

ScienceDaily (May 24, 2012) — Experiencing strong emotions synchronizes brain activity across individuals, a research team at Aalto University and Turku PET Centre in Finland has revealed.

image

Experiencing strong emotions synchronizes brain activity across individuals. (Credit: Image courtesy of Aalto University)

Human emotions are highly contagious. Seeing others’ emotional expressions such as smiles triggers often the corresponding emotional response in the observer. Such synchronization of emotional states across individuals may support social interaction: When all group members share a common emotional state, their brains and bodies process the environment in a similar fashion.

Researchers at Aalto University and Turku PET Centre have now found that feeling strong emotions makes different individuals’ brain activity literally synchronous.

The results revealed that especially feeling strong unpleasant emotions synchronized brain’s emotion processing networks in the frontal and midline regions. On the contrary, experiencing highly arousing events synchronized activity in the networks supporting vision, attention and sense of touch.

"Sharing others’ emotional states provides the observers a somatosensory and neural framework that facilitates understanding others’ intentions and actions and allows to ‘tune in’ or ‘sync’ with them. Such automatic tuning facilitates social interaction and group processes," says Adjunct Professor Lauri Nummenmaa from the Aalto University, Finland.

"The results have major implications for current neural models of human emotions and group behavior. It also deepens our understanding of mental disorders involving abnormal socioemotional processing," Nummenmaa says.

Participants’ brain activity was measured with functional magnetic resonance imaging while they were viewing short pleasant, neutral and unpleasant movies.

Source: Science Daily

May 25, 2012668 notes
#science #neuroscience #brain #psychology
Protein Necessary for Behavioral Flexibility Discovered

ScienceDaily (May 24, 2012) — Researchers have identified a protein necessary to maintain behavioral flexibility, which allows us to modify our behaviors to adjust to circumstances that are similar, but not identical, to previous experiences. Their findings, which appear in the journal Cell Reports, may offer new insights into addressing autism and schizophrenia — afflictions marked by impaired behavioral flexibility.

Our stored memories from previous experiences allow us to repeat certain tasks. For instance, after driving to a particular location, we recall the route the next time we make that trip. However, sometimes circumstances change — one road on the route is temporarily closed — and we need to make adjustments to reach our destination. Our behavioral flexibility allows us to make such changes and, then, successfully complete our task. It is driven, in part, by protein synthesis, which produces experience-dependent changes in neural function and behavior.

However, this process is impaired for many, preventing an adjustment in behavior when faced with different circumstances. In the Cell Reports study, the researchers sought to understand how protein synthesis is regulated during behavioral flexibility.

To do so, they focused on the kinase PERK, an enzyme that regulates protein synthesis. PERK is known to modify eIF2α, a factor that is required for proper protein synthesis. Their experiments involved comparing normal lab mice, which possessed the enzyme, with those that lacked it.

In their study, the mice were asked to navigate a water maze, which included elevating themselves onto a platform to get out of the water. Normal mice and those lacking PERK learned to complete this task.

However, in a second step, the researchers tested the mice’s behavioral flexibility by moving the maze’s platform to another location, thereby requiring them to respond to a change in the terrain. Here, the normal mice located the platform, but those lacking PERK were unable to do so or took significantly more time to complete the task.

A second experiment offered a different test of the role of PERK in aiding behavioral flexibility. In this measure, both normal and mutant mice heard an audible tone that was followed by a mild foot shock. At this stage, all of the mice developed a normal fear response — freezing at the tone in anticipation of the foot shock. However, the researchers subsequently removed the foot shock from the procedure and the mice heard only the tone. Eventually, the normal mice adjusted their responses so they did not freeze after hearing the tone. However, the mutant mice continued to respond as if they expected a foot shock to follow.

The researchers sought additional support for their conclusion that the absence of PERK may contribute to impaired behavioral flexibility in human neurological disorders. To do so, they conducted postmortem analyses of human frontal cortex samples from patients afflicted with schizophrenia, who often exhibit behavioral inflexibility, and unaffected individuals. The samples from the control group showed normal levels of PERK while those from the schizophrenic patients had significantly reduced levels of the protein.

"A rapidly expanding list of neurological disorders and neurodegenerative diseases, including Alzheimer’s disease, Parkinson’s disease, and Fragile X syndrome, have already been linked to aberrant protein synthesis," explained Eric Klann, a professor in NYU’s Center for Neural Science and one of the study’s co-authors. "Our results show the significance of PERK in maintaining behavioral flexibility and how its absence might be associated with schizophrenia. Further studies clarifying the specific role of PERK-regulated protein synthesis in the brain may provide new avenues to tackle such widespread and often debilitating neurological disorders."

Source: Science Daily

May 25, 201219 notes
#science #neuroscience #brain #psychology
Boundary stops molecule right where it needs to be

May 24, 2012

A molecule responsible for the proper formation of a key portion of the nervous system finds its way to the proper place not because it is actively recruited, but instead because it can’t go anywhere else.

Researchers at Baylor College of Medicine have identified a distal axonal cytoskeleton as the boundary that makes sure AnkyrinG clusters where it needs to so it can perform properly.

The findings appear in the current edition of Cell.

"It has been known that AnkyrinG is needed for the axon initial segment to form. Without the axon initial segment there would be no output of information within the nervous system,” said Dr. Matthew Rasband, associate professor of neuroscience at BCM. “Every known protein found at the axon initial segment depends on AnkyrinG, so if it is eliminated then the axon initial segment doesn’t form and the neuron doesn’t fire.”

To answer the question of how AnkyrinG gets to where it needs to be for proper function, Rasband, along with first author Dr. Mauricio Galiano, postdoctoral associate in neuroscience at BCM, and colleagues, began by analyzing how the axon initial segment forms. They found that AnkyrinG always appeared in exactly the same spot during development.

"It would start to enter into the axon and then it was almost as if it hit a wall and couldn’t go any further," Rasband said. "We would see it stop very close to the cell body and then it would backfill. This showed us that there was some type of boundary or barrier marking that area."

To further study the properties of the boundary they began to look at ways they could disrupt or move it to test the effects of AnkyrinG clustering in different areas.

In cell cultures mouse models they were able to move the boundary to different distances along the axon. Doing this allowed researchers to change the length of the axon initial segment. If the boundary was farther away from the cell body than the length of the segment was longer. If it was closer to the cell body, then the length was shorter.

When researchers removed the boundary all together, AnkyrinG would not cluster in the appropriate area and the axon initial segment would not form.

"We had anticipated there was a kind of molecule that recruited AnkyrinG but instead we found a barrier that excludes it," Rasband said. "These results have important implications because they imply a similar exclusion mechanism might be in play or functioning not only at the axon initial segment, but all of the places where AnkyrinG is found."

Rasband said within many disorders like autism or epilepsy proteins that AnkyrinG is responsible for forming are disrupted. So understanding how this molecule functions properly could one day play a role in finding treatment targets for diseases.

Provided by Baylor College of Medicine

Source: medicalxpress.com

May 24, 20127 notes
#science #neuroscience #brain #psychology
Locating ground zero: How the brain's emergency workers find the disaster area

May 24, 2012

Like emergency workers rushing to a disaster scene, cells called microglia speed to places where the brain has been injured, to contain the damage by ‘eating up’ any cellular debris and dead or dying neurons. Scientists at the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, have now discovered exactly how microglia detect the site of injury, thanks to a relay of molecular signals. Their work, published today in Developmental Cell, paves the way for new medical approaches to conditions where microglia’s ability to locate hazardous cells and material within the brain is compromised.

image

Microglia (green) move to the site of injury (arrow) to clear up debris. Credit: Copyright EMBL/Peri

"Considering that they help keep our brain healthy, we know surprisingly little about microglia," says Francesca Peri, who led the work. "Now, for the first time, we’ve identified the mechanism that allows microglia to detect brain injury, and how that emergency call is transmitted from neuron to neuron.”

image

When microglia (green) cannot detect ATP (bottom), they don’t move to the injury site as they usually would (top). Credit: Copyright EMBL/Peri

When an emergency occurs, cries can alert bystanders, who will dial the emergency number. A call will go out over the radio, and ambulances, police or fire engines in the area will respond as needed. In the brain, Peri and colleagues found, injured neurons send out their own distress cry: they release a molecule called glutamate. Neighbouring neurons sense that glutamate and respond by taking up calcium. As glutamate spreads out from the injury site, this creates a wave of calcium swallowing. Along that wave, as neurons take up calcium they release a third molecule, called ATP. When the wave comes within reach, a microglial cell detects that ATP and takes it as a call to action, moving in that direction – essentially tracing the wave backwards until it reaches the injury.

Scientists knew already that microglia can detect ATP, but this molecule doesn’t last long outside of cells, so there were doubts about how ATP alone could be a signal that carried far enough to reach microglia located far from the site of injury. The trick, as Peri and colleagues discovered, is the long-lasting glutamate-driven calcium wave that can travel the length of the brain. Thanks to this wave, the ATP signal is not just emitted by the injured cells, but is repeatedly sent out by the neurons along the way, until it reaches microglia.

Dirk Sieger and Christian Moritz in Peri’s lab took advantage of the fact that zebrafish have transparent heads, which allow scientists to peer down a microscope straight into the fish’s brain. They used a laser to injure a few of the fish’s brain cells, and watched fluorescently-labelled microglia move in on the injury. When they genetically engineered zebrafish to make neurons’ calcium levels traceable under the microscope, too, the scientists were able to confirm that when the calcium wave reached microglia, these cells immediately started moving toward the injury.

Knowing all the steps in this process, and how they feed into each other, could help to design treatments to improve microglia’s detection ability, which go awry in conditions such as Alzheimer’s and Parkinson’s diseases.

Provided by European Molecular Biology Laboratory

Source: medicalxpress.com

May 24, 201212 notes
#science #neuroscience #brain #psychology
Persistent sensory experience is good for aging brain

May 24, 2012

Despite a long-held scientific belief that much of the wiring of the brain is fixed by the time of adolescence, a new study shows that changes in sensory experience can cause massive rewiring of the brain, even as one ages. In addition, the study found that this rewiring involves fibers that supply the primary input to the cerebral cortex, the part of the brain that is responsible for sensory perception, motor control and cognition. These findings promise to open new avenues of research on brain remodeling and aging.

Published in the May 24, 2012 issue of Neuron, the study was conducted by researchers at the Max Planck Florida Institute (MPFI) and at Columbia University in New York.

"This study overturns decades-old beliefs that most of the brain is hard-wired before a critical period that ends when one is a young adult," said MPFI neuroscientist Marcel Oberlaender, PhD, first author on the paper. "By changing the nature of sensory experience, we were able to demonstrate that the brain can rewire, even at an advanced age. This may suggest that if one stops learning and experiencing new things as one ages, a substantial amount of connections within the brain may be lost."

The researchers conducted their study by examining the brains of older rats, focusing on an area of the brain known as the thalamus, which processes and delivers information obtained from sensory organs to the cerebral cortex. Connections between the thalamus and the cortex have been thought to stop changing by early adulthood, but this was not found to be the case in the rodents studied.

Being nocturnal animals, rats mainly rely on their whiskers as active sensory organs to explore and navigate their environment. For this reason, the whisker system is an ideal model for studying whether the brain can be remodeled by changing sensory experience. By simply trimming the whiskers, and preventing the rats from receiving this important and frequent form of sensory input, the scientists sought to determine whether extensive rewiring of the connections between the thalamus and cortex would occur.

On examination, they found that the animals with trimmed whiskers had altered axons, nerve fibers along which information is conveyed from one neuron (nerve cell) to many others; those whose whiskers were not trimmed had no axonal changes. Their findings were particularly striking as the rats were considered relatively old – meaning that this rewiring can still take place at an age not previously thought possible. Also notable was that the rewiring happened rapidly – in as little as a few days.

"We’ve shown that the structure of the rodent brain is in constant flux, and that this rewiring is shaped by sensory experience and interaction with the environment," said Dr. Oberlaender. "These changes seem to be life-long and may pertain to other sensory systems and species, including people. Our findings open the possibility of new avenues of research on development of the aging brain using quantitative anatomical studies combined with noninvasive imaging technologies suitable for humans, such as functional MRI (fMRI)."

The study was possible due to recent advances in high-resolution imaging and reconstruction techniques, developed in part by Dr. Oberlaender at MPFI. These novel methods enable researchers to automatically and reliably trace the fine and complex branching patterns of individual axons, with typical diameters less than a thousandth of a millimeter, throughout the entire brain.

Provided by Tartaglia Communications

Source: medicalxpress.com

May 24, 201216 notes
#science #neuroscience #brain #psychology
The auditory cortex adapts agilely with concentration

May 24, 2012

The birth of sensory perception on the human cerebral cortex is yet to be fully explained. The different areas on the cortex function in cooperation, and no perception is the outcome of only one area working alone. In his doctoral dissertation for the Department of Biomedical Engineering and Computational Science in Aalto University Jaakko Kauramäki shows that the auditory cortex is not left to its own devices.

Kauramäki’s dissertation in the field of cognitive neuroscience studied neural top-down processes, that is, the ways the brain as a system handles sounds arriving onto the auditory cortex in the frontal lobes.

Moving from parts towards a whole, bottom-up processes analyse a sound by dissecting it in hierarchical chain reactions from small and sophisticated bits towards a concise auditory sensation.

"The operation of the system as a whole can be affected by focusing on a specific task or sound. In my research I focused precisely on how the top-down effects manifest themselves on the auditory cortex," explains Kauramäki his study.

Right kind of noise promotes concentration and reinforces perception?

Kauramäki studied the auditory cortex in two separate tasks: reactions caused by selective attention during sound recognition and by lipreading. Kauramäki recorded the electrical and magnetic activity on the cortex using electroencephalography (EEG) and magnetoencephalography (MEG) respectively.

"40 years ago a so-called ‘gain effect’ was formulated: focusing attention enhances responses on the auditory cortex, which means that attention helps to better perceive audio stimuli," tells Kauramäki.

In the attention tests Kauramäki masked the sounds played for the test subjects with different frequencies of noise – and made a discovery. During periods of selective attention, the enhanced responses on the auditory cortex depended on the type of noise used. The frequency content of the noise affected the prominence of the responses. The responses are not only enhanced, but they are feature and task-specific.

"Similar results have not been obtained earlier because the stimuli used in the experiments have been too simple. The noise mask added a combinatory effect that brought the specificity and selectivity of the responses to the fore."

"Focusing attention may then be easier in a rich sound environment. Complete silence is of course an extreme case, but in total silence the auditory cortex begins to create connections out of thin air, to make up sensory perceptions."

"Then again, the more stimuli there are in the environment, the harder it becomes to focus. In attention disorders such as ADHD, precisely the top-down ability to filter sounds may be lacking," suspects Kauramäki.

In the lipreading tasks Kauramäki did not encounter such a dependency on frequency. Instead, lipreading suppressed the auditory cortex’s ability to react. The reason for this is the neural response of the speech production system.

"The suppressing effect is caused by the adaptation of the areas on the auditory cortex that specialise in speech. Suppressing occurs even when the speech is inaudible – the articulatory gestures of the mouth alone activate parts of the auditory cortex."

For Kauramäki the result suggests that the neural responses of the speech production system can reach the auditory cortex and thus reinforce perception.

"In noisy meetings, for example, it pays off to concentrate on the face of whoever is speaking: lipreading helps in the processing. It may suppress the reaction of the auditory cortex, but the big picture becomes clearer."

Provided by Aalto University

Source: medicalxpress.com

May 24, 201210 notes
#science #neuroscience #brain #psychology
World's biggest stroke clot-buster trial reveals patient benefits

May 24, 2012

(Medical Xpress) — Patients given a clot-busting drug within six hours of a stroke are more likely to make a better recovery than those who do not receive the treatment, new research has found.

The trial was set up in 2000 by the University of Sydney’s Professor Richard Lindley, while he was employed at the University of Edinburgh.

The study of more than 3000 patients is the world’s largest trial of the drug rt-PA and was coordinated at the University of Edinburgh. Since coming to Sydney Medical School in 2003, Professor Lindley has continued as the co-principal investigator of the research.

The findings of the study are published today in The Lancet, alongside an analysis of all other trials of the drug carried out in the past 20 years.

The trial found that following treatment with the drug rt-PA, which is given intravenously to patients who have suffered an acute ischaemic stroke, more patients were able to look after themselves.

"The trial results, together with the updated review, mean that rt-PA can now be offered to a much wider group of patients presenting with stroke", Professor Lindley said.

A patient’s chances of making a complete recovery within six months of a stroke were also increased.

An ischaemic stroke happens when the brain’s blood supply is interrupted by a blood clot. The damage caused can be permanent or fatal.

Researchers now know that for every 1000 patients given rt-PA within three hours of stroke, 80 more will survive and live without help from others than if they had not been given the drug.

The benefits of using rt-PA do come at a price, say researchers. Patients are at risk of death within seven days of treatment because the drug can cause a secondary bleed in the brain. The research team concluded that the benefits were seen in a wide variety of patients, despite the risks.

Stroke experts stress that these mortality figures need to be viewed in the context of deaths from stroke. Without treatment, one third of people who suffer a stroke die, with another third left permanently dependent and disabled.

Researchers say the threat of death and disability means many stroke patients are prepared to take the early risks of being treated with rt-PA to avoid being disabled.

The authors conclude that for those who do not experience bleeding, the drug improves patients’ longer term recovery.

About half of those who took part in the trial were over 80.

"The trial underlines the benefits of treating patients with the drug as soon as possible and provides the first reliable evidence that treatment is effective for those aged 80 and over," Professor Lindley said.

The study also found no reason to restrict use of rt-PA - also known as alteplase - on the basis of how severe a patient’s stroke has been.

Chief investigator Professor Peter Sandercock of the University of Edinburgh’s Centre for Clinical Brain Sciences said: “Our trial shows that it is crucial that treatment is given as fast as possible to all suitable patients.”

Provided by University of Sydney

Source: medicalxpress.com

May 24, 20127 notes
#science #neuroscience #brain #psychology #stroke
Genetic 'reset switch' enables signaling pathway to induce multiple developmental outcomes for olfactory neurons

May 24, 2012

Within the nervous system, a handful of signaling pathways modulate development of a cornucopia of different neuronal subtypes. “Even small alterations in neuron differentiation pathways can disrupt subsequent circuit organization and catalyze the genesis of neurological disorders,” explains Adrian Moore of the RIKEN Brain Science Institute in Wako.

image

Figure 1: Interplay between Notch signaling and Hamlet activity gives rise to diverse olfactory receptor neurons (ORNs), each with distinct structures and subsets of olfactory receptors (left). The precursor cell (right) divides to yield two daughter cells, one of which undergoes Notch (N)-mediated gene activation. Hamlet (Ham) subsequently resets Notch’s genetic effects, and the absence or subsequent restoration of Notch signaling determines which type of ORN (Naa or Nab) will result from differentiation. Credit: 2012 Adrian Moore, RIKEN Brain Science Institute

Recent work from Moore’s team, which includes Keita Endo of the University of Tokyo, has revealed mechanisms governing this complexity in the fruit fly olfactory system. Within the antennae—the fly equivalent of the nose—it was known that cells called neuronal precursors undergo multiple rounds of ‘asymmetric division’, wherein each resulting daughter cell follows a distinct developmental path, yielding different combinations of olfactory receptor neurons (ORNs). Moore’s team showed specifically that ORN precursors undergo two rounds of division, yielding four different cellular subtypes, three of which will typically mature into ORNs.

Earlier work from Endo showed that the activation or suppression of signaling by the Notch protein helps differentiate these cellular fates, but other factors were clearly involved. Their joint research demonstrated that a second protein, Hamlet, modulates the effects of Notch. 

“This [process] provides an important foundation for all future studies of odorant receptor expression and axon targeting control on the olfactory system,” says Moore. The researchers found that presence or absence of Notch and Hamlet activity plays a central role in establishing the identity of these subtypes, and this in turn determines both the connections formed by the resulting ORNs as well as the subset of olfactory receptor proteins that will be expressed (Fig. 1). 

Moore and Endo’s study also revealed a surprising mode of action for Hamlet. Chromosomal DNA is wrapped around clusters of protein, and chemical changes to those proteins profoundly alter local gene activity—a mechanism called ‘epigenetic regulation’. They found that Hamlet selectively deactivates genes activated by Notch by triggering such changes. This means that immature ORNs produced by division of a Notch-activated cell can essentially be ‘reset’ by Hamlet. The ultimate developmental fate of those cells is then determined, in part, by whether or not they subsequently undergo a new round of Notch activation. 

Moore and colleagues also observed that, beyond simply switching off active Notch genes, Hamlet may define subsets of target genes that can subsequently be reactivated by Notch signaling. “The modifications induced by Hamlet may help establish cell fate by marking gene promoters for use later during differentiation,” says Moore. “This could prove fundamental to understanding the process of neuronal diversification.”

Provided by RIKEN

Source: medicalxpress.com

May 24, 20124 notes
#science #neuroscience #brain #psychology #neuron
No new neurons in the human olfactory bulb

May 24, 2012

(Medical Xpress) — Research from Karolinska Institutet shows that the human olfactory bulb - a structure in the brain that processes sensory input from the nose - differs from that of other mammals in that no new neurons are formed in this area after birth. The discovery, which is published in the scientific journal Neuron, is based on the age-determination of the cells using the carbon-14 method, and might explain why the human sense of smell is normally much worse than that of other animals.

"I’ve never been so astonished by a scientific discovery," says lead investigator Jonas Frisén, Tobias Foundation Professor of stem cell research at Karolinska Institutet. "What you would normally expect is for humans to be like other animals, particularly apes, in this respect."

It was long thought that all brain neurons were formed up to the time of birth, after which production stopped. A paradigm shift occurred when scientists found that nerve cells were being continually formed from stem cells in the mammalian brain, which changed scientific views on the plasticity of the brain and raised hopes of being able to replace neurons lost during some types of neurological disease.

In the adult mammal, new nerve cells are formed in two regions of the brain: the hippocampus and the olfactory bulb. While the former has an important part to play in memory, the latter is essential to the interpretation of smells. However, owing to the difficulty of studying the formation of new neurons in humans, the extent to which this phenomenon also occurs in the human brain has remained unclear. In this present study, researchers at Karolinska Institutet and their Austrian and French colleagues made use of the sharp rise in atmospheric carbon-14 caused by Cold War nuclear tests to find an answer to this question.

Carbon-14 is incorporated in DNA, making it possible to gauge the age of the cells by measuring how much of the isotope they contain. Doing this, the team found that the olfactory bulb neurons in their adult human subjects had carbon-14 levels that matched those at the atmosphere at the time of their birth. This is a strong indication that there is no significant generation of new neurons in this part of the brain, something that sets humans apart from all other mammals.

"Humans are less dependent on their sense of smell for their survival than many other animals, which may be related to the loss of new cell generation in the olfactory bulb, but this is just speculation,” says Professor Frisén.

Professor Frisén and his team now plan to study the extent of neuron generation in the hippocampus, a part of the brain that is important for higher cerebral functions in humans.

Provided by Karolinska Institutet

Source: medicalxpress.com

May 24, 201212 notes
#science #neuroscience #brain #psychology #neuron
'Obesity Genes' May Influence Food Choices, Eating Patterns

ScienceDaily (May 23, 2012) — Blame it on your genes? Researchers from The Miriam Hospital’s Weight Control and Diabetes Research Center say individuals with variations in certain “obesity genes” tend to eat more meals and snacks, consume more calories per day and often choose the same types of high fat, sugary foods.

image

Blame it on your genes? Researchers say individuals with variations in certain “obesity genes” tend to eat more meals and snacks and consume more calories per day. (Credit: © Gennadiy Poznyakov / Fotolia)

Their study, published online by the American Journal of Clinical Nutrition and appearing in the June issue, reveals certain variations within the FTO and BDNF genes — which have been previously linked to obesity — may play a role in eating habits that can cause obesity.

The findings suggest it may be possible to minimize genetic risk by changing one’s eating patterns and being vigilant about food choices, in addition to adopting other healthy lifestyle habits, like regular physical activity.

"Understanding how our genes influence obesity is critical in trying to understand the current obesity epidemic, yet it’s important to remember that genetic traits alone do not mean obesity is inevitable," said lead author Jeanne M. McCaffery, Ph.D., of The Miriam Hospital’s Weight Control and Diabetes Research Center.

"Our lifestyle choices are critical when it comes to determining how thin or heavy we are, regardless of your genetic traits," she added. "However, uncovering genetic markers can possibly pinpoint future interventions to control obesity in those who are genetically predisposed."

Previous research has shown individuals who carry a variant of the fast mass and obesity-associated gene FTO and BDNF (or brain-derived neurotrophic factor gene) are at increased risk for obesity. The genes have also been linked with overeating in children and this is one of the first studies to extend this finding to adults. Both FTO and BDNF are expressed in the part of the brain that controls eating and appetite, although the mechanisms by which these gene variations influence obesity is still unknown.

As part of the Look AHEAD (Action in Health and Diabetes) trial, more than 2,000 participants completed a questionnaire about their eating habits over the past six months and also underwent geneotyping. Researchers focused on nearly a dozen genes that have been previously associated with obesity. They then examined whether these genetic markers influenced the pattern or content of the participants’ diet.

Variations in the FTO gene specifically were significantly associated with a greater number of meals and snacks per day, greater percentage of energy from fat and more servings of fats, oils and sweets. The findings are largely consistent with previous research in children.

Researchers also discovered that individuals with BDNF variations consumed more servings from the dairy and the meat, eggs, nuts and beans food groups. They also consumed approximately 100 more calories per day, which McCaffery notes could have a substantial influence on one’s weight.

"We show that at least some of the genetic influence on obesity may occur through patterns of dietary intake," she said. "The good news is that eating habits can be modified, so we may be able to reduce one’s genetic risk for obesity by changing these eating patterns."

McCaffery says that while this research greatly expands their knowledge on how genetics may influence obesity, the data must be replicated before the findings can be translated into possible clinical measures.

Source: Science Daily

May 24, 20127 notes
#science #neuroscience #brain #psychology #obesity
Antioxidant Urate Could Protect Against Parkinson’s Disease

May 23rd, 2012

Study supports urate protection against Parkinson’s disease, hints at novel mechanism

In vitro study indicates urate protection extends beyond antioxidant effect

Use of the antioxidant urate to protect against the neurodegeneration caused by Parkinson’s disease appears to rely on more than urate’s ability to protect against oxidative damage. In the May issue of the open-access journal PLoS One, researchers from the MassGeneral Institute for Neurodegenerative Diseases (MGH-MIND) describe experiments suggesting the involvement of a novel mechanism in urate’s protection of cultured brain cells against Parkinson’s-like damage.

“Our experiments showed, unexpectedly, that urate’s ability to protect neurons requires the presence of neighboring cells called astrocytes,” says Michael Schwarzschild, MD, PhD, of MGH-MIND, the study’s senior author. “The results suggest there may be multiple ways that raising urate could help protect against neurodegeneration in diseases like Parkinson’s and further support the development of treatments designed to elevate urate in the brain.” Schwarzschild and colleagues in the Parkinson’s Study Group currently are conducting a clinical trial investigating one approach to that strategy.

Characterized by tremors, rigidity, difficulty walking and other symptoms, Parkinson’s disease is caused by destruction of brain cells that produce the neurotransmitter dopamine. Several epidemiological studies suggested that healthy people with elevated levels of urate, a normal component of the blood, may have a reduced risk of developing Parkinson’s disease, and investigations by Schwarzschild’s team found that Parkinson’s patients with higher naturally occuring urate levels had slower progression of their symptoms.

The current study was designed to investigate whether both added urate and urate already present within the cells protect cultured dopamine-producing neurons against Parkinson-like degeneration. In addition, since previous studies suggested that urate’s protective effects depended on the presence of astrocytes,  star-shaped cells of the central nervous system that provide both structural and metabolic support to neurons,  the MGH-MIND team explored how the presence of astrocytes affects the ability of urate to protect against damage induced by MPP+, a toxic molecule that produces the same kind of neurodegeneration seen in Parkinson’s and is widely used in research studies.

image

Raising urate levels could help to protect against neurodegenerative diseases like Parkinsons. Image adapted from Flickr user Niels_Olson.

The experiments showed that, while added urate reduced MPP+-induced cell death by about 50 percent in cultured dopamine-producing mouse neurons, urate treatment virtually eliminated neuronal death in cultures containing both neurons and astrocytes. They also showed that reducing intracellular urate levels by induced expression of the enzyme that breaks it down increased neuronal vulnerability to MPP+ toxicity significantly in cultures that included astrocytes but only slightly in neuron-rich cultures. The fact that the presence of astrocytes greatly increases the protection of both externally applied urate and urate produced within cells indicates that the effect depends on more than urate’s ability to directly protect neurons against oxidative stress.

“A valuable next step will be determining whether endogenous urate is protective in live animal models of Parkinson’s disease,” says Schwarzschild. “It also will be important to determine whether we can selectively increase urate levels in brain cells by targeting urate transporter molecules. The approach now in early clinical trials examines whether treatment with the urate precursor inosine, which increases urate levels throughout the body, can slow the progression of the disease. If we could raise urate levels in brain cells without changing them in the rest of the body, we could avoid the risks of of excessive urate, which when accumulated in joints can cause gout.”

Source: Neuroscience News

May 24, 201210 notes
#science #neuroscience #brain #psychology #parkinson
Study shows how immune cells change wiring of the developing mouse brain

May 23, 2012

Researchers have shown in mice how immune cells in the brain target and remove unused connections between brain cells during normal development. This research, supported by the National Institutes of Health, sheds light on how brain activity influences brain development, and highlights the newly found importance of the immune system in how the brain is wired, as well as how the brain forms new connections throughout life in response to change.

Disease-fighting cells in the brain, known as microglia, can prune the billions of tiny connections (or synapses) between neurons, the brain cells that transmit information through electric and chemical signals. This new research demonstrates that microglia respond to neuronal activity to select synapses to prune, and shows how this pruning relies on an immune response pathway – the complement system – to eliminate synapses in the way that bacterial cells or other pathogenic debris are eliminated. The study was led by Beth Stevens, Ph.D., assistant professor of neurology at Boston Children’s Hospital and Harvard Medical School.

The brain is created with many more synapses than it retains into adulthood. As the brain develops, it goes through dynamic changes to refine its circuitry, trimming away the synaptic connections that do not have a lot of activity, and preserving the stronger, more active synapses. This period, known as synaptic pruning, is a key part of normal brain development.

Scientists do not have a clear understanding of how these synapses are selected, targeted and then pruned. However, precise elimination of unused synapses and strengthening those that are most needed is essential for normal brain function. Many childhood disorders, such as amblyopia (a loss of vision in one eye that can occur when the eyes are misaligned), various forms of mental retardation, epilepsy and autism are thought to be due to abnormal brain development.

Microglia originate in the bone marrow and transform into an activated state to defend the body against infections. Activated microglia are also found in other disease states, ranging from stroke to Alzheimer’s disease. It is not always clear, however, if these cells cause degeneration of brain cells, or if they are part of the brain’s recovery process. In more recent years, several research groups reported that activated microglia are also present in the normal brain. Additionally, during the most robust synaptic pruning periods there is an increased number of activated microglia present and clustered around synapses.

As reported in the May 24 issue of Neuron, scientists in Dr. Stevens’s lab used the visual system in mice to study synaptic pruning, a model that undergoes robust change and remodeling during development and which has circuitry that is well-defined and easy to manipulate. Researchers labeled neurons that project from the eye into an area of the brain called the lateral geniculate nucleus, or LGN, and found that reactive microglia contained portions of the synapses from the labeled neurons. They also saw that these labeled pieces of synaptic material were specifically found inside the microglia’s lysosomes – compartments responsible for digesting foreign particles.

The researchers then investigated if the amount of neuronal activity at a synapse determines whether microglia target it for removal. They used a drug to increase activity in the neurons projecting from one eye and saw less pruning of synapses in the corresponding brain region, as compared to the untreated eye. When they used a drug to reduce activity, this resulted in more pruning compared to the untreated eye. The researchers think microglia select a synapse for removal based on the synapse’s level of activity. This may be directly relevant to amblyopia, a loss of vision in one eye that can occur when the eyes are misaligned. Children with amblyopia will preferentially use one eye and vision in the less used eye deteriorates due loss of synapses and cells in the LGN.

Earlier research revealed that proteins involved in the complement system are found near synapses during development and are necessary for pruning. To see if these same proteins are used by microglia to shape neuronal connections, the researchers disrupted complement pathway proteins that are found only in the brain’s immune cells. Their results indicate that these complement proteins signal the microglia to trim away synapses, and suggest that immune system pathways are key to proper synaptic pruning.

"The concept that microglia prune synapses using immune system pathways has been difficult to prove,” said Edmund Talley, Ph.D., program director at the National Institute of Neurological Disorders and Stroke, “This exquisitely careful and meticulous research confirms the role of microglia in brain development, plasticity and learning.”

Dr. Stevens said the study sheds light on the role of microglia in the normal brain, and supports further investigations into the role of microglia in brain disease. “Almost every neurodegenerative brain disease involves several interesting common denominators,” she said. “It’s becoming increasingly recognized that early synapse loss is a hallmark of many neurodegenerative diseases.”

Provided by NIH/National Institute of Neurological Disorders and Stroke

Source: medicalxpress.com

May 24, 20127 notes
#science #neuroscience #brain #psychology
Brain research shows visual perception system unconsciously affects our preferences

May 23, 2012

When grabbing a coffee mug out of a cluttered cabinet or choosing a pen to quickly sign a document, what brain processes guide your choices?

New research from Carnegie Mellon University’s Center for the Neural Basis of Cognition (CNBC) shows that the brain’s visual perception system automatically and unconsciously guides decision-making through valence perception. Published in the journal Frontiers in Psychology, the review hypothesizes that valence, which can be defined as the positive or negative information automatically perceived in the majority of visual information, integrates visual features and associations from experience with similar objects or features. In other words, it is the process that allows our brains to rapidly make choices between similar objects.

The findings offer important insights into consumer behavior in ways that traditional consumer marketing focus groups cannot address. For example, asking individuals to react to package designs, ads or logos is simply ineffective. Instead, companies can use this type of brain science to more effectively assess how unconscious visual valence perception contributes to consumer behavior.

To transfer the research’s scientific application to the online video market, the CMU research team is in the process of founding the start-up company neonlabs through the support of the National Science Foundation (NSF) Innovation Corps (I-Corps).

"This basic research into how visual object recognition interacts with and is influenced by affect paints a much richer picture of how we see objects," said Michael J. Tarr, the George A. and Helen Dunham Cowan Professor of Cognitive Neuroscience and co-director of the CNBC. “What we now know is that common, household objects carry subtle positive or negative valences and that these valences have an impact on our day-to-day behavior.”

Tarr added that the NSF I-Corps program has been instrumental in helping the neonlabs’ team take this basic idea and teaching them how to turn it into a viable company. “The I-Corps program gave us unprecedented access to highly successful, experienced entrepreneurs and venture capitalists who provided incredibly valuable feedback throughout the development process,” he said.

NSF established I-Corps for the sole purpose of assessing the readiness of transitioning new scientific opportunities into valuable products through a public-private partnership. The CMU team of Tarr, Sophie Lebrecht, a CNBC and Tepper School of Business postdoctoral fellow, Babs Carryer, an embedded entrepreneur at CMU’s Project Olympus, and Thomas Kubilius, president of Pittsburgh-based Bright Innovation and adjunct professor of design at CMU, were awarded a $50,000, six-month grant to investigate how understanding valence perception could be used to make better consumer marketing decisions. They are launching neonlabs to apply their model of visual preference to increase click rates on online videos, by identifying the most visually appealing thumbnail from a stream of video. The web-based software product selects a thumbnail based on neuroimaging data on object perception and valence, crowd sourced behavioral data and proprietary computational analyses of large amounts of video streams.

"Everything you see, you automatically dislike or like, prefer or don’t prefer, in part, because of valence perception," said Lebrecht, lead author of the study and the entrepreneurial lead for the I-Corps grant. "Valence links what we see in the world to how we make decisions."

Lebrecht continued, “Talking with companies such as YouTube and Hulu, we realized that they are looking for ways to keep users on their sites longer by clicking to watch more videos. Thumbnails are a huge problem for any online video publisher, and our research fits perfectly with this problem. Our approach streamlines the process and chooses the screenshot that is the most visually appealing based on science, which will in the end result in more user clicks.”

Today (May 23), Lebrecht will join the other 23 I-Corps project teams in Palo Alto, Calif., for the final presentation of each team’s I-Corps journey from basic science idea to real-world business application. She will present neonlabs’ solution, outlining the customer landscape, competition and business model.

Carnegie Mellon is well known for its entrepreneurial culture. The university’s Greenlighting Startups initiative, a portfolio of five business incubators, is designed to speed company creation at CMU. In the past 15 years, Carnegie Mellon faculty and students have helped to create more than 300 companies and 9,000 jobs; the university averages 15 to 20 new startups each year.

"CMU has been an amazing place to build neonlabs," Lebrecht said. "There’s a great intellectual community and facilities here as well as people unbelievably experienced in tech transfer and startups who have been so incredibly generous with their time."

Provided by Carnegie Mellon University

Source: medicalxpress.com

May 24, 201219 notes
#science #neuroscience #brain #psychology #vision
Robust White Matter Helps Keep Us Smart As We Age

May 23rd, 2012

Well-connected brains make you smarter in older age
Brains that maintain healthy nerve connections as we age help keep us sharp in later life, new research funded by the charity Age UK has found

Brains that maintain healthy nerve connections as we age help keep us sharp in later life, new research funded by the charity Age UK has found.

Older people with robust brain ‘wiring’, that is, the nerve fibres that connect different, distant brain areas, can process information quickly and that this makes them generally smarter, the study suggests.

According to the findings, joining distant parts of the brain together with better wiring improves mental performance, suggesting that intelligence is not found in a single part of the brain.

However a loss of condition of this wiring or ‘white matter’, the billions of nerve fibres that transmit signals around the brain, can negatively affect our intelligence by altering these networks and slowing down our processing speed.

The research by the University of Edinburgh shows for the first time that the deterioration of white matter with age is likely to be a significant cause of age-related cognitive decline.

The research team used three different brain imaging techniques in compiling the results, including two that have never been used before in the study of intelligence.

image

Healthy nerve connections in the brain help to reduce mental decline and dementia in older people. Image by Flickr user Brian Auer. See below for attribution.

These techniques measure the amount of water in brain tissue, indicate structural loss in the brain, and show how well the nerve fibres are insulated.

The researchers examined scans and results of thinking and reaction time tests from 420 people in the Lothian Birth Cohort of 1936, a group of nearly 1100 people whose intelligence & general health have been tracked since they were 11

The research was part of the Disconnected Mind Project, a large study of the causes of people’s differences in cognitive ageing, led by Professor Ian Deary.

Study author Doctor Lars Penke said “Our results suggest a first plausible way how brain structure differences lead to higher intelligence. The results are exciting for our understanding of human intelligence differences at all ages.”

“They also suggest a clear target for seeking treatment for mental difficulties, be they pathological or age-related. That the brain’s nerve connections tend to stay the same throughout the brain means we can now look at factors that affect the overall condition of the brain, like its bloody supply.”

Professor Deary said that uncovering the secrets of good thinking skills in old age is a high priority. “The research team is now looking at what keeps the brain’s connections healthy. We value our thinking skills, and research should address how we might retain them or slow their decline with age.”

Doctor Mark Bastin, who co-authored the study, said “These findings are exciting as they show how quantitative brain imaging can provide novel insights into the links between brain structure and cognitive ability. This is a key research area given the importance of identifying strategies for retaining good mental ability into older age.”

Professor James Goodwin, Head of Research at Age UK, said: “This research is very exciting as it could have a real impact on tackling mental decline in later life, including dementia. With new understanding on how the brain functions we can work out why mental faculties decline with age in some people and not others and look at what can be done to improve our minds’ chances of ageing better.”

Source: Neuroscience News

May 24, 20129 notes
#science #neuroscience #brain #psychology
Researchers uncover new ways sleep-wake patterns are like clockwork

May 23, 2012

Researchers at New York University and Albert Einstein College of Medicine of Yeshiva University have discovered new ways neurons work together to ease the transition between sleep and wakefulness. Their findings, which appear in the journal Neuron, provide additional insights into sleep-wake patterns and offer methods to explore what may disrupt them.

Their study explored the biological, or circadian, clocks of Drosophila fruit flies, which are commonly used for research in this area. This is because it is relatively easy to find mutants with malfunctioning biological clocks and then to identify the genes underlying the altered behavior. Such studies in fruit flies have allowed the identification of similar “clock genes" in mammals, which function in largely the same manner as they do in a fly’s clock.

In the Neuron study, the researchers moved up a level to study how pacemaker clock neurons—which express clock genes—interact with each other. Specifically, they looked at the relationship between master pacemaker neurons, which control the overall pace of the circadian system, and non-master pacemaker neurons, whose role in circadian rhythms has been less clear.

To do so, they examined flies with normally functioning master and non-master clock neurons and compared them with mutant flies in which the signaling of these neurons was either increased or decreased. These comparisons allowed the researchers to isolate the individual roles of these neurons and, in particular, to understand how master and non-master pacemaker neurons work together to control circadian rhythms.

Their results revealed a previously unknown role for non-master pacemaker neurons. Specifically, these neurons employ a neurotransmitter, glutamate, which suppresses signaling of the master pacemaker neurons during the evening. Artificially increasing this suppression by the non-master clock neurons in the morning made it much harder for flies to wake up. So in normal flies, these non-master pacemaker neurons have to stand aside at dawn, allowing the master pacemaker neurons to fire to wake up the fly. The authors concluded that the balance between signaling of these two groups of clock neurons helps to set the precise time of the transition between sleep and wakefulness.

"Our work shifts the emphasis away from clock genes and starts to address how clock neurons function in a neural network to regulate behavior," explained Justin Blau, an associate professor in NYU’s Department of Biology and one of the study’s co-authors. "And it shows the importance of studying individual groups of clock neurons, since different subsets can have opposite effects on animal behavior.”

"This work helps to elucidate the neurotransmitters and receptors that facilitate communication between specific groups of nerve cells that regulate circadian rhythm," said co-author Myles Akabas, professor of Physiology & Biophysics and of Neuroscience at Albert Einstein College of Medicine. "It demonstrates the power of collaborative interdisciplinary research to address the molecular and cellular basis for behavior."

Provided by New York University

Source: medicalxpress.com

May 23, 201216 notes
#science #neuroscience #brain #psychology
Reverse engineering epilepsy's 'miracle' diet

May 23, 2012 by R. Alan Leo

For decades, neurologists have known that a diet high in fat and extremely low in carbohydrates can reduce epileptic seizures that resist drug therapy. But how the diet worked, and why, was a mystery—so much so that in 2010, The New York Times Magazine called it “Epilepsy’s Big, Fat Miracle.”

Now, researchers at Dana-Farber Cancer Institute and Harvard Medical School have proposed an answer, linking resistance to seizures to a protein that modifies cellular metabolism in the brain. The research, to be published in the May 24th issue of the journal Neuron, may lead to the development of new treatments for epilepsy.

The research was led jointly by Nika Danial, HMS assistant professor of cell biology at Dana-Farber Cancer Institute, and Gary Yellen, professor of neurobiology at Harvard Medical School. The first author was Alfredo Giménez-Cassina, a research fellow in Danial’s lab.

Epilepsy is a neurological disorder characterized by repeated seizures, an electrical storm in the brain that can manifest as convulsions, loss of motor control, or loss of consciousness. Some cases of epilepsy can be improved by a diet that drastically reduces sugar intake, triggering neurons to switch from their customary fuel of glucose to fat byproducts called ketone bodies. The so-called ketogenic diet, which mimics effects of starvation, was described more than 80 years ago and received renewed interest in the 1990s. Recent studies corroborate that it works, but shed little light on how.

"The connection between metabolism and epilepsy has been such a puzzle," said Yellen, who was introduced to the ketogenic diet through his wife, Elizabeth Thiele, HMS professor of neurology, who directs the Pediatric Epilepsy Program at MassGeneral Hospital for Children, but was not directly involved in the study. "I’ve met a lot of kids whose lives are completely changed by this diet," Yellen said. "It’s amazingly effective, and it works for many kids for whom drugs don’t work."

"We knew we needed to come at this link between metabolism and epilepsy from a new angle," said Danial, who had previously discovered a surprising double duty for a protein known for its role in apoptosis: The protein, BCL-2-associated Agonist of Cell Death, or BAD, also regulated glucose metabolism.

Giménez-Cassina further discovered that certain modifications in BAD switched metabolism in brain cells from glucose to ketone bodies. “It was then that we realized we had come upon a metabolic switch to do what the ketogenic diet does to the brain without any actual dietary therapy,” said Gimenez-Cassina, who went on to show that these same BAD modifications protect against seizures in experimental models of epilepsy. Still, it wasn’t clear exactly how.

Yellen suspected the solution involved potassium ion channels. While sodium and calcium ion channels tend to excite cells, including neurons, potassium channels tend to suppress cell electrical activity. His lab had previously linked ketone bodies to the activation of ATP-sensitive potassium (KATP) channels in neurons. Yellen had hypothesized that the ketogenic diet worked because ketone bodies provide neurons enough fuel for normal function, but when the electrical and energy storm of an epileptic seizure threatens, the activated KATP channels can shut the storm down. But the effects of diets are broad and complex, so it was impossible to say for sure.

The effects that Danial’s lab had discovered—BAD’s ability to alter metabolism and seizures—offered a new avenue for studying the therapeutic effects of altered metabolism. Together, the researchers decided to investigate whether Danial’s switch governed Yellen’s pathway, and whether they could reverse engineer the seizure protection of a ketogenic diet.

They could. Working in genetically altered mice, the researchers modified the BAD protein to reduce glucose metabolism and increase ketone body metabolism in the brain. Seizures decreased, but the benefit was erased when they knocked out the KATP channel—strong evidence that a BAD-KATP pathway conferred resistance to epileptic seizures. Further experiments suggested that it was indeed BAD’s role in metabolism, not cell death that mattered. The findings make the BAD protein a promising target for new epilepsy drugs.

"Diet sounds like this wholesome way to treat seizures, but it’s very hard. I mean, diets in general are hard, and this diet is really hard," said Yellen, whose wife’s Center for Dietary Therapy in Epilepsy hosts a candy-free Halloween party for its many patients on the ketogenic diet. “So finding a pharmacological substitute for this would make lots of people really happy.”

Provided by Harvard Medical School

Source: medicalxpress.com

May 23, 201210 notes
#science #neuroscience #brain #psychology #epilepsy
Treating pain with transplants

May 23, 2012

A new study finds that transplanting embryonic cells into adult mouse spinal cord can alleviate persistent pain. The research, published by Cell Press in the May 24th issue of the journal Neuron, suggests that reduced pain results from successful integration of the embryonic cells into the host spinal cord. The findings open avenues for clinical strategies aimed not just at treating the symptoms of chronic debilitating pain, but correcting the underlying disease pathology.

There are two major classes of chronic pain: inflammatory pain that results from injury to tissue, such as muscle and bone, and neuropathic pain from injury to nerves, for example, in the limbs or face. Damage to nerves can occur after physical trauma and from chemotherapy drugs. With neuropathic pain, the pain occurs in the absence of stimulation, and there is hypersensitivity and exacerbated pain to stimuli that would not normally cause pain. Neuropathic pain is thought to involve the loss of inhibitory neurons that release the chemical GABA, which is an inhibitory neurotransmitter that controls the excitability of neurons, including neurons that transmit pain information.

"Pharmacological approaches to managing neuropathic pain enhance GABA-mediated inhibition. However, some patients do not respond to these therapies and there are significant adverse side effects," explains senior study author, Dr. Allan Basbaum from the University of California, San Francisco. "Therefore, new therapeutic approaches for neuropathic pain are essential." Dr. Basbaum and colleagues explored whether replacement of the damaged inhibitory neurons might be useful for reducing neuropathic pain.

The researchers transplanted immature GABA neurons from mouse fetal brain into the spinal cord of mice with nerve injury-induced pain, a model for human neuropathic pain. The transplanted cells not only survived, but made connections with appropriate targets and integrated into the host spinal cord circuitry. This resulted in an almost complete reversal of the mechanical hypersensitivity generated in a nerve injury model of neuropathic pain. In contrast, the transplant procedure was not effective at reducing pain in a mouse model of inflammatory pain, which is induced by tissue injury.

Taken together, the findings have exciting implications for a cell-based treatment of neuropathic pain in humans. “Our strategy not only ameliorates the symptoms of neuropathic pain but, importantly, is also potentially disease modifying,” concludes Dr. Basbaum. “It is worth considering whether transplants such as these might have clinical utility in humans, a great advantage being that the adverse side effects associated with drug administration can be avoided.”

Provided by Cell Press

Source: medicalxpress.com

May 23, 20124 notes
#science #neuroscience #psychology #pain #brain
Dementia patients reveal how we construct a picture of the future

May 23, 2012

(Medical Xpress) — Our ability to imagine and plan our future depends on brain regions that store general knowledge, new research shows.

Dr. Muireann Irish from Neuroscience Research Australia (NeuRA) found that dementia patients who can no longer recall general knowledge – for example, the names of famous people or popular songs – are also unable to imagine themselves in the future.

"We already know that if memory of past events is compromised, as is the case in Alzheimer’s disease, then the ability to imagine future scenarios is also impaired,” says Dr. Irish.

"We have now discovered that damage to parts of the brain that store knowledge of facts and meanings can also produce the same effect," she says.

Thinking about the future is an important ability because it helps us to plan and anticipate the consequences of our actions.

"For example, a person with dementia who may leave the oven on, partly because they forget the appropriate action, but also because they cannot project forward in time to anticipate the dangerous consequences this might have," says Dr. Irish.

Dr. Irish and colleagues used MRI to study people with Alzheimer’s disease (memories of past experiences are lost) as well as patients with semantic dementia who have lost the ability to remember facts (semantic memory) but have little problem remembering past experiences.

Surprisingly, she found that the semantic dementia group was as impaired as the Alzheimer’s group when imagining future events, even though their memory of past experiences was relatively intact.

"This is an important finding, as it points to multiple regions in the brain that are responsible for our ability to imagine and plan for the future,” she says.

Provided by Neuroscience Research Australia

Source: medicalxpress.com

May 23, 201210 notes
#science #neuroscience #brain #psychology
Discoveries Into Perception Via Popular Magic Tricks

ScienceDaily (May 22, 2012) — Researchers at Barrow Neurological Institute at St. Joseph’s Hospital and Medical Center have unveiled how and why the public perceives some magic tricks in recent studies that could have real-world implications in military tactics, marketing and sports.

image

A professional magician believed that if he moved his hand in a straight line while performing a trick the audience would focus on the beginning and end points of the motion, but not in between. In contrast, he believed if he moved his hand in a curved motion the audience would follow his hand’s trajectory from beginning to end. (Credit: © luzitanija / Fotolia)

Susana Martinez-Conde, PhD, of Barrow’s Laboratory of Visual Neuroscience, and Stephen Macknik, PhD, of Barrow’s Laboratory of Behavioral Neurophysiology are well known for their research into magic and illusions. Their most recent original research projects, published in Frontiers in Human Neuroscience, offer additional insight into perception and cognition.

One of the studies was initiated by professional magician Apollo Robbins, who believed that audience members directed their attention differently depending on the type of hand motion used. Robbins believed that if he moved his hand in a straight line while performing a trick the audience would focus on the beginning and end points of the motion, but not in between. In contrast, he believed if he moved his hand in a curved motion the audience would follow his hand’s trajectory from beginning to end.

By studying the eye movements of individuals as they watched Robbins perform, Barrow researchers confirmed Robbins’ theory. Perhaps more importantly, they also found that the different types of hand motion triggered two different types of eye movement. The researchers discovered that curved motion engaged smooth pursuit eye movements (in which the eye follows a moving object smoothly), whereas straight motion led to saccadic eye movements (in which the eye jumps from one point of interest to another).

"Not only is this discovery important for magicians, but the knowledge that curved motion attracts attention differently from straight motion could have wide-reaching implications — for example, in predator-prey evasion techniques in the natural world, military tactics, sports strategies and marketing," says Martinez-Conde. This finding is believed to be the first discovery in the neuroscientific literature initiated by a magician, rather than a scientist.

In another study, the researchers worked with professional magician Mac King to investigate magicians’ use of social cues — like the position of their gaze — to misdirect observers.

They studied a popular coin-vanishing trick, in which King tosses a coin up and down in his right hand before “tossing” it to his left hand, where it subsequently disappears. In reality, the magician only simulates tossing the coin to the left hand, an implied motion that essentially tricks the neurons into responding as they would have if the coin had actually been thrown.

The Barrow researchers discovered that social misdirection does not always help magic. By presenting two different videos of King — one in which the audience could see his face and another in which his face was hidden — they found that social misdirection did not play a role in this particular trick.

"We wondered if the observer’s perception of magic was going to be different if they could see the magician’s head and eye position. To our surprise, it didn’t matter," says Martinez-Conde. "This indicates that social misdirection in magic is more complicated than previously believed, and not necessary for the perception of all magic tricks."

Source: Science Daily

May 23, 20124 notes
#science #neuroscience #brain #psychology #perception
Neuron-Nourishing Cells Appear to Retaliate in Alzheimer's

ScienceDaily (May 22, 2012) — When brain cells start oozing too much of the amyloid protein that is the hallmark of Alzheimer’s disease, the astrocytes that normally nourish and protect them deliver a suicide package instead, researchers report.

image

Drs. Michael Dinkins (from left), Guanghu Wang and Erhard Bieberich. (Credit: Image courtesy of Georgia Health Sciences University)

Amyloid is excreted by all neurons, but rates increase with aging and dramatically accelerate in Alzheimer’s. Astrocytes, which deliver blood, oxygen and nutrients to neurons in addition to hauling off some of their garbage, get activated and inflamed by excessive amyloid.

Now researchers have shown another way astrocytes respond is by packaging the lipid ceramide with the protein PAR-4, which independently can do damage but together are a more “deadly duo,” said Dr. Erhard Bieberich, biochemist at the Medical College of Georgia at Georgia Health Sciences University.

"If the neuron makes something toxic and dumps it at your door, what would you do?" said Bieberich, corresponding author of the study published in the Journal of Biological Chemistry. “You would probably do something to defend yourself.”

The researchers hypothesize that this lipid-coated package ultimately kills them both, which could help explain the brain-cell death and shrinkage that occurs in Alzheimer’s. “If the astrocytes die, the neurons die,” Bieberich said, noting studies suggest that excess amyloid alone does not kill brain cells. “There must be a secondary process toxifying the amyloid; otherwise the neuron would self-intoxicate before it made a big plaque,” he said. “The neuron would die first.”

One of many avenues for future pursuit include whether a ceramide antibody could be a viable Alzheimer’s treatment. In the researchers’ studies of brain cells of humans with Alzheimer’s as well as an animal model of the disease, antibodies to ceramide and Par-4 prevented astrocytes’ amyloid-induced death.

Ceramide and Par-4 get packaged in lipid-coated vesicles called exosomes; all cells secrete thousands of these vesicles but scientists are only beginning to understand their normal function. When exosomes become deadly, they are called apoxosomes.

Ceramide and Par-4 are typically not in a vesicle, rather in two distinct parts of a cell. Ceramide appears to take the lead in bringing the two together when confronted with amyloid. Bieberich and colleagues at the University of Georgia reported in 2003 that the deadly duo helps eliminate duplicate brain cells that occur early in brain development when their survival could result in a malformed brain. They suspected then that the duo might also have a role in Alzheimer’s.

Risk factors for Alzheimer’s include aging, family history and genetics, according to the Alzheimer’s Association. Increasing evidence suggests that Alzheimer’s also shares many of the same risk factors for cardiovascular disease, such as high cholesterol, high blood pressure and inactivity.

Source: Science Daily

May 22, 20125 notes
#science #neuroscience #brain #psychology #alzheimer
Learning and memory: The role of neo-neurons revealed

May 22, 2012

(Medical Xpress) — Researchers at the Institut Pasteur and the CNRS have recently identified in mice the role played by neo-neurons formed in the adult brain. By using selective stimulation the researchers were able to show that these neo-neurons increase the ability to learn and memorize difficult cognitive tasks. This newly discovered characteristic of neo-neurons to assimilate complex information could open up new avenues in the treatment of some neurodegenerative diseases. This publication is available online on the Nature Neuroscience journal’s website.

image

Section of a mouse brain observed using a fluorescence microscope. The green filaments represent neo-neurons in an organized network. Credit: Institut Pasteur

The discovery that new neurons could be formed in the adult brain created quite a stir in 2003 by debunking the age-old belief that a person is born with a set number of neurons and that any loss of neurons is irreversible. This discovery was all the more incredible considering that the function of these new neurons remained undetermined. That is, until today.

Using mice models the team working under Pierre-Marie Lledo, head of the Laboratory for Perception and Memory (Institut Pasteur/CNRS) recently revealed the role of these neo-neurons formed in the adult brain with respect to learning and memory. With the help of an experimental approach using optogenetics, developed by this very same team and published in December 2010, the researchers were able to show that when stimulated by a brief flash of light these neo-neurons facilitate both learning and the memorization of complex tasks. This resulted in mice models that were able to memorize information given during the learning activity more quickly and remember exercises even 50 days after experimentation had ended. The study also shows that neo-neurons generated just after birth hold no added advantages as relates to either learning or memory. In this respect it is only the neurons produced by the adult brain that have any considerable significance.

“This study shows that the activity of just a few neurons produced in the adult brain can still have considerable effects on cognitive processes and behavior. Moreover, this work helps to illustrate how the brain assimilates new stimulations seeing as normally electrical activity (which we mimic using flashes of light) is produced within the brain’s attention centers”, explains the study’s director Pierre-Marie Lledo.

Beyond simply discovering the functional contribution of these neo-neurons, the study has also reaffirmed the clear link between “mood” (defined here by a specific pattern of stimulation) and cerebral activity. It has been shown that curiosity, attentiveness and pleasure all promote the formation of neo-neurons and consequently the acquisition of new cognitive abilities. Conversely, a state of depression is detrimental to the production of new neurons and triggers a vicious cycle which prolongs this state of despondency. These results, and the optogenetics technologies that enabled this study, may prove very useful for devising therapeutic protocols which aim to counter the development of neurologic or psychiatric diseases.

Provided by CNRS

Source: medicalxpress.com

May 22, 201219 notes
#science #neuroscience #brain #psychology #memory
GPS for the brain: Researchers develop new brain map

May 22, 2012

University of Georgia researchers have developed a map of the human brain that shows great promise as a new guide to the inner workings of the body’s most complex and critical organ.

With this map, researchers hope to create a next-generation brain atlas that will be an alternative option to the atlas created by German anatomist Korbinian Brodmann more than 100 years ago, which is still commonly used in clinical and research settings.

Tianming Liu, assistant professor of computer science in the UGA Franklin College of Arts and Sciences, and his students Dajiang Zhu and Kaiming Li identified 358 landmarks throughout the brain related to memory, vision, language, arousal regulation and many other fundamental bodily operations. Their findings were published in the April issue of Cerebral Cortex.

The landmarks were discovered using diffusion tensor imaging, a sophisticated neuroimaging technique that allows scientists to visualize nerve fiber connections throughout the brain. Unlike many other neuroimaging studies, their map does not focus only on one section of the brain but rather the whole cerebral cortex.

"Previously, researchers would examine at most three or four small brain networks," Liu said. "We want to examine the whole brain connection, and this is the so-called connectome."

The new map provides a clearer picture of how different areas of the brain are physically connected and how these connections relate to basic brain function. Liu and his team examined hundreds of healthy young adults to establish the landmarks, which they call dense individualized and common connectivity-based cortical landmarks, or DICCCOL.

After extensive testing and comparison, the team determined that these nodes are present in every normal brain, meaning they can be used as a basis of comparison for those with damaged brain tissue or altered brain function.

"DICCCOL is very similar to a GPS system," Zhu said, "only it’s a GPS map of the human brain."

Now, thanks in part to a five-year, $1.6 million grant from the National Institutes of Health, Liu and collaborators Xiaoping Hu and Claire Coles at Emory University are preparing to test their brain map by comparing healthy brains with those of children whose brains were damaged by exposure to cocaine while in the womb.

Prenatal cocaine exposure, or PCE, can cause serious damage to brain networks. Because of this, analysis of the damage provides Liu and his team with an excellent opportunity to evaluate the usefulness of their map.

After comparing the PCE brains to those of healthy individuals, they hope to determine the segments of the brain responsible for physical or mental disabilities observed in children exposed to cocaine.

"The PCE brain is disrupted in a systematic way; the whole brain is wrongly wired," Liu said. "We want to test our map in one of the worst cases, and then we will know if it will work in other cases."

Once the robustness of their map is established, Liu and his team hope that it may prove useful in the evaluation of many other brain disorders, such as Alzheimer’s disease, Parkinson’s disease or stroke.

"This really is a fundamental technology," Liu said. "When we establish these DICCCOLS, we can very easily extend this project to other populations, to other brain diseases."

More information: Liu’s team published their DICCCOL data sets, which includes the source code and diffusion tensor images, at http://dicccol.cs.uga.edu so other researchers may use the findings in their own experiments.

The article, “DICCCOL: Dense Individualized and Common Connectivity-Based Cortical Landmarks,” is available at http://cercor.oxfordjournals.org/content/early/2012/04/05/cercor.bhs072.short

Provided by University of Georgia

Source: medicalxpress.com

May 22, 20129 notes
#science #neuroscience #brain #psychology
Seventy-Two Percent of Teenagers Experienced Reduced Hearing Ability After Attending Concert

ScienceDaily (May 21, 2012) — Seventy-two percent of teenagers participating in a study experienced reduced hearing ability following exposure to a pop rock performance by a popular female singer.

image

Seventy-two percent of teenagers participating in a study experienced reduced hearing ability following exposure to a pop rock performance by a popular female singer. (Credit: © DWP / Fotolia)

M. Jennifer Derebery, MD, House Clinic physician, along with the House Research Institute tested teens’ hearing before and after a concert and presented the study findings at the American Otologic Society meeting on April 21, 2012. The study has been accepted for publication in an upcoming issue of Otology & Neurotology.

The hearing loss that may be experienced after a pop rock concert is not generally believed to be permanent. It is called a temporary threshold shift and usually disappears within 16-48 hours, after which a person’s hearing returns to previous levels.

“Teenagers need to understand a single exposure to loud noise either from a concert or personal listening device can lead to hearing loss,” said M. Jennifer Derebery, MD, lead author and physician at the House Clinic. “With multiple exposures to noise over 85 decibels, the tiny hair cells may stop functioning and the hearing loss may be permanent.”

In the study, twenty-nine teenagers were given free tickets to a rock concert. To ensure a similar level of noise exposure for the teens, there were two blocks of seats within close range of each other. The seats were located in front of the stage at the far end of the venue approximately 15-18 rows up from the floor.

Parental consent was obtained for all of the underage study participants. The importance of using hearing protection was explained to the teenagers. Researchers then offered hearing protection to the subjects and encouraged them to use the foam ear plugs. However, only three teenagers chose to do so.

Three adult researchers sat with the teenagers. Using a calibrated sound pressure meter, 1,645 measurements of sound decibel (dBA) levels were recorded during the 26 songs played during the three hour concert. The sound levels ranged from 82-110 dBA, with an average of 98.5 dBA. The mean level was greater than 100 dBA for 10 of the 26 songs.

The decibel levels experienced at the concert exceeded what is allowable in the workplace, according to Occupational Safety and Health Administration (OSHA). OSHA safe listening guidelines set time limits for exposures to sound levels of 85 dB and greater in the workplace. The volumes recorded during the concert would have violated OSHA standards in less than 30 minutes. In fact, one third of the teen listeners showed a temporary threshold shift that would not be acceptable in adult workplace environments.

Following the concert, the majority of the study participants also were found to have a significant reduction in the Distortion Product Otoacoustic Emissions (OAE) test. This test checks the function of the tiny outer hair cells in the inner ear that are believed to be the most vulnerable to damage from prolonged noise exposure, and are crucial to normal hearing, the ability to hear soft (or low level sounds), and the ability to understand speech, especially in noisy environments. With exposure to loud noise, the outer hair cells show a reduction in their ability to function, which may later recover. However, it is known that with repeated exposure to loud noise, the tiny hair cells may become permanently damaged. Recent animal research suggests that a single exposure to loud noise may result in permanent damage to the hearing nerve connections themselves that are necessary to hear sound.

Following the concert, 53.6 percent of the teens said they did not think they were hearing as well after the concert. Twenty-five percent reported they were experiencing tinnitus or ringing in their ears, which they did not have before the concert.

Researchers are especially concerned, because in the most recent government survey on health in the United States National Health and Nutrition Examination Survey (NHANES) 2005-2006, 20% of adolescents were found to have at least slight hearing loss, a 31% increase from a similar survey done from 1988-1994.

The findings of the study clearly indicate more research is necessary to determine if the guidelines for noise exposure need to be revised for teenagers. More research is also needed to determine if teenager’s ears are more sensitive to noise than adults.

“It also means we definitely need to be doing more to ensure the sound levels at concerts are not so loud as to cause hearing loss and neurological damage in teenagers, as well as adults,” said Derebery. “Only 3 of our 29 teens chose to use ear protection, even when it was given to them and they were encouraged to do so. We have to assume this is typical behavior for most teen listeners, so we have the responsibility to get the sound levels down to safer levels.”

Researchers recommend teenagers and young adults take an active role in protecting their hearing by utilizing a variety of sound meter ‘apps’ available for smart phones. The sound meters will give a rough estimate of the noise level allowing someone to take the necessary steps to protect their hearing such as wearing ear plugs at a concert.

In addition, Derebery and the study co-authors would like to see concert promoters and the musicians themselves take steps to lower sound levels as well as encourage young concert goers to use hearing protection.

Source: Science Daily

May 22, 20126 notes
#science #neuroscience #brain #hearing #psychology
What Baboons Can Teach Us About Social Status

ScienceDaily (May 21, 2012) — Turns out it’s not bad being top dog, or in this case, top baboon.

image

Wounded baboon. (Credit: Image courtesy of University of Notre Dame)

A new study by University of Notre Dame biologist Beth Archie and colleagues from Princeton and Duke Universities finds that high-ranking male baboons recover more quickly from injuries and are less likely to become ill than other males.

Archie, Jeanne Altmann of Princeton and Susan Alberts of Duke examined health records from the Amboseli Baboon Research Project in Kenya. They found that high rank is associated with faster wound healing. The finding is somewhat surprising, given that top-ranked males also experience high stress, which should suppress immune responses. They also found that social status is a better predictor of wound healing than age.

"In humans and animals, it has always been a big debate whether the stress of being on top is better or worse than the stress of being on the bottom," said Archie, lead researcher on the study. "Our results suggest that, while animals in both positions experience stress, several factors that go along with high rank might serve to protect males from the negative effects of stress."

"The power of this study is in identifying the biological mechanisms that may confer health benefits to high-ranking members of society," said George Gilchrist, program director in the National Science Foundation (NSF)’s Division of Biology, which funded the research. "We know that humans have such benefits, but it took meticulous long-term research on baboon society to tease out the specific mechanisms. The question remains of causation: Is one a society leader because of stronger immune function or vice versa?"

The researchers examined 27 years of data on naturally occurring illness and injuries in wild male baboons, which is a notably large data set. Although research of health and disease in animals in laboratory settings has been quite extensive, this study is one of most comprehensive ever conducted on animals in a natural setting.

The research team investigated how differences in age, physical condition, stress, reproductive effort and testosterone levels contribute to status-related differences in immune functions. Previous research found that high testosterone levels and intense reproductive efforts can suppress immune function and are highest among high-ranking males.

However, Archie and her colleagues found that high-ranking males were less likely to become ill and recovered faster from injuries and illnesses than low-ranking males. The authors suggest that chronic stress, old age and poor physical condition associated with low rank may suppress immune function in low-ranking males.

"The complex interplay among social context, physiology and immune system-mediated health costs and benefits illustrates the power of interdisciplinary research," said Carolyn Ehardt, NSF program director for biological anthropology, which co-funded the research. "This research begins to tease apart the trade-offs in both high and low status in primates, including ourselves, which may lead to understanding the effects of social status on death and disease — not inconsequential for society as a whole."

Source: Science Daily

May 22, 20127 notes
#science #neuroscience #psychology #biology
Newly Discovered Protein Makes Sure Brain Development Isn't 'Botched'

ScienceDaily (May 21, 2012) — Johns Hopkins scientists have discovered a protein that appears to play an important regulatory role in deciding whether stem cells differentiate into the cells that make up the brain, as well as countless other tissues. This finding, published in the April Developmental Cell, could eventually shed light on developmental disorders as well as a variety of conditions that involve the generation of new neurons into adulthood, including depression, stroke, and posttraumatic stress disorder.

Researchers have long known that a small group of proteins called Notch plays a pivotal role in helping the immature cells present in embryos to develop into the variety of cells present throughout the body, including those that make up the brain, blood, kidneys and muscles.

"Notch signaling is involved in almost all aspects of tissue development," explains study leader Valina Dawson, Ph.D., a professor in the departments of Neurology, Neuroscience, and Physiology and co-director of the Stem Cell and Neuroregeneration Programs at the Institute for Cell Engineering at the Johns Hopkins University School of Medicine.

However, she says, even for researchers who have been studying Notch for decades, how this small group of proteins manages the development of such a diverse array of tissues and organs in the body remains unknown. It’s a pivotal mystery to solve, Dawson adds, since problems in Notch signaling seem to be involved in various cancers, Alzheimer’s disease, juvenile stroke and many other health problems.

In their new study, Dawson and her colleagues shed light on one way Notch proteins might be regulated, through a protein they recently discovered in the lab. This protein seemed to be involved in development, but at first, the researchers didn’t know its function.

To determine what purpose this protein serves in cells, Dawson, postdoctoral fellow Zhikai Chi, M.D., Ph.D., and their colleagues started by trying to determine what other proteins it’s able to bind to. By adding the mystery protein to cell cultures that expressed a variety of other proteins, they determined that the unknown protein altered cellular activity in those expressing Notch.

Since Notch is involved intimately in determining the fate of brain precursor cells, driving neural stem cells to proliferate and determining whether they become neurons or supporting cells known as glia, the researchers next examined how this mystery protein affected brain development in mouse embryos. They found that by increasing expression of the unknown protein, more neurons developed in certain parts of the developing brain, including the intermediate zone and cortical plate. In contrast, decreasing expression led to fewer neurons. Taken together, Dawson says, these experiments provided even more evidence that their unknown protein was somehow influencing Notch.

To determine exactly how the mystery protein was affecting Notch, the researchers examined the effect of the protein on neural stem cells in the process of differentiating into mature cell types. Increasing the amount of the unknown protein swayed development as if Notch wasn’t working. Since the unknown protein appeared to prevent Notch from acting on cells, the researchers named it Botch for “blocks Notch.”

With Botch’s role now clear, the researchers turned next to the mechanism behind how this protein exerts its influence. A series of experiments suggests that Botch interacts with Notch in the Golgi body, a cellular organelle involved in modifying proteins. For Notch to act in development, an immature version of this protein needs to be cleaved in order for the protein to be rearranged. Botch appears to prevent this pivotal modification from taking place, reducing the amount of mature Notch available to do its job.

Because Botch appears to play such an important role in regulating Notch, Dawson says, it could be involved in a number of diseases in which the generation of new neurons is misregulated. She and her colleagues are already performing some preliminary experiments to determine whether Botch expression might vary from the norm in diseases such as depression, which has been linked to a decrease in neurogenesis in the brain’s hippocampus. Eventually, researchers might be able to develop drugs that act on Botch to restart stalled neurogenesis, potentially treating depression and other diseases in which a lack of neurogenesis is thought to play a role.

"There are potentially some very large neurological problems that could be addressed through changing Botch activity," Dawson says.

Source: Science Daily

May 22, 20126 notes
#science #neuroscience #brain #psychology
Weight struggles? Blame new neurons in your hypothalamus

May 21, 2012

New nerve cells formed in a select part of the brain could hold considerable sway over how much you eat and consequently weigh, new animal research by Johns Hopkins scientists suggests in a study published in the May issue of Nature Neuroscience.

The idea that the brain is still forming new nerve cells, or neurons, into adulthood has become well-established over the past several decades, says study leader Seth Blackshaw, Ph.D., an associate professor in the Solomon H. Snyder Department of Neuroscience at the Johns Hopkins University School of Medicine. However, he adds, researchers had previously thought that this process, called neurogenesis, only occurs in two brain areas: the hippocampus, involved in memory, and the olfactory bulb, involved in smell.

More recent research suggests that a third area, the hypothalamus — associated with a variety of bodily functions, including sleep, body temperature, hunger and thirst — also produces new neurons. However, the precise source of this neurogenesis and the function of these newborn neurons remained a mystery.

To answer these questions, Blackshaw and his colleagues used mice as a model system. The researchers started by investigating whether any particular part of the hypothalamus had a high level of cell growth, suggesting that neurogenesis was occurring. They injected the animals with a compound called bromodeoxyuridine (BrdU), which selectively incorporates itself into newly replicating DNA of dividing cells, where it’s readily detectable. Within a few days, the researchers found high levels of BrdU in an area of the hypothalamus called the median eminence, which lies on the base of the brain’s fluid-filled third ventricle.

Further tests showed that these rapidly proliferating cells were tanycytes, a good candidate for producing new neurons since they have many characteristics in common with cells involved in neurogenesis during early development. To confirm that tanycytes were indeed producing new neurons and not other types of cells, Blackshaw and his colleagues selectively bred mice that produced a fluorescent protein only in their tanycytes. Within a few weeks, they found neurons that also fluoresced, proof that these cells came from tanycyte progenitors.

With the source of hypothalamic neurogenesis settled, the researchers turned to the question of function. Knowing that many previous studies have suggested that animals raised on a high-fat diet are at significantly greater risk of obesity and metabolic syndrome as adults, Blackshaw’s team wondered whether hypothalamic neurogenesis might play a role in this phenomenon.

The researchers fed mice a diet of high-fat chow starting at weaning and looked for evidence of neurogenesis at several different time points. While very young animals showed no difference compared with mice fed normal chow, neurogenesis quadrupled in adults that had consistently eaten the high-fat chow since weaning. These animals gained more weight and had higher fat mass than animals raised on normal chow.

When Blackshaw and his colleagues killed off new neurons in the high-fat eaters by irradiating just their median eminences with precise X-ray beams, the mice gained significantly less weight and fat than animals who had eaten the same diet and were considerably more active, suggesting that these new neurons play a critical role in regulating weight, fat storage and energy expenditure.

"People typically think growing new neurons in the brain is a good thing — but it’s really just another way for the brain to modify behavior," Blackshaw explains. He adds that hypothalamic neurogenesis is probably a mechanism that evolved to help wild animals survive and helped our ancestors do the same in the past. Wild animals that encounter a rich and abundant food source would be well-served to eat as much as possible, since such a resource is typically scarce in nature.

Being exposed to such a resource during youth, and consequently encouraging the growth of neurons that would promote more food intake and energy storage in the future, would be advantageous. However, Blackshaw explains, for lab animals as well as people in developed countries, who have nearly unlimited access to abundant food, such neurogenesis isn’t necessarily beneficial — it could encourage excessive weight gain and fat storage when they’re not necessary.

If the team’s work is confirmed in future studies, he adds, researchers might eventually use these findings as a basis to treat obesity by inhibiting hypothalamic neurogenesis, either by irradiating the median eminence or developing drugs that inhibit this process.

Provided by Johns Hopkins University School of Medicine

Source: medicalxpress.com

May 22, 201212 notes
#science #neuroscience #brain #psychology
Growth factor in stem cells may spur recovery from multiple sclerosis

May 21, 2012

A substance in human mesenchymal stem cells that promotes growth appears to spur restoration of nerves and their function in rodent models of multiple sclerosis (MS), researchers at Case Western Reserve University School of Medicine have found.

Their study appeared in the online version of Nature Neuroscience on Sunday, May 20.

In animals injected with hepatocyte growth factor, inflammation declined and neural cells grew. Perhaps most important, the myelin sheath, which protects nerves and their ability to gather and send information, regrew, covering lesions caused by the disease.

"The importance of this work is we think we’ve identified the driver of the recovery," said Robert H. Miller, professor of neurosciences at the School of Medicine and vice president for research at Case Western Reserve University.

Miller, neurosciences instructor Lianhua Bai and biology professor Arnold I. Caplan, designed the study. They worked with Project Manager Anne DeChant, and research assistants Jordan Hecker, Janet Kranso and Anita Zaremba, from the School of Medicine; and Donald P. Lennon, a research assistant from the university’s Skeletal Research Center.

In MS, the immune system attacks myelin, risking injury to exposed nerves’ intricate wiring. When damaged, nerve signals can be interrupted, causing loss of balance and coordination, cognitive ability and other functions. Over time, intermittent losses may become permanent.

Miller and Caplan reported in 2009 that when they injected human mesenchymal stem cells into rodent models of MS, the animals recovered from the damage wrought by the disease. Based on their work, a clinical trial is underway in which MS patients are injected with their own stem cells.

In this study, the researchers first wanted to test whether the presence of stem cells or something cells produce promotes recovery. They injected mice with the medium in which mesenchymal stem cells, culled from bone marrow, grew.

All 11 animals, which have a version of MS, showed a rapid reduction in functional deficits.

Analysis showed that the disease remained on course unless the molecules injected were of a certain size; that is, the molecular weight ranged between 50 and 100 kiloDaltons.

Research by others and results of their own work indicated hepatocyte growth factor, which is secreted by mesenchymal stem cells, was a likely instigator.

The scientists injected animals with 50 or 100 nanograms of the growth factor every other day for five days. The level of signaling molecules that promote inflammation decreased while the level of signaling molecules that counter inflammation increased. Neural cells grew and nerves laid bare by MS were rewrapped with myelin. The 100-nanogram injections appeared to provide slightly better recovery.

To test the system further, researchers tied up cell-surface receptors, in this case cMet receptors that are known to work with the growth factor.

When they jammed the receptors with a function-blocking cMet antibody, neither the mesenchymal stem cell medium nor the hepatocyte growth factor injections had any effect on the disease. In another test, injections of an anti-hepatocyte growth factor also blocked recovery.

The researchers will continue their studies, to determine if they can screen mesenchymal stem cells for those that produce the higher amounts of hepatocyte growth factor needed for effective treatment. That could lead to a more precise cell therapy.

"Could we now take away the mesenchymal stem cells and treat only with hepatocyte growth factor?” Miller asked. “We’ve shown we can do that in an animal but it’s not clear if we can do that in a patient.”

They also plan to test whether other factors may be used to stimulate the cMet receptors and induce recovery.

Provided by Case Western Reserve University

Source: medicalxpress.com

May 21, 20124 notes
#science #neuroscience #brain #psychology
Rare neurons discovered in monkey brains

May 21, 2012

Max Planck scientists discover brain cells in monkeys that may be linked to self-awareness and empathy in humans.

The anterior insular cortex is a small brain region that plays a crucial role in human self-awareness and in related neuropsychiatric disorders. A unique cell type – the von Economo neuron (VEN) – is located there. For a long time, the VEN was assumed to be unique to humans, great apes, whales and elephants. Henry Evrard, neuroanatomist at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany, now discovered that the VEN occurs also in the insula of macaque monkeys. The morphology, size and distribution of the monkey VEN suggest that it is at least a primal anatomical homolog of the human VEN. This finding offers new and much-needed opportunities to examine in detail the connections and functions of a cell and brain region that could have a key role in human self-awareness and in mental disorders including autism and specific forms of dementia.

The insular cortex, or simply insula, is a hidden cortical region folded and tucked away deep in the brain – an island within the cortex. Within the last decade, the insula has emerged from darkness as having a key role in diverse functions usually linked to our internal bodily states, to our emotions, to our self-awareness, and to our social interactions. The very anterior part of the insula in particular is where humans consciously sense subjective emotions, such as love, hate, resentment, self-confidence or embarrassment. In relation to these feelings, the anterior insula is involved in various psychopathologies. Damage of the insula leads to apathy, and to the inability to tell what feelings we or our conversational partner experience. These inabilities and alteration of the insula are also encountered in autism and other highly detrimental neuropsychiatric disorders including the behavioural variant of frontotemporal dementia (bvFTD).

The von Economo neuron (VEN) occurs almost exclusively in the anterior insula and anterior cingulate cortex. Until recently it was believed that the VEN is only present in humans, great apes and some large-brained mammals with complex social behaviour such as whales and elephants. In contrast to the typical neighbouring pyramidal neuron that is present in all mammals and all brain regions, the VEN has a peculiar spindle shape and is about three times as large. Their numeral density is selectively altered in autism and bvFTD. Henry Evrard and his team, at the Max Planck Institute for Biological Cybernetics in Tübingen now discovered VENs in the anterior insula in macaque monkeys. His present work provides compelling evidence that monkeys possess at least a primitive form of the human VEN although they do not have the ability to recognize themselves in a mirror, a behavioural hallmark of self-awareness.

"This means, other than previously believed, that highly concentrated VEN populations are not an exclusivity of hominids, but also occurs in other primate species", explains Henry Evrard. "The VEN phylogeny needs to be reexamined. Most importantly, the very much-needed analysis of the connections and physiology of these specific neurons is now possible.” Knowing the functions of the VEN and its connections to other regions of the brain in monkeys could give us clues on the evolution of the anatomical substrate of self-awareness in humans and may help us in better understanding serious neuropsychiatric disabilities including autism, or even addictions such as to drugs or smoking.

Provided by Max Planck Society

Source: medicalxpress.com

May 21, 201220 notes
#science #neuroscience #brain #psychology #neuron
Research holds out hope for stroke patients

May 21, 2012

(Medical Xpress) — People with a curious condition that causes them to apply make-up on only one side of their face, or ignore food on half of their plate, are playing a new role in understanding stroke recovery.

Researchers from the Queensland Brain Institute (QBI) at The University of Queensland have found the condition, a subset of the stroke called ‘unilateral spatial neglect’, tend to have the worst recovery outcomes in regaining lost functioning in their bodies, leading them to believe attention may have an important impact on recovering successfully.

Unilateral spatial neglect is typically caused by strokes on the right hand side of the brain and manifests in patients ignoring the left side of their body.

People with the condition may ignore food on the left hand side of their plate or, if asked to draw a clock, squash all 12 numbers into the right side of the clock face, leaving the other side blank.

They may also fail to shave, or to put make-up on the left side of their faces and. In severe cases, they behave as though the left side of their world does not exist.

“We know that brain plasticity plays a critical role in recovering from stroke,” says Professor Jason Mattingley, who holds the Foundation Chair in Cognitive Neuroscience at The University of Queensland.

“The fact that people with spatial neglect tend to have poorer recovery of motor function suggested to us that attention may be important for guiding plasticity following stroke.”

Current research being undertaken by the Mattingley laboratory is exploring this link.

“What we’re trying to do is explore what effect attention has on brain plasticity, and how attention might be used in neurorehabilitation” says Professor Mattingley.

Volunteers first undergo a magnetic resonance imaging (MRI) scan, which provides researchers with a three-dimensional picture of the brain.

“In terms of their structure, brains are like fingerprints – no two are exactly the same, even though superficially they seem very similar,” Professor Mattingley explains.

The MRI scan allows researchers to guide a transcranial magnetic stimulation (TMS) coil into position upon a volunteer’s scalp.

The device induces a small electrical current in the underlying brain tissue, causing it to become more active.

The researchers specifically target a part of the motor cortex that controls the thumb muscle in the left hand.

“It’s well established that the more often neurons activate at the same time, the more likely they are to communicate efficiently in the future. This is how the brain learns,” says Professor Mattingley.

“We’re exploiting that general principle in this research.”

Dr Marc Kamke, Research Fellow at QBI explains: “By adjusting the type of brain stimulation delivered we can artificially induce short-term changes that resemble naturally-occurring plasticity.”

But what the researchers have found is that the effects of stimulation upon a brain’s plasticity are dependent on attention.

“When we ask people to undertake a visual task that is irrelevant to the brain stimulation, but that demands a great deal of their attention, we observe a reduction in plasticity,” Dr Marc Kamke explains.

“When the task does not require much attention, however, the brain’s plastic response is apparent.”

“These results show that attention plays an important role in guiding brain plasticity,” says Professor Mattingley.

He adds, “while practical applications remain several steps away, this knowledge may ultimately help us develop more effective strategies for physical therapy after stroke.”

The results of the research, which was funded by the National Health and Medical Research Council of Australia, are published this week in The Journal of Neuroscience.

Provided by University of Queensland 

Source: medicalxpress.com

May 21, 2012
#science #neuroscience #brain #psychology #stroke
Songbirds' Learning Hub in Brain Offers Insight Into Motor Control

ScienceDaily (May 20, 2012) — To learn its signature melody, the male songbird uses a trial-and-error process to mimic the song of its father, singing the tune over and over again, hundreds of times a day, making subtle changes in the pitch of the notes. For the male Bengalese finch, this rigorous training process begins around the age of 40 days and is completed about day 90, just as he becomes sexually mature and ready to use his song to woo females.

image

To learn its signature melody, the male songbird uses a trial-and-error process to mimic the song of its father, singing the tune over and over again, hundreds of times a day, making subtle changes in the pitch of the notes. (Credit: © fasphotographic / Fotolia)

To accomplish this feat, the finch’s brain must receive and process large quantities of information about its performance and use that data to precisely control the complex vocal actions that allow it to modify the pitch and pattern of its song.

Now, scientists at UCSF have shown that a key brain structure acts as a learning hub, receiving information from other regions of the brain and figuring out how to use that information to improve its song, even when it’s not directly controlling the action. These insights may help scientists figure out new ways to treat neurological disorders that impair movement such as Huntington’s disease and Parkinson’s disease.

The research is reported as an advanced online publication on May 20, 2012 by the journal Nature, and will appear at a later date in the journal’s print edition.

Years of research conducted in the lab of Michael Brainard, PhD, an associate professor of physiology at UCSF, has shown that adult finches can keep track of slight differences in the individual “syllables,” or notes, they play and hear, and make mental computations that allow them to alter the pitch.

For previous experiments, Brainard and his colleagues developed a training process that induced adult finches to calibrate their song. They created a computer program that could recognize the pitch of every syllable the bird sang. The computer also delivered a sound the birds didn’t like — a kind of white noise — at the very moment they uttered a specific note. Within a few hours, the finches learned to alter the pitch of that syllable to avoid hearing the unpleasant sound.

In the new research, the UCSF neuroscientists used their technology to investigate how the learning process is controlled by the brain. A prevailing theory suggests that new learning is controlled by a “smart” brain structure called the basal ganglia, a cluster of interconnected brain regions involved in motor control and learning.

"It’s the first place where the brain is putting two and two together," said Jonathan Charlesworth, a recent graduate of UCSF’s neuroscience PhD program and the first author of the new paper. "If you remove the basal ganglia in a bird that hasn’t yet learned to sing, it will never learn to do so."

Once a basic, frequently repeated skill such as typing, singing the same song or shooting a basketball from the free-throw line is learned, the theory suggests, control of that activity is carried out by the motor pathway, the part of the nervous system that transmits signals from the brain to muscles. But for the basic routine to change — for a player to shoot from another spot on the basketball court or a bird to sing at a different pitch — the basal ganglia must again get involved, providing feedback that allows learning based on trial and error, the theory suggests.

What remained unclear is what makes the basal ganglia so “smart” and enables them to support such detailed trial-and-error learning. Was it something to do with their structure? Or were they getting information from elsewhere?

The scientists sought to answer this question by blocking the output of a key basal ganglia circuit while training male finches to alter their song using the white-noise blasts. As long as the basal ganglia were kept from sending signals to the motor pathway, the finches didn’t change their tune or show signs of learning. But when Brainard’s team stopped blocking the basal ganglia, something surprising happened: the finches immediately changed the pitch of their song, with no additional practice.

"It’s as if a golfer went to the driving range and was terrible, hitting the ball into the trees all day and not getting any better," said Charlesworth. "Then, at the end of the day, you throw a switch and all of a sudden you’re hitting the fairway like you’re Tiger Woods."

Normally, you’d expect improvement in skill performance like this to take time as the basal ganglia evaluates information, makes changes and gets new feedback, Brainard said.

"The surprise here is that the basal ganglia can pay attention, observe what other motor structures are doing and get information even when they aren’t involved in motor control," Brainard said. "They covertly learned how to improve skill performance and this explains how they did it."

These findings suggest that the basal ganglia’s “smartness” is due in large part to the steady flow of information they receive about the commands of other motor structures. It also portrays the basal ganglia as far more versatile than previously understood, able to learn how to calibrate fine-motor skills by acting as a specialized hub that receives information from various parts of the brain and responds to that information with new directives.

The findings also support the notion that problems in the basal ganglia circuit’s ability to receive information and learn from it may help trigger the movement disorders that are symptoms of Huntington’s and Parkinson’s, Brainard said.

Source: Science Daily

May 21, 20128 notes
#science #neuroscience #brain #psychology
Oxytocin Improves Brain Function in Children With Autism

ScienceDaily (May 19, 2012) — Preliminary results from an ongoing, large-scale study by Yale School of Medicine researchers shows that oxytocin — a naturally occurring substance produced in the brain and throughout the body — increased brain function in regions that are known to process social information in children and adolescents with autism spectrum disorders (ASD).

image

Preliminary results from an ongoing, large-scale study by Yale School of Medicine researchers shows that oxytocin — a naturally occurring substance produced in the brain and throughout the body— increased brain function in regions that are known to process social information in children and adolescents with autism spectrum disorders (ASD). (Credit: Image courtesy of Yale University)

A Yale Child Study Center research team that includes postdoctoral fellow Ilanit Gordon and Kevin Pelphrey, the Harris Associate Professor of Child Psychiatry and Psychology, will present the results on May 19 at the International Meeting for Autism Research.

"Our findings provide the first, critical steps toward devising more effective treatments for the core social deficits in autism, which may involve a combination of clinical interventions with an administration of oxytocin," said Gordon. "Such a treatment approach will fundamentally improve our understanding of autism and its treatment."

Social-communicative dysfunctions are a core characteristic of autism, a neurodevelopmental disorder that can have an enormous emotional and financial burden on the affected individual, their families, and society.

Gordon said that while a great deal of progress has been made in the field of autism research, there remain few effective treatments and none that directly target the core social dysfunction. Oxytocin has recently received attention for its involvement in regulating social abilities because of its role in many aspects of social behavior and social cognition in humans and other species.

To assess the impact of oxytocin on the brain function, Gordon and her team conducted a first-of-its-kind, double-blind, placebo-controlled study on children and adolescents aged 7 to 18 with ASD. The team members gave the children a single dose of oxytocin in a nasal spray and used functional magnetic resonance brain imaging to observe its effect.

The team found that oxytocin increased activations in brain regions known to process social information. Gordon said these brain activations were linked to tasks involving multiple social information processing routes, such as seeing, hearing, and processing information relevant to understanding other people.

Source: Science Daily

May 20, 201221 notes
#science #neuroscience #brain #psychology #autism
How Exercise Affects the Brain: Age and Genetics Play a Role

ScienceDaily (May 18, 2012) — Exercise clears the mind. It gets the blood pumping and more oxygen is delivered to the brain. This is familiar territory, but Dartmouth’s David Bucci thinks there is much more going on.

image

Exercise clears the mind. It gets the blood pumping and more oxygen is delivered to the brain. This is familiar territory, but Dartmouth’s David Bucci thinks there is much more going on. (Credit: © Galina Barskaya / Fotolia)

"In the last several years there have been data suggesting that neurobiological changes are happening — [there are] very brain-specific mechanisms at work here," says Bucci, an associate professor in the Department of Psychological and Brain Sciences.

From his studies, Bucci and his collaborators have revealed important new findings:

  • The effects of exercise are different on memory as well as on the brain, depending on whether the exerciser is an adolescent or an adult.
  • A gene has been identified which seems to mediate the degree to which exercise has a beneficial effect. This has implications for the potential use of exercise as an intervention for mental illness.

Bucci began his pursuit of the link between exercise and memory with attention deficit hyperactivity disorder (ADHD), one of the most common childhood psychological disorders. Bucci is concerned that the treatment of choice seems to be medication.

"The notion of pumping children full of psycho-stimulants at an early age is troublesome," Bucci cautions. "We frankly don’t know the long-term effects of administering drugs at an early age — drugs that affect the brain — so looking for alternative therapies is clearly important."

Anecdotal evidence from colleagues at the University of Vermont started Bucci down the track of ADHD. Based on observations of ADHD children in Vermont summer camps, athletes or team sports players were found to respond better to behavioral interventions than more sedentary children. While systematic empirical data is lacking, this association of exercise with a reduction of characteristic ADHD behaviors was persuasive enough for Bucci.

Coupled with his interest in learning and memory and their underlying brain functions, Bucci and teams of graduate and undergraduate students embarked upon a project of scientific inquiry, investigating the potential connection between exercise and brain function. They published papers documenting their results, with the most recent now available in the online version of the journal Neuroscience.

Bucci is quick to point out that “the teams of both graduate and undergraduates are responsible for all this work, certainly not just me.” Michael Hopkins, a graduate student at the time, is first author on the papers.

Early on, laboratory rats that exhibit ADHD-like behavior demonstrated that exercise was able to reduce the extent of these behaviors. The researchers also found that exercise was more beneficial for female rats than males, similar to how it differentially affects male and female children with ADHD.

Moving forward, they investigated a mechanism through which exercise seems to improve learning and memory. This is “brain derived neurotrophic factor” (BDNF) and it is involved in growth of the developing brain. The degree of BDNF expression in exercising rats correlated positively with improved memory, and exercising as an adolescent had longer lasting effects compared to the same duration of exercise, but done as an adult.

"The implication is that exercising during development, as your brain is growing, is changing the brain in concert with normal developmental changes, resulting in your having more permanent wiring of the brain in support of things like learning and memory," says Bucci. "It seems important to [exercise] early in life."

Bucci’s latest paper was a move to take the studies of exercise and memory in rats and apply them to humans. The subjects in this new study were Dartmouth undergraduates and individuals recruited from the Hanover community.

Bucci says that, “the really interesting finding was that, depending on the person’s genotype for that trophic factor [BDNF], they either did or did not reap the benefits of exercise on learning and memory. This could mean that you may be able to predict which ADHD child, if we genotype them and look at their DNA, would respond to exercise as a treatment and which ones wouldn’t.”

Bucci concludes that the notion that exercise is good for health including mental health is not a huge surprise. “The interesting question in terms of mental health and cognitive function is how exercise affects mental function and the brain.” This is the question Bucci, his colleagues, and students continue to pursue.

Source: Science Daily

May 19, 201215 notes
#science #neuroscience #brain #psychology
Acid in the brain: Team develops new way to look at brain function

May 18, 2012

University of Iowa neuroscientist John Wemmie, M.D., Ph.D., is interested in the effect of acid in the brain. His studies suggest that increased acidity or low pH, in the brain is linked to panic disorders, anxiety, and depression. But his work also suggests that changes in acidity are important for normal brain activity too.

image

University of Iowa researchers have developed an MRI-based method to detect and monitor pH changes in living brains. The image shows MRI brain scans of human subject breathing air (left) or air containing 7.5 percent carbon dioxide (middle). The difference between the two scans (shown right) shows increased brain acidity in red caused by carbon dioxide inhalation as measured by the new MRI-based strategy. Credit: Vincent Magnotta, University of Iowa

"We are interested in the idea that pH might be changing in the functional brain because we’ve been hot on the trail of receptors that are activated by low pH,” says Wemmie, a UI associate professor of psychiatry. “The presence of these receptors implies the possibility that low pH might be playing a signaling role in normal brain function.”

Wemmie’s studies have shown that these acid-sensing proteins are required for normal fear responses and for learning and memory in mice. However, while you can buy a kit to measure the pH (acidity) of your garden soil, there currently is no easy way to measure pH changes in the brain.

Wemmie teamed up with Vincent Magnotta, Ph.D., UI associate professor of radiology, psychiatry, and biomedical engineering, and using Magnotta’s expertise in developing MRI (magnetic resonance imaging)-based brain imaging techniques, the researchers developed and tested a new, non-invasive method to detect and monitor pH changes in living brains.

According to Wemmie, the new imaging technique provides the best evidence so far that pH changes do occur with normal function in the intact human brain. The findings were published May 7 in the Proceedings of the National Academy of Sciences (PNAS) Early Edition.

Specifically, the study showed the MRI-based method was able to detect global changes in brain pH in mice. Breathing carbon dioxide, which lowers pH (makes the brain more acidic), increased the signal, while bicarbonate injections, which increases brain pH, decreased the MRI signal. The relationship between the signal and the pH was linear over the range that was tested.

Importantly, the method also seems able to detect localized brain activity. When human volunteers viewed a flashing checkerboard — a classic experiment that activates a particular brain region involved in vision — the MRI method detected a drop in pH in that region. The team also confirmed the pH drop using other methods.

"Our study tells us, first, we have a technique that we believe can measure pH changes in the brain, and second, this MRI-based technique suggests that pH changes do occur with brain function,” Magnotta says.

"The results support our original idea that brain activity can change local pH in human brains during normal activity, meaning that pH change in conjunction with the pH-sensitive receptors could be part of a signaling system that affects brain activity and cognitive function," Wemmie adds

A new way to view brain activity

Importantly, this technique may also provide a new way to image the brain

Currently, functional MRI (fMRI) measures brain activity by detecting a signal that’s due to oxygen levels in the blood flowing to active brain regions. The UI team showed that their method responds to pH changes but is not influenced by changes in blood oxygenation. Conversely, fMRI does not respond to changes in pH.

"What we show is our method of detecting brain activity probably depends on pH changes and, more than that, it is distinct from the signal that fMRI measures," says Wemmie. "This gives us another tool to study brain activity."

pH and brain function

Wemmie’s previous studies have suggested a role for pH changes in certain psychiatric diseases, including anxiety and depression. With the new method, he and his colleagues hope to explore how pH is involved in these conditions.

“Brain activity is likely different in people with brain disorders, such as bipolar or depression and that might be reflected in this measure,” Wemmie says. “And perhaps most important, at the end of the day; could this signal be abnormal or perturbed in human psychiatric disease? And if so, it might be a target for manipulation and treatment?”

Provided by University of Iowa

Source: medicalxpress.com

May 19, 201213 notes
#science #neuroscience #brain #psychology
With fat: What's good or bad for the heart, may be the same for the brain

May 18, 2012

It has been known for years that eating too many foods containing “bad” fats, such as saturated fats or trans fats, isn’t healthy for your heart. However, according to new research from Brigham and Women’s Hospital (BWH), one “bad” fat—saturated fat—was found to be associated with worse overall cognitive function and memory in women over time. By contrast, a “good” fat—mono-unsaturated fat was associated with better overall cognitive function and memory.

This study is published online by Annals of Neurology, a journal of the American Neurological Association and Child Neurology Society, on May 18, 2012.

The research team analyzed data from the Women’s Health Study—originally a cohort of nearly 40,000 women, 45 years and older. The researchers focused on data from a subset of 6,000 women, all over the age of 65. The women participated in three cognitive function tests, which were spaced out every two years for an average testing span of four years. These women filled out very detailed food frequency surveys at the start of the Women’s Health Study, prior to the cognitive testing.

"When looking at changes in cognitive function, what we found is that the total amount of fat intake did not really matter, but the type of fat did,” explained Olivia Okereke, MD, MS, BWH Department of Psychiatry.

Women who consumed the highest amounts of saturated fat, which can come from animal fats such as red meat and butter, compared to those who consumed the lowest amounts, had worse overall cognition and memory over the four years of testing. Women who ate the most of the monounsaturated fats, which can be found in olive oil, had better patterns of cognitive scores over time.

"Our findings have significant public health implications," said Okereke. "Substituting in the good fat in place of the bad fat is a fairly simple dietary modification that could help prevent decline in memory."

Okereke notes that strategies to prevent cognitive decline in older people are particularly important. Even subtle declines in cognitive functioning can lead to higher risk of developing more serious problems, like dementia and Alzheimer disease.

Provided by Brigham and Women’s Hospital

Source: medicalxpress.com

May 18, 201211 notes
#science #neuroscience #brain #psychology
Various metabolic risk factors could be linked to diabetes-related pain with major implications for treatment

May 17, 2012

Around 1 in 50 people in the general population and 1 in 6 of those aged over 40 years experience neuropathy (damage to the nerves of the peripheral nervous system), which can cause numbness, tingling, pain, or weakness. The most common cause of neuropathy is diabetes, and up to half of diabetes patients can be affected. Currently, among the only treatments for neuropathy are glucose control (which often only delays it) and pain management. Yet less than half of patients are treated for pain, despite the availability of many effective therapies . Growing evidence suggests that various metabolic risk factors, including prediabetes, could be linked with neuropathy and thus be targets for new disease-modifying drugs. The issues are discussed in a Review in the June issue of The Lancet Neurology, by Dr Brian C Callaghan and colleagues, all of the University of Michigan, Ann Arbor, MI, USA.

Diabetes can cause various patterns of so-called diabetic neuropathy, but the most common presentation is a distal symmetrical polyneuropathy (DSP), in which symptoms begin in the feet and spread up the limbs. Patients experience decreased quality of life, both physically and mentally. DSP can cause balance problems, which may lead to falls. Neuropathy is one of three main risk factors for falls in patients with diabetes, along with retinopathy and vestibular dysfunction. Patients with diabetic DSP are two to three times more likely to fall than those with diabetes and no neuropathy. Additionally, patients with severe DSP are at risk of ulcerations and lower-extremity amputations, with 15% developing an ulcer during the course of their disease. Diabetes is the leading cause of lower-extremity amputations, roughly 80 000 of which are undertaken in the USA every year in patients with the disorder. Indeed, patients with diabetes are 15 times more likely than people without diabetes to have this life-changing complication.

Overall, costs associated with diabetic neuropathy in the USA are estimated to be between 4•6 and 13•7 billion dollars, with most of the expense attributed to those with type 2 diabetes. Therefore, neuropathy is associated with a quarter of the total costs of diabetes care in the USA.

Since the data linking prediabetes (a condition with higher than normal blood sugar levels, but not yet high enough for a diabetes diagnosis) with neuropathy are conflicting, a comprehensive study is needed to establish whether or not it is one of the metabolic drivers that underlie the onset and progression of neuropathy. The answer has direct implications for potential therapies for many patients with neuropathy. Currently one third of adult Americans meet criteria for prediabetes, but less than 5% of these people have received a formal diagnosis of prediabetes from their health-care providers and only a small percentage are being treated .Establishing a causal relation between prediabetes and neuropathy would change the clinical management of a substantial number of patients.

Research suggests that various metabolic factors (components of ‘metabolic syndrome’) other than blood glucose control—such as levels of LDL (bad) cholesterol and high blood pressure—might have a role in the development of neuropathy. The authors say that there are promising lines of investigation that could lead to improved prevention and treatment of the disorder. The magnitude of the effect of glucose control on neuropathy is much smaller in patients with type 2 diabetes than in those with type 1 diabetes. In view of this small effect size and the fact that many patients with type 2 diabetes continue to develop neuropathy despite adequate glucose control, discovery of modifiable risk factors for neuropathy is essential. Callaghan and colleagues are currently conducting such a study.

The authors conclude: “Components of the metabolic syndrome, including prediabetes, are potential risk factors for neuropathy, and studies are needed to establish whether they are causally related to neuropathy. These lines of enquiry will have direct implications for the development of new treatments for diabetic neuropathy.”

Provided by Lancet

Source: medicalxpress.com

May 18, 20121 note
#neuroscience #pain #psychology #science
Training the Brain Could Help Reduce Pain

ScienceDaily (May 17, 2012) — Training the brain to reduce pain could be a promising approach for treating phantom limb pain and complex regional pain syndrome, according to an internationally known neuroscience researcher speaking May 17 at the American Pain Society’s Annual Scientific Meeting.

G. Lorimer Moseley, PhD, professor of clinical neurosciences at University of South Australia and Neuroscience Research Australia, and head of the Body in Mind research team, told the plenary session audience that the brain stores maps of the body that are integrated with neurological systems that survey, regulate, and protect the integrity of the body physically and psychologically. These cortical maps govern movement, sensation and perception, and there is growing evidence, according to Moseley, showing that disruptions of brain maps occur in people with chronic pain. The best evidence is from those with phantom limb pain and complex regional pain syndrome, but there is also data from chronic back pain.

Moseley’s research is focused on the role of the brain and mind in chronic and complex pain disorders. Through collaborations with clinicians, scientists and patients, the Body in Mind team is exploring how the brain and its representation of the body change when pain persists, how the mind influences physiological regulation of the body, how the changes in the brain and mind can be normalized with treatment.

"We’re learning that chronic pain is associated with disruption of brain maps of the body and of the space around the body. When the brain determines the location of a sensory event, it integrates the location of the event in the body with a map of space. Disruption of these processes might be contributing to the problem," said Moseley. He added that it is possible for the body to be unharmed but the brain will respond by causing pain because it misinterpreted a benign stimulus as an attack. "We want to gradually train the brain to stop trying to protect body tissue that doesn’t need protecting."

Moseley said the brain can “rewire” itself, a process called neuroplasticity. Often painful stimuli triggered by a broken bone or other trauma cause the brain to rewire and, as a result, the damage signal is never switched off after the initial body trauma is resolved. The result: Chronic pain. So if the brain is capable of changing to cause persistent pain, can it be changed back to normal to alleviate pain?

"The brain is the focal point of the pain experience, but the plasticity phenomena can be harnessed to help alleviate pain," Moseley said.

He further stated that disrupted cortical body maps may contribute to the development or maintenance of chronic pain and, therefore, could be viable targets for treatment. One treatment approach involves targeting motor systems through a process Moseley calls graded motor imagery. It relies on using visual images to help the brain change its perceptions of the body after prolonged pain stimuli. “For someone with phantom limb pain, the brain’s body map still includes the severed arm or leg, and without any real stimuli from the region, it continues to produce pain,” Moseley explained.

He reported that studies with graded motor imagery have shown encouraging results in complex regional pain syndrome and in phantom limb pain.

"Our work shows that the complex neural connections in the brain not only are associated with chronic pain, they can be reconnected or manipulated through therapy that alters brain perceptions and produce pain relief," said Moseley.

Source: Science Daily

May 18, 20128 notes
#science #neuroscience #brain #psychology #pain
Pain Relief Through Distraction: It's Not All in Your Head

ScienceDaily (May 17, 2012) — Mental distractions make pain easier to take, and those pain-relieving effects aren’t just in your head, according to a report published online on May 17 in Current Biology, a Cell Press publication.

The findings based on high-resolution spinal fMRI (functional magnetic resonance imaging) as people experienced painful levels of heat show that mental distractions actually inhibit the response to incoming pain signals at the earliest stage of central pain processing.

"The results demonstrate that this phenomenon is not just a psychological phenomenon, but an active neuronal mechanism reducing the amount of pain signals ascending from the spinal cord to higher-order brain regions," said Christian Sprenger of the University Medical Center Hamburg-Eppendorf.

Those effects involve endogenous opioids, which are naturally produced by the brain and play a key role in the relief of pain, the new evidence shows.

The research group asked participants to complete either a hard or an easy memory task, both requiring them to remember letters, while they simultaneously applied a painful level of heat to their arms.

When study participants were more distracted by the harder of the two memory tasks, they did indeed perceive less pain. What’s more, their less painful experience was reflected by lower activity in the spinal cord as observed by fMRI scans. (fMRI is often used to measure changes in brain activity, Sprenger explained, and recent advances have made it possible to extend this tool for use in the spinal cord.)

Sprenger and colleagues then repeated the study again, this time giving participants either a drug called naloxone, which blocks the effects of opioids, or a simple saline infusion. The pain-relieving effects of distraction dropped by 40 percent during the application of the opioid antagonist compared to saline, evidence that endogenous opioids play an essential role.

The findings show just how deeply mental processes can go in altering the experience of pain, and that may have clinical importance.

"Our findings strengthen the role of cognitive-behavioral therapeutic approaches in the treatment of pain diseases, as it could be extrapolated that these approaches might also have the potential to alter the underlying neurobiological mechanisms as early as in the spinal cord," the researchers say.

Source: Science Daily

May 17, 20125 notes
#science #neuroscience #brain #psychology #pain
Suspicion resides in two regions of the brain

May 17, 2012

Fool me once, shame on you. Fool me twice, shame on my parahippocampal gyrus.

image

Read Montague, Ph.D., and colleagues at the Virginia Tech Carilion Research Institute discovered two distinct sites for suspicion in the brain: the amygdala, which correlates strongly with a baseline distrustfulness, and the parahippocampal gyrus, which acts like a cerebral lie detector. Credit: Virginia Tech

Scientists at the Virginia Tech Carilion Research Institute have found that suspicion resides in two distinct regions of the brain: the amygdala, which plays a central role in processing fear and emotional memories, and the parahippocampal gyrus, which is associated with declarative memory and the recognition of scenes.

"We wondered how individuals assess the credibility of other people in simple social interactions," said Read Montague, director of the Human Neuroimaging Laboratory and the Computational Psychiatry Unit at the Virginia Tech Carilion Research Institute, who led the study. "We found a strong correlation between the amygdala and a baseline level of distrust, which may be based on a person’s beliefs about the trustworthiness of other people in general, his or her emotional state, and the situation at hand. What surprised us, though, is that when other people’s behavior aroused suspicion, the parahippocampal gyrus lit up, acting like an inborn lie detector.”

The scientists used functional magnetic resonance imaging, or fMRI, to study the neural basis of suspicion. Seventy-six pairs of players, each with a buyer and a seller, competed in 60 rounds of a simple bargaining game while having their brains scanned. At the beginning of each round, the buyer would learn the value of a hypothetical widget and suggest a price to the seller. The seller would then set the price. If the seller’s price fell below the widget’s given value, the trade would go through, with the seller receiving the selling price and the buyer receiving any difference between the selling price and the actual value. If the seller’s price exceeded the value, though, the trade would not execute, and neither party would receive cash.

The authors found, as detailed in a previous paper, that buyers fell into three strategic categories: 42 percent were incrementalists, who were relatively honest about the widget’s value; 37 percent were conservatives, who adopted the strategy of withholding information; and 21 percent were strategists, who were actively deceptive, mimicking incrementalist behavior by sending high suggestions during low-value trials and then reaping greater benefits by sending low suggestions during high-value trials.

The sellers had a monetary incentive to read the buyers’ strategic profiles correctly, yet they received no feedback about the accuracy of the information they were receiving, so they could not confirm any suspicions about patterns of behavior. Without feedback, the sellers were forced to decide whether they should trust the buyers based on the pricing suggestions alone. “The more uncertain a seller was about a buyer’s credibility,” Montague said, “the more active his or her parahippocampal gyrus became.”

The authors believe a person’s baseline suspicion may have important consequences for his or her financial success. “People with a high baseline suspicion were often interacting with fairly trustworthy buyers, so in ignoring the information those buyers provided, they were giving up potential profits,” said Meghana Bhatt, the first author on the research paper. “The ability to recognize credible information in a competitive environment can be just as important as detecting untrustworthy behavior.”

The findings may also have implications for such psychiatric conditions as paranoia and anxiety disorders, said Montague. “The fact that increased amygdala activation corresponds to an inability to detect trustworthy behavior may provide insight into the social interactions of people with anxiety disorders, who often have increased activity in this area of the brain,” he said.

Provided by Virginia Tech

Source: medicalxpress.com

May 17, 201216 notes
#science #neuroscience #brain #psychology
Alzheimer's Gene Causes Brain's Blood Vessels to Leak Toxins and Die

ScienceDaily (May 16, 2012) — A well-known genetic risk factor for Alzheimer’s disease triggers a cascade of signaling that ultimately results in leaky blood vessels in the brain, allowing toxic substances to pour into brain tissue in large amounts, scientists report May 16 in the journal Nature.

image

The left photo shows destructive proteins (green) lining blood vessels in living brain tissue of mice with the human ApoE4 gene; after the drug cyclosporine A is added, the harmful proteins are nearly gone (right). (Credit: Image courtesy of University of Rochester Medical Center)

The results come from a team of scientists investigating why a gene called ApoE4 makes people more prone to developing Alzheimer’s. People who carry two copies of the gene have roughly eight to 10 times the risk of getting Alzheimer’s disease than people who do not.

A team of scientists from the University of Rochester, the University of Southern California, and other institutions found that ApoE4 works through cyclophilin A, a well-known bad actor in the cardiovascular system, causing inflammation in atherosclerosis and other conditions. The team found that cyclophilin A opens the gates to the brain assault seen in Alzheimer’s.

"We are beginning to understand much more about how ApoE4 may be contributing to Alzheimer’s disease," said Robert Bell, Ph.D., the post-doctoral associate at Rochester who is first author of the paper. "In the presence of ApoE4, increased cyclophilin A causes a breakdown of the cells lining the blood vessels in Alzheimer’s disease in the same way it does in cardiovascular disease or abdominal aneurysm. This establishes a new vascular target to fight Alzheimer’s disease."

The team found that ApoE4 makes it more likely that cyclophilin A will accumulate in large amounts in cells that help maintain the blood-brain barrier, a network of tightly bound cells that line the insides of blood vessels in the brain and carefully regulates what substances are allowed to enter and exit brain tissue.

ApoE4 creates a cascade of molecular signaling that weakens the barrier, causing blood vessels to become leaky. This makes it more likely that toxic substances will leak from the vessels into the brain, damaging cells like neurons and reducing blood flow dramatically by choking off blood vessels.

Doctors have long known that the changes in the brain seen in Alzheimer’s patients — the death of crucial brain cells called neurons — begins happening years or even decades before symptoms appear. The steps described in Nature discuss events much earlier in the disease process.

The idea that vascular problems are at the heart of Alzheimer’s disease is one championed for more than two decades by Berislav Zlokovic, M.D., Ph.D., the leader of the team and a neuroscientist formerly with the University of Rochester Medical Center and now at USC. For 20 years, Zlokovic has investigated how blood flow in the brain is affected in people with the disease, and how the blood-brain barrier allows nutrients to pass into the brain, and harmful substances to exit the brain.

At Rochester, Zlokovic struck up a collaboration with Bradford Berk, M.D., Ph.D.,a cardiologist and CEO of the Medical Center. For more than two decades Berk has studied cyclophilin A, showing how it promotes destructive forces in blood vessels and how it’s central to the forces that contribute to cardiovascular diseases like atherosclerosis and heart attack.

"As a cardiologist, I’ve been interested in understanding the role of cyclophilin A in patients who suffer from cardiovascular illness," said Berk, a professor at the Aab Cardiovascular Research Institute. "Now our collaboration in Rochester has resulted in the discovery that it also has an important role in Alzheimer’s disease. The finding reinforces the basic research enterprise — you never know when knowledge gained in one area will turn out to be crucial in another."

In studies of mice, the team found that mice carrying the ApoE4 gene had five times as much cyclophilin A compared to other mice in cells known as pericytes, which are crucial to maintaining the integrity of the blood-brain barrier. Blood vessels died, blood did not flow as completely through the brain as it did in other mice, and harmful substances like thrombin, fibrin, and hemosiderin, entered the brain tissue.

When the team blocked the action of cyclophilin A, either by knocking out its gene or by using the drug cyclosporine A to inhibit it, the damage in the mice was reversed. Blood flow resumed to normal, and unhealthy leakage of toxic substances from the blood vessels into the brain was slashed by 80 percent.

The team outlined the chain of events involved. Briefly:

  • When ApoE4 is present, cyclophilin A is much more plentiful;
  • Cyclophilin A causes an increase in a the inflammatory molecule NF Kappa B;
  • NF Kappa B boosts levels of certain types of molecules known as MMPs or matrix metalloproteinases that are known to damage blood vessels, reducing blood flow.

Altogether, the activity results in a dramatic boost in the amount of toxic substances in brain tissue. And when the cascade is interrupted at any of several points — when ApoE4 is not present, when cyclophilin A is blocked or shut off, or when NF Kappa B or the MMPs are inhibited — the blood-brain barrier is restored, blood flow returns to normal, and toxic substances do not leak into brain tissue.

For many years, researchers studying Alzheimer’s disease have been focused largely on amyloid beta, a protein structure that accumulates in the brains of patients with Alzheimer’s disease. The latest works points up the importance of other approaches, said Zlokovic, an adjunct professor at Rochester. At USC, Zlokovic is also deputy director of the Zilkha Neurogenetic Institute, director of the Center for Neurodegeneration and Regeneration, and professor and chair of the Department of Physiology and Biophysics.

"Our study has shown major neuronal injury resulting from vascular defects that are not related to amyloid beta," said Zlokovic. "This damage results from a breakdown of the blood-brain barrier and a reduction in blood flow.

"Amyloid beta definitely has an important role in Alzheimer’s disease," added Zlokovic. "But it’s very important to investigate other leads, perhaps where amyloid beta isn’t as centrally involved."

Source: Science Daily

May 17, 201212 notes
#science #neuroscience #brain #psychology #alzheimer
Human Genes Transplanted Into Zebrafish: Helps Identify Genes Related to Autism, Schizophrenia and Obesity

ScienceDaily (May 16, 2012) — What can a fish tell us about human brain development? Researchers at Duke University Medical Center transplanted a set of human genes into a zebrafish and then used it to identify genes responsible for head size at birth.

image

Here are images of live zebrafish that were studied for genetics and head size to give insight into human head size. The top fish does not have the gene KCTD13 and its head size and brain size are larger; the middle fish is normal; the fish on the bottom expresses too much of the gene and has the smallest head and brain size. (Credit: Christelle Golzio, Duke Center for Human Disease Modeling and Duke Department of Cell Biology)

Researchers at Duke University Medical Center transplanted a set of human genes into a zebrafish and then used it to identify genes responsible for head size at birth.

Head size in human babies is a feature that is related to autism, a condition that recent figures have shown to be more common than previously reported, 1 in 88 children in a March 2012 study. Head size is also a feature of other major neurological disorders, such as schizophrenia.

"In medical research, we need to dissect events in biology so we can understand the precise mechanisms that give rise to neurodevelopmental traits," said senior author Nicholas Katsanis, Ph.D., Jean and George Brumley Jr., MD, Professor of Developmental Biology, and Professor of Pediatrics and Cell Biology. "We need expert scientists to work side by side with clinicians who see such anatomic and other problems in patients, if we are to effectively solve many of our medical problems."

The study was published online in Nature journal on May 16.

Katsanis knew that a region on chromosome 16 was one of the largest genetic contributors to autism and schizophrenia, but a conversation at a European medical meeting pointed him to information that changes within that same region of the genome also were related to changes in a newborn’s head size.

The problem was difficult to address because the region had large deletions and duplications in DNA, which are the most common mutational mechanisms in humans. “Interpretation is harrowingly hard,” said Katsanis, who is also director of the Duke Center for Human Disease Modeling.

The reason is that a duplication of DNA or missing DNA usually involves several genes. “It is very difficult to go from ‘here is a region with many genes, sometimes over 50’ to ‘these are the genes that are driving this pathology,’” Katsanis said.

"There was a light bulb moment," Katsanis said. "The area of the genome we were exploring gave rise to reciprocal (opposite) defects in terms of brain cell growth, so we realized that overexpressing a gene in question might give one phenotype — a smaller head, while shutting down the same gene might yield the other, a larger head."

The researchers transplanted a common duplication area of human chromosome 16 known to contain 29 genes into zebrafish embryos and then systematically turned up the activity of each transplanted human gene to find which might cause a small head (microcephaly) in the fish. They then suppressed the same gene set and asked whether any of them caused the reciprocal defect: larger heads (macrocephaly).

The researchers knew that deletion of the region that contained these 29 genes occurred in 1.7% of children with autism.

It took the team a few months to dissect such a “copy number variant” — an alteration of the genome that results in an abnormal number of one or more sections of chromosomal DNA.

"Now we can go from a genetic finding that is dosage-sensitive and start asking reasonable questions about this gene as it pertains to neurocognitive traits, which is a big leap," Katsanis said. Neurocognitive refers to the ability to think, concentrate, reason, remember, process information, learn, understand and speak.

Many human conditions have anatomical features that are also related to genetics, he said. “There are major limitations in studying autistic or schizophrenic behavior in zebrafish, but we can measure head size, jaw size, or facial abnormalities.”

The single gene in question, KCTD13, is responsible for driving head size in zebrafish by regulating the creation and destruction of new neurons (brain cells). This discovery let the team focus on the analogous gene in humans. “This gene contributes to autism cases, and probably is associated with schizophrenia and also childhood obesity,” Katsanis said.

Once the gene has been uncovered, researchers can examine the protein it produces. “Once you have the protein, you can start asking valuable functional questions and learning what the gene does in the animal or human,” Katsanis said.

Copy number variants, such as the ones this team found on chromosome 16, are now thought to be one of the most common sources of genetic mutations. Hundreds, if not thousands, of such chromosomal deletions and duplications have been found in patients with a broad range of clinical problems, particularly neurodevelopmental disorders.

"Now we may have an efficient tool for dissecting them, which gives us the ability to improve both diagnosis and understanding of disease mechanisms," Katsanis said.

The current study suggests that KCTD13 is a major contributor to some cases of autism, but also points to the synergistic action of this gene with two other genes in the region, named MVP and MAPK3, Katsanis said.

Source: Science Daily

May 17, 20125 notes
#science #neuroscience #genetics #psychology
Internet Usage Patterns May Signify Depression

ScienceDaily (May 16, 2012) — In a new study analyzing Internet usage among college students, researchers at Missouri University of Science and Technology have found that students who show signs of depression tend to use the Internet differently than those who show no symptoms of depression.

Using actual Internet usage data collected from the university’s network, the researchers identified nine fine-grained patterns of Internet usage that may indicate depression. For example, students showing signs of depression tend to use file-sharing services more than their counterparts, and also use the Internet in a more random manner, frequently switching among several applications.

The researchers’ findings provide new insights on the association between Internet use and depression compared to existing studies, says Dr. Sriram Chellappan, an assistant professor of computer science at Missouri S&T and the lead researcher in the study.

"The study is believed to be the first that uses actual Internet data, collected unobtrusively and anonymously, to associate Internet usage with signs of depression," Chellappan says. Previous research on Internet usage has relied on surveys, which are "a far less accurate way" of assessing how people use the Internet, he says.

"This is because when students themselves reported their volume and type of Internet activity, the amount of Internet usage data is limited because people’s memories fade with time," Chellappan says. "There may be errors and social desirability bias when students report their own Internet usage." Social desirability bias refers to the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others.

Chellappan and his fellow researchers collected a month’s worth of Internet data for 216 Missouri S&T undergraduate students. The data was collected anonymously and unobtrusively, and students involved in the study were assigned pseudonyms to keep their identities hidden from the researchers.

Before the researchers collected the usage data from the campus network, the students were tested to determine whether they showed signs of depression. The researchers then analyzed the usage data of the study participants. They found that students who showed signs of depression used the Internet much differently than the other study participants.

Chellappan and his colleagues found that depressed students tended to use file-sharing services, send email and chat online more than the other students. Depressed students also tended to use higher “packets per flow” applications, those high-bandwidth applications often associated with online videos and games, than their counterparts.

Students who showed signs of depression also tended to use the Internet in a more “random” manner — frequently switching among applications, perhaps from chat rooms to games to email. Chellappan thinks that randomness may indicate trouble concentrating, a characteristic associated with depression.

The randomness stood out to Chellappan after his graduate student, Raghavendra Kotikalapudi, examined the “flow duration entropy” of students’ online usage. Flow duration entropy refers to the consistency of Internet use during certain periods of time. The lower the flow duration entropy, the more consistent the Internet use.

"Students showing signs of depression had high flow duration entropy, which means that the duration of Internet flows of these students is highly inconsistent," Chellappan says.

At the beginning of the study, the 216 participating students were tested to determine whether they exhibited symptoms of depression. Based on the Center for Epidemiologic Studies-Depression (CES-D) scale, about 30 percent of the students in the study met the minimum criteria for depression. Nationally, previous studies show that between 10 percent and 40 percent of all American students suffer from depression.

To ensure that participants were not identified during the study, each participant was assigned a pseudonym. The campus information technology department then provided the on-campus Internet usage data for each participant from the month of February 2011.

The researchers’ analysis of the month’s worth of data led Chellappan and his colleagues to conclude that students who were identified as exhibiting symptoms of depression used the Internet differently than the other students in the study.

Chellappan’s research has been accepted for publication in a forthcoming issue of IEEE Technology and Society Magazine.

The chief author of the paper is Kotikalapudi, who received his master of science degree in computer science from Missouri S&T in December 2011. His co-authors are Chellappan; Dr. Frances Montgomery, Curators’ Teaching Professor of psychological science; Dr. Donald C. Wunsch, the M.K. Finley Missouri Distinguished Professor of Computer Engineering; and Karl F. Lutzen, information security officer for Missouri S&T’s IT department.

Chellappan is now interested in using these findings to develop software that could be installed on home computers to help individuals determine whether their Internet usage patterns may indicate depression. The software would unobtrusively monitor Internet usage and alert individuals if their usage patterns indicate symptoms of depression.

"The software would be a cost-effective and an in-home tool that could proactively prompt users to seek medical help if their Internet usage patterns indicate possible depression," Chellappan says. "The software could also be installed on campus networks to notify counselors of students whose Internet usage patterns are indicative of depressive behavior."

Chellappan also believes the method used to connect Internet use and depression could also help diagnose other mental disorders like anorexia, bulimia, attention deficit hyperactivity disorder or schizophrenia.

"We could also investigate associations between other Internet features like visits to social networking sites, late night Internet use and randomness in time of Internet use with depressive symptoms," he says. "Applications of this study to diagnose and treat mental disorders for other vulnerable groups like the elderly and military veterans are also significant."

Source: Science Daily

May 17, 201212 notes
#science #neuroscience #brain #psychology #depression
Head Impacts in Contact Sports May Reduce Learning in College Athletes

ScienceDaily (May 16, 2012) — A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. The research is published in the May 16, 2012, online issue of Neurology®, the medical journal of the American Academy of Neurology.

image

A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. (Credit: © modestil / Fotolia)

The study involved college athletes at three Division I schools and compared 214 athletes in contact sports to 45 athletes in non-contact sports such as track, crew and Nordic skiing at the beginning and at the end of their seasons. The contact sport athletes wore special helmets that recorded the acceleration speed and other data at the time of any head impact.

The contact sport athletes experienced an average of 469 head impacts during the season. Athletes were not included in the study if they were diagnosed with a concussion during the season.

All of the athletes took tests of thinking and memory skills before and after the season. A total of 45 contact sport athletes and 55 non-contact sport athletes from one of the schools also took an additional set of tests of concentration, working memory and other skills.

"The good news is that overall there were few differences in the test results between the athletes in contact sports and the athletes in non-contact sports," said study author Thomas W. McAllister, MD, of The Geisel School of Medicine at Dartmouth in Lebanon, N.H. "But we did find that a higher percentage of the contact sport athletes had lower scores than would have been predicted after the season on a measure of new learning than the non-contact sport athletes."

A total of 22 percent of the contact sport athletes performed worse than expected on the test of new learning, compared to four percent of the non-contact sport athletes.

McAllister noted that the study did not find differences in test results between the two groups of athletes at the beginning of the season, suggesting that the cumulative head impacts that contact athletes had incurred over many previous seasons did not result in reduced thinking and memory skills in the overall group.

"These results are somewhat reassuring, given the recent heightened concern about the potential negative effects of these sports," he said. "Nevertheless, the findings do suggest that repetitive head impacts may have a negative effect on some athletes."

McAllister said it’s possible that some people may be genetically more sensitive to head impacts.

Source: Science Daily

May 17, 20122 notes
#science #neuroscience #brain #psychology
Character Traits Determined Genetically? Genes May Hold the Key to a Life of Success, Study Suggests

ScienceDaily (May 16, 2012) — Genes play a greater role in forming character traits — such as self-control, decision making or sociability — than was previously thought, new research suggests.

image

Identical twin boys. Genes play a greater role in forming character traits — such as self-control, decision making or sociability — than was previously thought, new research suggests. (Credit: © vgm6 / Fotolia)

A study of more than 800 sets of twins found that genetics were more influential in shaping key traits than a person’s home environment and surroundings.

Psychologists at the University of Edinburgh who carried out the study, say that genetically influenced characteristics could well be the key to how successful a person is in life.

The study of twins in the US — most aged 50 and over- used a series of questions to test how they perceived themselves and others. Questions included “Are you influenced by people with strong opinions?” and “Are you disappointed about your achievements in life?”

The results were then measured according to the Ryff Psychological Well-Being Scale which assesses and standardizes these characteristics.

By tracking their answers, the research team found that identical twins — whose DNA is [presumed to be] exactly the same — were twice as likely to share traits compared with non-identical twins.

Psychologists say the findings are significant because the stronger the genetic link, the more likely it is that these character traits are carried through a family.

Professor Timothy Bates, of the University of Edinburgh’s School of Philosophy, Psychology and Language Sciences, said that the genetic influence was strongest on a person’s sense of self-control.

Researchers found that genes affected a person’s sense of purpose, how well they get on with people and their ability to continue learning and developing.

Professor Bates added: “Ever since the ancient Greeks, people have debated the nature of a good life and the nature of a virtuous life. Why do some people seem to manage their lives, have good relationships and cooperate to achieve their goals while others do not? Previously, the role of family and the environment around the home often dominated people’s ideas about what affected psychological well-being. However, this work highlights a much more powerful influence from genetics.”

The study, which builds on previous research that found that happiness is underpinned by genes, is published online in the Journal of Personality.

Source: Science Daily

May 17, 201214 notes
#neuroscience #psychology #science #genetics
Damaged Connections in Phineas Gage's Brain: Famous 1848 Case of Man Who Survived Accident Has Modern Parallel

ScienceDaily (May 16, 2012) — Poor Phineas Gage. In 1848, the supervisor for the Rutland and Burlington Railroad in Vermont was using a 13-pound, 3-foot-7-inch rod to pack blasting powder into a rock when he triggered an explosion that drove the rod through his left cheek and out of the top of his head. As reported at the time, the rod was later found, “smeared with blood and brains.”

image

Recreation of Gage accident. (Credit: Copyright John Darrell Van Horn and the UCLA Laboratory of Neuro Imaging, 2012)

Miraculously, Gage lived, becoming the most famous case in the history of neuroscience — not only because he survived a horrific accident that led to the destruction of much of his left frontal lobe but also because of the injury’s reported effects on his personality and behavior, which were said to be profound. Gage went from being an affable 25-year-old to one that was fitful, irreverent and profane. His friends and acquaintances said he was “no longer Gage.”

Over the years, various scientists have studied and argued about the exact location and degree of damage to Gage’s cerebral cortex and the impact it had on his personality. Now, for the first time, researchers at UCLA, using brain-imaging data that was lost to science for a decade, have broadened the examination of Gage to look at the damage to the white matter “pathways” that connect various regions of the brain.

Reporting in the May 16 issue of the journal PLoS ONE, Jack Van Horn, a UCLA assistant professor of neurology, and colleagues note that while approximately 4 percent of the cerebral cortex was intersected by the rod’s passage, more than 10 percent of Gage’s total white matter was damaged. The passage of the tamping iron caused widespread damage to the white matter connections throughout Gage’s brain, which likely was a major contributor to the behavioral changes he experienced.

Because white matter and its myelin sheath — the fatty coating around the nerve fibers that form the basic wiring of the brain — connect the billions of neurons that allow us to reason and remember, the research not only adds to the lore of Phineas Gage but may eventually lead to a better understanding of multiple brain disorders that are caused in part by similar damage to these connections.

"What we found was a significant loss of white matter connecting the left frontal regions and the rest of the brain," said Van Horn, who is a member of UCLA’s Laboratory of Neuro Imaging (LONI). "We suggest that the disruption of the brain’s ‘network’ considerably compromised it. This may have had an even greater impact on Mr. Gage than the damage to the cortex alone in terms of his purported personality change."

LONI is part of an ambitious joint effort with Massachusetts General Hospital and the National Institutes of Health to document the trillions of microscopic links between every one of the brain’s 100 billion neurons — the so-called “connectome.” And because mapping the brain’s physical wiring eventually will lead to answers about what causes mental conditions that may be linked to the breakdown of these connections, it was appropriate, as well as historically interesting, to take a new look at the damage to Gage’s brain.

Since Gage’s 189-year-old skull, which is on display in the Warren Anatomical Museum at Harvard Medical School, is now fragile and unlikely to again be subjected to medical imaging, the researchers had to track down the last known imaging data, from 2001, which had been lost due to various circumstances at Brigham and Women’s Hospital, a teaching affiliate of Harvard, for some 10 years.

The authors were able to recover the computed tomographic data files and managed to reconstruct the scans, which revealed the highest-quality resolution available for modeling Gage’s skull. Next, they utilized advanced computational methods to model and determine the exact trajectory of the tamping iron that shot through his skull. Finally, because the original brain tissue was, of course, long gone, the researchers used modern-day brain images of males that matched Gage’s age and (right) handedness, then used software to position a composite of these 110 images into Gage’s virtual skull, the assumption being that Gage’s anatomy would have been similar.

Van Horn found that nearly 11 percent of Gage’s white matter was damaged, along with 4 percent of the cortex.

"Our work illustrates that while cortical damage was restricted to the left frontal lobe, the passage of the tamping iron resulted in the widespread interruption of white matter connectivity throughout his brain, so it likely was a major contributor to the behavioral changes he experienced," Van Horn said. "Connections were lost between the left frontal, left temporal and right frontal cortices and the left limbic structures of the brain, which likely had considerable impact on his executive as well as his emotional functions."

And while Gage’s personality changed, he eventually was able to travel and find employment as a stagecoach driver for several years in South America. Ultimately, he died in San Francisco, 12 years after the accident.

Van Horn noted a modern parallel.

"The extensive loss of white matter connectivity, affecting both hemispheres, plus the direct damage by the rod, which was limited to the left cerebral hemisphere, is not unlike modern patients who have suffered a traumatic brain injury," he said. "And it is analogous to certain forms of degenerative diseases, such as Alzheimer’s disease or frontal temporal dementia, in which neural pathways in the frontal lobes are degraded, which is known to result in profound behavioral changes."

Van Horn noted that the quantification of the changes to Gage’s brain’s pathways might well provide important insights for clinical assessment and outcome-monitoring in modern-day brain trauma patients.

Source: Science Daily

May 17, 201228 notes
#science #neuroscience #brain #psychology
Positive feedback in the developing brain

May 16, 2012

(Medical Xpress) — When an animal is born, its early experiences help map out the still-forming connections in its brain. As neurons in sensory areas of the brain fire in response to sights, smells, and sounds, synapses begin to form, laying the neuronal groundwork for activity later in life. Not all parts of the brain receive input directly from the external world, however, and researchers have wondered how these regions build their wiring early in development.

image

The output of this indirect-pathway neuron in the striatum of a mouse brain has been genetically silenced. The neuron has been filled through the attached electrode with a red fluorophore to measure its spine density and the number of active synapses. In the background, other indirect pathway neurons are seen in green and red. Credit: Bernardo Sabatini

New research from Howard Hughes Medical Institute investigator Bernardo Sabatini and colleagues on the basal ganglia, a region of the brain that controls motor planning, indicates that development here follows a different strategy. The new findings suggest that wiring of the basal ganglia during early development is driven not only by experience, but also by a self-reinforcing loop of neuronal signaling. As the loop strengthens, more synapses form.

“What we found is that silencing these neurons doesn’t really change their output patterns — of course they are silenced, but they still find their targets and survive — but instead drastically influences their inputs,” said Bernardo L. Sabatini.

The basal ganglia help an animal select its actions based on sensory and social context, as well as past experience. The new clues about how the basal ganglia gets wired shortly after birth, described in the May 13, 2012, issue of the journal Nature, may help scientists understand what happens when the area goes awry, such as in Parkinson’s disease, when degradation of neurons in the basal ganglia interferes with patients’ ability to initiate appropriate movements, or drug addiction, where overstimulation of the basal ganglia spurs inappropriate actions. Sabatini says his team’s findings also suggest that the process can be easily perturbed during development, and may contribute to human disorders such as cerebral palsy and attention deficit hyperactivity disorder.

Although the basal ganglia do not receive direct messages from the external world, this region of the brain is by no means anatomically isolated: it receives signals from all over the cortex, and its output eventually returns to the cortex. Sabatini, who is at Harvard Medical School, explains that to select a motor action, the brain likely signals through that whole loop. “The question is, how do you lay down the circuits for those patterns?”

The basal ganglia are complex, containing many clusters of cells, some of which send excitatory signals and others inhibitory. Sabatini’s group focused on the basal ganglia’s main input station, the striatum. The striatum uses the information it receives to help direct movement in two ways: a ‘direct’ pathway stimulates motor actions and an ‘indirect’ pathway inhibits them. To learn how striatal activity affects circuit development, Sabatini’s team studied mutant mice whose indirect or direct pathways were turned off (because they were unable to release the inhibitory chemical messenger, GABA).

The group expected that silencing these neurons would prevent them from forming connections with the neurons that should have been receiving their signals. To their surprise, the silenced neurons survived and wired themselves to their targets normally. Unexpectedly, however, silencing the striatum’s direct pathway seemed to prevent formation of the connections sending input to the striatum. Silencing the indirect pathway upped the number of inputs. “We went into this study thinking completely differently,” says Sabatini. “What we found is that silencing these neurons doesn’t really change their output patterns — of course they are silenced, but they still find their targets and survive — but instead drastically influences their inputs.”

To see whether individual cells help set up the basal ganglia circuit, Sabatini’s group turned off a select few striatal neurons, rather than whole pathways, in the mice. They found that silencing these neurons did not affect excitatory connections to the area, suggesting that circuit-level activity patterns set up the basal ganglia’s wiring, rather than individual genes or molecules within cells. “It’s hard to believe that there are molecular cues that specify these structures, because it would be way too complicated,” Sabatini says.

When the group dampened activity in neurons that project from the brain’s cortex to the striatum during development, then examined the brain when the mouse had reached early adulthood (25 days after birth) they saw fewer neuronal connections in the striatum compared to mice that had developed normally suggesting that early perturbations in development can have lasting effects. “That experiments is what told us that it’s the ongoing activity of cortical neurons that is driving this process in the striatum,” Sabatini says. The axons — the slender processes of the neuron that carry electrical impulses — stimulate striatal cells by releasing the excitatory neurotransmitter glutamate, telling them to make more synapses and stabilize them, he adds.

Sabatini believes that the basal ganglia tests random connection patterns after an animal is born and reinforces the correct ones. This type of plasticity of the basal ganglia probably lasts into adulthood, because animals are constantly learning to take new actions. Using genetically engineered mice that allow researchers to control exactly which neurons to inactivate and when, Sabatini’s group is now studying how perturbations affect the wiring later in life.

Sabatini expects that these results will get us a step closer to understanding human disease. “Maybe we will show that there’s hope for therapy,” he adds. “If it is plastic, maybe we can recover.”

Provided by Howard Hughes Medical Institute

Source: medicalxpress.com

May 16, 20128 notes
#science #neuroscience #brain #psychology
Let's get moving: Unravelling how locomotion starts

May 16, 2012

(Medical Xpress) — Scientists at the University of Bristol have shed new light on one of the great unanswered questions of neuroscience: how the brain initiates rhythmic movements like walking, running and swimming.

image

The Xenopus frog tadpole is a small, simple vertebrate

While experiments in the 1970s using electrical brain stimulation identified areas of the brain responsible for starting locomotion, the precise neuron-by-neuron pathway has not been described in any vertebrate – until now. 

To find this pathway, Dr. Edgar Buhl and colleagues in Bristol’s School of Biological Sciences studied a small, simple vertebrate: the Xenopus frog tadpole.

They found that the pathway to initiate swimming consists of just four types of neurons.  By touching skin on the head of the tadpole and applying cellular neurophysiology and anatomy techniques, the scientists identified nerve cells that detect the touch on the skin, two types of brain nerve cells which pass on the signal, and the motor nerve cells that control the swimming muscles. 

Dr. Buhl said: “These findings address the longstanding question of how locomotion is initiated following sensory stimulation and, for the first time in any vertebrate, define in detail a direct pathway responsible.  They could thus be of great evolutionary interest and could also open the path to understanding initiation of locomotion in other vertebrates.”

When mechanisms in the brain that initiate locomotion break down – for example, in people with Parkinson’s disease – starting to walk becomes a real problem.  Therefore, understanding the initiation of swimming in tadpoles could be a first step towards understanding the initiation of locomotion in more complex vertebrates, including people, and may eventually have implications for treating movement disorders such as Parkinson’s.

The research is published today in the Journal of Physiology.

Provided by University of Bristol

Source: medicalxpress.com

May 16, 20122 notes
#science #neuroscience #brain #psychology
Surgeons Restore Some Hand Function to Quadriplegic Patient

May 15th, 2012

Technique could help those with C6, C7 spinal cord injuries.

Surgeons at Washington University School of Medicine in St. Louis have restored some hand function in a quadriplegic patient with a spinal cord injury at the C7 vertebra, the lowest bone in the neck. Instead of operating on the spine itself, the surgeons rerouted working nerves in the upper arms. These nerves still “talk” to the brain because they attach to the spine above the injury.

Following the surgery, performed at Barnes-Jewish Hospital, and one year of intensive physical therapy, the patient regained some hand function, specifically the ability to bend the thumb and index finger. He can now feed himself bite-size pieces of food and write with assistance.

The case study, published online May 15 in the Journal of Neurosurgery, is, to the authors’ knowledge, the first reported case of using nerve transfer to restore the ability to flex the thumb and index finger after a spinal cord injury.

“This procedure is unusual for treating quadriplegia because we do not attempt to go back into the spinal cord where the injury is,” says surgeon Ida K. Fox, MD, assistant professor of plastic and reconstructive surgery at Washington University, who treats patients at Barnes-Jewish Hospital. “Instead, we go out to where we know things work — in this case the elbow — so that we can borrow nerves there and reroute them to give hand function.”

image

To detour around the block in this patient’s C7 spinal cord injury and return hand function, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury (green) and the non-working nerves that connect below the injury (red) run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor (yellow arrow). Image adapted from Eric Young image available in press release mentioned.

Although patients with spinal cord injuries at the C6 and C7 vertebra have no hand function, they do have shoulder, elbow and some wrist function because the associated nerves attach to the spinal cord above the injury and connect to the brain. Since the surgeon must tap into these working nerves, the technique will not benefit patients who have lost all arm function due to higher injuries — in vertebrae C1 through C5.

The surgery was developed and performed by the study’s senior author Susan E. Mackinnon, MD, chief of the Division of Plastic and Reconstructive Surgery at Washington University School of Medicine. Specializing in injuries to peripheral nerves, she has pioneered similar surgeries to return function to injured arms and legs.

Mackinnon originally developed this procedure for patients with arm injuries specifically damaging the nerves that provide the ability to flex the thumb and index finger. This is the first time she has applied this peripheral nerve technique to return limb function after a spinal cord injury.

[Video: Surgeons restore some hand function to quadriplegic patient]
Surgeons at Washington University School of Medicine in St. Louis have restored some hand function in a quadriplegic patient with a spinal cord injury at the C7 vertebra, the lowest bone in the neck. Instead of operating on the spine itself, the surgeons rerouted working nerves in the upper arms. These nerves still “talk” to the brain because they attach to the spine above the injury. Following the surgery, performed at Barnes-Jewish Hospital, and one year of intensive physical therapy, the patient regained the ability to pinch and can now feed himself bite-size pieces of food and write with assistance.

“Many times these patients say they would like to be able to do very simple things,” Fox says. “They say they would like to be able to feed themselves or write without assistance. If we can restore the ability to pinch, between thumb and index finger, it can return some very basic independence.”

Mackinnon cautions that the hand function restored to the patient was not instantaneous and required intensive physical therapy. It takes time to retrain the brain to understand that nerves that used to bend the elbow now provide pinch, she says.

Though this study reports only one case, Mackinnon and her colleagues do not anticipate a limited window of time during which a patient with a similar spinal cord injury must be treated with this nerve transfer technique. This patient underwent the surgery almost two years after his injury. As long as the nerve remains connected to the support and nourishment of the spinal cord, even though it no longer “talks” to the brain, the nerve and its associated muscle remain healthy, even years after the injury.

“The spinal cord is the control center for the nerves, which run like spaghetti all the way out to the tips of the fingers and the tips of the toes,” says Mackinnon, the Sydney M. Shoenberg Jr. and Robert H. Shoenberg Professor and director of the School of Medicine’s Center for Nerve Injury and Paralysis. “Even nerves below the injury remain healthy because they are still connected to the spinal cord. The problem is that these nerves no longer ‘talk’ to the brain because the spinal cord injury blocks the signals.”

To detour around the block in this patient’s C7 spinal cord injury and return hand function below the level of the injury, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury and the non-working nerves that connect below the injury run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor.

In this case, Mackinnon took a non-working nerve that controls the ability to pinch and plugged it into a working nerve that drives one of two muscles that flex the elbow. After the surgery, the bicep still flexes the elbow, but a second muscle, called the brachialis, that used to also provide elbow flexion, now bends the thumb and index finger.

“This is not a particularly expensive or overly complex surgery,” Mackinnon says. “It’s not a hand or a face transplant, for example. It’s something we would like other surgeons around the country to do.”

By Julia Evangelou Strait

Source: Neuroscience News

May 16, 201210 notes
#science #neuroscience
This Is Your Brain On Sugar: Study in Rats Shows High-Fructose Diet Sabotages Learning, Memory

ScienceDaily (May 15, 2012) — Attention, college students cramming between midterms and finals: Binging on soda and sweets for as little as six weeks may make you stupid.

image

New research suggests that binging on soda and sweets for as little as six weeks may make you stupid. (Credit: © RTimages / Fotolia)

A new UCLA rat study is the first to show how a diet steadily high in fructose slows the brain, hampering memory and learning — and how omega-3 fatty acids can counteract the disruption. The peer-reviewed Journal of Physiology publishes the findings in its May 15 edition.

"Our findings illustrate that what you eat affects how you think," said Fernando Gomez-Pinilla, a professor of neurosurgery at the David Geffen School of Medicine at UCLA and a professor of integrative biology and physiology in the UCLA College of Letters and Science. "Eating a high-fructose diet over the long term alters your brain’s ability to learn and remember information. But adding omega-3 fatty acids to your meals can help minimize the damage."

While earlier research has revealed how fructose harms the body through its role in diabetes, obesity and fatty liver, this study is the first to uncover how the sweetener influences the brain.

The UCLA team zeroed in on high-fructose corn syrup, an inexpensive liquid six times sweeter than cane sugar, that is commonly added to processed foods, including soft drinks, condiments, applesauce and baby food. The average American consumes more than 40 pounds of high-fructose corn syrup per year, according to the U.S. Department of Agriculture. “We’re not talking about naturally occurring fructose in fruits, which also contain important antioxidants,” explained Gomez-Pinilla, who is also a member of UCLA’s Brain Research Institute and Brain Injury Research Center. “We’re concerned about high-fructose corn syrup that is added to manufactured food products as a sweetener and preservative.”

Gomez-Pinilla and study co-author Rahul Agrawal, a UCLA visiting postdoctoral fellow from India, studied two groups of rats that each consumed a fructose solution as drinking water for six weeks. The second group also received omega-3 fatty acids in the form of flaxseed oil and docosahexaenoic acid (DHA), which protects against damage to the synapses — the chemical connections between brain cells that enable memory and learning.

"DHA is essential for synaptic function — brain cells’ ability to transmit signals to one another," Gomez-Pinilla said. "This is the mechanism that makes learning and memory possible. Our bodies can’t produce enough DHA, so it must be supplemented through our diet."

The animals were fed standard rat chow and trained on a maze twice daily for five days before starting the experimental diet. The UCLA team tested how well the rats were able to navigate the maze, which contained numerous holes but only one exit. The scientists placed visual landmarks in the maze to help the rats learn and remember the way.

Six weeks later, the researchers tested the rats’ ability to recall the route and escape the maze. What they saw surprised them.

"The second group of rats navigated the maze much faster than the rats that did not receive omega-3 fatty acids," Gomez-Pinilla said. "The DHA-deprived animals were slower, and their brains showed a decline in synaptic activity. Their brain cells had trouble signaling each other, disrupting the rats’ ability to think clearly and recall the route they’d learned six weeks earlier."

The DHA-deprived rats also developed signs of resistance to insulin, a hormone that controls blood sugar and regulates synaptic function in the brain. A closer look at the rats’ brain tissue suggested that insulin had lost much of its power to influence the brain cells.

"Because insulin can penetrate the blood-brain barrier, the hormone may signal neurons to trigger reactions that disrupt learning and cause memory loss," Gomez-Pinilla said.

He suspects that fructose is the culprit behind the DHA-deficient rats’ brain dysfunction. Eating too much fructose could block insulin’s ability to regulate how cells use and store sugar for the energy required for processing thoughts and emotions.

"Insulin is important in the body for controlling blood sugar, but it may play a different role in the brain, where insulin appears to disturb memory and learning," he said. "Our study shows that a high-fructose diet harms the brain as well as the body. This is something new."

Gomez-Pinilla, a native of Chile and an exercise enthusiast who practices what he preaches, advises people to keep fructose intake to a minimum and swap sugary desserts for fresh berries and Greek yogurt, which he keeps within arm’s reach in a small refrigerator in his office. An occasional bar of dark chocolate that hasn’t been processed with a lot of extra sweetener is fine too, he said.

Still planning to throw caution to the wind and indulge in a hot-fudge sundae? Then also eat foods rich in omega-3 fatty acids, like salmon, walnuts and flaxseeds, or take a daily DHA capsule. Gomez-Pinilla recommends one gram of DHA per day.

"Our findings suggest that consuming DHA regularly protects the brain against fructose’s harmful effects," said Gomez-Pinilla. "It’s like saving money in the bank. You want to build a reserve for your brain to tap when it requires extra fuel to fight off future diseases."

Source: Science Daily

May 16, 201228 notes
#science #neuroscience #brain #memory #psychology
Chronic Child Abuse Strong Indicator of Negative Adult Experiences

ScienceDaily (May 15, 2012) — Child abuse or neglect are strong predictors of major health and emotional problems, but little is known about how the chronicity of the maltreatment may increase future harm apart from other risk factors in a child’s life.

image

This chart illustrates the individual childhood and adult outcomes according to the number of reports that occurred before the event of interest. Because it was possible for some children to enter the study period with a pre-existing condition, these are indicated as gray or black bars with the legend indicating the outcome occurred “before the study.” Chronicity is associated with increasing risk for all but child maltreatment perpetration, violent delinquency, and head or brain injury. In these cases, there is a slight decline in prevalence for the highest category compared with middle categories, but in all cases having reports was associated with higher rates of outcomes. (Credit: Image courtesy of Washington University in St. Louis)

In a new study published in the current issue of the journal Pediatrics, Melissa Jonson-Reid, PhD, child welfare expert and a professor at the Brown School at Washington University in St. Louis, looked at how chronic maltreatment impacted the future health and behavior of children and adults.

The study tracked children by number of child maltreatment reports (zero to four or more) and followed the children into early adulthood, by which time some of the children had become parents.

The study sought to determine how well the number of child maltreatment reports predicted poor outcomes in adolescence, such as delinquency, substance abuse in the teen years or getting a sexually transmitted disease.

"For every measure studied, a more chronic history of child maltreatment reports was powerfully predictive of worse outcomes," Jonson-Reid says.

"For most outcomes, having a single maltreatment report put children at a 20 percent to 50 percent higher risk than non-maltreated comparison children.

In addition, a series of adult outcomes were tracked to see if the chronicity of maltreatment still mattered after controlling for the poor outcomes in adolescence. Adult outcomes included adult substance abuse or growing up and having children whom they then maltreated.

"In models of adult outcomes, children with four or more reports were about least twice as likely to later abuse their own children and have contact with the mental health system, even when controlling for the negative outcomes during adolescence." Jonson-Reid says that there appears to be good reason to put resources into preventing ongoing maltreatment.

"Successfully interrupting chronic child maltreatment may well reduce risk of a wide range of other costly child and adolescent health and behavioral problems," she says.

Jonson-Reid cites a recently published Centers for Disease Control and Prevention study estimating lifetime costs for a single year’s worth of children reported for maltreatment at $242 billion.

"What our study illustrates is that these costs are even more likely to accrue for children who continue to be re-reported," she says.

The study also found that maltreatment predicts a range of negative adolescent outcomes, and those adolescent outcomes then predict poor adult outcomes.

"If the poor outcomes in adolescence can be dealt with effectively, then later adult outcomes may also be forestalled," Jonson-Reid says.

"Our findings could therefore be interpreted as supporting many current evidence-based interventions that seek to improve behavioral and social functioning among children and adolescents who have experienced trauma like abuse or neglect."

Source: Science Daily

May 15, 201213 notes
#science #neuroscience #psychology
Mystery Gene Reveals New Mechanism for Anxiety Disorders

ScienceDaily (May 15, 2012) — A novel mechanism for anxiety behaviors, including a previously unrecognized inhibitory brain signal, may inspire new strategies for treating psychiatric disorders, University of Chicago researchers report.

By testing the controversial role of a gene called Glo1 in anxiety, scientists uncovered a new inhibitory factor in the brain: the metabolic by-product methylglyoxal. The system offers a tantalizing new target for drugs designed to treat conditions such as anxiety disorder, epilepsy, and sleep disorders.

The study, published in the Journal of Clinical Investigation, found that animals with multiple copies of the Glo1 gene were more likely to exhibit anxiety-like behavior in laboratory tests. Further experiments showed that Glo1 increased anxiety-like behavior by lowering levels of methylglyoxal (MG). Conversely, inhibiting Glo1 or raising MG levels reduced anxiety behaviors.

"Animals transgenic for Glo1 had different levels of anxiety-like behavior, and more copies made them more anxious," said Abraham Palmer, PhD, assistant professor of human genetics at the University of Chicago Medicine and senior author of the study. "We showed that Glo1 was causally related to anxiety-like behavior, rather than merely correlated."

In 2005, a comparison of different mouse strains found a link between anxiety-like behaviors and Glo1, the gene encoding the metabolic enzyme glyoxylase 1. However, subsequent studies questioned the link, and the lack of an obvious connection between glyoxylase 1 and brain function or behavior made some scientists skeptical.

Read More →

May 15, 201227 notes
#science #neuroscience #brain #psychology #anxiety
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December