Neuroscience

Month

May 2012

How to minimize stroke damage

May 14, 2012

Following a stroke, factors as varied as blood sugar, body temperature and position in bed can affect patient outcomes, Loyola University Medical Center researchers report.

In a review article in the journal MedLink Neurology, first author Murray Flaster, MD, PhD and colleagues summarize the latest research on caring for ischemic stroke patients. (Most strokes are ischemic, meaning they are caused by blood clots.)

"The period immediately following an acute ischemic stroke is a time of significant risk,” the Loyola neurologists write. “Meticulous attention to the care of the stroke patient during this time can prevent further neurologic injury and minimize common complications, optimizing the chance of functional recovery.”

Stroke care has two main objectives – minimizing injury to brain tissue and preventing and treating the many neurologic and medical complications that can occur just after a stroke.

The authors discuss the many complex factors that affect outcomes. For example, there is considerable evidence of a link between hyperglycemia (high blood sugar) and poor outcomes after stroke. The authors recommend strict blood sugar control, using frequent finger-stick glucose checks and aggressive insulin treatment.

For each 1 degree C increase in the body temperature of stroke patients, the risk of death or severe disability more than doubles. Therapeutic cooling has been shown to help cardiac arrest patients, and clinical trials are underway to determine whether such cooling could also help stroke patients. Until those trials are completed, the goal should be to keep normal temperatures (between 95.9 and 99.5 degrees F).

Position in bed also is important, because sitting upright decreases blood flow in the brain. A common practice is to keep the patient lying flat for 24 hours. If a patient has orthopnea (difficulty breathing while lying flat), the head of the bed should be kept at the lowest elevation the patient can tolerate.

The authors discuss many other issues in stroke care, including blood pressure management; blood volume; statin therapy; management of complications such as pneumonia and sepsis; heart attack and other cardiac problems; blood clots; infection; malnutrition and aspiration; brain swelling; seizures; recurrent stroke; and brain hemorrhages.

Studies have shown that hospital units that specialize in stroke care decrease mortality, increase the likelihood of being discharged to home and improve functional status and quality of life.

All patients should receive supportive care — including those who suffer major strokes and the elderly. “Even in these populations, the majority of patients will survive their stroke,” the authors write. “The degree of functional recovery, however, may be dramatically impacted by the intensity and appropriateness of supportive care.”

Provided by Loyola University Health System

Source: medicalxpress.com

May 15, 201214 notes
#science #neuroscience #brain #stroke #psychology
Brain oscillations reveal that our senses do not experience the world continuously

May 14, 2012

(Medical Xpress) — It has long been suspected that humans do not experience the world continuously, but rather in rapid snapshots.

Now, researchers at the University of Glasgow have demonstrated this is indeed the case. Just as the body goes through a 24-hour sleep-wake cycle controlled by a circadian clock, brain function undergoes such cyclic activity – albeit at a much faster rate.

Professor Gregor Thut of the Institute of Neuroscience and Psychology, said: “Rhythms are intrinsic to biological systems. The circadian rhythm, with its very slow periodicity of sleep and wake cycles every 24 hours has an obvious, periodic effect on bodily functions.

“Brain oscillations – the recurrent neural activity that we see in the brain – also show periodicity but cycle at much faster speeds. What we wanted to know was whether brain function was affected in a cyclic manner by these rapid oscillations.”

The researchers studied a prominent brain rhythm associated with visual cortex functioning that cycles at a rate of 10 times per second (10Hz).

They used a ‘simple trick’ to affect the oscillations of this rhythm which involved presenting a brief sound to ‘reset’ the oscillation.

Testing subsequent visual perception, by using transcranial magnetic stimulation of the visual cortex, revealed a cyclic pattern at the very rapid rate of brain oscillations, in time with the underlying brainwaves.

Prof Thut said: “Rhythmicity therefore is indeed omnipresent not only in brain activity but also brain function. For perception, this means that despite experiencing the world as a continuum, we do not sample our world continuously but in discrete snapshots determined by the cycles of brain rhythms.”

The research, ‘Sounds reset rhythms of visual cortex and corresponding human visual perception’ is published in the journal Current Biology.

Provided by University of Glasgow

Source: medicalxpress.com

May 15, 201268 notes
#science #neuroscience #brain #psychology
Let there be light: It's good for our brains

May 14, 2012 By Sandy Evangelista

(Medical Xpress) — Swiss scientists have proven that light intensity influences our cognitive performance and how alert we feel, and that these positive effects last until early evening.

image

Credit: 2012 EPFL

Tests conducted in EPFL’s Solar Energy and Building Physics Laboratory (LESO) have confirmed the hypothesis that light influences our subjective feeling of sleepiness. The research team, led by Mirjam Münch, also showed that the effects of light exposure last until the early evening, and that light intensity has an impact on cognitive mechanisms. The results of this research were recently published in the journal Behavioral Neuroscience.

Light synchronizes our biological clocks. It is collected in the eye by photoreceptors that use photopigments (pigments that change when exposed to light), known as melanopsin. These cells, which differ from rods and cones, are considered a third class of photoreceptors in the retina and were discovered just ten years ago. They’re not there to form an image, but to perceive and absorb photons in the visible light spectrum. In addition, they are stimulated by blue light.

Exploring office lighting

Münch and her team wanted to know how our circadian rhythm could be influenced by our perception of light during the daytime. They created realistic office lighting conditions and recruited 29 young participants. “For this study, we took into account the intensity of natural and artificial light without specifically evaluating their spectra.”

From daytime to dusk

To synchronize their internal biological clocks, the volunteers had to maintain a regular sleep schedule during the seven days leading up to the test. They wore bracelets equipped with light sensors and accelerometers, so that the scientists could monitor their movements.

The study itself took place over two eight-hour sessions. The participants spent the first six hours in an experiment room, first in well-lighted conditions (1000-2000 lux, more or less equivalent to natural light in a room). In the second session, the light intensity was about 170 lux, which is what the eye perceives in a room without a window, lit with artificial light. For this experiment, light intensity was measured at eye-level. Every 30 minutes, the subjects were asked to assess how alert or sleepy they felt.

Finally, at the end of each session, the participants underwent two hours of supplemental memory tests in a darkened room – less than 6 lux. During these last two hours, the researchers took saliva samples in order to measure cortisol and melatonin concentrations. These two hormones are produced in a in a 24-hour cycle by the human body.

Boosted by the light

The volunteers who were subjected to higher light intensity during the afternoon were more alert all the way into the early evening. When they were subjected to light intensity ten times weaker, however, they showed signs of sleepiness and obtained lower scores on the memory tests.

These results were observed even in the absence of changes in cortisol and melatonin concentrations in their saliva. “With this study, we have discovered that light intensity has a direct effect on the subjective feeling of sleepiness as well as on objective cognitive performance, and that the benefits of more intense light during the daytime last long past the time of exposure,” concludes Münch.

Provided by Ecole Polytechnique Federale de Lausanne

Source: medicalxpress.com

May 14, 201212 notes
#science #neuroscience #brain #psychology
Powerful Function of Single Protein That Controls Neurotransmission Discovered

ScienceDaily (May 13, 2012) — Scientists at Weill Cornell Medical College have discovered that the single protein — alpha 2 delta — exerts a spigot-like function, controlling the volume of neurotransmitters and other chemicals that flow between the synapses of brain neurons. The study, published online in Nature, shows how brain cells talk to each other through these signals, relaying thoughts, feelings and action, and this powerful molecule plays a crucial role in regulating effective communication.

In the study, the investigators also suggest how the widely used pain drug Lyrica might work. The alpha 2 delta protein is the target of this drug and the new work suggests an approach to how other drugs could be developed that effectively twist particular neurotransmitter spigots on and off to treat neurological disorders. The research findings surprised the research team, which includes scientists from University College London.

"We are amazed that any single protein has such power," says the study’s lead investigator Dr. Timothy A. Ryan, professor of Biochemistry and associate professor of Biochemistry in Anesthesiology at Weill Cornell Medical College. "It is indeed rare to identify a biological molecule’s function that is so potent, that seems to be controlling the effectiveness of neurotransmission."

The researchers found that alpha 2 delta determines how many calcium channels will be present at the synaptic junction between neurons. The transmission of chemical signals is triggered at the synapse by the entry of calcium into these channels, so the volume and speed of neurotransmission depends on the availability of these channels.

Researchers discovered that taking away alpha 2 delta from brain cells prevented calcium channels from getting to the synapse. “But if you add more alpha 2 delta, you can triple the number of channels at synapses,” Dr. Ryan says. “This change in abundance was tightly linked to how well synapses carry out their function, which is to release neurotransmitters.”

Before this study, it was known that Lyrica, which is used for neuropathic pain, seizures and fibromyalgia, binds to alpha 2 delta, but little was understood about how this protein works to control synapses.

Read More →

May 14, 20129 notes
#science #neuroscience #brain #psychology
Vitamin K2: New Hope for Parkinson's Patients?

ScienceDaily (May 11, 2012) — Neuroscientist Patrik Verstreken, associated with VIB and KU Leuven, succeeded in undoing the effect of one of the genetic defects that leads to Parkinson’s using vitamin K2. His discovery gives hope to Parkinson’s patients.

image

Male fruit fly (Drosophila Melanogaster). Scientists have succeeded in undoing the effect of one of the genetic defects that leads to Parkinson’s using vitamin K2. The research was done in fruit flies. (Credit: © Studiotouch / Fotolia)

This research was done in collaboration with colleagues from Northern Illinois University (US) and was recently published in the journal Science.

"It appears from our research that administering vitamin K2 could possibly help patients with Parkinson’s. However, more work needs to be done to understand this better," says Patrik Verstreken.

Malfunctioning power plants are at the basis of Parkinson’s.

If we looked at cells as small factories, then mitochondria would be the power plants responsible for supplying the energy for their operation. They generate this energy by transporting electrons. In Parkinson’s patients, the activity of mitochondria and the transport of electrons have been disrupted, resulting in the mitochondria no longer producing sufficient energy for the cell. This has major consequences as the cells in certain parts of the brain will start dying off, disrupting communication between neurons. The results are the typical symptoms of Parkinson’s: lack of movement (akinesia), tremors and muscle stiffness.

The exact cause of this neurodegenerative disease is not known. In recent years, however, scientists have been able to describe several genetic defects (mutations) found in Parkinson’s patients, including the so-called PINK1 and Parkin mutations, which both lead to reduced mitochondrial activity. By studying these mutations, scientists hope to unravel the mechanisms underlying the disease process.

Paralyzed fruit flies

Fruit flies (Drosophila) are frequently used in lab experiments because of their short life spans and breeding cycles, among other things. Within two weeks of her emergence, every female is able to produce hundreds of offspring. By genetically modifying fruitflies, scientists can study the function of certain genes and proteins. Patrik Verstreken and his team used fruitflies with a genetic defect in PINK1 or Parkin that is similar to the one associated with Parkinson’s. They found that the flies with a PINK1 or Parkin mutation lost their ability to fly.

Upon closer examination, they discovered that the mitochondria in these flies were defective, just as in Parkinson’s patients. Because of this they generated less intracellular energy — energy the insects needed to fly. When the flies were given vitamin K2, the energy production in their mitochondria was restored and the insects’ ability to fly improved. The researchers were also able to determine that the energy production was restored because the vitamin K2 had improved electron transport in the mitochondria. This in turn led to improved energy production.

Conclusion

Vitamin K2 plays a role in the energy production of defective mitochondria. Because defective mitochondria are also found in Parkinson’s patients with a PINK1 or Parkin mutation, vitamin K2 potentially offers hope for a new treatment for Parkinson’s.

Source: Science Daily

May 14, 20128 notes
#science #neuroscience #brain #psychology #parkinson
Gene therapy for hearing loss: Potential and limitations

May 11, 2012

Regenerating sensory hair cells, which produce electrical signals in response to vibrations within the inner ear, could form the basis for treating age- or trauma-related hearing loss. One way to do this could be with gene therapy that drives new sensory hair cells to grow.

Researchers at Emory University School of Medicine have shown that introducing a gene called Atoh1 into the cochleae of young mice can induce the formation of extra sensory hair cells.

Their results show the potential of a gene therapy approach, but also demonstrate its current limitations. The extra hair cells produce electrical signals like normal hair cells and connect with neurons. However, after the mice are two weeks old, which is before puberty, inducing Atoh1 has little effect. This suggests that an analogous treatment in adult humans would also not be effective by itself.

The findings were published May 9 in the Journal of Neuroscience.

"We’ve shown that hair cell regeneration is possible in principle," says Ping Chen, PhD, associate professor of cell biology at Emory University School of Medicine. “In this paper, we have identified which cells are capable of becoming hair cells under the influence of Atoh1, and we show that there are strong age-dependent limitations on the effects of Atoh1 by itself.”

The first author of the paper, Michael Kelly, now a postdoctoral fellow at the National Institute on Deafness and Other Communication Disorders, was a graduate student in Emory’s Neuroscience program.

Kelly and his coworkers engineered mice to turn on the Atoh1 gene in the inner ear in response to the antibiotic doxycycline. Previous experimenters had used a virus to introduce Atoh1 into the cochleae of animals. This approach resembles gene therapy, but has the disadvantage of being slightly different each time, Chen says. In contrast, the mice have the Atoh1 gene turned on in specific cells along the lining of the inner ear, called the cochlear epithelium, but only when fed doxycycline.

Young mice given doxycycline for two days had extra sensory hair cells, in parts of the cochlea where developing hair cells usually appear, and also additional locations (see accompanying image).

The extra hair cells could generate electrical signals, although those signals weren’t as strong as mature hair cells. Also, the extra hair cells appeared to attract neuronal fibers, which suggests that those signals could connect to the rest of the nervous system.

"They can generate electrical signals, but we don’t know if they can really function in the context of hearing.” Chen says. “For that to happen, the hair cells’ signals need to be coordinated and integrated.”

Although doxycycline could turn on Atoh1 all over the surface of the cochlea, extra sensory hair cells did not appear everywhere. When they removed cochleae from the mice and grew them in culture dishes, her team was able to provoke even more hair cells to grow when they added a drug that inhibits the Notch pathway.

Manipulating the Notch pathway affects several aspects of embryonic development and in some contexts appears to cause cancer, so the approach needs to be refined further. Chen says that it may be possible to unlock the age-related limits on hair cell regeneration by supplying additional genes or drugs in combination with Atoh1, and the results with the Notch drug provide an example.

"Our future goals are to develop approaches to stimulate hair cell formation in older animals, and to examine functional recovery after Atoh1 induction," she says.

Provided by Emory University

Source: medicalxpress.com

May 14, 20124 notes
#science #neuroscience #brain #psychology
Study raises questions about use of anti-epilepsy drugs in newborns

May 11, 2012

A brain study in infant rats demonstrates that the anti-epilepsy drug phenobarbital stunts neuronal growth, which could prompt new questions about using the first-line drug to treat epilepsy in human newborns.

In Annals of Neurology EarlyView posted online May 11, researchers at Georgetown University Medical Center (GUMC) report that the anti-epilepsy drug phenobarbital given to rat pups about a week old changed the way the animals’ brains were wired, causing cognitive abnormalities later in life.

The researchers say it has been known that some of the drugs used to treat epilepsy increase the amount of neurons that die shortly after birth in the rat brain, but, until this study, no one had shown whether this action had any adverse impact on subsequent brain development.

"Our study is the first to show that the exposure to these drugs — and just a single exposure — can prevent brain circuits from developing their normal connectivity, meaning they may not be wired correctly, which can have long-lasting effects on brain function,” says the study’s senior investigator, Karen Gale, Ph.D., a professor of pharmacology at GUMC. “These findings suggest that in the growing brain, these drugs are not as benign as one would like to believe.”

For their study, the Georgetown researchers studied four agents including phenobarbital.

"The good news is not all anti-epilepsy drugs have this disruptive effect in the animal studies," Gale says.

The researchers found that the anti-epilepsy drug levetiracetam did not stunt synaptic growth. Animals treated with a third drug, lamotrigine, showed neural maturation, but it was delayed. An additional finding involved melatonin. When added to phenobarbital, it appeared to prevent the persistent adverse neural effects in the rat pups. Melatonin has been used clinically to protect cells from injury in humans.

"Many clinicians have been advocating for a reexamination of the use of these drugs in infants, and our findings provide experimental data to support that need," says the study’s co-lead investigator, Patrick A. Forcelli, Ph.D., a postdoctoral fellow in the department of pharmacology and physiology at GUMC. "Phenobarbital has been used to treat seizures for over 100 years — well before a Food and Drug Administration approval process was established— and for more than 50 years, it has been the first drug of choice in the treatment of seizures in neonates."

Read More →

May 14, 20121 note
#science #neuroscience #brain #epilepsy #psychology
Confirmation of repeated patterns of neurons indicates stereotypical organization throughout brain's cerebral cortex

May 11, 2012

Neurons are arranged in periodic patterns that repeat over large distances in two areas of the cerebral cortex, suggesting that the entire cerebral cortex has a stereotyped organization, reports a team of researchers led by Toshihiko Hosoya of the RIKEN Brain Science Institute. The entire cortex has a stereotypical layered structure with the same cell types arranged in the same way, but how neurons are organized in the other orientation—parallel to the brain’s surface—is poorly understood.

image

Figure 1: In the mouse visual cortex, neurons expressing id2 mRNA (magenta) are found in regularly repeating clusters. Reproduced from Ref. 1 © 2011 Hisato Maruoka et al., RIKEN Brain Science Institute

Hosoya and his colleagues therefore examined layer V (5) of the mouse cortex, which contains two classes of large pyramidal neurons that look identical but differ in the connections they form. One projects axons straight down to regions beneath the cortex; the other projects to the cortex on the opposite side of the brain.

First, the researchers examined expression of the id2 gene in cells of the visual cortex, because these cells form clusters in that part of the brain. They found that id2 is expressed in nearly all cells that project axons downward, but not in those that cross over. Hosoya and colleagues verified this by visualizing the connections of cells using fluorescent cholera toxin, which binds to cell membranes and travels along the axons.

Further examination of gene expression patterns in tissue slices revealed that the cells are arranged in clusters aligned perpendicular to the brain’s surface, and that the clusters are organized in a regular pattern, with the same basic unit repeating every thirty micrometers (Fig. 1). They also observed the same pattern in layer V of the somatosensory cortex, suggesting that this organization is common to all other areas.

By generating a strain of mutant mice expressing green fluorescent protein in the progenitor cells that produce the cells in layer V during brain development, Hosoya and his colleagues investigated the embryonic origin of these cells. This revealed that each cluster contains neurons that are produced by different progenitor cells.

Finally, the researchers showed that the regular pattern persists in the adult visual cortex, and that neurons in each cluster show the same activity patterns in response to visual stimulation. “Our preliminary data suggest that at least several other areas in the cortex have the same structure,” says Hosoya. “It’s likely that the entire cortex has the same organization, and I expect that the human cortex has the same structure.”

Provided by RIKEN

Source: medicalxpress.com

May 14, 20127 notes
#science #neuroscience #brain #neuron #psychology
Astrocytes found to bridge gap between global brain activity and localized circuits

May 11, 2012

Global network activity in the brain modulates local neural circuitry via calcium signaling in non-neuronal cells called astrocytes (Fig. 1), according to research led by Hajime Hirase of the RIKEN Brain Science Institute. The finding clarifies the link between two important processes in the brain.

image

Figure 1: Astrocytes are star-shaped cells with numerous fine projections that ensheath synapses in the brain. © 2012 Hajime Hirase

Activity in large-scale brain networks is thought to modulate changes in neuronal connectivity, so-called ‘synaptic plasticity’, in the cerebral cortex. The neurotransmitter acetylcholine regulates global brain activity associated with attention and awareness, and is involved in plasticity.

To investigate how these processes are linked, Hirase and his colleagues simultaneously stimulated the whiskers of mice and the nucleus basalis of Meynert (NBM), a basal forebrain structure containing neurons that synthesize acetylcholine and project widely to the cortex. Using electrodes and an imaging technique called two-photon microscopy, performed through a ‘cranial window’, they monitored the responses of cells in the barrel cortex, which receives inputs from the whiskers.

Recordings from the electrodes showed that repeated co-stimulation of the whiskers and NBM induced plasticity in the barrel cortex. This plasticity depended on two types of receptors—muscarinic acetylcholine receptors (mAChRs) and N-methyl-D-aspartic acid receptors (NMDARs). Two-photon imaging microscopy further revealed that activation of the mAChRs during co-stimulation elevated the concentration of calcium ions within astrocytes of the barrel cortex.

The researchers repeated these experiments in mutant mice lacking the receptor that controls the release of calcium ions in astrocytes. Since co-stimulation of whiskers and NBM did not induce plasticity in the mutants, Hirase and colleagues concluded that calcium signaling in astrocytes acts as a ‘gate’ linking the changes in global brain state induced by acetylcholine to activity in local cortical circuits.

Furthermore, the researchers found that stimulation of the NBM led to an increase in the extracellular concentration of the amino acid D-serine in the normal, but not the mutant, mice. D-serine is secreted by astrocytes and activates NMDARs. Hirase’s team had previously shown that astrocytes are electrically silent in living rodents even in the presence of neural activity2. The new findings showed that the biochemical, as opposed to electrical, activation of astrocytes induces them to release the transmitter that modulates synaptic plasticity in the neuronal circuitry.

“Our study is probably the first to show that calcium signaling in astrocytes is related to neuronal circuit plasticity in living animals,” says Hirase. “We are now studying if this type of calcium signaling occurs in all parts of an astrocyte or is restricted to some parts of the cell.”

Provided by RIKEN

Source: medicalxpress.com

May 14, 201212 notes
#science #neuroscience #brain #psychology
Mild traumatic brain injury may alter brain's neuronal circuit excitability and contribute to brain network dysfunction

May 11, 2012

Even mild head injuries can cause significant abnormalities in brain function that last for several days, which may explain the neurological symptoms experienced by some individuals who have experienced a head injury associated with sports, accidents or combat, according to a study by Virginia Commonwealth University School of Medicine researchers.

These findings, published in the May issue of the Journal of Neuroscience, advance research in the field of traumatic brain injury (TBI), enabling researchers to better understand what brain structural or functional changes underlie posttraumatic disorders – a question that until now has remained unclear.

Previous research has shown that even a mild case of TBI can result in long-lasting neurological issues that include slowing of cognitive processes, confusion, chronic headache, posttraumatic stress disorder and depression.

The VCU team, led by Kimberle M. Jacobs, Ph.D., associate professor in the Department of Anatomy and Neurobiology, demonstrated for the first time, using sophisticated bioimaging and electrophysiological approaches, that mild injury can cause structural disruption of axons in the brain while also changing the way the neurons fire in areas where they have not been structurally altered. Axons are nerve fibers in the brain responsible for conducting electrical impulses. The team used models of mild traumatic brain injury and followed morphologically identified neurons in live cortical slices.

“These findings should help move the field forward by providing a unique bioimaging and electrophysiological approach to assess the evolving changes evoked by mild TBI and their potential therapeutic modulation,” said co-investigator, John T. Povlishock, Ph.D., professor and chair of the VCU School of Medicine’s Department of Anatomy and Neurobiology and director of the Commonwealth Center for the Study of Brain Injury.

According to Povlishock, additional benefit may also derive from the use of this model system with repetitive injuries to determine if repeated insults exacerbate the observed abnormalities.

Provided by Virginia Commonwealth University

Source: medicalxpress.com

May 14, 20125 notes
#science #neuroscience #brain #psychology
Maternal Antibodies to Gluten Linked to Schizophrenia Risk in Children

May 11th, 2012

Babies born to women with sensitivity to gluten appear to be at increased risk for certain psychiatric disorders later in life, according to research by scientists at Karolinska Institutet in Sweden and Johns Hopkins Children’s Center in Baltimore.

The team’s findings, published in The American Journal of Psychiatry, add to a growing body of evidence that many “adult” diseases may take root before and shortly after birth.

“Lifestyle and genes are not the only factors that shape disease risk, and factors and exposures before, during and after birth can help pre-program much of our adult health,” said investigator Robert Yolken, M.D., a neuro-virologist at Johns Hopkins Children’s Center. “Our study is an illustrative example suggesting that a dietary sensitivity before birth could be a catalyst in the development of schizophrenia or a similar condition 25 years later.”

Maternal infections and other inflammatory disorders during pregnancy have long been linked to greater risk for schizophrenia in the offspring but, the Swedish and U.S. investigators say, this is the first study that points to maternal food sensitivity as a possible culprit in the development of such disorders. The findings establish a strong link but do not mean that gluten sensitivity will invariably cause schizophrenia, the investigators caution. The research, however, does suggest an intriguing new mechanism that may drive up risk and illuminate possible prevention strategies.

“Our research not only underscores the importance of maternal nutrition during pregnancy and its lifelong effects on the offspring, but also suggests one potential cheap and easy way to reduce risk if we were to find further proof that gluten sensitivity exacerbates or drives up schizophrenia risk,” said study lead investigator Håkan Karlsson, M.D., Ph.D., a neuroscientist at Karolinska Institutet and former neuro-virology fellow at Johns Hopkins.

The team’s findings are based on an examination of 764 birth records and neonatal blood samples of Swedes born between 1975 and 1985. Some 211 of them subsequently developed non-affective psychoses, such as schizophrenia and delusional disorders.

Using stored neonatal blood samples, the investigators measured levels of IgG antibodies to milk and wheat. IgG antibodies are markers of immune system reaction triggered by the presence of certain proteins. Because a mother’s antibodies cross the placenta during pregnancy to confer immunity to the baby, a newborn’s elevated IgG levels are proof of protein sensitivity in the mother.

Children born to mothers with abnormally high levels of antibodies to the wheat protein gluten had nearly twice the risk of developing schizophrenia later in life, compared with children who had normal levels of gluten antibodies. The link persisted even after researchers accounted for other factors known to increase schizophrenia risk, including maternal age, gestational age, mode of delivery and the mother’s immigration status. The risk for psychiatric disorders was not increased among those with elevated levels of antibodies to milk protein.

The researchers say the suspicion that food sensitivity in the mother can affect her child’s risk for psychiatric disorders stems from an observation made in the wake of the World War II by U.S. Army researcher F. Curtis Dohan, M.D. Dohan noted that food scarcity in post-war Europe and wheat-poor diets led to notably fewer hospital admissions for schizophrenia.  The link was merely observational, but it has piqued the curiosity of scientists ever since.

Researchers in the past also have observed that people diagnosed with schizophrenia have disproportionately high rates celiac disease, a rare autoimmune disorder characterized by gluten sensitivity. Although it is a hallmark of the condition, gluten sensitivity alone is not enough to diagnose celiac disease. Other studies have found that some people with schizophrenia have gluten sensitivity without other signs of celiac disease, the researchers note.

Yolken and Karlsson say the team already is conducting follow-up studies to clarify how gluten or sensitivity to it increases schizophrenia risk and whether it does so only in those genetically predisposed.

Source: Neuroscience News

May 14, 201221 notes
#science #neuroscience #brain #psychology #schizophrenia
Neurodegeneration 'Switched Off' in Mice

ScienceDaily (May 10, 2012) — Researchers at the Medical Research Council (MRC) Toxicology Unit at the University of Leicester have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice.

image

Scientists have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice. (Credit: © pressmaster / Fotolia)

In human neurodegenerative diseases, including Alzheimer’s, Parkinson’s and prion diseases, proteins “mis-fold” in a variety of different ways resulting in the build up of mis-shapen proteins. These form the plaques found in Alzheimer’s and the Lewy bodies found in Parkinson’s disease.

The researchers studied mice with neurodegeneration caused by prion disease. These mouse models currently provide the best animal representation of human neurodegenerative disorders, where it is known that the build up of mis-shapen proteins is linked with brain cell death.

They found that the build up of mis-folded proteins in the brains of these mice activates a natural defense mechanism in cells, which switches off the production of new proteins. This would normally switch back ‘on’ again, but in these mice the continued build-up of mis-shapen protein keeps the switch turned ‘off’. This is the trigger point leading to brain cell death, as those key proteins essential for nerve cell survival are not made.

By injecting a protein that blocks the ‘off’ switch of the pathway, the scientists were able to restore protein production, independently of the build up of mis-shapen proteins,and halt the neurodegeneration. The brain cells were protected, protein levels and synaptic transmission (the way in which brain cells signal to each other) were restored and the mice lived longer, even though only a very small part of their brain had been treated.

Mis-shapen proteins in human neurodegenerative diseases, such as Alzheimer’s and Parkinson’s diseases, also over-activate this fundamental pathway controlling protein synthesis in the brains of patients, which represents a common target underlying these different clinical conditions. The scientists’ results suggest that treatments focused on this pathway could be protective in a range of neurodegenerative disease in which mis-shapen proteins are building up and causing neurons to die.

Professor Giovanna Mallucci, who led the team, said, “What’s exciting is the emergence of a common mechanism of brain cell death, across a range of different neurodegenerative disorders, activated by the different mis-folded proteins in each disease. The fact that, in mice with prion disease, we were able to manipulate this mechanism and protect the brain cells means we may have a way forward in how we treat other disorders. Instead of targeting individual mis-folded proteins in different neurodegenerative diseases, we may be able to target the shared pathways and rescue brain cell degeneration irrespective of the underlying disease.”

Professor Hugh Perry, chair of the MRC’s Neuroscience and Mental Health Board, said, “Neurodegenerative diseases such as Alzheimer’s and Parkinson’s are debilitating and largely untreatable conditions. Alzheimer’s disease and related disorders affect over seven million people in Europe, and this figure is expected to double every 20 years as the population ages across Europe. The MRC believes that research such as this, which looks at the fundamental mechanisms of these devastating diseases, is absolutely vital. Understanding the mechanism that leads to neuronal dysfunction prior to neuronal loss is a critical step in finding ways to arrest disease progression.”

Source: Science Daily

May 14, 20129 notes
#science #neuroscience #brain #psychology #alzheimer #parkinson
Glial Cells Supply Nerve Fibers with Energy-Rich Metabolic Products

May 10th, 2012

Glial cells pass on metabolites to neurons.

Around 100 billion neurons in the human brain enable us to think, feel and act. They transmit electrical impulses to remote parts of the brain and body via long nerve fibres known as axons. This communication requires enormous amounts of energy, which the neurons are thought to generate from sugar. Axons are closely associated with glial cells which, on the one hand, surround them with an electrically insulating myelin sheath and, on the other hand support their long-term function. Klaus Armin and his research group from the Max Planck Institute of Experimental Medicine in Göttingen have now discovered a possible mechanisms by which these glial cells in the brain can support their associated axons and keep them alive in the long term.

Oligodendrocytes are a group of highly specialised glial cells in the central nervous system. They are responsible for the formation of the fat-rich myelin sheath that surrounds the nerve fibres as an insulating layer. The comparison with the coating on electricity cables is an obvious one; however, myelin can do much more than act as the insulating layer on electricity cables: it increases the transmission speed of the axons and also reduces ongoing energy consumption. The extreme importance of myelin for a functioning nervous system is shown by the diseases that arise from a defective insulating layer, such as multiple sclerosis

Interestingly, the function of the oligodendrocytes goes far beyond the mere provision of myelin. Klaus-Armin Nave and his team at the Max Planck Institute in Göttingen already succeeded in demonstrating years ago that healthy glial cells are also essential for the long-term function and survival of the axons themselves, irrespective of myelination. “The way in which the oligodendrocytes functionally support their associated axons was not clear to us up to now,” says Nave. In a new study, the researchers were able to show that the glial cells are involved in, among other things, the replenishment of energy in the nerve fibres. “They could be described as the petrol stations on the data highway of the axons,” says Nave, explaining the results.

image

Electron microscope cross-section image of the nerve fibres (axons) of the optic nerve. Axons are surrounded by special glial cells, the oligodendrocytes, wrapping themselves around the axons in several layers. Between the axons, there are extensions of astrocytes, another type of glial cells. © K.-A.Nave/MPI f. Experimental Medicine

But how does the energy refuelling work? Is there a metabolic connection between the oligodendrocytes and axons? To find out, Ursula Fünfschilling generated genetically modified mice: the function of the mitochondria was deliberately disrupted in the oligodendrocytes through the inactivation of the Cox10 gene. This affects the final stages of sugar breakdown taking place in the mitochondria where energy is harnessed – a process known as the respiratory chain. If a link in this chain is missing, in this instance cytochrome oxidase, which is only functional when cells have the enzyme Cox10, the glial cells gradually lose the capacity for cell respiration in their mitochondria. “Without independent breathing, the manipulated glial cells of the nervous systems should have died,” explains the scientist. That is, unless the low level of energy harnessed from the splitting of the glucose to form pyruvate or milk acid, a process known as glycolysis, is sufficient for them.

And this is precisely what the scientists observed in their mice: the animals’ myelin was initially formed in the normal way. The loss of the mitochondrial respiratory chain, which started at this point, did not appear to affect the glial cells in the central nervous system. Even one year later, there were no neurodegenerative changes in the brain to be observed. The scientists assume that in the early weeks of life – a phase characterised by maximum energy requirement – the mutated oligodendrocytes still rely on many intact mitochondria. All of the more mature oligodendrocytes later appear to reduce the mitochondrial respiration and set it to energy generation through increased glycolysis. This has the advantage in healthy glial cells that the metabolic products which arise during the breaking down of glucose can be used as components for myelin synthesis. In addition, the lactic acid that arises in the oligodendrocytes can be given to the axons where it can be used to produce energy with the help of the axon’s own mitochondria.

“The complete loss of the respiratory chain in the deliberately modified oligodendrocytes probably elevates a developmental step that unfolds naturally,” explains Nave. Thus the loss of glial mitochondria does not result in the deterioration of the energy supply to the axons but, conversely, to an oversupply of exploitable lactic acid. The affected nerve pathways themselves have no problem demonstrably in metabolising the lactic acid from oligodendrocytes. Transport proteins ensure the rapid transfer of the lactic acid between the oligodendrocytes and their myelinated axons.

This finding provides a new understanding of the role of oligodendrocytes: in addition to their known significance for myelinisation [aka myelination], they can directly provide the axons with glucose products that can be used as fuel with the help of axonal mitochondria in periods of high activity. This coupling of glial cells could explain, among other things, why in many myelin diseases, for example multiple sclerosis, the affected demyelinised axons often suffer irreversible damage.

Source: Neuroscience News

May 14, 201216 notes
#science #neuroscience #brain #psychology #neuron
Key Cellular Mechanisms Behind the Onset of Tinnitus Identified

ScienceDaily (May 10, 2012) — Research into hearing loss after exposure to loud noises could lead to the first drug treatments to prevent the development of tinnitus.

Researchers in the University of Leicester’s Department of Cell Physiology and Pharmacology have identified a cellular mechanism that could underlie the development of tinnitus following exposure to loud noises. The discovery could lead to novel tinnitus treatments, and investigations into potential drugs to prevent tinnitus are currently underway.

Tinnitus is a sensation of phantom sounds, usually ringing or buzzing, heard in the ears when no external noise is present. It commonly develops after exposure to loud noises (acoustic over-exposure), and scientists have speculated that it results from damage to nerve cells connected to the ears.

Although hearing loss and tinnitus affect around ten percent of the population, there are currently no drugs available to treat or prevent tinnitus.

University of Leicester researcher Dr Martine Hamann, who led the study published in the journal Hearing Research, said: “We need to know the implications of acoustic over exposure, not only in terms of hearing loss but also what’s happening in the brain and central nervous system. It’s believed that tinnitus results from changes in excitability in cells in the brain — cells become more reactive, in this case more reactive to an unknown sound.”

Dr Hamann and her team, including PhD student Nadia Pilati, looked at cells in an area of the brain called the dorsal cochlear nucleus — the relay carrying signals from nerve cells in the ear to the parts of the brain that decode and make sense of sounds. Following exposure to loud noises, some of the nerve cells (neurons) in the dorsal cochlear nucleus start to fire erratically, and this uncontrolled activity eventually leads to tinnitus.

Dr Hamann said “We showed that exposure to loud sound triggers hearing loss a few days after the exposure to the sound. It also triggers this uncontrolled activity in the neurons of the dorsal cochlear nucleus. This is all happening very quickly, in a matter of days”

In a key breakthrough in collaboration with GSK who sponsored Dr Pilati’s PhD, the team also discovered the specific cellular mechanism that leads to the neurons’ over-activity. Malfunctions in specific potassium channels that help regulate the nerve cell’s electrical activity mean the neurons cannot return to an equilibrium resting state.

Ordinarily, these cells only fire regularly and therefore regularly return to a rest state. However, if the potassium channels are not working properly, the cells cannot return to a rest state and instead fire continuously in random bursts, creating the sensation of constant noise when none exists.

Dr Hamann explained: “In normal conditions the channel helps to drag down the cellular electrical activity to its resting state and this allows the cell to function with a regular pattern. After exposure to loud sound, the channel is functioning less and therefore the cell is constantly active, being unable to reach its resting state and displaying those irregular bursts.”

Although many researchers have investigated the mechanisms underlying tinnitus, this is the first time that cellular bursting activity has been characterised and linked to specific potassium channels. Identifying the potassium channels involved in the early stages of tinnitus opens up new possibilities for preventing tinnitus with early drug treatments.

Dr Hamann’s team is currently investigating potential drugs that could regulate the damaged cells, preventing their erratic firing and returning them to a resting state. If suitable drug compounds are discovered, they could be given to patients who have been exposed to loud noises to protect them against the onset of tinnitus.

These investigations are still in the preliminary stages, and any drug treatment would still be years away.

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #hearing #psychology #brain
Testosterone-Fueled Infantile Males Might Be a Product of Mom's Behavior

ScienceDaily (May 10, 2012) — By comparing the testosterone levels of five-month old pairs of twins, both identical and non-identical, University of Montreal researchers were able to establish that testosterone levels in infancy are not inherited genetically but rather determined by environmental factors.

image

Angry boy. Testosterone levels in infancy are not inherited genetically but rather determined by environmental factors, new research suggests. (Credit: © crestajohnson / Fotolia)

"Testosterone is a key hormone for the development of male reproductive organs, and it is also associated with behavioural traits, such as sexual behaviour and aggression," said lead author Dr. Richard E. Tremblay of the university’s Research Unit on Children’s Psychosocial Maladjustment. "Our study is the largest to be undertaken with newborns, and our results contrast with the findings gained by scientists working with adolescents and adults, indicating that testosterone levels are inherited."

The findings were presented in an article published inPsychoneuroendocrinology on May 7, 2012.

The researchers took saliva samples from 314 pairs of twins and measured the levels of testosterone. They then compared the similarity in testosterone levels between identical and fraternal twins to determine the contribution of genetic and environmental factors. Results indicated that differences in levels of testosterone were due mainly to environmental factors. “The study was not designed to specifically identify these environmental factors which could include a variety of environmental conditions, such as maternal diet, maternal smoking, breastfeeding and parent-child interactions.”

"Because our study suggests that testosterone levels in infants are determined by the circumstances in which the child develops before and after birth, further studies will be needed to find out exactly what these influencing factors are and to what extent they change from birth to puberty," Tremblay said.

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #brain #psychology
Evolution’s Gift May Also Be at the Root of a Form of Autism

May 10th, 2012

A recently evolved pattern of gene activity in the language and decision-making centers of the human brain is missing in a disorder associated with autism and learning disabilities, a new study by Yale University researchers shows.

“This is the cost of being human,” said Nenad Sestan, associate professor of neurobiology, researcher at Yale’s Kavli Institute for Neuroscience, and senior author of the paper. “The same evolutionary mechanisms that may have gifted our species with amazing cognitive abilities have also made us more susceptible to psychiatric disorders such as autism.”

The findings are reported in the May 11 issue of the journal Cell.

In the Cell paper, Kenneth Kwan, the lead author, and other members of the Sestan laboratory identified the evolutionary changes that led the NOS1 gene to become active specifically in the parts of the developing human brain that form the adult centers for speech and language and decision-making. This pattern of NOS1 activity is controlled by a protein called FMRP and is missing in Fragile X syndrome, a disorder caused by a genetic defect on the X chromosome that disrupts FMRP production. Fragile X syndrome, the leading inherited form of intellectual disability, is also the most common single-gene cause of autism. The loss of NOS1 activity may contribute to some of the many cognitive deficits suffered by those with Fragile X syndrome, such as lower IQ, attention deficits, and speech and language delays, the authors say.

The pattern of NOS1 activity in these brain centers does not occur in the developing mouse brain — suggesting that it is a more recent evolutionary adaptation possibly involved in the wiring of neural circuits important for higher cognitive abilities. The findings of the Cell paper support this hypothesis. The study also provides insights into how genetic deficits in early development, a time when brain circuits are formed, can lead to disorders such as autism, in which symptoms appear after birth.

“This is an example of where the function of genetic changes that likely drove aspects of human brain evolution was disrupted in disease, possibly reverting some of our newly acquired cognitive abilities and thus contributing to a psychiatric outcome,” Kwan said.

image

Artist representation of early developmental brain cells that when disrupted cause Fragile X syndrome. Adapted from Yale University press release image.

By Bill Hathaway

Source: Neuroscience News

May 10, 201219 notes
#science #neuroscience #psychology #brain #autism
Researchers identify genetic mutation causing rare form of spinal muscular atrophy

May 10, 2012

Scientists have confirmed that mutations of a gene are responsible for some cases of a rare, inherited disease that causes progressive muscle degeneration and weakness: spinal muscular atrophy with lower extremity predominance, also known as SMA-LED.

"Typical spinal muscular atrophies begin in infancy or early childhood and are fatal, involving all motor neurons, but SMA-LED predominantly affects nerve cells controlling muscles of the legs. It is not fatal and the prognosis is good, although patients usually are moderately disabled and require assistive devices such as bracing and wheelchairs throughout their lives," said Robert H. Baloh, MD, PhD, director of Cedars-Sinai Medical Center’s Neuromuscular Division and senior author of a Neurology article describing the new findings on DYNC1H1.

It is a molecule inside cells that acts as a motor to transport cellular components. Using cells cultured from patients, Baloh’s group showed that the mutation disrupts this motor’s function. The researchers found that some subjects with mutations had global developmental delay in addition to weakness, indicating the brain also is involved.

"Our observations suggest that a range of DYNC1H1-related disease exists in humans – from a widespread neurodevelopmental abnormality of the central nervous system to more selective involvement of certain motor neurons, which manifests as spinal muscular atrophy," Baloh said.

He pointed out that while this molecule is responsible for some inheritable cases of spinal muscular atrophy with lower extremity predominance, the genetic mutation is absent in others. The search continues, therefore, to find other culprit genetic mutations and develop biological therapies to correct them.

"Although this is a rare form of motor neuron disease, it tells us that dynein function – the molecular motor – is crucial for the development and maintenance of motor neurons, which we hope will provide insight into the common form of spinal muscular atrophy and also amyotrophic lateral sclerosis," Baloh said. ALS (also known as Lou Gehrig’s disease) is a progressive neurodegenerative disease that affects nerve cells in the brain and spinal cord.

Baloh, an expert in treating and studying neuromuscular and neurodegenerative diseases, joined Cedars-Sinai in early 2012, working with other physicians and scientists in the Department of Neurology and the Regenerative Medicine Institute to establish one of the most comprehensive neuromuscular disease treatment and research teams in California.

Provided by Cedars-Sinai Medical Center

Source: medicalxpress.com

May 10, 20123 notes
#science #neuroscience #psychology #biology #disease
Mathematical model unlocks key to brain wiring

May 10, 2012

(Medical Xpress) — A new mathematical model predicting how nerve fibres make connections during brain development could aid understanding of how some cognitive disorders occur.

The model, constructed by scientists at the Queensland Brain Institute (QBI) and School of Mathematics and Physics at the University of Queensland (UQ), gives new insight into how changing chemical levels in nerve fibres can modify nerve wiring underpinning connections in the brain.

Professor Geoff Goodhill says that while scientists have long known that changing these chemical levels can change where nerve fibres grow, only now are they understanding why this is the case.

“Our mathematical model allows us to predict precisely how these chemical levels control the direction in which nerve fibres grow, during both neural development and regeneration after injury,” he said.

Correct brain wiring is fundamental for normal brain function.

Recent discoveries suggest that wiring problems may underpin a number of nervous system disorders including autism, dyslexia, Down syndrome, Tourette’s syndrome and Parkinson’s disease.

The new model, published in the prestigious cell journal Neurondemonstrates the important role mathematics can play in understanding how the brain develops, and perhaps ultimately preventing such disorders. 

Provided by University of Queensland 

Source: medicalxpress.com

May 10, 20127 notes
#science #neuroscience #brain #psychology
Researchers move closer to delaying dementia

May 10, 2012

(Medical Xpress) — Scientists at University of Queensland’s Brain Institute are one step closer to developing new therapies for treating dementia.

QBI’s Dr Jana Vukovic said the work was aimed at understanding the molecular mechanism that may impair learning and memory in the ageing population.

“Ageing slows the production of new nerve cells, reducing the brain’s ability to form new memories,” said Dr Vokovic, who performed the work in the laboratory of Professor Perry Bartlett, the Director of QBI at The University of Queensland.

"But our research shows for the first time that the brain cells usually responsible for mediating immunity, microglia, have an inhibitory effect on memory during ageing.

“Furthermore, they have shown that a molecule produced by nerve cells, fractalkine, can reverse this process and stimulate stem cells to produce new neurons.”

The discovery, published in The Journal of Neuroscience today, came after QBI scientists observed that the increased production of new neurons in mice that were actively running was due to the release of fractalkine in the hippocampus – the brain structure responsible for specific types of learning and memory.

Professor Bartlett said it had been known for some time that exercise increased the production of new nerve cells in the hippocampus in young and even aged mice.

“But this study found that it is fractalkine that appears to be specifically mediating this effect by making the microglia produce factors that activate the stem cells that produce new nerve cells,” he said.

“Once the cells are activated they divide and produce new cells, which underpin the animal’s ability to learn and form memories.

"This means that fractalkine may form the basis for the development of future therapies.

“The discovery is especially exciting because we have found that older animals suffering cognitive decline showed significantly lower levels of fractalkine.

“We are seeking ways of increasing fractalkine levels in patients with cognitive decline, and hoping this may be a new frontline therapy in treating dementia.”

Dr Vukovic said that until relatively recently, it was thought the adult brain was incapable of generating new neurons.

“But work from Professor Bartlett’s laboratory over the past 20 years has demonstrated that the brains of adult animals, including humans, retain the ability to make new nerve cells,” she said.

“The challenge is to find out how to stimulate this production in the aged animal and human where production has slowed.”

The latest work was a significant step toward achieving this goal, she said.

Provided by University of Queensland

Source: medicalxpress.com

May 10, 20128 notes
#science #neuroscience #brain #psychology
Think global, act local: New roles for protein synthesis at synapses

May 10, 2012

(Medical Xpress) — How do we build a memory in the brain? It is well known that for animals (and humans) new proteins are needed to establish long-term memories. During learning information is stored at the synapses, the junctions connecting nerve cells. Synapses also require new proteins in order to show changes in their strength (synaptic plasticity). Historically, scientists have focused on the cell body as the place where the required proteins are synthesized. However, in recent years there has been increasing focus on the dendrites and axons (the compartments that meet to form synapses) as a potential site for protein synthesis.

Protein synthesis machines have been observed there as well as a limited number of their templates, the messenger RNA molecules. The limited number of mRNAs observed in dendrites and axons placed constraints on the constellation of proteins that could be synthesized to help synapses work and change. Researchers from Erin Schuman’s lab at the Max Planck Institute (MPI) for Brain Research used new-generation sequencing to directly identify a very large number (over 2500) of new mRNA molecules that are present at the axons and dendrites. Using high-resolution imaging techniques they were able to both quantify and visualize individual mRNA molecules. They published their findings in the latest issue of Neuron.

[Video]
Erin Schuman and her colleagues describe how they were able to detect numerous new mRNAs in the processes of neurons with unprecedented sensitivity. Video: Neuron.

Using microarray approaches and/or in situ hybridization techniques, many different groups had each identified a hundred or so mRNAs that might reside in the dendrites. By analyzing and comparing these studies the Schuman team discovered something surprising: it seems that not a single mRNA type was found in all three studies. This observation made the scientist at the MPI for Brain Research wonder whether the already discovered mRNAs are just the tip of the iceberg and whether there were many more mRNA molecules waiting to be discovered.

In order to find out the researchers dissected the neuropil layer of the rat hippocampus. This layer comprises a high concentration of axons and dendrites, but lacks the cell bodies of pyramidal neurons (the principal cell type in the hippocampus and other brain areas). By using sensitive high-resolution sequencing techniques, mRNAs could be detected which, due to their lower concentrations, were not discovered before. The researchers found an impressive number of 2550 unique mRNAs present at the dendrites and/or axons. To determine the relative abundance in the neuronal cells, the scientists at Erin Schuman’s lab used the Nanostring nCounter, a new technique allowing the high-resolution visualization and quantification of single mRNA molecules. They found that the concentration of mRNAs in the euronal cells varies by three orders of magnitude. Additionally, the researchers were able to classify many of the mRNAs and determine their function in synaptic plasticity. These include signaling molecules, scaffolds and the receptors for neurotransmitter molecules. In addition, many mRNAs coding for protein implicated in diseases like autism were discovered in the dendrites and axons. Finally, by using advanced imaging techniques, the researchers could directly visualize some of the mRNAs in the neuronal dendrites, hundreds of micrometers from the cell body.

These results reveal a previously unappreciated enormous potential for the local protein synthesis machinery to supply, maintain and modify the dendritic and synaptic protein population. It seems that neurons use a local control mechanism much in the same way that modern societies have learned that the most efficient means to distribute goods to the population is to use local distribution centers.

Provided by Max Planck Society

Source: medicalxpress.com

May 10, 20126 notes
#science #neuroscience #brain #memory #psychology
Researchers say genes and vascular risk modify effects of aging on brain and cognition

May 9, 2012 

Efforts to understand how the aging process affects the brain and cognition have expanded beyond simply comparing younger and older adults.

"Everybody ages differently. By looking at genetic variations and individual differences in markers of vascular health, we begin to understand that preventable factors may affect our chances for successful aging," said Wayne State University psychology doctoral student Andrew Bender, lead author of a study supported by the National Institute on Aging of the National Institutes of Health and now in press in the journal Neuropsychologia.

The report, “Age-related Differences in Memory and Executive Functions in Healthy APOE ε4 Carriers: The Contribution of Individual Differences in Prefrontal Volumes and Systolic Blood Pressure,” focuses on carriers of the ε4 variant of the apolipoprotein (APOE) gene, present in roughly 25 percent of the population. Compared to those who possess other forms of the APOE gene, carriers of the ε4 allele are at significantly greater risk for Alzheimer’s, dementia and cardiovascular disease.

Many studies also have shown that nondemented carriers of the APOE ε4 variant have smaller brain volumes and perform less well on cognitive tests than carriers of other gene variants. Those findings, however, are not consistent, and a possible explanation may come from examining interactions between the risky genes and other factors, such as markers of cardiovascular health. Prior research in typical samples of older adults has shown that indeed other vascular risk factors — such as elevated cholesterol, hypertension or diabetes — can exacerbate the impact of the APOE ε4 variant on brain and cognition, but it is unclear if such synergy of risks is present in healthy adults.

Thus, Wayne State researchers evaluated a group of volunteers from 19 to 77 years of age who self-reported as exceptionally healthy on a questionnaire that screened for a number of conditions, representing a “best case scenario” of healthy aging. The research project, led by Naftali Raz, Ph.D., professor of psychology and director of the Lifespan Cognitive Neuroscience Research Program at WSU’s Institute of Gerontology, tested different cognitive abilities known for their sensitivity to aging and the effects of the APOE ε4 variant. Those abilities include speed of information processing, working memory (holding and manipulating information in one’s mind) and episodic memory (memory for events).

Researchers also measured participants’ blood pressure, performed genetic testing to determine which APOE variant participants carried, and measured the volumes of several critical brain regions using a high-resolution structural magnetic resonance imaging brain scan. Bender and Raz showed that for older APOE ε4 carriers, even minor increases in systolic blood pressure (the higher of the two numbers that are reported in blood pressure measures) were linked with smaller volumes of the prefrontal cortex and prefrontal white matter, slower speed of information processing, reduced working memory capacity and worse verbal memory. Notably, they said, that pattern was not evident in those who lacked the ε4 gene variant.

The study concludes that the APOE ε4 gene may make its carriers sensitive to negative effects of relatively subtle elevations in systolic blood pressure, and that the interplay between two risk factors, genetic and physiological, is detrimental to the key brain structures and associated cognitive functions.

"Although genes play a significant role in shaping the effects of age and vascular risk on the brain and cognition, the impact of single genetic variants is relatively small, and there are quite a few of them. Thus, one’s aging should not be seen through the lens of one’s genetic profile," cautioned the study’s authors. They continued, "The negative impact of many genetic variations needs help from other risk factors, and while there isn’t much one can do about genes, a lot can be done about vascular risk factors such as blood pressure or cholesterol."

"Everybody should try to keep those in check, although people with certain genetic variants more so than others." Raz said. "Practically speaking, even with the best deck of genetic cards dealt to you, it still makes sense to reduce risk through whatever works: exercise, diet or, if those fail, medication."

Because the study is part of a longitudinal project, he and Bender said the immediate future task now is to determine how the interaction between risky genes and vascular risk factors affect the trajectory of age-related changes — not differences, as in this cross-sectional study — in brain and cognition.

Provided by Wayne State University

Source: medicalxpress.com

May 10, 20121 note
#science #neuroscience #brain #psychology
Chronic cocaine use triggers changes in brain's neuron structure

May 9, 2012

Chronic exposure to cocaine reduces the expression of a protein known to regulate brain plasticity, according to new, in vivo research on the molecular basis of cocaine addiction. That reduction drives structural changes in the brain, which produce greater sensitivity to the rewarding effects of cocaine.

image

The research, led by UB’s Dietz, suggests a potential new target for development of a treatment for cocaine addiction. Credit: Douglas Levere, UB Communications

The finding suggests a potential new target for development of a treatment for cocaine addiction. It was published last month in Nature Neuroscience by researchers at the University at Buffalo and Mount Sinai School of Medicine.

"We found that chronic cocaine exposure in mice led to a decrease in this protein’s signaling," says David Dietz, PhD, assistant professor of pharmacology and toxicology in the School of Medicine and Biomedical Sciences, who did the work while at Mt. Sinai. "The reduction of the expression of the protein, called Rac1, then set in motion a cascade of events involved in structural plasticity of the brain — the shape and growth of neuronal processes in the brain. Among the most important of these events is the large increase in the number of physical protrusions or spines that grow out from the neurons in the reward center of the brain.

"This suggests that Rac1 may control how exposure to drugs of abuse, like cocaine, may rewire the brain in a way that makes an individual more susceptible to the addicted state," says Dietz.

The presence of the spines demonstrates the spike in the reward effect that the individual obtains from exposure to cocaine. By changing the level of expression of Rac1, Dietz and his colleagues were able to control whether or not the mice became addicted, by preventing enhancement of the brain’s reward center due to cocaine exposure.

To do the experiment, Dietz and his colleagues used a novel tool, which allowed for light activation to control Rac1 expression, the first time that a light-activated protein has been used to modulate brain plasticity.

"We can now understand how proteins function in a very temporal pattern, so we could look at how regulating genes at a specific time point could affect behavior, such as drug addiction, or a disease state," says Dietz.

In his UB lab, Dietz is continuing his research on the relationship between behavior and brain plasticity, looking, for example, at how plasticity might determine how much of a drug an animal takes and how persistent the animal is in trying to get the drug.

Provided by University at Buffalo

Source: medicalxpress.com

May 10, 20123 notes
#science #neuroscience #brain #psychology
Scientists identify neurotranmitters that lead to forgetting

May 9, 2012

While we often think of memory as a way of preserving the essential idea of who we are, little thought is given to the importance of forgetting to our wellbeing, whether what we forget belongs in the “horrible memories department” or just reflects the minutia of day-to-day living.

Despite the fact that forgetting is normal, exactly how we forget—the molecular, cellular, and brain circuit mechanisms underlying the process—is poorly understood.

Now, in a study that appears in the May 10, 2012 issue of the journal Neuron, scientists from the Florida campus of The Scripps Research Institute have pinpointed a mechanism that is essential for forming memories in the first place and, as it turns out, is equally essential for eliminating them after memories have formed.

"This study focuses on the molecular biology of active forgetting," said Ron Davis, chair of the Scripps Research Department of Neuroscience who led the project. "Until now, the basic thought has been that forgetting is mostly a passive process. Our findings make clear that forgetting is an active process that is probably regulated."

The Two Faces of Dopamine

To better understand the mechanisms for forgetting, Davis and his colleagues studied Drosophila or fruit flies, a key model for studying memory that has been found to be highly applicable to humans. The flies were put in situations where they learned that certain smells were associated with either a positive reinforcement like food or a negative one, such as a mild electric shock. The scientists then observed changes in the flies’ brains as they remembered or forgot the new information.

The results showed that a small subset of dopamine neurons actively regulate the acquisition of memories and the forgetting of these memories after learning, using a pair of dopamine receptors in the brain. Dopamine is a neurotransmitter that plays an important role in a number of processes including punishment and reward, memory, learning and cognition.

But how can a single neurotransmitter, dopamine, have two seemingly opposite roles in both forming and eliminating memories? And how can these two dopamine receptors serve acquiring memory on the one hand, and forgetting on the other?

The study suggests that when a new memory is first formed, there also exists an active, dopamine-based forgetting mechanism—ongoing dopamine neuron activity—that begins to erase those memories unless some importance is attached to them, a process known as consolidation that may shield important memories from the dopamine-driven forgetting process.

The study shows that specific neurons in the brain release dopamine to two different receptors known as dDA1 and DAMB, located on what are called mushroom bodies because of their shape; these densely packed networks of neurons are vital for memory and learning in insects. The study found the dDA1 receptor is responsible for memory acquisition, while DAMB is required for forgetting.

When dopamine neurons begin the signaling process, the dDA1 receptor becomes overstimulated and begins to form memories, an essential part of memory acquisition. Once that memory is acquired, however, these same dopamine neurons continue signaling. Except this time, the signal goes through the DAMB receptor, which triggers forgetting of those recently acquired, but not yet consolidated, memories.

Jacob Berry, a graduate student in the Davis lab who led the experimentation, showed that inhibiting the dopamine signaling after learning enhanced the flies’ memory. Hyperactivating those same neurons after learning erased memory. And, a mutation in one of the receptors, dDA1, produced flies unable to learn, while a mutation in the other, DAMB, blocked forgetting.

Intriguing Issues

While Davis was surprised by the mechanisms the study uncovered, he was not surprised that forgetting is an active process. “Biology isn’t designed to do things in a passive way,” he said. “There are active pathways for constructing things, and active ones for degrading things. Why should forgetting be any different?”

The study also brings into a focus a lot of intriguing issues, Davis said—savant syndrome, for example.

"Savants have a high capacity for memory in some specialized areas," he said. "But maybe it isn’t memory that gives them this capacity, maybe they have a bad forgetting mechanism. This also might be a strategy for developing drugs to promote cognition and memory—what about drugs that inhibit forgetting as cognitive enhancers?"

Provided by The Scripps Research Institute

Source: medicalxpress.com

May 10, 2012114 notes
#science #neuroscience #brain #psychology
Why Do People Choke When the Stakes Are High? Loss Aversion May Be the Culprit

ScienceDaily (May 9, 2012) — In sports, on a game show, or just on the job, what causes people to choke when the stakes are high? A new study by researchers at the California Institute of Technology (Caltech) suggests that when there are high financial incentives to succeed, people can become so afraid of losing their potentially lucrative reward that their performance suffers.

image

In the study, each participant was asked to control this virtual object on a screen. The virtual object consisted of two weighted balls connected by a spring. The task was to place the object, which stretched and contracted as a weighted spring would in real life, into a square target within two seconds. (Credit: Image courtesy of California Institute of Technology)

It is a somewhat unexpected conclusion. After all, you would think that the more people are paid, the harder they will work, and the better they will do their jobs — until they reach the limits of their skills. That notion tends to hold true when the stakes are low, says Vikram Chib, a postdoctoral scholar at Caltech and lead author on a paper published in the May 10 issue of the journalNeuron. Previous research, however, has shown that if you pay people too much, their performance actually declines.

Some experts have attributed this decline to too much motivation: they think that, faced with the prospect of earning an extra chunk of cash, you might get so excited that you will fail to do the task properly. But now, after looking at brain-scan data of volunteers performing a specific motor task, the Caltech team says that what actually happens is that you become worried about losing your potential prize. The researchers also found that the more someone is afraid of loss, the worse they perform.

In the study, each participant was asked to control a virtual object on a screen by moving an index finger that had a tracking device attached to it. The virtual object consisted of two weighted balls connected by a spring. The task was to place the object, which stretched and contracted as a weighted spring would in real life, into a square target within two seconds.

The researchers controlled for individual skill levels by customizing the size of the target so that everyone would have the same success rate. That way, people who happened to be really good or bad at this task would not skew the data.

After a training period, the subjects were asked to perform the task while inside an fMRI machine, which measures blood flow in the brain — a proxy for brain activity, since wherever a brain is active, it needs extra oxygen, and thus a larger volume of blood. By monitoring blood flow, the researchers can pinpoint areas of the brain that turn on when a particular task is performed.

The task began with the researchers offering the participants a randomized range of rewards — from $0 to $100 — if they could successfully place the object into the square within the time limit. At the end of hundreds of trials — each with varying reward amounts — the participant was given their reward, based on the result of just one of the trials, picked at random.

As expected, the team found that performance improved as the incentives increased — but only when the cash reward amounts were at the low end of the spectrum. Once the rewards passed a certain threshold, which depended on the individual, performance began to fall off.

Incentives are known to activate a part of your brain called the ventral striatum, Chib says; the researchers thus expected to see the ventral striatum become increasingly active as they bumped up the prizes. And if the conventional thought were correct — that the reason for the observed performance decline was over-motivation — they would expect the striatum to continue showing a lot of activation when the incentives became high enough for performance to suffer.

What they found, instead, was that when the participants were shown their potential rewards, activity in the striatum did indeed increase with rising incentives. But once the volunteers started doing the task, striatal activity decreased with rising incentives. They also noticed that the less activity they saw in a participant’s striatum, the worse that person performed on the task.

Other studies have shown that decreasing striatal activity is related to fear or aversion to loss, Chib says. “When people see the incentive that they’re being offered, they initially encode it as a gain,” he explains. “But when they’re actually doing the task, the thing that causes them to perform poorly is that they worry about losing a potential incentive they haven’t even received yet.” He adds, “We’re showing loss aversion even though there are no explicit losses anywhere in the task — that’s very strange and something you really wouldn’t expect.”

To further test their hypothesis, Chib and his colleagues decided to measure how loss-averse each participant was. They had the participants play a coin-flip game in which there was an equal chance they could win or lose varying amounts of money.

Each participant was offered varying potential win-loss amounts ($20-$20, $20-$10, $20-$5, for example), and then given the opportunity to either accept each possible gamble or decline it. The win-loss ratio at which the subjects chose to take the gamble provided a measure of how loss-averse each person was; someone willing to gamble even when they might win or lose $20 is less loss-averse than someone who is only willing to gamble if they can win $20 but only lose $5.

Once the numbers had been crunched and compared to the original experiment, it turned out that the more averse a participant was, the worse they did on the task when the stakes were high. And for a particularly loss-aversive person, the threshold at which their performance started to decline did not have to be very high. “If you’re more loss-averse, it really hurts you,” Chib says. “You’re going to reach peak performance at a lower incentive level, and your performance is also going to be worse for higher incentives.”

"Previously, it’s been shown that the ventral striatum is involved in mediating performance increases in response to rising incentives," says John O’Doherty, professor of psychology and coauthor of the paper. "But our study shows that changes in activity in this same region can, under certain situations, also lead to worsening performance."

While this study only involved a specific motor task and financial incentives, these results may well be universal, says Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology and another coauthor of the study. “The implications and applications can include any sort of decision making that contains high stakes and uncertainties, such as business and politics.”

These findings, the researchers say, might be used to develop new ways to motivate people to perform better or to train them to be less loss-averse. “This loss aversion can be an important way of deciding how to set up incentive mechanisms and how to figure out who’s going to perform well and who isn’t,” Chib says. “If you can train somebody to be less loss-averse, maybe you can help them avoid performing poorly in stressful situations.”

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #brain #psychology
Response to first drug treatment may signal likelihood of future seizures in people with epilepsy

May 9, 2012

How well people with newly diagnosed epilepsy respond to their first drug treatment may signal the likelihood that they will continue to have more seizures, according to a study published in the May 9, 2012, online issue ofNeurology, the medical journal of the American Academy of Neurology.

"Our research shows a pattern based on how a person responds to initial treatment and specifically, to their first two courses of drug treatment," said study author Patrick Kwan, MD, PhD, with the University of Melbourne in Australia.

For the study, 1,098 people from Scotland between the ages of nine and 93 with newly diagnosed epilepsy were followed for as long as 26 years after being given their first drug therapy. Participants were considered seizure-free if they had no seizures for at least a year without changes in their treatment. If they had further seizures, a second drug was chosen to be given alone or to be added to the first. If seizures continued, a third drug regimen was selected, and the process continued for up to nine drug regimens.

The study found that 50 percent of the people were seizure-free after the first drug tried, 13 percent were seizure-free after the second drug regimen tried and 4 percent were seizure-free after the third drug regimen tried. Less than two percent of the participants stopped having seizures on additional drug treatment courses up to the seventh one tried, and none became seizure-free after that.

The research also found that 37 percent of people in the study became seizure-free within six months of treatment. Another 22 percent became seizure-free after more than six months of starting treatment. Both groups continued to be seizure-free. However, 16 percent had fluctuating periods of seizure freedom and relapses, and 25 percent were never seizure-free for one year.

At the end of the study, 749 people (68 percent) were seizure-free and 678 people (62 percent) were on only one drug. The results were independent of the age when the person had the first seizure or the type of epilepsy.

"A person who doesn’t respond well to two courses of epilepsy drug treatment should be further evaluated to verify an epilepsy diagnosis and to identify whether surgery is the best next step," said Patricia E. Penovich, MD, with the Minnesota Epilepsy Group PA and the University of Minnesota School of Medicine in St. Paul, Minn., and a Fellow with the American Academy of Neurology, who wrote an accompanying editorial on the study.

Provided by American Academy of Neurology

Source: medicalxpress.com

May 9, 20121 note
#science #neuroscience #brain #psychology
The music of the (hemi)spheres sheds new light on schizophrenia

May 9, 2012

In 1619, the pioneering astronomer Johannes Kepler published Harmonices Mundi in which he analyzed data on the movement of planets and asserted that the laws of nature governing the movements of planets show features of harmonic relationships in music. In so doing, Kepler provided important support for the, then controversial, model of the universe proposed by Copernicus.

In the latest issue of Biological Psychiatry, researchers at the University of California in San Diego suggest that careful analyses of the electrical signals of brain activity, measured using electroencephalography (EEG), may reveal important harmonic relationships in the electrical activity of brain circuits.

The underlying premise is a simple one - that brain function is expressed by circuits that fire, and therefore generate oscillating EEG signals, at different frequencies.

High frequency EEG activity called gamma, for example, might reflect the activity of fast-spiking cells which are often a subclass of inhibitory nerve cells containing parvalbumin. Represented musically, this would be a high pitch, i.e., toward the right side of the piano.

Lower frequency EEG activity, called theta, might come from cells that fire with a lower frequency.

As circuits interact with each other, one would see different “musical combinations”, like the chords of music, emerging in the EEG signal. Abnormalities in the structure and function of brain circuits would be reflected in cacophonous music, chords where the musical “voices” are firing at the wrong rate (pitch), volume (amplitude), or timing.

It is increasingly evident that schizophrenia is a disorder characterized by disturbances in the “music of the brain hemispheres.” This new report describes relationships between low- and high-frequency EEG oscillations in the human brain produced when high frequency auditory stimuli are presented to a research subject. The authors observed relatively slower oscillations and reduced cross-phase synchrony (for example, peak of theta coinciding with peak of gamma) in schizophrenia patients compared to healthy study participants.

Dr. John Krystal, Editor of Biological Psychiatry, commented, “The new findings highlight the importance of understanding the relationships between different circuits. It seems that cortical abnormalities in schizophrenia disturb brain function, in part, by disturbing the ‘tuning’ of brain circuits in relation to each other.”

Provided by Elsevier

Source: medicalxpress.com

May 9, 201211 notes
#science #neuroscience #brain #psychology
Researchers Discover a New Family of Key Mitochondrial Proteins for the Function and Variability of the Brain

May 9th, 2012

This family comprises a cluster of six genes that may be altered in neurological conditions, such as Parkinson’s and Charcot-Marie-Tooth disease.

A team headed by Eduardo Soriano at the Institute for Research in Biomedicine (IRB Barcelona) has published a study in Nature Communications describing a new family of six genes whose function regulates the movement and position of mitochondria in neurons. Many neurological conditions, including Parkinson’s and various types of Charcot-Marie-Tooth disease, are caused by alterations of genes that control mitochondrial transport, a process that provides the energy required for cell function.

“We have identified a set of new genes that are highly expressed in the nervous system and have a specific function in a biological process that is crucial for the activity and viability of the nervous system”, explains Eduardo Soriano, head of the Neurobiology and Cell Regeneration group at IRB Barcelona and full professor at the University of Barcelona (UB).

By means of comparative genomic analyses, the scientists have discovered that these genes are found only in more evolved mammals, the so-called Eutharia, these characterized by internal fertilization and development. “This finding indicates the relevance of mitochondrial biology. When the brain evolved in size, function and structure, the mitochondrial transport process also became more complex and probably required additional regulatory mechanisms”, says Soriano. “Likewise, given the origin of the gene cluster, in the transition between primitive mammals, such as marsupials (kangaroos) and the remaining placental mammals, it is tempting to propose that the cluster is linked to the increased complexity of the cerebral cortex in the lineage that leads to humans”, adds the full UB professor Jordi Garcia-Fernàndez, collaborator in the study.

image

In the image, red indicates the localization of mitochondria in a neuron. The new proteins described help to regulate their positions in the cell. Image adapted from IRB Barcelona press release image.

Correct brain function is highly energy-demanding. However, this energy must be finely distributed throughout neurons —cells that have ramifications that can reach up to tens of centimetres in length, from the brain to the limbs. This cluster of genes forms part of the “wheel” machinery of mitochondria and regulates the localization of each cell on the basis of its energy requirements. “These genes would be like an extra control in cellular mitochondrial trafficking and they interact with the major proteins associated with the regulation of mitochondrial transport”, explains Soriano.

Another striking characteristic of these new proteins is that they are found both in mitochondria, the function of which has already been described, and in the cell nucleus, where their function is unknown. “They may also be involved in the regulation of gene expression, a possibility that we are now studying”. In addition to their potential involvement in brain pathologies, the researchers believe that these proteins may be related to metabolic diseases and cancer.

Source: Neuroscience News

May 9, 20121 note
#science #neuroscience #brain #psychology
Virtual reality allows researchers to measure brain activity during behavior at unprecedented resolution

May 9, 2012

Researchers have developed a new technique which allows them to measure brain activity in large populations of nerve cells at the resolution of individual cells. The technique, reported today in the journal Nature, has been developed in zebrafish to represent a simplified model of how brain regions work together to flexibly control behaviour.

Our thoughts and actions are the product of large populations of nerve cells, called neurons, working in harmony, often millions at a time. Measuring brain activity during behaviour at detailed resolution in these groups of cells has proved extremely challenging. Currently, scientists are restricted to measuring their activity in individual brain areas of, for example, moving rats, typically in less than a few hundred neurons.

Dr Misha Ahrens, a Sir Henry Wellcome Postdoctoral Fellow based at Harvard University and the University of Cambridge, worked with colleagues to develop a technique which allows neuroscientists to study as many as 2,000 neurons simultaneously, anywhere in the brain of a transparent zebrafish. Their work was funded by the Wellcome Trust and the National Institutes of Health.

Dr Ahrens and colleagues created a virtual environment for zebrafish, which allowed them to measure activity in the neurons as the fish ‘moved’. In reality, the zebrafish was paralysed to allow the researchers to image its brain; the fish perceived to ‘move’ through the virtual environment by activating their motor neuron axons, the cells responsible for generating movement.

Zebrafish are often used as a simple organism to study genetics and characteristics of the nervous system that are conserved in humans . They are genetically modifiable, so by manipulating the fish’s genetic make-up, Dr Ahrens and colleagues created a fish in which all neurons contained a particular protein that increases its fluorescence when the cells are active. The fish are transparent and so the team were able to use a laser-scanning microscope, to see activity in any neuron in the brain of the fish, and up to 2,000 neurons simultaneously.

Dr Ahrens explains: “Our behaviour is determined by thousands, possibly millions, of nerve cells working in harmony. The zebrafish performs complex behaviors, with a brain of about 100,000 neurons, almost all of which are accessible to optical recording of neural activity. Our new technique will help us examine how large networks mediate behaviour, while at the same time telling us what each individual cell is doing.”

Using the technique, Dr Ahrens and colleagues asked the question: dozebrafish adapt their behaviour in response to changes in their environment? To do this, they manipulated the virtual environment to simulate the fish suddenly becoming more “muscular”. This served as a simplified version of what happens when the brain needs to adapt the way it drives behavior, for example, when water temperature changes the efficacy of the muscles, or when the fish gets injured.

Dr Ahrens adds: “The paralyzed fish in the virtual world do indeed adapt their behaviour, by adjusting the amount of impulses the brain sends to the muscles. They also ‘remember’ this change for a while. Imaging the brain everywhere during this behaviour, we identified certain brain regions that were involved, most notably the cerebellum and related structures. This technique opens the possibility that eventually, the behaviour may be used to gain insights into human motor control and motor control deficits.

"Our own motor control is continuously recalibrating itself in a similar way to the fish’s to cope with ever changing conditions of our body and environment, such as when we injure a leg, or if we’re walking on a slippery floor or carrying a heavy bag. The zebrafish’s behaviour is an ultra-simplified version of this and we have been able to gain some insight into how its brain structures drive behaviour. This might someday help us understand how damage to certain brain regions in humans affects the way in which the brain integrates sensory information to control body movements."

Understanding the brain is one of the Wellcome Trust’s five strategic challenges.

Provided by Wellcome Trust

Source: medicalxpress.com

May 9, 20123 notes
#science #neuroscience #brain #psychology
Reduction of excess brain activity improves memory in amnestic mild cognitive impairment

May 9, 2012

Research published in the May 10 issue of the journal Neuron, describes a potential new therapeutic approach for improving memory and modifying disease progression in patients with amnestic mild cognitive impairment. The study finds that excess brain activity may be doing more harm than good in some conditions that cause mild cognitive decline and memory impairment.

Elevated activity in specific parts of the hippocampus, a brain region involved in memory, is often seen in disorders associated with an increased risk for Alzheimer’s disease. Amnestic mild cognitive impairment (aMCI), where memory is worse than would be expected for a person’s age, is one such disorder. “In the case of early aMCI, it has been suggested that the increased hippocampal activation may serve a beneficial function by recruiting additional neural resources to compensate for those that are lost,” explains senior study author, Dr. Michela Gallagher, from Johns Hopkins University. “However, animal studies have raised the alternative view that this excess activation may be contributing to memory impairment.”

Dr. Gallagher and colleagues tested how a reduction of hippocampal activity would impact human patients with aMCI. The researchers used a low dose of a drug used clinically to treat epilepsy, for the purpose of reducing hippocampal activity in subjects with aMCI to levels that were similar to activity levels in healthy, age-matched subjects in a control group. The researchers found that treatment with the drug improved performance on a memory task. These findings point to the therapeutic potential of reducing excess activation in the hippocampus in aMCI.

The results also have broader significance as elevated activity in the hippocampus is also observed in other conditions that are thought to precede Alzheimer’s disease, and may be one of the underlying mechanisms of neurodegeneration. “Apart from a direct role in memory impairment, there is concern that elevated activity in vulnerable neural networks could be causing additional damage and, possibly, widespread disease-related degeneration that underlies cognitive decline and the conversion to Alzheimer’s disease,” concludes Dr. Gallagher. “Therefore, reducing the elevated activity in the hippocampus may help to restore memory and protect the brain.”

Provided by Cell Press

More information: Bakker et al.: “Reduction of hippocampal hyperactivity improves cognition in amnestic mild cognitive impairment.”,DOI:10.1016/j.neuron.2012.03.023

Source: medicalxpress.com

May 9, 20127 notes
#science #neuroscience #brain #psychology #memory
Babies’ Brains Benefit From Music Lessons

Released: 5/9/2012 11:20 AM EDT

Newswise — After completing the first study of its kind, researchers at McMaster University have discovered that very early musical training benefits children even before they can walk or talk.

They found that one-year-old babies who participate in interactive music classes with their parents smile more, communicate better and show earlier and more sophisticated brain responses to music.

The findings were published recently in the scientific journals Developmental Science and Annals of the New York Academy of Sciences.

“Many past studies of musical training have focused on older children,” says Laurel Trainor, director of the McMaster Institute for Music and the Mind. “Our results suggest that the infant brain might be particularly plastic with regard to musical exposure.”

Trainor, together with David Gerry, a music educator and graduate student, received an award from the Grammy Foundation in 2008 to study the effects of musical training in infancy. In the recent study, groups of babies and their parents spent six months participating in one of two types of weekly music instruction.

One music class involved interactive music-making and learning a small set of lullabies, nursery rhymes and songs with actions. Parents and infants worked together to learn to play percussion instruments, take turns and sing specific songs.

In the other music class, infants and parents played at various toy stations while recordings from the popular Baby Einstein series played in the background.

Before the classes began, all the babies had shown similar communication and social development and none had previously participated in other baby music classes.

“Babies who participated in the interactive music classes with their parents showed earlier sensitivity to the pitch structure in music,” says Trainor. “Specifically, they preferred to listen to a version of a piano piece that stayed in key, versus a version that included out-of-key notes. Infants who participated in the passive listening classes did not show the same preferences. Even their brains responded to music differently. Infants from the interactive music classes showed larger and/or earlier brain responses to musical tones.”

The non-musical differences between the two groups of babies were even more surprising, say researchers.

Babies from the interactive classes showed better early communication skills, like pointing at objects that are out of reach, or waving goodbye. Socially, these babies also smiled more, were easier to soothe, and showed less distress when things were unfamiliar or didn’t go their way.

While both class types included listening to music and all the infants heard a similar amount of music at home, a big difference between the classes was the interactive exposure to music.

“There are many ways that parents can connect with their babies,” says study coordinator Andrea Unrau. “The great thing about music is, everyone loves it and everyone can learn simple interactive musical games together.”

Source: newswise

May 9, 20127 notes
#science #brain #neuroscience #psychology
Cellist Achieves Optimal Performance Through Neurofeedback

Released: 5/9/2012 11:00 AM EDT 

Newswise — “Practice makes perfect,” the saying goes. Optimal performance, however, can require more than talent, effort, and repetition. Training the brain to reduce stress through neurofeedback can remove barriers and enhance one’s innate abilities.

image

An article in the journal Biofeedback presents the narrative of a young cellist who was able to realize the potential of his talent and eliminate debilitating migraine headaches. This case study is part of a special section in the Spring 2012 issue focusing on optimal functioning.

Enhancing people’s performance in business, performing and visual arts, academia, and sports can be realized through biofeedback and neurofeedback training. Tools of stress reduction, mental imagery training, psychology, and psycho-physiological technology are combined to help people reach their goals.

The author and practitioner in this case study has combined her work and study in the fields of theater, social work, and neurofeedback. In her practice, she coaches clients to achieve outstanding performances. For example, a singer can better understand and interpret a musical selection, allowing that singer to better convey the emotion of the music, resulting in a noticeably improved performance.

William, the young musician, sought relief from migraine headaches that were affecting him almost daily. His therapy, however, did not take the approach of treating the headaches, but of focusing on William as a person and as a performer. By improving his functionality, working through moments of obsessiveness, self-criticism, fear, and anxiety, the headaches could also be resolved.

William’s therapist conducted neurofeedback — using sensors to read his brainwaves, analyzing these with NeuroOptimal™ software, and then giving feedback to the brain through a visual display and sound. With this information, the brain can learn to self-correct. This technology assists in getting people past that moment when they obsess over whether they have given the correct answer or hit the right note.

NeuroOptimal feedback, guided imagery, and coaching about decisions regarding his music helped William move beyond the difficulties he encountered. During his senior recital at his college, he was able to give a relaxed, confident performance that was met with a standing ovation.

Full text of the article, “William’s Story: A Case Study in Optimal Performance,” Biofeedback, Volume 40, Issue 1, Spring 2012, is available at http://www.aapb-biofeedback.com/

Source: newswise

May 9, 20125 notes
#science #neuroscience #brain #psychology
Can new diagnostic approaches help assess brain function in unconscious, brain-injured patients?

May 9, 2012

Disorders of consciousness such as coma or a vegetative state caused by severe brain injury are poorly understood and their diagnosis has relied mainly on patient responses and measures of brain activity. However, new functional and imaging-based diagnostic tests that measure communication and signaling between different brain regions may provide valuable information about the potential for consciousness in patients unable to communicate. These innovative approaches are described and compared in a Review article in the groundbreaking neuroscience journal Brain Connectivity.

image

Brain Connectivity is the journal of record for researchers and clinicians interested in all aspects of brain connectivity. Credit: ©2012 Mary Ann Liebert, Inc., publishers

Mélanie Boly and coauthors from University of Liège (Belgium), University of Milan (Italy), and University College London (UK) compare the benefits and limitations of three methods for studying the dynamics of brain communication and connectivity in response to internal and external stimulation: functional magnetic resonance imaging f(MRI); transcranial magnetic stimulation (TMS) combined with electroencephalograpy (EEG); and response to neuronal perturbation, measuring, for example, sensory evoked potentials (ERP). They report their findings and propose future research directions in the article “Brain Connectivity in Disorders of Consciousness.”

"In recent years, there has been a tremendous interest in gaining a better understanding of the various disorders of consciousness. A variety of methods including fMRI and PET have been used to study these disorders," says Bharat Biswal, PhD, Co-Editor-in-Chief of Brain Connectivity and Associate Professor, University of Medicine and Dentistry of New Jersey. “This article provides a comprehensive analysis using three new and innovative methods to study disorders of consciousness.”

More information: The article is available free on the Brain Connectivitywebsite at http://online.liebertpub.com/doi/full/10.1089/brain.2011.0049

Provided by Mary Ann Liebert, Inc.

Source: medicalxpress.com

May 9, 2012
#science #neuroscience #brain #psychology #consciousness
Computer Scientists Show What Makes Movie Lines Memorable

ScienceDaily (May 8, 2012) — Whether it’s a line from a movie, an advertising slogan or a politician’s catchphrase, some statements take hold in people’s minds better than others. But why?

Cornell researchers who applied computer analysis to a database of movie scripts think they may have found the secret of what makes a line memorable.

The study suggests that memorable lines use familiar sentence structure but incorporate distinctive words or phrases, and they make general statements that could apply elsewhere. The latter may explain why lines such as, “You’re gonna need a bigger boat” or “These aren’t the droids you’re looking for” (accompanied by a hand gesture) have become standing jokes. You can use them in a different context and apply the line to your own situation.

While the analysis was based on movie quotes, it could have applications in marketing, politics, entertainment and social media, the researchers said.

"Using movie scripts allowed us to study just the language, without other factors. We needed a way of asking a question just about the language, and the movies make a very nice dataset," said graduate student Cristian Danescu-Niculescu-Mizil, first author of a paper to be presented at the 50th Annual Meeting of the Association for Computational Linguistics July 8-14 in Jeju, South Korea.

The study grows out of ongoing work on how ideas travel across networks.

"We’ve been looking at things like who talks to whom," said Jon Kleinberg, a professor of computer science who worked on the study, "but we hadn’t explored how the language in which an idea was presented might have an effect."

To address that, they collaborated with Lillian Lee, a professor of computer science who specializes in computer processing of natural human language.

They obtained scripts from about 1,000 movies, and a database of memorable quotes from those movies from the Internet Movie Database. Each quote was paired with another from the movie’s script, spoken by the same character in the same scene and about the same length, to eliminate every factor except the language itself. Obi-Wan Kenobi, for example, also said, “You don’t need to see his identification,” but you don’t hear that a lot.

They asked a group of people who had not seen the movies to choose which quote in the pairs was most memorable. Two patterns emerged to identify the memorable choice: distinctiveness and generality.

Then the researchers programmed a computer with linguistic rules reflecting these concepts. A line will be less general if it contains third-person pronouns and definite articles (which refer to people, objects or events in the scene) and uses past tense (usually referring to something that happened previously in the story). Distinctive language can be identified by comparison with a database of news stories. The computer was able to choose the memorable quote an average of 64 percent of the time.

Later analysis also found subtle differences in sound and word choice: Memorable quotes use more sounds made in the front of the mouth, words with more syllables and fewer coordinating conjunctions.

In a further test, the researchers found that the same rules applied to popular advertising slogans.

Although teaching a computer how to write memorable dialogue is probably a long way off, applications might be developed to monitor the work of human writers and evaluate it in progress, Kleinberg suggested.

The researchers have set up a website where you can test your skill at identifying memorable movie quotes, and perhaps contribute some data to the research, at www.cs.cornell.edu/~cristian/memorability.html

Source: Science Daily

May 9, 20126 notes
#science #neuroscience #memory #psychology #brain
Future Treatment for Nearsightedness — Compact Fluorescent Light Bulbs?

ScienceDaily (May 8, 2012) — Researchers at the University of Alabama at Birmingham hope to one day use fluorescent light bulbs to slow nearsightedness, which affects 40 percent of American adults and can cause blindness.

In an early step in that direction, results of a study found that small increases in daily artificial light slowed the development of nearsightedness by 40 percent in tree shrews, which are close relatives of primates.

The team, led by Thomas Norton, Ph.D., professor in the UAB Department of Vision Sciences, presented the study results May 8 at the 2012 Association for Research in Vision and Ophthalmology annual meeting in Ft. Lauderdale.

People can see clearly because the front part of the eye bends light and focuses it on the retina in back. Nearsightedness, also called myopia, occurs when the physical length of the eye is too long, causing light to focus in front of the retina and blurring images.

Myopia has many causes, some related to inheritance and some to the environment. Research in recent years had, for instance, suggested that children who spent more time outdoors, presumably in brighter outdoor light, had less myopia as young adults. That raised the question of whether artificial light, like sunlight, could help reduce myopia development, without the risks of prolonged sun exposure, such as skin cancer and cataracts.

"Our hope is to develop programs that reduce the rate of myopia using energy efficient, fluorescent lights for a few hours each day in homes or classrooms," said John Siegwart, Ph.D., research assistant professor in UAB Vision Sciences and co-author of the study. "Trying to prevent myopia by fixing defective genes through gene therapy or using a drug is a multi-year, multimillion-dollar effort with no guarantee of success. We hope to make a difference just with light bulbs."

Sorting through theories

Work over 25 years had shown that putting a goggle over one eye of a study animal, one that lets in light but blurs images, causes the eye to grow too long, which in turn causes myopia. Other past studies had shown that elevated light levels could reduce myopia under these conditions, whether the light was produced by halogen lamps, metal halide bulbs or daylight. The current study is the first to show that the development of myopia can be slowed by increasing daily fluorescent light levels.

One prevailing theory on myopia-related shape changes in the eye is that they are caused by the blurriness of images experienced while reading or doing other near-work chores. Another holds some people develop myopia because they have low levels of vitamin D, which goes up with exposure to sunlight and could explain the connection between outdoor light and reduced myopia. A third theory, one reinforced by the current results, is that bright light causes an increase in levels of dopamine, a signaling molecule in the retina.

To test the theories, the team used a goggle that lets in light but no images to produce myopia in one eye of each tree shrew. They found that a group exposed to elevated fluorescent light levels for eight hours per day developed 47 percent less myopia than a control group exposed to normal indoor lighting, even though the images were neither more nor less blurry. They also found that animals fed vitamin D supplements developed myopia just like ones without the supplement. Given these results, the team is now experimenting with light levels and treatment times to see if a short, bright light treatment could be effective. They have also begun studies looking at the effect of elevated light on retinal dopamine levels as it relates to the reduction of myopia.

"If we can find the best kind of light, treatment period and light level, we’ll have the scientific justification to begin studies raising light levels in schools, for instance," said Norton. "Compact fluorescent bulbs use much less electricity than standard light bulbs, and future programs raising light levels will have more impact the less expensive they are."

Source: Science Daily

May 9, 20124 notes
#science #neuroscience #vision
'Blindness’ May Rapidly Enhance Other Senses

ScienceDaily (May 8, 2012) — Can blindness or other forms of visual deprivation really enhance our other senses such as hearing or touch? While this theory is widely regarded as being true, there are still many questions about the science behind it.

New findings from a Canadian research team investigating this link suggest that not only is there a real connection between vision and other senses, but that connection is important to better understand the underlying mechanisms that can quickly trigger sensory changes. This may demystify the true potential of human adaptation and, ultimately, help develop innovative and effective methods for rehabilitation following sensory loss or injury.

François Champoux, director of the University of Montreal’s Laboratory of Auditory Neuroscience Research, will present his team’s research and findings at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Studies have shown, in terms of hearing, that blind people are better at localizing sound. One study even suggested that blindness might improve the ability to differentiate between sound frequencies. “The supposed enhanced tactile abilities have been studied at a greater degree and can be seen as early as days or even minutes following blindness,” says Champoux. “This rapid change in auditory ability hasn’t yet been clearly demonstrated.”

Two big questions about blindness and enhanced abilities remain unanswered: Can blindness improve more complex auditory abilities and, if so, can these changes be triggered after only a few minutes of visual deprivation, similar to those seen with tactile abilities?

"When we speak or play a musical instrument, the sounds have specific harmonic relations. In other words, if we play a certain note on a piano, that note has many related ‘layers.’ However, we don’t hear all of these layers because our brain simply associates them all together and we only hear the lowest one," Champoux explains.

It’s through this complex computation based on specific components of the sound that the brain can interpret and distinguish auditory signals coming from different people or instruments. The ability to identify harmonicity — the harmonic relation between sounds — is one of the most powerful factors involved in interpreting our auditory surroundings.

"Harmonicity can easily be evaluated using a simple task in which similar harmonic layers are set up and one of them is gradually modified until the individual notices two layers instead of one," says Champoux. "In our study, healthy individuals completed such a task while blindfolded. This task was administered twice, separated by a 90-minute interval during which the participants conversed with the experimenter in a quiet room. Half of the participants kept the blindfold on during the interval period, depriving them of all visual input, while the other half removed their blindfolds."

They found no significant differences between the two groups in their ability to differentiate harmonicity prior to visual deprivation. However, the results of the testing session following visual deprivation revealed that visually deprived individuals performed significantly better than the group that took their blindfolds off.

"Regardless of the neural basis for such an enhancement, our results suggest that the potential for change in auditory perception is much greater than previously assumed," Champoux notes.

Source: Science Daily

May 9, 20126 notes
#science #neuroscience #brain #psychology
The Risk of Listening to Amplified Music

ScienceDaily (May 8, 2012) — Listening to amplified music for less than 1.5 hours produces measurable changes in hearing ability that may place listeners at risk of noise-induced hearing loss, new research shows. While further research is needed to firmly establish this risk, the investigation is significant because it provides the first acoustical data for a new method to assess the potential harm from a widespread cultural behavior: “leisure listening” to amplified music, whether in live environments or through headphones.

A team of Danish acoustics researchers present the results of their preliminary study at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics. Their goal is to help develop recommendations for how sound engineers, musicians, event organizers, and the general public should safely enjoy amplified music so they are protected from hearing loss — just as workers are now protected by occupational health standards.

Explains Rodrigo Ordonez, Ph.D., lead scientist of the Danish team from Aalborg University’s Department of Electronic Systems: “Modern low-distortion, high-power loudspeaker systems and headphones make it easy for people to be exposed to potentially harmful sound levels at discotheques, concerts, or while using portable music players.”

He adds that in the realm of industrial noise and work-related sound exposures, decades of experience and personal tragedy — many workers lost hearing from factory conditions — has produced the hearing-damage risk criteria currently used. Based on well-documented acoustical parameters, these criteria outline measurement procedures and expected impact on hearing.

"Yet when it comes to musical sound exposure — and in particular, amplified music — it is not known if the same measures used for industrial noise will accurately describe the effects on hearing and the risk these behaviors pose," Dr. Ordonez says.

To investigate the potential health risk from amplified music, the team measured sounds known as “otoacoustic emissions” as an index of auditory function. These are sounds generated within the inner ear in response to sound stimuli, and they can be measured in the ear canals of people who have healthy hearing. Research shows that otoacoustic emissions disappear when the inner ear is damaged. In this study, the researchers measured otoacoustic emissions to gauge changes in hearing ability before and after exposure to amplified music, testing this method in a live concert environment. Comparing how these two sets of measures change after a sound exposure with the acoustical parameters of the amplified music can lead to a better understanding of how our hearing is affected.

Results revealed two main findings: One is that it is possible to measure changes in hearing after exposures of relatively short duration, less than 1.5 hours. The second is that there are noticeable individual differences in sound exposure levels, as well as in the changes on otoacoustic emissions produced by similar exposure conditions.

Next steps in the team’s work include refining their measurement methods and describing the biophysical effects and mechanics that music sound levels have on individuals. Ultimately they hope to provide data and a scientific rationale on which to establish damage risk criteria for music sound exposure.

Source: Science Daily

May 9, 20128 notes
#science #neuroscience #brain #psychology
Scientists Tuning in to How You Tune out Noise

ScienceDaily (May 8, 2012) — Although we have little awareness that we are doing it, we spend most of our lives filtering out many of the sounds that permeate our lives and acutely focusing on others — a phenomenon known as auditory selective attention. In research that could some day lead to the development of improved devices allowing users to control things like wheelchairs through thought alone, hearing scientists at the University of Washington (UW) are attempting to tease apart the process.

The work will be presented at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Auditory selective attention is extremely important in everyday life, notes UW postdoctoral researcher Ross Maddox. “In situations as mundane as ordering your morning cup of coffee, you must focus on the barista while tuning out the loud hiss of the espresso machine and the annoying cell phone conversation happening in line right behind you,” says Maddox. “However, the mechanisms behind selective attention are still not well understood.” In addition, some individuals suffer from Central Auditory Processing Disorder (CAPD), “which means they have normal hearing when tested by an audiologist,” he says, “but they are completely lost in loud settings like restaurants and airports.”

To determine how auditory selective attention works — and perhaps how it fails in people with CAPD — Maddox, along with Adrian K.C. Lee, an assistant professor of speech and hearing sciences, and colleague Willy Cheung, created laboratory situations that promoted the breakdown of the process. The researchers had 10 subjects try to focus their attention on just one target sound — a continuously repeating utterance of a single letter — among a total of 4, 6, 8, or 12 such sounds. The subjects had to determine when an “oddball” item (the letter “R,” chosen because it doesn’t rhyme with any other letter) was inserted into the target sound stream.

"Most studies systematically degrade sounds and measure the effects on listeners’ performance," Maddox explains. "Here, we made the target sound as easy to distinguish from all the other sounds present as possible, and tested the upper limit on the number of sounds a listener could tune out, given all these acoustical advantages."

Unsurprisingly, it is harder to tune in to just one stream when the number of streams increases. However, study subjects did better than expected — successfully identifying the target 70 percent of the time in the most difficult conditions. Repeating letters faster did make the task harder — although with faster repetition, listeners more quickly learn what the letter they’re listening to sounds like, “so there is a tradeoff involved when deciding on repetition speed,” Maddox says.

The work, Maddox and colleagues say, is a first step toward developing an auditory brain-computer interface (BCI) — a device that reads brain activity to allow users to control computers or machines such as wheelchairs. “We hope to create a system that presents a user with an auditory ‘menu’ of sounds — similar to the letter streams here — and allows the listener to make a choice by reading their brainwaves to determine which sound they are focusing on. The more sound streams a user is able to tune out, the more menu options we can present at a single time.”

Source: Science Daily

May 9, 20129 notes
#science #neuroscience #psychology #brain
Gestures Fulfill a Big Role in Language

ScienceDaily (May 8, 2012) — People of all ages and cultures gesture while speaking, some much more noticeably than others. But is gesturing uniquely tied to speech, or is it, rather, processed by the brain like any other manual action?

image

Scientists have discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. (Credit: Image courtesy of Acoustical Society of America (ASA))

A U.S.-Netherlands research collaboration delving into this tie discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. This is surprising because there is less visual information contained in gestures than in actual actions on objects. In short: Less may actually be more when it comes to gestures and actions in terms of understanding language.

Spencer Kelly, associate professor of Psychology, director of the Neuroscience program, and co-director of the Center for Language and Brain at Colgate University, and colleagues from the National Institutes of Health and Max Planck Institute for Psycholinguistics will present their research at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Among their key findings is that gestures — more than actions — appear to make people pay attention to the acoustics of speech. When we see a gesture, our auditory system expects to also hear speech. But this is not what the researchers found in the case of manual actions on objects.

Just think of all the actions you’ve seen today that occurred in the absence of speech. “This special relationship is interesting because many scientists have argued that spoken language evolved from a gestural communication system — using the entire body — in our evolutionary past,” points out Kelly. “Our results provide a glimpse into this past relationship by showing that gestures still have a tight and perhaps special coupling with speech in present-day communication. In this way, gestures are not merely add-ons to language — they may actually be a fundamental part of it.”

A better understanding of the role hand gestures play in how people understand language could lead to new audio and visual instruction techniques to help people overcome major challenges with language delays and disorders or learning a second language.

What’s next for the researchers? “We’re interested in how other types of visual inputs, such as eye gaze, mouth movements, and facial expressions, combine with hand gestures to impact speech processing. This will allow us to develop even more natural and effective ways to help people understand and learn language,” says Kelly.

Source: Science Daily

May 9, 20129 notes
#science #neuroscience #psychology #brain
Psychologists reveal how emotion can shut down high-level mental processes without our knowledge

May 8, 2012

Psychologists at Bangor University believe that they have glimpsed for the first time, a process that takes place deep within our unconscious brain, where primal reactions interact with higher mental processes. Writing in the Journal of Neuroscience, they identify a reaction to negative language inputs which shuts down unconscious processing.

For the last quarter of a century, psychologists have been aware of, and fascinated by the fact that our brain can process high-level information such as meaning outside consciousness. What the psychologists at Bangor University have discovered is the reverse- that our brain can unconsciously ‘decide’ to withhold information by preventing access to certain forms of knowledge.

The psychologists extrapolate this from their most recent findings working with bilingual people. Building on their previous discovery that bilinguals subconsciously access their first language when reading in their second language; the psychologists at the School of Psychology and Centre for Research on Bilingualism have now made the surprising discovery that our brain shuts down that same unconscious access to the native language when faced with a negative word such as war, discomfort, inconvenience, and unfortunate.

They believe that this provides the first proven insight to a hither-to unproven process in which our unconscious mind blocks information from our conscious mind or higher mental processes.

This finding breaks new ground in our understanding of the interaction between emotion and thought in the brain. Previous work on emotion and cognition has already shown that emotion affects basic brain functions such as attention, memory, vision and motor control, but never at such a high processing level as language and understanding.

Key to this is the understanding that people have a greater reaction to emotional words and phrases in their first language- which is why people speak to their infants and children in their first language despite living in a country which speaks another language and despite fluency in the second. It has been recognised for some time that anger, swearing or discussing intimate feelings has more power in a speaker’s native language. In other words, emotional information lacks the same power in a second language as in a native language.

Dr Yan Jing Wu of the University’s School of Psychology said: “We devised this experiment to unravel the unconscious interactions between the processing of emotional content and access to the native language system. We think we’ve identified, for the first time, the mechanism by which emotion controls fundamental thought processes outside consciousness.

"Perhaps this is a process that resembles the mental repression mechanism that people have theorised about but never previously located."

So why would the brain block access to the native language at an unconscious level?

Professor Guillaume Thierry explains: “We think this is a protective mechanism. We know that in trauma for example, people behave very differently. Surface conscious processes are modulated by a deeper emotional system in the brain. Perhaps this brain mechanism spontaneously minimises negative impact of disturbing emotional content on our thinking, to prevent causing anxiety or mental discomfort.”

He continues: “We were extremely surprised by our finding. We were expecting to find modulation between the different words- and perhaps a heightened reaction to the emotional word - but what we found was the exact opposite to what we expected- a cancellation of the response to the negative words.”

The psychologists made this discovery by asking English-speaking Chinese people whether word pairs were related in meaning. Some of the word pairs were related in their Chinese translations. Although not consciously acknowledging a relation, measurements of electrical activity in the brain revealed that the bilingual participants were unconsciously translating the words. However, uncannily, this activity was not observed when the English words had a negative meaning.

Provided by Bangor University

Source: medicalxpress.com

May 9, 201238 notes
#science #neuroscience #brain #psychology
Pavlov’s Electronic Dog

May 8th, 2012

Nanotechnology scientists and memory researchers at the Kiel University redesigned a mental learning process using electronic circuits.

The bell rings and the dog starts drooling. Such a reaction was part of studies performed by Ivan Pavlov, a famous Russian psychologist and physiologist and winner of the Nobel Prize for Physiology and Medicine in 1904. His experiment, nowadays known as “Pavlov’s Dog”, is ever since considered as a milestone for implicit learning processes. By using specific electronic components scientists form the Technical Faculty and the Memory Research at the Kiel University together with the Forschungszentrum Jülich were now able to mimic the behavior of Pavlov`s dog. The study “An Electronic Version of Pavlov’s Dog” is published in the current issue of Advanced Functional Materials (huwp 12012).

Digital and biological information processing are based on fundamentally different principles. Modern computers are able to work on mathematical-logical problems at an extremely high pace. In fact, procedures in the computer’s central processing unit and in the storage media run serially. While digital computers have shown immense success throughout the years in certain fields, they reveal weaknesses when it comes to pattern recognition and cognitive tasks. “However, to imitate biological information processing systems recognition and cognitive tasks are essential. Mammal brains – and therefore also the brains of humans – decode information in complex neuronal networks of synapses with up to 1014 (100 Trillion) connections. However, the connectivity between neurons is not fixed. “Learning means that new connections between neurons are created, or existing connections are reinforced or weakened”, says PD Dr. Thorsten Bartsch of the Clinic for Neurology. This is called neuronal plasticity.

image

Kiel scientists teach electronic circuits to memorize reactions. Source: Kohlstedt

Is it possible to design neural circuits with electronic devices to mimic learning? At this crossroad between neurobiology, material science and nanoelectronics, scientists from the University of Kiel are collaborating with their colleagues from the Research Center Jülich. Now, they have succeeded in electronically recreating the classical “Pavlov’s Dog” experiment. “We used memristive devices in order to mimic the associative behaviour of Pavlov’s dog in form of an electronic circuit”, explains Professor Hermann Kohlstedt, head of the working group Nanoelectronics at the University of Kiel.

Memristors are a class of electronic circuit elements which have only been available to scientists in an adequate quality for a few years. They exhibit a memory characteristic in form of hysteretic current-voltage curves consisting of high and low resistance branches.  In dependence on the prior charge flow through the device these resistances can vary.  Scientists try to use this memory effect in order to create networks that are similar to neuronal connections between synapses. “In the long term, our goal is to copy the synaptic plasticity onto electronic circuits. We might even be able to recreate cognitive skills electronically”, says Kohlstedt. The collaborating scientific working groups in Kiel and Jülich have taken a small step toward this goal.

The project set-up consisted of the following: two electrical impulses were linked via a memristive device to a comparator. The two pulses represent the food and the bell in Pavlov’s experiment. A comparator is a device that compares two voltages or currents and generates an output when a given level has been reached. In this case, it produces the output signal (representing saliva) when the threshold value is reached. In addition, the memristive element also has a threshold voltage that is defined by physical and chemical mechanisms in the nano-electronic device. Below this threshold value the memristive device behaves like any ordinary linear resistor. However, when the threshold value is exceeded, a hysteretic (changed) current-voltage characteristic will appear.

“During the experimental investigation, the food for the dog (electrical impulse 1) resulted in an output signal of the comparator, which could be defined as salivation. Unlike to impulse 1, the ring of the bell (electrical impulse 2) was set in such a way that the compartor’s output stayed unaffected – meaning no salivation”, describes Dr. Martin Ziegler, scientist at the Kiel University and the first-author of the publication. After applying both impulses simultaneously to the memristive device, the threshold value was exceeded. The working group had activated the memristive memory function. Multiple repetitions led to an associative learning process within the circuit – similar to Pavlov’s dogs. “From this moment on, we had only to apply electrical impulse 2 (bell) and the comparator generated an output signal, equivalent to salivation”, says Ziegler and is very pleased with these results. Electrical impulse 1 (feed) triggers the same reaction as it did before the learning. Hence, the electric circuit shows a behaviour that is termed classical conditioning in the field of psychology. Beyond that, the scientists were able to prove that the electrical circuit is able to unlearn a particular behaviour if both impulses were not longer applied simultaneously.

Information on “Pavlov’s dog”

In Behavioural Psychology, Pavlov’s experiments with dogs are considered as milestones to understand implicit learning in biological systems. In the early 20th century, Ivan Pavlov was able to show that dogs reacted indifferently towards the impulse “bell” and “food” when these were presented separately. After combining those two impulses (food and bell) in multiple repetitions, the dogs associated both impulses with each other. As a result, the dogs produced a higher amount of saliva, now even hearing the bell alone. This method is called classical conditioning and can be generalized to various combinations of certain impulses.

image

Nanotechnology scientists and memory researchers have published research results concerning “Pavlov’s Dog”. Credit: Advanced Functional Materials (huwp 2012)

Source: Neuroscience News

May 8, 201214 notes
#science #neuroscience #brain #psychology
How Cannabis Use During Adolescence Affects Brain Regions Associated With Schizophrenia

ScienceDaily (May 8, 2012) — New research from the Royal College of Surgeons in Ireland (RCSI) published in Nature’s Neuropsychopharmacology has shown physical changes to exist in specific brain areas implicated in schizophrenia following the use of cannabis during adolescence. The research has shown how cannabis use during adolescence can interact with a gene, called the COMT gene, to cause physical changes in the brain.

The COMT gene provides instructions for making enzymes which breakdown a specific chemical messenger called dopamine. Dopamine is a neurotransmitter that helps conduct signals from one nerve cell to another, particularly in the brains reward and pleasure centres. Adolescent cannabis use and its interaction with particular forms of the COMT gene have been shown to cause physical changes in the brain as well as increasing the risk of developing schizophrenia.

Dr Áine Behan, Department of Physiology, RCSI and lead author on the study said ‘This is the first study to show that the combined effects of the COMT gene with adolescent cannabis use cause physical changes in the brain regions associated with schizophrenia. It demonstrates how genetic, developmental and environmental factors interact to modulate brain function in schizophrenia and supports previous behavioural research which has shown the COMT gene to influence the effects of adolescent cannabis use on schizophrenia-related behaviours.

The three areas of the brain assessed in this study were found to show changes in cell size, density and protein levels.

'Increased knowledge on the effects of cannabis on the brain is critical to understanding youth mental health both in terms of psychological and psychiatric well-being,' Dr Behan continued.

Source: Science Daily

May 8, 201225 notes
#science #neuroscience #brain #psychology #schizophrenia
Fewer Suicides After Antidepressive Treatment for Schizophrenia

ScienceDaily (May 8, 2012) — Antidepressive drugs reduce the mortality rate of schizophrenic patients, while treatment with bensodiazepines greatly increases it, especially as regards suicide. Giving several antipsychotics simultaneously, however, seems to have no effect at all. This according to a new study examining different drug combinations administered to patients with schizophrenia.

"We weren’t aware that the beneficial effects of antidepressives were so powerful," says Jari Tiihonen, professor of clinical psychiatry at Karolinska Institutet’s Department of Clinical Neuroscience.

The study followed 2,588 Finns who had recently developed schizophrenia from the time of their initial admission to hospital for an average of four years. By accessing the Finnish registers, the researchers were then able to ascertain the effects of different drug combinations on the mortality risk within the group.

A total of 160 people died in the study, most commonly from external causes such as drowning, poisoning or violent crime, something that affected 57 people. Thirty-five of these cases were suicide, which made it and cardiovascular disease the two main causes of death.

The researchers found that when taking bensodiazepines, the participants ran a 91 per cent higher risk of early death than at times when these drugs were not used. By far the most common cause of death was suicide, and most deaths occurred with patients who had been taking their bensodiazepines for longer than four weeks.

"The increased suicide risk for patients with long-standing benzodiazepine use may be partly attributable to the possible development of withdrawal symptoms when the drugs run out," says Professor Tiihonen. "These symptoms, which can be severe severe anxiety and insomnia, might have affected some of the patients’ decisions to commit suicide. It’s therefore extremely important that bensodiazepines are discontinued gradually rather than abruptly over a period of weeks or months and in consultation with a doctor."

"The temporary acute use of benzodiazepines is justifiable if the patient is suffering a great deal of anxiety," he continues. "But benzodiazepines should be discontinued within a month according to psychiatric recommendations, which doctors must start following and respecting."

During the periods the participants took antidepressive drugs, they ran a 43 per cent lower mortality risk than during the periods when these drugs were not used. Antipsychotics had no effect on mortality if the patients were on multiple prescriptions simultaneously.

"People think that it’s dangerous to treat patients with schizophrenia with more than one antipsychotic drug, but there is nothing to back that up, says Professor Tiihonen. "I believe that most doctors prescribe several antipsychotics if their patients are not helped by just one kind, and our study finds no link between this and increased mortality during a four year follow-up. But it does mean more adverse effects, such as the risk of weight-gain, which also impacts the health in the long run, so the recommended attitude is still one of restraint."

Source: Science Daily

May 8, 20124 notes
#science #neuroscience brain psychology
Psychiatric Medications' Effect On Brain Structure Varies

ScienceDaily (May 8, 2012) — It is increasingly recognized that chronic psychotropic drug treatment may lead to structural remodeling of the brain. Indeed, clinical studies in humans present an intriguing picture: antipsychotics, used for the treatment of schizophrenia and psychosis, may contribute to cortical gray matter loss in patients, whereas lithium, used for the treatment of bipolar disorder and mania, may preserve gray matter in patients.

However, the clinical significance of these structural changes is not yet clear. There are many challenges in executing longitudinal, controlled, and randomized studies to evaluate this issue in humans, particularly because there are also many confounding factors, including illness severity, illness duration, and other medications, when studying patients.

It is therefore critical to develop animal models to inform the clinical research. To accomplish this, a group of researchers at King’s College London, led by Dr. Shitij Kapur, developed a rat model using clinically relevant drug exposure and matched clinical dosing in combination with longitudinal magnetic resonance imaging. They administered either lithium or haloperidol (a common antipsychotic) to rats in doses equivalent to those received by humans. The rats received this treatment daily for eight weeks, equivalent to 5 human years, and underwent brain scans both before and after treatment.

Dr. Kapur explained their findings, “Using this approach, we observed that chronic treatment with haloperidol leads to decreases in cortical gray matter, whilst lithium induced an increase, effects that were reversible after drug withdrawal.” Gray matter was decreased by 6% after haloperidol treatment, but increased by 3% after lithium treatment.

"These important observations clarify conflicting findings from clinical trials by removing many of the confounding effects," commented Dr. John Krystal, Editor of Biological Psychiatry. "Whether these changes in brain structure underlie the benefits or side effects of these medications remain to be seen. However, they point to brain effects of established medications that are not well understood, but which may hold clues to new treatment approaches."

"Whilst these intriguing findings are consistent with available clinical data, it should be noted these studies were done in normal rats, which do not capture the innate pathology of either schizophrenia or bipolar disorder," Kapur added. "Moreover, because the mechanism(s) of these drug effects remain unknown, further studies are required, and one should be cautious in drawing clinical inferences. Nevertheless, our study demonstrates a new and powerful model system for further investigation of the effects of psychotropic drug treatment on brain morphology."

Source: Science Daily

May 8, 20126 notes
#science #neuroscience #brain #psychology
Researchers 'switch off' neurodegeneration in mice

May 8, 2012

Researchers at the Medical Research Council (MRC) Toxicology Unit at the University of Leicester have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice.

In human neurodegenerative diseases, including Alzheimer’s, Parkinson’s and prion diseases, proteins “misfold” in a variety of different ways resulting in the build up of misshapen proteins. These form the plaques found in Alzheimer’s and the Lewy bodies found in Parkinson’s disease. 
  
The researchers studied mice with neurodegeneration caused by prion disease, as these mouse models currently provide the best animal representation of human neurodegenerative disorders, where it is known that the build up of misshapen proteins is linked with brain cell death. 
  
They found that the build up of misfolded proteins in the brains of these mice activates a natural defence mechanism in cells, which switches off the production of new proteins. This would normally switch back ‘on’ again, but in these mice the continued build-up of misshapen protein keeps the switch turned ‘off’. This is the trigger point leading to brain cell death, as those key proteins essential for nerve cell survival are not made. 
  
By injecting a protein that blocks the ‘off’ switch of the pathway, the scientists were able to restore protein production, independently of the build up of misshapen proteins, and halt the neurodegeneration. The brain cells were protected, protein levels and synaptic transmission (the way in which brain cells signal to each other) were restored and the mice lived longer, even though only a very small part of their brain had been treated. 
  
Misshapen proteins in human neurodegenerative diseases, such as Alzheimer’s and Parkinson’s diseases, also over-activate this fundamental pathway controlling protein synthesis in the brains of patients, which represents a common target underlying these different clinical conditions. The scientists’ results suggest that treatments focused on this pathway could be protective in a range of neurodegenerative disease in which misshapen proteins are building up and causing neurons to die. 
  
Professor Giovanna Mallucci, who led the team, said: “What’s exciting is the emergence of a common mechanism of brain cell death across a range of different neurodegenerative disorders and activated by the different misfolded proteins in each disease. The fact that in mice with prion disease we were able to manipulate this mechanism and protect the brain cells means we may have a way forward in how we treat other disorders. Instead of targeting individual misfolded proteins in different neurodegenerative diseases, we may be able to target the shared pathways and rescue brain cell degeneration irrespective of the underlying disease.” 
  
Professor Hugh Perry, chair of the MRC’s Neuroscience and Mental Health Board, said: “Neurodegenerative diseases such as Alzheimer’s and Parkinson’s are debilitating and largely untreatable conditions. Alzheimer’s disease and related disorders affect over seven million people in Europe, and this figure is expected to double every 20 years as the population ages across Europe. The MRC believes that research such as this, which looks at the fundamental mechanisms of these devastating diseases, is absolutely vital. Understanding the mechanism that leads to neuronal dysfunction prior to neuronal loss is a critical step in finding ways to arrest disease progression.”

Provided by Medical Research Council 

Source: medicalxpress.com

May 8, 201226 notes
#science #neuroscience #brain #psychology
Getting a grip on memories

May 8, 2012

(Medical Xpress) — Having a fat head may not be a bad thing, according to new findings at The Johns Hopkins University. As reported in the February 9 issue of Neuron, Hopkins researchers have made a significant discovery as to how adding fat molecules to proteins can influence the brain circuitry controlling cognitive function, including learning and memory.

“When you learn something, you strengthen and inhibit certain transmissions and sculpt a particular circuit. Recall [or memory] is using that circuit again,” says Richard L. Huganir, Ph.D., professor and director of the Solomon H. Snyder Department of Neuroscience at Johns Hopkins. His team’s latest finding describes for the first time how one protein chemically alters another in this circuit strengthening process and represents another step toward understanding a key part of how memories are made and maintained within the brain, something researchers believe could provide a pathway toward treating disorders like Alzheimer’s and schizophrenia.

In studying the molecular underpinnings of learning and memory, Huganir and his team have focused on one of several processes in which a molecule is tagged by another molecule of fat. Tagging sends the molecules to a particular destination within a cell. Specifically, the team has studied DHHC5, which is known to add a fat molecule to other proteins. Until now it was not known which proteins receive this tag.

The scientists suspected a target molecule would need to bind DHHC5, which would then transfer fat onto it. To determine what DHHC5 could bind, they used it as bait in a large pool of rat brain proteins to fish for those that stuck to DHHC5. Within that pool, DHHC5 bound four different proteins, researchers found. Using a computer program, they compared these with other proteins implicated in learning and memory. All four shared similarity with the brain protein known as GRIP1, mutations of which have been linked to disorders such as autism. The scientists then tested GRIP1 and DHHC5 directly and found that they bound each other as well. Next, they put GRIP1 into human kidney cells, either by itself or with DHHC5, and analyzed each group of cells to see what happened. They found that only the GRIP1 proteins that were added to cells with DHHC5 were tagged with fat. From this they concluded that DHHC5 does indeed tag GRIP1 with fat.

The researchers then wanted to know if this process happens in a brain. However, they needed a way to look into a living cell and be able to tell apart GRIP1 that had a fat tag and GRIP1 that didn’t. They designed two distinct GRIP1 proteins: one permanently tagged with fat, and another mutated so that it could never be tagged. They added color markers to both proteins so they could track them under a microscope, and then added one type or the other to living brain cells. The fat-tagged proteins seemed to form clusters extending to the cell’s edges in a pattern resembling that of cellular recycling-center proteins. The untagged proteins, in contrast, seemed to diffuse around the center of the cell. From this, the team concluded that DHHC5 tags proteins like GRIP1 with fat to send them to be recycled.

According to Huganir, protein recycling is critical for strengthening and maintaining memory circuits. Since GRIP1 is involved with recycling, it may be important in this critical aspect of memory formation. Huganir believes some day researchers could learn how to control this mechanism and reverse the disease process for disorders like Alzheimer’s and schizophrenia.

“Some day we may be able to inhibit or activate these molecules,” Huganir says. “These molecules are involved in mediating everything in the brain, all behaviors.”

Provided by Johns Hopkins University

Source: medicalxpress.com

May 8, 201210 notes
#science #neuroscience #brain #psychology #memory
Psychopathy Linked to Specific Structural Abnormalities in the Brain

May 7th, 2012

New research provides the strongest evidence to date that psychopathy is linked to specific structural abnormalities in the brain.

The study, published in Archives of General Psychiatry and led by researchers at King’s College London is the first to confirm that psychopathy is a distinct neuro-developmental sub-group of anti-social personality disorder (ASPD).

Most violent crimes are committed by a small group of persistent male offenders with ASPD. Approximately half of male prisoners in England and Wales will meet diagnostic criteria for ASPD. The majority of such men are not true psychopaths (ASPD-P). They are characterised by emotional instability, impulsivity and high levels of mood and anxiety disorders. They typically use aggression in a reactive way in response to a perceived threat or sense of frustration.

However, about one third of such men will meet additional diagnostic criteria for psychopathy (ASPD+P). They are characterised by a lack of empathy and remorse, and use aggression in a planned way to secure what they want (status, money etc.). Previous research has shown that psychopaths’ brains differ structurally from healthy brains, but until now, none have examined these differences within a population of violent offenders with ASPD.

Dr Nigel Blackwood from the Institute of Psychiatry at King’s and lead author of the study says: ‘Using MRI scans we found that psychopaths had structural brain abnormalities in key areas of their ‘social brains’ compared to those who just had ASPD. This adds to behavioural and developmental evidence that psychopathy is an important subgroup of ASPD with a different neurobiological basis and different treatment needs.

‘There is a clear behavioural difference amongst those diagnosed with ASPD depending on whether or not they also have psychopathy. We describe those without psychopathy as ‘hot-headed’ and those with psychopathy as ‘cold-hearted’. The ‘cold-hearted’ psychopathic group begin offending earlier, engage in a broader range and greater density of offending behaviours, and respond less well to treatment programmes in adulthood, compared to the ‘hot-headed’ group. We now know that this behavioural difference corresponds to very specific structural brain abnormalities which underpin psychopathic behaviour, such as profound deficits in empathising with the distress of others.’

The researchers used Magnetic Resonance Imaging (MRI) to scan the brains of 44 violent adult male offenders diagnosed with Anti-Social Personality Disorder (ASPD). Crimes committed included murder, rape, attempted murder and grievous bodily harm. Of these, 17 met the diagnosis for psychopathy (ASPD+P) and 27 did not (ASPD-P). They also scanned the brains of 22 healthy non-offenders.

The study found that ASPD+P offenders displayed significantly reduced grey matter volumes in the anterior rostral prefrontal cortex and temporal poles compared to ASPD-P offenders and healthy non-offenders. These areas are important in understanding other people’s emotions and intentions and are activated when people think about moral behaviour. Damage to these areas is associated with impaired empathising with other people, poor response to fear and distress and a lack of ‘self-conscious’ emotions such as guilt or embarrassment.

Dr Blackwood explains: ‘Identifying and diagnosing this sub-group of violent offenders with brain scans has important implications for treatment. Those without the syndrome of psychopathy, and the associated structural brain damage, will benefit from cognitive and behavioural treatments. Optimal treatment for the group of psychopaths is much less clear at this stage.’

Source: Neuroscience News

May 8, 201211 notes
#science #neuroscience #psychology #brain
Greater Purpose in Life May Protect Against Harmful Changes in the Brain Associated With Alzheimer’s Disease

ScienceDaily (May 7, 2012) — Greater purpose in life may help stave off the harmful effects of plaques and tangles associated with Alzheimer’s disease, according to a new study by researchers at Rush University Medical Center.

image

Greater purpose in life may help stave off the harmful effects of plaques and tangles associated with Alzheimer’s disease, according to a new study. (Credit: © Nejron Photo / Fotolia)

The study is published in the May issue of the Archives of General Psychiatry.

"Our study showed that people who reported greater purpose in life exhibited better cognition than those with less purpose in life even as plaques and tangles accumulated in their brains," said Patricia A. Boyle, PhD.

"These findings suggest that purpose in life protects against the harmful effects of plaques and tangles on memory and other thinking abilities. This is encouraging and suggests that engaging in meaningful and purposeful activities promotes cognitive health in old age."

Boyle and her colleagues from the Rush Alzheimer’s Disease Center studied 246 participants from the Rush Memory and Aging Project who did not have dementia and who subsequently died and underwent brain autopsy. Participants received an annual clinical evaluation for up to approximately 10 years, which included detailed cognitive testing and neurological exams.

Participants also answered questions about purpose in life, the degree to which one derives meaning from life’s experiences and is focused and intentional. Brain plaques and tangles were quantified after death. The authors then examined whether purpose in life slowed the rate of cognitive decline even as older persons accumulated plaques and tangles.

While plaques and tangles are very common among persons who develop Alzheimer’s dementia (characterized by prominent memory loss and changes in other thinking abilities), recent data suggest that plaques and tangles accumulate in most older persons, even those without dementia. Plaques and tangles disrupt memory and other cognitive functions.

Boyle and colleagues note that much of the Alzheimer’s research that is ongoing seeks to identify ways to prevent or limit the accumulation of plaques and tangles in the brain, a task that has proven quite difficult. Studies such as the current one are needed because, until effective preventive therapies are discovered, strategies that minimize the impact of plaques and tangles on cognition are urgently needed.

"These studies are challenging because many factors influence cognition and research studies often lack the brain specimen data needed to quantify Alzheimer’s changes in the brain," Boyle said. "Identifying factors that promote cognitive health even as plaques and tangles accumulate will help combat the already large and rapidly increasing public health challenge posed by Alzheimer’s disease."

The Rush Memory and Aging Project, which began in 1997, is a longitudinal clinical-pathological study of common chronic conditions of aging. Participants are older persons recruited from about 40 continuous care retirement communities and senior subsidized housing facilities in and around the Chicago Metropolitan area. More than 1,500 older persons are currently enrolled in the study.

Source: Science Daily

May 8, 20128 notes
#science #neuroscience #brain #psychology #alzheimer
Gene That Leads to Severe Weight Gain With Antipsychotic Treatment Discovered

ScienceDaily (May 7, 2012) — Antipsychotic medications are increasingly prescribed in the US, but they can cause serious side effects including rapid weight gain, especially in children. In the first study of its kind, researchers at Zucker Hillside Hospital and the Feinstein Institute for Medical Research identified a gene that increases weight gain in those treated with commonly-used antipsychotic drugs.

These findings were published in the May issue of Archives of General Psychiatry.

Second-generation antipsychotics (SGAs) were used as the treatment in this study. SGAs are commonly used to treat many psychotic and nonpsychotic disorders. However, it is important to note that these SGAs are associated with substantial weight gain, including the development of obesity and other cardiovascular risk factors. The weight gain side effect of SGAs is significant because it often results in a reduced life expectancy of up to 30 years in those who suffer from chronic and severe mental illnesses. The weight gain also prompts some to stop taking the medication, adversely impacting their quality of life.

In this genome-wide association study (GWAS), researchers first evaluated a group of pediatric patients in the US being treated for the first time with antipsychotics. They then replicated the result in three independent groups of patients who were in psychiatric hospitals in the United States and Germany or participating in European antipsychotic drug trials. The gene that was identified to increase weight gain, MC4R or melanocortin 4 receptor, has been previously identified as being linked to obesity and type 2 diabetes. In the new study, it was found that patients gained up to 20 pounds when on treatment.

"This study offers the prospect of being able to identify individuals who are at greatest risk for severe weight gain following antipsychotic treatment," said Anil Malhotra, MD, investigator at the Zucker Hillside Hospital Department of Psychiatry Research and Feinstein Institute for Medical Research. "We hope that those who are at risk could receive more intensive or alternative treatment that would reduce the potential for weight gain and we are currently conducting studies to identify such treatment."

Additional Details About the Study

Researchers conducted the first GWAS of SGA-induced weight gain in patients carefully monitored for medication adherence who were undergoing initial treatment with SGAs. To confirm results, they next assessed three independent replication cohorts: 1) a cohort of adult subjects undergoing their first treatment with a single SGA (clozapine), 2) a cohort of adult subjects treated with the same SGAs as in our discovery sample, and 3) a cohort of adult subjects in the first episode of schizophrenia and enrolled in a randomized clinical trial of antipsychotic drugs. The discovery cohort consisted of 139 pediatric patients undergoing first exposure to SGAs. The 3 additional cohorts consisted of 73, 40, and 92 subjects. Patients in the discovery cohort were treated with SGAs for 12 weeks. Additional cohorts were treated for 6 and 12 weeks.

This GWAS yielded 20 single-nucleotide polymorphisms at a single locus exceeding a statistical threshold of P10-5. This locus, near the melanocortin 4 receptor (MC4R) gene, overlaps a region previously identified by large-scale GWAS of obesity in the general population. Effects were recessive, with minor allele homozygotes gaining extreme amounts of weight during the 12-week trial. These results were replicated in 3 additional cohorts, with rs489693 demonstrating consistent recessive effects; meta-analysis revealed a genome-wide significant effect. Moreover, consistent effects on related metabolic indices, including triglyceride, leptin, and insulin levels were observed.

Source: Science Daily

May 8, 20127 notes
#science #neuroscience #psychology
Midlife and Late-Life Depressive Symptoms Associated With Dementia

ScienceDaily (May 7, 2012) — Depressive symptoms that are present in midlife or in late life are associated with an increased risk of developing dementia, according to a report in the May issue of Archives of General Psychiatry, a JAMA Network publication.

Nearly 5.3 million individuals in the United States have Alzheimer disease (AD) and the resulting health care costs in 2010 were roughly $172 billion, the authors write as background information in the study. “Prevalence and costs of AD and other dementias are projected to rise dramatically during the next 40 years unless a prevention or a cure can be found. Therefore, it is critical to gain a greater understanding of the key risk factors and etiologic underpinnings of dementia from a population-based perspective,” the authors write.

Deborah E. Barnes, Ph.D., M.P.H., of the University of California, San Francisco and the San Francisco Veterans Affairs Medical Center, and colleagues evaluated data from 13,535 long-term Kaiser Permanente members and examined depressive symptoms assessed in midlife (1964-1973) and in late life (1994-2000) and risks of developing dementia, Alzheimer disease (AD) and vascular dementia (VaD; dementia resulting from brain damage from impaired blood flow to the brain).

Depressive symptoms were present in 14.1 percent of study participants in midlife only, 9.2 percent in late life only and 4.2 percent in both. During six years of follow-up, 22.5 percent of patients were diagnosed with dementia; 5.5 percent with Alzheimer disease and 2.3 percent with VaD.

When examining AD and VaD separately, patients with late-life depressive symptoms had a two-fold increase in AD risk, and patients with midlife and late-life symptoms had more than a three-fold increase in VaD risk.

"Our findings suggest that chronic depression during the life course may be etiologically associated with an increased risk of dementia, particularly VaD, whereas depression that occurs for the first time in late life is likely to reflect a prodromal stage of dementia, in particular AD," the authors conclude.

Source: Science Daily

May 8, 20124 notes
#science #neuroscience #psychology #brain
Defective Carnitine Metabolism May Play Role in Autism

ScienceDaily (May 7, 2012) — The deletion of part of a gene that plays a role in the synthesis of carnitine — an amino acid derivative that helps the body use fat for energy — may play a role in milder forms of autism, said a group of researchers led by those at Baylor College of Medicine and Texas Children’s Hospital.

"This is a novel inborn error of metabolism," said Dr. Arthur Beaudet, chair of molecular and human genetics at BCM and a physician at Texas Children’s Hospital, and the senior author of the report that appears online in the Proceedings of the National Academy of Sciences. "How it is associated with the causes of autism is as yet unclear. However, it could point to a means of treatment or even prevention in some patients."

Deletion leads to imbalance

Beaudet and his international group of collaborators believe the gene deletion leads to an imbalance in carnitine in the body. Meat eaters receive about 75 percent of their carnitine from their diet. However, dietary carnitine levels are low in vegetarians and particularly in vegans. In most people, levels of carnitine are balanced by the body’s ability to manufacture its own carnitine in the liver, kidney and brain, starting with a modified form of the amino acid lysine.

Carnitine deficiency has been identified when not enough is absorbed through the diet or because of medical treatments such as kidney dialysis. Genetic forms of carnitine deficiency also exist, which are caused when too much carnitine is excreted through the kidneys.

In this new inborn error, there is a deletion in the second exon — the protein-coding portion of a gene — of the TMLHE gene, which includes the genetic code for the first enzyme in the synthesis of carnitine (TMLHE stands for trimethyllysine epsilon which encodes the enzyme trimethyllysine dioxygenase).

Studies in the laboratory that identified the deletion were led by Dr. Patricia B.S. Celestino-Soper, as a graduate student in Beaudet’s laboratory at BCM, and by Dr. Sara Violante, a graduate student in the laboratory of Dr. Frédéric M. Vaz of the Academic Medical Center in Amsterdam.

Frequency of deletion

To determine the frequency of the gene deletion, Beaudet and his colleagues tested male autism patients who were the only people with the disorder in their families (simplex families) from the Simons Simplex Collection, the South Carolina Early Autism Project and Houston families. In collaboration with laboratories and researchers in Nashville, Los Angeles, Paris, New York, Toronto and Cambridge (United Kingdom), they tested affected male siblings in families with more than one male case of autism (multiplex families).

When they looked at the TMLHE genes in males affected by autism and compared them to normal controls, they found that the gene alteration is a fairly common one, occurring in as many as one in 366 males unaffected by autism. It was not significantly more common in males within families in which there is only one person with autism. However, it is nearly three times more common in families with two or more boys with autism.

No syndromic form

Beaudet said most of the affected males with the deletion did not have syndromic autism that is frequently associated with other serious diseases. In many instances, syndromic autism affects physical development as well as cognitive, which is reflected in their facial features as well as other parts of their bodies. None of the six boys affected with autism (where information was available) had the syndromic form of disease. Their intelligence quotients and cognitive scores varied, with some being far below normal and others normal.

"Most of the males we identified with the TMLHE deficiency were apparently normal as adults," said Beaudet, although detailed information on learning and behavior was not available on these "control" males. "The gene deletion is neither necessary nor sufficient in itself to cause autism."

"TMLHE deficiency itself is likely to be a weak risk factor for autism, but we need to do more studies to replicate our results," Beaudet said. He estimated that at the rates found in his study, the deficiency might be a factor in about 170 males born with autism per year in the United States. This would equate to about one-half of one percent of autism cases.

The authors from Amsterdam found major increases in some carnitine-related chemicals and absence of others in both urine and plasma. These metabolic alterations were found to be predictive of the dysfunction of the TMLHE gene and therefore can be used to identify males with this disorder.

It remains uncertain whether TMLHE deficiency is benign or causes autism by affecting the function of neurons through toxic accumulation or deficiency of a variety of chemical metabolites.

"We believe that the most attractive hypothesis at this time is that the increased risk of autism is modified by dietary intake of carnitine from birth through the first few years of life," said Beaudet.

He and his colleagues are undertaking three studies to further their understanding of the TMLHE deficiency. In one, they will attempt to replicate the findings in multiplex families. In a second, they will study carnitine levels in the cerebrospinal fluid of infants with autism — both those who have the gene deficiency and those who do not. In a third study, they plan to begin giving boys under age 5 with autism carnitine or a related supplement and determine whether this improves the behavior of those with the TMLHE deficiency and those without.

Source: Science Daily

May 8, 20121 note
#science #neuroscience #psychology #brain #autism
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December