Neuroscience

Month

July 2013

Jul 25, 2013127 notes
#autism #technology #neuroscience #science
Jul 25, 2013103 notes
#Blue Brain project #brain mapping #brainwaves #neural circuits #neuroscience #science
Jul 25, 2013344 notes
#psychopathy #empathy #brain imaging #brain activity #somatosensory cortex #psychology #neuroscience #science
Jul 25, 2013108 notes
#ataxia #epilepsy #epileptic seizures #ion channels #Dravet syndrome #stem cells #neuroscience #science
Jul 24, 2013217 notes
#auditory system #hearing #hearing disorders #neuroscience #science
No Link Between Mercury Exposure and Autism-like Behaviors

The potential impact of exposure to low levels of mercury on the developing brain – specifically by women consuming fish during pregnancy – has long been the source of concern and some have argued that the chemical may be responsible for behavioral disorders such as autism. However, a new study that draws upon more than 30 years of research in the Republic of Seychelles reports that there is no association between pre-natal mercury exposure and autism-like behaviors.

image

“This study shows no evidence of a correlation between low level mercury exposure and autism spectrum-like behaviors among children whose mothers ate, on average, up to 12 meals of fish each week during pregnancy,” said Edwin van Wijngaarden, Ph.D., an associate professor in the University of Rochester Medical Center’s (URMC) Department of Public Health Sciences and lead author of the study which appears online today in the journal Epidemiology. “These findings contribute to the growing body of literature that suggest that exposure to the chemical does not play an important role in the onset of these behaviors.”

The debate over fish consumption has long created a dilemma for expecting mothers and physicians. Fish are high in beneficial nutrients such as, selenium, vitamin E, lean protein, and omega-3 fatty acids; the latter are essential to brain development. At the same time, exposure to high levels of mercury has been shown to lead to developmental problems, leading to the claim that mothers are exposing their unborn children to serious neurological impairment by eating fish during pregnancy. Despite the fact that the developmental consequences of low level exposure remain unknown, some organizations, including the U.S. Food and Drug Administration, have recommended that pregnant women limit their consumption of fish.

The presence of mercury in the environment is widespread and originates from both natural sources such as volcanoes and as a byproduct of coal-fired plants that emit the chemical. Much of this mercury ends up being deposited in the world’s oceans where it makes its way into the food chain and eventually into fish. While the levels of mercury found in individual fish are generally low, concerns have been raised about the cumulative effects of a frequent diet of fish.

The Republic of Seychelles has proven to be the ideal location to examine the potential health impact of persistent low level mercury exposure. With a population of 87,000 people spread across an archipelago of islands in the Indian Ocean, fishing is a both an important industry and a primary source of nutrition – the nation’s residents consume fish at a rate 10 times greater than the populations of the U.S. and Europe.  

The Seychelles Child Development Study – a partnership between URMC, the Seychelles Ministries of Health and Education, and the University of Ulster in Ireland – was created in the mid-1980s to specifically study the impact of fish consumption and mercury exposure on childhood development. The program is one of the largest ongoing epidemiologic studies of its kind.

“The Seychelles study was designed to follow a population over a very long period of time and focus on relevant mercury exposure,” said Philip Davidson, Ph.D., principal investigator of the Seychelles Child Development Study and professor emeritus in Pediatrics at URMC.   “While the amount of fish consumed in the Seychelles is significantly higher than other countries in the industrialized world, it is still considered low level exposure.”

The autism study involved 1,784 children, adolescents, and young adults and their mothers. The researchers were first able to determine the level of prenatal mercury exposure by analyzing hair samples that had been collected from the mothers around the time of birth, a test which can approximate mercury levels found in the rest of the body including the growing fetus. 

The researchers then used two questionnaires to determine whether or not the study participants were exhibiting autism spectrum-like behaviors. The Social Communication Questionnaire was completed by the children’s parents and the Social Responsiveness Scale was completed by their teachers. These tests – which include questions on language skills, social communication, and repetitive behaviors – do not provide a definitive diagnosis, but they are widely used in the U.S. as an initial screening tool and may suggest the need for additional evaluation.

The mercury levels of the mothers were then matched with the test scores of their children and the researchers found that there was no correlation between prenatal exposure and evidence of autism-spectrum-like behaviors. This is similar to the result of previous studies of the nation’s children which have measured language skills and intelligence, amongst other outcomes, and have not observed any adverse developmental effects.

The study lends further evidence to an emerging belief that the “good” may outweigh the possible “bad” when it comes to fish consumption during pregnancy. Specifically, if mercury does adversely influence child development at these levels of exposure then the benefits of the nutrients found in the fish may counteract or perhaps even supersede the potential negative effects of the mercury. 

“This study shows no consistent association in children with mothers with mercury levels that were six to ten times higher than those found in the U.S. and Europe,” said Davidson. “This is a sentinel population and if it does not exist here than it probably does not exist.”

“NIEHS has been a major supporter of research looking into the human health risks associated with mercury exposure,” said Cindy Lawler, Ph.D., acting branch chief at the National Institute of Environmental Health Sciences, part of National Institutes of Health. “The studies conducted in the Seychelles Islands have provided a unique opportunity to better understand the relationship between environmental factors, such as mercury, and the role they may play in the development of diseases like autism. Although more research is needed, this study does present some good news for parents.” 

Jul 24, 2013115 notes
#ASD #autism #brain development #mercury exposure #neurobiology #neuroscience #science
Jul 24, 201398 notes
#glioblastoma #glioma #brain cancer #hyaluronic acid #polyethylene glycol #neuroscience #science
Controlling genes with light

New technique can rapidly turn genes on and off, helping scientists better understand their function.

Although human cells have an estimated 20,000 genes, only a fraction of those are turned on at any given time, depending on the cell’s needs — which can change by the minute or hour. To find out what those genes are doing, researchers need tools that can manipulate their status on similarly short timescales.

That is now possible, thanks to a new technology developed at MIT and the Broad Institute that can rapidly start or halt the expression of any gene of interest simply by shining light on the cells.

The work is based on a technique known as optogenetics, which uses proteins that change their function in response to light. In this case, the researchers adapted the light-sensitive proteins to either stimulate or suppress the expression of a specific target gene almost immediately after the light comes on.

“Cells have very dynamic gene expression happening on a fairly short timescale, but so far the methods that are used to perturb gene expression don’t even get close to those dynamics. To understand the functional impact of those gene-expression changes better, we have to be able to match the naturally occurring dynamics as closely as possible,” says Silvana Konermann, an MIT graduate student in brain and cognitive sciences.

The ability to precisely control the timing and duration of gene expression should make it much easier to figure out the roles of particular genes, especially those involved in learning and memory. The new system can also be used to study epigenetic modifications — chemical alterations of the proteins that surround DNA — which are also believed to play an important role in learning and memory.

Konermann and Mark Brigham, a graduate student at Harvard University, are the lead authors of a paper describing the technique in the July 22 online edition of Nature. The paper’s senior author is Feng Zhang, the W.M. Keck Assistant Professor in Biomedical Engineering at MIT and a core member of the Broad Institute and MIT’s McGovern Institute for Brain Research.

Shining light on genes

The new system consists of several components that interact with each other to control the copying of DNA into messenger RNA (mRNA), which carries genetic instructions to the rest of the cell. The first is a DNA-binding protein known as a transcription activator-like effector (TALE). TALEs are modular proteins that can be strung together in a customized way to bind any DNA sequence.

Fused to the TALE protein is a light-sensitive protein called CRY2 that is naturally found in Arabidopsis thaliana, a small flowering plant. When light hits CRY2, it changes shape and binds to its natural partner protein, known as CIB1. To take advantage of this, the researchers engineered a form of CIB1 that is fused to another protein that can either activate or suppress gene copying.

After the genes for these components are delivered to a cell, the TALE protein finds its target DNA and wraps around it. When light shines on the cells, the CRY2 protein binds to CIB1, which is floating in the cell. CIB1 brings along a gene activator, which initiates transcription, or the copying of DNA into mRNA. Alternatively, CIB1 could carry a repressor, which shuts off the process.

A single pulse of light is enough to stimulate the protein binding and initiate DNA copying. The researchers found that pulses of light delivered every minute or so are the most effective way to achieve continuous transcription for the desired period of time. Within 30 minutes of light delivery, the researchers detected an uptick in the amount of mRNA being produced from the target gene. Once the pulses stop, the mRNA starts to degrade within about 30 minutes.

In this study, the researchers tried targeting nearly 30 different genes, both in neurons grown in the lab and in living animals. Depending on the gene targeted and how much it is normally expressed, the researchers were able to boost transcription by a factor of two to 200.

Karl Deisseroth, a professor of bioengineering at Stanford University and one of the inventors of optogenetics, says the most important innovation of the technique is that it allows control of genes that naturally occur in the cell, as opposed to engineered genes delivered by scientists.

“You could control, at precise times, a particular genetic locus and see how everything responds to that, with high temporal precision,” says Deisseroth, who was not part of the research team.

Epigenetic modifications

Another important element of gene-expression control is epigenetic modification. One major class of epigenetic effectors is chemical modification of the proteins, known as histones, that anchor chromosomal DNA and control access to the underlying genes. The researchers showed that they can also alter these epigenetic modifications by fusing TALE proteins with histone modifiers.

Epigenetic modifications are thought to play a key role in learning and forming memories, but this has not been very well explored because there are no good ways to disrupt the modifications, short of blocking histone modification of the entire genome. The new technique offers a much more precise way to interfere with modifications of individual genes.

“We want to allow people to prove the causal role of specific epigenetic modifications in the genome,” Zhang says.

So far, the researchers have demonstrated that some of the histone effector domains can be tethered to light-sensitive proteins; they are now trying to expand the types of histone modifiers they can incorporate into the system.

“It would be really useful to expand the number of epigenetic marks that we can control. At the moment we have a successful set of histone modifications, but there are a good deal more of them that we and others are going to want to be able to use this technology for,” Brigham says.

Jul 24, 2013132 notes
#epigenetics #optogenetics #genes #genetics #neurons #memory #TALE protein #neuroscience #science
Jul 24, 2013178 notes
#stroke #astrocytes #ischemic brain disorders #stem cells #neuroscience #science
Jul 24, 201378 notes
#alzheimer's disease #beta amyloid #dementia #cognitive decline #oligomers #amyloid fibrils #neuroscience #science
Chips that mimic the brain

Novel microchips imitate the brain’s information processing in real time. Neuroinformatics researchers from the University of Zurich and ETH Zurich together with colleagues from the EU and US demonstrate how complex cognitive abilities can be incorporated into electronic systems made with so-called neuromorphic chips: They show how to assemble and configure these electronic systems to function in a way similar to an actual brain.

image

No computer works as efficiently as the human brain – so much so that building an artificial brain is the goal of many scientists. Neuroinformatics researchers from the University of Zurich and ETH Zurich have now made a breakthrough in this direction by understanding how to configure so-called neuromorphic chips to imitate the brain’s information processing abilities in real-time. They demonstrated this by building an artificial sensory processing system that exhibits cognitive abilities.

New approach: simulating biological neurons

Most approaches in neuroinformatics are limited to the development of neural network models on conventional computers or aim to simulate complex nerve networks on supercomputers. Few pursue the Zurich researchers’ approach to develop electronic circuits that are comparable to a real brain in terms of size, speed, and energy consumption. “Our goal is to emulate the properties of biological neurons and synapses directly on microchips,” explains Giacomo Indiveri, a professor at the Institute of Neuroinformatics (INI), of the University of Zurich and ETH Zurich.

The major challenge was to configure networks made of artificial, i.e. neuromorphic, neurons in such a way that they can perform particular tasks, which the researchers have now succeeded in doing: They developed a neuromorphic system that can carry out complex sensorimotor tasks in real time. They demonstrate a task that requires a short-term memory and context-dependent decision-making – typical traits that are necessary for cognitive tests. In doing so, the INI team combined neuromorphic neurons into networks that implemented neural processing modules equivalent to so-called “finite-state machines” – a mathematical concept to describe logical processes or computer programs. Behavior can be formulated as a “finite-state machine” and thus transferred to the neuromorphic hardware in an automated manner. “The network connectivity patterns closely resemble structures that are also found in mammalian brains,” says Indiveri.

Chips can be configured for any behavior modes

The scientists thus demonstrate for the first time how a real-time hardware neural-processing system where the user dictates the behavior can be constructed. “Thanks to our method, neuromorphic chips can be configured for a large class of behavior modes. Our results are pivotal for the development of new brain-inspired technologies,” Indiveri sums up. One application, for instance, might be to combine the chips with sensory neuromorphic components, such as an artificial cochlea or retina, to create complex cognitive systems that interact with their surroundings in real time.

Literature:

E. Neftci, J. Binas, U. Rutishauser, E. Chicca, G. Indiveri, R. J. Douglas. Synthesizing cognition in neuromorphic electronic systems. PNAS. July 22, 2013.

Jul 23, 2013107 notes
#AI #neuromorphic chip #ANNs #artificial brain #neuroscience #science
Jul 23, 201363 notes
#autism #plasticity #neuroplasticity #neurons #neurodevelopmental disorders #neuroscience #science
Breastfeeding Could Prevent ADHD

TAU research finds that breastfed children are less likely to develop ADHD later in life

image

We know that breastfeeding has a positive impact on child development and health — including protection against illness. Now researchers from Tel Aviv University have shown that breastfeeding could also help protect against Attention Deficit/Hyperactivity Disorder (ADHD), the most commonly diagnosed neurobehavioral disorder in children and adolescents.

Seeking to determine if the development of ADHD was associated with lower rates of breastfeeding, Dr. Aviva Mimouni-Bloch, of Tel Aviv University’s Sackler Faculty of Medicine and Head of the Child Neurodevelopmental Center in Loewenstein Hospital, and her fellow researchers completed a retrospective study on the breastfeeding habits of parents of three groups of children: a group that had been diagnosed with ADHD; siblings of those diagnosed with ADHD; and a control group of children without ADHD and lacking any genetic ties to the disorder.

The researchers found a clear link between rates of breastfeeding and the likelihood of developing ADHD, even when typical risk factors were taken into consideration. Children who were bottle-fed at three months of age were found to be three times more likely to have ADHD than those who were breastfed during the same period. These results have been published in Breastfeeding Medicine.

Understanding genetics and environment

In their study, the researchers compared breastfeeding histories of children from six to 12 years of age at Schneider’s Children Medical Center in Israel. The ADHD group was comprised of children that had been diagnosed at the hospital, the second group included the siblings of the ADHD patients, and the control group included children without neurobehavioral issues who had been treated at the clinics for unrelated complaints.

In addition to describing their breastfeeding habits during the first year of their child’s life, parents answered a detailed questionnaire on medical and demographic data that might also have an impact on the development of ADHD, including marital status and education of the parents, problems during pregnancy such as hypertension or diabetes, birth weight of the child, and genetic links to ADHD.

Taking all risk factors into account, researchers found that children with ADHD were far less likely to be breastfed in their first year of life than the children in the other groups. At three months, only 43 percent of children in the ADHD group were breastfed compared to 69 percent of the sibling group and 73 percent of the control group. At six months, 29 percent of the ADHD group was breastfed, compared to 50 percent of the sibling group and 57 percent of the control group.

One of the unique elements of the study was the inclusion of the sibling group, says Dr. Mimouni-Bloch. Although a mother will often make the same breastfeeding choices for all her children, this is not always the case. Some children’s temperaments might be more difficult than their siblings’, making it hard for the mother to breastfeed, she suggests.

Added protection

While researchers do not yet know why breastfeeding has an impact on the future development of ADHD — it could be due to the breast milk itself, or the special bond formed between mother and baby during breastfeeding, for example — they believe this research shows that breastfeeding can have a protective effect against the development of the disorder, and can be counted as an additional biological advantage for breastfeeding.

Dr. Mimouni-Bloch hopes to conduct a further study on breastfeeding and ADHD, examining children who are at high risk for ADHD from birth and following up in six-month intervals until six years of age, to obtain more data on the phenomenon.

Jul 23, 201391 notes
#ADHD #breastfeeding #neurobiology #psychology #neuroscience #science
The Love Hormone is Two-Faced

Finding shows oxytocin strengthens bad memories and can increase fear and anxiety

It turns out the love hormone oxytocin is two-faced. Oxytocin has long been known as the warm, fuzzy hormone that promotes feelings of love, social bonding and well-being. It’s even being tested as an anti-anxiety drug. But new Northwestern Medicine® research shows oxytocin also can cause emotional pain, an entirely new, darker identity for the hormone.

Oxytocin appears to be the reason stressful social situations, perhaps being bullied at school or tormented by a boss, reverberate long past the event and can trigger fear and anxiety in the future.

That’s because the hormone actually strengthens social memory in one specific region of the brain, Northwestern scientists discovered.

If a social experience is negative or stressful, the hormone activates a part of the brain that intensifies the memory. Oxytocin also increases the susceptibility to feeling fearful and anxious during stressful events going forward. 

(Presumably, oxytocin also intensifies positive social memories and, thereby, increases feelings of well being, but that research is ongoing.)

The findings are important because chronic social stress is one of the leading causes of anxiety and depression, while positive social interactions enhance emotional health. The research, which was done in mice, is particularly relevant because oxytocin currently is being tested as an anti-anxiety drug in several clinical trials.

“By understanding the oxytocin system’s dual role in triggering or reducing anxiety, depending on the social context, we can optimize oxytocin treatments that improve well-being instead of triggering negative reactions,” said Jelena Radulovic, the senior author of the study and the Dunbar Professsor of Bipolar Disease at Northwestern University Feinberg School of Medicine. The paper was published July 21 in Nature Neuroscience.

This is the first study to link oxytocin to social stress and its ability to increase anxiety and fear in response to future stress. Northwestern scientists also discovered the brain region responsible for these effects — the lateral septum – and the pathway or route oxytocin uses in this area to amplify fear and anxiety.

The scientists discovered that oxytocin strengthens negative social memory and future anxiety by triggering an important signaling molecule — ERK (extracellular signal regulated kinases) — that becomes activated for six hours after a negative social experience. ERK causes enhanced fear, Radulovic believes, by stimulating the brain’s fear pathways, many of which pass through the lateral septum. The region is involved in emotional and stress responses.

The findings surprised the researchers, who were expecting oxytocin to modulate positive emotions in memory, based on its long association with love and social bonding.

“Oxytocin is usually considered a stress-reducing agent based on decades of research,” said Yomayra Guzman, a doctoral student in Radulovic’s lab and the study’s lead author. “With this novel animal model, we showed how it enhances fear rather than reducing it and where the molecular changes are occurring in our central nervous system.’

The new research follows three recent human studies with oxytocin, all of which are beginning to offer a more complicated view of the hormone’s role in emotions.

All the new experiments were done in the lateral septum. This region has the highest oxytocin levels in the brain and has high levels of oxytocin receptors across all species from mice to humans.

“This is important because the variability of oxytocin receptors in different species is huge,” Radulovic said. “We wanted the research to be relevant for humans, too.”

Experiments with mice in the study established that 1) oxytocin is essential for strengthening the memory of negative social interactions and 2) oxytocin increases fear and anxiety in future stressful situations.

Experiment 1: Oxytocin Strengthens Bad Memories

Three groups of mice were individually placed in cages with aggressive mice and experienced social defeat, a stressful experience for them. One group was missing its oxytocin receptors, essentially the plug by which the hormone accesses brain cells. The lack of receptors means oxytocin couldn’t enter the mice’s brain cells. The second group had an increased number of receptors so their brain cells were flooded with the hormone. The third control group had a normal number of receptors.

Six hours later, the mice were returned to cages with the aggressive mice. The mice that were missing their oxytocin receptors didn’t appear to remember the aggressive mice and show any fear. Conversely, when mice with increased numbers of oxytocin receptors were reintroduced to the aggressive mice, they showed an intense fear reaction and avoided the aggressive mice.

Experiment 2: Oxytocin Increases Fear and Anxiety in Future Stress

Again, the three groups of mice were exposed to the stressful experience of social defeat in the cages of other more aggressive mice. This time, six hours after the social stress, the mice were put in a box in which they received a brief electric shock, which startles them but is not painful. Then 24 hours later, the mice were returned to the same box but did not receive a shock.

The mice missing their oxytocin receptors did not show any enhanced fear when they re-entered the box in which they received the shock. The second group, which had extra oxytocin receptors showed much greater fear in the box. The third control group exhibited an average fear response.

“This experiment shows that after a negative social experience the oxytocin triggers anxiety and fear in a new stressful situation,” Radulovic said.

Jul 23, 2013348 notes
#anxiety #social anxiety #memory #oxytocin #fear #negative emotions #psychology #neuroscience #science
Jul 23, 2013115 notes
#science #alcohol #alcoholism #nucleus accumbens #insula #prefrontal cortex #optogenetics #neuroscience
Scientists identify key to learning new words

For the first time scientists have identified how a pathway in the brain which is unique to humans allows us to learn new words.

image

The average adult’s vocabulary consists of about 30,000 words. This ability seems unique to humans as even the species closest to us - chimps - manage to learn no more than 100. 

It has long been believed that language learning depends on the integration of hearing and repeating words but the neural mechanisms behind learning new words remained unclear. Previous studies have shown that this may be related to a pathway in the brain only found in humans and that humans can learn only words that they can articulate. 

Now researchers from King’s College London Institute of Psychiatry, in collaboration with Bellvitge Biomedical Research Institute (IDIBELL) and the University of Barcelona, have mapped the neural pathways involved in word learning among humans. They found that the arcuate fasciculus, a collection of nerve fibres connecting auditory regions at the temporal lobe with the motor area located at the frontal lobe in the left hemisphere of the brain, allows the ‘sound’ of a word to be connected to the regions responsible for its articulation. Differences in the development of these auditory-motor connections may explain differences in people’s ability to learn words. 

The results of the study are published in the journal Proceedings of the National Academy of Sciences (PNAS).

Dr Marco Catani, co-author from the NatBrainLab at King’s College London Institute of Psychiatry said: “Often humans take their ability to learn words for granted. This research sheds new light on the unique ability of humans to learn a language, as this pathway is not present in other species. The implications of our findings could be wide ranging – from how language is taught in schools and rehabilitation from injury, to early detection of language disorders such as dyslexia. In addition these findings could have implications for other disorders where language is affected such as autism and schizophrenia.”

The study involved 27 healthy volunteers. Researchers used diffusion tensor imaging to image the structure of the brain before a word learning task and functional MRI, to  detect the regions in the brain that were most active during the task. They found a strong relationship between the ability to remember words and the structure of arcuate fasciculus, which connects two brain areas: the territory of Wernicke, related to auditory language decoding, and Broca’s area, which coordinates the movements associated with speech and the language processing.

In participants able to learn words more successfully their arcuate fasciculus was more myelinated i.e. the nervous tissue facilitated faster conduction of the electrical signal. In addition the activity between the two regions was more co-ordinated in these participants.

Dr Catani concludes, “Now we understand that this is how we learn new words, our concern is that children will have less vocabulary as much of their interaction is via screen, text and email rather than using their external prosthetic memory. This research reinforces the need for us to maintain the oral tradition of talking to our children.”

Jul 23, 2013206 notes
#language #word learning #arcuate fasciculus #temporal lobe #dyslexia #diffusion tensor imaging #neuroscience #science
MS study targets damage affecting nerves

Multiple sclerosis treatments that repair damage to the brain could be developed thanks to new research.

A study has shed light on how cells are able to regenerate protective sheaths around nerve fibres in the brain.

These sheaths, made up of a substance called myelin, are critical for the quick transmission of nerve signals, enabling vision, sensation and movement, but break down in patients with multiple sclerosis (MS).

In multiple sclerosis patients, the protective layer surrounding nerve fibres is stripped away and the nerves are exposed and damaged.

-Dr Veronique Miron(MRC for Regenerative Medicine at the University of Edinburgh)

Macrophages

The study, by the Universities of Edinburgh and Cambridge, found that immune cells, known as macrophages, help trigger the regeneration of myelin.

Researchers found that following loss of or damage to myelin, macrophages can release a compound called activin-A, which activates production of more myelin.

Approved therapies for multiple sclerosis work by reducing the initial myelin injury – they do not promote myelin regeneration. This study could help find new drug targets to enhance myelin regeneration and help to restore lost function in patients with multiple sclerosis.

-Dr Veronique Miron (Medical Council Centre for Regenerative Medicine at the University of Edinburgh)

Study

The study, which looked at myelin regeneration in human tissue samples and in mice, is published in Nature Neuroscience.

It was funded by the MS Society, the Wellcome Trust and the Multiple Sclerosis Society of Canada.

Scientists now plan to start further research to look at how activin-A works and whether its effects can be enhanced.

We urgently need therapies that can help slow the progression of MS and so we’re delighted researchers have identified a new, potential way to repair damage to myelin. We look forward to seeing this research develop further.

-Dr Susan Kohlhaas (Head of Biomedical Research at the MS Society)

We are pleased to fund MS research that may lead to treatment benefits for people living with MS. We look forward to advances in treatments that address repair specifically, so that people with MS may be able to manage the unpredictable symptoms of the disease.

-Dr Karen Lee (Vice-President, Research at the MS Society of Canada

Jul 22, 201368 notes
#MS #macrophages #myelin #activin-A #neurobiology #neuroscience #science
For a healthy brain, don't let the trash pile up

Recycling is not only good for the environment, it’s good for the brain. A study using rat cells indicates that quickly clearing out defective proteins in the brain may prevent loss of brain cells.

image

Results of a study in Nature Chemical Biology suggest that the speed at which damaged proteins are cleared from neurons may affect cell survival and may explain why some cells are targeted for death in neurodegenerative disorders. The research was supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health.

One of the mysteries surrounding neurodegenerative diseases is why some nerve cells are marked for destruction whereas their neighbors are spared. It is especially puzzling because the protein thought to be responsible for cell death is found throughout the brain in many of these diseases, yet only certain brain areas or cell types are affected.

In Huntington’s disease and many other neurodegenerative disorders, proteins that are misfolded (have abnormal shapes), accumulate inside and around neurons and are thought to damage and kill nearby brain cells. Normally, cells sense the presence of malformed proteins and clear them away before they do any damage. This is regulated by a process called proteostasis, which the cell uses to control protein levels and quality.

In the study, Andrey S. Tsvetkov and his colleagues from the University of California, San Francisco (UCSF) and Duke University, Durham, N.C., showed that differences in the rate of proteostasis may be the clue to understanding why certain nerve cells die in Huntington’s, a genetic brain disorder that leads to uncontrolled movements and death.

To measure how quickly proteins are cleared away from cells, the researchers developed a new technique called optical pulse-labeling, allowing them to follow specific proteins in individual living cells. To test the technique, they grew brain cells in a dish and turned on Dendra2, a photoswitchable protein that glows from green to red after being hit by a specific type of light. Both the red and green glow can be followed until the protein is cleared from the cell. In this way, the researchers could track the lifetime of newly produced Dendra2 (which glows green) and older, photoswitched Dendra2 (which glows red) until the protein was cleared away from the cell.

"Before this new technique, there was no way to look at individual neurons and their capacity to handle proteins. This method provides a real-time readout of how fast proteins are turned over in neurons and gives us a look at some of the mechanisms involved," said Margaret Sutherland, Ph.D., program director at NINDS.

The researchers followed Dendra2 in a set of striatal neurons, which they obtained from rats. The striatum (where striatal neurons are located) is a brain region involved in a number of brain functions including planning movements and is most heavily affected in Huntington’s disease. They discovered that the mean lifetime of the protein (how long it remained in the cell) varied three- to fourfold, suggesting that rates of proteostasis were different among individual neurons. In other words, some cells may process an identical protein much slower than others.

Then, the researchers investigated how cells deal with different forms of huntingtin, the protein involved in Huntington’s. They fused Dendra2 on the end of a normal or mutant version of huntingtin to track how long the protein remained in cells. The mutant version of huntingtin is longer, and contains three building blocks of the protein repeated an abnormal number of times. These repeats in huntingtin are what cause it to misfold, eventually leading to neuron death and the symptoms of the disease. As predicted, in their experiments, the mutant form of huntingtin caused more rat cells to die than did the normal form of the protein.

The researchers found that the amount of time the mutant protein remained in the cell predicted neuronal survival: shorter mean lifetimes of mutant huntingtin were associated with longer neuronal survival. A shorter mean lifetime indicates that a protein does not remain in the cell for a long time, and that proteostasis is working effectively to clear it away. This suggests that improving proteostasis in Huntington’s brains may improve neuronal survival.

To test this idea, the researchers activated Nrf2, a protein known to regulate protein processing. When Nrf2 was turned on, the mean lifetime of huntingtin was shortened, and the neuron lived longer.

"Nrf2 seems like a potentially exciting therapeutic target. It is profoundly neuroprotective in our Huntington’s model and it accelerates the clearance of mutant huntingtin," said Dr. Steven Finkbeiner, senior author of the paper.

Although both striatal and cortical neurons are affected by mutant huntingtin, striatal neurons are more susceptible to cell death. The investigators found that striatal neurons were not as effective as cortical neurons in recognizing and clearing away the mutant protein.

"One surprising finding from these experiments was the significance of single cells’ ability to clear mutant huntingtin. It turned out that this ability largely predicted their susceptibility, whether that neuron came from the most vulnerable region of the brain – the striatum, or the cortex, which is less vulnerable," said Dr. Finkbeiner. The findings indicate that the toxicity of the damaged proteins may cause neurodegeneration by interfering with the proteostasis system, affecting how quickly they are cleared from neurons.

"The results should remind us that focusing on the disease-causing proteins is only one side of the coin. To understand why some cells die and others are spared, we may need to recognize that there are major, largely unrecognized cell-specific differences in the ways that various types of neurons recognize and dispose of disease-causing proteins," continued Dr. Finkbeiner.

The researchers explored potential mechanisms behind differences in proteostasis. One way that cells normally get rid of proteins is through autophagy — a process in which proteins are packed up into spheres and then broken down. Results in this paper suggested that neurons increased the rate of autophagy when they sensed that the mutant form of huntingtin was accumulating, indicating the autophagy system may be a drug target.

"These findings provide evidence that our brains have powerful coping mechanisms to deal with disease-causing proteins. The fact that some of these diseases don’t cause symptoms we can detect until the fourth or fifth decade of life, even when the gene has been present since birth, suggests that those mechanisms are pretty good," said Dr. Finkbeiner.

Future research is needed to determine why coping mechanisms fail as brain cells age and how neurons in the healthy brain keep the proteostasis system functioning.

"New research methods that help us understand how individual neurons function will increase our understanding of central nervous system disorders and help identify new treatments. It is critical to continue working on the methods such as those described in this paper," said Dr. Sutherland.

Jul 22, 2013110 notes
#neurodegenerative diseases #brain cells #nerve cells #proteins #proteostasis #neuroscience #science
Jul 21, 2013272 notes
#science #stem cells #Oct4 gene #reprogrammed cells #chemicals #regenerative medicine #neuroscience
Stem cell research reveals clues to brain disease

The development of new drugs for improving treatment of Alzheimer’s and Parkinson’s disease is a step closer after recent research into how stem cells migrate and form circuits in the brain.

The results from a study by researchers at The University of Auckland’s Centre for Brain Research may hold important clues into why there is less plasticity in brains affected by Parkinson’s and Alzheimer’s disease, and links to insulin resistance and diabetes.

The major five-year project to understand how stem cells start and stop migrating in the brain has also helped to unlock the secrets of how stem cells migrate during development and in adulthood.

The study revealed new information on how connectivity between brain cells is improved or worsened, says senior study author, Dr Maurice Curtis who conceived and directed the research. The experiments were carried out at the Centre for Brain Research laboratories by Dr Hector Monzo. Collaborators included a director of the CBR, Distinguished Professor Richard Faull, Dr Thomas Park, Dr Birger Dieriks, Deidre Jansson and Professor Mike Dragunow.

“We have begun testing new novel drug compounds that target how polysialic acid is removed from the cell in the hope of improving neuron connectivity,” says Dr Curtis.

He explains that stem cells in the brain are immature brain cells that must migrate from their birthplace to a position in the brain where they will connect with other brain cells, turn into adult brain cells (neurons) and become part of the brain’s circuitry.

“Even once the neuron has found its location, the neuron’s tentacles (or dendrites) need to forage to find other neurons to connect with to form circuits. This would be easy except that in the adult brain the cells are surrounded by a fairly rigid matrix (extracellular matrix) and so migration or foraging becomes almost impossible in this high friction environment.”

“The way the cell overcomes this ‘friction’ is by placing large amounts of a special slippery molecule called ‘polysialic acid-neural cell adhesion molecule’ onto the cell surface,” says Dr Curtis. “This allows the cell to migrate or forage with only a fraction of the friction it once had and this also reduces the energy requirements of the cell.”

Once the cell has migrated to its destination, the slippery coating is removed and the cell becomes locked in place ready to connect with other cells. In the case of the dendritic foraging, the polysialic acid must be removed in order for the dendrite to connect with another cell (synapse formation).

“We have known for at least 20 years that this process occurs but despite extensive studies by a number of groups internationally we have been in the dark about what controls this process,” he says. “Studies in my laboratory have demonstrated what happens to the slippery molecules once the cell no longer needs them.”

There were three possibilities for this process:

  • that enzymes cut them off the outside of the cell
  • that the friction wears it off the cell or
  • the cell internalises the slippery substance and recycles it ready for future use.

“For the past five years, we have systematically studied how this process is controlled,” says Dr Curtis. “Our findings have demonstrated that cells internalise the slippery molecule after receiving two specific cues.”

One of these cues is from collagen which makes up part of the rigid structure outside of the cell and the other is from a gaseous molecule called nitric oxide which triggers the outer membrane of the cell to internalise the slippery molecules.

“What we also discovered is that when there is an increased amount of insulin and insulin-like growth factor 1 (which has some similar functions to insulin) present in the culture, the cell cannot internalise the slippery molecules and instead they remain on the cell surface.”

“The key to the breakthrough was in determining that the process by which the polysialic acid is added to the cell surface was so persistent that it needed to be stopped in order to study how the polysialic acid was removed,” says Dr Curtis. “This required extensive trialling of many different cell growth conditions, enzyme concentrations and growing the cells in many different extracellular matrices.”

This is interesting because it is well known that in Parkinson’s disease and Alzheimer’s disease the brain is less sensitive to insulin, he says.

“In our studies in cells the insulin blocks the removal of polysialic acid and therefore the cell cannot connect properly and form synapses with other nearby cells.”

“This may hold major clues to why there is less plasticity in brains affected by Parkinson’s and Alzheimer’s disease in adults as well as helping to unlock the secrets of how stem cells migrate during development of the brain”, says Dr Curtis.

The Gus Fisher Postdoctoral Fellowship, the Auckland Medical Research Foundation and the Manchester Trust were the main sponsors of this research work.

The study results were published online this month in an ‘ahead of print’ version of The Journal of Neurochemistry.

Jul 21, 201392 notes
#stem cells #neurodegenerative diseases #insulin #brain cells #neurons #neuroscience #medicine #science
Technique inactivates Down-causing chromosome

Borrowing a trick from nature, researchers have switched off the extra chromosome that causes Down syndrome in cells taken from patients with the condition.

Though not a cure, the technique, reported July 17 in Nature, has already produced insights into the disorder. In the long run it might even make the flaw that causes Down syndrome correctable through gene therapy.

“Gene therapy is now on the horizon,” says Elizabeth Fisher, a molecular geneticist at University College London. “But that horizon is very far away.”

Down syndrome, also called trisomy 21, occurs when people inherit three copies of chromosome 21 instead of the usual two. It is the most common chromosomal condition, affecting around one in every 700 babies born in the United States. People with the disorder typically have both physical and cognitive complications of having an extra chromosome.

“Down syndrome has been one of those disorders where people say, ‘Oh, there’s nothing you can do about it,’ ” says Jeanne Lawrence, a chromosome biologist and genetic counselor at the University of Massachusetts Medical School in Worcester, who led the study with colleagues Lisa Hall and Jun Jiang.

The researchers decided to see whether they could shut down the extra chromosome by drawing on a biological process called X inactivation. Women have two X chromosomes and men have only one X and a Y. To halve the amount of X chromosome products, female cells shut down one copy. Cells do that using a chunk of RNA called XIST, which is made by one X chromosome but not the other. The RNA works by pulling in proteins that essentially board up the chromosome like an abandoned building. The other X stays on by making a different RNA.

Lawrence and Hall thought that if they put XIST on another chromosome, it might shut that one down too. So Jiang put the gene for XIST onto one of the three copies of chromosome 21 carried by stem cells grown from a man with Down syndrome. That copy of the chromosome got switched off.

“It’s kind of surprising that it wasn’t done before. I’m smacking my own forehead and saying, ‘duh,’ ” says Roger Reeves, a geneticist at Johns Hopkins University.

One idea about why an extra chromosome 21 causes cognitive problems is that it may slow down the growth of brain cells. Jiang grew nerve cells from the Down patient’s stem cells to see how cells with one shut-down chromosome developed compared with cells bearing three active copies. The cells with only two working chromosomes grew faster, forming clusters of neurons in a day or two, while the uncorrected cells needed four or five days.

The work is an enormous step forward in Down syndrome research, Fisher says, and “may take us much closer to understanding the molecular basis of the disorder.” The technique could allow researchers to figure out which genes are involved in Down syndrome and how extra copies affect cells and ultimately the body, she says.

Reeves wants to use the technology in animal experiments, a critical step in determining whether it could find use as gene therapy for people with Down syndrome. He plans to work with Lawrence’s group to switch off the extra chromosome in mice engineered to have a disorder that simulates some features of Down syndrome.

But Reeves doubts that scientists could use the method to switch off the extra chromosome in every cell in the body. Doing so would probably require gene therapy at a very early stage of pregnancy, something scientists don’t know how to do. “I just don’t see how we would get there from where we are today,” Reeves says.

Such universal silencing of the extra chromosome may be necessary to forestall developmental problems. But other problems associated with Down syndrome might be prevented or reversed by shutting down the extra chromosome after birth. For instance, people with Down syndrome are at high risk of developing childhood leukemia and of getting Alzheimer’s disease. Gene therapy to turn off the extra chromosome in the bone marrow or the brain might prevent those problems.

Therapeutic possibilities are still far in the future and may never pan out, says William Mobley, a neurologist and neuroscientist at the University of California, San Diego. “We have to move cautiously and deliberately and not say that a cure for Down syndrome is on the horizon,” he says. “It’s not true, but gosh is there excitement that progress is being made.”

Jul 21, 201388 notes
#down syndrome #gene therapy #trisomy #chromosome 21 #brain cells #genetics #science
Jul 20, 201370 notes
#pregnancy #alcohol #alcohol consumption #fetal development #cognitive function #neuroscience #science
Is sexual addiction the real deal?

Controversy exists over what some mental health experts call “hypersexuality,” or sexual “addiction.” Namely, is it a mental disorder at all, or something else? It failed to make the cut in the recently updated Diagnostic and Statistical Manual of Mental Disorders, or DSM-5, considered the bible for diagnosing mental disorders. Yet sex addiction has been blamed for ruining relationships, lives and careers.

Now, for the first time, UCLA researchers have measured how the brain behaves in so-called hypersexual people who have problems regulating their viewing of sexual images. The study found that the brain response of these individuals to sexual images was not related in any way to the severity of their hypersexuality but was instead tied only to their level of sexual desire.

In other words, hypersexuality did not appear to explain brain differences in sexual response any more than simply having a high libido, said senior author Nicole Prause, a researcher in the department of psychiatry at the Semel Institute for Neuroscience and Human Behavior at UCLA.

"Potentially, this is an important finding," Prause said. "It is the first time scientists have studied the brain responses specifically of people who identify as having hypersexual problems."

The study appears in the current online edition of the journal Socioaffective Neuroscience and Psychology.

A diagnosis of hypersexuality or sexual addiction is typically associated with people who have sexual urges that feel out of control, who engage frequently in sexual behavior, who have suffered consequences such as divorce or economic ruin as a result of their behaviors, and who have a poor ability to reduce those behaviors.

But, said Prause and her colleagues, such symptoms are not necessarily representative of an addiction — in fact, non-pathological, high sexual desire could also explain this cluster of problems.

One way to tease out the difference is to measure the brain’s response to sexual-image stimuli in individuals who acknowledge having sexual problems. If they indeed suffer from hypersexuality, or sexual addiction, their brain response to visual sexual stimuli could be expected be higher, in much the same way that the brains of cocaine addicts have been shown to react to images of the drug in other studies.

The study involved 52 volunteers: 39 men and 13 women, ranging in age from 18 to 39, who reported having problems controlling their viewing of sexual images. They first filled out four questionnaires covering various topics, including sexual behaviors, sexual desire, sexual compulsions, and the possible negative cognitive and behavioral outcomes of sexual behavior. Participants had scores comparable to individuals seeking help for hypersexual problems.

While viewing the images, the volunteers were monitored using electroencephalography (EEG), a non-invasive technique that measures brain waves, the electrical activity generated by neurons when they communicate with each other. Specifically, the researchers measured event-related potentials, brain responses that are the direct result of a specific cognitive event.

"The volunteers were shown a set of photographs that were carefully chosen to evoke pleasant or unpleasant feelings," Prause said. "The pictures included images of dismembered bodies, people preparing food, people skiing — and, of course, sex. Some of the sexual images were romantic images, while others showed explicit intercourse between one man and one woman."

The researchers were most interested in the response of the brain about 300 milliseconds after each picture appeared, commonly called the “P300” response. This basic measure has been used in hundreds of neuroscience studies internationally, including studies of addiction and impulsivity, Prause said. The P300 response is higher when a person notices something new or especially interesting to them.

The researchers expected that P300 responses to the sexual images would correspond to a person’s sexual desire level, as shown in previous studies. But they further predicted that P300 responses would relate to measures of hypersexuality. That is, in those whose problem regulating their viewing of sexual images could be characterized as an “addiction,” the P300 reaction to sexual images could be expected to spike.

Instead, the researchers found that the P300 response was not related to hypersexual measurements at all; there were no spikes or decreases tied to the severity of participants’ hypersexuality. So while there has been much speculation about the effect of sexual addiction or hypersexuality in the brain, the study provided no evidence to support any difference, Prause said.

"The brain’s response to sexual pictures was not predicted by any of the three questionnaire measures of hypersexuality," she said. "Brain response was only related to the measure of sexual desire. In other words, hypersexuality does not appear to explain brain responses to sexual images any more than just having a high libido."

But debate continues over whether sex addiction is indeed an addiction. A study published in 2012 by Prause’s colleague Rory Reid, a UCLA assistant professor of psychiatry, supported the reliability of the proposed DSM-5 diagnostic criteria for hypersexual disorder. However, Prause notes, that study was not focused on the validity of sex addiction or impulsivity, and did not use any biophysiological data in the analysis.

"If our study can be replicated," she said, "these findings would represent a major challenge to existing theories of a sex ‘addiction.’ "

Jul 20, 2013179 notes
#sexual addiction #hypersexuality #brain response #brain activity #psychology #neuroscience #science
Gene mutation in dogs offers clues for neural tube defects in humans

A gene related to neural tube defects in dogs has for the first time been identified by researchers at the University of California, Davis, and University of Iowa.

image

The researchers also found evidence that the gene may be an important risk factor for human neural tube defects, which affect more than 300,000 babies born each year around the world, according to the U.S. Centers for Disease Control and Prevention. Neural tube defects, including anencephaly and spina bifida, are caused by the incomplete closure or development of the spine and skull.

The new findings appear this week in the journal PLOS Genetics.

“The cause of neural tube defects is poorly understood but has long been thought to be associated with genetic, nutritional and environmental factors,” said Noa Safra, lead author on the study and a postdoctoral fellow in the laboratory of Professor Danika Bannasch in the UC Davis School of Veterinary Medicine.

She noted that dogs provide an excellent biomedical model because they receive medical care comparable to what humans receive, share in a home environment and develop naturally occurring diseases that are similar to those found in humans. More specifically, several conditions associated with neural-tube defects are known to occur naturally in dogs. All DNA samples used in the study were taken from household pets, rather than laboratory animals, Safra said.

She and colleagues carried out genome mapping in four Weimaraner dogs affected by spinal dysraphism, a naturally occurring spinal-cord disorder, and in 96 such dogs that had no neural tube defects.  Spinal dysraphism, previously reported in the Weimaraner breed, causes symptoms that include impaired motor coordination or partial paralysis in the legs, abnormal gait, a crouched stance and abnormal leg or paw reflexes.

Analysis of a specific region on canine chromosome eight led the researchers to a mutation in a gene called NKX2-8, one of a group of genes known as “homeobox genes,” known to be involved with regulating patterns of anatomical development in the embryo.

The researchers determined that the NKX2-8 mutation occurred in the Weimaraner breed with a frequency of 1.4 percent — 14 mutations in every 1,000 dogs.

Additionally, they tested nearly 500 other dogs from six different breeds that had been reported to be clinically affected by neural tube defects, but did not find copies of the NKX2-8 gene mutation among the non-Weimaraner dogs.

“The data indicate that this mutation does not appear as a benign mutation in some breeds, while causing defects in other breeds,” Safra said. “Our results suggest that the NKX2-8 mutation is a ‘private’ mutation in Weimaraners that is not shared with other breeds.”

The researchers say that identification of such a breed-specific gene may help veterinarians diagnose spinal dysraphism in dogs and enable Weimaraner breeders to use DNA screening to select against the mutation when developing their breeding plans.

In an effort to investigate a potential role for the NKX2-8 mutation in cases of neural tube defects in people, the researchers also sequenced 149 unrelated samples from human patients with spina bifida. They found six cases in which the patients carried mutations of the NKX2-8 gene but stress that further studies are needed to confirm whether these mutations are responsible for the diagnosed neural tube defects.

Jul 20, 201364 notes
#neural tube defects #anencephaly #spina bifida #genetics #dogs #medicine #neuroscience #science
Jul 20, 2013176 notes
#neurons #axons #axonal conduction #neuroimaging #neuroscience #science
Jul 19, 2013179 notes
#muscle movement #motor control #prosthetic limbs #robotics #neuroscience #science
No oxytocin benefit for autism

The so-called trust hormone, oxytocin, may not improve the symptoms of children with autism, a large study led by UNSW researchers has found.

Professor Mark Dadds, of the UNSW School of Psychology, says previous research suggested that oxytocin – a hormone with powerful effects on brain activity linked to the formation of social bonds – could have benefits for children with the disorder.

“Many parents of children with autism are already obtaining and using oxytocin nasal spray with their child, and clinical trials of the spray’s effects are underway all over the world. Oxytocin has been touted as a possible new treatment, but its effects may be limited,” Professor Dadds says.

Autism is a complex condition of unknown cause in which children exhibit reduced interest in other people, impaired social communication skills and repetitive behaviours.

To determine its suitability as a general treatment Professor Dadds’ team conducted a randomised controlled clinical trial of 38 boys aged between seven and 16 years of age with autism. Half were given a nasal spray of oxytocin on four consecutive days.

The study has been accepted for publication in the Journal of Autism and Developmental Disorders.

“We found that, compared to a placebo, oxytocin did not significantly improve emotion recognition, social interaction skills, repetitive behaviours, or general behavioural adjustment,” says Professor Dadds.

“This is in contrast to a handful of previous smaller studies which have shown some positive effects on repetitive behaviours, social memory and emotion processing.

“These studies, however, were limited by having small numbers of participants and/or by looking at the effects of single doses of oxytocin on specific behaviours or cognitive effects while the participants had the oxytocin in their system.

“The results of our much larger study suggest caution should be exercised in recommending nasal oxytocin as a general treatment for young people with autism.”

The boys in the new study were assessed twice before treatment, three times during the treatment week, immediately afterwards and three months later, with a parent present. Factors such as eye contact with the parent, responsiveness, warmth, speech, positive body language, repetitive behaviours, and recognition of facial emotions were observed.

Research in people who are healthy shows oxytocin can increase levels of trust and eye-gazing and improve their identification of emotions in others.

One likely possibility is that many children with autism have impaired oxytocin receptor systems that do not respond properly, Professor Dadds says. But there may be a subgroup of children for whom oxytocin could be beneficial, and research is needed to determine who responds to it and how best to deliver it.

Jul 19, 2013110 notes
#autism #oxytocin #social interaction #social skills #psychology #neuroscience #science
Jul 19, 201378 notes
#calcium #calcium ions #brain mapping #neurotransmission #neural activity #neurons #neuroscience #science
Jul 19, 2013226 notes
#concussion #head injury #TBI #football #risk weighted cumulative exposure #neurology #neuroscience #science
Biochemical mapping helps explain who will respond to antidepressants

Duke Medicine researchers have identified biochemical changes in people taking antidepressants – but only in those whose depression improves. These changes occur in a neurotransmitter pathway that is connected to the pineal gland, the part of the endocrine system that controls the sleep cycle, suggesting an added link between sleep, depression and treatment outcomes. The study, published on July 17, 2013, in the journal PLOS ONE, uses an emerging science called pharmacometabolomics to measure and map hundreds of chemicals in the blood in order to define the mechanisms underlying disease and to develop new treatment strategies based on a patient’s metabolic profile.

"Metabolomics is teaching us about the differences in metabolic profiles of patients who respond to medication, and those who do not," said Rima Kaddurah-Daouk, PhD, associate professor of psychiatry and behavioral sciences at Duke Medicine and leader of the Pharmacometabolomics Research Network.

"This could help us to better target the right therapies for patients suffering from depression who can benefit from treatment with certain antidepressants, and identify, early on, patients who are resistant to treatment and should be placed on different therapies."

Major depressive disorder – a form of depression characterized by a severely depressed mood that persists two weeks or more – is one of the most prevalent mental disorders in the United States, affecting 6.7% of the adult population in a given year.

Selective serotonin reuptake inhibitors (SSRIs) are the most commonly prescribed antidepressants for major depressive disorder, but only some patients benefit from SSRI treatment. Others may respond to placebo, while some may not find relief from either. This variability in response creates dilemmas for treating physicians where the only choice they have is to test one drug at a time and wait for several weeks to determine if a patient is going to respond to the specific SSRI.

Recent studies by the Duke team have used metabolomics tools to map biochemical pathways implicated in depression and have begun to distinguish which patients respond to treatment with an SSRI or placebo based on their metabolic profiles. These studies have pointed to several metabolites on the tryptophan metabolic pathway as potential contributing factors to whether patients respond to antidepressants.

Tryptophan is metabolized in different ways. One pathway leads to serotonin and subsequently to melatonin and an array of melatonin-like chemicals called methoxyindoles produced in the pineal gland. In the current study, the researchers analyzed levels of metabolites within branches of the tryptophan pathway and correlated changes with treatment outcomes.

Seventy-five patients with major depressive disorder were randomized to take sertraline (Zoloft) or placebo in the double-blind trial. After one week and four weeks of taking the SSRI or placebo, the researchers measured improvement in symptoms of depression to determine response to treatment, and blood samples were taken and analyzed using a metabolomics platform build to measure neurotransmitters.

The researchers observed that 60 percent of patients taking the SSRI responded to the treatment, and 50 percent of those taking placebo also responded. Several metabolic changes in the tryptophan pathway leading to melatonin and methoxyindoles were seen in patients taking the SSRI who responded to the treatment; these changes were not found in those who did not respond to the antidepressant.

The results suggest that serotonin metabolism in the pineal gland may play a role in the underlying cause of depression and its treatment outcomes, based on the biochemical changes that were seen to be associated with improvements in depression.

"This study revealed that the pineal gland is involved in mechanisms of recovery from a depressed state," said Kaddurah-Daouk. "We have started to map serotonin which is believed to be implicated in depression, but now realize that it may not be serotonin itself that is important in depression recovery. It could be metabolites of serotonin that are produced in the pineal gland that are implicated in sleep cycles.

"Shifting utilization of tryptophan metabolism from kynurenine to production of melatonin and other methoxyindoles seems important for treatment response but some patients do not have this regulation mechanism. We can now start to think about ways to correct this."

The identification of a metabolic signature for patients who have a milder form of depression and who can improve with use of placebo is critically important for streamlining clinical trials with antidepressants. The Duke team is the first to start to define in depth early biochemical effects of treatment with SSRI and placebo, and a molecular basis for why antidepressants take several weeks to start showing benefit.

In future studies, researchers may collect blood samples from patients during both the day and night to define how the circadian cycle, changes in sleep patterns, neurotransmitters and hormonal systems are modified in those who respond and do not respond to SSRIs and placebo. This can lead to more effective treatment strategies.

Jul 19, 2013153 notes
#depression #antidepressants #serotonin #pineal gland #neurotransmitters #medicine #neuroscience #science
Jul 19, 2013179 notes
#hippocampus #learning #brain maping #spatial memory #psychology #neuroscience #science
Jul 18, 2013134 notes
#science #brain inflammation #chronic pain #cytokines #chemokines #neuropathy #medicine #neuroscience
Nano Drug Crosses Blood-Brain Tumor Barrier, Targets Brain Tumor Cells and Blood Vessels

An experimental drug in early development for aggressive brain tumors can cross the blood-brain tumor barrier, kill tumor cells and block the growth of tumor blood vessels, according to a study led by researchers at the Ohio State University Comprehensive Cancer Center – Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC – James).

image

The laboratory and animal study also shows how the agent, called SapC-DOPS, targets tumor cells and blood vessels. The findings support further development of the drug as a novel treatment for brain tumors.

Glioblastoma multiforme is the most common and aggressive form of brain cancer, with a median survival of about 15 months. A major obstacle to improving treatment for the 3,470 cases of the disease expected in the United States this year is the blood-brain barrier, the name given to the tight fit of cells that make up the blood vessels in the brain. That barrier protects the brain from toxins in the blood but also keeps drugs in the bloodstream from reaching brain tumors.

“Few drugs have the capacity to cross the tumor blood-brain barrier and specifically target tumor cells,” says principal investigator Balveen Kaur, PhD, associate professor of neurological surgery and chief of the Dardinger Laboratory of Neurosciences at the OSUCCC – James. “Our preclinical study indicates that SapC-DOPS does both and inhibits the growth of new tumor blood vessels, suggesting that this agent could one day be an important treatment for glioblastoma and other solid tumors.”

The findings were published in a recent issue of the journal Molecular Therapy.

SapC-DOPS (saposin-C dioleoylphosphatidylserine), is a nanovesicle drug that has shown activity in glioblastoma, pancreatic cancer and other solid tumors in preclinical studies. The nanovesicles fuse with tumor cells, causing them to self-destruct by apoptosis.

Key findings of the study, which used two brain-tumor models, include:

  • SapC-DOPS binds with exposed patches of the phospholipid phosphatidylserine (PtdSer) on the surface of tumor cells;
  • Blocking PtdSer on cells inhibited tumor targeting;
  • SapC-DOPS strongly inhibited brain-tumor blood-vessel growth in cell and animal models, probably because these cells also have high levels of exposed PtdSer.
  • Hypoxic cells were sensitized to killing by SapC-DOPS.

“Based on our findings, we speculate that SapC-DOPS could have a synergistic effect when combined with chemotherapy or radiation therapy, both of which are known to increase the levels of exposed PtdSer on cancer cells,” Kaur says.

Jul 18, 2013100 notes
#blood-brain barrier #blood vessels #glioblastoma #brain cancer #SapC-DOPS #science
Jul 18, 2013394 notes
#fear conditioning #serotonin #PTSD #memory #neurons #learning #psilocybin #psychology #neuroscience #science
Information in brain cells' electrical activity combines memory, environment, and state of mind

The information carried by the electrical activity of neurons is a mixture of stored memories, environmental circumstances, and current state of mind, scientists have found in a study of laboratory rats. The findings, which appear in the journal PLoS Biology, offer new insights into the neurobiological processes that give rise to knowledge and memory recall.

image

The study was conducted by Eduard Kelemen, a former graduate student and post-doctoral associate at the State University of New York (SUNY) Downstate Medical Center, and André Fenton, a professor at New York University’s Center for Neural Science and Downstate Medical Center. Kelemen is currently a postdoctoral fellow at University of Tuebingen in Germany.

The idea that recollection is not merely a replay of our stored experiences dates back to Plato. He believed that memory retrieval was, in fact, a much more intricate process—a view commonly accepted by today’s cognitive psychologists and couched in the theory of constructive recollection. The theory posits that during memory retrieval, information across different experiences may combine during recall to form a single experience. Such a process may explain the prevalence of false memories. For example, studies have shown that people mistakenly recalled seeing a school bus in a movie if the bus was mentioned after they watched the movie.

In addition, other scholarship has shown that a subject’s mindset can also influence the retrieved information. For example, looking at a house from the perspective of a homebuyer or a burglar leads to different recollections—potential purchasers may recall the house’s leaky roof while would-be burglars may remember where the jewelry is kept.

But while the psychological contours of retrieval are well-documented, very little is known about the neural activity that underlies this process.

With this in mind, Fenton and Kelemen centered their study on the neurophysiological processes rats employ as they solve problems that require memory retrieval. To do so, they employed techniques developed during the last two decades. These involve monitoring the electrical activity of neurons in the rats’ hippocampus—the part of the brain used to encode new memories and retrieve old ones. By spotting certain types of neuronal activity, researchers have historically been able to perform what amounts to a mind reading exercise to decode what the rat is thinking and even comprehend the specifics of the rats’ memory retrieval.

In their experiments, Fenton and Kelemen tested the viability of a concept, “cross-episode retrieval”— stimulating the brain activity in a given circumstance that was also activated in a previous, distinctive experience.

“Such cross-episode expression of past activity can create opportunities for generating novel associations and new information that was never directly experienced,” the authors wrote.

To test their hypotheses, rats were placed in a stable, circular arena, then in a rotating, circular arena of the same size, followed by a return to the stable arena. In the rotating arena condition, the surface turned slowly, making it necessary for the rat to think about its location either in terms of the rotating floor or in terms of the stationary room.

Overall, the results showed district neural activity between the stable and rotating conditions. However, during the rotating task, the researchers intermittently observed “cross-episode retrieval”—that is, at times, neurons expressed patterns of electrical activity under the rotating-arena condition that were similar to those activity patterns that were used in the stable-arena condition. Notably, cross-episode retrieval occurred more frequently when the angular position of the rotating arena was about to complete a full rotation and return to the same position as in the stable condition, demonstrating that retrieval is influenced by the environment.

To show that cross-episode retrieval was influenced by current state of mind, Fenton and Kelemen took advantage of an earlier finding from their experiments: during the arena rotation, neural activity switches between signaling the rat’s location in the stationary room and the rat’s location on the rotating arena floor. Cross-episode retrieval was also more likely when neuronal activity represented the position of the rat in the stationary room than when it represented positions that rotate with the arena. This showed that retrieval is influenced by internal cognitive variables that are encoded by hippocampal discharge—i.e., a state of mind.

“These experiments demonstrate novel, key features of constructive human episodic memory in rat hippocampal discharge,” explained Fenton, “and suggest a neurobiological mechanism for how experiences of different events that are separate in time can nonetheless comingle and recombine in the mind to generate new information that can sometimes amount to valuable, creative insight and knowledge.”

Jul 18, 201392 notes
#memory #memory retrieval #neurons #hippocampus #psychology #neuroscience #science
Jul 18, 2013134 notes
#birds #pigeon brain #avian telencephalon #mammalian brain #cognition #hub nodes #neuroscience #science
Potential neurological treatments often advance to clinical trials on shaky evidence

Clinical trials of drug treatments for neurological diseases such as Alzheimer’s and Parkinson’s often fail because the animal studies that preceded them were poorly designed or biased in their interpretation, according to a new study from an international team of researchers. More stringent requirements are needed to assess the significance of animal studies before testing the treatments in human patients, the researchers say.

The team — led by John Ioannidis, MD, DSc, a professor of medicine at the Stanford University School of Medicine and an expert in clinical trial design — assessed the results of more than 4,000 animal studies in 160 meta-analyses of potential treatments for neurological disorders from Alzheimer’s disease, Parkinson’s disease, stroke, spinal-cord injury and a form of multiple sclerosis. (A meta-analysis is a study that compiles and assesses information and conclusions from many independent experiments of a treatment, or intervention, for a particular condition.).

They determined that only eight of the 160 studies of potential treatments yielded the statistically significant, unbiased data necessary to support advancing the treatment to clinical trials. In contrast, 108 of the treatments were deemed at least somewhat effective at the time they were published.

Ioannidis and his collaborators at the University of Edinburgh in Scotland and the University of Ioannina School of Medicine in Greece say that animal studies of potential interventions can be made more efficient and reliable by increasing average sample size, being aware of statistical bias, publishing negative results and making all the results of all experiments on the effectiveness of a particular treatment — regardless of their outcome — freely accessible to scientists.

"Some researchers have postulated that animals may not be good models for human diseases," said Ioannidis. "I don’t agree. I think animal studies can be useful and perfectly fine. The problem is more likely to be related to the selective availability of information about the studies conducted on animals." Although the researchers focused here on neurological disorders, they believe it is likely that similar bias exists in animal studies of other types of disorders.

Ioannidis, who directs the Stanford Prevention Research Center, is the senior author of the research, published online in PLoS Biology on July 16. Lecturer Konstantinos Tsilidis, PhD, and postgraduate fellow Orestis Panagiotou, MD, of the University of Ioannina share lead authorship of the study. Panagiotou is currently a researcher at the National Cancer Institute’s Division of Cancer Epidemiology and Genetics.

Ioannidis is known for his efforts to strengthen the way that research is planned, carried out and reported. He was called “one of the world’s foremost experts on the credibility of medical research” in a profile published in The Atlantic magazine in 2010. He outlined some of the problems he observed in a 2005 essay in PLoS-Medicine titled, “Why most published research findings are false.” The essay is one of the most-downloaded articles in the history of the Public Library of Science, according to the journal’s media relations office.

For the new study, Ioannidis and his colleagues evaluated results in a database of the thousands of animal studies compiled over the years through the CAMARADES initiative (Collaborative Approach to Meta-Analysis and Review of Animal Data in Experimental Studies), led by professor Malcolm MacLeod, PhD, from the University of Edinburgh, who is also a co-author of the study.

The team compared the number of experiments in the meta-analyses that would have been expected to yield positive results (based on their predicted statistical power) with the actual number of experiments with published positive results. The difference was striking: 919 expected versus the 1,719 that were published, implying that either negative results were not published, or that the results of the experiments were interpreted too optimistically.

"We saw that it was very common for these interventions to have published evidence that they would work," said Ioannidis. "It was extremely common to have results that suggest they would be effective in humans."

Furthermore, nearly half (46 percent) of the 160 meta-analyses showed evidence of small-study effects — a term used to describe the fact that a small study using fewer numbers of animals is more likely to find the intervention more effective than a larger study with many animals.

Ioannidis speculated that a reluctance to publish negative findings (that is, those that conclude that a particular intervention did not work any better than the control treatment) and a perhaps unconscious desire on the part of researchers to find a promising treatment has colored the field of neurological research. Obscuring access to studies that conclude a particular treatment is ineffective, while also publishing positive results that are likely to be statistically flawed, tilts the perception toward the potential effectiveness of an intervention and encourages unwarranted human clinical trials.

"There are no standard rules that guide a decision to move from animal studies into human clinical trials," said Ioannidis, who also holds C.F. Rehnborg Professorship at Stanford. "Sometimes interventions are tested in humans with very little evidence that they may be effective. Of the 160 analyses we studied, only eight had what we would call strong evidence of potential effectiveness with no hint of bias in the preliminary animal studies. And of these eight, only two have given positive results in humans."

Ioannidis believes the development of consortiums of groups of researchers studying a particular intervention, coupled with the free sharing of all data about its effectiveness, or lack thereof, is a good first step in reducing bias in animal studies.

"Under the current conditions, only a tiny proportion of interventions that have published some promising results in animals have shown to be at all effective in humans. For example, while dozens of treatments on ischemic or hemorrhagic stroke seem to work in the animal literature, almost none of them have worked in humans," said Ioannidis. "It is hard to believe we could not improve upon that translation record. If we raise the bar for moving into human trials, centralize researchers’ efforts and make all results available, it will be much easier for researchers to know whether they have a potential winner, and it would increase the efficiency of human clinical trials enormously."

Jul 17, 201351 notes
#animal studies #neurodegenerative diseases #CAMARADES initiative #medicine #neuroscience #science
Distinctive brain blood flow patterns associated with sexual dysfunction

Premenopausal women who aren’t interested in sex and are unhappy about this reality have distinctive blood flow patterns in their brains in response to explicit videos compared to women with normal sexual function, researchers report.

image

A study of 16 women – six with normal sexual function and 10 with clear symptoms of dysfunction – showed distinct differences in activation of brain regions involved in making and retrieving memories, and determining how attentive they are to their response to sexual stimuli, researchers report in the journal Fertility and Sterility.

Up to 20 percent of women may have this form of sexual dysfunction, called hypoactive sexual desire disorder, for which there are no proven therapies, said Dr. Michael P. Diamond, Chairman of the Department of Obstetrics and Gynecology at the Medical College of Georgia at Georgia Regents University.

Researchers hope that a clearer understanding of physiological differences in these women will provide novel therapy targets as well as a method to objectively assess therapies, said Diamond, the study’s senior author.

"There are site-specific alterations in blood flow in the brains of individuals with hypoactive sexual disorders versus those with normal sexual function," Diamond said. "This tells me there is a physiologic means of assessing hypoactive sexual desire and that as we move forward with therapeutics, whether it’s counseling or medications, we can look to see whether changes occur in those regions."

Viagra, developed in the 1990s as way to increase the heart rate of sick babies, was approved by the Food and Drug Administration in 1998 to also treat male impotence, a major cause of sexual dysfunction. While several more options for men have been developed since, no FDA-approved options are available for women experiencing hypoactive sexual desire, Diamond said. He notes that a possible critical flaw in developing and evaluating therapies for women may be the inability to objectively measure results, other than with a woman’s self-reporting of its impact on sexual activity.

Years ago, Diamond, a reproductive endocrinologist, became frustrated by the inability to help these women. In fact, many women did not bother discussing the issue with their physicians, possibly because it’s an awkward problem with no clear solutions, he said.

While still at Wayne State University, he and his colleagues began looking for objective measures of a woman’s sexual response, identifying sexually explicit film clips, then using functional magnetic resonance imaging, which measures real-time brain activation in response to a stimulus, to look at responses.

Their latest study links acquired hypoactive sexual desire disorder to a distinct pattern of blood flow in the brain, with significant activation of cortical structures involved in attention and reflection about emotion and mental state. Researchers noted that paying more attention to response to sexual stimuli already is implicated in sexual dysfunction. They also note activation of the anterior cingulate gyrus, an area involved in a broad range of emotions including homeostasis, pain, depression, and apathy. Another key area was the amygdala, which has a central role in processing emotion, learning, and memory.

Women with normal sexual function showed significantly greater activation of areas such as the right thalamus - a sort of relay station for handling sensory and motor input – that also plays a role in sexual arousal. They also experienced activation of the parahippocampal gyrus, involved in making and recalling memories. Interestingly, this area has been found to be more significantly activated in women with surgical menopause receiving hormone therapy.

Diamond notes that the official diagnosis of the sexual disorder requires distress regarding persistent disinterest in sex. Study participants were heterosexual, in stable relationships and had previously viewed sexually explicit images. Those with sexual dysfunction had a mean age of 37 versus 29 in the control group. Part of assessing blood flow patterns included also measuring baseline responses to neutral videos.

Next steps include taking these measurements in a larger number of women and beginning to use brain blood flow patterns to assess therapies, Diamond said.

Jul 17, 201375 notes
#blood flow #sexual dysfunction #hypoactive sexual desire disorder #anterior cingulate gyrus #parahippocampal gyrus #neuroscience #science
Inner Speech Speaks Volumes About the Brain

Whether you’re reading the paper or thinking through your schedule for the day, chances are that you’re hearing yourself speak even if you’re not saying words out loud. This internal speech — the monologue you “hear” inside your head — is a ubiquitous but largely unexamined phenomenon. A new study looks at a possible brain mechanism that could explain how we hear this inner voice in the absence of actual sound.

In two experiments, researcher Mark Scott of the University of British Columbia found evidence that a brain signal called corollary discharge — asignal that helps us distinguish the sensory experiences we produce ourselves from those produced by external stimuli — plays an important role in our experiences of internal speech.

The findings from the two experiments are published in Psychological Science, a journal of the Association for Psychological Science.

Corollary discharge is a kind of predictive signal generated by the brain that helps to explain, for example, why other people can tickle us but we can’t tickle ourselves. The signal predicts our own movements and effectively cancels out the tickle sensation.

And the same mechanism plays a role in how our auditory system processes speech. When we speak, an internal copy of the sound of our voice is generated in parallel with the external sound we hear.

“We spend a lot of time speaking and that can swamp our auditory system, making it difficult for us to hear other sounds when we are speaking,” Scott explains. “By attenuating the impact our own voice has on our hearing — using the ‘corollary discharge’ prediction — our hearing can remain sensitive to other sounds.”

Scott speculated that the internal copy of our voice produced by corollary discharge can be generated even when there isn’t any external sound, meaning that the sound we hear when we talk inside our heads is actually the internal prediction of the sound of our own voice.

If corollary discharge does in fact underlie our experiences of inner speech, he hypothesized, then the sensory information coming from the outside world should be cancelled out by the internal copy produced by our brains if the two sets of information match, just like when we try to tickle ourselves.

And this is precisely what the data showed. The impact of an external sound was significantly reduced when participants said a syllable in their heads that matched the external sound. Their performance was not significantly affected, however, when the syllable they said in their head didn’t match the one they heard.

These findings provide evidence that internal speech makes use of a system that is primarily involved in processing external speech, and may help shed light on certain pathological conditions.

“This work is important because this theory of internal speech is closely related to theories of the auditory hallucinations associated with schizophrenia,” Scott concludes.

Jul 17, 2013230 notes
#auditory system #audtory perception #internal speech #inner voice #schizophrenia #neuroscience #science
Jul 17, 2013103 notes
#glial cells #oligodendrocytes #glutamate #exosomes #glycolytic enzymes #neuroscience #science
Scientists identify neural origins of hot flashes in menopausal women

A new study from neuroscientists at the Wayne State University School of Medicine provides the first novel insights into the neural origins of hot flashes in menopausal women in years. The study may inform and eventually lead to new treatments for those who experience the sudden but temporary episodes of body warmth, flushing and sweating.

The paper, “Temporal Sequencing of Brain Activations During Naturally Occurring Thermoregulatory Events,” by Robert Freedman, Ph.D., professor of psychiatry and behavioral neurosciences, founder of the Behavioral Medicine Laboratory and a member at the C.S. Mott Center for Human Growth and Development, and his collaborator, Vaibhav Diwadkar, Ph.D., associate professor of psychiatry and behavioral neurosciences, appears in the June issue of Cerebral Cortex, an Oxford University Press journal.

“The idea of understanding brain responses during thermoregulatory events has spawned many studies where thermal stimuli were applied to the skin. But hot flashes are unique because they are internally generated, so studying them presents unique challenges,” said Freedman, the study’s principal investigator. “Our participants had to lie in the MRI scanner while being heated between two body-size heating pads for up to two hours while we waited for the onset of a hot flash. They were heroic in this regard and the study could not have been conducted without their incredible level of cooperation.”

“Menopause and hot flashes are a significant women’s health issue of widespread general interest,” Diwadkar added. “However, understanding of the neural origins of hot flashes has remained poor. The question has rarely been assessed with in vivo functional neuroimaging. In part, this paucity of studies reflects the technical limitations of objectively identifying hot flashes while symptomatic women are being scanned with MRI. Nothing like this has been published because this is a very difficult study to do.”

During the course of a single year, 20 healthy, symptomatic postmenopausal women ages 47 to 58 who reported six or more hot flashes a day were scanned at the School of Medicine’s Vaitkevicius Imaging Center, located in Detroit’s Harper University Hospital.

The researchers collected skin conductance levels to identify the onset of flashes while the women were being scanned. Skin conductance is an electrical measure of sweating. The women were connected to a simple circuit passing a very small current across their chests, Diwadkar said. Changes in levels allowed researchers to identify a hot flash onset and analyze the concurrently acquired fMRI data to investigate the neural precedents and correlates of the event.

The researchers focused on regions like the brain stem because its sub regions, such as the medullary and dorsal raphe, are implicated in thermal regulation, while forebrain regions, such as the insula, have been implicated in the personal perception of how someone feels. They showed that activity in some brain areas, such as the brain stem, begins to rise before the actual onset of the hot flash.

“Frankly, evidence of fMRI-measured rise in the activity of the brain stem even before women experience a hot flash is a stunning result. When this finding is considered along with the fact that activity in the insula only rises after the experience of the hot flash, we gain some insight on the complexity of brain mechanisms that mediate basic regulatory functions,” Diwadkar said.

These results point to the plausible origins of hot flashes in specific brain regions. The researchers believe it is the first such demonstration in academic literature.

They are now evaluating the network-based interactions between the brain regions by using more complex modeling of the fMRI data. “We think that our study highlights the value of using well-designed fMRI paradigms and analyses in understanding clinically relevant questions,” Diwadkar said.

The researchers also are exploring possibilities for integrating imaging with treatment to examine whether specific pharmacotherapies for menopause might alter regional brain responses.

Jul 16, 201382 notes
#aging #menopause #neuroimaging #thermal regulation #fMRI #neuroscience #science
Jul 16, 2013105 notes
#ConceptNet 4 #AI #artificial intelligence #neuroscience #science
Jul 16, 2013144 notes
#orbitofrontal cortex #schizophrenia #learning #motivation #psychology #neuroscience #science
Path of Plaque Buildup in Brain Shows Promise as Early Biomarker for Alzheimer's Disease

The trajectory of amyloid plaque buildup—clumps of abnormal proteins in the brain linked to Alzheimer’s disease—may serve as a more powerful biomarker for early detection of cognitive decline rather than using the total amount to gauge risk, researchers from Penn Medicine’s Department of Radiology suggest in a new study published online July 15 in the Journal of Neurobiology of Aging.

Amyloid plaque that starts to accumulate relatively early in the temporal lobe, compared to other areas and in particular to the frontal lobe, was associated with cognitively declining participants, the study found. “Knowing that certain brain abnormality patterns are associated with cognitive performance could have pivotal importance for the early detection and management of Alzheimer’s,” said senior author Christos Davatzikos, PhD, professor in the Department of Radiology, the Center for Biomedical Image Computing and Analytics, at the Perelman School of Medicine at the University of Pennsylvania.

Today, memory decline and Alzheimer’s—which 5.4 million Americans live with today—is often assessed with a variety of tools, including physical and bio fluid tests and neuroimaging of total amyloid plaque in the brain. Past studies have linked higher amounts of the plaque in dementia-free people with greater risk for developing the disorder. However, it’s more recently been shown that nearly a third of people with plaque on their brains never showed signs of cognitive decline, raising questions about its specific role in the disease.

Now, Dr. Davatzikos and his Penn colleagues, in collaboration with a team led by Susan M. Resnick, PhD, Chief, Laboratory of Behavioral Neuroscience at the National Institute on Aging (NIA), used Pittsburgh compound B (PiB) brain scans from the Baltimore Longitudinal Study of Aging’s Imaging Study and discovered a stronger association between memory decline and spatial patterns of amyloid plaque progression than the total amyloid burden.

“It appears to be more about the spatial pattern of this plaque progression, and not so much about the total amount found in brains. We saw a difference in the spatial distribution of plaques among cognitive declining and stable patients whose cognitive function had been measured over a 12-year period. They had similar amounts of amyloid plaque, just in different spots,” Dr. Davatzikos said. “This is important because it potentially answers questions about the variability seen in clinical research among patients presenting plaque. It accumulates in different spatial patterns for different patients, and it’s that pattern growth that may determine whether your memory declines.”

The team, including first author Rachel A. Yotter, PhD, a postdoctoral researcher in the Section for Biomedical Image Analysis, retrospectively analyzed the PET PiB scans of 64 patients from the NIA’s Baltimore Longitudinal Study of Aging whose average age was 76 years old. For the study, researchers created a unique picture of patients’ brains by combining and analyzing PET images measuring the density and volume of amyloid plaque and their spatial distribution within the brain. The radiotracer PiB allowed investigators to see amyloid temporal changes in deposition.

Those images were then compared to California Verbal Learning Test (CLVT) scores, among other tests, from the participants to determine the longitudinal cognitive decline. The group was then broken up into two subgroups: the most stable and the most declining individuals (26 participants).

Despite lack of significant difference in the total amount of amyloid in the brain, the spatial patterns between the two groups (stable and declining) were different, with the former showing relatively early accumulation in the frontal lobes and the latter in the temporal lobes.   

A particular area of the brain may be affected early or later depending on the amyloid trajectory, according to the authors, which in turn would affect cognitive impairment. Areas affected early with the plaque include the lateral temporal and parietal regions, with sparing of the occipital lobe and motor cortices until later in disease progression.

“This finding has broad implications for our understanding of the relationship between cognitive decline and resistance and amyloid plaque location, as well as the use of amyloid imaging as a biomarker in research and the clinic,” said Dr Davatzikos. “The next step is to investigate more individuals with mild cognitive impairment, and to further investigate the follow-up scans of these individuals via the BLSA study, which might shed further light on its relevance for early detection of Alzheimer’s.”

Jul 16, 201339 notes
#alzheimer's disease #dementia #cognitive decline #amyloid plaques #temporal lobe #neuroscience #science
When fear factors in

A little bit of learned fear is a good thing, keeping us from making risky, stupid decisions or falling over and over again into the same trap. But new research from neuroscientists and molecular biologists at USC shows that a missing brain protein may be the culprit in cases of severe over-worry, where the fear perseveres even when there’s nothing of which to be afraid.

image

In a study appearing the week of July 15 in the Proceedings of the National Academy of Sciences, the researchers examined mice without the enzymes monoamine oxidase A and B (MAO A/B), which sit next to each other in a human’s genetic code as well as on that of mice. Prior research has found an association between deficiencies of these enzymes in humans and developmental disabilities along the autism spectrum, such as clinical perseverance, the inability to change or modulate actions along with social context.

“These mice may serve as an interesting model to develop interventions to these neuropsychiatric disorders,” said University Professor and senior author Jean Shih, Boyd & Elsie Welin Professor of Pharmacology and Pharmaceutical Sciences at the USC School of Pharmacy and the Keck School of Medicine of USC. “The severity of the changes in the MAO A/B knockout mice compared to MAO A knockout mice supports the idea that the severity of autistic-like features may be correlated to the amounts of monoamine levels, particularly at early developmental stages.”

Shih is a world leader in understanding the neurobiological and biochemical mechanisms behind such behaviors as aggression and anxiety. In this latest study, Shih and her co-investigators — including lead author Chanpreet Singh, a USC doctoral student at the time of the research who is now at the California Institute of Technology (Caltech), and Richard Thompson, USC University Professor Emeritus and Keck Professor of Psychology and Biological Sciences at the USC Dornsife College of Letters, Arts and Sciences — expanded their past research on MAO A/B, which regulates neurotransmitters known as monoamines, including serotonin, norepinephrine and dopamine.

Comparing mice without MAO A/B with their wild-type littermates, the researchers found significant differences in how the mice without MAO A/B processed fear and other types of learning. Mice without MAO A/B and wild mice were put in a new, neutral environment and given a mild electric shock. All mice showed learned fear the next time they were tested in the same environment, with the MAO A/B knockout mice displaying a greater degree of fear.

But while wild mice continued to explore other new environments freely after the trauma, mice without the MAO A/B enzymes generalized their phobia to other contexts — their fear spilled over onto places where they should have no reason to be afraid.

“The neural substrates processing fear in the brain is very different in these mice,” Singh said. “Enhanced learning in the wrong context is a disorder and is exemplified by these mice. Their brain is not letting them forget. In a survival issue, you need to be able to forget things.”

The mice without MAO A and MAO B also learned eye-blink conditioning much more quickly than wild mice, which has also been noted in autistic patients but not in mice missing only one of these enzymes.

Importantly, the mice without MAO A/B did not display any differences in learning for spatial skills and object recognition, the researchers found, “but in their ability to learn an emotional event, the [MAO A/B knockout mice] are very different than wild types,” Singh said.

He continued: “When both enzymes are missing, it significantly increases the levels of neurotransmitters, which causes developmental changes, which leads to differential expression of receptors that are very important for synaptic plasticity — a measure of learning — and to behavior that is quite similar to what we see along the autism spectrum.”

Jul 16, 201384 notes
#autism #learning #monoamines #synaptic plasticity #genetics #neuroscience #science
Jul 15, 201360 notes
#cooperation #prisoner’s dilemma #spatial model #evolutionary simulation #neuroscience #science
Jul 15, 2013122 notes
#decision making #internal noise #EEG activity #brain activity #neuroscience #science
Jul 15, 2013182 notes
#heart rate variability #music #choir singing #heart activity #heart rate #ANS #neuroscience #science
Foraging for thought – new insights into our working memory

We take it for granted that our thoughts are in constant turnover. Metaphors like “stream of consciousness” and “train of thought” imply steady, continuous motion. But is there a mechanism inside our heads that drives this? Is there something compelling our attention to move on to new ideas instead of dwelling in the same spot forever?

image

A research team led by Dr Matthew Johnson in the School of Psychology at The University of Nottingham Malaysia Campus (UNMC) may have discovered part of the answer. They have pinpointed an effect that makes people turn their attention to something new rather than dwelling on their most recent thoughts. The research, which has been published in the academic journal Psychological Science, could have implications for studying disorders like autism and ADHD.

Dr Johnson said: “We have discovered a very promising paradigm. The effect is strong and replicates easily – you could demonstrate it in any psychology lab in the world. The work is still in its early stages but I think this could turn out to be a very important part of our understanding of how and why our thoughts work the way they do.

The paper “Foraging for Thought: An Inhibition-of-Return-Like Effect Resulting From Directing Attention Within Working Memory” sheds new light on what makes us turn our attention to things we haven’t recently thought rather than ones we have. It was carried out in collaboration with Yale University, Princeton University, The Ohio State University, and Manhattanville College.

The “inhibition of return” effect is well-established in visual attention. At certain time scales, people are slower to turn their thoughts back to a location they have just paid attention to. They are much quicker to focus on a new location. Some have interpreted this effect as a “foraging facilitator,” a process that encourages organisms to visit new locations over previously visited ones when exploring a new environment or performing a visual search.

However, in this new study, the researchers weren’t focusing on visual search, but on the process of thought itself. Participants were shown either two words or two pictures, and when the items disappeared, they were instructed to turn their attention briefly to one of the items they were just shown and ignore the other. Immediately afterwards they were asked to identify either the item they had just thought about, or the one they had ignored. For both pictures and words the participants were quicker to react to the item they had ignored.

Dr Johnson said: “The effect was shocking. When we began we expected to find the exact opposite – that thinking about something will make it easier to identify. We were initially disappointed – but when the effect was replicated over multiple experiments we realised we were onto something new and exciting.”

Critically, the effect is temporary; on a later memory test participants remembered attended items better than ignored ones.

Dr Johnson said: “That’s important. If thinking about things made us worse at remembering them long-term, it would make no sense for real-world survival. That’s why we think we’ve tapped into something fundamental about how we think in the moment – a possible mechanism keeping our thoughts moving onto new things, and not getting stuck.”

The researchers have more experiments planned to explore this effect. They say the new task could have implications for studying disorders like autism and ADHD, where attention may persist too long or move on too easily, as well as conditions with more general cognitive impairments, such as schizophrenia and ageing-related dementia.

Future studies planned also include applying cognitive neuroscience techniques to determine the effect’s underlying neural foundations.

Jul 15, 201388 notes
#working memory #autism #ADHD #attention #psychology #neuroscience #science
Jul 14, 2013664 notes
#science #flatworms #regeneration #memory RNA #memory #epigenetics #neuroscience
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December