Researchers at the University Department of Neurology at the MedUni Vienna have identified a gene behind an epilepsy syndrome, which could also play an important role in other idiopathic (genetically caused) epilepsies. With the so-called “next generation sequencing”, with which genetic changes can be identified within a few days, it was ascertained that the CNTN2 gene is defective in this type of epilepsy.

This was investigated by a team led by Elisabeth Stögmann in collaboration with Cairo’s Ain Shams University and the Helmholtz Centre Munich with reference to a particular Egyptian family, in which five sick children have resulted from the marriage of one healthy cousin to his, likewise healthy, second cousin. The children affected suffer from a specific epilepsy syndrome, in which different types of epileptic attacks occur. This constellation has the “advantage”, according to Stögmann, that both alleles of the gene, which is how one designates different forms of the gene, demonstrate this defect: “As a result the defect becomes symptomatic and identifiable.
"20,000 to 25,000 genes, including all the "protein coding" ones, were sequenced for this. When this was done a mutation was found in the CNTN2 gene. CNTN2 undertakes an important function in the anchoring of potassium channels to the synapses. The mutation makes it no longer possible to generate this protein and, as a consequence, the potassium channels no longer remain affixed to the synapses. The researchers suspect that the epilepsy in this family is triggered by the altered function of the potassium channels.
This discovery, which has now been published in the top journal “Brain”, is providing the stimulus for further research to investigate this particular gene in other epilepsy patients as well. Approximately one percent of the population suffers from active epilepsy in which regular epileptic fits occur. The danger of suffering from an epileptic fit once in your life lies at approximately four to five percent. Genetic factors play a major part in the occurrence of epilepsies.
A team led by Dr. Alex Parker, a professor of pathology and cellular biology and a researcher at the University of Montreal Hospital Research Centre (CRCHUM), has identified an important therapeutic target for alleviating the symptoms of Lou Gehrig’s disease, also known as amyotrophic lateral sclerosis (ALS), and other related neurodegenerative disorders such as Alzheimer’s disease, Parkinson’s disease and Huntington’s disease.

In a study published in the online version of Neurobiology of Disease, the team both confirmed the importance of this new target as well as a series of compounds that can be used to attenuate the dysregulation of one of the important cellular processes that lead to neuronal dysfunction and ultimately to brain cell death.
Although scientists are unclear about causes of ALS, they have made headway in identifying the cellular process potentially implicated in disease onset and progression. One such process which has attracted researcher interest involves the endoplasmic reticulum (ER), a component of cells that plays an important role in maintaining cell health. In collaboration with Dr. Pierre Drapeau at the University of Montreal and using worm and zebrafish models of ALS, Parker’s team not only confirmed that incapacitated ER leads to the motor neuron death typical of ALS, but also identified a series of compounds that alleviate the fatal consequences of defective ER.
“Since Riluzole, the one approved treatment compound for treating ALS, only has a modest effect on slowing disease progression, we set out to test a number of other compounds, and in so doing we discovered that they work by compensating for defective ER” explains Dr Parker. The compounds in question, Methylene blue, Salubrinal, Guanabenz and Phenazine, were each tested individually and in different combinations.
With the exception of Phenazine, these compounds have known benefits for treating neurodegenerative diseases. Parker and his team showed that each of these compounds reduces paralysis and neurodegeneration and that each acts on different parts of the ER pathway to achieve neuroprotection. More importantly, the researchers found that using these compounds in different combinations can enhance their therapeutic effects.
“These results are quite encouraging,” says Dr Parker, “and have given us a much better understanding of ER’s role in ALS as well as showing the way for improved treatments”. Parker’s team plans to test and confirm these findings with more complex animal models, a necessary step in developing medication that can be of benefit to human beings.
Research has implications for understanding memory and imagination
While studying rats’ ability to navigate familiar territory, Johns Hopkins scientists found that one particular brain structure uses remembered spatial information to imagine routes the rats then follow. Their discovery has implications for understanding why damage to that structure, called the hippocampus, disrupts specific types of memory and learning in people with Alzheimer’s disease and age-related cognitive decline. And because these mental trajectories guide the rats’ behavior, the research model the scientists developed may be useful in future studies on higher-level tasks, such as decision-making.
The details of their work were published online in the journal Nature on April 17.

“For the first time, we believe we have evidence that before a rat returns to an important place, it actually plans out its path,” says David Foster, Ph.D., assistant professor of neuroscience at the Johns Hopkins University School of Medicine. “The rat finds that location in its mind’s eye and knows how to get there.”
Foster and his team found that, at least for the purposes of navigation, the “mind’s eye” is located in the hippocampus, which is composed of two banana-shaped segments under the cerebral cortex on both sides of the brain. It is best known for creating memories. In people with Alzheimer’s, it is one of the first parts of the brain to sustain damage.
The Foster lab experiments focused on a group of neurons in the hippocampus called place cells because they are known to fire when animals are at a given location within a given environment. What was not known, Foster says, was how and when the brain uses that information.
By miniaturizing an existing technology, Foster and a postdoc in his lab, Brad Pfeiffer, Ph.D., were able to implant 20 microwires into each side of the hippocampus of four rats. The tiny wires let them record electrical activity from as many as 250 individual place cells at the same time, more than ever achieved before.
Over a two-week training period, the rats became familiar with the testing area which was surrounded by a variety of objects, so that the rats could tell where they were in relation to the objects outside. The space was 2 meters square with 36 tiny “dishes” placed at regular intervals in a grid. A single dish at a time would be filled with the rats’ reward: liquid chocolate.
The rats’ navigation tests involved as many as 40 sets of alternating “odd” and “even” trials per day. The odd trials required the rats to “forage” through the arena to find a chocolate-filled dish in a random location; the even trials required the rats to return each time to a “home” dish to receive their reward. While the rats fulfilled their tasks, the researchers recorded the firing of their place cells.
They found that as a rat travels randomly through the box without knowing where it needs to go, different combinations of place cells fire at each location along its path. The same set of cells fires every time the rat travels the same spot. These unique combinations of firings “mark” each spot in the rat’s brain and can be reconstructed into what seems like a virtual map, when needed.
When a rat is about to go to a specific location, e.g., “home,” place cells in its hippocampus fire in a sequence that creates a predictive path, which the rat then follows, somewhat like Hansel and Gretel following an imagined bread crumb trail.
Foster says that “unlike a Hansel and Gretel bread crumb trail, which only allows you to leave by the same route by which you entered, the rats’ memories of their surroundings are flexible and can be reconstructed in a way that allows them to ‘picture’ how to quickly get from point A to point B.” In order to do this, he says, the rats must already be familiar with the terrain between point A and point B, but, like a GPS, they don’t have to have previously started at point A with the goal of reaching point B.
Foster says the elderly can get lost easily, and research on aged mice shows that their place cells can fail to distinguish between different environments. His team’s research suggests that defective place cells would also affect a person’s ability to “look ahead” in their imaginations to predict a way home. Similarly, he says, higher-order brain functions, like problem solving, also require people to “look ahead” and imagine themselves in a different scenario.
“The hippocampus seems to be directing the movement of the rats, making decisions for them in real time,” says Foster. “Our model allows us to see this happening in a way that’s not been possible before. Our next question is, what will these place cells do when we put obstacles in the rats’ paths?”
Close family members of people with Alzheimer’s disease are more than twice as likely as those without a family history to develop silent buildup of brain plaques associated with Alzheimer’s disease, according to researchers at Duke Medicine.
The study, published online in the journal PLOS ONE on April 17, 2013, confirms earlier findings on a known genetic variation that increases one’s risk for Alzheimer’s, and raises new questions about other genetic factors involved in the disease that have yet to be identified.
An estimated 25 million people worldwide have Alzheimer’s disease, and the number is expected to triple by 2050. More than 95 percent of these individuals have late-onset Alzheimer’s, which usually occurs after the age of 65. Research has shown that Alzheimer’s begins years to decades before it is diagnosed, with changes to the brain measurable through a variety of tests.
Family history is a known risk factor and predictor of late-onset Alzheimer’s disease, and studies suggest a two- to four-fold greater risk for Alzheimer’s in individuals with a mother, father, brother or sister who develop the disease. These first-degree relatives share roughly 50 percent of their genes with another member of their family. Common genetic variations, including changes to the APOE gene, account for around 50 percent of the heritability of Alzheimer’s, but the disease’s other genetic roots are still unexplained.
“In this study, we sought to understand whether simply having a positive family history, in otherwise normal or mildly forgetful people, was enough to trigger silent buildup of Alzheimer’s plaques and shrinkage of memory centers,” said senior author P. Murali Doraiswamy, professor of psychiatry and medicine at Duke.
Duke neuroscience research trainee Erika J. Lampert, Doraiswamy and colleagues analyzed data from 257 adults, ages 55 to 89, both cognitively healthy and with varying levels of impairment. The participants were part of the Alzheimer’s Disease Neuroimaging Initiative, a national study working to define the progression of Alzheimer’s through biomarkers.
The researchers looked at participants’ age, gender and family history of the disease, with a positive family history defined as having a parent or sibling with Alzheimer’s. This information was compared with cognitive assessments and other biological tests, including APOE genotyping, MRI scans measuring hippocampal volume, and studies of three different pathologic markers (Aβ42, t-tau, and t-tau/Aβ42 ratio) found in cerebrospinal fluid.
As expected, the researchers found that a variation in the APOE gene associated with a greater risk and earlier onset of Alzheimer’s was overrepresented in participants with a family history of the disease. However, other biological differences were also seen in those with a family history, suggesting that unidentified genetic factors may influence the disease’s development before the onset of dementia.
Nearly half of all healthy people with a positive family history would have met the criteria for preclinical Alzheimer’s disease based on measurements of their cerebrospinal fluid, but only about 20 percent of those without a family history would have met such criteria.
“We already knew that family history increases one’s risk for developing Alzheimer’s, but we now are showing that people with a positive family history may also have higher levels of Alzheimer’s pathology earlier, which could be a reason why they experience a faster cognitive decline than those without a family history,” Lampert said.
The findings may influence the design of future studies developing new diagnostic tests for Alzheimer’s, as researchers may choose to exclude those with a positive family history – a group that has historically volunteered to participate in studies to better understand the disease – as healthy controls, given that they are more likely to develop Alzheimer’s pathology.
“Our study shows the power of a simple one-minute questionnaire about family history to predict silent brain changes,” Doraiswamy said. “In the absence of full understanding of all genetic risks for late-onset Alzheimer’s, family history information can serve as a risk stratification tool for prevention research and personalizing care.” He encouraged those with a known positive family history to seek out clinical trials specific to preventing the disease.
Cigarette smoking is the leading cause of preventable deaths globally. Unfortunately smoking cessation is difficult, with more than 90% of attempts to quit resulting in relapse.

(Image: Jupiterimages)
There are a growing number of available methods that can be tried in the effort to reduce smoking, including medications, behavioral therapies, hypnosis, and even acupuncture. All attempt to alter brain function or behavior in some way.
A new study published in Biological Psychiatry now reports that a single 15-minute session of high frequency transcranial magnetic stimulation (TMS) over the prefrontal cortex temporarily reduced cue-induced smoking craving in nicotine-dependent individuals.
Nicotine activates the dopamine system and reward-related regions in the brain. Nicotine withdrawal naturally results in decreased activity of these regions, which has been closely associated with craving, relapse, and continued nicotine consumption.
One of the critical reward-related regions is the dorsolateral prefrontal cortex, which can be targeted using a brain stimulation technology called transcranial magnetic stimulation. Transcranial magnetic stimulation is a non-invasive procedure that uses magnetic fields to stimulate nerve cells. It does not require sedation or anesthesia and so patients remain awake, reclined in a chair, while treatment is administered through coils placed near the forehead.
Dr. Xingbao Li and colleagues at Medical University of South Carolina examined cravings triggered by smoking cues in 16 nicotine-dependent volunteers who received one session each of high frequency or sham repetitive transcranial magnetic stimulation applied over the dorsolateral prefrontal cortex. This design allowed the researchers to ferret out the effects of the real versus the sham stimulation, similar to how placebo pills are used in evaluating the effectiveness and safety of new medications.
They found that craving induced by smoking cues was reduced after participants received real stimulation. They also report that the reduction in cue-induced craving was positively correlated with level of nicotine dependence; in other words, the TMS-induced craving reductions were greater in those with higher levels of nicotine use.
Dr. John Krystal, Editor of Biological Psychiatry, commented, “One of the elegant aspects of this study is that it suggests that specific manipulations of particular brain circuits may help to protect smokers and possibly people with other addictions from relapsing.”
"While this was only a temporary effect, it raises the possibility that repeated TMS sessions might ultimately be used to help smokers quit smoking. TMS as used in this study is safe and is already FDA approved for treating depression. This finding opens the way for further exploration of the use of brain stimulation techniques in smoking cessation treatment," said Li.
University of British Columbia researchers have found a new potential use for the over-the-counter pain drug Tylenol. Typically known to relieve physical pain, the study suggests the drug may also reduce the psychological effects of fear and anxiety over the human condition, or existential dread.
Published in the Association for Psychological Science journal Psychological Science, the study advances our understanding of how the human brain processes different kinds of pain.
“Pain exists in many forms, including the distress that people feel when exposed to thoughts of existential uncertainty and death,” says lead author Daniel Randles, UBC Dept. of Psychology. “Our study suggests these anxieties may be processed as ‘pain’ by the brain – but Tylenol seems to inhibit the signal telling the brain that something is wrong.”
The study builds on recent American research that found acetaminophen – the generic form of Tylenol – can successfully reduce the non-physical pain of being ostracized from friends. The UBC team sought to determine whether the drug had similar effects on other unpleasant experiences – in this case, existential dread.

In the study, participants took acetaminophen or a placebo while performing tasks designed to evoke this kind of anxiety – including writing about death or watching a surreal David Lynch video – and then assign fines to different types of crimes, including public rioting and prostitution.
Compared to a placebo group, the researchers found the people taking acetaminophen were significantly more lenient in judging the acts of the criminals and rioters – and better able to cope with troubling ideas. The results suggest that participants’ existential suffering was “treated” by the headache drug.
“That a drug used primarily to alleviate headaches may also numb people to the worry of thoughts of their deaths, or to the uneasiness of watching a surrealist film – is a surprising and very interesting finding,” says Randles, a PhD candidate who authored the study with Prof. Steve Heine and Nathan Santos.
While the findings suggest that acetaminophen can help to reduce anxiety, the researchers caution that further research – and clinical trials – must occur before acetaminophen should be considered a safe or effective treatment for anxiety.
Brain scans are increasingly able to reveal whether or not you believe you remember some person or event in your life. In a new study presented at a cognitive neuroscience meeting today, researchers used fMRI brain scans to detect whether a person recognized scenes from their own lives, as captured in some 45,000 images by digital cameras. The study is seeking to test the capabilities and limits of brain-based technology for detecting memories, a technique being considered for use in legal settings.
“The advancement and falling costs of fMRI, EEG, and other techniques will one day make it more practical for this type of evidence to show up in court,” says Francis Shen of the University of Minnesota Law School, who is chairing a session on neuroscience and the law at a meeting of the Cognitive Neuroscience Society (CNS) in San Francisco this week. “But technological advancement on its own doesn’t necessarily lead to use in the law.” But as the technology has advanced and as the legal system desires to use more empirical evidence, neuroscience and the law are intersecting more often than in previous decades.
In U.S. courts, neuroscientific evidence has been used largely in cases involving brain injury litigation or questions of impaired ability. In some cases outside the United States, however, courts have used brain-based evidence to check whether a person has memories of legally relevant events, such as a crime. New companies also are claiming to use brain scans to detect lies – although judges have not yet admitted this evidence in U.S. courts. These developments have rallied some in the neuroscience community to take a critical look at the promise and perils of such technology in addressing legal questions – working in partnership with legal scholars through efforts such as the MacArthur Foundation Research Network on Law and Neuroscience.
Recognizing your own memories
What inspired Anthony Wagner, a cognitive neuroscientist at Stanford University, to test fMRI uses for memory detection was a case in June 2008 in Mumbai, India, in which a judge cited EEG evidence as indicating that a murder suspect held knowledge about the crime that only the killer could possess. “It appeared that the brain data held considerable sway,” says Wagner, who points out that the methods used in that case have not been subject to extensive peer review.
Since then, Wagner and colleagues have conducted a number of experiments to test whether brain scans can be used to discriminate between stimuli that people perceive as old or new, as well as more objectively, whether or not they have previously encountered a particular person, place, or thing. To date, Wagner and colleagues have had success in the lab using fMRI-based analyses to determine whether someone recognizes a person or perceives them as unfamiliar, but not in determining whether in fact they have actually seen them before.
In a new study presented today, his team sought to take the experiments out of the lab and into the real world by outfitting participants with digital cameras around their necks that automatically took photos of the participants’ everyday experiences. Over a multi-week period, the cameras yielded 45,000 photos per participant.
Wagner’s team then took brief photo sequences of individual events from the participants’ lives and showed them to the participants in the fMRI scanner, along with photo sequences from other subjects as the control stimuli. The researchers analyzed their brain patterns to determine whether or not the participants were recognizing the sequences as their own. “We did quite well with most subjects, with a mean accuracy of 91% in discriminating between event sequences that the participant recognized as old and those that the participant perceived as unfamiliar, ” Wagner says. “These findings indicate that distributed patterns of brain activity, as measured with fMRI, carry considerable information about an individual’s subjective memory experience – that is, whether or not they are remembering the event.”
In another new study, Wagner and colleagues tested whether people can “beat the technology” by using countermeasures to alter their brain patterns. Back in the lab, the researchers showed participants individual faces and later asked them whether the faces were old or new. “Halfway through the memory test, we stopped and told them ‘What we are actually trying to do is read out from your brain patterns whether or not you are recognizing the face or perceiving it as novel, and we’ve been successful with other subjects in doing this in the past. Now we want you to try to beat the system by altering your neural responses.’” The researchers instructed the participants to think about a familiar person or experience when presented with a new face, and to focus on a novel feature of the face when presented a previously encountered face.
“In the first half of the test, during which participants were just making memory decisions, we were well above chance in decoding from brain patterns whether they recognized face or perceived it as novel. However, in the second half of the test, we were unable to classify whether or not they recognized the face nor whether the face was objectively old or new,” Wagner says. Within a forensic setting, Wagner says, it is conceivable that a suspect could use such measures to try to mask the brain patterns associated with memory.
Wagner says that his work to date suggests that the technology may have some utility in reading out brain patterns in cooperative individuals but that the uses are much more uncertain with uncooperative individuals. However, Wagner stresses that the method currently does not distinguish well between whether a person’s memory reflects true or false recognition. He says that it is premature to consider such evidence in the courts because many additional factors await future testing, including the effects of stress, practice, and time between the experience and the memory test.
Overgeneralizing the adolescent brain
A general challenge to the use of neuroscientific evidence in legal settings, Wagner says, is that most studies are at the group rather than the individual level. “The law cares about a particular individual in a particular situation right in front of them,” he says, and the science often cannot speak to that specificity.
Shen cites the challenge of making individualized inference from group-based data as one of the major ones facing use of neuroscience evidence in the court. “This issue has come up in the context of juvenile justice, where the adolescent brain development data confirms behavioral data that on average 17-year-olds are more impulsive than adults, but does not tell us whether a particular 17-year-old, namely the one on trial, was less able to control his/her actions on the day and in the manner in question,” he says.
Indeed, B.J. Casey of the Weill Medical College of Cornell University says that too often we overgeneralize the lack of self control among adolescents. Although adolescents do show poor self control as a group, some situations and individuals are more prone to this breakdown than others.
“It is not that teens can’t make decisions, they can and they can do so efficiently,” Casey says. “It is when they must make decisions in the heat of the moment – in presence of potential or perceived threats, among peers – that the court should consider diminished responsibility of teens while still holding them accountable for their behavior.” Research suggests that this diminished ability is due to the immature development of circuitry involved in processing of negative or positive cues in the environment in the subcortical limbic regions and then in regulating responses to those cues in the prefrontal cortex.
The body of research to date is at the group-level, however, and is not yet able to comment on the neurobiological maturity of an individual adolescent. To help provide more guidance on this issue in legal settings, Casey and colleagues are working alongside legal scholars on a developmental imaging study, funded by the MacArthur Foundation, that is examining behaviors relevant to juvenile criminal behavior, including impulsivity and peer influence.
Making real-world connections
The same type of work – to connect brain imaging to particular behaviors in the real-world – is ongoing in a number of other areas, including fMRI-based lie detection and linking negligence to specific mental states. “It’s a big leap to go from a laboratory setting, in which impulse control may be measured by one’s ability to not press a button in response to a stimulus, to the real-world, where the question is whether someone had requisite self-control not to tie up an innocent person and throw them off a bridge.” Shen says. “I don’t see neuroscience solving these big problems anytime soon, and so the question for law becomes: What do we do with this uncertainty? I think this is where we’re at right now, and where we’ll be for some time.”
“With a few notable exceptions such as death penalty cases, cases where a juvenile is facing a very stiff sentence, and litigating brain injury claims, ‘law and neuroscience’ is not familiar to most lawyers,” Shen says. “But this might change – and soon.” The ongoing work is vital, he says, for laying a foundation for a future that’s yet to come, and he hopes that more neuroscientists will increasingly collaborate with legal scholars.
A brain-training task that increases the number of items an individual can remember over a short period of time may boost performance in other problem-solving tasks by enhancing communication between different brain areas. The new study being presented this week in San Francisco is one of a growing number of experiments on how working-memory training can measurably improve a range of skills – from multiplying in your head to reading a complex paragraph.

(Image: Nelson Marques)
“Working memory is believed to be a core cognitive function on which many types of high-level cognition rely, including language comprehension and production, problem solving, and decision making,” says Brad Postle of the University of Wisconsin-Madison, who is co-chairing a session on working-memory training at the Cognitive Neuroscience Society (CNS) annual meeting today in San Francisco. Work by various neuroscientists to document the brain’s “plasticity” – changes brought about by experience – along with technical advances in using electromagnetic techniques to stimulate the brain and measure changes, have enabled researchers to explore the potential for working-memory training like never before, he says.
The cornerstone brain-training exercise in this field has been the “n-back” task, a challenging working memory task that requires an individual to mentally juggle several items simultaneously. Participants must remember both the recent stimuli and an increasing number of stimuli before it (e.g., the stimulus “1-back,” “2-back,” etc). These tasks can be adapted to also include an audio component or to remember more than one trait about the stimuli over time – for example, both the color and location of a shape.
Through a number of experiments over the past decade, Susanne Jaeggi of the University of Maryland, College Park, and others have found that participants who train with n-back tasks over the course of approximately a month for about 20 minutes per day not only get better at the n-back task itself, but also experience “transfer” to other cognitive tasks on which they did not train. “The effects generalize to important domains such as attentional control, reasoning, reading, or mathematical skills,” Jaeggi says. “Many of these improvements remain over the course of several months, suggesting that the benefits of the training are long lasting.”
As yet unresolved and controversial, however, has been understanding which factors determine whether working-memory training will generalize to other domains, as well as how the brain changes in response to the training. Work by Postle’s group using a new technique of applying electromagnetic stimulation on the brains of people undergoing working-memory training addresses some of these questions.
Training increases connectivity
Bornali Kundu of the University of Wisconsin-Madison, who works in Postle’s laboratory, used transcranial magnetic stimulation (TMS) with electroencephalography (EEG) to measure activity in specific brain circuits before and after training with an n-back task. “Our main finding was that training on the n-back task increased the number of items an individual could remember over a short period of time,” explains Kundu, who is presenting these new results today. “This increase in short-term memory performance was associated with enhanced communication between distant brain areas, in particular between the parietal and frontal brain areas.”
In the n-back task, Kundu’s team presented stimuli one-at-a-time on a computer screen and asked participants to decide if the current stimulus matched both the color and location of the stimulus presented a certain number of presentations previously. The color varied among seven primary colors, and the location varied among eight possible positions arranged in a square formation. The control task was playing the video game Tetris, which involves moving colored shapes to different locations, but does not require participants to remember anything. Before and after the training, researchers administered a range of cognitive tasks on which subjects did not receive training, and simultaneously delivered TMS while recording EEG, to measure communication between brain areas during task performance.
After practicing the n-back task for 5 hours a day and 5 days per week over 5 weeks, subjects were able to remember more items over short periods of time. Importantly, for those whose working memory improved, communication between the dorsolateral prefrontal cortex (DLPFC) and parietal cortex also improved. “This is in comparison to the control group, who showed no such differences in neural communication after practicing Tetris for 5 weeks,” Kundu says.
Working-memory training also produced improvement on cognitive tasks for which participants were not trained that are also believed to rely on communication between the parietal cortex and DLPFC. For two of these tasks – the ability to detect a change in a briefly presented array of squares, and the ability to detect a red letter “C” embedded in a field of distracting stimuli of rotated red “C”s and blue “C”s – those who had trained in the n-back test also showed a decrease in task-related EEG. The training exercise had registered a similar decrease. “The overall picture seems to be that the extent of transfer of training to untrained tasks depends on the overlap of neural circuits recruited by the two,” Kundu says.
Developing future therapies
Moving forward, many cognitive neuroscientists are working to see how working-memory training may specifically help clinical populations, such as patients with ADHD. “If we can learn the ‘rules’ that govern how, why, and when cognitive training can produce improvements that generalize to untrained tasks, it may be that therapies can be developed for patients suffering from neurological or psychiatric disease,” Postle says.
Both Jaeggi’s team, as well as Torkel Klingberg of the Karolinska Institute in Sweden, who is also presenting at the symposium today in San Francisco, have had success with such training for children with ADHD, decreasing the symptoms of inattention. “Here, the reason working-memory training may transfer to tests of fluid intelligence, as well as to a reduction in ADHD-associated hyperactivity symptoms, may be because both of those complex behaviors use some of the same brain circuits also used in performing the working-memory training tasks,” Kundu says.
“Individual differences in working memory performance have been related to individual differences in numerous real world skills such as reading comprehension, performance on standardized tests, and much more,” she adds. “I would not expect the same sorts of transfer effects that have been seen with working-memory training to happen if an individual practiced a task that used a minimally overlapping network, such as, for example, shooting three-pointers – which presumably uses different brain areas like primary and secondary motor cortex and the cerebellum.”
Jaeggi says that it is important to understand that cognitive abilities are not as unchangeable as some might think. “Even though there is certainly a hereditary component to mental abilities, that does not mean that there are not also components that are malleable and respond to experience and practice,” she says. “Whereas we try to strengthen participants’ working memory skills in our research, there are other routes that are possible as well, such as for example physical or musical training, meditation, nutrition, or even sleep.”
Despite all the promising research, Jaeggi says, researchers still need to understand many aspects of this work, such as “individual differences that influence training and transfer effects, the question of how long the effects last, and whether and how the effects translate into more real-world settings and ultimately, academic achievement.”
Even a mild injury to the brain can have long lasting consequences, including increased risk of cognitive impairment later in life. While it is not yet known how brain injury increases risk for dementia, there are indications that chronic, long-lasting, inflammation in the brain may be important. A new paper by researchers at the University of Kentucky Sanders-Brown Center on Aging (SBCoA), appearing in the Journal of Neuroscience, offers the latest information concerning a “switch” that turns “on” and “off” inflammation in the brain after trauma.
A team of researchers led by Linda Van Eldik, director of SBCoA, used a mouse model to study the role of p38a MAPK in trauma-induced injury responses in the microglia resident immune cell of the brain.
"The p38α MAPK protein is an important switch that drives abnormal inflammatory responses in peripheral tissue inflammatory disorders, including chronic debilitating diseases like rheumatoid arthritis," said Van Eldik.
"However, less is known about the potential importance of p38α MAPK in controlling inflammatory responses in the brain. Our work supports p38α MAPK as a promising clinical target for the treatment of CNS disorders associated with uncontrolled brain inflammation, including trauma, and potentially others like Alzheimer’s disease. We are excited by our findings, and are actively working to develop drugs targeting p38a MAPK designed specifically for diseases of the brain."
Lead author of the paper Adam D. Bachstetter said, “I was surprised when I looked under the microscope at the brain tissue of mice that had a diffuse brain injury. Microglia normally look like a small spider, but after suffering a brain injury the microglia become like angry spiders from a horror movie. In brain-injured mice that lack p38a MAPK there were no angry-looking microglia, only the normal small spider-like cells. When I started the study I never expected the results to be so clear and striking. I believe that the p38a MAPK is a promising clinical target for the treatment of CNS disorders with dysregulated inflammatory responses, but we are still a long way from development of CNS-active p38 inhibitor drugs. “
Making decisions involves a gradual accumulation of facts that support one choice or another. A person choosing a college might weigh factors such as course selection, institutional reputation and the quality of future job prospects.
But if the wrong choice is made, Princeton University researchers have found that it might be the information rather than the brain’s decision-making process that is to blame. The researchers report in the journal Science that erroneous decisions tend to arise from errors, or “noise,” in the information coming into the brain rather than errors in how the brain accumulates information.
These findings address a fundamental question among neuroscientists about whether bad decisions result from noise in the external information — or sensory input — or because the brain made mistakes when tallying that information. In the example of choosing a college, the question might be whether a person made a poor choice because of misleading or confusing course descriptions, or because the brain failed to remember which college had the best ratings.

Previous measurements of brain neurons have indicated that brain functions are inherently noisy. The Princeton research, however, separated sensory inputs from the internal mental process to show that the former can be noisy while the latter is remarkably reliable, said senior investigator Carlos Brody, a Princeton associate professor of molecular biology and the Princeton Neuroscience Institute (PNI), and a Howard Hughes Medical Institute Investigator.
"To our great surprise, the internal mental process was perfectly noiseless. All of the imperfections came from noise in the sensory processes," Brody said. Brody worked with first author Bingni Brunton, now a postdoctoral research associate in the departments of biology and applied mathematics at the University of Washington; and Matthew Botvinick, a Princeton associate professor of psychology and PNI.
The research subjects — four college-age volunteers and 19 laboratory rats — listened to streams of randomly timed clicks coming into both the left ear and the right ear. After listening to a stream, the subjects had to choose the side from which more clicks originated. The rats had been trained to turn their noses in the direction from which more clicks originated.
The test subjects mostly chose the correct side but occasionally made errors. By comparing various patterns of clicks with the volunteers’ responses, researchers found that all of the errors arose when two clicks overlapped, and not from any observable noise in the brain system that tallied the clicks. This was true in experiment after experiment utilizing different click patterns, in humans and rats.
The researchers used the timing of the clicks and the decision-making behavior of the test subjects to create computer models that can be used to indicate what happens in the brain during decision-making. The models provide a clear window into the brain during the “mulling over” period of decision-making, the time when a person is accumulating information but has yet to choose, Brody said.
"Before we conducted this study, we did not have a way of looking at this process without inserting electrodes into the brain," Brody said. "Now thanks to our model, we have an estimation of what is going on at each moment in time during the formation of the decision."
The study suggests that information represented and processed in the brain’s neurons must be robust to noise, Brody said. “In other words, the ‘neural code’ may have a mechanism for inherent error correction,” he said.
"The new work from the Brody lab is important for a few reasons," said Anne Churchland, an assistant professor of biological sciences at Cold Spring Harbor Laboratory who studies decision-making and was not involved in the study. "First, the work was very innovative because the researchers were able to study carefully controlled decision-making behavior in rodents. This is surprising in that one might have guessed rodents were incapable of producing stable, reliable decisions that are based on complex sensory stimuli.
"This work exposed some unexpected features of why animals, including humans, sometimes make incorrect decisions," Churchland said. "Specifically, the researchers found that errors are mostly driven by the inability to accurately encode sensory information. Alternative possibilities, which the authors ruled out, included noise associated with holding the stimulus in mind, or memory noise, and noise associated with a bias toward one alternative or the other."
Scientists at CWRU School of Medicine Discover New Technique that Holds Promise for the Treatment of Multiple Sclerosis and Cerebral Palsy
Researchers at Case Western Reserve School of Medicine have discovered a technique that directly converts skin cells to the type of brain cells destroyed in patients with multiple sclerosis, cerebral palsy and other so-called myelin disorders.
This discovery appears today in the journal Nature Biotechnology.
This breakthrough now enables “on demand” production of myelinating cells, which provide a vital sheath of insulation that protects neurons and enables the delivery of brain impulses to the rest of the body. In patients with multiple sclerosis (MS), cerebral palsy (CP), and rare genetic disorders called leukodystrophies, myelinating cells are destroyed and cannot be replaced.
The new technique involves directly converting fibroblasts - an abundant structural cell present in the skin and most organs - into oligodendrocytes, the type of cell responsible for myelinating the neurons of the brain.
“Its ‘cellular alchemy,’” explained Paul Tesar, PhD, assistant professor of genetics and genome sciences at Case Western Reserve School of Medicine and senior author of the study. “We are taking a readily accessible and abundant cell and completely switching its identity to become a highly valuable cell for therapy.”
In a process termed “cellular reprogramming,” researchers manipulated the levels of three naturally occurring proteins to induce fibroblast cells to become precursors to oligodendrocytes (called oligodendrocyte progenitor cells, or OPCs).
Tesar’s team, led by Case Western Reserve researchers and co-first authors Fadi Najm and Angela Lager, rapidly generated billions of these induced OPCs (called iOPCs). Even more important, they showed that iOPCs could regenerate new myelin coatings around nerves after being transplanted to mice—a result that offers hope the technique might be used to treat human myelin disorders.
When oligodendrocytes are damaged or become dysfunctional in myelinating diseases, the insulating myelin coating that normally coats nerves is lost. A cure requires the myelin coating to be regenerated by replacement oligodendrocytes.
Until now, OPCs and oligodendrocytes could only be obtained from fetal tissue or pluripotent stem cells. These techniques have been valuable, but with limitations.
“The myelin repair field has been hampered by an inability to rapidly generate safe and effective sources of functional oligodendrocytes,” explained co-author and myelin expert Robert Miller, PhD, professor of neurosciences at the Case Western Reserve School of Medicine and the university’s vice president for research. “The new technique may overcome all of these issues by providing a rapid and streamlined way to directly generate functional myelin producing cells.”
This initial study used mouse cells. The critical next step is to demonstrate feasibility and safety using human cells in a lab setting. If successful, the technique could have widespread therapeutic application to human myelin disorders.
“The progression of stem cell biology is providing opportunities for clinical translation that a decade ago would not have been possible,” said Stanton Gerson, MD, professor of Medicine-Hematology/Oncology at the School of Medicine and director of the National Center for Regenerative Medicine and the UH Case Medical Center Seidman Cancer Center. “It is a real breakthrough.”
The St. Jude Children’s Research Hospital – Washington University Pediatric Cancer Genome Project has identified mutations responsible for more than half of a subtype of childhood brain tumor that takes a high toll on patients. Researchers also found evidence the tumors are susceptible to drugs already in development.
The study focused on a family of brain tumors known as low-grade gliomas (LGGs). These slow-growing cancers are found in about 700 children annually in the U.S., making them the most common childhood tumors of the brain and spinal cord. For patients whose tumors cannot be surgically removed, the long-term outlook remains bleak due to complications from the disease and its ongoing treatment. Nationwide, surgery alone cures only about one-third of patients.
Using whole genome sequencing, researchers identified genetic alterations in two genes that occurred almost exclusively in a subtype of LGG termed diffuse LGG. This subtype cannot be cured surgically because the tumor cells invade the healthy brain. Together, the mutations accounted for 53 percent of the diffuse LGG in this study. Researchers also demonstrated that one of the mutations, which had not previously been linked to brain tumors, caused tumors when introduced into the glial brain cells of mice.
The findings appear in the April 14 advance online edition of the scientific journal Nature Genetics.
“This subtype of low-grade glioma can be a nasty chronic disease, yet prior to this study we knew almost nothing about its genetic alterations,” said David Ellison, M.D., Ph.D., chair of the St. Jude Department of Pathology and the study’s corresponding author. The first author is Jinghui Zhang, Ph.D., an associate member of the St. Jude Department of Computational Biology.
The Pediatric Cancer Genome Project is using next-generation whole genome sequencing to determine the complete normal and cancer genomes of children and adolescents with some of the least understood and most difficult to treat cancers. Scientists believe that studying differences in the 3 billion chemical bases that make up the human genome will provide the scientific foundation for the next generation of cancer care.
“We were surprised to find that many of these tumors could be traced to a single genetic alteration,” said co-author Richard K. Wilson, Ph.D., director of The Genome Institute at Washington University School of Medicine in St. Louis. “This is a major pathway through which low-grade gliomas develop and it provides new clues to explore as we search for better treatments.”
The study involved whole genome sequencing of 39 paired tumor and normal tissue samples from 38 children and adolescents with different subtypes of LGG and related tumors called low-grade glioneuronal tumors (LGGNTs). Although many cancers develop following multiple genetic abnormalities, 62 percent of the 39 tumors in this study stemmed from a single genetic alteration.
Previous studies have linked LGGs to abnormal activation of the MAPK/ERK pathway. The pathway is involved in regulating cell division and other processes that are often disrupted in cancer. Until now, however, the genetic alterations involved in driving this pathway were unknown for some types of LGG and LGGNT.
This study linked activation in the pathway to duplication of a key segment of the FGFR1 gene, which investigators discovered in brain tumors for the first time. The segment is called a tyrosine kinase domain. It functions like an on-off switch for several cell signaling pathways, including the MAPK/ERK pathway. Investigators also demonstrated that experimental drugs designed to block activity along two altered pathways worked in cells with theFGFR1 tyrosine kinase domain duplication. “The finding suggests a potential opportunity for using targeted therapies in patients whose tumors cannot be surgically removed,” Ellison said.
Researchers also showed that the FGFR1 abnormality triggered an aggressive brain tumor in glial cells from mice that lacked the tumor suppressor gene Trp53.
Whole-genome sequencing found previously undiscovered rearrangements in the MYB and MYBL1 genes in diffuse LGGs. These newly identified abnormalities were also implicated in switching on the MAPK/ERK pathway.
Researchers checked an additional 100 LGGs and LGGNTs for the same FGFR1, MYB and MYBL1 mutations. Overall, MYB was altered in 25 percent of the diffuse LGGs, and 24 percent had alterations in FGFR1. Researchers also turned up numerous other mutations that occurred in just a few tumors. The affected genes included BRAF, RAF1, H3F3A, ATRX, EP300, WHSC1 and CHD2.
“The Pediatric Cancer Genome Project has provided a remarkable opportunity to look at the genomic landscape of this disease and really put the alterations responsible on the map. We can now account for the genetic errors responsible for more than 90 percent of low-grade gliomas,” Ellison said. “The discovery that FGFR1 and MYB play a central role in childhood diffuse LGG also serves to distinguish the pediatric and adult forms of the disease.”
Some breast tumor circulating cells in the bloodstream are marked by a constellation of biomarkers that identify them as those destined to seed the brain with a deadly spread of cancer, said researchers led by those at Baylor College of Medicine in a report that appears online in the journal Science Translational Medicine.
"What prompted us to initiate this study was our desire to understand the characteristics of these cells," said Dr. Dario Marchetti, professor of pathology at BCM, director of the CTC (circulating tumor cell) Core Facility at BCM and a member of the NCI-designated Dan L. Duncan Cancer Center at BCM. Often, he said, circulating tumor cells (CTCs) from breast cancer patients which spread or metastasize to the brain are not identified by the current method for identifying such cells approved by the U.S. Food and Drug Administration (CellSearch® platform).
While this system is based on the detection of antibodies that target the epithelial cell adhesion molecule (EpCAM), the biomarkers identified by Marchetti and his colleagues include human epidermal growth factor receptor 2 (HER2+), epidermal growth factor receptor (EGFR), heparanase (HPSE) and Notch1 - and not EpCAM. Together, said Marchetti, these four proteins, previously known to be associated with cancer metastasis, spell out the signature of circulating tumors cells that travel to the brain.
Marchetti, using sophisticated techniques to test samples provided by Dr. Morris D. Groves of The University of Texas MD Anderson Cancer Center, also found this same pattern of proteins in the tissue taken from brain metastases of animals injected with breast cancer circulating tumor cells (CTCs).
They tested these special circulating tumor cells in laboratory models and found that they are highly invasive and capable of spread in live animals. They also found cells with this signature in the metastatic tumors of animals with breast cancer.
"We were able to grow these cells in vitro (in the laboratory in culture) for the first time ever," said Marchetti.
Circulating tumor cells are a promising method of identifying and monitoring solid tumors and could replace tumor biopsies in some cases. However, the promise is still being studied by experts such as Marchetti. In this case, he has identified a new signature for such cells - one that directs their activities toward spreading cancer to brain - an outcome with frequently fatal consequences.
The study not only identifies a novel signature of circulating tumor cells, it shows the limitations of currently approved platforms used to identify cancer in this way. Understanding such cells can help scientist understand how the disease spreads - an initial step in developing new methods of treating metastatic disease.
"We don’t claim that these biomarkers are the only important ones," said Marchetti. "We hope to find novel markers in brain metastasis that will make diagnosis and monitoring even more targeted."
They are also trying to find ways to link these circulating tumor cells back to the signature of the original or primary tumor.
Tapeworm infection in the brain that can trigger seizures is a growing health concern, doctors say.

But the infection, which leads to swelling in the brain, is usually treatable with medication, according to a leading association of neurologists.
Estimated cases of neurocysticercosis, as the tapeworm infection is called, range from 40,000 to 160,000 each year in the United States, said Dr. Peter Hotez, dean of the National School of Tropical Medicine at Baylor College of Medicine in Houston. “It’s been around a long time, affecting people living in severe poverty, but the disease is not well-studied or understood,” Hotez said.
Texas is one area of the country with many cases. “The disease has now become a leading cause of epilepsy in Houston,” Hotez said. “Every [week], we have patients come into our tropical medicine clinic with it.”
Concerns about an apparent increase of neurocysticercosis within the United States led the American Academy of Neurology to issue treatment guidelines for doctors and patients in the April 9 issue of the journal Neurology.
The recommendations are based on a review of 10 studies published between 1980 and 2010 that evaluated so-called cysticidal drugs for treatment of tapeworm infections. The infection involves infestation of the brain with the larvae of the Taenia solium tapeworm. In severe cases, it can cause death.
Tapeworm infection is common in Third World countries because of inadequate sanitation and hygiene, and an estimated 2 million people worldwide have epilepsy as a result. The good news is that good hygiene and food preparation can prevent it.
People develop the tapeworm infection when they consume improperly cooked meat, such as pork, or any food or drink that contains the tapeworm eggs or larvae (also known as cysts). Touching the fecal matter of an infected person is another means of transmission. The larvae then transform into full-sized tapeworms, which can grow to several feet, Hotez said.
In pigs, tapeworm larvae travel to the brain and await transmission to another animal (a human, for instance) when the pigs are eaten, he said. The parasites do the same thing in humans, but there’s nowhere to go from the human brain. Ultimately, the larvae die, and that’s when the trouble begins.
As the larvae die, they lose the ability to hide from the body’s immune system. The immune system responds by causing inflammation, which leads to epileptic seizures and brain swelling, Hotez said.
The guidelines for children and adults recommend using the medication albendazole to kill the cysts if they’re alive and treating brain swelling with corticosteroid drugs that dampen the immune system. The study found that albendazole (Albenza), used with or without the corticosteroids, reduced seizure frequency and the number of brain lesions seen in imaging scans. Not enough data was available to evaluate another drug, praziquantel, the researchers said.
Only limited evidence exists to support specific treatment approaches, however, and the treatments may produce side effects, such as abdominal complaints, according to the guidelines. It’s also unclear whether anti-epileptic medications may help prevent the seizures caused by the inflammation.
For now, the key is physician awareness, said Dr. Karen Roos, a professor of neurology at the Indiana University School of Medicine and lead author of the guidelines. “Physicians from areas of the world where this infection is endemic are very knowledgeable about this infection,” she said. “They know more than U.S. physicians.”
Infection with the tapeworm is preventable through proper sanitation, good hygiene and thorough cooking of meat.
Obesity, heart disease, and high blood pressure (hypertension) are all related, but understanding the molecular pathways that underlie cause and effect is complicated.
A new University of Iowa study identifies a protein within certain brain cells as a communications hub for controlling blood pressure, and suggests that abnormal activation of this protein may be a mechanism that links cardiovascular disease and obesity to elevated blood pressure.

"Cardiovascular diseases are the leading cause of death worldwide, and hypertension is a major cardiovascular risk factor," says Kamal Rahmouni, UI associate professor of pharmacology and internal medicine, and senior study author. "Our study identifies the protein called mTORC1 in the hypothalamus as a key player in the control of blood pressure. Targeting mTORC1 pathways may, therefore, be a promising strategy for the management of cardiovascular risk factors."
The hypothalamus is a small region of the brain that is responsible for maintaining normal function for numerous bodily processes, including blood pressure, body temperature, and glucose levels. Signaling of mTORC1 protein in the hypothalamus has previously been shown to affect food intake and body weight.
The new study, which was published April 2 in the journal Cell Metabolism, shows that the mTORC1 protein is activated by small molecules and hormones that are associated with obesity and cardiovascular disease, and this activation leads to dramatic increases in blood pressure.
Leucine is an amino acid that we get from food, which is known to activate mTORC1. The UI researchers showed that activating mTORC1 in rat brains with leucine increased activity in the nerves that connect the brain to the kidney, an important organ in blood pressure control. The increased nerve activity was accompanied by a rise in blood pressure. Conversely, blocking this mTORC1 activation significantly blunted leucine’s blood pressure-raising effect.
This finding may have direct clinical relevance as elevated levels of leucine have been correlated with an increased risk of high blood pressure in patients with cardiovascular disease.
"Our new study suggests a mechanism by which leucine in the bloodstream might increase blood pressure,” Rahmouni says.
Previous work has also suggested that mTORC1 is a signaling hub for leptin, a hormone produced by fat cells, which has been implicated in obesity-related hypertension.
Rahmouni and his colleagues showed that leptin activates mTORC1 in a specific part of the hypothalamus causing increased nerve activity and a rise in blood pressure. These effects are blocked by inhibiting activation of mTORC1.
“Our study shows that when this protein is either activated or inhibited in a very specific manner, it can cause dramatic changes in blood pressure,” Rahmouni says. “Given the importance of this protein for the control of blood pressure, any abnormality in its activity might explain the hypertension associated with certain conditions like obesity and cardiovascular disease.”
Rahmouni and his team hope that uncovering the details of the pathways linking mTORC1 activation and high blood pressure might lead to better treatments for high blood pressure in patients with cardiovascular disease and obesity.
New research has questioned the reliability of neuroscience studies, saying that conclusions could be misleading due to small sample sizes.

A team led by academics from the University of Bristol reviewed 48 articles on neuroscience meta-analysis which were published in 2011 and concluded that most had an average power of around 20 per cent – a finding which means the chance of the average study discovering the effect being investigated is only one in five.
The paper, being published in Nature Reviews Neuroscience, reveals that small, low-powered studies are ‘endemic’ in neuroscience, producing unreliable research which is inefficient and wasteful.
It focuses on how low statistical power – caused by low sample size of studies, small effects being investigated, or both – can be misleading and produce more false scientific claims than high-powered studies.
It also illustrates how low power reduces a study’s ability to detect any effects and shows that when discoveries are claimed, they are more likely to be false or misleading.
The paper claims there is substantial evidence that a large proportion of research published in scientific literature may be unreliable as a consequence.
Another consequence is that the findings are overestimated because smaller studies consistently give more positive results than larger studies. This was found to be the case for studies using a diverse range of methods, including brain imaging, genetics and animal studies.
Kate Button, from the School of Social and Community Medicine, and Marcus Munafò, from the School of Experimental Psychology, led a team of researchers from Stanford University, the University of Virginia and the University of Oxford.
She said: “There’s a lot of interest at the moment in improving the reliability of science. We looked at neuroscience literature and found that, on average, studies had only around a 20 per cent chance of detecting the effects they were investigating, even if the effects are real. This has two important implications - many studies lack the ability to give definitive answers to the questions they are testing, and many claimed findings are likely to be incorrect or unreliable.”
The study concludes that improving the standard of results in neuroscience, and enabling them to be more easily reproduced, is a key priority and requires attention to well-established methodological principles.
It recommends that existing scientific practices can be improved with small changes or additions to methodologies, such as acknowledging any limitations in the interpretation of results; disclosing methods and findings transparently; and working collaboratively to increase the total sample size and power.
The eyes sometimes have it, beating out the tongue, nose and brain in the emotional and biochemical balloting that determines the taste and allure of food, a scientist said here today. Speaking at the 245th National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest scientific society, he described how people sometimes “see” flavors in foods and beverages before actually tasting them.
“There have been important new insights into how people perceive food flavors,” said Terry E. Acree, Ph.D. “Years ago, taste was a table with two legs — taste and odor. Now we are beginning to understand that flavor depends on parts of the brain that involve taste, odor, touch and vision. The sum total of these signals, plus our emotions and past experiences, result in perception of flavors, and determine whether we like or dislike specific foods.”

Acree said that people actually can see the flavor of foods, and the eyes have such a powerful role that they can trump the tongue and the nose. The popular Sauvignon Blanc white wine, for instance, gets its flavor from scores of natural chemicals, including chemicals with the flavor of banana, passion fruit, bell pepper and boxwood. But when served a glass of Sauvignon Blanc tinted to the deep red of merlot or cabernet, people taste the natural chemicals that give rise to the flavors of those wines.
The sense of smell likewise can trump the taste buds in determining how things taste, said Acree, who is with Cornell University. In a test that people can do at home, psychologists have asked volunteers to smell caramel, strawberry or other sweet foods and then take a sip of plain water; the water will taste sweet. But smell bread, meat, fish or other non-sweet foods, and water will not taste sweet.
While the appearance of foods probably is important, other factors can override it. Acree pointed out that hashes, chilies, stews and cooked sausages have an unpleasant look, like vomit or feces. However, people savor these dishes based on the memory of eating and enjoying them in the past. The human desire for novelty and new experiences also is a factor in the human tendency to ignore what the eyes may be tasting and listening to the tongue and nose, he added.
Acree said understanding the effects of interactions between smell and vision and taste, as well as other odorants, will open the door to developing healthful foods that look and smell more appealing to finicky kids or adults.
Why do some memories last a lifetime while others disappear quickly?

(Image: Tim Vernon, LTH NHS TRUST/SCIENCE PHOTO LIBRARY)
A new study suggests that memories rehearsed, during either sleep or waking, can have an impact on memory consolidation and on what is remembered later.
The new Northwestern University study shows that when the information that makes up a memory has a high value (associated with, for example, making more money), the memory is more likely to be rehearsed and consolidated during sleep and, thus, be remembered later.
Also, through the use of a direct manipulation of sleep, the research demonstrated a way to encourage the reactivation of low-value memories so they too were remembered later.
Delphine Oudiette, a postdoctoral fellow in the department of psychology at Northwestern and lead author of the study, designed the experiment to study how participants remembered locations of objects on a computer screen. A value assigned to each object informed participants how much money they could make if they remembered it later on the test.
"The pay-off was much higher for some of the objects than for others," explained Ken Paller, professor of psychology at Northwestern and co-author of the study. "In other words, we manipulated the value of the memories — some were valuable memories and others not so much, just as the things we experience each day vary in the extent to which we’d like to be able to remember them later."
When each object was shown, it was accompanied by a characteristic sound. For example, a tea kettle would appear with a whistling sound. During both states of wakefulness and sleep, some of the sounds were played alone, quite softly, essentially reminding participants of the low-value items.
Participants remembered the low-value associations better when the sound presentations occurred during sleep.
"We think that what’s happening during sleep is basically the reactivation of that information," Oudiette said. "We can provoke the reactivation by presenting those sounds, therefore energizing the low-value memories so they get stored better."
The research poses provocative implications about the role memory reactivation during sleep could play in improving memory storage,” said Paller, director of the Cognitive Neuroscience Program at Northwestern. “Whatever makes you rehearse during sleep is going to determine what you remember later, and conversely, what you’re going to forget.”
Many memories that are stored during the day are not remembered.
"We think one of the reasons for that is that we have to rehearse memories in order to keep them. When you practice and rehearse, you increase the likelihood of later remembering," Oudiette said. "And a lot of our rehearsal happens when we don’t even realize it — while we’re asleep."
Paller said selectivity of memory consolidation is not well understood. Most efforts in memory research have focused on what happens when you first form a memory and on what happens when you retrieve a memory.
"The in-between time is what we want to learn more about, because a fascinating aspect of memory storage is that it is not static," Paller said. "Memories in our brain are changing all of the time. Sometimes you improve memory storage by rehearsing all the details, so maybe later you remember better — or maybe worse if you’ve embellished too much.
"The fact that this critical memory reactivation transpires during sleep has mostly been hidden from us, from humanity, because we don’t realize so much of what’s happening while we’re asleep," he said.
Protein spheres in the nucleus give wrong signal for cell division

RUB researchers develop new hypothesis for the degeneration of nerve cells
A new hypothesis has been developed by researchers in Bochum on how Alzheimer’s disease could occur. They analysed the interaction of the proteins FE65 and BLM that regulate cell division. In the cell culture model, they discovered spherical structures in the nucleus that contained FE65 and BLM. The interaction of the proteins triggered a wrong signal for cell division. This may explain the degeneration and death of nerve cells in Alzheimer’s patients. The team led by Dr. Thorsten Müller and Prof. Dr. Katrin Marcus from the Department of Functional Proteomics in cooperation with the RUB’s Medical Proteome Centre headed by Prof. Helmut E. Meyer reported on the results in the “Journal of Cell Science”.
Components of spherical structures in the nucleus identified
The so-called amyloid precursor protein APP is central to Alzheimer’s disease. It spans the cell membrane, and its cleavage products are linked to protein deposits that form in Alzheimer patients outside the nerve cells. APP anchors the protein FE65 to the membrane, which was the focus of the current study. FE65 can migrate into the nucleus, where it plays a role in DNA replication and repair. Based on cells grown in the laboratory, the team led by Dr. Müller established that FE65 can unite with other proteins in the cell nucleus to form spherical structures, so-called “nuclear spheres”. Video microscopy showed that these ring-like structures merge with each other and can thus grow. “By using a special cell culture model, we were able to identify additional components of these spheres”, says Andreas Schrötter, PhD student in the working group Morbus Alzheimer at the Institute for Functional Proteomics. Among other things, the scientists found the protein BLM, which is known from Bloom’s syndrome – an extremely rare hereditary disease, which is associated with dwarfism, immunodeficiency, and an increased risk of cancer. BLM is involved in DNA replication and repair in the nucleus.
The amount of FE65 determines the amount of BLM in the cell nucleus
Müller’s team took a closer look at the function of FE65. By means of genetic manipulation, the researchers generated cell cultures, in which the FE65-production was reduced. A smaller amount of FE65 thus generated a smaller amount of the protein BLM in the nucleus. Instead, BLM collected in another area of the cell, the endoplasmic reticulum. In addition, the researchers found a lower rate of DNA replication in the genetically modified cells. In this way, FE65 influences the replication of the genetic material via the BLM protein. When the researchers cranked up the FE65-production again, the amount of BLM in the nucleus also increased again.
FE65 as a possible trigger for Alzheimer’s
In patients with Alzheimer’s disease, the protein APP, an interaction partner of FE65, changes. The interaction of the two molecules is important for the transport of FE65 into the nucleus, where it regulates cell division in combination with BLM. Müller’s team assumes that the altered APP-FE65 interaction mistakenly sends the cells the signal to divide. Since nerve cells normally cannot divide, they degenerate instead and die. “This hypothesis, which we pursue in the working group Morbus Alzheimer, also delivers new starting points for potential therapies, which are urgently needed for Alzheimer’s disease,” says Dr. Mueller. In the future, the team will also investigate whether and how the amount of BLM is altered in Alzheimer’s patients compared to healthy subjects.
We’ve all been there: You’re at work deeply immersed in a project when suddenly you start thinking about your weekend plans. It happens because behind the scenes, parts of your brain are battling for control.

Now, University of Florida researchers and their colleagues are using a new technique that allows them to examine how parts of the brain battle for dominance when a person tries to concentrate on a task. Addressing these fluctuations in attention may help scientists better understand many neurological disorders such as autism, depression and mild cognitive impairment.
Mingzhou Ding, a professor of biomedical engineering, and Xiaotong Wen, an assistant research scientist of biomedical engineering, both of the University of Florida; Yijun Liu of the McKnight Brain Institute of the University of Florida and Peking University, Beijing; and Li Yao of Beijing Normal University, report their findings in the current issue of The Journal of Neuroscience.
Scientists know different networks within the brain have distinct functions. Ding, Wen and their colleagues used a brain imaging technique called functional magnetic resonance imaging and biostatistical methods to examine interactions between a set of areas they call the task control network and another set of areas known as the default mode network.
The task control network regulates attention to surroundings, controlling concentration on a task such as doing homework, or listening for emotional cues during a conversation. The default mode network is thought to regulate self-reflection and emotion, and often becomes active when a person seems to be doing nothing else.
“We knew that the default mode network decreases in activity when a task is being performed, but we didn’t know why or how,” said Ding, a professor of biomedical engineering in the J. Crayton Pruitt department of biomedical engineering. “We also wanted to know what is driving that activity decrease.
“For a long time, the questions we are asking could not be answered.”
In the past, researchers could not distinguish between directions of interactions between regions of the brain, and could come up with only one number to represent an average of the back-and-forth interactions. Ding and his colleagues used a new technique to untangle the interactions in each direction to show how the different brain regions interact with one another.
In their study, the researchers used fMRI to examine the brains of people performing a task that required concentration. The scientists can see the activity in certain areas of the brain at the same time a person is performing a given task. They can see which parts of the brain are active and which are not and correlate this to how successful a person is at a given task. They then applied the Granger causality technique to look at the data they saw in the fMRI. Named for Nobel Prize-winning economist Clive Granger, this technique allows scientists to examine how one variable affects another variable; in this case, how one region of the brain influences another.
“People have hypothesized different functions for signals going in different directions,” Ding said. “We show that when the task control network suppresses the default mode network, the person can do the task better and faster. The better the default mode network is shut down, the better a person performs.”
However, when the default mode network is not sufficiently suppressed, it sends signals to the task control network that effectively distract the person, causing his or her performance to drop. So while the task control network suppresses the default mode network, the default mode network also interferes with the task control network.
“Your brain is a constant seesaw back and forth,” even when trying to concentrate on a task, Ding said.
The Granger causality technique may help researchers learn more about how neurological disorders work. Researchers have found that the default mode network remains unchanged in people with autism whether they are performing a task or interacting with the environment, which could explain symptoms such as difficulty reading social cues or being easily overwhelmed by sensory stimulation. Scientists have made similar findings with depression and mild cognitive impairment. However, until now no one has been able to address what areas of the brain might be regulating the default mode network and which might be interfering with that regulation.
“Now we are able to address these questions,” Ding said.
Using a miniature electronic device implanted in the brain, scientists have tapped into the internal reward system of mice, prodding neurons to release dopamine, a chemical associated with pleasure.

The researchers, at Washington University School of Medicine in St. Louis and the University of Illinois at Urbana-Champaign, developed tiny devices, containing light emitting diodes (LEDs) the size of individual neurons. The devices activate brain cells with light. The scientists report their findings April 12 in the journal Science.
“This strategy should allow us to identify and map brain circuits involved in complex behaviors related to sleep, depression, addiction and anxiety,” says co-principal investigator Michael R. Bruchas, PhD, assistant professor of anesthesiology at Washington University. “Understanding which populations of neurons are involved in these complex behaviors may allow us to target specific brain cells that malfunction in depression, pain, addiction and other disorders.”
For the study, Washington University neuroscientists teamed with engineers at the University of Illinois to design microscale (LED) devices thinner than a human hair. This was the first application of the devices in optogenetics, an area of neuroscience that uses light to stimulate targeted pathways in the brain. The scientists implanted them into the brains of mice that had been genetically engineered so that some of their brain cells could be activated and controlled with light.
Although a number of important pathways in the brain can be studied with optogenetics, many neuroscientists have struggled with the engineering challenge of delivering light to precise locations deep in the brain. Most methods have tethered animals to lasers with fiber optic cables, limiting their movement and altering natural behaviors.
But with the new devices, the mice freely moved about and were able to explore a maze or scamper on a wheel. The electronic LEDs are housed in a tiny fiber implanted deep in the brain. That’s important to the device’s ability to activate the proper neurons, according to John A. Rogers, PhD, professor of materials science and engineering at the University of Illinois.
“You want to be able to deliver the light down into the depth of the brain,” Rogers says. “We think we’ve come up with some powerful strategies that involve ultra-miniaturized devices that can deliver light signals deep into the brain and into other organs in the future.”
Using light from the cellular-scale LEDs to stimulate dopamine-producing cells in the brain, the investigators taught the mice to poke their noses through a specific hole in a maze. Each time a mouse would poke its nose through the hole, that would trigger the system to wirelessly activate the LEDs in the implanted device, which then would emit light, causing neurons to release dopamine, a chemical related to the brain’s natural reward system.
“We used the LED devices to activate networks of brain cells that are influenced by the things you would find rewarding in life, like sex or chocolate,” says co-first author Jordan G. McCall, a neuroscience graduate student in Washington University’s Division of Biology and Biomedical Sciences. “When the brain cells were activated to release dopamine, the mice quickly learned to poke their noses through the hole even though they didn’t receive any food as a reward. They also developed an associated preference for the area near the hole, and they tended to hang around that part of the maze.”
The researchers believe the LED implants may be useful in other types of neuroscience studies or may even be applied to different organs. Related devices already are being used to stimulate peripheral nerves for pain management. Other devices with LEDs of multiple colors may be able to activate and control several neural circuits at once. In addition to the tiny LEDs, the devices also carry miniaturized sensors for detecting temperature and electrical activity within the brain.
Bruchas and his colleagues already have begun other studies of mice, using the LED devices to manipulate neural circuits that are involved in social behaviors. This could help scientists better understand what goes on in the brain in disorders such as depression and anxiety.
“We believe these devices will allow us to study complex stress and social interaction behaviors,” Bruchas explains. “This technology enables us to map neural circuits with respect to things like stress and pain much more effectively.”
The wireless, microLED implant devices represent the combined efforts of Bruchas and Rogers. Last year, along with Robert W. Gereau IV, PhD, professor of anesthesiology, they were awarded an NIH Director’s Transformative Research Project award to develop and conduct studies using novel device development and optogenetics, which involves activating or inhibiting brain cells with light.