Researchers from The University of Manchester have discovered a new mechanism that governs how body clocks react to changes in the environment.

And the discovery, which is being published in Current Biology, could provide a solution for alleviating the detrimental effects of chronic shift work and jet-lag.
The team’s findings reveal that the enzyme casein kinase 1epsilon (CK1epsilon) controls how easily the body’s clockwork can be adjusted or reset by environmental cues such as light and temperature.
Internal biological timers (circadian clocks) are found in almost every species on the planet. In mammals including humans, circadian clocks are found in most cells and tissues of the body, and orchestrate daily rhythms in our physiology, including our sleep/wake patterns and metabolism.
Dr David Bechtold, who led The University of Manchester’s research team, said: “At the heart of these clocks are a complex set of molecules whose interaction provides robust and precise 24 hour timing. Importantly, our clocks are kept in synchrony with the environment by being responsive to light and dark information.”
This work, funded by the Biotechnology and Biological Sciences Research Council, was undertaken by a team from The University of Manchester in collaboration with scientists from Pfizer led by Dr Travis Wager.
The research identifies a new mechanism through which our clocks respond to these light inputs. During the study, mice lacking CK1epsilon, a component of the clock, were able to shift to a new light-dark environment (much like the experience in shift work or long-haul air travel) much faster than normal.
The research team went on to show that drugs that inhibit CK1epsilon were able to speed up shift responses of normal mice, and critically, that faster adaption to the new environment minimised metabolic disturbances caused by the time shift.
Dr Bechtold said: “We already know that modern society poses many challenges to our health and wellbeing - things that are viewed as commonplace, such as shift-work, sleep deprivation, and jet lag disrupt our body’s clocks. It is now becoming clear that clock disruption is increasing the incidence and severity of diseases including obesity and diabetes.
“We are not genetically pre-disposed to quickly adapt to shift-work or long-haul flights, and as so our bodies’ clocks are built to resist such rapid changes. Unfortunately, we must deal with these issues today, and there is very clear evidence that disruption of our body clocks has real and negative consequences for our health.”
He continues: “As this work progresses in clinical terms, we may be able to enhance the clock’s ability to deal with shift work, and importantly understand how maladaptation of the clock contributes to diseases such as diabetes and chronic inflammation.”
University of Bonn psychologists prove genetic variation is underlying factor in higher incidence of forgetfulness
Misplaced your keys? Can’t remember someone’s name? Didn’t notice the stop sign? Those who frequently experience such cognitive lapses now have an explanation. Psychologists from the University of Bonn have found a connection between such everyday lapses and the DRD2 gene. Those who have a certain variant of this gene are more easily distracted and experience a significantly higher incidence of lapses due to a lack of attention. The scientific team will probably report their results in the May issue of “Neuroscience Letters,” which is already available online in advance.

Most of us are familiar with such everyday lapses; can’t find your keys, again! Or you walk into another room but forgot what you actually went there for. Or you are on the phone with someone and cannot remember their name. “Such short-term memory lapses are very common, but some people experience them particularly often,” said Prof. Dr. Martin Reuter from the department for Differential and Biological Psychology at the University of Bonn. Mistakes occurring due to such short-term lapses can become a hazard in cases where, e.g., a person overlooks a stop sign at an intersection. And in the workplace, a lack of attention can also become a problem–so for example when it results in forgetting to save essential data.
A gene “directing” your brain
"A familial clustering of such lapses suggests that they are subject to genetic effects," explained Dr. Sebastian Markett, the principal author and a member of Prof. Reuter’s team. In lab experiments, the group of scientists had already found indications earlier that the so-called dopamine D2 receptor gene (DRD2) plays a part in forgetfulness. DRD2 has an essential function in signal transmission within the frontal lobes. "This structure can be compared to a director coordinating the brain like an orchestra," Dr. Markett added. In this simile, the DRD2 gene would correspond to the baton, because it plays a part in dopamine transmission in the brain. If the baton skips a beat, the orchestra gets confused.
The psychologists from the University of Bonn tested a total of 500 women and men by taking a saliva sample and examining it using methods from molecular biology. All humans carry the DRD2 gene, which comes in two variants that are distinguished by only one letter within the genetic code. The one variant has C (cytosine) in one locus, which is displaced by T (thymine) in the other. According to the research team’s analyses, about a quarter of the subjects exclusively had the DRD2 gene with the cytosine nucleobase, while three quarters were the genotype with at least one thymine base.
The scientists then wanted to find out whether this difference in the genetic code also had an effect on everyday behavior. By means of a self-assessment survey they asked the subjects to state how frequently they experience these lapses–how often they forgot names, misplaced their keys. The survey also included questions regarding certain impulsivity-related factors, such as how easily a subject was distracted from actual tasks at hand, and how long they were able to maintain their concentration.
Lapses can clearly be tied to the gene variant
The scientists used statistical methods to check whether it was possible to associate the forgetfulness symptoms elicited by means of the surveys to one of the DRD2 gene variants. The results showed that functions such as attention and memory are less clearly expressed in persons who carry the thymine variant of the gene than in the cytosine type. “The connection is obvious; such lapses can partially be attributed to this gene variant,” reported Dr. Markett. According to their own statements, the subjects with the thymine DRD2 variant more frequently “fall victim” to forgetfulness or attention deficits. And vice versa, the cytosine type seems to be protected from that. “This result matches the results of other studies very well,” added Dr. Markett.
Carriers of the gene variant linked to forgetfulness may now find solace in the fact that they are not responsible for their genes, and that this is just their fate….but Dr. Markett doesn’t agree. “There are things you can do to compensate for forgetfulness; writing yourself notes or making more of an effort to put your keys down in a specific location–and not just anywhere.” Those who develop such strategies for the different areas of their lives are better able to handle their deficit.
You wouldn’t hear the mating song of the male fruit fly as you reached for the infested bananas in your kitchen. Yet, the neural activity behind the insect’s amorous call could help scientists understand how you made the quick decision to pull your hand back from the tiny swarm.

Male fruit flies base the pitch and tempo of their mating song on the movement and behavior of their desired female, Princeton University researchers have discovered. In the animal kingdom, lusty warblers such as birds typically have a mating song with a stereotyped pattern. A fruit fly’s song, however, is an unordered series of loud purrs and soft drones made by wing vibrations, the researchers reported in the journal Nature. A male adjusts his song in reaction to his specific environment, which in this case is the distance and speed of a female — the faster and farther away she’s moving, the louder he “sings.”
While the actors are small, the implications of these findings could be substantial for understanding rapid decision-making, explained corresponding author Mala Murthy, a Princeton assistant professor of molecular biology and the Princeton Neuroscience Institute. Fruit flies are a common model for studying the systems of more advanced beings such as humans, and have the basic components of more complex nervous systems, she said.
The researchers have provided a possible tool for studying the neural pathways behind how an organism engaged in a task adjusts its behavior to sudden changes, be it a leopard chasing a zigzagging gazelle, or a commuter navigating stop-and-go traffic, Murthy said. She and her co-authors created a model that could predict a fly’s choice of song in response to its changing environment, and identified the neural pathways involved in these decisions.
"Here we have natural courtship behavior and we have this discovery that males are using information about their sensory environment in real time to shape their song. That makes the fly system a unique model to study decision-making in a natural context," Murthy said.
"You can imagine that if a fly can integrate visual information quickly to modulate his song, the way in which it does that is probably a very basic equivalent of how a more complicated animal solves a similar problem," she said. "To figure out at the level of individual neurons how flies perform sensory-motor integration will give us insight into how a mammalian brain does it and, ultimately, maybe how a human brain does it."
Why do neurodegenerative diseases such as Alzheimer’s affect only the elderly? Why do some people live to be over 100 with intact cognitive function while others develop dementia decades earlier?

Image: A new study shows that a gene regulator called REST, dormant in the brains of young people (left), switches on in normal aging brains (center) to protect against various stresses, including abnormal proteins associated with neurodegenerative diseases. REST is lost in critical brain regions of people with Alzheimer’s (right). Credit: Yankner Lab
More than a century of research into the causes of dementia has focused on the clumps and tangles of abnormal proteins that appear in the brains of people with neurodegenerative diseases. However, scientists know that at least one piece of the puzzle has been missing because some people with these abnormal protein clumps show few or no signs of cognitive decline.
A new study offers an explanation for these longstanding mysteries. Researchers have discovered that a gene regulator active during fetal brain development, called REST, switches back on later in life to protect aging neurons from various stresses, including the toxic effects of abnormal proteins. The researchers also showed that REST is lost in critical brain regions of people with Alzheimer’s and mild cognitive impairment.
A new study in animals shows that using a compound to block the body’s immune response greatly reduces disability after a stroke.

The study by scientists from the University of Wisconsin School of Medicine and Public Health also showed that particular immune cells – CD4+ T-cells produce a mediator, called interleukin (IL)-21 that can cause further damage in stroke tissue.
Moreover, normal mice, ordinarily killed or disabled by an ischemic stroke, were given a shot of a compound that blocks the action of IL-21. Brain scans and brain sections showed that the treated mice suffered little or no stroke damage.
“This is very exciting because we haven’t had a new drug for stroke in decades, and this suggests a target for such a drug,” says lead author Dr. Zsuzsanna Fabry, professor of pathology and laboratory medicine
Stroke is the fourth-leading killer in the world and an important cause of permanent disability. In an ischemic stroke, a clot blocks the flow of oxygen-rich blood to the brain. But Fabry explains that much of the damage to brain cells occurs after the clot is removed or dissolved by medicine. Blood rushes back into the brain tissue, bringing with it immune cells called T-cells, which flock to the source of an injury.
The study shows that after a stroke, the injured brain cells provoke the CD4+ T-cells to produce a substance, IL-21, that kills the neurons in the blood-deprived tissue of the brain. The study gave new insight how stroke induces neural injury.
Similar Findings in Humans
Fabry’s co-author Dr. Matyas Sandor, professor of pathology and laboratory medicine, says that the final part of the study looked at brain tissue from people who had died following ischemic strokes. It found that CD4+ T-cells and their protein, IL-21 are in high concentration in areas of the brain damaged by the stroke.
Sandor says the similarity suggests that the protein that blocks IL-21 could become a treatment for stroke, and would likely be administered at the same time as the current blood-clot dissolving drugs.
“We don’t have proof that it will work in humans,” he says, “but similar accumulation of IL-21 producing cells suggests that it might.”
The paper was published this week in the Journal of Experimental Medicine.
A novel protein may explain how biological clocks regulate human sleep cycles

In a series of experiments sparked by fruit flies that couldn’t sleep, Johns Hopkins researchers say they have identified a mutant gene — dubbed “Wide Awake” — that sabotages how the biological clock sets the timing for sleep. The finding also led them to the protein made by a normal copy of the gene that promotes sleep early in the night and properly regulates sleep cycles.
Because genes and the proteins they code for are often highly conserved across species, the researchers suspect their discoveries — boosted by preliminary studies in mice — could lead to new treatments for people whose insomnia or off-hours work schedules keep them awake long after their heads hit the pillow.
“We know that the timing of sleep is regulated by the body’s internal biological clock, but just how this occurs has been a mystery,” says study leader Mark N. Wu, M.D., Ph.D., an assistant professor of neurology, medicine, genetic medicine and neuroscience at the Johns Hopkins University School of Medicine. “We have now found the first protein ever identified that translates timing information from the body’s circadian clock and uses it to regulate sleep.”
A report on the work was published online March 13 in the journal Neuron.
In their hunt for the molecular roots of sleep regulation, Wu and his colleagues studied thousands of fruit fly colonies, each with a different set of genetic mutations, and analyzed their sleep patterns. They found that one group of flies, with a mutation in the gene they would later call Wide Awake (or Wake for short), had trouble falling asleep at night, a malady that looked a lot like sleep-onset insomnia in humans. The investigators say Wake appears to be the messenger from the circadian clock to the brain, telling it that it’s time to shut down and sleep.
After isolating the gene, Wu’s team determined that when working properly, Wake helps shut down clock neurons of the brain that control arousal by making them more responsive to signals from the inhibitory neurotransmitter called GABA. Wake does this specifically in the early evening, thus promoting sleep at the right time. Levels of Wake cycle during the day, peaking near dusk in good sleepers.
Flies with a mutated Wake gene that couldn’t get to sleep were not getting enough GABA signal to quiet their arousal circuits at night, keeping the flies agitated.
The researchers found the same gene in every animal they studied: humans, mice, rabbits, chickens, even worms.
Importantly, when Wu’s team looked to see where Wake was located in the mouse brain, they found that it was expressed in the suprachiasmatic nucleus (SCN), the master clock in mammals. Wu says the fact that the Wake protein was expressed in high concentrations in the SCN of mice is significant.
“Sometimes we discover things in flies that have no direct relevance in higher order animals,” Wu says. “In this case, because we found the protein in a location where it likely plays a role in circadian rhythms and sleep, we are encouraged that this protein may do the same thing in mice and people.”
The hope is that someday, by manipulating Wake, possibly with a medication, shift workers, military personnel and sleep-onset insomniacs could sleep better.
“This novel pathway may be a place where we can intervene,” Wu says.
Your brain’s ability to instantly link what you see with what you do is down to a dedicated information ‘highway’, suggests new UCL-led research.

For the first time, researchers from UCL and Cambridge University have found evidence of a specialised mechanism for spatial self-awareness that combines visual cues with body motion.
Standard visual processing is prone to distractions, as it requires us to pay attention to objects of interest and filter out others. The new study has shown that our brains have separate ‘hard-wired’ systems to visually track our own bodies, even if we are not paying attention to them. In fact, the newly-discovered network triggers reactions even before the conscious brain has time to process them.
The researchers discovered the new mechanism by testing 52 healthy adults in a series of three experiments. In all experiments, participants used robotic arms to control cursors on two-dimensional displays, where cursor motion was directly linked to hand movement. Their eyes were kept fixed on a mark at the centre of the screen, confirmed with eye tracking.
In the first experiment, participants controlled two separate cursors with their left and right hands, both equally close to the centre. The goal was to guide each cursor to a corresponding target at the top of the screen. Occasionally the cursor or target on one side would jump left or right, requiring participants to take corrective action. Each jump was ‘cued’ with a flash on one side, but this was random so did not always correspond to the side about to change.
Unsurprisingly, people reacted faster to target jumps when their attention was drawn to the ‘correct’ side by the cue. However, reactions to cursor jumps were fast regardless of cuing, suggesting that a separate mechanism independent of attention is responsible for tracking our own movements.
“The first experiment showed us that we react very quickly to changes relating to objects directly under our own control, even when we are not paying attention to them,” explains Dr Alexandra Reichenbach of the UCL Institute of Cognitive Neuroscience, lead author of the study. “This provides strong evidence for a dedicated neural pathway linking motor control to visual information, independently of the standard visual systems that are dependent on attention.”
The second experiment was similar to the first, but also introduced changes in brightness to demonstrate the attention effect on the visual perception system. In the third experiment, participants had to guide one cursor to its target in the presence of up to four dummy targets and cursors, ‘distractors’, alongside the real ones. In this experiment, responses to cursor jumps were less affected by distractors than responses to target jumps. Reactions to cursor jumps remained vigorous with one or two distractors, but were significantly decreased when there were four.
“These results provide further evidence of a dedicated ‘visuomotor binding’ mechanism that is less prone to distractions than standard visual processing,’ says Dr Reichenbach. “It looks like the specialised system has a higher tolerance for distractions, but in the end it is still affected. Exactly why we evolved a separate mechanism remains to be seen, but the need to react rapidly to different visual cues about ourselves and the environment may have been enough to necessitate a specialised pathway.”
The newly-discovered system could explain why some schizophrenia patients feel like their actions are controlled by someone else.
“Schizophrenia often manifests as delusion of control, and a dysfunction in the visuomotor mechanism identified in this study might be a cause for this symptom,” explains Dr Reichenbach. “If someone does not automatically link corresponding visual cues with body motion, then they might have the feeling that they are not controlling their movements. We would need further research to confirm this, and it would be fascinating to see how schizophrenia patients perform in these experiments.”
These findings could also explain why people with even the most advanced prosthetic limbs can have trouble coordinating movements.
“People often describe their prosthetic limbs as feeling ‘other’, not a true extension of their body,’ says Dr Reichenbach. “Even on the best prosthetic hands, if the observed movement of the fingers is not exactly what you would expect, then it will not feel like you are in direct control. These small details might have a big effect on how people perceive prostheses.”
People who try to quit smoking often say that kicking the habit makes the voice inside telling them to light up even louder, but why people succumb to those cravings so often has never been fully understood. Now, a new brain imaging study in this week’s JAMA Psychiatry from scientists in Penn Medicine and the National Institute on Drug Abuse (NIDA) Intramural Research Program shows how smokers suffering from nicotine withdrawal may have more trouble shifting from a key brain network—known as default mode, when people are in a so-called “introspective” or “self-referential” state— and into a control network, the so-called executive control network, that could help exert more conscious, self-control over cravings and to focus on quitting for good.

The findings help validate a neurobiological basis behind why so many people trying to quit end up relapsing—up to 80 percent, depending on the type of treatment—and may lead to new ways to identify smokers at high risk for relapse who need more intensive smoking cessation therapy.
The brain imaging study was led by researchers at University of Pennsylvania’s new Brain and Behavior Change Program, led by Caryn Lerman, PhD, who is also the deputy director of Penn’s Abramson Cancer Center, and Elliot Stein, PhD, and collaborators at NIDA. They found that smokers who abstained from cigarettes showed weakened interconnectivity between certain large-scale networks in their brains: the default mode network, the executive control network, and the salience network. They posit that this weakened connectivity reduces smokers’ ability to shift into or maintain greater influence from the executive control network, which may ultimately help maintain their quitting attempt.
“What we believe this means is that smokers who just quit have a more difficult time shifting gears from inward thoughts about how they feel to an outward focus on the tasks at hand,” said Lerman, who also serves as the Mary W. Calkins professor in the Department of Psychiatry. “It’s very important for people who are trying to quit to be able to maintain activity within the control network— to be able to shift from thinking about yourself and your inner state to focus on your more immediate goals and plan.”
Prior studies have looked at the effects of nicotine on brain interconnectivity in the resting state, that is, in the absence of any specific goal directed activity. This is the first study, however, to compare resting brain connectivity in an abstinent state and when people are smoking as usual, and then relate those changes to symptoms of craving and mental performance.
For the study, researchers conducted brain scans on 37 healthy smokers (those who smoke more than 10 cigarettes a day) ages 19 to 61 using functional magnetic resonance imaging (fMRI) in two different sessions: 24 hours after biochemically confirmed abstinence and after smoking as usual.
Imaging showed a significantly weaker connectivity between the salience network and default mode network during abstinence, compared with their sated state. Also, weakened connectivity during abstinence was linked with increases in smoking urges, negative mood, and withdrawal symptoms, suggesting that this weaker internetwork connectivity may make it more difficult for people to quit.
Establishing the strength of the connectivity between these large-scale brain networks will be important in predicting people’s ability to quit and stay quit, the authors write. Also, such connectivity could serve as a clinical biomarker to identify smokers who are most likely to respond to a particular treatment.
“Symptoms of withdrawal are related to changes in smokers’ brains, as they adjust to being off of nicotine, and this study validates those experiences as having a biological basis,” said Lerman. “The next step will be to identify in advance those smokers who will have more difficultly quitting and target more intensive treatments, based on brain activity and network connectivity.”
Research from McGill University reveals that the brain’s motor network helps people remember and recognize music that they have performed in the past better than music they have only heard. A recent study by Prof. Caroline Palmer of the Department of Psychology sheds new light on how humans perceive and produce sounds, and may pave the way for investigations into whether motor learning could improve or protect memory or cognitive impairment in aging populations. The research is published in the journal Cerebral Cortex.
“The memory benefit that comes from performing a melody rather than just listening to it, or saying a word out loud rather than just hearing or reading it, is known as the ’production effect’ on memory”, says Prof. Palmer, a Canada Research Chair in Cognitive Neuroscience of Performance. “Scientists have debated whether the production effect is due to motor memories, such as knowing the feel of a particular sequence of finger movements on piano keys, or simply due to strengthened auditory memories, such as knowing how the melody tones should sound. Our paper provides new evidence that motor memories play a role in improving listeners’ recognition of tones they have previously performed.”

For the study, researchers recruited twenty skilled pianists from Lyon, France. The group was asked to learn simple melodies by either hearing them several times or performing them several times on a piano. Pianists then heard all of the melodies they had learned, some of which contained wrong notes, while their brain electric signals were measured using electroencephalography (EEG).
“We found that pianists were better at recognizing pitch changes in melodies they had performed earlier,” said the study’s first author, Brian Mathias, a McGill PhD student who conducted the work at the Lyon Neuroscience Research Centre in France with additional collaborators Drs. Barbara Tillmann and Fabien Perrin.
The team found that EEG measurements revealed larger changes in brain waves and increased motor activity for previously performed melodies than for heard melodies about 200 milliseconds after the wrong notes. This reveals that the brain quickly compares incoming auditory information with motor information stored in memory, allowing us to recognize whether a sound is familiar.
“This paper helps us understand ‘experiential learning’, or ‘learning by doing’, and offers pedagogical and clinical implications,” said Mathias, “The role of the motor system in recognizing music, and perhaps also speech, could inform education theory by providing strategies for memory enhancement for students and teachers.”
Alzheimer’s disease is the most widespread degenerative neurological disorder in the world. Over five million Americans live with it, and one in three senior citizens will die with the disease or a similar form of dementia. While memory loss is a common symptom of Alzheimer’s, other behavioral manifestations — depression, loss of inhibition, delusions, agitation, anxiety, and aggression — can be even more challenging for victims and their families to live with.

Now Prof. Daniel Offen and Dr. Adi Shruster of Tel Aviv University’s Sackler School of Medicine have discovered that by reestablishing a population of new cells in the part of the brain associated with behavior, some symptoms of Alzheimer’s disease significantly decreased or were reversed altogether.
The research, published in the journal Behavioural Brain Research, was conducted on mouse models; it provides a promising target for Alzheimer’s symptoms in human beings as well.
"Until 15 years ago, the common belief was that you were born with a finite number of neurons. You would lose them as you aged or as the result of injury or disease," said Prof. Offen, who also serves as Chief Scientific Officer at BrainStorm, a biotech company at the forefront of innovative stem cell research. "We now know that stem cells can be used to regenerate areas of the brain."
Speeding up recovery
After introducing stem cells in brain tissue in the laboratory and seeing promising results, Prof. Offen leveraged the study to mice with Alzheimer’s disease-like symptoms. The gene (Wnt3a) was introduced in the part of the mouse brain that controls behavior, specifically fear and anxiety, in the hope that it would contribute to the formation of genes that produce new brain cells.
According to Prof. Offen, untreated Alzheimer’s mice would run heedlessly into an unfamiliar and dangerous area of their habitats instead of assessing potential threats, as healthy mice do. Once treated with the gene that increased new neuron population, however, the mice reverted to assessing their new surroundings first, as usual.
"Normal mice will recognize the danger and avoid it. Mice with the disease, just like human patients, lose their sense of space and reality," said Prof. Offen. "We first succeeded in showing that new neuronal cells were produced in the areas injected with the gene. Then we succeeded in showing diminished symptoms as a result of this neuron repopulation."
"The loss of inhibition is a cause of great embarrassment for most patients and relatives of patients with Alzheimer’s," said Prof. Offen. "Often, patients take off their pants in public, having no sense of their surroundings. We saw parallel behavior in animal models with Alzheimer’s."
Next: Memory
After concluding that increased stem cell production in a certain area of the brain had a positive effect on behavioral deficits of Alzheimer’s, Prof. Offen has moved to research into the area of the brain that controls memory. He and his team are currently exploring it in the laboratory and are confident that the results of the new study will be similar.
"Although there are many questions to answer before this research produces practical therapies, we are very optimistic about the results and feel this is a promising direction for Alzheimer’s research," said Prof. Offen.
New research from Karolinska Institutet and Umeå University in Sweden demonstrates for the first time that there is a close relationship between body perception and the ability to remember. For us to be able to store new memories from our lives, we need to feel that we are in our own body. According to researchers, the results could be of major importance in understanding the memory problems that psychiatric patients often exhibit.
The memories of what happened on the first day of school are an example of an episodic memory. How these memories are created and how the role that the perception of one’s own body has when storing memories has long been inconclusive. Swedish researchers can now demonstrate that volunteers who experience an exciting event whilst perceiving an illusion of being outside their own body exhibit a form of memory loss.
“It is already evident that people who have suffered psychiatric conditions in which they felt that they were not in their own body have fragmentary memories of what actually occurred”, says Loretxu Bergouignan, principal author of the current study. “We wanted to see how this manifests itself in healthy subjects.”
The study, which is published in the scientific journal PNAS, involved a total of 84 students reading about and undergoing four oral questioning sessions. To make these sessions extra memorable, an actor (Peter Bergared) took up the role of examiner – a (fictional) very eccentric professor at Karolinska Institutet. Two of the interrogations were perceived from a first person perspective from their own bodies in the usual way, while the participants in the other two sessions experienced a created illusion of being outside their own body. In both cases, the participants wore virtual reality goggles and earphones. One week later, they either underwent memory testing where they had to recall the events and provide details about what had happened, in which order, and what they felt, or they had to try to remember the events while they underwent brain imaging with functional magnetic resonance imaging (fMRI).
It then turned out that the participants remembered the ‘out-of-body’ interrogations significantly worse than those experienced from the normal ‘In body’ perspective. This was the case despite the fact that they responded equally well to the questions from each situation and also indicated that they experienced the same level of emotion. The fMRI scans further revealed a crucial difference in activity in the portion of the temporal lobe – the hippocampus – that is known to be central for episodic memories.
“When they tried to remember what happened during the interrogations experienced out-of-body, activity in the hippocampus was eliminated, unlike when they remembered the other situations. However, we could see activity in the frontal lobe cortex, so they were really making an effort to remember”, says professor Henrik Ehrsson, the research group leader behind the study.
The researchers’ interpretation of the results is that there is a close relationship between body experience and memory. Our brain constantly creates the experience of one’s own body in space by combining information from multiple senses: sight, hearing, touch, and more. When a memory is created, it is the task of the hippocampus to link all the information found in the cerebral cortex into a unified memory for further long-term storage. During the experience of being outside one’s body, this memory storage process is disturbed, whereupon the brain creates fragmentary memories instead.
“We believe that this new knowledge may be important for future research on memory disorders in a number of psychiatric conditions such as post-traumatic stress disorder, borderline personality disorder and certain psychoses where patients have dissociative experiences,” says Loretxu Bergouignan.
Researchers at the School of Medicine have identified a subset of nerve cells that mediates a form of chronic, touch-evoked pain called tactile allodynia, a condition that is resistant to conventional pain medication.
The discovery could point researchers to more fruitful efforts to develop effective drugs for the condition.
Touch-evoked pain occurs as part of a larger neuropathic pain condition arising from damage or disruption of nerve-cell circuits or signals caused by disorders such as alcoholism, diabetes, shingles and AIDS, or procedures such as spine surgery and chemotherapy. For patients with tactile allodynia, the slightest touch — a gentle caress or the brush of shirt against skin — can cause excruciating pain because changes in nerve-cell signals or networks trick the brain into mistaking touch for pain.
The study, published online Feb. 27 in Neuron, found that these “touch” neurons are different from the usual “pain” neurons that respond to stimuli such as cuts or bruises.
Unlike pain caused by such wounds, neuropathic pain is difficult to manage because little can be done to repair nerve damage. Managing it may require strong painkillers or combinations of treatments.
Common painkillers such as morphine have little effect on touch-evoked pain, possibly because they don’t target the touch neurons, the authors say. Morphine binds to specific protein-binding sites on pain neurons called mu opioid receptors, or MORs, and cuts off the their signals so that the brain can no longer sense pain.
However, the touch neurons do not carry MORs, which is why morphine cannot bind to them and block the pain. Instead, they carry delta opioid receptors, or DORs, whose role in pain control has been unclear until recently.
"That’s been the problem so far; any type of severe pain you have, you go into the clinic and very likely you will be treated with morphine-like opioids," said Gregory Scherrer, PharmD, PhD, the senior author of the study and an assistant professor of anesthesia. "You can give some of these patients as much morphine as you want; it won’t work if the mu opioid receptor is not present on the neurons that underlie that type of pain."
There are currently no Food and Drug Administration-approved pain-control drugs that target DORs. Previous attempts at developing DOR-targeting drugs haven’t succeeded because researchers didn’t know what type of pain such drugs would be useful for, Scherrer said.
Two DOR-binding drugs developed for knee pain by Adolor Corp., a biotechnology firm, for instance, probably failed because there is no compelling evidence that DOR was present or involved. AstraZeneca, another pharmaceutical firm, also had a DOR program but recently stopped its research efforts, Scherrer added.
"Now that we have provided a rationale and mechanism supporting the utility of DOR agonists for cutaneous pain and tactile allodynia, these companies will be able to design trials more carefully to evaluate specifically the drugs’ efficacy against touch-evoked pain," he said.
Earlier studies by Scherrer and others hinted at the presence of special nerve fibers on the skin that might contribute to touch-evoked pain.
In the current study, Scherrer and colleagues used fluorescent mouse models to isolate these neurons and identify how they control touch-evoked pain. They found that DOR can play an inhibitory role in these neurons: When proteins bind to DOR, they cut off communication to the spinal cord, through which sensory signals travel to the brain.
DOR-carrying “touch” neurons pervade the skin and could easily be targeted by drugs in the form of skin patches or topical creams, Scherrer suggested.
"By contrast, most MOR-carrying neurons penetrate internal organs," he said. "That’s why morphine is effective in treating post-surgery pain, for example."
Scherrer and fellow researchers tested two different DOR-binding compounds individually on mice and found that both reduced the mice’s sensitivity to touch-evoked pain.
Preliminary studies also indicate that DOR-targeting drugs might not cause dramatic side effects like morphine does, especially if they can be used topically, Scherrer said.
"Morphine and other MOR-targeting drugs have myriad deleterious side effects — including addiction, respiratory depression, constipation, nausea and vomiting — that further limits their utility for chronic pain management," he said.
The next step is to determine whether DOR could be a target for other types of pain, such as arthritis pain, pain from bone cancer and muscle pain, Scherrer added.
The findings also suggest that the body’s opioid system — normally associated with pain and addiction — may also respond to other stimuli such as touch.
"We may have underestimated the importance of the opioid system and what can be achieved with drugs targeting other subtypes of opioid receptors," Scherrer said.