Posts tagged neuroscience

Posts tagged neuroscience
Strategic or Random? How the Brain Chooses
Many of the choices we make are informed by experiences we’ve had in the past. But occasionally we’re better off abandoning those lessons and exploring a new situation unfettered by past experiences. Scientists at the Howard Hughes Medical Institute’s Janelia Research Campus have shown that the brain can temporarily disconnect information about past experience from decision-making circuits, thereby triggering random behavior.
In the study, rats playing a game for a food reward usually acted strategically, but switched to random behavior when they confronted a particularly unpredictable and hard-to-beat competitor. The animals sometimes got stuck in a random-behavior mode, but the researchers, led by Janelia lab head Alla Karpova and postdoctoral fellow Gowan Tervo, found that they could restore normal behavior by manipulating activity in a specific region of the brain. Because the behavior of animals stuck in this random mode bears some resemblance to that of patients affected by a psychological condition called learned helplessness, the findings may help explain that condition and suggest strategies for treating it. Karpova, Tervo and their colleagues published their findings in the September 25, 2012, issue of the journal Cell.
The brain excels at integrating information from past experiences to guide decision-making in new situations. But in certain circumstances, random behavior may be preferable. An animal might have the best chance of avoiding a predator if it moves unpredictably, for example. And in a new environment, unrestricted exploration might make more sense than relying on an internal model developed elsewhere. So scientists have long speculated that the brain may have a way to switch off the influence of past experiences so that behavior can proceed randomly, Karpova says. But others disagreed. “They argue that it’s inefficient, and that it would be at odds with what some people call one of the most central operating principles of the brain – to use our past experience and knowledge to optimize behavioral choices,” she notes.
Karpova and her colleagues wanted to see if they could create a situation that would force animals to switch into this random mode of behavior. “We tried to create a setting that would push the need to create behavioral variability and unpredictability to its extreme,” she says. They did this by placing rats in a competitive setting in which a computer-simulated competitor determined which of two holes in a wall would provide a sugary reward. The virtual competitor, whose sophistication was varied by the experimenters, analyzed the rats’ behavior to predict their future choices.
“We thought if we came up with very sophisticated competitors, then the animals would eventually be unable to figure out how to outcompete them, and be forced to either give up or switch into this [random] mode, if such a mode exists,” Karpova says. And that’s exactly what happened: When faced with a weak competitor, the animals made strategic choices based on the outcomes of previous trials. But when a sophisticated competitor made strong predictions, the rats ignored past experience and made random selections in search of a reward.
Now that they had evidence that the brain could generate both strategic and random behavior, Karpova and her colleagues wanted to know how it switched between modes. Since that switch determines whether or not an animal’s internal model of the world influences its behavior, the scientists suspected it might involve a brain region called the anterior cingulate cortex, where that internal model is likely encoded.
They found that they could cause animals to switch between random and strategic behavior by manipulating the level of a stress hormone called norepinephrine in the anterior cingulate cortex. Increasing norepinephrine in the region activated random behavior and suppressed the strategic mode. Inhibiting release of the hormone had the opposite effect.
Karpova’s team observed that animals in their experiments sometimes continued to behave randomly, even when such behavior was no longer advantageous. “If all they’ve experienced is this really sophisticated competitor for several sessions that thwarts their attempts at strategic, model-based counter-prediction, they go into this [random mode], and they can get stuck in it for quite some time after that competitor is gone,” she says. This, she says, resembles the condition of learned helplessness, in which strategic decision-making is impaired following an experience in which a person finds they are unable to control their environment.
The scientists could release the animals from this “stuck” state by suppressing the release of norepinephrine in the anterior cingulate cortex. “Just by manipulating a single neuromodulatory input into one brain area, you can dramatically enhance the strategic mode. The effect is strong enough to rescue animals out of the random mode and successfully transform them into strategic decision makers,” Karpova says. “We think this might shed light on what has gone wrong in conditions such as learned helplessness, and possibly how we can help alleviate them.”
Karpova says that now that her team has uncovered a mechanism that switches the brain between random and strategic behavior, she would like to understand how those behaviors are controlled in more natural settings. “We normally try to use all of our knowledge to think strategically, but sometimes we still need to explore,” she says. In most cases, that probably means brief bouts of random behavior during times when we are uncertain that past experience is relevant, followed by a return to more strategic behavior – a more subtle balance that Karpova intends to investigate at the level of changes in activity in individual neural circuits.
Inattention, hyperactivity, and impulsive behavior in children with ADHD can result in social problems and they tend to be excluded from peer activities. They have been found to have impaired recognition of emotional expression from other faces. The research group of Professor Ryusuke Kakigi of the National Institute for Physiological Sciences, National Institutes of Natural Sciences, in collaboration with Professor Masami K. Yamaguchi and Assistant Professor Hiroko Ichikawa of Chuo University first identified the characteristics of facial expression recognition of children with ADHD by measuring hemodynamic response in the brain and showed the possibility that the neural basis for the recognition of facial expression is different from that of typically developing children. The findings are discussed in Neuropsychologia (available online on Aug. 23, 2014).

The research group showed images of a happy expression or an angry expression to 13 children with ADHD and 13 typically developing children and identified the location of the brain activated at that time. They used non-invasive near-infrared spectroscopy to measure brain activity. Near-infrared light, which is likely to go through the body, was projected through the skull and the absorbed or scattered light was measured. The strength of the light depends on the concentration in “oxyhemoglobin” which gives the oxygen to the nerve cells working actively. The result was that typically developing children showed significant hemodynamic response to both the happy expression and angry expression in the right hemisphere of the brain. On the other hand, children with ADHD showed significant hemodynamic response only to the happy expression but brain activity specific for the angry expression was not observed. This difference in the neural basis for the recognition of facial expression might be responsible for impairment in social recognition and the establishment of peer-relationships.
(Source: eurekalert.org)
New findings on how brain handles tactile sensations
The traditional understanding in neuroscience is that tactile sensations from the skin are only assembled to form a complete experience in the cerebral cortex, the most advanced part of the brain. However, this is challenged by new research findings from Lund University in Sweden that suggest both that other levels in the brain play a greater role than previously thought, and that a larger proportion of the brain’s different structures are involved in the perception of touch.
“It was believed that a tactile sensation, such as touching a simple object, only activated a very small part of the cerebral cortex. However, our findings show that a much larger part is probably activated. The assembly of sensations actually starts in the brainstem”, said neuroscience researcher Henrik Jörntell at Lund University.
According to his colleague Fredrik Bengtsson, who also participated in the research, this is the first study to show how complex tactile sensations from the skin are coded at the cellular level in the brain.
“Our findings have given us a new key to understanding how the perception of touch in the skin is processed and communicated to the brain”, he said.
The Lund researchers have worked in collaboration with researchers in Paris to study how individual nerve cells receive information from the skin. They used a ‘haptic interface’, which created controlled sensations of rolling and slipping movements and of contact initiating and ceasing. Movements proved decisive for the perception of touch – something that was not previously technically possible to study.
The findings of the Swedish-French research group have been published in the distinguished journal Neuron. The work is based on animal experiments and is first and foremost basic research, which aims to increase knowledge of the function of the brain. However, there are also possible areas of application.
“Normal hand and arm prostheses do not give any feedback and therefore no sensation of being a ‘real’ hand or arm. However, there are new, advanced prostheses with sensors that can supply information to the amputated arm. Our research could contribute to the further development of such sensors”, said Henrik Jörntell.
The new findings could also have a bearing on psychiatric illness and brain diseases such as stroke and Parkinson’s disease. Detailed knowledge of how the brain and its various parts process information and create a picture of a tactile experience is important to understanding these conditions.
“If we know how a healthy brain operates, we can compare it with the situation in different diseases. Then perhaps we can help patients’ brains to function more normally”, said Henrik Jörntell.
The first animal model for ALS dementia, a form of ALS that also damages the brain, has been developed by Northwestern Medicine scientists. The advance will allow researchers to directly see the brains of living mice, under anesthesia, at the microscopic level. This will allow direct monitoring of test drugs to determine if they work.

This is one of the latest research findings since the ALS Ice Bucket Challenge heightened interest in the disease and the need for expanded research and funding.
“This new model will allow rapid testing and direct monitoring of drugs in real time,” said Northwestern scientist and study senior author Teepu Siddique, MD. “This will allow scientists to move quickly and accelerate the testing of drug therapies.”
The new mouse model has the pathological hallmarks of the disease in humans with mutations in the genes for UBQLN2 (ubliqulin 2) and SQSTM1 (P62) that Siddique and colleagues identified in 2011. That pathology was linked to all forms of ALS and ALS/dementia.
Dr. Siddique and Han-Xiang Deng, MD, the corresponding authors on the paper, said they have reproduced behavioral, neurophysiological and pathological changes in a mouse that mimic this form of dementia associated with ALS (amyotrophic lateral sclerosis).
Dr. Siddique is the Les Turner ALS Foundation/Herbert C. Wenske Professor of Neurology at Northwestern University Feinberg School of Medicine and a neurologist at Northwestern Memorial Hospital. Dr. Deng is a research professor in Neurology at Feinberg.
The study was published Sept. 22 in the Proceedings of the National Academy of Sciences.
It’s been difficult for scientists to reproduce the genetic mutations of ALS, especially ALS/dementia in animal models, Dr. Siddique noted, which has hampered drug therapy testing.
Five percent or more of ALS cases, also known as Lou Gherig’s disease, also have ALS/dementia.
“ALS with dementia is an even more vicious disease than ALS alone because it attacks the brain causing changes in behavior and language well as causing paralysis,” Dr. Siddique said.
ALS affects an estimated 350,000 people worldwide, with an average survival of three years. In this progressive neurological disorder, the degeneration of neurons leads to muscle weakness and impaired speaking, swallowing and breathing, eventually causing paralysis and death. The associated dementia affects behavior and may affect decision-making, judgment, insight and language.
(Source: feinberg.northwestern.edu)
Simultaneously using mobile phones, laptops and other media devices could be changing the structure of our brains, according to new University of Sussex research.

A study published today (24 September) in PLOS ONE reveals that people who frequently use several media devices at the same time have lower grey-matter density in one particular region of the brain compared to those who use just one device occasionally.
The research supports earlier studies showing connections between high media-multitasking activity and poor attention in the face of distractions, along with emotional problems such as depression and anxiety.
But neuroscientists Kep Kee Loh and Dr Ryota Kanai point out that their study reveals a link rather than causality and that a long-term study needs to be carried out to understand whether high concurrent media usage leads to changes in the brain structure, or whether those with less-dense grey matter are more attracted to media multitasking.
The researchers at the University of Sussex’s Sackler Centre for Consciousness Science used functional magnetic resonance imaging (fMRI) to look at the brain structures of 75 adults, who had all answered a questionnaire regarding their use and consumption of media devices, including mobile phones and computers, as well as television and print media.
They found that, independent of individual personality traits, people who used a higher number of media devices concurrently also had smaller grey matter density in the part of the brain known as the anterior cingulate cortex (ACC), the region notably responsible for cognitive and emotional control functions.
Kep Kee Loh says: “Media multitasking is becoming more prevalent in our lives today and there is increasing concern about its impacts on our cognition and social-emotional well-being. Our study was the first to reveal links between media multitasking and brain structure.”
Scientists have previously demonstrated that brain structure can be altered upon prolonged exposure to novel environments and experience. The neural pathways and synapses can change based on our behaviours, environment, emotions, and can happen at the cellular level (in the case of learning and memory) or cortical re-mapping, which is how specific functions of a damaged brain region could be re-mapped to a remaining intact region.
Other studies have shown that training (such as learning to juggle, or taxi drivers learning the map of London) can increase grey-matter densities in certain parts of the brain.
“The exact mechanisms of these changes are still unclear,” says Kep Kee Loh. “Although it is conceivable that individuals with small ACC are more susceptible to multitasking situations due to weaker ability in cognitive control or socio-emotional regulation, it is equally plausible that higher levels of exposure to multitasking situations leads to structural changes in the ACC. A longitudinal study is required to unambiguously determine the direction of causation.”
(Source: sussex.ac.uk)
New research by scientists at the University of Kentucky’s Sanders-Brown Center on Aging suggests that people who notice their memory is slipping may be on to something.

The research, led by Richard Kryscio, Ph.D., chair of the Department of of Biostatistics and associate director of the Alzheimer’s Disease Center at UK, appears to confirm that self-reported memory complaints are strong predictors of clinical memory impairment later in life.
Kryscio and his group asked 531 people with an average age of 73 and free of dementia if they had noticed any changes in their memory in the prior year. The participants were also given annual memory and thinking tests for an average of 10 years. After death, participants’ brains were examined for evidence of Alzheimer’s disease.
During the study, 56 percent of the participants reported changes in their memory, at an average age of 82. The study found that participants who reported changes in their memory were nearly three times more likely to develop memory and thinking problems. About one in six participants developed dementia during the study, and 80 percent of those first reported memory changes.
"What’s notable about our study is the time it took for the transition from self-reported memory complaint to dementia or clinical impairment — about 12 years for dementia and nine years for clinical impairment — after the memory complaints began," Kryscio said. "That suggests that there may be a significant window of opportunity for intervention before a diagnosable problem shows up."
Kryscio points out that while these findings add to a growing body of evidence that self-reported memory complaints can be predictive of cognitive impairment later in life, there isn’t cause for immediate alarm if you can’t remember where you left your keys.
"Certainly, someone with memory issues should report it to their doctor so they can be followed. Unfortunately, however, we do not yet have preventative therapies for Alzheimer’s disease or other illnesses that cause memory problems."
The research, which was supported by grants from the National Institutes of Health, the National Institute on Aging, and the National Center for Advancing Translational Sciences, was published in the Sept. 24, 2014, online issue of Neurology.
(Source: uknow.uky.edu)
(Image caption: This is a coronal view of the hippocampus brain region of a patient with Alzheimer’s disease. Image courtesy of Daniel Tranel’s Laboratory at the UI’s Department of Neurology.)
Alzheimer’s patients can still feel the emotion long after the memories have vanished
A new University of Iowa study further supports an inescapable message: caregivers have a profound influence—good or bad—on the emotional state of individuals with Alzheimer’s disease. Patients may not remember a recent visit by a loved one or having been neglected by staff at a nursing home, but those actions can have a lasting impact on how they feel.
The findings of this study are published in the September 2014 issue of the journal Cognitive and Behavioral Neurology.
UI researchers showed individuals with Alzheimer’s disease clips of sad and happy movies. The patients experienced sustained states of sadness and happiness despite not being able to remember the movies.
“This confirms that the emotional life of an Alzheimer’s patient is alive and well,” says lead author Edmarie Guzmán-Vélez, a doctoral student in clinical psychology, a Dean’s Graduate Research Fellow, and a National Science Foundation Graduate Research Fellow.
Guzmán-Vélez conducted the study with Daniel Tranel, UI professor of neurology and psychology, and Justin Feinstein, assistant professor at the University of Tulsa and the Laureate Institute for Brain Research.
Tranel and Feinstein published a paper in 2010 that predicted the importance of attending to the emotional needs of people with Alzheimer’s, which is expected to affect as many as 16 million people in the United States by 2050 and cost an estimated $1.2 trillion.
“It’s extremely important to see data that support our previous prediction,” Tranel says. “Edmarie’s research has immediate implications for how we treat patients and how we teach caregivers.”
Despite the considerable amount of research aimed at finding new treatments for Alzheimer’s, no drug has succeeded at either preventing or substantially influencing the disease’s progression. Against this foreboding backdrop, the results of this study highlight the need to develop new caregiving techniques aimed at improving the well-being and minimizing the suffering for the millions of individuals afflicted with Alzheimer’s.
For this behavioral study, Guzmán-Vélez and her colleagues invited 17 patients with Alzheimer’s disease and 17 healthy comparison participants to view 20 minutes of sad and then happy movies. These movie clips triggered the expected emotion: sorrow and tears during the sad films and laughter during the happy ones.
About five minutes after watching the movies, the researchers gave participants a memory test to see if they could recall what they had just seen. As expected, the patients with Alzheimer’s disease retained significantly less information about both the sad and happy films than the healthy people. In fact, four patients were unable to recall any factual information about the films, and one patient didn’t even remember watching any movies.
Before and after seeing the films, participants answered questions to gauge their feelings. Patients with Alzheimer’s disease reported elevated levels of either sadness or happiness for up to 30 minutes after viewing the films despite having little or no recollection of the movies.
Quite strikingly, the less the patients remembered about the films, the longer their sadness lasted. While sadness tended to last a little longer than happiness, both emotions far outlasted the memory of the films.
The fact that forgotten events can continue to exert a profound influence on a patient’s emotional life highlights the need for caregivers to avoid causing negative feelings and to try to induce positive feelings.
“Our findings should empower caregivers by showing them that their actions toward patients really do matter,” Guzmán-Vélez says. “Frequent visits and social interactions, exercise, music, dance, jokes, and serving patients their favorite foods are all simple things that can have a lasting emotional impact on a patient’s quality of life and subjective well-being.”
A new, easy-to-use EEG electrode set for the measurement of the electrical activity of the brain was developed in a recent study completed at the University of Eastern Finland. The solutions developed in the PhD study of Pasi Lepola, MSc, make it possible to attach the electrode set on the patient quickly, resulting in reliable results without any special treatment of the skin. As EEG measurements in emergency care are often performed in challenging conditions, the design of the electrode set pays particular attention to the reduction of electromagnetic interference from external sources.
EEG measurements can be used to detect such abnormalities in the electrical activity of the brain that require immediate treatment. These abnormalities are often indications of severe brain damage, cerebral infarction, cerebral haemorrhage, poisoning, or unspecified disturbed levels of consciousness. One of the most serious brain function abnormalities is a prolonged epileptic seizure, status epilepticus, which is impossible to diagnose without an EEG measurement. In many cases, a rapidly performed EEG measurement and the start of a proper treatment significantly reduces the need for aftercare and rehabilitation. This, in turn, drastically improves the cost-effectiveness of the treatment chain.
Although the benefits of EEG measurements are indisputable, they remain underused in acute and emergency care. A significant reason for this is the fact that the electrode sets available on the markets are difficult to attach on the patient, and their use requires special skills and constant training. This new type of an electrode set is expected to provide solutions for making EEG measurements feasible at as an early stage as possible.

The EEG electrode set was produced using screen printing technology, in which silver ink was used to print the conductors and measurement electrodes on a flexible polyester film. The EEG electrode set consists of 16 hydrogel-coated electrodes which, unlike in the traditional method, are placed on the hair-free areas of the patient’s head, making it easy to attach. The new EEG electrode set significantly speeds up the measurement process because there is no need to scrape the patient’s skin or to use any separate gels. As the electrode set is flexible and solid, the electrodes get automatically placed in their correct places. Furthermore, there is no need to move the patient’s head when putting on the EEG electrode set, which is especially important in patients possibly suffering from a neck or skull injury. Due to the fact that the disposable electrode set is easy and fast to use, it is particularly well-suited to be used in emergency care, in ambulances and even in field conditions. Thanks to the materials used, the electrode set does not interfere with any magnetic resonance or computed tomography imaging the patient may undergo.
The performance of the electrode set was tested by using various electrical tests, on several volunteers, and in real patient cases. The results were compared to those obtained by traditional EEG methods.

The PhD study also focused on the use of screen printing technology solutions to protect electrodes against electromagnetic interference. The silver or graphite shielding layer printed to the outer edge of the electrode set was discovered to significantly reduce external interference on the EEG signal. This shielding layer can be easily and cost-efficiently introduced to all measurement electrodes produced with similar methods. Protecting the electrode with a shielding layer is beneficial when measuring weak signals in conditions that contain external interference.
(Source: uef.fi)
Areas of the brain that respond to reward and pleasure are linked to the ability of a drug known as butorphanol to relieve itch, according to new research led by Gil Yosipovitch, MD, Professor and Chair of the Department of Dermatology at Temple University School of Medicine (TUSM), and Director of the Temple Itch Center. The findings point to the involvement of the brain’s opioid receptors—widely known for their roles in pain, reward, and addiction—in itch relief, potentially opening up new avenues to the development of treatments for chronic itch.

The article, published online September 11, in the Journal of Investigative Dermatology, is the first to show precisely where in the brain butorphanol works to relieve itch. In identifying those areas, the study helps to explain why butorphanol works better for chronic itching mediated by histamine, a small molecule involved in allergic reactions, than for nonhistamine-related types of itch.
"The research allows us to assess butorphanol’s effects," Dr. Yosipovitch said. "We can now identify better targets in the brain that drugs can work on to relieve itch."
The research marks an important step toward the development of itch-specific agents. As Dr. Yosipovitch explained, chronic itching, which affects roughly 12 percent of the population, comprises not just one disease, but many—ranging from atopic eczema and psoriasis to systemic diseases such as lymphoma and chronic liver failure. Biochemically, each of those diseases induces itching via one of two main pathways: one that is mediated by histamine and one that is not. Most pathological itching originates along nonhistaminergic pathways.
Working with Alexandru D. P. Papoiu, MD, PhD, at Wake Forest University School of Medicine, Dr. Yosipovitch experimentally induced itch in human volunteers using either histamine or cowhage, which incites nonhistaminergic itching. Study volunteers were then treated with either butorphanol or a placebo and subjected to functional magnetic resonance imaging (fMRI) to analyze brain activity and assess the effects of butorphanol (or placebo). When volunteers returned seven days later, they received the other treatment and again underwent fMRI.
Butorphanol suppressed histamine itching in all cases and reduced cowhage itching in 35 percent of subjects. The drug’s suppression of histamine itching was associated specifically with the activation of brain areas known as the nucleus accumbens and septal nuclei—areas located deep at the base of the forebrain. The regions are notably rich in so-called kappa (κ)-opioid receptors, on which butorphanol acts. By contrast, the relief of cowhage itch by butorphanol was linked to effects in other brain areas.
The findings suggest that butorphanol works primarily on κ-opioid receptors to suppress the itch sensation induced by histamine. But the drug also has important effects on an itch pathway that does not involve histamine, where the demand for new treatments is greatest.
How nonhistaminergic itching is reduced through the involvement of opioid receptors remains unclear. Opioid receptors modulate the transmission of information about itch in the brain and occur in high levels in the areas of the brain that house neural pathways associated with reward. Reward pathways are known particularly for their response to pleasurable stimuli. Dr. Yosipovitch and Dr. Papoiu have shown in previous work that the activation of reward circuits is correlated with pleasurability and the degree of itch relief derived from self-scratching.
The new study, which Yosipovitch carried out at Wake Forest University prior to joining the TUSM faculty in 2013, further illustrates the power of applying imaging technologies to basic questions in itch research. At Temple’s Itch Center, Yosipovitch is continuing to explore those applications.
"We are in a position now to better understand the itch-scratch cycle," he said. "To break the cycle from the top down, knowing where to target receptors in the brain, would be a major achievement."
(Source: temple.edu)
Infant Cooing, Babbling Linked to Hearing Ability
Infants’ vocalizations throughout the first year follow a set of predictable steps from crying and cooing to forming syllables and first words. However, previous research had not addressed how the amount of vocalizations may differ between hearing and deaf infants. Now, University of Missouri research shows that infant vocalizations are primarily motivated by infants’ ability to hear their own babbling. Additionally, infants with profound hearing loss who received cochlear implants to help correct their hearing soon reached the vocalization levels of their hearing peers, putting them on track for language development.
“Hearing is a critical aspect of infants’ motivation to make early sounds,” said Mary Fagan, an assistant professor in the Department of Communication Science and Disorders in the MU School of Health Professions. “This study shows babies are interested in speech-like sounds and that they increase their babbling when they can hear.”
Fagan studied the vocalizations of 27 hearing infants and 16 infants with profound hearing loss who were candidates for cochlear implants, which are small electronic devices embedded into the bone behind the ear that replace some functions of the damaged inner ear. She found that infants with profound hearing loss vocalized significantly less than hearing infants. However, when the infants with profound hearing loss received cochlear implants, the infants’ vocalizations increased to the same levels as their hearing peers within four months of receiving the implants.
“After the infants received their cochlear implants, the significant difference in overall vocalization quantity was no longer evident,” Fagan said. “These findings support the importance of early hearing screenings and early cochlear implantation.”
Fagan found that non-speech-like sounds such as crying, laughing and raspberry sounds, were not affected by infants’ hearing ability. She says this finding highlights babies are more interested in speech-like sounds since they increase their production of those sounds such as babbling when they can hear.
“Babies learn so much through sound in the first year of their lives,” Fagan said. “We know learning from others is important to infants’ development, but hearing allows infants to explore their own vocalizations and learn through their own capacity to produce sounds.”
In future research, Fagan hopes to study whether infants explore the sounds of objects such as musical toys to the same degree they explore vocalization.
Fagan’s research, “Frequency of vocalization before and after cochlear implantation: Dynamic effect of auditory feedback on infant behavior,” was published in the Journal of Experimental Child Psychology.