Posts tagged learning

Posts tagged learning
Researcher shows how stress hormones promote brain’s building of negative memories
When a person experiences a devastating loss or tragic event, why does every detail seem burned into memory whereas a host of positive experiences simply fade away?
It’s a bit more complicated than scientists originally thought, according to a study recently published in the journal Neuroscience by ASU researcher Sabrina Segal.
When people experience a traumatic event, the body releases two major stress hormones: norepinephrine and cortisol. Norepinephrine boosts heart rate and controls the fight-or-flight response, commonly rising when individuals feel threatened or experience highly emotional reactions. It is chemically similar to the hormone epinephrine – better known as adrenaline.
In the brain, norepinephrine in turn functions as a powerful neurotransmitter or chemical messenger that can enhance memory.
Research on cortisol has demonstrated that this hormone can also have a powerful effect on strengthening memories. However, studies in humans up until now have been inconclusive – with cortisol sometimes enhancing memory, while at other times having no effect.
A key factor in whether cortisol has an effect on strengthening certain memories may rely on activation of norepinephrine during learning, a finding previously reported in studies with rats.
In her study, Segal, an assistant research professor at the Institute for Interdisciplinary Salivary Bioscience Research at ASU, and her colleagues at the University of California-Irvine showed that human memory enhancement functions in a similar way.
Conducted in the laboratory of Larry Cahill at U.C. Irvine, Segal’s study included 39 women who viewed 144 images from the International Affective Picture Set. This set is a standardized picture set used by researchers to elicit a range of responses, from neutral to strong emotional reactions, upon view.
Segal and her colleagues gave each of the study’s subjects either a dose of hydrocortisone – to simulate stress – or a placebo just prior to viewing the picture set. Each woman then rated her feelings at the time she was viewing the image, in addition to giving saliva samples before and after. One week later, a surprise recall test was administered.
What Segal’s team found was that “negative experiences are more readily remembered when an event is traumatic enough to release cortisol after the event, and only if norepinephrine is released during or shortly after the event.”
“This study provides a key component to better understanding how traumatic memories may be strengthened in women,” Segal added, “because it suggests that if we can lower norepinephrine levels immediately following a traumatic event, we may be able to prevent this memory enhancing mechanism from occurring, regardless of how much cortisol is released following a traumatic event.”
Further studies are needed to explore to what extent the relationship between these two stress hormones differ depending on whether you are male or female, particularly because women are twice as likely to develop disorders from stress and trauma that affect memory, such as in Posttraumatic Stress Disorder (PTSD). In the meantime, the team’s findings are a first step toward a better understanding of neurobiological mechanisms that underlie traumatic disorders, such as PTSD.
(Image: Wikimedia Commons)
When it comes to learning languages, adults and children have different strengths. Adults excel at absorbing the vocabulary needed to navigate a grocery store or order food in a restaurant, but children have an uncanny ability to pick up on subtle nuances of language that often elude adults. Within months of living in a foreign country, a young child may speak a second language like a native speaker.
Brain structure plays an important role in this “sensitive period” for learning language, which is believed to end around adolescence. The young brain is equipped with neural circuits that can analyze sounds and build a coherent set of rules for constructing words and sentences out of those sounds. Once these language structures are established, it’s difficult to build another one for a new language.
In a new study, a team of neuroscientists and psychologists led by Amy Finn, a postdoc at MIT’s McGovern Institute for Brain Research, has found evidence for another factor that contributes to adults’ language difficulties: When learning certain elements of language, adults’ more highly developed cognitive skills actually get in the way. The researchers discovered that the harder adults tried to learn an artificial language, the worse they were at deciphering the language’s morphology — the structure and deployment of linguistic units such as root words, suffixes, and prefixes.
“We found that effort helps you in most situations, for things like figuring out what the units of language that you need to know are, and basic ordering of elements. But when trying to learn morphology, at least in this artificial language we created, it’s actually worse when you try,” Finn says.
Finn and colleagues from the University of California at Santa Barbara, Stanford University, and the University of British Columbia describe their findings in the July 21 issue of PLoS One. Carla Hudson Kam, an associate professor of linguistics at British Columbia, is the paper’s senior author.
Too much brainpower
Linguists have known for decades that children are skilled at absorbing certain tricky elements of language, such as irregular past participles (examples of which, in English, include “gone” and “been”) or complicated verb tenses like the subjunctive.
“Children will ultimately perform better than adults in terms of their command of the grammar and the structural components of language — some of the more idiosyncratic, difficult-to-articulate aspects of language that even most native speakers don’t have conscious awareness of,” Finn says.
In 1990, linguist Elissa Newport hypothesized that adults have trouble learning those nuances because they try to analyze too much information at once. Adults have a much more highly developed prefrontal cortex than children, and they tend to throw all of that brainpower at learning a second language. This high-powered processing may actually interfere with certain elements of learning language.
“It’s an idea that’s been around for a long time, but there hasn’t been any data that experimentally show that it’s true,” Finn says.
Finn and her colleagues designed an experiment to test whether exerting more effort would help or hinder success. First, they created nine nonsense words, each with two syllables. Each word fell into one of three categories (A, B, and C), defined by the order of consonant and vowel sounds.
Study subjects listened to the artificial language for about 10 minutes. One group of subjects was told not to overanalyze what they heard, but not to tune it out either. To help them not overthink the language, they were given the option of completing a puzzle or coloring while they listened. The other group was told to try to identify the words they were hearing.
Each group heard the same recording, which was a series of three-word sequences — first a word from category A, then one from category B, then category C — with no pauses between words. Previous studies have shown that adults, babies, and even monkeys can parse this kind of information into word units, a task known as word segmentation.
Subjects from both groups were successful at word segmentation, although the group that tried harder performed a little better. Both groups also performed well in a task called word ordering, which required subjects to choose between a correct word sequence (ABC) and an incorrect sequence (such as ACB) of words they had previously heard.
The final test measured skill in identifying the language’s morphology. The researchers played a three-word sequence that included a word the subjects had not heard before, but which fit into one of the three categories. When asked to judge whether this new word was in the correct location, the subjects who had been asked to pay closer attention to the original word stream performed much worse than those who had listened more passively.
“This research is exciting because it provides evidence indicating that effortful learning leads to different results depending upon the kind of information learners are trying to master,” says Michael Ramscar, a professor of linguistics at the University of Tübingen who was not part of the research team. “The results indicate that learning to identify relatively simple parts of language, such as words, is facilitated by effortful learning, whereas learning more complex aspects of language, such as grammatical features, is impeded by effortful learning.”
Turning off effort
The findings support a theory of language acquisition that suggests that some parts of language are learned through procedural memory, while others are learned through declarative memory. Under this theory, declarative memory, which stores knowledge and facts, would be more useful for learning vocabulary and certain rules of grammar. Procedural memory, which guides tasks we perform without conscious awareness of how we learned them, would be more useful for learning subtle rules related to language morphology.
“It’s likely to be the procedural memory system that’s really important for learning these difficult morphological aspects of language. In fact, when you use the declarative memory system, it doesn’t help you, it harms you,” Finn says.
Still unresolved is the question of whether adults can overcome this language-learning obstacle. Finn says she does not have a good answer yet but she is now testing the effects of “turning off” the adult prefrontal cortex using a technique called transcranial magnetic stimulation. Other interventions she plans to study include distracting the prefrontal cortex by forcing it to perform other tasks while language is heard, and treating subjects with drugs that impair activity in that brain region.

How the brain stabilizes its connections in order to learn better
Throughout our lives, our brains adapt to what we learn and memorise. The brain is indeed made up of complex networks of neurons and synapses that are constantly re-configured. However, in order for learning to leave a trace, connections must be stabilized. A team at the University of Geneva (UNIGE) discovered a new cellular mechanism involved in the long-term stabilization of neuron connections, in which non-neuronal cells, called astrocytes, play a role unidentified until now. These results, published in Current Biology, will lead to a better understanding of neurodegenerative and neurodevelopmental diseases.
The central nervous system excitatory synapses – points of contact between neurons that allow them to transmit signals – are highly dynamic structures, which are continuously forming and dissolving. They are surrounded by non-neuronal cells, or glial cells, which include the distinctively star-shaped astrocytes. These cells form complex structures around synapses, and play a role in the transmission of cerebral information which was widely unknown before.
Plasticity and Stability
By increasing neuronal activity through whiskers stimulation of adult mice, the scientists were able to observe, in both the somatosensory cortex and the hippocampus, that this increased neuronal activity provokes an increase in astrocytes movements around synapses. The synapses, surrounded by astrocytes, re-organise their architecture, which protects them and increases their longevity. The team of researchers led by Dominique Muller, Professor in the Department of Fundamental Neuroscience of the Faculty of Medicine at UNIGE, developed new techniques that allowed them to specifically “control” the different synaptic structures, and to show that the phenomenon took place exclusively in the connections between neurons involved in learning. “In summary, the more the astrocytes surround the synapses, the longer the synapses last, thus allowing learning to leave a mark on memory,” explained Yann Bernardinelli, the lead author on this study.
This study identifies a new, two-way interaction between neurons and astrocytes, in which the learning process regulates the structural plasticity of astrocytes, who in turn determine the fate of the synapses. This mechanism indicates that astrocytes apparently play an important role in the processes of learning and memory, which present abnormally in various neurodegenerative and neurodevelopmental diseases, among which Alzheimer’s, autism, or Fragile X syndrome.
This discovery highlights the until now underestimated importance of cells which, despite being non-neuronal, participate in a crucial way in the cerebral mechanisms that allow us to learn and retain memories of what we have learned.
A new study from the Gladstone Institutes has revealed a way to alleviate the learning and memory deficits caused by apoE4, the most important genetic risk factor for Alzheimer’s disease, improving cognition to normal levels in aged mice.
In the study, which was conducted in collaboration with researchers at UC San Francisco and published today in the Journal of Neuroscience, scientists transplanted inhibitory neuron progenitors—early-stage brain cells that have the capacity to develop into mature inhibitory neurons—into two mouse models of Alzheimer’s disease, apoE4 or apoE4 with accumulation of amyloid beta, another major contributor to Alzheimer’s. The transplants helped to replenish the brain by replacing cells lost due to apoE4, regulating brain activity and improving learning and memory abilities.
“This is the first time transplantation of inhibitory neuron progenitors has been used in aged Alzheimer’s disease models,” said first author Leslie Tong, a graduate student at the Gladstone Institutes and UCSF. “Working with older animals can be challenging from a technical standpoint, and it was amazing to see that the cells not only survived but affected activity and behavior.”
The success of the treatment in older mice, which corresponded to late adulthood in humans, is particularly important, as this would be the age that would be targeted were this method ever to be used therapeutically in people.
“This is a very important proof of concept study,” said senior author Yadong Huang, MD, PhD, an associate investigator at Gladstone Institutes and associate professor of neurology and pathology at UCSF. “The fact that we see a functional integration of these cells into the hippocampal circuitry and a complete rescue of learning and memory deficits in an aged model of Alzheimer’s disease is very exciting.”
A balance of excitatory and inhibitory activity in the brain is essential for normal function. However, in the apoE4 model of Alzheimer’s disease—a genetic risk factor that is carried by approximately 25% of the population and is involved in 60-75% of all Alzheimer’s cases—this balance gets disrupted due to a decline in inhibitory regulator cells that are essential in maintaining normal brain activity. The hippocampus, an important memory center in the brain, is particularly affected by this loss of inhibitory neurons, resulting in an increase in network activation that is thought to contribute to the learning and memory deficits characteristic of Alzheimer’s disease. The accumulation of amyloid beta in the brain has also been linked to this imbalance between excitatory and inhibitory activity in the brain.
In the current study, the researchers hoped that by grafting inhibitory neuron progenitors into the hippocampus of aged apoE4 mice, they would be able to combat these effects, replacing the lost cells and restoring normal function to the area. Remarkably, these new inhibitory neurons survived in the hippocampus, enhancing inhibitory signaling and rescuing impairments in learning and memory.
In addition, when these inhibitory progenitor cells were transplanted into apoE4 mice with an accumulation of amyloid beta, prior deficits were alleviated. However, the new inhibitory neurons did not affect amyloid beta levels, suggesting that the cognitive enhancement did not occur as a result of amyloid clearance, and amyloid did not impair the integration of the transplant.
According to Dr. Huang, the potential implications for these findings extend beyond the current methods used. “Stem cell therapy in humans is still a long way off. However, this study tells us that if there is any way we can enhance inhibitory neuron function in the hippocampus, like through the development of small molecule compounds, it may be beneficial for Alzheimer disease patients.”
(Source: gladstoneinstitutes.org)
A recent study shows that assimilation of L2 vowels to L1 phonemes governs language learning in adulthood; researchers urge development of novel methods of second language teaching.

The behavioral and neural evidence of the study was found by researchers at Aalto University in Finland and at the University of Salento in Italy. The study was the first one to identify the neural mechanisms underlying the learning of L2 sounds (second language) in adulthood. Overall, this and earlier studies support the hypothesis that students in a foreign language classroom should particularly benefit from learning environments where they receive a focused amount of high-quality input from L2 native teachers, use pervasively the L2 to achieve functional and communicative goals, and receive intensive training (including the use of multi-medial systems) in the perception and production of L2 sounds in order to reactivate neuroplasticity of auditory cortex.
Learning in adulthood the sounds of a second language L2 means assimilating them to the phonemes of the native language L1.
In the study, two samples of Italian students, attending first year and fifth year classes of an English Language curriculum were invited to the behavioral and electroencephalography (EEG) lab. Dr. Brattico, senior author of the study from Aalto University, explains: “The discrimination skills were measured by crossing two methodologies: on one hand, perception tests in which the students listened to couples of English sounds that I synthesized and had to judge how similar or different they were, and on the other hand, EEG recordings with 64 electrode cap, while the students were presented with the same pairs of sounds and watched a silenced movie.”
The EEG recordings were used to extract the auditory event-related potential, namely the succession of neural events necessary to the processing and representation of sound, originating from the auditory cortex.
“When we hear linguistic sounds that are part of our native tongue, in a few milliseconds the brain is able to decipher the acoustic signal, extract the peculiar characteristics of each sound and produce a mental representation of it: thus we are able to discern one sound from another and assemble first the syllables, then the words and so on”, adds the first author, Professor Grimaldi, University of Salento.
“We compared the neural responses of the auditory cortex of the two groups of university students with one another and with a control group with a low level of education (third year of junior secondary school)”, explains Grimaldi. “We started with this hypothesis: if during the academic studies the students had developed new perceptual abilities we would have found different neural responses for the three groups”. The results did not confirm the hypothesis, but instead showed that neutrally, the L2 sounds were assimilated to L1 phonemes in all the groups.
Grimaldi adds: “Let us consider, for example, what happens when we watch a movie or listen to a song in a language that we do not know: we are able to perceive acoustic differences, but we cannot `extract´ the words from the acoustic stream and accede to their meaning. This is what happened for our groups of students”. Previous behavioral studies that observed L2 learners who had different native languages in an educational context (German, Finnish, Japanese, Turkish and other English learning students) never produced results favorable for the teachers. “This study specifies confirms and extends such results, proving by means of neurophysiological data that the quantity and quality of the stimuli received by university students are not enough to form long-term traces of L2 sounds in the auditory cortex”, confirms Brattico.
The results were published online in Frontiers in Human Neuroscience.
(Source: web.aalto.fi)
New research: teaching the brain to reduce pain
People can be conditioned to feel less pain when they hear a neutral sound, new research from the University of Luxembourg has found. This lends weight to the idea that we can learn to use mind-over-matter to beat pain. The scientific article was published recently in the online journal “PLOS One”.
Scientists have known for many years that on-going pain in one part of the body is reduced when a new pain is inflicted to another part of the body. This pain blocking is a physiological reaction by the nervous system to help the body deal with a potentially more relevant novel threat.
To explore this “pain inhibits pain” phenomenon, painful electric pulses were first administered to a subject’s foot (first pain) and the resulting pain intensity was then measured. Then the subject was asked to put their hand in a bucket of ice water (novel stimulus causing pain reduction), and as they did so, a telephone ringtone sounded in headphones. After this procedure had been repeated several times, it was observed that the pain felt from the electrical stimulation was reduced simply when the ring tone sounded.
The brain had been conditioned to the ringtone being a signal to trigger the body’s physical pain blocking mechanism. The people being tested not only felt significantly less pain, but there were also fewer objective signs of pain, such as activity in the muscles used in the facial expression of pain (frowning). In total, 32 people were tested.
“We have shown that just as the physiological reaction of saliva secretion was provoked in Pavlov’s dogs by the ringing of a bell, an analogous effect occurs regarding the ability to mask pain in humans,” said Fernand Anton, Professor of Biological Psychology at the University of Luxembourg. “Conversely, similar learning effects may be involved in the enhancement and maintenance of pain in some patients,” added Raymonde Scheuren, lead researcher in this study.
Study finds that learning by repetition impairs recall of details
When learning, practice doesn’t always make perfect.
UC Irvine neurobiologists Zachariah Reagh and Michael Yassa have found that while repetition enhances the factual content of memories, it can reduce the amount of detail stored with those memories. This means that with repeated recall, nuanced aspects may fade away.
In the study, which appears this month in Learning & Memory, student participants were asked to look at pictures either once or three times. They were then tested on their memories of those images. The researchers found that multiple views increased factual recall but actually hindered subjects’ ability to reject similar “imposter” pictures. This suggests that the details of those memories may have been shaken loose by repetition.
This discovery supports Reagh’s and Yassa’s Competitive Trace Theory – published last year in Frontiers in Behavioral Neuroscience – which posits that the details of a memory become more subjective the more they’re recalled and can compete with bits of other similar memories. The scientists hypothesize that this may even lead to false memories, akin to a brain version of the telephone game.
Yassa, an assistant professor of neurobiology & behavior, said that these findings do not discredit the practice of repetitive learning. However, he noted, pure repetition alone has limitations. For a more enriching and lasting learning experience through which nuance and detail are readily recalled, other memory techniques should be used to complement repetition.
(Image caption: Brain scans show high activity in the medial prefrontal cortex (top) and striatum (bottom) while playing a competitive game. UC Berkeley and UIUC researchers have now found genetic variations in dopamine-regulating genes in the prefrontal cortex and striatum associated with differences in belief learning and reinforcement learning, respectively. Credit: Ming Hsu)
Your genes affect your betting behavior
Investors and gamblers take note: your betting decisions and strategy are determined, in part, by your genes.
Researchers from the University of California, Berkeley, National University of Singapore and University of Illinois at Urbana-Champaign (UIUC) have shown that betting decisions in a simple competitive game are influenced by the specific variants of dopamine-regulating genes in a person’s brain.
Dopamine is a neurotransmitter – a chemical released by brain cells to signal other brain cells – that is a key part of the brain’s reward and pleasure-seeking system. Dopamine deficiency leads to Parkinson’s disease, while disruption of the dopamine network is linked to numerous psychiatric and neurodegenerative disorders, including schizophrenia, depression and dementia.
While previous studies have shown the important role of the neurotransmitter dopamine in social interactions, this is the first study tying these interactions to specific genes that govern dopamine functioning.
“This study shows that genes influence complex social behavior, in this case strategic behavior,” said study leader Ming Hsu, an assistant professor of marketing in UC Berkeley’s Haas School of Business and a member of the Helen Wills Neuroscience Institute. “We now have some clues about the neural mechanisms through which our genes affect behavior.”
The implications for business are potentially vast but unclear, Hsu said, though one possibility is training workforces to be more strategic. But the findings could significantly affect our understanding of diseases involving dopamine, such as schizophrenia, as well as disorders of social interaction, such as autism.
“When people talk about dopamine dysfunction, schizophrenia is one of the first diseases that come to mind,” Hsu said, noting that the disease involves a very complex pattern of social and decision making deficits. “To the degree that we can better understand ubiquitous social interactions in strategic settings, it may help us understand how to characterize and eventually treat the social deficits that are symptoms of diseases like schizophrenia.”
Hsu, UIUC graduate student Eric Set and their colleagues, including Richard P. Ebstein and Soo Hong Chew from the National University of Singapore, will publish their findings the week of June 16 in the online early edition of the Proceedings of the National Academy of Sciences.
Two brain areas involved in competition
Hsu established two years ago that when people engage in competitive social interactions, such as betting games, they primarily call upon two areas of the brain: the medial prefrontal cortex, which is the executive part of the brain, and the striatum, which deals with motivation and is crucial for learning to acquire rewards. Functional magnetic resonance imaging (fMRI) scans showed that people playing these games displayed intense activity in these areas.
“If you think of the brain as a computing machine, these are areas that take inputs, crank them through an algorithm, and translate them into behavioral outputs,” Hsu said. “What is really interesting about these areas is that both are innervated by neurons that use dopamine.”
Hsu and Set of UIUC’s Department of Economics wanted to determine which genes involved in regulating dopamine concentrations in these brain areas were associated with strategic thinking, so they enlisted as subjects a group of 217 undergraduates at the National University of Singapore, all of whom had had their genomes scanned for some 700,000 genetic variants. The researchers focused on only 143 variants within 12 genes involved in regulating dopamine. Some of the 12 are primarily involved in regulating dopamine in the prefrontal cortex, while others primarily regulate dopamine in the striatum.
The competition was a game called patent race, commonly used by social scientists to study social interactions. It involves one person betting, via computer, with an anonymous opponent.
“We know from brain imaging studies that when people compete against one another, they actually engage in two distinct types of learning processes,” Set said, referring to Hsu’s 2012 study. “One type involves learning purely from the consequences of your own actions, called reinforcement learning. The other is a bit more sophisticated, called belief learning, where people try to make a mental model of the other players, in order to anticipate and respond to their actions.”
Trial-and-error learning vs belief learning
Using a mathematical model of brain function during competitive social interactions, Hsu and Set correlated performance in reinforcement learning and belief learning with different variants or mutations of the 12 dopamine-related genes, and discovered a distinct difference.
They found that differences in belief learning – the degree to which players were able to anticipate and respond to the actions of others, or to imagine what their competitor is thinking and respond strategically – was associated with variation in three genes which primarily affect dopamine functioning in the medial prefrontal cortex.
In contrast, differences in trial-and-error reinforcement learning – how quickly they forget past experiences and how quickly they change strategy – was associated with variation in two genes that primarily affect striatal dopamine.
Hsu said that the findings correlate well with previous brain studies showing that the prefrontal cortex is involved in belief learning, while the striatum is involved in reinforcement learning.
“We were surprised by the degree of overlap, but it hints at the power of studying the neural and genetic levels under a single mathematical framework, which is only beginning in this area,” he said.
Hsu is currently collaborating with other scientists to correlate career achievements in older adults with genes and performance on competitive games, to see which brain regions and types of learning are most important for different kinds of success in life.
Synchronized brain waves enable rapid learning
The human mind can rapidly absorb and analyze new information as it flits from thought to thought. These quickly changing brain states may be encoded by synchronization of brain waves across different brain regions, according to a new study from MIT neuroscientists.
The researchers found that as monkeys learn to categorize different patterns of dots, two brain areas involved in learning — the prefrontal cortex and the striatum — synchronize their brain waves to form new communication circuits.
“We’re seeing direct evidence for the interactions between these two systems during learning, which hasn’t been seen before. Category-learning results in new functional circuits between these two areas, and these functional circuits are rhythm-based, which is key because that’s a relatively new concept in systems neuroscience,” says Earl Miller, the Picower Professor of Neuroscience at MIT and senior author of the study, which appears in the June 12 issue of Neuron.
There are millions of neurons in the brain, each producing its own electrical signals. These combined signals generate oscillations known as brain waves, which can be measured by electroencephalography (EEG). The research team focused on EEG patterns from the prefrontal cortex —the seat of the brain’s executive control system — and the striatum, which controls habit formation.
The phenomenon of brain-wave synchronization likely precedes the changes in synapses, or connections between neurons, believed to underlie learning and long-term memory formation, Miller says. That process, known as synaptic plasticity, is too time-consuming to account for the human mind’s flexibility, he believes.
“If you can change your thoughts from moment to moment, you can’t be doing it by constantly making new connections and breaking them apart in your brain. Plasticity doesn’t happen on that kind of time scale,” says Miller, who is a member of MIT’s Picower Institute for Learning and Memory. “There’s got to be some way of dynamically establishing circuits to correspond to the thoughts we’re having in this moment, and then if we change our minds a moment later, those circuits break apart somehow. We think synchronized brain waves may be the way the brain does it.”
The paper’s lead author is former Picower Institute postdoc Evan Antzoulatos, who is now at the University of California at Davis.
Humming together
Miller’s lab has previously shown that during category-learning, neurons in the striatum become active early, followed by slower activation of neurons in the prefrontal cortex. “The striatum learns very simple things really quickly, and then its output trains the prefrontal cortex to gradually pick up on the bigger picture,” Miller says. “The striatum learns the pieces of the puzzle, and then the prefrontal cortex puts the pieces of the puzzle together.”
In the new study, the researchers wanted to investigate whether this activity pattern actually reflects communication between the prefrontal cortex and striatum, or if each region is working independently. To do this, they measured EEG signals as monkeys learned to assign patterns of dots into one of two categories.
At first, the animals were shown just two different examples, or “exemplars,” from each category. After each round, the number of exemplars was doubled. In the early stages, the animals could simply memorize which exemplars belonged to each category. However, the number of exemplars eventually became too large for the animals to memorize all of them, and they began to learn the general traits that characterized each category.
By the end of the experiment, when the researchers were showing 256 novel exemplars, the monkeys were able to categorize all of them correctly.
As the monkeys shifted from rote memorization to learning the categories, the researchers saw a corresponding shift in EEG patterns. Brain waves known as “beta bands,” produced independently by the prefrontal cortex and the striatum, began to synchronize with each other. This suggests that a communication circuit is forming between the two regions, Miller says.
“There is some unknown mechanism that allows these resonance patterns to form, and these circuits start humming together,” he says. “That humming may then foster subsequent long-term plasticity changes in the brain, so real anatomical circuits can form. But the first thing that happens is they start humming together.”
A little later, as an animal nailed down the two categories, two separate circuits formed between the striatum and prefrontal cortex, each corresponding to one of the categories.
“This is the first paper that provides data suggesting that coupling in the beta-band between prefrontal cortex and striatum may play a key role in category-formation. In addition to revealing a novel mechanism involved in category-learning, the results also contribute to better understanding of the significance of coupled beta-band oscillations in the brain,” says Andreas Engel, a professor of physiology at the University Medical Center Hamburg-Eppendorf in Germany.
“Expanding your knowledge”
Previous studies have shown that during cognitively demanding tasks, there is increased synchrony between the frontal cortex and visual cortex, but Miller’s lab is the first to show specific patterns of synchrony linked to specific thoughts.
Miller and Antzoulatos also showed that once the prefrontal cortex learns the categories and sends them to the striatum, they undergo further modification as new information comes in, allowing more expansive learning to take place. This iteration can occur over and over.
“That’s how you get the open-ended nature of human thought. You keep expanding your knowledge,” Miller says. “The prefrontal cortex learning the categories isn’t the end of the game. The cortex is learning these new categories and then forming circuits that can send the categories down to the striatum as if it’s just brand-new material for the brain to elaborate on.”
In follow-up studies, the researchers are now looking at how the brain learns more abstract categories, and how activity in the striatum and prefrontal cortex might reflect that type of abstraction.
Sleep After Learning Strengthens Connections Between Brain Cells and Enhances Memory
In study published today in Science, researchers at NYU Langone Medical Center show for the first time that sleep after learning encourages the growth of dendritic spines, the tiny protrusions from brain cells that connect to other brain cells and facilitate the passage of information across synapses, the junctions at which brain cells meet. Moreover, the activity of brain cells during deep sleep, or slow-wave sleep, after learning is critical for such growth.
The findings, in mice, provide important physical evidence in support of the hypothesis that sleep helps consolidate and strengthen new memories, and show for the first time how learning and sleep cause physical changes in the motor cortex, a brain region responsible for voluntary movements.
“We’ve known for a long time that sleep plays an important role in learning and memory. If you don’t sleep well you won’t learn well,” says senior investigator Wen-Biao Gan, PhD, professor of neuroscience and physiology and a member of the Skirball Institute of Biomolecular Medicine at NYU Langone Medical Center. “But what’s the underlying physical mechanism responsible for this phenomenon? Here we’ve shown how sleep helps neurons form very specific connections on dendritic branches that may facilitate long-term memory. We also show how different types of learning form synapses on different branches of the same neurons, suggesting that learning causes very specific structural changes in the brain.”
On the cellular level, sleep is anything but restful: Brain cells that spark as we digest new information during waking hours replay during deep sleep, also known as slow-wave sleep, when brain waves slow down and rapid-eye movement, as well as dreaming, stops. Scientists have long believed that this nocturnal replay helps us form and recall new memories, yet the structural changes underpinning this process have remained poorly understood.
To shed light on this process, Dr. Gan and colleagues employed mice genetically engineered to express a fluorescent protein in neurons. Using a special laser-scanning microscope that illuminates the glowing fluorescent proteins in the motor cortex, the scientists were then able to track and image the growth of dendritic spines along individual branches of dendrites before and after mice learned to balance on a spin rod. Over time mice learned how to balance on the rod as it gradually spun faster. “It’s like learning to ride a bike,” says Dr. Gan. “Once you learn it, you never forget.”
After documenting that mice, in fact, sprout new spines along dendritic branches, within six hours after training on the spinning rod, the researchers set out to understand how sleep would impact this physical growth. They trained two sets of mice: one trained on the spinning rod for an hour and then slept for 7 hours; the second trained for the same period of time on the rod but stayed awake for 7 hours. The scientists found that the sleep-deprived mice experienced significantly less dendritic spine growth than the well-rested mice. Furthermore, they found that the type of task learned determined which dendritic branches spines would grow.
Running forward on the spinning rod, for instance, produced spine growth on different dendritic branches than running backward on the rod, suggesting that learning specific tasks causes specific structural changes in the brain.
“Now we know that when we learn something new, a neuron will grow new connections on a specific branch,” says Dr. Gan. “Imagine a tree that grows leaves (spines) on one branch but not another branch. When we learn something new, it’s like we’re sprouting leaves on a specific branch.”
Finally, the scientists showed that brain cells in the motor cortex that activate when mice learn a task reactivate during slow-wave deep sleep. Disrupting this process, they found, prevents dendritic spine growth. Their findings offer an important insight into the functional role of neuronal replay—the process by which the sleeping brain rehearses tasks learned during the day—observed in the motor cortex.
“Our data suggest that neuronal reactivation during sleep is quite important for growing specific connections within the motor cortex,” Dr. Gan adds.
(Image: Shutterstock)