Posts tagged emotion
Posts tagged emotion
Finnish and Danish researchers have developed a new method that performs decoding, or brain-reading, during continuous listening to real music. Based on recorded brain responses, the method predicts how certain features related to tone color and rhythm of the music change over time, and recognizes which piece of music is being listened to. The method also allows pinpointing the areas in the brain that are most crucial for the processing of music. The study was published in the journal NeuroImage.
Using functional magnetic resonance imaging (fMRI), the research team at the Finnish Centre of Excellence in Interdisciplinary Music Research in the Universities of Jyväskylä and Helsinki, and the Center for Functionally Integrative Neuroscience in Aarhus University, Denmark, recorded the brain responses of participants while they were listening to a 16-minute excerpt of the album Abbey Road by the Beatles. Following this, they used computational algorithms to extract a collection of musical features from the musical recording. Subsequently, they employed a collection of machine-learning methods to train a computer model that predicts how the features of the music change over time. Finally, they develop a classifier that predicts which part of the music the participant was listening to at each time.
The researchers found that most of the musical features included in the study could be reliably predicted from the brain data. They also found that the piece being listened to could be predicted significantly better than chance. Fairly large differences were however found between participants in terms of the prediction accuracy. An interesting finding was that areas outside of the auditory cortex, including motor, limbic, and frontal areas, had to be included in the models to obtain reliable predictions, providing thus evidence for the important role of these areas in the processing of musical features.
"We believe that decoding provides a method that complements other existing methods to obtain more reliable information about the complex processing of music in the brain", says Professor Petri Toiviainen from the University of Jyväskylä. "Our results provide additional evidence for the important involvement of emotional and motor areas in music processing."
TAU researchers find unresponsive patients’ brains may recognize photographs of their family and friends
Patients in a vegetative state are awake, breathe on their own, and seem to go in and out of sleep. But they do not respond to what is happening around them and exhibit no signs of conscious awareness. With communication impossible, friends and family are left wondering if the patients even know they are there.
Now, using functional magnetic resonance imaging (fMRI), Dr. Haggai Sharon and Dr. Yotam Pasternak of Tel Aviv University’s Functional Brain Center and Sackler Faculty of Medicine and the Tel Aviv Sourasky Medical Center have shown that the brains of patients in a vegetative state emotionally react to photographs of people they know personally as though they recognize them.
"We showed that patients in a vegetative state can react differently to different stimuli in the environment depending on their emotional value," said Dr. Sharon. "It’s not a generic thing; it’s personal and autobiographical. We engaged the person, the individual, inside the patient."
The findings, published in PLOS ONE, deepen our understanding of the vegetative state and may offer hope for better care and the development of novel treatments. Researchers from TAU’s School of Psychological Sciences, Department of Neurology, and Sagol School of Neuroscience and the Loewenstein Hospital in Ranaana contributed to the research.
Talking to the brain
For many years, patients in a vegetative state were believed to have no awareness of self or environment. But in recent years, doctors have made use of fMRI to examine brain activity in such patients. They have found that some patients in a vegetative state can perform complex cognitive tasks on command, like imagining a physical activity such as playing tennis, or, in one case, even answering yes-or-no questions. But these cases are rare and don’t provide any indication as to whether patients are having personal emotional experiences in such a state.
To gain insight into “what it feels like to be in a vegetative state,” the researchers worked with four patients in a persistent (defined as “month-long”) or permanent (persisting for more than three months) vegetative state. They showed them photographs of people they did and did not personally know, then gauged the patients’ reactions using fMRI, which measures blood flow in the brain to detect areas of neurological activity in real time. In response to all the photographs, a region specific to facial recognition was activated in the patients’ brains, indicating that their brains had correctly identified that they were looking at faces.
But in response to the photographs of close family members and friends, brain regions involved in emotional significance and autobiographical information were also activated in the patients’ brains. In other words, the patients reacted with activations of brain centers involved in processing emotion, as though they knew the people in the photographs. The results suggest patients in a vegetative state can register and categorize complex visual information and connect it to memories – a groundbreaking finding.
The ghost in the machine
However, the researchers could not be sure if the patients were conscious of their emotions or just reacting spontaneously. So they then verbally asked the patients to imagine their parents’ faces. Surprisingly, one patient, a 60-year-old kindergarten teacher who was hit by a car while crossing the street, exhibited complex brain activity in the face- and emotion-specific brain regions, identical to brain activity seen in healthy people. The researchers say her response is the strongest evidence yet that vegetative-state patients can be “emotionally aware.” A second patient, a 23-year-old woman, exhibited activity just in the emotion-specific brain regions. (Significantly, both patients woke up within two months of the tests. They did not remember being in a vegetative state.)
"This experiment, a first of its kind, demonstrates that some vegetative patients may not only possess emotional awareness of the environment but also experience emotional awareness driven by internal processes, such as images," said Dr. Sharon.
Research focused on the “emotional awareness” of patients in a vegetative state is only a few years old. The researchers hope their work will eventually contribute to improved care and treatment. They have also begun working with patients in a minimally conscious state to better understand how regions of the brain interact in response to familiar cues. Emotions, they say, could help unlock the secrets of consciousness.
Scientists have discovered that playing computer games can bring players’ emotional responses and brain activity into unison.
By measuring the activity of facial muscles and imaging the brain while gaming, the group found out that people go through similar emotions and display matching brainwaves. The study of Helsinki Institute for Information Technology HIIT researchers is now published in PLOS ONE.
– It’s well known that people who communicate face-to-face will start to imitate each other. People adopt each other’s poses and gestures, much like infectious yawning. What is less known is that the very physiology of interacting people shows a type of mimicry – which we call synchrony or linkage, explains Michiel Sovijärvi-Spapé.
In the study, test participants play a computer game called Hedgewars, in which they manage their own team of animated hedgehogs and in turns shoot the opposing team with ballistic artillery. The goal is to destroy the opposing team’s hedgehogs. The research team varied the amount of competitiveness in the gaming situation: players teamed up against the computer and they were also pinned directly against each other.
The players were measured for facial muscle reactions with facial electromyography, or fEMG, and their brainwaves were measured with electroencephalography, EEG.
– Replicating previous studies, we found linkage in the fEMG: two players showed both similar emotions and similar brainwaves at similar times. We further observed a linkage also in the brainwaves with EEG, tells Sovijärvi-Spapé.
A striking discovery indicates further that the more competitive the gaming gets, the more in sync are the emotional responses of the players. The test subjects were to report emotions themselves, and negative emotions were associated with the linkage effect.
– Although counterintuitive, the discovered effect increases as a game becomes more competitive. And the more competitive it gets, the more the players’ positive emotions begin to reflect each other. All the while their experiences of negative emotions increase.
The results present promising upshots for further study.
– Feeling others’ emotions could be particularly beneficial in competitive settings: the linkage may enable one to better anticipate the actions of opponents.
Another interpretation suggested by the group is that the physical linkage of emotion may work to compensate a possibly faltering social bond while competing in a gaming setting.
– Since our participants were all friends before the game, we can speculate that the linkage is most prominent when a friendship is ‘threatened’ while competing against each other, ponders Sovijärvi-Spapé.
Was the evolution of high-quality vision in our ancestors driven by the threat of snakes? Work by neuroscientists in Japan and Brazil is supporting the theory originally put forward by Lynne Isbell, professor of anthropology at the University of California, Davis.
In a paper published Oct. 28 in the journal Proceedings of the National Academy of Sciences, Isbell; Hisao Nishijo and Quan Van Le at Toyama University, Japan; and Rafael Maior and Carlos Tomaz at the University of Brasilia, Brazil; and colleagues show that there are specific nerve cells in the brains of rhesus macaque monkeys that respond to images of snakes.
The snake-sensitive neurons were more numerous, and responded more strongly and rapidly, than other nerve cells that fired in response to images of macaque faces or hands, or to geometric shapes. Isbell said she was surprised that more neurons responded to snakes than to faces, given that primates are highly social animals.
"We’re finding results consistent with the idea that snakes have exerted strong selective pressure on primates," Isbell said.
Isbell originally published her hypothesis in 2006, following up with a book, “The Fruit, the Tree and the Serpent” (Harvard University Press, 2009) in which she argued that our primate ancestors evolved good, close-range vision primarily to spot and avoid dangerous snakes.
Modern mammals and snakes big enough to eat them evolved at about the same time, 100 million years ago. Venomous snakes are thought to have appeared about 60 million years ago — “ambush predators” that have shared the trees and grasslands with primates.
Nishijo’s laboratory studies the neural mechanisms responsible for emotion and fear in rhesus macaque monkeys, especially instinctive responses that occur without learning or memory. Previous researchers have used snakes to provoke fear in monkeys, he noted. When Nishijo heard of Isbell’s theory, he thought it might explain why monkeys are so afraid of snakes.
"The results show that the brain has special neural circuits to detect snakes, and this suggests that the neural circuits to detect snakes have been genetically encoded," Nishijo said.
The monkeys tested in the experiment were reared in a walled colony and neither had previously encountered a real snake.
"I don’t see another way to explain the sensitivity of these neurons to snakes except through an evolutionary path," Isbell said.
Isbell said she’s pleased to be able to collaborate with neuroscientists.
"I don’t do neuroscience and they don’t do evolution, but we can put our brains together and I think it brings a wider perspective to neuroscience and new insights for evolution," she said.
Counterintuitive findings from a new USC study show that the part of the brain that is associated with empathizing with the pain of others is activated more strongly by watching the suffering of hateful people as opposed to likable people.
While one might assume that we would empathize more with people we like, the study may indicate that the human brain focuses more greatly on the need to monitor enemies closely, especially when they are suffering.
“When you watch an action movie and the bad guy appears to be defeated, the moment of his demise draws our focus intensely,” said Lisa Aziz-Zadeh of the Brain and Creativity Institute of the USC Dornsife College of Letters, Arts and Sciences. “We watch him closely to see whether he’s really down for the count because it’s critical for predicting his potential for retribution in the future.”
Aziz-Zadeh, who has a joint appointment with the USC Division of Occupational Science and Occupational Therapy, collaborated with lead author Glenn Fox, a PhD candidate at USC, and Mona Sobhani, formerly a graduate student at USC and who is now a postdoctoral researcher at Vanderbilt University, on a study that appears this month in Frontiers in Psychology.
The study examined activity in the so-called “pain matrix” of the brain, a network that includes the insula cortex, the anterior cingulate and the somatosensory cortices — regions known to activate when an individual watches another person suffer.
The pain matrix is thought to be a related to empathy — allowing us to understand another’s pain. However, this study indicates that the pain matrix may be more involved in processing pain in general and not necessarily tied to empathic processing.
Participants — all of them white, male and Jewish — first watched videos of hateful, anti-Semitic individuals in pain and then other videos of tolerant, nonhateful individuals in pain. Their brains were scanned with functional magnetic resonance imaging (fMRI) to show activity levels in the pain matrix.
Surprisingly, the participants’ pain matrices were more activated by watching the anti-Semites suffer compared to the tolerant individuals.
“The results further revealed the brain’s flexibility in processing complex social situations,” Fox said. “The brain uses the complete context of the situation to mount an appropriate response. In this case, the brain’s response is likely tied to the relative increase in the need to attend to and understand the pain of the hateful person.”
A possible next step for the researchers will be to try to understand how regulating one’s emotional reaction to stimuli such as these alters the resulting patterns of brain activity.
Egoism and narcissism appear to be on the rise in our society, while empathy is on the decline. And yet, the ability to put ourselves in other people’s shoes is extremely important for our coexistence. A research team headed by Tania Singer from the Max Planck Institute for Human Cognitive and Brain Sciences has discovered that our own feelings can distort our capacity for empathy. This emotionally driven egocentricity is recognised and corrected by the brain. When, however, the right supramarginal gyrus doesn’t function properly or when we have to make particularly quick decisions, our empathy is severely limited.
When assessing the world around us and our fellow humans, we use ourselves as a yardstick and tend to project our own emotional state onto others. While cognition research has already studied this phenomenon in detail, nothing is known about how it works on an emotional level. It was assumed that our own emotional state can distort our understanding of other people’s emotions, in particular if these are completely different to our own. But this emotional egocentricity had not been measured before now.
This is precisely what the Max Planck researchers have accomplished in a complex marathon of experiments and tests. They also discovered the area of the brain responsible for this function, which helps us to distinguish our own emotional state from that of other people. The area in question is the supramarginal gyrus, a convolution of the cerebral cortex which is approximately located at the junction of the parietal, temporal and frontal lobe. “This was unexpected, as we had the temporo-parietal junction in our sights. This is located more towards the front of the brain,” explains Claus Lamm, one of the publication’s authors.
On the empathy trail with toy slime and synthetic fur
Using a perception experiment, the researchers began by showing that our own feelings actually do influence our capacity for empathy, and that this egocentricity can also be measured. The participants, who worked in teams of two, were exposed to either pleasant or unpleasant simultaneous visual and tactile stimuli.
While participant 1, for example, could see a picture of maggots and feel slime with her hand, participant 2 saw a picture of a puppy and could feel soft, fleecy fur on her skin. “It was important to combine the two stimuli. Without the tactile stimulus, the participants would only have evaluated the situation ‘with their heads’ and their feelings would have been excluded,” explains Claus Lamm. The participants could also see the stimulus to which their team partners were exposed at the same time.
The two participants were then asked to evaluate either their own emotions or those of their partners. As long as both participants were exposed to the same type of positive or negative stimuli, they found it easy to assess their partner’s emotions. The participant who was confronted with a stinkbug could easily imagine how unpleasant the sight and feeling of a spider must be for her partner.
Differences only arose during the test runs in which one partner was confronted with pleasant stimuli and the other with unpleasant ones. Their capacity for empathy suddenly plummeted. The participants’ own emotions distorted their assessment of the other person’s feelings. The participants who were feeling good themselves assessed their partners’ negative experiences as less severe than they actually were. In contrast, those who had just had an unpleasant experience assessed their partners’ good experiences less positively.
Particularly quick decisions cause a decline in empathy
The researchers pinpointed the area of the brain responsible for this phenomenon with the help of functional magnetic resonance imaging, generally referred to as a brain scanning. The right supramarginal gyrus ensures that we can decouple our perception of ourselves from that of others. When the neurons in this part of the brain were disrupted in the course of this task, the participants found it difficult not to project their own feelings onto others. The participants’ assessments were also less accurate when they were forced to make particularly quick decisions.
Up to now, the social neuroscience models have assumed that we mainly draw on our own emotions as a reference for empathy. This only works, however, if we are in a neutral state or the same state as our counterpart – otherwise, the brain must counteract and correct.
Since Leonardo Da Vinci painted the Mona Lisa, much has been said about what lies behind her smile. Now, Spanish researchers have discovered how far this attention-grabbing expression confuses our emotion recognition and makes us perceive a face as happy, even if it is not.
Human beings deduce others´ state of mind from their facial expressions. “Fear, anger, sadness, displeasure and surprise are quickly inferred in this way,” David Beltrán Guerrero, researcher at the University of La Laguna, explains to SINC. But some emotions are more difficult to perceive.
“There is a wide range of more ambiguous expressions, from which it is difficult to deduce the underlying emotional state. A typical example is the expression of happiness,” says Beltrán, who is part of a group of experts at the Canarian institution who have analyzed, in three scientific articles, the smile’s capacity to distort people’s innate deductive ability.
“The smile plays a key role in recognizing others´ happiness. But, as we know, we are not really happy every time we smile,” he adds. In some cases, a smile merely expresses politeness or affiliation. In others, it may even be a way of hiding negative feelings and incentives, such as dominance, sarcasm, nervousness or embarrassment.
To develop this line of research, the authors created faces comprising smiling mouths and eyes expressing non-happy emotions, and compared them with faces in which both mouths and eyes expressed the same type of emotional state.
The main objective was to discover how far the smile skews the recognition of ambiguous expressions, making us identify them with happiness even though they are accompanied by eyes which clearly express a different feeling.
The power of a smile
“The influence of the smile is highly dependent on the type of task given to participants and, therefore, on the type of activity we are involved in when we come across this type of expression,” Beltrán notes.
Thus when the task is purely perceptive – like the detection of facial features - the smile has a very strong influence, to the extent that differences between ambiguous expressions (happy mouth and non-happy eyes) and genuinely happy expressions (happy mouth and eyes) are not distinguished.
On the other hand, when the task involves categorizing expressions, that is recognizing if they are happy, sad or any other emotion, the influence of the smile weakens, although it is still important, since 40% of the time, participants identify ambiguous expressions as genuinely happy.
However, the influence of the smile disappears in emotional assessment, that is when someone is asked to assess whether a facial expression is positive or negative: “A smile can cause us to interpret a non-happy expression as happy, except when we are involved in the emotional assessment of said expression,” he highlights.
A stimulus which is difficult to assess
According to the authors, the reason why a smile sometimes leads to the incorrect categorization of an expression is related to its high visual “salience”– its attention-grabbing capacity – and its almost exclusive association with the emotional state of happiness.
In a recent study, it was found that the smile dominates many of the initial stages of the brain processing of faces, to the extent that it prompts similar electrical activity in the brain for genuinely happy expressions and ambiguous expressions with smiles and non-happy eyes.
By measuring eye movements, it was observed that an ambiguous expression is confused and categorized as happy if the first gaze falls on the area of the smiling mouth, rather than the area of the eyes.
However, curiously the influence of the smile in these assessments is not the same for everyone. “Another study showed that people with social anxiety tend to confuse ambiguous expressions with genuinely happy expressions less frequently,” Beltrán concludes.
Manuel G. Calvo, Hipólito Marrero, David Beltrán. “When does the brain distinguish between genuine and ambiguous smiles? An ERP study”. Brain and Cognition 81 (2013) 237–246.
Manuel G. Calvo, Andrés Fernández-Martín, Lauri Nummenmaa. “Perceptual, categorical, and affective processing of ambiguous smiling facial expressions”. Cognition 125 (2012) 373–393.
Manuel G. Calvo; Aida Gutiérrez-García; Pedro Avero; Daniel Lundqvist. “Attentional Mechanisms in Judging Genuine and Fake Smiles: Eye-Movement Patterns”. Emotion 2013, Vol. 13 (2013), No. 4, 792–802.
A RIKEN research team has discovered an enzyme called Rines that regulates MAO-A, a major brain protein controlling emotion and mood. The enzyme is a potentially promising drug target for treating diseases associated with emotions such as depression.
Monoamine oxidase A (MAO-A) is an enzyme that breaks down serotonin, norephinephrine and dopamine, neurotransmitters well-known for their influence on emotion and mood. Nicknamed the “warrior gene”, a variant of the MAOA gene has been associated with increased risk of violent and anti-social behavior.
While evidence points to a link between MAO-A levels and various emotional patterns, however, the mechanism controlling MAO-A levels in the brain has remained unknown.
Now, a research team headed by Jun Aruga at the RIKEN Brain Science Institute has shown for the first time that a ligase named Rines (RING finger-type E3 ubiquitin ligase) regulates these levels. Their research shows that mice without the Rines gene exhibit impaired stress responses and enhanced anxiety, controlled in part through the regulation of MAO-A levels. The study is published today in Journal of Neuroscience.
As the first study to demonstrate regulation of MAO-A protein via the ubiquitin proteasomal system, this research presents a promising new avenue for analyzing the role of MAO-A in brain function. Further research promises insights into the treatment of anxiety, stress-related disorders and impaired social functions.
Miyuki Kabayama, Kazuto Sakoori, Kazuyuki Yamada, Veravej G. Ornthanalai, Maya Ota, Naoko Morimura, Kei-ichi Katayama, Niall P. Murphy, and Jun Aruga. “Rines E3 Ubiquitin Ligase Regulates MAO-A Levels and Emotional Responses.” The Journal of Neuroscience, 2013.
New published research from psychologists at the universities of Kent and Witten/Herdecke has shown that mindfulness meditation has the ability to temporarily alter practitioners’ perceptions of time – a finding that has wider implications for the use of mindfulness both as an everyday practice, and in clinical treatments and interventions.
Led by Dr Robin Kramer from Kent’s School of Psychology, the research team hypothesised that, given mindfulness’ emphasis on moment-to-moment awareness, mindfulness meditation would slow down time and produce the feeling that short periods of time lasted longer.
To test this hypothesis, they used a temporal bisection task, which allows researchers to measure where each individual subjectively splits a period of time in half. Participants’ responses to this task were collected twice, once before and then again after a listening task. By separating people into two groups, participants listened for ten minutes to either an audiobook or a meditation exercise designed to focus their attention on the movement of breath in the body. The results showed that the control group (audiobook) didn’t change in their responses after the listening task compared with before. However, meditation led to a relative overestimation of durations i.e. time periods felt longer than they had before.
The reasons for this have been interpreted by Dr Kramer and team as the result of attentional changes, producing either improved attentional resources that allow increased attention to the processing of time, or a shift to internally-oriented attention that would have the same effect.
Dr Kramer said: ‘Our findings represent some of the first to demonstrate how mindfulness meditation can alter the perception of time. Given the increasing popularity of mindfulness in everyday practice, its relationship with time perception may provide an important step in our understanding of this pervasive, ancient practice in our modern world.’
Dr Kramer also explained that the benefits of mindfulness and mindfulness-based therapies in a variety of domains are now being identified. These include decreases in rumination, improvements in cognitive flexibility, working memory capacity and sustained attention, and reductions in reactivity, anxiety and depressive symptoms. Mindfulness-based treatments also appear to provide broad antidepressant and antianxiety effects, as well as decreases in general psychological distress. As such, these interventions have been applied with a variety of patients, including those suffering from fibromyalgia, psoriasis, cancer, binge eating and chronic pain.
Dr Dinkar Sharma, Senior Lecturer in Psychology at Kent, commented: ‘Demonstrating that mindfulness has an effect on time perception is important because it opens up the opportunity that mindfulness could be used to alter psychological disorders that are associated with a range of distortions in the perception of time - such as disorders of memory, emotion and addiction.’
Dr Ulrich Weger, of Witten/Herdecke’s Department of Psychology and Psychotherapy, concluded by stating that ‘the impact of a brief mindfulness exercise on elementary processes such as time perception is remarkable’.
MUSIC is not tangible. You can’t eat it, drink it or mate with it. It doesn’t protect against the rain, wind or cold. It doesn’t vanquish predators or mend broken bones. And yet humans have always prized music — or well beyond prized, loved it.
In the modern age we spend great sums of money to attend concerts, download music files, play instruments and listen to our favorite artists whether we’re in a subway or salon. But even in Paleolithic times, people invested significant time and effort to create music, as the discovery of flutes carved from animal bones would suggest.
So why does this thingless “thing” — at its core, a mere sequence of sounds — hold such potentially enormous intrinsic value?
The quick and easy explanation is that music brings a unique pleasure to humans. Of course, that still leaves the question of why. But for that, neuroscience is starting to provide some answers.
More than a decade ago, our research team used brain imaging to show that music that people described as highly emotional engaged the reward system deep in their brains — activating subcortical nuclei known to be important in reward, motivation and emotion. Subsequently we found that listening to what might be called “peak emotional moments” in music — that moment when you feel a “chill” of pleasure to a musical passage — causes the release of the neurotransmitter dopamine, an essential signaling molecule in the brain.
When pleasurable music is heard, dopamine is released in the striatum — an ancient part of the brain found in other vertebrates as well — which is known to respond to naturally rewarding stimuli like food and sex and which is artificially targeted by drugs like cocaine and amphetamine.
But what may be most interesting here is when this neurotransmitter is released: not only when the music rises to a peak emotional moment, but also several seconds before, during what we might call the anticipation phase.
The idea that reward is partly related to anticipation (or the prediction of a desired outcome) has a long history in neuroscience. Making good predictions about the outcome of one’s actions would seem to be essential in the context of survival, after all. And dopamine neurons, both in humans and other animals, play a role in recording which of our predictions turn out to be correct.
To dig deeper into how music engages the brain’s reward system, we designed a study to mimic online music purchasing. Our goal was to determine what goes on in the brain when someone hears a new piece of music and decides he likes it enough to buy it.
We used music-recommendation programs to customize the selections to our listeners’ preferences, which turned out to be indie and electronic music, matching Montreal’s hip music scene. And we found that neural activity within the striatum — the reward-related structure — was directly proportional to the amount of money people were willing to spend.
But more interesting still was the cross talk between this structure and the auditory cortex, which also increased for songs that were ultimately purchased compared with those that were not.
Why the auditory cortex? Some 50 years ago, Wilder Penfield, the famed neurosurgeon and the founder of the Montreal Neurological Institute, reported that when neurosurgical patients received electrical stimulation to the auditory cortex while they were awake, they would sometimes report hearing music. Dr. Penfield’s observations, along with those of many others, suggest that musical information is likely to be represented in these brain regions.
The auditory cortex is also active when we imagine a tune: think of the first four notes of Beethoven’s Fifth Symphony — your cortex is abuzz! This ability allows us not only to experience music even when it’s physically absent, but also to invent new compositions and to reimagine how a piece might sound with a different tempo or instrumentation.
We also know that these areas of the brain encode the abstract relationships between sounds — for instance, the particular sound pattern that makes a major chord major, regardless of the key or instrument. Other studies show distinctive neural responses from similar regions when there is an unexpected break in a repetitive pattern of sounds, or in a chord progression. This is akin to what happens if you hear someone play a wrong note — easily noticeable even in an unfamiliar piece of music.
These cortical circuits allow us to make predictions about coming events on the basis of past events. They are thought to accumulate musical information over our lifetime, creating templates of the statistical regularities that are present in the music of our culture and enabling us to understand the music we hear in relation to our stored mental representations of the music we’ve heard.
So each act of listening to music may be thought of as both recapitulating the past and predicting the future. When we listen to music, these brain networks actively create expectations based on our stored knowledge.
Composers and performers intuitively understand this: they manipulate these prediction mechanisms to give us what we want — or to surprise us, perhaps even with something better.
In the cross talk between our cortical systems, which analyze patterns and yield expectations, and our ancient reward and motivational systems, may lie the answer to the question: does a particular piece of music move us?
When that answer is yes, there is little — in those moments of listening, at least — that we value more.