Posts tagged psychology

Posts tagged psychology
Bad language could be good for you, a new study shows. For the first time, psychologists have found that swearing may serve an important function in relieving pain.

The study, published in the journal NeuroReport, measured how long college students could keep their hands immersed in cold water. During the chilly exercise, they could repeat an expletive of their choice or chant a neutral word. When swearing, the 67 student volunteers reported less pain and on average endured about 40 seconds longer.
Although cursing is notoriously decried in the public debate, researchers are now beginning to question the idea that the phenomenon is all bad. “Swearing is such a common response to pain that there has to be an underlying reason why we do it,” says psychologist Richard Stephens of Keele University in England, who led the study. And indeed, the findings point to one possible benefit: “I would advise people, if they hurt themselves, to swear,” he adds.
How swearing achieves its physical effects is unclear, but the researchers speculate that brain circuitry linked to emotion is involved. Earlier studies have shown that unlike normal language, which relies on the outer few millimeters in the left hemisphere of the brain, expletives hinge on evolutionarily ancient structures buried deep inside the right half.
One such structure is the amygdala, an almond-shaped group of neurons that can trigger a fight-or-flight response in which our heart rate climbs and we become less sensitive to pain. Indeed, the students’ heart rates rose when they swore, a fact the researchers say suggests that the amygdala was activated.
That explanation is backed by other experts in the field. Psychologist Steven Pinker of Harvard University, whose book The Stuff of Thought (Viking Adult, 2007) includes a detailed analysis of swearing, compared the situation with what happens in the brain of a cat that somebody accidentally sits on. “I suspect that swearing taps into a defensive reflex in which an animal that is suddenly injured or confined erupts in a furious struggle, accompanied by an angry vocalization, to startle and intimidate an attacker,” he says.
But cursing is more than just aggression, explains Timothy Jay, a psychologist at the Massachusetts College of Liberal Arts who has studied our use of profanities for the past 35 years. “It allows us to vent or express anger, joy, surprise, happiness,” he remarks. “It’s like the horn on your car, you can do a lot of things with that, it’s built into you.”
In extreme cases, the hotline to the brain’s emotional system can make swearing harmful, as when road rage escalates into physical violence. But when the hammer slips, some well-chosen swearwords might help dull the pain.
There is a catch, though: The more we swear, the less emotionally potent the words become, Stephens cautions. And without emotion, all that is left of a swearword is the word itself, unlikely to soothe anyone’s pain.
(Source: scientificamerican.com)
This Is How Your Brain Becomes Addicted to Caffeine
Within 24 hours of quitting the drug, your withdrawal symptoms begin. Initially, they’re subtle: The first thing you notice is that you feel mentally foggy, and lack alertness. Your muscles are fatigued, even when you haven’t done anything strenuous, and you suspect that you’re more irritable than usual.
Over time, an unmistakable throbbing headache sets in, making it difficult to concentrate on anything. Eventually, as your body protests having the drug taken away, you might even feel dull muscle pains, nausea and other flu-like symptoms.
This isn’t heroin, tobacco or even alcohol withdrawl. We’re talking about quitting caffeine, a substance consumed so widely (the FDA reports thatmore than 80 percent of American adults drink it daily) and in such mundane settings (say, at an office meeting or in your car) that we often forget it’s a drug—and by far the world’s most popular psychoactive one.
Like many drugs, caffeine is chemically addictive, a fact that scientists established back in 1994. This past May, with the publication of the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM), caffeine withdrawal was finally included as a mental disorder for the first time—even though its merits for inclusion are symptoms that regular coffee-drinkers have long known well from the times they’ve gone off it for a day or more.
Why, exactly, is caffeine addictive? The reason stems from the way the drug affects the human brain, producing the alert feeling that caffeine drinkers crave.
Soon after you drink (or eat) something containing caffeine, it’s absorbed through the small intestine and dissolved into the bloodstream. Because the chemical is both water- and fat-soluble (meaning that it can dissolve in water-based solutions—think blood—as well as fat-based substances, such as our cell membranes), it’s able to penetrate the blood-brain barrier and enter the brain.
Structurally, caffeine closely resembles a molecule that’s naturally present in our brain, called adenosine (which is a byproduct of many cellular processes, including cellular respiration)—so much so, in fact, that caffeine can fit neatly into our brain cells’ receptors for adenosine, effectively blocking them off. Normally, the adenosine produced over time locks into these receptors and produces a feeling of tiredness.
When caffeine molecules are blocking those receptors, they prevent this from occurring, thereby generating a sense of alertness and energy for a few hours. Additionally, some of the brain’s own natural stimulants (such as dopamine) work more effectively when the adenosine receptors are blocked, and all the surplus adenosine floating around in the brain cues the adrenal glands to secrete adrenaline, another stimulant.
For this reason, caffeine isn’t technically a stimulant on its own, says Stephen R. Braun, the author or Buzzed: the Science and Lore of Caffeine and Alcohol, but a stimulant enabler: a substance that lets our natural stimulants run wild. Ingesting caffeine, he writes, is akin to “putting a block of wood under one of the brain’s primary brake pedals.” This block stays in place for anywhere from four to six hours, depending on the person’s age, size and other factors, until the caffeine is eventually metabolized by the body.
In people who take advantage of this process on a daily basis (i.e. coffee/tea, soda or energy drink addicts), the brain’s chemistry and physical characteristics actually change over time as a result. The most notable change is that brain cells grow more adenosine receptors, which is the brain’s attempt to maintain equilibrium in the face of a constant onslaught of caffeine, with its adenosine receptors so regularly plugged (studies indicate that the brain also responds by decreasing the number of receptors for norepinephrine, a stimulant). This explains why regular coffee drinkers build up a tolerance over time—because you have more adenosine receptors, it takes more caffeine to block a significant proportion of them and achieve the desired effect.
This also explains why suddenly giving up caffeine entirely can trigger a range of withdrawal effects. The underlying chemistry is complex and not fully understood, but the principle is that your brain is used to operating in one set of conditions (with an artificially-inflated number of adenosine receptors, and a decreased number of norepinephrine receptors) that depend upon regular ingestion of caffeine. Suddenly, without the drug, the altered brain chemistry causes all sorts of problems, including the dreaded caffeine withdrawal headache.
The good news is that, compared to many drug addictions, the effects are relatively short-term. To kick the thing, you only need to get through about 7-12 days of symptoms without drinking any caffeine. During that period, your brain will naturally decrease the number of adenosine receptors on each cell, responding to the sudden lack of caffeine ingestion. If you can make it that long without a cup of joe or a spot of tea, the levels of adenosine receptors in your brain reset to their baseline levels, and your addiction will be broken.
Autism affects different parts of the brain in women and men
Autism affects different parts of the brain in females with autism than males with autism, a new study reveals. The research is published today in the journal Brain as an open-access article.
Scientists at the Autism Research Centre at the University of Cambridge used magnetic resonance imaging to examine whether autism affects the brain of males and females in a similar or different way. They found that the anatomy of the brain of someone with autism substantially depends on whether an individual is male or female, with brain areas that were atypical in adult females with autism being similar to areas that differ between typically developing males and females. This was not seen in men with autism.
“One of our new findings is that females with autism show neuroanatomical ‘masculinization’,” said Professor Simon Baron-Cohen, senior author of the paper. “This may implicate physiological mechanisms that drive sexual dimorphism, such as prenatal sex hormones and sex-linked genetic mechanisms.”
Autism affects 1% of the general population and is more prevalent in males. Most studies have therefore focused on male-dominant samples. As a result, our understanding of the neurobiology of autism is male-biased.
“This is one of the largest brain imaging studies of sex/gender differences yet conducted in autism. Females with autism have long been under-recognized and probably misunderstood,” said Dr Meng-Chuan Lai, who led the research project. “The findings suggest that we should not blindly assume that everything found in males with autism applies to females. This is an important example of the diversity within the ‘spectrum’.”
Dr Michael Lombardo, who co-led the study, added that although autism manifests itself in many different ways, grouping by gender may help provide a better understanding of this condition.
He said: “Autism as a whole is complex and vastly diverse, or heterogeneous, and this new study indicates that there are ways to subgroup the autism spectrum, such as whether an individual is male or female. Reducing heterogeneity via subgrouping will allow research to make significant progress towards understanding the mechanisms that cause autism.”
Self-perceived social status predicts hippocampal function and stress hormones
A mother’s perceived social status predicts her child’s brain development and stress indicators, finds a study at Boston Children’s Hospital. While previous studies going back to the 1950s have linked objective socioeconomic factors — such as parental income or education — to child health, achievement and brain function, the new study is the first to link brain function to maternal self-perception.
In the study, children whose mothers saw themselves as having a low social status were more likely to have increased cortisol levels, an indicator of stress, and less activation of their hippocampus, a structure in the brain responsible for long-term memory formation (required for learning) and reducing stress responses.
Findings were published online August 6th by the journal Developmental Science, and will be part of a special issue devoted to the effects of socioeconomic status on brain development.
"We know that there are big disparities among people in income and education," says Margaret Sheridan, PhD, of the Labs of Cognitive Neuroscience at Boston Children’s Hospital, the study’s first author. "Our results indicate that a mother’s perception of her social status ‘lives’ biologically in her children."
Sheridan, senior investigator Charles Nelson, PhD, of Boston Children’s Hospital and colleagues studied 38 children aged 8.3 to 11.8 years. The children gave saliva samples to measure levels of cortisol, and 19 also underwent functional MRI of the brain, focusing on the hippocampus.
Mothers, meanwhile, rated their social standing on a ladder on a scale of 1 to 10, comparing themselves with others in the United States. Findings were as follows:
The findings suggest that while actual socioeconomic status varies, how people perceive and adapt to their situation is an important factor in child development. Some of this may be culturally determined, Sheridan notes. She is currently participating in a much larger international study of childhood poverty, the Young Lives Project, that is looking at objective and subjective measures of social status along with health measures and cognitive function. The study will capture much wider extremes of socioeconomic status than would a U.S.-based study.
What the current study didn’t find was evidence that stress itself alters hippocampal function; no relationship was found between cortisol and hippocampal function, as has been seen in animals, perhaps because of the small number children having brain fMRIs. “This needs further exploration,” says Sheridan. “There may be more than one pathway leading to differences in long-term memory, or there may be an effect of stress on the hippocampus that comes out only in adulthood.”
(Source: eurekalert.org)
An innovative series of experiments could help to unlock the mysteries of how the brain makes sense of the hustle and bustle of human activity we see around us every day.

Very little is known about the psychological processes which enable us to pick out a potential mugger from a busy street or to spot an old friend approaching us across a crowded room. Such judgements of social intention, which we make countless times each day, enable us to respond in appropriate ways to the dynamic and complex world around us.
George Mather, Professor of Vision Science at the University of Lincoln, UK, and one of the world’s foremost experts on human visual perception, will lead a new research project investigating the mechanisms behind this crucial ability to perceive and interpret the intentions of other people from the way they move.
Numerous experiments have explored the way we use visual signals to extract meaning from our environment, but most have been based on static images, such as photos of different facial expressions.
Other studies into the perception of moving images have relied on very simple animated scenes, like moving patterns of regularly-spaced lines or random dots, devoid of the richness and nuances of scenes from the ‘real world’.
There remains limited scientific understanding of how the human visual system makes sense of the flurry of movement we see around us in modern societies: for example, whether a person approaching us is sprinting or strolling, whether that means they are angry or calm, and how we should react in response.
Professor Mather aims to bridge this gap in the academic literature through a series of world-first experiments. He has been awarded a grant of £287,000 by the UK’s Economic & Social Research Council (ESRC) for a three-year study. The aim is to shed new light on the process by which the human visual system identifies and decodes ‘dynamic cues of social intention’.
Professor Mather said: “It’s true that actions speak louder than words. Perception of movement is fundamental to many of our everyday social interactions. But simply judging speed is in itself a very complex task. When you see somebody walking across your field of view, how do you know how fast they are going? That information can be very useful because it might tell you something about their intentions but it’s surprisingly difficult to make an accurate judgement. A basic problem is that the further away a moving object is, the slower it moves in the image received by the eye. We don’t really understand at the moment how the human visual system is able to compensate for different viewing conditions.”
Motion perception has been a consistent theme of Professor Mather’s research career. In previous studies he has shown that the brain can deduce socially meaningful information from very simple depictions of human movement, such as collections of dots denoting the major joints of the body.
The research in this latest project will answer fundamental questions about how the brain combines ‘low-level’ information about image motion with ‘high level’ knowledge of the social world to make meaningful assessments of the speed and nature of human movements.
(Source: lincoln.ac.uk)
Study Reveals That Overthinking Can Be Detrimental to Human Performance
Trying to explain riding a bike is difficult because it is an implicit memory. The body knows what to do, but thinking about the process can often interfere. So why is it that under certain circumstances paying full attention and trying hard can actually impede performance? A new UC Santa Barbara study, published today in the Journal of Neuroscience, reveals part of the answer.
There are two kinds of memory: implicit, a form of long-term memory not requiring conscious thought and expressed by means other than words; and explicit, another kind of long-term memory formed consciously that can be described in words. Scientists consider these distinct areas of function both behaviorally and in the brain.
Long-term memory is supported by various regions in the prefrontal cortex, the newest part of the brain in terms of evolution and the part of the brain responsible for planning, executive function, and working memory. “A lot of people think the reason we’re human is because we have the most advanced prefrontal cortex,” said the study’s lead author, Taraz Lee, a postdoctoral scholar working in UCSB’s Action Lab.
Two previous brain studies have shown that taxing explicit memory resources improved recognition memory without awareness. The results suggest that implicit perceptual memory can aid performance on recognition tests. So Lee and his colleagues decided to test whether the effects of the attentional control processes associated with explicit memory could directly interfere with implicit memory.
Lee’s study used continuous theta-burst transcranial magnetic stimulation (TMS) to temporarily disrupt the function of two different parts of the prefrontal cortex, the dorsolateral and ventrolateral. The dorsal and ventral regions are close to each other but have slightly different functions. Disrupting function in two distinct areas provided a direct causal test of whether explicit memory processing exerts control over sensory resources –– in this case, visual information processing –– and in doing so indirectly harms implicit memory processes.
Participants were shown a series of kaleidoscopic images for about a minute, then had a one-minute break before being given memory tests containing two different kaleidoscopic images. They were then asked to distinguish images they had seen previously from the new ones. “After they gave us that answer, we asked whether they remembered a lot of rich details, whether they had a vague impression, or whether they were blindly guessing,” explains Lee. “And the participants only did better when they said they were guessing.”
The results of disrupting the function of the dorsolateral prefrontal cortex shed light on why paying attention can be a distraction and affect performance outcomes. “If we ramped down activity in the dorsolateral prefrontal cortex, people remembered the images better,” said Lee.
When the researchers disrupted the ventral area of the prefrontal cortex, participants’ memory was just slightly worse. “They would shift from saying that they could remember a lot of rich details about the image to being vaguely familiar with the images,” Lee said. “It didn’t actually make them better at the task.”
Lee’s fascination with the effect of attentional processes on memory stems from his extensive sports background. As he pointed out, there are always examples of professional golfers who have the lead on the 18th hole, but when it comes down to one easy shot, they fall apart. “That should be the time when it all comes out the best, but you just can’t think about that sort of thing,” he said. “It just doesn’t help you.”
His continuing studies at UCSB’s Action Lab will focus on dissecting the process of choking under pressure. Lee’s work will use brain scans to examine why people who are highly incentivized to do well often succumb to pressure and how the prefrontal cortex and these attentional processes interfere with performance.
"I think most researchers who look at prefrontal cortex function are trying to figure out what it does to help you and how that explains how the brain works and how we act," said Lee. "I look at it at the opposite. If we can figure out the ways in which activity in this part of the brain hurts you, then this also informs how your brain works and can give us some clues to what’s actually going on."

This is your brain on Vivaldi and Beatles
Listening to music activates large networks in the brain, but different kinds of music are processed differently. A team of researchers from Finland, Denmark and the UK has developed a new method for studying music processing in the brain during a realistic listening situation. Using a combination of brain imaging and computer modeling, they found areas in the auditory, motor, and limbic regions to be activated during free listening to music. They were furthermore able to pinpoint differences in the processing between vocal and instrumental music. The new method helps us to understand better the complex brain dynamics of brain networks and the processing of lyrics in music. The study was published in the journal NeuroImage.
Using functional magnetic resonance imaging (fMRI), the research team, led by Dr. Vinoo Alluri from the University of Jyväskylä, Finland, recorded the brain responses of individuals while they were listening to music from different genres, including pieces by Antonio Vivaldi, Miles Davis, Booker T. & the M.G.’s, The Shadows, Astor Piazzolla, and The Beatles. Following this, they analyzed the musical content of the pieces using sophisticated computer algorithms to extract musical features related to timbre, rhythm and tonality. Using a novel cross-validation method, they subsequently located activated brain areas that were common across the different musical stimuli.
The study revealed that activations in several areas in the brain belonging to the auditory, limbic, and motor regions were activated by all musical pieces. Notable, areas in the medial orbitofrontal region and the anterior cingulate cortex, which are relevant for self-referential appraisal and aesthetic judgments, were found to be activated during the listening. A further interesting finding was that vocal and instrumental music were processed differently. In particular, the presence of lyrics was found to shift the processing of musical features towards the right auditory cortex, which suggests a left-hemispheric dominance in the processing of the lyrics. This result is in line with previous research, but now for the first time observed during continuous listening to music.
"The new method provides a powerful means to predict brain responses to music, speech, and soundscapes across a variety of contexts", says Dr. Vinoo Alluri.
What Color is Your Night Light? It May Affect Your Mood
Study Finds Red Light Least Harmful, While Blue Light is Worst
When it comes to some of the health hazards of light at night, a new study suggests that the color of the light can make a big difference.
In a study involving hamsters, researchers found that blue light had the worst effects on mood-related measures, followed closely by white light.
But hamsters exposed to red light at night had significantly less evidence of depressive-like symptoms and changes in the brain linked to depression, compared to those that experienced blue or white light.
The only hamsters that fared better than those exposed to red light were those that had total darkness at night.
The findings may have important implications for humans, particularly those whose work on night shifts makes them susceptible to mood disorders, said Randy Nelson, co-author of the study and professor of neuroscience and psychology at The Ohio State University.
“Our findings suggest that if we could use red light when appropriate for night-shift workers, it may not have some of the negative effects on their health that white light does,” Nelson said.
The study appears in the Aug. 7, 2013, issue of The Journal of Neuroscience.
The research examined the role of specialized photosensitive cells in the retina — called ipRGCs — that don’t have a major role in vision, but detect light and send messages to a part of the brain that helps regulate the body’s circadian clock. This is the body’s master clock that helps determine when people feel sleepy and awake.
Other research suggests these light-sensitive cells also send messages to parts of the brain that play a role in mood and emotion.
“Light at night may result in parts of the brain regulating mood receiving signals during times of the day when they shouldn’t,” said co-author Tracy Bedrosian, a former graduate student at Ohio State who is now a postdoctoral researcher at the Salk Institute. “This may be why light at night seems to be linked to depression in some people.”
What people experience as different colors of light are actually lights of different wavelengths. The ipRGCs don’t appear to react to light of different wavelengths in the same way.
“These cells are most sensitive to blue wavelengths and least sensitive to red wavelengths,” Nelson said. “We wanted to see how exposure to these different color wavelengths affected the hamsters.”
In one experiment, the researchers exposed adult female Siberian hamsters to four weeks each of nighttime conditions with no light, dim red light, dim white light (similar to that found in normal light bulbs) or dim blue light.
They then did several tests with the hamsters that are used to check for depressive-like symptoms. For example, if the hamsters drink less-than-normal amounts of sugar water — a treat they normally enjoy — that is seen as evidence of a mood problem.
Results showed that hamsters that were kept in the dark at night drank the most sugar water, followed closely by those exposed to red light. Those that lived with dim white or blue light at night drank significantly less of the sugar water than the others.
After the testing, the researchers then examined the hippocampus regions of the brains of the hamsters.
Hamsters that spent the night in dim blue or white light had a significantly reduced density of dendritic spines compared to those that lived in total darkness or that were exposed to only red light. Dendritic spines are hairlike growths on brain cells that are used to send chemical messages from one cell to another.
A lowered density of these dendritic spines has been linked to depression, Nelson said.
“The behavior tests and changes in brain structure in hamsters both suggest that the color of lights may play a key role in mood,” he said.
“In nearly every measure we had, hamsters exposed to blue light were the worst off, followed by those exposed to white light,” he said. “While total darkness was best, red light was not nearly as bad as the other wavelengths we studied.”
Nelson and Bedrosian said they believe these results may be applicable to humans.
In addition to shift workers, others may benefit from limiting their light at night from computers, televisions and other electronic devices, they said. And, if light is needed, the color may matter.
“If you need a night light in the bathroom or bedroom, it may be better to have one that gives off red light rather than white light,” Bedrosian said.
Not only does practice make perfect, it also makes for more efficient generation of neuronal activity in the primary motor cortex, the area of the brain that plans and executes movement, according to researchers from the University of Pittsburgh School of Medicine. Their findings, published online today in Nature Neuroscience, showed that practice leads to decreased metabolic activity for internally generated movements, but not for visually guided motor tasks, and suggest the motor cortex is “plastic” and a potential site for the storage of motor skills.

The hand area of the primary motor cortex is known to be larger among professional pianists than in amateur ones. This observation has suggested that extensive practice and the development of expert performance induces changes in the primary motor cortex, said senior investigator Peter L. Strick, Ph.D., Distinguished Professor and chair, Department of Neurobiology, Pitt School of Medicine.
Prior imaging studies have shown that markers of synaptic activity, meaning the input signals to neurons, decrease in the primary motor cortex as repeated actions become routine and an individual develops expertise at a motor skill. The researchers found that markers of synaptic activity also display a marked decrease in monkeys trained to perform sequences of movements that are guided from memory — an internally generated task — rather than from vision. They wondered whether the change in synaptic activity indicated that neuron firing also declined. To examine this issue they recorded neuron activity and sampled metabolic activity, a measure of synaptic activity in the same animals.
All the monkeys were trained on two tasks and were rewarded when they reached out to touch an object in front of them. In the visually guided task, a visual target showed the monkeys where to reach and the end point was randomly switched from trial to trial. In the internally generated task the monkeys were trained to perform short sequences of movements without visual cues. They practiced the sequences until they achieved a level of skill comparable to an expert typist.
The researchers found neuron activity was comparable between monkeys that performed visually guided and internally generated tasks. However, metabolic activity was high for the visually guided task, but only modest during the internally generated task.
“This tells us that practicing a skilled movement and the development of expertise leads to more efficient generation of neuron activity in the primary motor cortex to produce the movement. The increase in efficiency could be created by a number of factors such as more effective synapses, greater synchrony in inputs and more finely tuned inputs,” Dr. Strick noted. “What is really important is that our results indicate that practice changes the primary motor cortex so that it can become an important substrate for the storage of motor skills. Thus, the motor cortex is adaptable, or plastic.
(Source: upmc.com)
Monogamy’s Boost to Human Evolution
“Monogamy is a problem,” said Dieter Lukas of the University of Cambridge in a telephone news conference this week. As Dr. Lukas explained to reporters, he and other biologists consider monogamy an evolutionary puzzle.
In 9 percent of all mammal species, males and females will share a common territory for more than one breeding season, and in some cases bond for life. This is a problem — a scientific one — because male mammals could theoretically have more offspring by giving up on monogamy and mating with lots of females.
In a new study, Dr. Lukas and his colleague Tim Clutton-Brock suggest that monogamy evolves when females spread out, making it hard for a male to travel around and fend off competing males.
On the same day, Kit Opie of University College London and his colleagues published a similar study on primates, which are especially monogamous — males and females bond in over a quarter of primate species. The London scientists came to a different conclusion: that the threat of infanticide leads males to stick with only one female, protecting her from other males.
Even with the scientific problem far from resolved, research like this inevitably turns us into narcissists. It’s all well and good to understand why the gray-handed night monkey became monogamous. But we want to know: What does this say about men and women?
As with all things concerning the human heart, it’s complicated.
“The human mating system is extremely flexible,” Bernard Chapais of the University of Montreal wrote in a recent review in Evolutionary Anthropology. Only 17 percent of human cultures are strictly monogamous. The vast majority of human societies embrace a mix of marriage types, with some people practicing monogamy and others polygamy. (Most people in these cultures are in monogamous marriages, though.)
There are even some societies where a woman may marry several men. And some men and women have secret relationships that last for years while they’re married to other people, a kind of dual monogamy. Same-sex marriages acknowledge commitments that in many cases existed long before they won legal recognition.
Each species faces its own special challenges — the climate where it lives, or the food it depends on, or the predators that stalk it — and certain conditions may favor monogamy despite its drawbacks. One source of clues to the origin of human mating lies in our closest relatives, chimpanzees and bonobos. They live in large groups where the females mate with lots of males when they’re ovulating. Male chimpanzees will fight with each other for the chance to mate, and they’ve evolved to produce extra sperm to increase their chances that they get to father a female’s young.
Our own ancestors split off from the ancestors of chimpanzees about seven million years ago. Fossils may offer us some clues to how our mating systems evolved after that parting of ways. The hormone levels that course through monogamous primates are different from those of other species, possibly because the males aren’t in constant battle for females.
That difference in hormones influences how primates grow in some remarkable ways. For example, the ratio of their finger lengths is different.
In 2011, Emma Nelson of the University of Liverpool and her colleagues looked at the finger bones of ancient hominid fossils. From what they found, they concluded that hominids 4.4 million years ago mated with many females. By about 3.5 million years ago, however, the finger-length ratio indicated that hominids had shifted more toward monogamy.
Our lineage never evolved to be strictly monogamous. But even in polygamous relationships, individual men and women formed long-term bonds — a far cry from the arrangement in chimpanzees.
While the two new studies published last week disagree about the force driving the evolution of monogamy, they do agree on something important. “Once monogamy has evolved, then male care is far more likely,” Dr. Opie said.
Once a monogamous primate father starts to stick around, he has the opportunity to raise the odds that his offspring will survive. He can carry them, groom their fur and protect them from attacks.
In our own lineage, however, fathers went further. They had evolved the ability to hunt and scavenge meat, and they were supplying some of that food to their children. “They may have gone beyond what is normal for monogamous primates,” said Dr. Opie.
The extra supply of protein and calories that human children started to receive is widely considered a watershed moment in our evolution. It could explain why we have brains far bigger than other mammals.
Brains are hungry organs, demanding 20 times more calories than a similar piece of muscle. Only with a steady supply of energy-rich meat, Dr. Okie suggests, were we able to evolve big brains — and all the mental capacities that come with it.
Because of monogamy, Dr. Opie said, “This could be how humans were able to push through a ceiling in terms of brain size.”