Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

42 notes

Peptides helping researchers in search for Parkinson’s disease treatment
Australian researchers have taken the first step in using bioactive peptides as the building blocks to help ‘build a new brain’ to treat degenerative brain disease.
Deakin University biomedical scientist Dr Richard Williams is working in a team with Dr David Nisbet from the Australian National University and Dr Clare Parish at the Florey Neuroscience Institute to develop a way to repair the damaged parts of the brain that cause Parkinson’s disease.
Parkinson’s disease develops when the brain cells (or neurons) that produce the chemical dopamine die or are damaged. Dopamine neurons produce a lubricant that helps the brain transmit signals to the body that control muscles and movement. When these cells die or are damaged the result is the shaking and muscle stiffness that are among the common symptoms of the disease.
"We are looking at a way of helping the brain to regenerate the dead or damaged cells that transport dopamine throughout the body," Dr Williams said. "Peptides help the body heal itself, providing many positive benefits for health, particularly in regenerative medicine; this is why the sports people were using them to recover more quickly in the current doping scandal."
Peptides are both the building blocks and the messengers of the body; the team has used them to mimic the normal brain environment and provide the chemical signals needed to help the brain function.
"Peptides stick together like Lego blocks, so in the first stage of the project we have been able to make a three dimensional material or tissue scaffold that provides the networks cells need to grow; but the peptides also carry instructions in the form of chemical signals which tell the cells to grow into new neurons," Dr Williams explained.
"Importantly, this material has the same consistency as the brain, does not cause chronic inflammation and is non-toxic to the body.
"Our aim is to use this scaffold material to support the patient’s own stem cells that could be turned into dopamine neurons and implanted back into the brain. We expect that when implanted the material and stem cells would be accepted by the brain as normal tissue and grow to replace the damaged or dead cells."
While the research is not yet complete, Dr Williams is excited by the possibilities this work offers to the treatment of degenerative conditions.
"It is no secret that we are living longer, and with this we are seeing an increase in many conditions that come about because of ageing such Parkinson’s. By developing biomaterials, like the ones we are working on, it could be possible to help the body to regenerate and provide an improved quality of life to the older members of our community," he said.
"This work can also be adapted to other parts of the body which struggle to repair themselves, such as new cartilage for joints, muscle and heart cells, bones and teeth. Ultimately, it will be like taking your car to the garage to have new parts fitted to replace the worn out ones."
The results of the first stage of this Australian Research Council funded project will be published in the international journal Soft Matter.

Peptides helping researchers in search for Parkinson’s disease treatment

Australian researchers have taken the first step in using bioactive peptides as the building blocks to help ‘build a new brain’ to treat degenerative brain disease.

Deakin University biomedical scientist Dr Richard Williams is working in a team with Dr David Nisbet from the Australian National University and Dr Clare Parish at the Florey Neuroscience Institute to develop a way to repair the damaged parts of the brain that cause Parkinson’s disease.

Parkinson’s disease develops when the brain cells (or neurons) that produce the chemical dopamine die or are damaged. Dopamine neurons produce a lubricant that helps the brain transmit signals to the body that control muscles and movement. When these cells die or are damaged the result is the shaking and muscle stiffness that are among the common symptoms of the disease.

"We are looking at a way of helping the brain to regenerate the dead or damaged cells that transport dopamine throughout the body," Dr Williams said. "Peptides help the body heal itself, providing many positive benefits for health, particularly in regenerative medicine; this is why the sports people were using them to recover more quickly in the current doping scandal."

Peptides are both the building blocks and the messengers of the body; the team has used them to mimic the normal brain environment and provide the chemical signals needed to help the brain function.

"Peptides stick together like Lego blocks, so in the first stage of the project we have been able to make a three dimensional material or tissue scaffold that provides the networks cells need to grow; but the peptides also carry instructions in the form of chemical signals which tell the cells to grow into new neurons," Dr Williams explained.

"Importantly, this material has the same consistency as the brain, does not cause chronic inflammation and is non-toxic to the body.

"Our aim is to use this scaffold material to support the patient’s own stem cells that could be turned into dopamine neurons and implanted back into the brain. We expect that when implanted the material and stem cells would be accepted by the brain as normal tissue and grow to replace the damaged or dead cells."

While the research is not yet complete, Dr Williams is excited by the possibilities this work offers to the treatment of degenerative conditions.

"It is no secret that we are living longer, and with this we are seeing an increase in many conditions that come about because of ageing such Parkinson’s. By developing biomaterials, like the ones we are working on, it could be possible to help the body to regenerate and provide an improved quality of life to the older members of our community," he said.

"This work can also be adapted to other parts of the body which struggle to repair themselves, such as new cartilage for joints, muscle and heart cells, bones and teeth. Ultimately, it will be like taking your car to the garage to have new parts fitted to replace the worn out ones."

The results of the first stage of this Australian Research Council funded project will be published in the international journal Soft Matter.

Filed under parkinson's disease degenerative diseases peptides brain cells dopamine neuroscience science

62 notes

Wireless, implanted sensor broadens range of brain research
A compact, self-contained sensor recorded and transmitted brain activity data wirelessly for more than a year in early stage animal tests, according to a study funded by the National Institutes of Health. In addition to allowing for more natural studies of brain activity in moving subjects, this implantable device represents a potential major step toward cord-free control of advanced prosthetics that move with the power of thought. The report is in the April 2013 issue of the Journal of Neural Engineering.
“For people who have sustained paralysis or limb amputation, rehabilitation can be slow and frustrating because they have to learn a new way of doing things that the rest of us do without actively thinking about it,” said Grace Peng, Ph.D., who oversees the Rehabilitation Engineering Program of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH. “Brain-computer interfaces harness existing brain circuitry, which may offer a more intuitive rehab experience, and ultimately, a better quality of life for people who have already faced serious challenges.”
Recent advances in brain-computer interfaces (BCI) have shown that it is possible for a person to control a robotic arm through implanted brain sensors linked to powerful external computers. However, such devices have relied on wired connections, which pose infection risks and restrict movement, or were wireless but had very limited computing power.
Building on this line of research, David Borton, Ph.D., and Ming Yin, Ph.D., of Brown University, Providence, R.I., and colleagues surmounted several major barriers in developing their sensor. To be fully implantable within the brain, the device needed to be very small and completely sealed off to protect the delicate machinery inside the device and the even more delicate tissue surrounding it. At the same time, it had to be powerful enough to convert the brain’s subtle electrical activity into digital signals that could be used by a computer, and then boost those signals to a level that could be detected by a wireless receiver located some distance outside the body. Like all cordless machines, the device had to be rechargeable, but in the case of an implanted brain sensor, recharging must also be done wirelessly.
The researchers consulted with brain surgeons on the shape and size of the sensor, which they built out of titanium, commonly used in joint replacements and other medical implants. They also fitted the device with a window made of sapphire, which electromagnetic signals pass through more easily than other materials, to assist with wireless transmission and inductive charging, a method of recharging also used in electronic toothbrushes. Inside, the device was densely packed with the electronics specifically designed to function on low power to reduce the amount of heat generated by the device and to extend the time it could work on battery power.
Testing the device in animal models — two pigs and two rhesus macaques — the researchers were able to receive and record data from the implanted sensors in real time over a broadband wireless connection. The sensors could transmit signals more than three feet and have continued to perform for over a year with little degradation in quality or performance.
The ability to remotely record brain activity data as an animal interacts naturally with its environment may help inform studies on muscle control and the movement-related brain circuits, the researchers say. While testing of the current devices continues, the researchers plan to refine the sensor for better heat management and data transmission, with use in human medical care as the goal.
“Clinical applications may include thought-controlled prostheses for severely neurologically impaired patients, wireless access to motorized wheelchairs or other assistive technologies, and diagnostic monitoring such as in epilepsy, where patients currently are tethered to the bedside during assessment,” said Borton.

Wireless, implanted sensor broadens range of brain research

A compact, self-contained sensor recorded and transmitted brain activity data wirelessly for more than a year in early stage animal tests, according to a study funded by the National Institutes of Health. In addition to allowing for more natural studies of brain activity in moving subjects, this implantable device represents a potential major step toward cord-free control of advanced prosthetics that move with the power of thought. The report is in the April 2013 issue of the Journal of Neural Engineering.

“For people who have sustained paralysis or limb amputation, rehabilitation can be slow and frustrating because they have to learn a new way of doing things that the rest of us do without actively thinking about it,” said Grace Peng, Ph.D., who oversees the Rehabilitation Engineering Program of the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH. “Brain-computer interfaces harness existing brain circuitry, which may offer a more intuitive rehab experience, and ultimately, a better quality of life for people who have already faced serious challenges.”

Recent advances in brain-computer interfaces (BCI) have shown that it is possible for a person to control a robotic arm through implanted brain sensors linked to powerful external computers. However, such devices have relied on wired connections, which pose infection risks and restrict movement, or were wireless but had very limited computing power.

Building on this line of research, David Borton, Ph.D., and Ming Yin, Ph.D., of Brown University, Providence, R.I., and colleagues surmounted several major barriers in developing their sensor. To be fully implantable within the brain, the device needed to be very small and completely sealed off to protect the delicate machinery inside the device and the even more delicate tissue surrounding it. At the same time, it had to be powerful enough to convert the brain’s subtle electrical activity into digital signals that could be used by a computer, and then boost those signals to a level that could be detected by a wireless receiver located some distance outside the body. Like all cordless machines, the device had to be rechargeable, but in the case of an implanted brain sensor, recharging must also be done wirelessly.

The researchers consulted with brain surgeons on the shape and size of the sensor, which they built out of titanium, commonly used in joint replacements and other medical implants. They also fitted the device with a window made of sapphire, which electromagnetic signals pass through more easily than other materials, to assist with wireless transmission and inductive charging, a method of recharging also used in electronic toothbrushes. Inside, the device was densely packed with the electronics specifically designed to function on low power to reduce the amount of heat generated by the device and to extend the time it could work on battery power.

Testing the device in animal models — two pigs and two rhesus macaques — the researchers were able to receive and record data from the implanted sensors in real time over a broadband wireless connection. The sensors could transmit signals more than three feet and have continued to perform for over a year with little degradation in quality or performance.

The ability to remotely record brain activity data as an animal interacts naturally with its environment may help inform studies on muscle control and the movement-related brain circuits, the researchers say. While testing of the current devices continues, the researchers plan to refine the sensor for better heat management and data transmission, with use in human medical care as the goal.

“Clinical applications may include thought-controlled prostheses for severely neurologically impaired patients, wireless access to motorized wheelchairs or other assistive technologies, and diagnostic monitoring such as in epilepsy, where patients currently are tethered to the bedside during assessment,” said Borton.

Filed under brain activity implants prosthetics limb amputation BCI animal model neuroscience science

110 notes

Face of the future rears its head
Meet Zoe: a digital talking head which can express human emotions on demand with “unprecedented realism” and could herald a new era of human-computer interaction.
A virtual “talking head” which can express a full range of human emotions and could be used as a digital personal assistant, or to replace texting with “face messaging”, has been developed by researchers.
The lifelike face can display emotions such as happiness, anger, and fear, and changes its voice to suit any feeling the user wants it to simulate. Users can type in any message, specifying the requisite emotion as well, and the face recites the text. According to its designers, it is the most expressive controllable avatar ever created, replicating human emotions with unprecedented realism.
The system, called “Zoe”, is the result of a collaboration between researchers at Toshiba’s Cambridge Research Lab and the University of Cambridge’s Department of Engineering. Students have already spotted a striking resemblance between the disembodied head and Holly, the ship’s computer in the British sci-fi comedy, Red Dwarf.
Appropriately enough, the face is actually that of Zoe Lister, an actress perhaps best-known as Zoe Carpenter in the Channel 4 series, Hollyoaks. To recreate her face and voice, researchers spent several days recording Zoe’s speech and facial expressions. The result is a system that is light enough to work in mobile technology, and could be used as a personal assistant in smartphones, or to “face message” friends.
The framework behind “Zoe” is also a template that, before long, could enable people to upload their own faces and voices - but in a matter of seconds, rather than days. That means that in the future, users will be able to customise and personalise their own, emotionally realistic, digital assistants.
If this can be developed, then a user could, for example, text the message “I’m going to be late” and ask it to set the emotion to “frustrated”. Their friend would then receive a “face message” that looked like the sender, repeating the message in a frustrated way.
The team who created Zoe are currently looking for applications, and are also working with a school for autistic and deaf children, where the technology could be used to help pupils to “read” emotions and lip-read. Ultimately, the system could have multiple uses – including in gaming, in audio-visual books, as a means of delivering online lectures, and in other user interfaces.
“This technology could be the start of a whole new generation of interfaces which make interacting with a computer much more like talking to another human being,” Professor Roberto Cipolla, from the Department of Engineering, University of Cambridge, said.

Face of the future rears its head

Meet Zoe: a digital talking head which can express human emotions on demand with “unprecedented realism” and could herald a new era of human-computer interaction.

A virtual “talking head” which can express a full range of human emotions and could be used as a digital personal assistant, or to replace texting with “face messaging”, has been developed by researchers.

The lifelike face can display emotions such as happiness, anger, and fear, and changes its voice to suit any feeling the user wants it to simulate. Users can type in any message, specifying the requisite emotion as well, and the face recites the text. According to its designers, it is the most expressive controllable avatar ever created, replicating human emotions with unprecedented realism.

The system, called “Zoe”, is the result of a collaboration between researchers at Toshiba’s Cambridge Research Lab and the University of Cambridge’s Department of Engineering. Students have already spotted a striking resemblance between the disembodied head and Holly, the ship’s computer in the British sci-fi comedy, Red Dwarf.

Appropriately enough, the face is actually that of Zoe Lister, an actress perhaps best-known as Zoe Carpenter in the Channel 4 series, Hollyoaks. To recreate her face and voice, researchers spent several days recording Zoe’s speech and facial expressions. The result is a system that is light enough to work in mobile technology, and could be used as a personal assistant in smartphones, or to “face message” friends.

The framework behind “Zoe” is also a template that, before long, could enable people to upload their own faces and voices - but in a matter of seconds, rather than days. That means that in the future, users will be able to customise and personalise their own, emotionally realistic, digital assistants.

If this can be developed, then a user could, for example, text the message “I’m going to be late” and ask it to set the emotion to “frustrated”. Their friend would then receive a “face message” that looked like the sender, repeating the message in a frustrated way.

The team who created Zoe are currently looking for applications, and are also working with a school for autistic and deaf children, where the technology could be used to help pupils to “read” emotions and lip-read. Ultimately, the system could have multiple uses – including in gaming, in audio-visual books, as a means of delivering online lectures, and in other user interfaces.

“This technology could be the start of a whole new generation of interfaces which make interacting with a computer much more like talking to another human being,” Professor Roberto Cipolla, from the Department of Engineering, University of Cambridge, said.

Filed under human-computer interaction talking head emotions emotional combinations technology neuroscience science

1,055 notes

Study indicates reverse impulses clear useless information, prime brain for learning
When the mind is at rest, the electrical signals by which brain cells communicate appear to travel in reverse, wiping out unimportant information in the process, but sensitizing the cells for future sensory learning, according to a study of rats conducted by researchers at the National Institutes of Health.
The finding has implications not only for studies seeking to help people learn more efficiently, but also for attempts to understand and treat post-traumatic stress disorder—in which the mind has difficulty moving beyond a disturbing experience.
During waking hours, brain cells, or neurons, communicate via high-speed electrical signals that travel the length of the cell. These communications are the foundation for learning. As learning progresses, these signals travel across groups of neurons with increasing rapidity, forming circuits that work together to recall a memory.
It was previously known that, during sleep, these impulses were reversed, arising from waves of electrical activity originating deep within the brain. In the current study, the researchers found that these reverse signals weakened circuits formed during waking hours, apparently so that unimportant information could be erased from the brain. But the reverse signals also appeared to prime the brain to relearn at least some of the forgotten information. If the animals encountered the same information upon awakening, the circuits re-formed much more rapidly than when they originally encountered the information.
"The brain doesn’t store all the information it encounters, so there must be a mechanism for discarding what isn’t important," said senior author R. Douglas Fields, Ph.D., head of the Section on Nervous System Development and Plasticity at the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the NIH institute where the research was conducted. "These reverse brain signals appear to be the mechanism by which the brain clears itself of unimportant information."
Their findings appear in the Proceedings of the National Academy of Sciences.
The researchers studied the activity of rats’ brain cells from the hippocampus, a tube-like structure deep in the brain. The hippocampus relays information to and from many other regions of the brain. It plays an important role in memory, orientation, and navigation.
The classic understanding of brain cell activity is that electrical signals travel from dendrites—antenna-like projections at one end of the cell—through the cell body. From the cell body, they then travel the length of the axon, a single long projection at the other end of the cell. This electrical signal stimulates the release of chemicals at the end of the axon, which bind to dendrites on adjacent cells, stimulating these recipient cells to fire electrical signals, and so on. When groups of cells repeatedly fire in this way, the electrical signals increase in intensity.
Dr. Bukalo and her team examined electrical signals that traveled in reverse—from the cell’s axon, to the cell body, and out its many dendrites. This reverse firing happens during sleep and at rest, appearing to reset the cell, the researchers found.
After first stimulating the cells with reverse electrical impulses, the researchers next stimulated the dendrites again with electrical impulses traveling in the forward direction. In response, the neurons generated a stronger signal, with the connections appearing to strengthen with repeated electrical stimulation.
This pattern appears to underlie the formation of new memories. A connection that is reset but never stimulated again may simply fade from use over time, Dr. Bukalo explained. But when a cell is stimulated again, it fires a stronger signal and may be more easily synchronized to the reinforced signals of other brain cells, all of which act in concert over time.

Study indicates reverse impulses clear useless information, prime brain for learning

When the mind is at rest, the electrical signals by which brain cells communicate appear to travel in reverse, wiping out unimportant information in the process, but sensitizing the cells for future sensory learning, according to a study of rats conducted by researchers at the National Institutes of Health.

The finding has implications not only for studies seeking to help people learn more efficiently, but also for attempts to understand and treat post-traumatic stress disorder—in which the mind has difficulty moving beyond a disturbing experience.

During waking hours, brain cells, or neurons, communicate via high-speed electrical signals that travel the length of the cell. These communications are the foundation for learning. As learning progresses, these signals travel across groups of neurons with increasing rapidity, forming circuits that work together to recall a memory.

It was previously known that, during sleep, these impulses were reversed, arising from waves of electrical activity originating deep within the brain. In the current study, the researchers found that these reverse signals weakened circuits formed during waking hours, apparently so that unimportant information could be erased from the brain. But the reverse signals also appeared to prime the brain to relearn at least some of the forgotten information. If the animals encountered the same information upon awakening, the circuits re-formed much more rapidly than when they originally encountered the information.

"The brain doesn’t store all the information it encounters, so there must be a mechanism for discarding what isn’t important," said senior author R. Douglas Fields, Ph.D., head of the Section on Nervous System Development and Plasticity at the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the NIH institute where the research was conducted. "These reverse brain signals appear to be the mechanism by which the brain clears itself of unimportant information."

Their findings appear in the Proceedings of the National Academy of Sciences.

The researchers studied the activity of rats’ brain cells from the hippocampus, a tube-like structure deep in the brain. The hippocampus relays information to and from many other regions of the brain. It plays an important role in memory, orientation, and navigation.

The classic understanding of brain cell activity is that electrical signals travel from dendrites—antenna-like projections at one end of the cell—through the cell body. From the cell body, they then travel the length of the axon, a single long projection at the other end of the cell. This electrical signal stimulates the release of chemicals at the end of the axon, which bind to dendrites on adjacent cells, stimulating these recipient cells to fire electrical signals, and so on. When groups of cells repeatedly fire in this way, the electrical signals increase in intensity.

Dr. Bukalo and her team examined electrical signals that traveled in reverse—from the cell’s axon, to the cell body, and out its many dendrites. This reverse firing happens during sleep and at rest, appearing to reset the cell, the researchers found.

After first stimulating the cells with reverse electrical impulses, the researchers next stimulated the dendrites again with electrical impulses traveling in the forward direction. In response, the neurons generated a stronger signal, with the connections appearing to strengthen with repeated electrical stimulation.

This pattern appears to underlie the formation of new memories. A connection that is reset but never stimulated again may simply fade from use over time, Dr. Bukalo explained. But when a cell is stimulated again, it fires a stronger signal and may be more easily synchronized to the reinforced signals of other brain cells, all of which act in concert over time.

Filed under brain cells PTSD memory learning hippocampus memory formation neuroscience science

113 notes

Brain-mapping increases understanding of alcohol’s effects on college freshmen
A research team that includes several Penn State scientists has completed a first-of-its-kind longitudinal pilot study aimed at better understanding how the neural processes that underlie responses to alcohol-related cues change during students’ first year of college.
Anecdotal evidence abounds attesting to the many negative social and physical effects of the dramatic increase in alcohol use that often comes with many students’ first year of college. The behavioral changes that accompany those effects indicate underlying changes in the brain. Yet in contrast to alcohol’s numerous other effects, its effect on the brain’s continuing development from adolescence into early adulthood — which includes the transition from high school to college — is not well known.
Penn State psychology graduate student Adriene Beltz, with a team of additional researchers, investigated the changes that occurred to alcohol-related neural processes in the brains of a small group of first-year students.
Using functional magnetic resonance imaging (fMRI) and a data analysis technique known as effective connectivity mapping, the researchers collected and analyzed data from 11 students, who participated in a series of three fMRI sessions beginning just before the start of classes and concluding part-way through the second semester.
"We wanted to know if and how brain responses to alcohol cues — pictures of alcoholic beverages in this case — changed across the first year of college," said Beltz, "and how these potential changes related to alcohol use. Moreover, we wanted our analysis approach to take advantage of the richness of fMRI data."
Analysis of the data collected from the study participants revealed signs in their brains’ emotion processing networks of habituation to alcohol-related stimuli, and noticeable alterations in their cognitive control networks.
Recent studies have indicated that young adults’ cognitive development continues through the ages of the mid-20s, particularly in those regions of the brain responsible for decision-making or judgment-related activity — the sort of cognitive “fine tuning” that potentially makes us, in some senses, as much who we are (and will be) as any other stage of our overall development.
Other recent studies suggest that binge drinking during late adolescence may damage the brain in ways that could last into adulthood.
Beltz’s study indicates that connections among brain regions involved in emotion processing and cognitive control may change with increased exposure to alcohol and alcohol-related cues. Those connections also may influence other parts of the brain, such as those still-developing regions responsible for students’ decision-making and judgment abilities.
"The brain is a complex network," Beltz said. "We know that connections among different brain regions are important for behavior, and we know that many of these connections are still developing into early adulthood. Thus, alcohol could have far-reaching consequences on a maturing brain, directly influencing some brain regions and indirectly influencing others by disrupting neural connectivity."
While in an fMRI scanner at the Penn State Social, Life and Engineering Sciences Imaging Center, students participating in the study completed a task: responding as quickly as possible, by pressing a button on a grip device, to an image of either an alcoholic beverage or a non-alcoholic beverage when both types of images were displayed consecutively on a screen. From the resulting data, effective connectivity maps were created for each individual and for the group.
Examining the final maps, the researchers found that brain regions involved in emotion-processing showed less connectivity when the students responded to alcohol cues than when they responded to non-alcohol cues, and that brain regions involved in cognitive control showed the most connectivity during the first semester of college. The findings suggest that the students needed to heavily recruit brain regions involved in cognitive control in order to overcome the alcohol-associated stimuli when instructed to respond to the non-alcohol cues.
"Connectivity among brain regions implicated in cognitive control spiked from the summer before college to the first semester of college," said Beltz. "This was particularly interesting because the spike coincided with increases in the participants’ alcohol use and increases in their exposure to alcohol cues in the college environment. From the first semester to the second semester, levels of alcohol use and cue exposure remained steady, but connectivity among cognitive control brain regions decreased. From this, we concluded that changes in alcohol use and cue exposure — not absolute levels — were reflected by the underlying neural processes."
Although the immediate implications of the pilot study for first-year students are fairly clear, there are still a number of unanswered questions related to alcohol’s longer-term effects on development, for college students after their first year and for those same individuals later in life.
To begin exploring those potential long-term effects, Beltz has planned a follow-up study to track a larger number of participants over a greater length of time.

Brain-mapping increases understanding of alcohol’s effects on college freshmen

A research team that includes several Penn State scientists has completed a first-of-its-kind longitudinal pilot study aimed at better understanding how the neural processes that underlie responses to alcohol-related cues change during students’ first year of college.

Anecdotal evidence abounds attesting to the many negative social and physical effects of the dramatic increase in alcohol use that often comes with many students’ first year of college. The behavioral changes that accompany those effects indicate underlying changes in the brain. Yet in contrast to alcohol’s numerous other effects, its effect on the brain’s continuing development from adolescence into early adulthood — which includes the transition from high school to college — is not well known.

Penn State psychology graduate student Adriene Beltz, with a team of additional researchers, investigated the changes that occurred to alcohol-related neural processes in the brains of a small group of first-year students.

Using functional magnetic resonance imaging (fMRI) and a data analysis technique known as effective connectivity mapping, the researchers collected and analyzed data from 11 students, who participated in a series of three fMRI sessions beginning just before the start of classes and concluding part-way through the second semester.

"We wanted to know if and how brain responses to alcohol cues — pictures of alcoholic beverages in this case — changed across the first year of college," said Beltz, "and how these potential changes related to alcohol use. Moreover, we wanted our analysis approach to take advantage of the richness of fMRI data."

Analysis of the data collected from the study participants revealed signs in their brains’ emotion processing networks of habituation to alcohol-related stimuli, and noticeable alterations in their cognitive control networks.

Recent studies have indicated that young adults’ cognitive development continues through the ages of the mid-20s, particularly in those regions of the brain responsible for decision-making or judgment-related activity — the sort of cognitive “fine tuning” that potentially makes us, in some senses, as much who we are (and will be) as any other stage of our overall development.

Other recent studies suggest that binge drinking during late adolescence may damage the brain in ways that could last into adulthood.

Beltz’s study indicates that connections among brain regions involved in emotion processing and cognitive control may change with increased exposure to alcohol and alcohol-related cues. Those connections also may influence other parts of the brain, such as those still-developing regions responsible for students’ decision-making and judgment abilities.

"The brain is a complex network," Beltz said. "We know that connections among different brain regions are important for behavior, and we know that many of these connections are still developing into early adulthood. Thus, alcohol could have far-reaching consequences on a maturing brain, directly influencing some brain regions and indirectly influencing others by disrupting neural connectivity."

While in an fMRI scanner at the Penn State Social, Life and Engineering Sciences Imaging Center, students participating in the study completed a task: responding as quickly as possible, by pressing a button on a grip device, to an image of either an alcoholic beverage or a non-alcoholic beverage when both types of images were displayed consecutively on a screen. From the resulting data, effective connectivity maps were created for each individual and for the group.

Examining the final maps, the researchers found that brain regions involved in emotion-processing showed less connectivity when the students responded to alcohol cues than when they responded to non-alcohol cues, and that brain regions involved in cognitive control showed the most connectivity during the first semester of college. The findings suggest that the students needed to heavily recruit brain regions involved in cognitive control in order to overcome the alcohol-associated stimuli when instructed to respond to the non-alcohol cues.

"Connectivity among brain regions implicated in cognitive control spiked from the summer before college to the first semester of college," said Beltz. "This was particularly interesting because the spike coincided with increases in the participants’ alcohol use and increases in their exposure to alcohol cues in the college environment. From the first semester to the second semester, levels of alcohol use and cue exposure remained steady, but connectivity among cognitive control brain regions decreased. From this, we concluded that changes in alcohol use and cue exposure — not absolute levels — were reflected by the underlying neural processes."

Although the immediate implications of the pilot study for first-year students are fairly clear, there are still a number of unanswered questions related to alcohol’s longer-term effects on development, for college students after their first year and for those same individuals later in life.

To begin exploring those potential long-term effects, Beltz has planned a follow-up study to track a larger number of participants over a greater length of time.

Filed under alcohol brain mapping effective connectivity mapping fMRI brain responses neuroscience science

36 notes

Atypical brain circuits may cause slower gaze shifting in infants who later develop autism
Infants at 7 months of age who go on to develop autism are slower to reorient their gaze and attention from one object to another when compared to 7-month-olds who do not develop autism, and this behavioral pattern is in part explained by atypical brain circuits.
Those are the findings of a new study led by University of North Carolina School of Medicine researchers and published online March 20 by the American Journal of Psychiatry.
"These findings suggest that 7-month-olds who go on to develop autism show subtle, yet overt, behavioral differences prior to the emergence of the disorder. They also implicate a specific neural circuit, the splenium of the corpus callosum, which may not be functioning as it does in typically developing infants, who show more rapid orienting to visual stimuli," said Jed T. Elison, PhD, first author of the study.
Elison worked on the study, conducted as part of the Infant Brain Imaging Study (IBIS) Network, for his doctoral dissertation at UNC. He now is a postdoctoral fellow at the California Institute of Technology. The study’s senior author is Joseph Piven, MD, professor of psychiatry, director of the Carolina Institute for Developmental Disabilities at UNC, and the principle investigator of the IBIS Network.
The IBIS Network consists of research sites at UNC, Children’s Hospital of Philadelphia, Washington University in St. Louis, the University of Washington in Seattle, the University of Utah in Salt Lake City, and the Montreal Neurological Institute at McGill University, and the University of Alberta are currently recruiting younger siblings of children with autism and their families for ongoing research.
"Difficulty in shifting gaze and attention that we found in 7-month-olds may be a fundamental problem in autism," Piven said. "Our hope is that this finding may help lead us to early detection and interventions that could improve outcomes for individuals with autism and their families."
The study included 97 infants: 16 high-risk infants later classified with an autism spectrum disorder (ASD), 40 high-risk infants not meeting ASD criteria (i.e., high-risk-negative) and 41 low-risk infants. For this study, infants participated in an eye-tracking test and a brain scan at 7 months of age a clinical assessment at 25 months of age.
The results showed that the high-risk infants later found to have ASD were slower to orient or shift their gaze (by approximately 50 miliseconds) than both high-risk-negative and low-risk infants. In addition, visual orienting ability in low-risk infants was uniquely associated with a specific neural circuit in the brain: the splenium of the corpus callosum. This association was not found in infants later classified with ASD.
The study concluded that atypical visual orienting is an early feature of later emerging ASD and is associated with a deficit in a specific neural circuit in the brain.

Atypical brain circuits may cause slower gaze shifting in infants who later develop autism

Infants at 7 months of age who go on to develop autism are slower to reorient their gaze and attention from one object to another when compared to 7-month-olds who do not develop autism, and this behavioral pattern is in part explained by atypical brain circuits.

Those are the findings of a new study led by University of North Carolina School of Medicine researchers and published online March 20 by the American Journal of Psychiatry.

"These findings suggest that 7-month-olds who go on to develop autism show subtle, yet overt, behavioral differences prior to the emergence of the disorder. They also implicate a specific neural circuit, the splenium of the corpus callosum, which may not be functioning as it does in typically developing infants, who show more rapid orienting to visual stimuli," said Jed T. Elison, PhD, first author of the study.

Elison worked on the study, conducted as part of the Infant Brain Imaging Study (IBIS) Network, for his doctoral dissertation at UNC. He now is a postdoctoral fellow at the California Institute of Technology. The study’s senior author is Joseph Piven, MD, professor of psychiatry, director of the Carolina Institute for Developmental Disabilities at UNC, and the principle investigator of the IBIS Network.

The IBIS Network consists of research sites at UNC, Children’s Hospital of Philadelphia, Washington University in St. Louis, the University of Washington in Seattle, the University of Utah in Salt Lake City, and the Montreal Neurological Institute at McGill University, and the University of Alberta are currently recruiting younger siblings of children with autism and their families for ongoing research.

"Difficulty in shifting gaze and attention that we found in 7-month-olds may be a fundamental problem in autism," Piven said. "Our hope is that this finding may help lead us to early detection and interventions that could improve outcomes for individuals with autism and their families."

The study included 97 infants: 16 high-risk infants later classified with an autism spectrum disorder (ASD), 40 high-risk infants not meeting ASD criteria (i.e., high-risk-negative) and 41 low-risk infants. For this study, infants participated in an eye-tracking test and a brain scan at 7 months of age a clinical assessment at 25 months of age.

The results showed that the high-risk infants later found to have ASD were slower to orient or shift their gaze (by approximately 50 miliseconds) than both high-risk-negative and low-risk infants. In addition, visual orienting ability in low-risk infants was uniquely associated with a specific neural circuit in the brain: the splenium of the corpus callosum. This association was not found in infants later classified with ASD.

The study concluded that atypical visual orienting is an early feature of later emerging ASD and is associated with a deficit in a specific neural circuit in the brain.

Filed under brain brain circuits neural circuit infants autism corpus callosum visual orienting ASD neuroscience science

187 notes

Sleep study reveals how the adolescent brain makes the transition to mature thinking
A new study conducted by monitoring the brain waves of sleeping adolescents has found that remarkable changes occur in the brain as it prunes away neuronal connections and makes the major transition from childhood to adulthood.
“We’ve provided the first long-term, longitudinal description of developmental changes that take place in the brains of youngsters as they sleep,” said Irwin Feinberg, professor emeritus of psychiatry and behavioral sciences and director of the UC Davis Sleep Laboratory. “Our outcome confirms that the brain goes through a remarkable amount of reorganization during puberty that is necessary for complex thinking.”
The research, published in the February 15 issue of American Journal of Physiology: Regulatory, Integrative and Comparative Physiology, also confirms that electroencephalogram, or EEG, is a powerful tool for tracking brain changes during different phases of life, and that it could potentially be used to help diagnose age-related mental illnesses. It is the final component in a three-part series of studies carried out over 10 years and involving more than 3,500 all-night EEG recordings. The data provide an overall picture of the brain’s electrical behavior during the first two decades of life.
Feinberg explained that scientists have generally assumed that a vast number of synapses are needed early in life to recover from injury and adapt to changing environments. These multiple connections, however, impair the efficient problem solving and logical thinking required later in life. His study is the first to show how this shift can be detected by measuring the brain’s electrical activity in the same children over the course of time.
Two earlier studies by Feinberg and his colleagues showed that EEG fluctuations during the deepest (delta or slow wave) phase of sleep, when the brain is most recuperative, consistently declined for 9- to 18-year-olds. The most rapid decline occurred between the ages of 12 and 16-1/2. This led the team to conclude that the streamlining of brain activity — or “neuronal pruning” — required for adult cognition occurs together with the timing of reproductive maturity.
Questions remained, though, about electrical activity patterns in the brains of younger children.
For the current study, Feinberg and his research team monitored 28 healthy, sleeping children between the ages of 6 and 10 for two nights every six months. The new findings show that synaptic density in the cerebral cortex reaches its peak at age 8 and then begins a slow decline. The recent findings also confirm that the period of greatest and most accelerated decline occurs between the ages of 12 and 16-1/2 years, at which point the drop markedly slows.
“Discovering that such extensive neuronal remodeling occurs within this 4-1/2 year timeframe during late adolescence and the early teen years confirms our view that the sleep EEG indexes a crucial aspect of the timing of brain development,” said Feinberg.
The latest study also confirms that EEG sleep analysis is a powerful approach for evaluating adolescent brain maturation, according to Feinberg. Besides being a relatively simple, accessible technology for measuring the brain’s electrical activity, it is more accurate than more cumbersome and expensive options.
“Structural MRI, for instance, has not been able to identify the adolescent accelerations and decelerations that are easily and reliably captured by sleep EEG,” said Feinberg. “We hope our data can aid the search for the unknown genetic and hormonal biomarkers that drive those fluctuations. Our data also provide a baseline for seeking errors in brain development that signify the onset of diseases such as schizophrenia, which typically first become apparent during adolescence. Once these underlying processes have been identified, it may become possible to influence adolescent brain changes in ways that promote normal development and correct emerging abnormalities.”
(Image: iStockphoto)

Sleep study reveals how the adolescent brain makes the transition to mature thinking

A new study conducted by monitoring the brain waves of sleeping adolescents has found that remarkable changes occur in the brain as it prunes away neuronal connections and makes the major transition from childhood to adulthood.

“We’ve provided the first long-term, longitudinal description of developmental changes that take place in the brains of youngsters as they sleep,” said Irwin Feinberg, professor emeritus of psychiatry and behavioral sciences and director of the UC Davis Sleep Laboratory. “Our outcome confirms that the brain goes through a remarkable amount of reorganization during puberty that is necessary for complex thinking.”

The research, published in the February 15 issue of American Journal of Physiology: Regulatory, Integrative and Comparative Physiology, also confirms that electroencephalogram, or EEG, is a powerful tool for tracking brain changes during different phases of life, and that it could potentially be used to help diagnose age-related mental illnesses. It is the final component in a three-part series of studies carried out over 10 years and involving more than 3,500 all-night EEG recordings. The data provide an overall picture of the brain’s electrical behavior during the first two decades of life.

Feinberg explained that scientists have generally assumed that a vast number of synapses are needed early in life to recover from injury and adapt to changing environments. These multiple connections, however, impair the efficient problem solving and logical thinking required later in life. His study is the first to show how this shift can be detected by measuring the brain’s electrical activity in the same children over the course of time.

Two earlier studies by Feinberg and his colleagues showed that EEG fluctuations during the deepest (delta or slow wave) phase of sleep, when the brain is most recuperative, consistently declined for 9- to 18-year-olds. The most rapid decline occurred between the ages of 12 and 16-1/2. This led the team to conclude that the streamlining of brain activity — or “neuronal pruning” — required for adult cognition occurs together with the timing of reproductive maturity.

Questions remained, though, about electrical activity patterns in the brains of younger children.

For the current study, Feinberg and his research team monitored 28 healthy, sleeping children between the ages of 6 and 10 for two nights every six months. The new findings show that synaptic density in the cerebral cortex reaches its peak at age 8 and then begins a slow decline. The recent findings also confirm that the period of greatest and most accelerated decline occurs between the ages of 12 and 16-1/2 years, at which point the drop markedly slows.

“Discovering that such extensive neuronal remodeling occurs within this 4-1/2 year timeframe during late adolescence and the early teen years confirms our view that the sleep EEG indexes a crucial aspect of the timing of brain development,” said Feinberg.

The latest study also confirms that EEG sleep analysis is a powerful approach for evaluating adolescent brain maturation, according to Feinberg. Besides being a relatively simple, accessible technology for measuring the brain’s electrical activity, it is more accurate than more cumbersome and expensive options.

“Structural MRI, for instance, has not been able to identify the adolescent accelerations and decelerations that are easily and reliably captured by sleep EEG,” said Feinberg. “We hope our data can aid the search for the unknown genetic and hormonal biomarkers that drive those fluctuations. Our data also provide a baseline for seeking errors in brain development that signify the onset of diseases such as schizophrenia, which typically first become apparent during adolescence. Once these underlying processes have been identified, it may become possible to influence adolescent brain changes in ways that promote normal development and correct emerging abnormalities.”

(Image: iStockphoto)

Filed under adolescent brain brainwaves brain development developmental changes EEG neuroscience psychology science

54 notes

Origins of teamwork found in our nearest relative the chimpanzee
Teamwork has been fundamental in humanity’s greatest achievements but scientists have found that working together has its evolutionary roots in our nearest primate relatives – chimpanzees.
A series of trials by scientists found that chimpanzees not only coordinate actions with each other but also understand the need to help a partner perform their role to achieve a common goal.
Pairs of chimpanzees were given tools to get grapes out of a box. They had to work together with a tool each to get the food out. Scientists found that the chimpanzees would solve the problem together, even swapping tools, to pull the food out.
The study, published in Biology Letters, by scientists from Warwick Business School, UK, and the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, sought to find out if there were any evolutionary roots to humans’ ability to cooperate and coordinate actions.
Dr Alicia Melis, Assistant Professor of Behavioural Science at Warwick Business School, said: “We want to find out where humans’ ability to cooperate and work together has come from and whether it is unique to us.
“Many animal species cooperate to achieve mutually beneficial goals like defending their territories or hunting prey. However, the level of intentional coordination underlying these group actions is often unclear, and success could be due to independent but simultaneous actions towards the same goal.
“This study provides the first evidence that one of our closest primate relatives, the chimpanzees, not only intentionally coordinate actions with each other but that they even understand the necessity to help a partner performing her role in order to achieve the common goal.
“These are skills shared by both chimpanzees and humans, so such skills may have been present in their common ancestor before humans evolved their own complex forms of collaboration”

Origins of teamwork found in our nearest relative the chimpanzee

Teamwork has been fundamental in humanity’s greatest achievements but scientists have found that working together has its evolutionary roots in our nearest primate relatives – chimpanzees.

A series of trials by scientists found that chimpanzees not only coordinate actions with each other but also understand the need to help a partner perform their role to achieve a common goal.

Pairs of chimpanzees were given tools to get grapes out of a box. They had to work together with a tool each to get the food out. Scientists found that the chimpanzees would solve the problem together, even swapping tools, to pull the food out.

The study, published in Biology Letters, by scientists from Warwick Business School, UK, and the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, sought to find out if there were any evolutionary roots to humans’ ability to cooperate and coordinate actions.

Dr Alicia Melis, Assistant Professor of Behavioural Science at Warwick Business School, said: “We want to find out where humans’ ability to cooperate and work together has come from and whether it is unique to us.

“Many animal species cooperate to achieve mutually beneficial goals like defending their territories or hunting prey. However, the level of intentional coordination underlying these group actions is often unclear, and success could be due to independent but simultaneous actions towards the same goal.

“This study provides the first evidence that one of our closest primate relatives, the chimpanzees, not only intentionally coordinate actions with each other but that they even understand the necessity to help a partner performing her role in order to achieve the common goal.

“These are skills shared by both chimpanzees and humans, so such skills may have been present in their common ancestor before humans evolved their own complex forms of collaboration”

Filed under primates evolution teamwork intentional coordination psychology neuroscience science

105 notes

Skulls of early humans carry telltale signs of inbreeding
Buried for 100,000 years at Xujiayao in the Nihewan Basin of northern China, the recovered skull pieces of an early human exhibit a now-rare congenital deformation that indicates inbreeding might well have been common among our ancestors, new research from the Chinese Academy of Sciences and Washington University in St. Louis suggests.
The skull, known as Xujiayao 11, has an unusual perforation through the top of the brain case — an enlarged parietal foramen (EPF) or “hole in the skull” — that is consistent with modern humans diagnosed with a rare genetic mutation in the homeobox genes ALX4 on chromosome 11 and MSX2 on chromosome 5.
These specific genetic mutations interfere with bone formation and prevent the closure of small holes in the back of the prenatal braincase, a process that is normally completed within the first five months of fetal development. It occurs in about one out of every 25,000 modern human births.
Although this genetic abnormality is sometimes associated with cognitive deficits, the older adult age of Xujiayao 11 suggests that any such deficits in this individual were minor.
Traces of genetic abnormalities, such as EPF, are seen unusually often in the skulls of Pleistocene humans, from early Homo erectus to the end of the Paleolithic.
"The probability of finding one of these abnormalities in the small available sample of human fossils is very low, and the cumulative probability of finding so many is exceedingly small," suggests study co-author Erik Trinkaus, the Mary Tileston Hemenway Professor of Anthropology in Arts & Sciences at Washington University in St. Louis.
"The presence of the Xujiayao and other Pleistocene human abnormalities therefore suggests unusual population dynamics, most likely from high levels of inbreeding and local population instability." It therefore provides a background for understanding populational and cultural dynamics through much of human evolution.

Skulls of early humans carry telltale signs of inbreeding

Buried for 100,000 years at Xujiayao in the Nihewan Basin of northern China, the recovered skull pieces of an early human exhibit a now-rare congenital deformation that indicates inbreeding might well have been common among our ancestors, new research from the Chinese Academy of Sciences and Washington University in St. Louis suggests.

The skull, known as Xujiayao 11, has an unusual perforation through the top of the brain case — an enlarged parietal foramen (EPF) or “hole in the skull” — that is consistent with modern humans diagnosed with a rare genetic mutation in the homeobox genes ALX4 on chromosome 11 and MSX2 on chromosome 5.

These specific genetic mutations interfere with bone formation and prevent the closure of small holes in the back of the prenatal braincase, a process that is normally completed within the first five months of fetal development. It occurs in about one out of every 25,000 modern human births.

Although this genetic abnormality is sometimes associated with cognitive deficits, the older adult age of Xujiayao 11 suggests that any such deficits in this individual were minor.

Traces of genetic abnormalities, such as EPF, are seen unusually often in the skulls of Pleistocene humans, from early Homo erectus to the end of the Paleolithic.

"The probability of finding one of these abnormalities in the small available sample of human fossils is very low, and the cumulative probability of finding so many is exceedingly small," suggests study co-author Erik Trinkaus, the Mary Tileston Hemenway Professor of Anthropology in Arts & Sciences at Washington University in St. Louis.

"The presence of the Xujiayao and other Pleistocene human abnormalities therefore suggests unusual population dynamics, most likely from high levels of inbreeding and local population instability." It therefore provides a background for understanding populational and cultural dynamics through much of human evolution.

Filed under skulls inbreeding congenital deformation Xujiayao 11 genetic mutations cognitive deficits evolution neuroscience science

131 notes

Neanderthal brains focussed on vision and movement
Neanderthal brains were adapted to allow them to see better and maintain larger bodies, according to new research by the University of Oxford and the Natural History Museum, London.
Although Neanderthals’ brains were similar in size to their contemporary modern human counterparts, fresh analysis of fossil data suggests that their brain structure was rather different. Results imply that larger areas of the Neanderthal brain, compared to the modern human brain, were given over to vision and movement and this left less room for the higher level thinking required to form large social groups.
The analysis was conducted by Eiluned Pearce and Professor Robin Dunbar at the University of Oxford and Professor Chris Stringer at the Natural History Museum, London, and is published in the online version of the journal, Proceedings of the Royal Society B.
Looking at data from 27,000–75,000-year-old fossils, mostly from Europe and the Near East, they compared the skulls of 32 anatomically modern humans and 13 Neanderthals to examine brain size and organisation. In a subset of these fossils, they found that Neanderthals had significantly larger eye sockets, and therefore eyes, than modern humans.
The researchers calculated the standard size of fossil brains for body mass and visual processing requirements. Once the differences in body and visual system size are taken into account, the researchers were able to compare how much of the brain was left over for other cognitive functions.
Previous research by the Oxford scientists shows that modern humans living at higher latitudes evolved bigger vision areas in the brain to cope with the low light levels. This latest study builds on that research, suggesting that Neanderthals probably had larger eyes than contemporary humans because they evolved in Europe, whereas contemporary humans had only recently emerged from lower latitude Africa.
'Since Neanderthals evolved at higher latitudes and also have bigger bodies than modern humans, more of the Neanderthal brain would have been dedicated to vision and body control, leaving less brain to deal with other functions like social networking,' explains lead author Eiluned Pearce from the  Institute of Cognitive and Evolutionary Anthropology at the University of Oxford.
‘Smaller social groups might have made Neanderthals less able to cope with the difficulties of their harsh Eurasian environments because they would have had fewer friends to help them out in times of need. Overall, differences in brain organisation and social cognition may go a long way towards explaining why Neanderthals went extinct whereas modern humans survived.’
'The large brains of Neanderthals have been a source of debate from the time of the first fossil discoveries of this group, but getting any real idea of the “quality” of their brains has been very problematic,' says Professor Chris Stringer, Research Leader in Human Origins at the Natural History Museum and co-author on the paper. 'Hence discussion has centred on their material culture and supposed way of life as indirect signs of the level of complexity of their brains in comparison with ours.
'Our study provides a more direct approach by estimating how much of their brain was allocated to cognitive functions, including the regulation of social group size; a smaller size for the latter would have had implications for their level of social complexity and their ability to create, conserve and build on innovations.'
Professor Robin Dunbar observes: ‘Having less brain available to manage the social world has profound implications for the Neanderthals’ ability to maintain extended trading networks, and are likely also to have resulted in less well developed material culture – which, between them, may have left them more exposed than modern humans when facing the ecological challenges of the Ice Ages.’
The relationship between absolute brain size and higher cognitive abilities has long been controversial, and this new study could explain why Neanderthal culture appears less developed than that of early modern humans, for example in relation to symbolism, ornamentation and art.

Neanderthal brains focussed on vision and movement

Neanderthal brains were adapted to allow them to see better and maintain larger bodies, according to new research by the University of Oxford and the Natural History Museum, London.

Although Neanderthals’ brains were similar in size to their contemporary modern human counterparts, fresh analysis of fossil data suggests that their brain structure was rather different. Results imply that larger areas of the Neanderthal brain, compared to the modern human brain, were given over to vision and movement and this left less room for the higher level thinking required to form large social groups.

The analysis was conducted by Eiluned Pearce and Professor Robin Dunbar at the University of Oxford and Professor Chris Stringer at the Natural History Museum, London, and is published in the online version of the journal, Proceedings of the Royal Society B.

Looking at data from 27,000–75,000-year-old fossils, mostly from Europe and the Near East, they compared the skulls of 32 anatomically modern humans and 13 Neanderthals to examine brain size and organisation. In a subset of these fossils, they found that Neanderthals had significantly larger eye sockets, and therefore eyes, than modern humans.

The researchers calculated the standard size of fossil brains for body mass and visual processing requirements. Once the differences in body and visual system size are taken into account, the researchers were able to compare how much of the brain was left over for other cognitive functions.

Previous research by the Oxford scientists shows that modern humans living at higher latitudes evolved bigger vision areas in the brain to cope with the low light levels. This latest study builds on that research, suggesting that Neanderthals probably had larger eyes than contemporary humans because they evolved in Europe, whereas contemporary humans had only recently emerged from lower latitude Africa.

'Since Neanderthals evolved at higher latitudes and also have bigger bodies than modern humans, more of the Neanderthal brain would have been dedicated to vision and body control, leaving less brain to deal with other functions like social networking,' explains lead author Eiluned Pearce from the  Institute of Cognitive and Evolutionary Anthropology at the University of Oxford.

‘Smaller social groups might have made Neanderthals less able to cope with the difficulties of their harsh Eurasian environments because they would have had fewer friends to help them out in times of need. Overall, differences in brain organisation and social cognition may go a long way towards explaining why Neanderthals went extinct whereas modern humans survived.’

'The large brains of Neanderthals have been a source of debate from the time of the first fossil discoveries of this group, but getting any real idea of the “quality” of their brains has been very problematic,' says Professor Chris Stringer, Research Leader in Human Origins at the Natural History Museum and co-author on the paper. 'Hence discussion has centred on their material culture and supposed way of life as indirect signs of the level of complexity of their brains in comparison with ours.

'Our study provides a more direct approach by estimating how much of their brain was allocated to cognitive functions, including the regulation of social group size; a smaller size for the latter would have had implications for their level of social complexity and their ability to create, conserve and build on innovations.'

Professor Robin Dunbar observes: ‘Having less brain available to manage the social world has profound implications for the Neanderthals’ ability to maintain extended trading networks, and are likely also to have resulted in less well developed material culture – which, between them, may have left them more exposed than modern humans when facing the ecological challenges of the Ice Ages.’

The relationship between absolute brain size and higher cognitive abilities has long been controversial, and this new study could explain why Neanderthal culture appears less developed than that of early modern humans, for example in relation to symbolism, ornamentation and art.

Filed under brain Neanderthals brain structure cognitive functions visual system neuroscience psychology evolution science

free counters