Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

177 notes

(Image caption: This image depicts the injection sites and the expression of the viral constructs in the two areas of the brain studied: the Dentate Gyrus of the hippocampus (middle) and the Basolateral Amygdala (bottom corners). Image courtesy of the researchers)
Neuroscientists reverse memories’ emotional associations
Most memories have some kind of emotion associated with them: Recalling the week you just spent at the beach probably makes you feel happy, while reflecting on being bullied provokes more negative feelings.
A new study from MIT neuroscientists reveals the brain circuit that controls how memories become linked with positive or negative emotions. Furthermore, the researchers found that they could reverse the emotional association of specific memories by manipulating brain cells with optogenetics — a technique that uses light to control neuron activity.
The findings, described in the Aug. 27 issue of Nature, demonstrated that a neuronal circuit connecting the hippocampus and the amygdala plays a critical role in associating emotion with memory. This circuit could offer a target for new drugs to help treat conditions such as post-traumatic stress disorder, the researchers say.
“In the future, one may be able to develop methods that help people to remember positive memories more strongly than negative ones,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, director of the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory, and senior author of the paper.
The paper’s lead authors are Roger Redondo, a Howard Hughes Medical Institute postdoc at MIT, and Joshua Kim, a graduate student in MIT’s Department of Biology.
Shifting memories
Memories are made of many elements, which are stored in different parts of the brain. A memory’s context, including information about the location where the event took place, is stored in cells of the hippocampus, while emotions linked to that memory are found in the amygdala.
Previous research has shown that many aspects of memory, including emotional associations, are malleable. Psychotherapists have taken advantage of this to help patients suffering from depression and post-traumatic stress disorder, but the neural circuitry underlying such malleability is not known.
In this study, the researchers set out to explore that malleability with an experimental technique they recently devised that allows them to tag neurons that encode a specific memory, or engram. To achieve this, they label hippocampal cells that are turned on during memory formation with a light-sensitive protein called channelrhodopsin. From that point on, any time those cells are activated with light, the mice recall the memory encoded by that group of cells.
Last year, Tonegawa’s lab used this technique to implant, or “incept,” false memories in mice by reactivating engrams while the mice were undergoing a different experience. In the new study, the researchers wanted to investigate how the context of a memory becomes linked to a particular emotion. First, they used their engram-labeling protocol to tag neurons associated with either a rewarding experience (for male mice, socializing with a female mouse) or an unpleasant experience (a mild electrical shock). In this first set of experiments, the researchers labeled memory cells in a part of the hippocampus called the dentate gyrus.
Two days later, the mice were placed into a large rectangular arena. For three minutes, the researchers recorded which half of the arena the mice naturally preferred. Then, for mice that had received the fear conditioning, the researchers stimulated the labeled cells in the dentate gyrus with light whenever the mice went into the preferred side. The mice soon began avoiding that area, showing that the reactivation of the fear memory had been successful.
The reward memory could also be reactivated: For mice that were reward-conditioned, the researchers stimulated them with light whenever they went into the less-preferred side, and they soon began to spend more time there, recalling the pleasant memory.
A couple of days later, the researchers tried to reverse the mice’s emotional responses. For male mice that had originally received the fear conditioning, they activated the memory cells involved in the fear memory with light for 12 minutes while the mice spent time with female mice. For mice that had initially received the reward conditioning, memory cells were activated while they received mild electric shocks.
Next, the researchers again put the mice in the large two-zone arena. This time, the mice that had originally been conditioned with fear and had avoided the side of the chamber where their hippocampal cells were activated by the laser now began to spend more time in that side when their hippocampal cells were activated, showing that a pleasant association had replaced the fearful one. This reversal also took place in mice that went from reward to fear conditioning.
Altered connections
The researchers then performed the same set of experiments but labeled memory cells in the basolateral amygdala, a region involved in processing emotions. This time, they could not induce a switch by reactivating those cells — the mice continued to behave as they had been conditioned when the memory cells were first labeled.
This suggests that emotional associations, also called valences, are encoded somewhere in the neural circuitry that connects the dentate gyrus to the amygdala, the researchers say. A fearful experience strengthens the connections between the hippocampal engram and fear-encoding cells in the amygdala, but that connection can be weakened later on as new connections are formed between the hippocampus and amygdala cells that encode positive associations.
“That plasticity of the connection between the hippocampus and the amygdala plays a crucial role in the switching of the valence of the memory,” Tonegawa says.
These results indicate that while dentate gyrus cells are neutral with respect to emotion, individual amygdala cells are precommitted to encode fear or reward memory. The researchers are now trying to discover molecular signatures of these two types of amygdala cells. They are also investigating whether reactivating pleasant memories has any effect on depression, in hopes of identifying new targets for drugs to treat depression and post-traumatic stress disorder.
David Anderson, a professor of biology at the California Institute of Technology, says the study makes an important contribution to neuroscientists’ fundamental understanding of the brain and also has potential implications for treating mental illness.
“This is a tour de force of modern molecular-biology-based methods for analyzing processes, such as learning and memory, at the neural-circuitry level. It’s one of the most sophisticated studies of this type that I’ve seen,” he says.

(Image caption: This image depicts the injection sites and the expression of the viral constructs in the two areas of the brain studied: the Dentate Gyrus of the hippocampus (middle) and the Basolateral Amygdala (bottom corners). Image courtesy of the researchers)

Neuroscientists reverse memories’ emotional associations

Most memories have some kind of emotion associated with them: Recalling the week you just spent at the beach probably makes you feel happy, while reflecting on being bullied provokes more negative feelings.

A new study from MIT neuroscientists reveals the brain circuit that controls how memories become linked with positive or negative emotions. Furthermore, the researchers found that they could reverse the emotional association of specific memories by manipulating brain cells with optogenetics — a technique that uses light to control neuron activity.

The findings, described in the Aug. 27 issue of Nature, demonstrated that a neuronal circuit connecting the hippocampus and the amygdala plays a critical role in associating emotion with memory. This circuit could offer a target for new drugs to help treat conditions such as post-traumatic stress disorder, the researchers say.

“In the future, one may be able to develop methods that help people to remember positive memories more strongly than negative ones,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, director of the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory, and senior author of the paper.

The paper’s lead authors are Roger Redondo, a Howard Hughes Medical Institute postdoc at MIT, and Joshua Kim, a graduate student in MIT’s Department of Biology.

Shifting memories

Memories are made of many elements, which are stored in different parts of the brain. A memory’s context, including information about the location where the event took place, is stored in cells of the hippocampus, while emotions linked to that memory are found in the amygdala.

Previous research has shown that many aspects of memory, including emotional associations, are malleable. Psychotherapists have taken advantage of this to help patients suffering from depression and post-traumatic stress disorder, but the neural circuitry underlying such malleability is not known.

In this study, the researchers set out to explore that malleability with an experimental technique they recently devised that allows them to tag neurons that encode a specific memory, or engram. To achieve this, they label hippocampal cells that are turned on during memory formation with a light-sensitive protein called channelrhodopsin. From that point on, any time those cells are activated with light, the mice recall the memory encoded by that group of cells.

Last year, Tonegawa’s lab used this technique to implant, or “incept,” false memories in mice by reactivating engrams while the mice were undergoing a different experience. In the new study, the researchers wanted to investigate how the context of a memory becomes linked to a particular emotion. First, they used their engram-labeling protocol to tag neurons associated with either a rewarding experience (for male mice, socializing with a female mouse) or an unpleasant experience (a mild electrical shock). In this first set of experiments, the researchers labeled memory cells in a part of the hippocampus called the dentate gyrus.

Two days later, the mice were placed into a large rectangular arena. For three minutes, the researchers recorded which half of the arena the mice naturally preferred. Then, for mice that had received the fear conditioning, the researchers stimulated the labeled cells in the dentate gyrus with light whenever the mice went into the preferred side. The mice soon began avoiding that area, showing that the reactivation of the fear memory had been successful.

The reward memory could also be reactivated: For mice that were reward-conditioned, the researchers stimulated them with light whenever they went into the less-preferred side, and they soon began to spend more time there, recalling the pleasant memory.

A couple of days later, the researchers tried to reverse the mice’s emotional responses. For male mice that had originally received the fear conditioning, they activated the memory cells involved in the fear memory with light for 12 minutes while the mice spent time with female mice. For mice that had initially received the reward conditioning, memory cells were activated while they received mild electric shocks.

Next, the researchers again put the mice in the large two-zone arena. This time, the mice that had originally been conditioned with fear and had avoided the side of the chamber where their hippocampal cells were activated by the laser now began to spend more time in that side when their hippocampal cells were activated, showing that a pleasant association had replaced the fearful one. This reversal also took place in mice that went from reward to fear conditioning.

Altered connections

The researchers then performed the same set of experiments but labeled memory cells in the basolateral amygdala, a region involved in processing emotions. This time, they could not induce a switch by reactivating those cells — the mice continued to behave as they had been conditioned when the memory cells were first labeled.

This suggests that emotional associations, also called valences, are encoded somewhere in the neural circuitry that connects the dentate gyrus to the amygdala, the researchers say. A fearful experience strengthens the connections between the hippocampal engram and fear-encoding cells in the amygdala, but that connection can be weakened later on as new connections are formed between the hippocampus and amygdala cells that encode positive associations.

“That plasticity of the connection between the hippocampus and the amygdala plays a crucial role in the switching of the valence of the memory,” Tonegawa says.

These results indicate that while dentate gyrus cells are neutral with respect to emotion, individual amygdala cells are precommitted to encode fear or reward memory. The researchers are now trying to discover molecular signatures of these two types of amygdala cells. They are also investigating whether reactivating pleasant memories has any effect on depression, in hopes of identifying new targets for drugs to treat depression and post-traumatic stress disorder.

David Anderson, a professor of biology at the California Institute of Technology, says the study makes an important contribution to neuroscientists’ fundamental understanding of the brain and also has potential implications for treating mental illness.

“This is a tour de force of modern molecular-biology-based methods for analyzing processes, such as learning and memory, at the neural-circuitry level. It’s one of the most sophisticated studies of this type that I’ve seen,” he says.

Filed under optogenetics hippocampus memory emotions amygdala dentate gyrus neuroscience science

152 notes

Stop and Listen: Study Shows How Movement Affects Hearing
When we want to listen carefully to someone, the first thing we do is stop talking. The second thing we do is stop moving altogether. This strategy helps us hear better by preventing unwanted sounds generated by our own movements.
This interplay between movement and hearing also has a counterpart deep in the brain. Indeed, indirect evidence has long suggested that the brain’s motor cortex, which controls movement, somehow influences the auditory cortex, which gives rise to our conscious perception of sound.
A new Duke study, appearing online August 27 in Nature, combines cutting-edge methods in electrophysiology, optogenetics and behavioral analysis to reveal exactly how the motor cortex, seemingly in anticipation of movement, can tweak the volume control in the auditory cortex.
The new lab methods allowed the group to “get beyond a century’s worth of very powerful but largely correlative observations, and develop a new, and really a harder, causality-driven view of how the brain works,” said the study’s senior author Richard Mooney Ph.D., a professor of neurobiology at Duke University School of Medicine, and a member of the Duke Institute for Brain Sciences.
The findings contribute to the basic knowledge of how communication between the brain’s motor and auditory cortexes might affect hearing during speech or musical performance. Disruptions to the same circuitry may give rise to auditory hallucinations in people with schizophrenia.
In 2013, researchers led by Mooney first characterized the connections between motor and auditory areas in mouse brain slices as well as in anesthetized mice. The new study answers the critical question of how those connections operate in an awake, moving mouse.
"This is a major step forward in that we’ve now interrogated the system in an animal that’s freely behaving," said David Schneider, a postdoctoral associate in Mooney’s lab.
Mooney suspects that the motor cortex learns how to mute responses in the auditory cortex to sounds that are expected to arise from one’s own movements while heightening sensitivity to other, unexpected sounds. The group is testing this idea.
"Our first step will be to start making more realistic situations where the animal needs to ignore the sounds that its movements are making in order to detect things that are happening in the world," Schneider said.
In the latest study, the team recorded electrical activity of individual neurons in the brain’s auditory cortex. Whenever the mice moved — walking, grooming, or making high-pitched squeaks — neurons in their auditory cortex were dampened in response to tones played to the animals, compared to when they were at rest.
To find out whether movement was directly influencing the auditory cortex, researchers conducted a series of experiments in awake animals using optogenetics, a powerful method that uses light to control the activity of select populations of neurons that have been genetically sensitized to light. Like the game of telephone, sounds that enter the ear pass through six or more relays in the brain before reaching the auditory cortex.
"Optogenetics can be used to activate a specific relay in the network, in this case the penultimate node that relays signals to the auditory cortex," Mooney said.
About half of the suppression during movement was found to originate within the auditory cortex itself. “That says a lot of modulation is going on in the auditory cortex, and not just at earlier relays in the auditory system” Mooney said.
More specifically, the team found that movement stimulates inhibitory neurons that in turn suppress the response of the auditory cortex to tones.
The researchers then wondered what turns on the inhibitory neurons. The suspects were many. “The auditory cortex is like this giant switching station where all these different inputs come through and say, ‘Okay, I want to have access to these interneurons,’” Mooney said. “The question we wanted to answer is who gets access to them during movement?”
The team knew from previous experiments that neuronal projections from the secondary motor cortex (M2) modulate the auditory cortex. But to isolate M2’s relative contribution — something not possible with traditional electrophysiology — the researchers again used optogenetics, this time to switch on and off the M2’s inputs to the inhibitory neurons.
Turning on M2 inputs reproduced a sense of movement in the auditory cortex, even in mice that were resting, the group found. “We were sending a ‘Hey I’m moving’ signal to the auditory cortex,” Schneider said. Then the effect of playing a tone on the auditory cortex was much the same as if the animal had actually been moving — a result that confirmed the importance of M2 in modulating the auditory cortex. On the other hand, turning off M2 simulated rest in the auditory cortex, even when the animals were still moving.
"I couldn’t contain my excitement when we first saw that result," said Anders Nelson, a neurobiology graduate student in Mooney’s group.

Stop and Listen: Study Shows How Movement Affects Hearing

When we want to listen carefully to someone, the first thing we do is stop talking. The second thing we do is stop moving altogether. This strategy helps us hear better by preventing unwanted sounds generated by our own movements.

This interplay between movement and hearing also has a counterpart deep in the brain. Indeed, indirect evidence has long suggested that the brain’s motor cortex, which controls movement, somehow influences the auditory cortex, which gives rise to our conscious perception of sound.

A new Duke study, appearing online August 27 in Nature, combines cutting-edge methods in electrophysiology, optogenetics and behavioral analysis to reveal exactly how the motor cortex, seemingly in anticipation of movement, can tweak the volume control in the auditory cortex.

The new lab methods allowed the group to “get beyond a century’s worth of very powerful but largely correlative observations, and develop a new, and really a harder, causality-driven view of how the brain works,” said the study’s senior author Richard Mooney Ph.D., a professor of neurobiology at Duke University School of Medicine, and a member of the Duke Institute for Brain Sciences.

The findings contribute to the basic knowledge of how communication between the brain’s motor and auditory cortexes might affect hearing during speech or musical performance. Disruptions to the same circuitry may give rise to auditory hallucinations in people with schizophrenia.

In 2013, researchers led by Mooney first characterized the connections between motor and auditory areas in mouse brain slices as well as in anesthetized mice. The new study answers the critical question of how those connections operate in an awake, moving mouse.

"This is a major step forward in that we’ve now interrogated the system in an animal that’s freely behaving," said David Schneider, a postdoctoral associate in Mooney’s lab.

Mooney suspects that the motor cortex learns how to mute responses in the auditory cortex to sounds that are expected to arise from one’s own movements while heightening sensitivity to other, unexpected sounds. The group is testing this idea.

"Our first step will be to start making more realistic situations where the animal needs to ignore the sounds that its movements are making in order to detect things that are happening in the world," Schneider said.

In the latest study, the team recorded electrical activity of individual neurons in the brain’s auditory cortex. Whenever the mice moved — walking, grooming, or making high-pitched squeaks — neurons in their auditory cortex were dampened in response to tones played to the animals, compared to when they were at rest.

To find out whether movement was directly influencing the auditory cortex, researchers conducted a series of experiments in awake animals using optogenetics, a powerful method that uses light to control the activity of select populations of neurons that have been genetically sensitized to light. Like the game of telephone, sounds that enter the ear pass through six or more relays in the brain before reaching the auditory cortex.

"Optogenetics can be used to activate a specific relay in the network, in this case the penultimate node that relays signals to the auditory cortex," Mooney said.

About half of the suppression during movement was found to originate within the auditory cortex itself. “That says a lot of modulation is going on in the auditory cortex, and not just at earlier relays in the auditory system” Mooney said.

More specifically, the team found that movement stimulates inhibitory neurons that in turn suppress the response of the auditory cortex to tones.

The researchers then wondered what turns on the inhibitory neurons. The suspects were many. “The auditory cortex is like this giant switching station where all these different inputs come through and say, ‘Okay, I want to have access to these interneurons,’” Mooney said. “The question we wanted to answer is who gets access to them during movement?”

The team knew from previous experiments that neuronal projections from the secondary motor cortex (M2) modulate the auditory cortex. But to isolate M2’s relative contribution — something not possible with traditional electrophysiology — the researchers again used optogenetics, this time to switch on and off the M2’s inputs to the inhibitory neurons.

Turning on M2 inputs reproduced a sense of movement in the auditory cortex, even in mice that were resting, the group found. “We were sending a ‘Hey I’m moving’ signal to the auditory cortex,” Schneider said. Then the effect of playing a tone on the auditory cortex was much the same as if the animal had actually been moving — a result that confirmed the importance of M2 in modulating the auditory cortex. On the other hand, turning off M2 simulated rest in the auditory cortex, even when the animals were still moving.

"I couldn’t contain my excitement when we first saw that result," said Anders Nelson, a neurobiology graduate student in Mooney’s group.

Filed under auditory cortex hearing motor cortex optogenetics neuroscience science

667 notes

Researchers discover fever’s origin
Fever is a response to inflammation, and is triggered by an onset of the signaling substance prostaglandin. Researchers at Linköping University can now see precisely where these substances are produced – a discovery that paves the way for smarter drugs.
When you take an aspirin, all production of prostaglandins in the body is suppressed. All symptoms of inflammation are eased simultaneously, including fever, pain and loss of appetite. But it might not always be desirable to get rid of all symptoms – there is a reason why they appear.
”Perhaps you want to inhibit loss of appetite but retain fever. In the case of serious infections, fever can be a good thing,” says David Engblom, senior lecturer in neurobiology at Linköping University.
Eleven years ago he had his first breakthrough as a researcher when he uncovered the mechanism behind the formation of prostaglandin E2 during fever. These signaling molecules cannot pass the blood-brain barrier, the purpose of which is to protect the brain from hazardous substances. Engblom showed that instead, they could be synthesised from two enzymes in the blood vessels on the inside of the brain, before moving to the hypothalamus, where the body’s thermostat is located.
Previous work from the research team described a very simple mechanism, but there was not yet proof that it was important in real life. The study to be published in The Journal of Neuroscience with David Engblom and his doctoral student Daniel Wilhelms as lead authors is based on tests with mice that lack the enzymes COX-2 and mPGES-1 in the brain’s blood vessels. When they were infected with bacterial toxins the fever did not appear, while other signs of inflammation were not affected.
”This shows that those prostaglandins which cause fever are formed in the blood-brain barrier – nowhere else. Now it will be interesting to investigate the other inflammation symptoms. Knowledge of this type can be useful when developing drugs that ease certain symptoms, but not all of them,” explains David Engblom.
For many years there has been debate as to where the fever signaling originates. Three alternative ideas have been proposed. Firstly, that it comes from prostaglandins circulating in the blood, secondly that it comes from immune cells in the brain, and thirdly Engblom’s theory, which stresses the importance of the brain’s blood vessels. The third proposal can now be considered confirmed.

Researchers discover fever’s origin

Fever is a response to inflammation, and is triggered by an onset of the signaling substance prostaglandin. Researchers at Linköping University can now see precisely where these substances are produced – a discovery that paves the way for smarter drugs.

When you take an aspirin, all production of prostaglandins in the body is suppressed. All symptoms of inflammation are eased simultaneously, including fever, pain and loss of appetite. But it might not always be desirable to get rid of all symptoms – there is a reason why they appear.

”Perhaps you want to inhibit loss of appetite but retain fever. In the case of serious infections, fever can be a good thing,” says David Engblom, senior lecturer in neurobiology at Linköping University.

Eleven years ago he had his first breakthrough as a researcher when he uncovered the mechanism behind the formation of prostaglandin Eduring fever. These signaling molecules cannot pass the blood-brain barrier, the purpose of which is to protect the brain from hazardous substances. Engblom showed that instead, they could be synthesised from two enzymes in the blood vessels on the inside of the brain, before moving to the hypothalamus, where the body’s thermostat is located.

Previous work from the research team described a very simple mechanism, but there was not yet proof that it was important in real life. The study to be published in The Journal of Neuroscience with David Engblom and his doctoral student Daniel Wilhelms as lead authors is based on tests with mice that lack the enzymes COX-2 and mPGES-1 in the brain’s blood vessels. When they were infected with bacterial toxins the fever did not appear, while other signs of inflammation were not affected.

”This shows that those prostaglandins which cause fever are formed in the blood-brain barrier – nowhere else. Now it will be interesting to investigate the other inflammation symptoms. Knowledge of this type can be useful when developing drugs that ease certain symptoms, but not all of them,” explains David Engblom.

For many years there has been debate as to where the fever signaling originates. Three alternative ideas have been proposed. Firstly, that it comes from prostaglandins circulating in the blood, secondly that it comes from immune cells in the brain, and thirdly Engblom’s theory, which stresses the importance of the brain’s blood vessels. The third proposal can now be considered confirmed.

Filed under fever inflammation blood-brain barrier prostaglandins blood vessels neuroscience science

240 notes

How the Brain Makes Sense of Spaces, Large and Small







When an animal encounters a new environment, the neurons in its brain that are responsible for mapping out the space are ready for anything. So says a new study in which scientists at the Howard Hughes Medical Institute’s Janelia Research Campus examined neuronal activity in rats as they explored an unusually large maze for the first time.
The researchers found that neurons in the brain’s hippocampus, where information about people, places, and events is stored, each contribute to an animal’s mental map at their own rate. Some neurons begin to associate themselves with the new space immediately, while others hold back, contributing only if the space expands beyond a size that can be represented by the first-line neurons. Similar mechanisms may be at play as the human brain records a new experience, says Janelia group leader Albert Lee, who led the study. Lee, graduate student Dylan Rich, and Hua Peng-Liaw, a technician in Lee’s lab, published their findings in the August 15, 2014, issue of the journal Science.
“The hippocampus has to represent arbitrary things,” Lee says. “When a new experience begins, we don’t know how long it’s going to last, and the brain has to form a new representation on the fly. This mechanism means that the hippocampus doesn’t have to adjust its representation if an environment is larger than predicted, or if an experience goes on longer than expected.”
As an animal explores a new environment, cells in its hippocampus fire to mark new places that it encounters. The cells, called place cells, fire randomly, but become associated with the shapes, smells, and other sensory cues present in that location. In humans, analogous cells store memories of people, places, facts, and events.
In rodents, about a third of the cells in the region of the hippocampus devoted to spatial learning participate in mapping a typical laboratory-sized maze. Different mazes are represented by different but overlapping sets of neurons. The differences between those sets allow the brain to distinguish between memories of different environments.
But what happens when an animal finds itself in an environment larger than a five-meter laboratory maze? In the wild, rats can traverse territories as long as 50 meters. Lee wanted to know how the hippocampus kept track of environments that placed greater demands on its neurons.
If cells continued to mark off space at the rate that scientists had observed in more confined environments, the animal’s mental map would quickly lose its uniqueness. “If every cell is active in the representation of a single space, then you can’t use this mechanism to distinguish memories of different things,” Lee points out. 
So Lee and his team stocked up on supplies from the hardware store and built their own maze, far larger than any that had been used previously to track place cell activity. The 48-meter maze wouldn’t fit inside Lee’s lab, so Lee, Rich, and Liaw set it up in a large cage-cleaning room at Janelia.
The room was busy during the week, so the team did their experiments on weekends. For multiple weekends over the course of about two years, Janelia’s vivarium staff would clear the room for them, and then the team would reassemble the maze and set up video cameras and electrophysiology equipment. The team recorded the activity of individual cells in the hippocampus as rats explored the maze for the first time. They first introduced the animals to a small portion of the maze, then gradually increased the territory to which the rats had access, monitoring how the brain added new information to its spatial map.
When the scientists analyzed their data, they discovered that from the time the rats entered the maze, their brains were ready to represent an environment of any size. “Instead of the hippocampus having to adjust in time as the animal notices that the maze gets larger, it anticipates all different sizes of mazes from the beginning,” Lee says. “It does this by dividing up its population of neurons so that certain ones are ready to represent smaller mazes, others are ready to represent medium-size mazes, and others, large ones.”
All of the neurons acted independently, firing randomly to mark off places in the maze. But some neurons had a greater propensity to mark off space than others, Lee explains. Some neurons mark space quickly and become associated with many places in the maze, whereas others are less likely to fire. These, Lee says, are reserved for mapping larger spaces.
In small environments, a subset of the cells that are most likely to mark off space – those that have a chance to fire while the animal explores – form the map on their own. In larger mazes, all of the neurons with a high propensity to mark space are recruited to the mapping effort, meaning they cannot be used to distinguish the representation of one large maze from another. That’s when the neurons with a lower tendency to fire step in, randomly marking space in a distinct, identifying set.
“There’s always a set of neurons that is just at the edge, where they are equally likely to represent any given environment versus not, regardless of what its size is,” Lee says. “Those are the neurons the brain can actually use to distinguish which environment its in.”
The system means the brain never has to adjust its representation of an environment as it is being created, Lee says. “All neurons are marking space at their own preferred rate, so there doesn’t have to be a mechanism to say, ‘you should fire because this maze is large or this maze is small.’ The hippocampus is ready for anything at any moment.”
Cells in the human brain may record events in a similar way, marking off time as an event unfolds without knowing how long it will continue, Lee says.

How the Brain Makes Sense of Spaces, Large and Small

When an animal encounters a new environment, the neurons in its brain that are responsible for mapping out the space are ready for anything. So says a new study in which scientists at the Howard Hughes Medical Institute’s Janelia Research Campus examined neuronal activity in rats as they explored an unusually large maze for the first time.

The researchers found that neurons in the brain’s hippocampus, where information about people, places, and events is stored, each contribute to an animal’s mental map at their own rate. Some neurons begin to associate themselves with the new space immediately, while others hold back, contributing only if the space expands beyond a size that can be represented by the first-line neurons. Similar mechanisms may be at play as the human brain records a new experience, says Janelia group leader Albert Lee, who led the study. Lee, graduate student Dylan Rich, and Hua Peng-Liaw, a technician in Lee’s lab, published their findings in the August 15, 2014, issue of the journal Science.

“The hippocampus has to represent arbitrary things,” Lee says. “When a new experience begins, we don’t know how long it’s going to last, and the brain has to form a new representation on the fly. This mechanism means that the hippocampus doesn’t have to adjust its representation if an environment is larger than predicted, or if an experience goes on longer than expected.”

As an animal explores a new environment, cells in its hippocampus fire to mark new places that it encounters. The cells, called place cells, fire randomly, but become associated with the shapes, smells, and other sensory cues present in that location. In humans, analogous cells store memories of people, places, facts, and events.

In rodents, about a third of the cells in the region of the hippocampus devoted to spatial learning participate in mapping a typical laboratory-sized maze. Different mazes are represented by different but overlapping sets of neurons. The differences between those sets allow the brain to distinguish between memories of different environments.

But what happens when an animal finds itself in an environment larger than a five-meter laboratory maze? In the wild, rats can traverse territories as long as 50 meters. Lee wanted to know how the hippocampus kept track of environments that placed greater demands on its neurons.

If cells continued to mark off space at the rate that scientists had observed in more confined environments, the animal’s mental map would quickly lose its uniqueness. “If every cell is active in the representation of a single space, then you can’t use this mechanism to distinguish memories of different things,” Lee points out. 

So Lee and his team stocked up on supplies from the hardware store and built their own maze, far larger than any that had been used previously to track place cell activity. The 48-meter maze wouldn’t fit inside Lee’s lab, so Lee, Rich, and Liaw set it up in a large cage-cleaning room at Janelia.

The room was busy during the week, so the team did their experiments on weekends. For multiple weekends over the course of about two years, Janelia’s vivarium staff would clear the room for them, and then the team would reassemble the maze and set up video cameras and electrophysiology equipment. The team recorded the activity of individual cells in the hippocampus as rats explored the maze for the first time. They first introduced the animals to a small portion of the maze, then gradually increased the territory to which the rats had access, monitoring how the brain added new information to its spatial map.

When the scientists analyzed their data, they discovered that from the time the rats entered the maze, their brains were ready to represent an environment of any size. “Instead of the hippocampus having to adjust in time as the animal notices that the maze gets larger, it anticipates all different sizes of mazes from the beginning,” Lee says. “It does this by dividing up its population of neurons so that certain ones are ready to represent smaller mazes, others are ready to represent medium-size mazes, and others, large ones.”

All of the neurons acted independently, firing randomly to mark off places in the maze. But some neurons had a greater propensity to mark off space than others, Lee explains. Some neurons mark space quickly and become associated with many places in the maze, whereas others are less likely to fire. These, Lee says, are reserved for mapping larger spaces.

In small environments, a subset of the cells that are most likely to mark off space – those that have a chance to fire while the animal explores – form the map on their own. In larger mazes, all of the neurons with a high propensity to mark space are recruited to the mapping effort, meaning they cannot be used to distinguish the representation of one large maze from another. That’s when the neurons with a lower tendency to fire step in, randomly marking space in a distinct, identifying set.

“There’s always a set of neurons that is just at the edge, where they are equally likely to represent any given environment versus not, regardless of what its size is,” Lee says. “Those are the neurons the brain can actually use to distinguish which environment its in.”

The system means the brain never has to adjust its representation of an environment as it is being created, Lee says. “All neurons are marking space at their own preferred rate, so there doesn’t have to be a mechanism to say, ‘you should fire because this maze is large or this maze is small.’ The hippocampus is ready for anything at any moment.”

Cells in the human brain may record events in a similar way, marking off time as an event unfolds without knowing how long it will continue, Lee says.

Filed under hippocampus neural activity place cells neurons memory neuroscience science

102 notes

Brain Benefits From Weight Loss Following Bariatric Surgery
Weight loss surgery can curb alterations in brain activity associated with obesity and improve cognitive function involved in planning, strategizing and organizing, according to a new study published in the Endocrine Society’s Journal of Clinical Endocrinology & Metabolism (JCEM).
Obesity can tax the brain as well as other organs. Obese individuals face a 35 percent higher risk of developing Alzheimer’s disease compared to normal weight people.
Bariatric surgery is used to help people who are dangerously obese lose weight. Bariatric surgery procedures are designed to restrict the amount of food you can eat before you feel full by reducing the stomach’s size or limit the absorption of nutrients by removing part of the small intestine from the path food takes through the digestive tract. Some procedures, such as Roux-en-Y gastric bypass (RYBG) surgery, use a combination of these methods. This study was the first to assess brain activity in women before and after bariatric surgery.
“When we studied obese women prior to bariatric surgery, we found some areas of their brains metabolized sugars at a higher rate than normal weight women,” said one of the study’s authors, Cintia Cercato, MD, PhD, of the University of São Paolo in São Paolo, Brazil. “In particular, obesity led to altered activity in a part of the brain linked to the development of Alzheimer’s disease – the posterior cingulate gyrus. Since bariatric surgery reversed this activity, we suspect the procedure may contribute to a reduced risk of Alzheimer’s disease and other forms of dementia.”
The longitudinal study examined the effect of RYBG surgery on the brain function of 17 obese women. Researchers used positron emission tomography (PET) scans and neuropsychological tests to assess brain function and activity in the participants prior to surgery and six months after the procedure. The same tests also were run once on a control group of 16 lean women.
Before they underwent surgery, the obese women had higher rates of metabolism in certain areas of the brain, including the posterior cingulate gyrus. Following surgery, there was no evidence of this exacerbated brain activity. Their brain metabolism rates were comparable to the activity seen in normal weight women.
After surgery, the obese women also performed better on a test measuring executive function – the brain’s ability to connect past experience and present action – than they did before the procedures. Executive function is used in planning, organizing and strategizing. Five other neuropsychological tests measuring various aspects of memory and cognitive function showed no change following the surgery.
“Our findings suggest the brain is another organ that benefits from weight loss induced by surgery,” Cercato said. “The increased brain activity the obese women exhibited before undergoing surgery did not result in improved cognitive performance, which suggests obesity may force the brain to work harder to achieve the same level of cognition.”
(Image: Getty)

Brain Benefits From Weight Loss Following Bariatric Surgery

Weight loss surgery can curb alterations in brain activity associated with obesity and improve cognitive function involved in planning, strategizing and organizing, according to a new study published in the Endocrine Society’s Journal of Clinical Endocrinology & Metabolism (JCEM).

Obesity can tax the brain as well as other organs. Obese individuals face a 35 percent higher risk of developing Alzheimer’s disease compared to normal weight people.

Bariatric surgery is used to help people who are dangerously obese lose weight. Bariatric surgery procedures are designed to restrict the amount of food you can eat before you feel full by reducing the stomach’s size or limit the absorption of nutrients by removing part of the small intestine from the path food takes through the digestive tract. Some procedures, such as Roux-en-Y gastric bypass (RYBG) surgery, use a combination of these methods. This study was the first to assess brain activity in women before and after bariatric surgery.

“When we studied obese women prior to bariatric surgery, we found some areas of their brains metabolized sugars at a higher rate than normal weight women,” said one of the study’s authors, Cintia Cercato, MD, PhD, of the University of São Paolo in São Paolo, Brazil. “In particular, obesity led to altered activity in a part of the brain linked to the development of Alzheimer’s disease – the posterior cingulate gyrus. Since bariatric surgery reversed this activity, we suspect the procedure may contribute to a reduced risk of Alzheimer’s disease and other forms of dementia.”

The longitudinal study examined the effect of RYBG surgery on the brain function of 17 obese women. Researchers used positron emission tomography (PET) scans and neuropsychological tests to assess brain function and activity in the participants prior to surgery and six months after the procedure. The same tests also were run once on a control group of 16 lean women.

Before they underwent surgery, the obese women had higher rates of metabolism in certain areas of the brain, including the posterior cingulate gyrus. Following surgery, there was no evidence of this exacerbated brain activity. Their brain metabolism rates were comparable to the activity seen in normal weight women.

After surgery, the obese women also performed better on a test measuring executive function – the brain’s ability to connect past experience and present action – than they did before the procedures. Executive function is used in planning, organizing and strategizing. Five other neuropsychological tests measuring various aspects of memory and cognitive function showed no change following the surgery.

“Our findings suggest the brain is another organ that benefits from weight loss induced by surgery,” Cercato said. “The increased brain activity the obese women exhibited before undergoing surgery did not result in improved cognitive performance, which suggests obesity may force the brain to work harder to achieve the same level of cognition.”

(Image: Getty)

Filed under brain activity cognitive function obesity weight loss neuroscience science

85 notes

Focus on naturally occurring protein to tackle dementia

Scientists at the University of Warwick have provided the first evidence that the lack of a naturally occurring protein is linked to early signs of dementia.

Published in Nature Communications, the research found that the absence of the protein MK2/3 promotes structural and physiological changes to cells in the nervous system. These changes were shown to have a significant correlation with early signs of dementia, including restricted learning and memory formation capabilities.

An absence of MK2/3, in spite of the brain cells (neurons) having significant structural abnormalities, did not prevent memories being formed, but did prevent these memories from being altered.

The results have led the researchers to call for greater attention to be paid to studying MK2/3.

Lead researcher and author Dr Sonia Corrêa says that “Understanding how the brain functions from the sub-cellular to systems level is vital if we are to be able to develop ways to counteract changes that occur with ageing.

“By demonstrating for the first time that the MK2/3 protein, which is essential for neuron communication, is required to fine-tune memory formation this study provides new insight into how molecular mechanisms regulate cognition”.

Neurons can adapt memories and make them more relevant to current situations by changing the way they communicate with other cells.

Information in the brain is transferred between neurons at synapses using chemicals (neurotransmitters) released from one (presynaptic) neuron which then act on receptors in the next (postsynaptic) neuron in the chain.

MK2/3 regulates the shape of spines in properly functioning postsynaptic neurons. Postsynaptic neurons with MK2/3 feature wider, shorter spines (Fig.1) than those without (Fig2).

The researchers found that change, caused by MK2/3’s absence, in the spine’s shape restricts the ability of neurons to communicate with each other, leading to alterations in the ability to acquire new memories.

“Deterioration of brain function commonly occurs as we get older but, as result of dementia or other neurodegenerative diseases, it can occur earlier in people’s lives”, says Dr Corrêa. “For those who develop the early signs of dementia it becomes more difficult for them to adapt to changes in their life, including performing routine tasks.

“For example, washing the dishes; if you have washed them by hand your whole life and then buy a dishwasher it can be difficult for those people who are older or have dementia to acquire the new memories necessary to learn how to use the machine and mentally replace the old method of washing dishes with the new. The change in shape of the postsynaptic neuron due to absence of MK2/3 is strongly correlated with this inability to acquire the new memories”.

Dr Corrêa argues that “Given their vital role in memory formation, MK2/3 pathways are important potential pharmaceutical targets for the treatment of cognitive deficits associated with ageing and dementia.”

Filed under aging dementia learning MK2/3 memory formation synaptic plasticity neuroscience science

153 notes

Wii Balance Board Induces Changes in the Brains of MS Patients
A balance board accessory for a popular video game console can help people with multiple sclerosis (MS) reduce their risk of accidental falls, according to new research published online in the journal Radiology. Magnetic resonance imaging (MRI) scans showed that use of the Nintendo Wii Balance Board system appears to induce favorable changes in brain connections associated with balance and movement.
Balance impairment is one of the most common and disabling symptoms of MS, a disease of the central nervous system in which the body’s immune system attacks the protective sheath around nerve fibers. Physical rehabilitation is often used to preserve balance, and one of the more promising new tools is the Wii Balance Board System, a battery-powered device about the size and shape of a bathroom scale. Users stand on the board and shift their weight as they follow the action on the television screen during games like slalom skiing.
While Wii balance board rehabilitation has been reported as effective in patients with MS, little is known about the underlying physiological basis for any improvements in balance.
Researchers recently used an MRI technique called diffusion tensor imaging (DTI) to study changes in the brains of 27 MS patients who underwent a 12-week intervention using Wii balance board-based visual feedback training. DTI is a non-conventional MRI technique that allows detailed analysis of the white matter tracts that transmit nervous signals through the brain and body.
MRI scans of the MS patients showed significant effects in nerve tracts that are important in balance and movement. The changes seen on MRI correlated with improvements in balance as measured by an assessment technique called posturography.
These brain changes in MS patients are likely a manifestation of neural plasticity, or the ability of the brain to adapt and form new connections throughout life, according to lead author Luca Prosperini, M.D., Ph.D., from Sapienza University in Rome, Italy.
"The most important finding in this study is that a task-oriented and repetitive training aimed at managing a specific symptom is highly effective and induces brain plasticity," he said. "More specifically, the improvements promoted by the Wii balance board can reduce the risk of accidental falls in patients with MS, thereby reducing the risk of fall-related comorbidities like trauma and fractures."
Dr. Prosperini noted that similar plasticity has been described in persons who play video games, but the exact mechanisms behind the phenomenon are still unknown. He hypothesized that changes can occur at the cellular level within the brain and may be related to myelination, the process of building the protective sheath around the nerves.
The rehabilitation-induced improvements did not persist after the patients discontinued the training protocol, Dr. Prosperini said, most likely because certain skills related to structural changes to the brain after an injury need to be maintained through training.
"This finding should have an important impact on the rehabilitation process of patients, suggesting that they need ongoing exercises to maintain good performance in daily living activities," Dr. Prosperini said.

Wii Balance Board Induces Changes in the Brains of MS Patients

A balance board accessory for a popular video game console can help people with multiple sclerosis (MS) reduce their risk of accidental falls, according to new research published online in the journal Radiology. Magnetic resonance imaging (MRI) scans showed that use of the Nintendo Wii Balance Board system appears to induce favorable changes in brain connections associated with balance and movement.

Balance impairment is one of the most common and disabling symptoms of MS, a disease of the central nervous system in which the body’s immune system attacks the protective sheath around nerve fibers. Physical rehabilitation is often used to preserve balance, and one of the more promising new tools is the Wii Balance Board System, a battery-powered device about the size and shape of a bathroom scale. Users stand on the board and shift their weight as they follow the action on the television screen during games like slalom skiing.

While Wii balance board rehabilitation has been reported as effective in patients with MS, little is known about the underlying physiological basis for any improvements in balance.

Researchers recently used an MRI technique called diffusion tensor imaging (DTI) to study changes in the brains of 27 MS patients who underwent a 12-week intervention using Wii balance board-based visual feedback training. DTI is a non-conventional MRI technique that allows detailed analysis of the white matter tracts that transmit nervous signals through the brain and body.

MRI scans of the MS patients showed significant effects in nerve tracts that are important in balance and movement. The changes seen on MRI correlated with improvements in balance as measured by an assessment technique called posturography.

These brain changes in MS patients are likely a manifestation of neural plasticity, or the ability of the brain to adapt and form new connections throughout life, according to lead author Luca Prosperini, M.D., Ph.D., from Sapienza University in Rome, Italy.

"The most important finding in this study is that a task-oriented and repetitive training aimed at managing a specific symptom is highly effective and induces brain plasticity," he said. "More specifically, the improvements promoted by the Wii balance board can reduce the risk of accidental falls in patients with MS, thereby reducing the risk of fall-related comorbidities like trauma and fractures."

Dr. Prosperini noted that similar plasticity has been described in persons who play video games, but the exact mechanisms behind the phenomenon are still unknown. He hypothesized that changes can occur at the cellular level within the brain and may be related to myelination, the process of building the protective sheath around the nerves.

The rehabilitation-induced improvements did not persist after the patients discontinued the training protocol, Dr. Prosperini said, most likely because certain skills related to structural changes to the brain after an injury need to be maintained through training.

"This finding should have an important impact on the rehabilitation process of patients, suggesting that they need ongoing exercises to maintain good performance in daily living activities," Dr. Prosperini said.

Filed under MS diffusion tensor imaging myelination balance white matter posturography neuroscience science

402 notes

A long childhood feeds the hungry human brain
A five-year old’s brain is an energy monster. It uses twice as much glucose (the energy that fuels the brain) as that of a full-grown adult, a new study led by Northwestern University anthropologists has found.
The study helps to solve the long-standing mystery of why human children grow so slowly compared with our closest animal relatives.
It shows that energy funneled to the brain dominates the human body’s metabolism early in life and is likely the reason why humans grow at a pace more typical of a reptile than a mammal during childhood.
Results of the study will be published the week of Aug. 25 in the journal Proceedings of the National Academy of Sciences.
"Our findings suggest that our bodies can’t afford to grow faster during the toddler and childhood years because a huge quantity of resources is required to fuel the developing human brain," said Christopher Kuzawa, first author of the study and a professor of anthropology at Northwestern’s Weinberg College of Arts and Sciences. "As humans we have so much to learn, and that learning requires a complex and energy-hungry brain."
Kuzawa also is a faculty fellow at the Institute for Policy Research at Northwestern.
The study is the first to pool existing PET and MRI brain scan data — which measure glucose uptake and brain volume, respectively — to show that the ages when the brain gobbles the most resources are also the ages when body growth is slowest. At 4 years of age, when this “brain drain” is at its peak and body growth slows to its minimum, the brain burns through resources at a rate equivalent to 66 percent of what the entire body uses at rest.
The findings support a long-standing hypothesis in anthropology that children grow so slowly, and are dependent for so long, because the human body needs to shunt a huge fraction of its resources to the brain during childhood, leaving little to be devoted to body growth. It also helps explain some common observations that many parents may have.
"After a certain age it becomes difficult to guess a toddler or young child’s age by their size," Kuzawa said. "Instead you have to listen to their speech and watch their behavior. Our study suggests that this is no accident. Body growth grinds nearly to a halt at the ages when brain development is happening at a lightning pace, because the brain is sapping up the available resources."
It was previously believed that the brain’s resource burden on the body was largest at birth, when the size of the brain relative to the body is greatest. The researchers found instead that the brain maxes out its glucose use at age 5. At age 4 the brain consumes glucose at a rate comparable to 66 percent of the body’s resting metabolic rate (or more than 40 percent of the body’s total energy expenditure).
"The mid-childhood peak in brain costs has to do with the fact that synapses, connections in the brain, max out at this age, when we learn so many of the things we need to know to be successful humans," Kuzawa said.
"At its peak in childhood, the brain burns through two-thirds of the calories the entire body uses at rest, much more than other primate species," said William Leonard, co-author of the study. "To compensate for these heavy energy demands of our big brains, children grow more slowly and are less physically active during this age range. Our findings strongly suggest that humans evolved to grow slowly during this time in order to free up fuel for our expensive, busy childhood brains."

A long childhood feeds the hungry human brain

A five-year old’s brain is an energy monster. It uses twice as much glucose (the energy that fuels the brain) as that of a full-grown adult, a new study led by Northwestern University anthropologists has found.

The study helps to solve the long-standing mystery of why human children grow so slowly compared with our closest animal relatives.

It shows that energy funneled to the brain dominates the human body’s metabolism early in life and is likely the reason why humans grow at a pace more typical of a reptile than a mammal during childhood.

Results of the study will be published the week of Aug. 25 in the journal Proceedings of the National Academy of Sciences.

"Our findings suggest that our bodies can’t afford to grow faster during the toddler and childhood years because a huge quantity of resources is required to fuel the developing human brain," said Christopher Kuzawa, first author of the study and a professor of anthropology at Northwestern’s Weinberg College of Arts and Sciences. "As humans we have so much to learn, and that learning requires a complex and energy-hungry brain."

Kuzawa also is a faculty fellow at the Institute for Policy Research at Northwestern.

The study is the first to pool existing PET and MRI brain scan data — which measure glucose uptake and brain volume, respectively — to show that the ages when the brain gobbles the most resources are also the ages when body growth is slowest. At 4 years of age, when this “brain drain” is at its peak and body growth slows to its minimum, the brain burns through resources at a rate equivalent to 66 percent of what the entire body uses at rest.

The findings support a long-standing hypothesis in anthropology that children grow so slowly, and are dependent for so long, because the human body needs to shunt a huge fraction of its resources to the brain during childhood, leaving little to be devoted to body growth. It also helps explain some common observations that many parents may have.

"After a certain age it becomes difficult to guess a toddler or young child’s age by their size," Kuzawa said. "Instead you have to listen to their speech and watch their behavior. Our study suggests that this is no accident. Body growth grinds nearly to a halt at the ages when brain development is happening at a lightning pace, because the brain is sapping up the available resources."

It was previously believed that the brain’s resource burden on the body was largest at birth, when the size of the brain relative to the body is greatest. The researchers found instead that the brain maxes out its glucose use at age 5. At age 4 the brain consumes glucose at a rate comparable to 66 percent of the body’s resting metabolic rate (or more than 40 percent of the body’s total energy expenditure).

"The mid-childhood peak in brain costs has to do with the fact that synapses, connections in the brain, max out at this age, when we learn so many of the things we need to know to be successful humans," Kuzawa said.

"At its peak in childhood, the brain burns through two-thirds of the calories the entire body uses at rest, much more than other primate species," said William Leonard, co-author of the study. "To compensate for these heavy energy demands of our big brains, children grow more slowly and are less physically active during this age range. Our findings strongly suggest that humans evolved to grow slowly during this time in order to free up fuel for our expensive, busy childhood brains."

Filed under brain development childhood glucose neuroimaging plasticity neuroscience science

127 notes

SA’s Taung Child’s skull and brain not human-like in expansion
The Taung Child, South Africa’s premier hominin discovered 90 years ago by Wits University Professor Raymond Dart, never seizes to transform and evolve the search for our collective origins.
By subjecting the skull of the first australopith discovered to the latest technologies in the Wits University Microfocus X-ray Computed Tomography (CT) facility, researchers are now casting doubt on theories that Australopithecus africanus shows the same cranial adaptations found in modern human infants and toddlers – in effect disproving current support for the idea that this early hominin shows infant brain development in the prefrontal region similar to that of modern humans.
The results have been published online in the prestigious journal Proceedings of the National Academy of Sciences (PNAS) on Monday, 25 August 2014 at 21:00 SAST (15:00 EST), in an article titled: New high resolution CT data of the Taung partial cranium and endocast and their bearing on metopism and hominin brain evolution.
The Taung Child has historical and scientific importance in the fossil record as the first and best example of early hominin brain evolution, and theories have been put forward that it exhibits key cranial adaptations found in modern human infants and toddlers.
To test the ancientness of this evolutionary adaptation, Dr Kristian J. Carlson, Senior Researcher from the Evolutionary Studies Institute at the University of the Witwatersrand, and colleagues, Professor Ralph L. Holloway from Columbia University and Douglas C. Broadfield from Florida Atlantic University, performed an in silico dissection of the Taung fossil using high-resolution computed tomography.
"A recent study has described the roughly 3 million-year-old fossil, thought to have belonged to a 3 to 4-year-old, as having a persistent metopic suture and open anterior fontanelle, two features that facilitate post-natal brain growth in human infants when their disappearance is delayed," said Carlson.
Comparisons with the existing hominin fossil record and chimpanzee variation do not support this evolutionary scenario.
Citing deficiencies in how the Taung fossil material has been recently assessed, the researchers suggest physical evidence does not incontrovertibly link features of the Taung skull, or its endocast, to early prefrontal lobe expansion, a brain region implicated in many human behaviors.
The authors also debate the previously offered theoretical basis for this adaptation in A. africanus. By refuting the presence of these features in the Taung Child, the researchers dispute whether these structures were selectively advantageous in hominin evolution, particularly in australopiths.
Thus, results of the new study show that there is still no evidence for this kind of skull adaptation that evolved before Homo, nor is there evidence for a link between such skull characteristics and the proposed accompanying early prefrontal lobe expansion, Carlson said.

SA’s Taung Child’s skull and brain not human-like in expansion

The Taung Child, South Africa’s premier hominin discovered 90 years ago by Wits University Professor Raymond Dart, never seizes to transform and evolve the search for our collective origins.

By subjecting the skull of the first australopith discovered to the latest technologies in the Wits University Microfocus X-ray Computed Tomography (CT) facility, researchers are now casting doubt on theories that Australopithecus africanus shows the same cranial adaptations found in modern human infants and toddlers – in effect disproving current support for the idea that this early hominin shows infant brain development in the prefrontal region similar to that of modern humans.

The results have been published online in the prestigious journal Proceedings of the National Academy of Sciences (PNAS) on Monday, 25 August 2014 at 21:00 SAST (15:00 EST), in an article titled: New high resolution CT data of the Taung partial cranium and endocast and their bearing on metopism and hominin brain evolution.

The Taung Child has historical and scientific importance in the fossil record as the first and best example of early hominin brain evolution, and theories have been put forward that it exhibits key cranial adaptations found in modern human infants and toddlers.

To test the ancientness of this evolutionary adaptation, Dr Kristian J. Carlson, Senior Researcher from the Evolutionary Studies Institute at the University of the Witwatersrand, and colleagues, Professor Ralph L. Holloway from Columbia University and Douglas C. Broadfield from Florida Atlantic University, performed an in silico dissection of the Taung fossil using high-resolution computed tomography.

"A recent study has described the roughly 3 million-year-old fossil, thought to have belonged to a 3 to 4-year-old, as having a persistent metopic suture and open anterior fontanelle, two features that facilitate post-natal brain growth in human infants when their disappearance is delayed," said Carlson.

Comparisons with the existing hominin fossil record and chimpanzee variation do not support this evolutionary scenario.

Citing deficiencies in how the Taung fossil material has been recently assessed, the researchers suggest physical evidence does not incontrovertibly link features of the Taung skull, or its endocast, to early prefrontal lobe expansion, a brain region implicated in many human behaviors.

The authors also debate the previously offered theoretical basis for this adaptation in A. africanus. By refuting the presence of these features in the Taung Child, the researchers dispute whether these structures were selectively advantageous in hominin evolution, particularly in australopiths.

Thus, results of the new study show that there is still no evidence for this kind of skull adaptation that evolved before Homo, nor is there evidence for a link between such skull characteristics and the proposed accompanying early prefrontal lobe expansion, Carlson said.

Filed under taung child hominin evolution prefrontal cortex brain development neuroscience science

113 notes

Scientists Uncover Navigation System Used by Cancer, Nerve Cells

Duke University researchers have found a ”roving detection system” on the surface of cells that may point to new ways of treating diseases like cancer, Parkinson’s disease and amyotrophic lateral sclerosis (ALS).

The cells, which were studied in nematode worms, are able to break through normal tissue boundaries and burrow into other tissues and organs — a crucial step in many normal developmental processes, ranging from embryonic development and wound-healing to the formation of new blood vessels.

But sometimes the process goes awry. Such is the case with metastatic cancer, in which cancer cells spread unchecked from where they originated and form tumors in other parts of the body.

“Cell invasion is one of the most clinically relevant yet least understood aspects of cancer progression,” said David Sherwood, an associate professor of biology at Duke.

Sherwood is leading a team that is investigating the molecular mechanisms that control cell invasion in both normal development and cancer, using a one-millimeter worm known as C. elegans.

At one point in C. elegans development, a specialized cell called the anchor cell breaches the dense, sheet-like membrane that separate the worm’s uterus from its vulva, opening up the worm’s reproductive tract.

Anchor cells can’t see, so they need some kind of signal to tell them where to break through. In a 2009 study, Sherwood and colleagues discovered that an extracellular cue called netrin orients the anchor cell so that it invades in the right direction.

In a new study appearing Aug. 25 in the Journal of Cell Biology, the team shows how receptors on the invasive cells essentially rove around the cell membrane ”hunting” for the missing netrin signal that will guide the cell to the correct location.

The researchers used a video camera attached to a powerful microscope to take time-lapse movies of the slow movement of the C. elegans anchor cell during its invasion (Figure 1, Figure 2).

Their time-lapse analyses reveal that when netrin production is blocked, netrin receptors on the surface of the anchor cell periodically cluster, disperse and reassemble in a different region of the cell membrane. The receptors cluster alongside patches of actin filaments — thin flexible fibers that help cells change shape and form invasive protrusions –- that pop up in each new spot.

“It’s kind of like a missile detection system,” Sherwood said.

Rather than the whole cell having to move around, its receptors move around on the outside of the cell until they get a signal. Once the receptors locate the netrin signal, they stabilize in the region of the cell membrane that is closest to the source of the signal.

The findings redefine decades-old ideas about how the cell’s navigation system works. “Cells don’t just passively respond to the netrin signal — they’re actively searching for it,” Sherwood said.

Given that netrin has been found to promote cell invasion in some of the most lethal cancers, the findings could lead to new treatment strategies. Disrupting the cell’s netrin detection system, for example, could prevent cancer cells from finding their way to the bloodstream or the lymphatic system and stop them from metastasizing, or becoming invasive and spreading throughout the body.

“One of the things we’re gearing up to do next are drug screens with our collaborators to see if we can block this detection system during invasion,” Sherwood said.

Scientists have also known for years that netrin plays a key role in wiring the brain and nervous system by guiding developing nerve cells as they grow and form connections.

This means the results could also point to new ways of treating neurological disorders like Parkinson’s and ALS and recovering from spinal cord injuries.

Tinkering with the cell’s netrin detection machinery, for example, may make it possible to encourage damaged cells in the central nervous system — which normally have limited ability to regenerate — to regrow.

(Source: today.duke.edu)

Filed under C. elegans netrin cancer cells nerve cells neuroscience science

free counters