Neuroscience

Articles and news from the latest research reports.

53 notes

Brain Connectivity Can Predict Epilepsy Surgery Outcomes

A discovery from Case Western Reserve and Cleveland Clinic researchers could provide epilepsy patients invaluable advance guidance about their chances to improve symptoms through surgery.

Assistant Professor of Neurosciences Roberto Fernández Galán, PhD, and his collaborators have identified a new, far more accurate way to determine precisely what portions of the brain suffer from the disease. This information can give patients and physicians better information regarding whether temporal lobe surgery will provide the results they seek.

“Our analysis of neuronal activity in the temporal lobe allows us to determine whether it is diseased, and therefore, whether removing it with surgery will be beneficial for the patient,” Galán said, the paper’s senior author. “In terms of accuracy and efficiency, our analysis method is a significant improvement relative to current approaches.”

The findings appear in research published October 30 in the open access journal PLOS ONE.

About one-third of patients with temporal lobe epilepsy do not respond to medical treatment and opt to do lobectomies to alleviate their symptoms. Yet the surgery’s success rate is only 60 to 70 percent because of the difficulties in identifying the diseased brain tissue prior to the procedures.

Galán and investigators from Cleveland Clinic determined that using intracranial electroencephalography (iEEG) to measure patients’ functional neural connectivity – that is, the communication from one brain region to another - identified the epileptic lobe with 87 percent accuracy. An iEEG records electrical activity with electrodes implanted in the brain. Key indicators of a diseased lobe are weak and similar connections.

In the retrospective study, Galán and Arun Antony, MD, formerly a senior clinical fellow in the Epilepsy Center at Cleveland Clinic and now an assistant professor of neurology at the University of Pittsburgh, examined data from 23 patients with temporal lobe epilepsy who had all or part of their temporal lobes removed after iEEG evaluations performed at Cleveland Clinic. The researchers examined the results of patients’ preoperative iEEG to determine the degree of functional connectivity that was associated with successful surgical outcomes.

“The concept of functional connectivity has been extensively studied by basic science researchers, but has not found a way into the realm of clinical epilepsy treatment yet,” Antony said, the paper’s first author. “Our discovery is another step towards the use of measures of functional connectivity in making clinical decisions in the treatment of epilepsy.”

As a standard preoperative test for lobectomy surgery, physicians analyze iEEG traces looking for simultaneous discharges of neurons that appear as spikes in the recordings, which indicate epileptic activity. This PLOS ONE discovery evaluates the data differently by examining normal brain activity in the absence of spikes and inferring connectivity.

(Source: newswise.com)

Filed under epilepsy brain activity lobectomy intracranial electroencephalography neuroscience science

66 notes

Gene Found To Foster Synapse Formation In The Brain

Researchers at Johns Hopkins say they have found that a gene already implicated in human speech disorders and epilepsy is also needed for vocalizations and synapse formation in mice. The finding, they say, adds to scientific understanding of how language develops, as well as the way synapses — the connections among brain cells that enable us to think — are formed. A description of their experiments appears in Science Express on Oct. 31.

image

A group led by Richard Huganir, Ph.D., director of the Solomon H. Snyder Department of Neuroscience and a Howard Hughes Medical Institute investigator, set out to investigate genes involved in synapse formation. Gek-Ming Sia, Ph.D., a research associate in Huganir’s laboratory, first screened hundreds of human genes for their effects on lab-grown mouse brain cells. When one gene, SRPX2, was turned up higher than normal, it caused the brain cells to erupt with new synapses, Sia found.

When Huganir’s team injected fetal mice with an SRPX2-blocking compound, the mice showed fewer synapses than normal mice even as adults, the researchers found. In addition, when SRPX2-deficient mouse pups were separated from their mothers, they did not emit high-pitched distress calls as other pups do, indicating they lacked the rodent equivalent of early language ability.

Other researchers’ analyses of the human genome have found that mutations in SRPX2 are associated with language disorders and epilepsy, and when Huganir’s team injected the human SRPX2 with the same mutations into the fetal mice, they also had deficits in their vocalization as young pups.

Another research group at Institut de Neurobiologie de la Méditerranée in France had previously shown that SRPX2 interacts with FoxP2, a gene that has gained wide attention for its apparently crucial role in language ability.

Huganir’s team confirmed this, showing that FoxP2 controls how much protein the SRPX2 gene makes and may affect language in this way. “FoxP2 is famous for its role in language, but it’s actually involved in other functions as well,” Huganir comments. “SRPX2 appears to be more specialized to language ability.” Huganir suspects that the gene may also be involved in autism, since autistic patients often have language impairments, and the condition has been linked to defects in synapse formation.

This study is only the beginning of teasing out how SRPX2 acts on the brain, Sia says. “We’d like to find out what other proteins it acts on, and how exactly it regulates synapses and enables language development.”

Filed under synapses language development autism epilepsy genetics neuroscience science

98 notes

Exposure to Cortisol-Like Medications Before Birth May Contribute to Emotional Problems and Brain Changes

Neonatologists seem to perform miracles in the fight to support the survival of babies born prematurely.

To promote their survival, cortisol-like drugs called glucocorticoids are administered frequently to women in preterm labor to accelerate their babies’ lung maturation prior to birth. Cortisol is a substance naturally released by the body when stressed. But the levels of glucocorticoids administered to promote lung development are higher than that achieved with typical stress, perhaps only mirrored in the body’s reaction to extreme stresses.

The benefit of glucocorticoids is undisputed and has certainly saved the lives of countless babies, but this exposure also may have some negative consequences. Indeed, excessive glucocorticoid levels may have effects on brain development, perhaps contributing to emotional problems later in life.

In this issue of Biological Psychiatry, Dr. Elysia Davis at the University of Denver and her colleagues report new findings on the effects of synthetic glucocorticoid on human brain development. Their study focused on healthy children who were born full-term, avoiding the confounding effects of premature birth.

The investigators conducted brain imaging sessions in and carefully assessed 54 children, 6-10 years of age. The mothers of the participating children also completed reports on their child’s behavior. The researchers then divided the children into two groups: those who were exposed to glucocorticoids prenatally and those who were not.

In this study, children with fetal glucocorticoid exposure showed significant cortical thinning, and a thinner cortex also predicted more emotional problems. In one particularly affected part of the brain, the rostral anterior cingulate cortex, it was 8-9% thinner among children exposed to glucocorticoids. Interestingly, other studies have shown that this region of the brain is affected in individuals diagnosed with mood and anxiety disorders.

"Fetal exposure to a frequently administered stress hormone is associated with consequences for child brain development that persist for at least 6 to 10 years. These neurological changes are associated with increased risk for stress and emotional problems," Davis explained of their findings. "Importantly, these findings were observed among healthy children born full term."

Although such a finding does not indicate that glucocorticoids ‘caused’ these changes, the researchers did determine that the findings can’t be explained by any obvious confounding differences between the groups. The two groups did not differ on weight or gestational age at birth, apgar scores, maternal factors, or any other basic demographics. Thus, the findings do suggest that glucocorticoid administration may somehow alter the trajectory of brain development of exposed children.

"This study provides evidence that prenatal exposure to stress hormones shapes the construction of the fetal nervous system with consequences for the developing brain that persist into the preadolescent period," she added.

"This study highlights potential links between early cortisol exposure, cortical thinning and mood symptoms in children. It may provide important insights into the development of the brain and the long-term impact of maternal stress," commented Dr. John Krystal, Editor of Biological Psychiatry.

(Source: elsevier.com)

Filed under stress glucocorticoids cortisol brain development psychology neuroscience science

243 notes

Babies can learn their first lullabies in the womb
An infant can recognise a lullaby heard in the womb for several months after birth, potentially supporting later speech development. This is indicated in a new study at the University of Helsinki.
The study focused on 24 women during the final trimester of their pregnancies. Half of the women played the melody of Twinkle Twinkle Little Star to their fetuses five days a week for the final stages of their pregnancies. The brains of the babies who heard the melody in utero reacted more strongly to the familiar melody both immediately and four months after birth when compared with the control group. These results show that fetuses can recognise and remember sounds from the outside world.
This is significant for the early rehabilitation, since rehabilitation aims at long-term changes in the brain.
“Even though our earlier research indicated that fetuses could learn minor details of speech, we did not know how long they could retain the information. These results show that babies are capable of learning at a very young age, and that the effects of the learning remain apparent in the brain for a long time,” expounds Eino Partanen, who is currently finishing his dissertation at the Cognitive Brain Research Unit.
“This is the first study to track how long fetal memories remain in the brain. The results are significant, as studying the responses in the brain let us focus on the foundations of fetal memory. The early mechanisms of memory are currently unknown,” points out Dr Minna Huotilainen, principal investigator.
The researchers believe that song and speech are most beneficial for the fetus in terms of speech development. According to the current understanding, the processing of singing and speech in the babies brains are partly based on shared mechanisms, and so hearing a song can support a baby’s speech development. However, little is known about the possible detrimental effects that noise in the workplace can cause to a fetus during the final trimester. An extensive research project on this topic is underway at the Finnish Institute of Occupational Health.

Babies can learn their first lullabies in the womb

An infant can recognise a lullaby heard in the womb for several months after birth, potentially supporting later speech development. This is indicated in a new study at the University of Helsinki.

The study focused on 24 women during the final trimester of their pregnancies. Half of the women played the melody of Twinkle Twinkle Little Star to their fetuses five days a week for the final stages of their pregnancies. The brains of the babies who heard the melody in utero reacted more strongly to the familiar melody both immediately and four months after birth when compared with the control group. These results show that fetuses can recognise and remember sounds from the outside world.

This is significant for the early rehabilitation, since rehabilitation aims at long-term changes in the brain.

“Even though our earlier research indicated that fetuses could learn minor details of speech, we did not know how long they could retain the information. These results show that babies are capable of learning at a very young age, and that the effects of the learning remain apparent in the brain for a long time,” expounds Eino Partanen, who is currently finishing his dissertation at the Cognitive Brain Research Unit.

“This is the first study to track how long fetal memories remain in the brain. The results are significant, as studying the responses in the brain let us focus on the foundations of fetal memory. The early mechanisms of memory are currently unknown,” points out Dr Minna Huotilainen, principal investigator.

The researchers believe that song and speech are most beneficial for the fetus in terms of speech development. According to the current understanding, the processing of singing and speech in the babies brains are partly based on shared mechanisms, and so hearing a song can support a baby’s speech development. However, little is known about the possible detrimental effects that noise in the workplace can cause to a fetus during the final trimester. An extensive research project on this topic is underway at the Finnish Institute of Occupational Health.

Filed under infants speech development memory learning psychology neuroscience science

47 notes

Critical Gene in Retinal Development and Motion Sensing Identified

Our vision depends on exquisitely organized layers of cells within the eye’s retina, each with a distinct role in perception. Johns Hopkins researchers say they have taken an important step toward understanding how those cells are organized to produce what the brain “sees.” Specifically, they report identification of a gene that guides the separation of two types of motion-sensing cells, offering insight into how cellular layering develops in the retina, with possible implications for the brain’s cerebral cortex. A report on the discovery is published in the Nov. 1 issue of the journal Science.

“The separation of different types of cells into layers is critical to their ability to form the precise sets of connections with each other — the circuitry — that lets us process visual information,” says Alex Kolodkin, Ph.D., a professor in the Johns Hopkins University School of Medicine’s Solomon H. Snyder Department of Neuroscience and an investigator at the Howard Hughes Medical Institute. “There is still much to learn about how that separation happens during development, but we’ve identified for the first time proteins that enable two very similar types of cells to segregate into their own distinct neuronal layers.”

Kolodkin’s research group specializes in studying how circuitry forms among neurons (brain and nerve cells). Past experiments revealed that two types of proteins, called semaphorins and plexins, help guide this process. In the current study, Lu Sun, a graduate student in Kolodkin’s laboratory, focused on the genes that carry the blueprint for these proteins in two of the 10 layers of cells in the mammalian retina.

Those two layers are made up of so-called starburst amacrine cells (SACs). One type of SAC, known as “Off,” detects motion by sensing decreases in the amount of light hitting the retina, while the other type, “On,” detects increases in light. Sun examined the amounts of several semaphorin and plexin proteins being made by each type of cell, and found that only the “On” SACs were making a semaphorin called Sema6A. Sema6A can only work in the retina by interacting with its receptor, a plexin called PlexA2, but Sun found both types of SAC were churning out roughly equal amounts of PlexA2.

Reasoning that Sema6A might be the key difference that enabled the “On” and “Off” SACs to segregate from one another, Kolodkin’s team analyzed mice in which the genes for either Sema6A, PlexA2 or both could be switched off, and looked at the effects of this manipulation on their retinas. “Knocking out” either gene during development led the “On” and “Off” layers to run together, the team found, and caused abnormalities in the “On” SACs’ tree-like extensions. However, the “Off” SACs, which hadn’t been using their Sema6A gene in the first place, still looked and functioned normally.

“When signaling between Sema6A and PlexA2 was lost, not only was layering compromised, but the ‘On’ SACs lost both their distinctive symmetrical appearance, and, importantly, their motion-detecting ability,” Sun says. “This is evidence that the beautiful symmetric shape that gives starburst amacrine cells their name is necessary for their function.”

Adds Kolodkin, “We hope that learning how layering occurs in these very specific cell types will help us begin sorting out how connections are made not just in the retina, but also in neurons throughout the nervous system. Layering also occurs in the cerebral cortex, for example, which is responsible for thought and consciousness, and we really want to know how this is organized during neural development.”

(Source: newswise.com)

Filed under retinal development retina nerve cells amacrine cells cerebral cortex neuroscience science

898 notes

Patient in ‘vegetative state’ not just aware, but paying attention
Research raises possibility of devices in the future to help some patients in a vegetative state interact with the outside world.
A patient in a seemingly vegetative state, unable to move or speak, showed signs of attentive awareness that had not been detected before, a new study reveals. This patient was able to focus on words signalled by the experimenters as auditory targets as successfully as healthy individuals. If this ability can be developed consistently in certain patients who are vegetative, it could open the door to specialised devices in the future and enable them to interact with the outside world.
The research, by scientists at the Medical Research Council Cognition and Brain Sciences Unit (MRC CBSU) and the University of Cambridge, is published today, 31 October, in the journal Neuroimage: Clinical.
For the study, the researchers used electroencephalography (EEG), which non-invasively measures the electrical activity over the scalp, to test 21 patients diagnosed as vegetative or minimally conscious, and eight healthy volunteers. Participants heard a series of different words  - one word a second over 90 seconds at a time - while asked to alternatingly attend to either the word ‘yes’ or the word ‘no’, each of which appeared 15% of the time. (Some examples of the words used include moss, moth, worm and toad.) This was repeated several times over a period of 30 minutes to detect whether the patients were able to attend to the correct target word.
They found that one of the vegetative patients was able to filter out unimportant information and home in on relevant words they were being asked to pay attention to. Using brain imaging (fMRI), the scientists also discovered that this patient could follow simple commands to imagine playing tennis. They also found that three other minimally conscious patients reacted to novel but irrelevant words, but were unable to selectively pay attention to the target word.
These findings suggest that some patients in a vegetative or minimally conscious state might in fact be able to direct attention to the sounds in the world around them.
Dr Srivas Chennu at the University of Cambridge, said: ”Not only did we find the patient had the ability to pay attention, we also found independent evidence of their ability to follow commands – information which could enable the development of future technology to help patients in a vegetative state communicate with the outside world.
“In order to try and assess the true level of brain function and awareness that survives in the vegetative and minimally conscious states, we are progressively building up a fuller picture of the sensory, perceptual and cognitive abilities in patients. This study has added a key piece to that puzzle, and provided a tremendous amount of insight into the ability of these patients to pay attention.”
Dr Tristan Bekinschtein at the MRC Cognition and Brain Sciences Unit said: “Our attention can be drawn to something by its strangeness or novelty, or we can consciously decide to pay attention to it. A lot of cognitive neuroscience research tells us that we have distinct patterns in the brain for both forms of attention, which we can measure even when the individual is unable to speak. These findings mean that, in certain cases of individuals who are vegetative, we might be able to enhance this ability and improve their level of communication with the outside world.”
This study builds on a joint programme of research at the University of Cambridge and MRC CBSU where a team of researchers have been developing a series of diagnostic and prognostic tools based on brain imaging techniques since 1998. Famously, in 2006 the group was able to use fMRI imaging techniques to establish that a patient in a vegetative state could respond to yes or no questions by indicating different, distinct patterns of brain activity.

Patient in ‘vegetative state’ not just aware, but paying attention

Research raises possibility of devices in the future to help some patients in a vegetative state interact with the outside world.

A patient in a seemingly vegetative state, unable to move or speak, showed signs of attentive awareness that had not been detected before, a new study reveals. This patient was able to focus on words signalled by the experimenters as auditory targets as successfully as healthy individuals. If this ability can be developed consistently in certain patients who are vegetative, it could open the door to specialised devices in the future and enable them to interact with the outside world.

The research, by scientists at the Medical Research Council Cognition and Brain Sciences Unit (MRC CBSU) and the University of Cambridge, is published today, 31 October, in the journal Neuroimage: Clinical.

For the study, the researchers used electroencephalography (EEG), which non-invasively measures the electrical activity over the scalp, to test 21 patients diagnosed as vegetative or minimally conscious, and eight healthy volunteers. Participants heard a series of different words  - one word a second over 90 seconds at a time - while asked to alternatingly attend to either the word ‘yes’ or the word ‘no’, each of which appeared 15% of the time. (Some examples of the words used include moss, moth, worm and toad.) This was repeated several times over a period of 30 minutes to detect whether the patients were able to attend to the correct target word.

They found that one of the vegetative patients was able to filter out unimportant information and home in on relevant words they were being asked to pay attention to. Using brain imaging (fMRI), the scientists also discovered that this patient could follow simple commands to imagine playing tennis. They also found that three other minimally conscious patients reacted to novel but irrelevant words, but were unable to selectively pay attention to the target word.

These findings suggest that some patients in a vegetative or minimally conscious state might in fact be able to direct attention to the sounds in the world around them.

Dr Srivas Chennu at the University of Cambridge, said: ”Not only did we find the patient had the ability to pay attention, we also found independent evidence of their ability to follow commands – information which could enable the development of future technology to help patients in a vegetative state communicate with the outside world.

“In order to try and assess the true level of brain function and awareness that survives in the vegetative and minimally conscious states, we are progressively building up a fuller picture of the sensory, perceptual and cognitive abilities in patients. This study has added a key piece to that puzzle, and provided a tremendous amount of insight into the ability of these patients to pay attention.”

Dr Tristan Bekinschtein at the MRC Cognition and Brain Sciences Unit said: “Our attention can be drawn to something by its strangeness or novelty, or we can consciously decide to pay attention to it. A lot of cognitive neuroscience research tells us that we have distinct patterns in the brain for both forms of attention, which we can measure even when the individual is unable to speak. These findings mean that, in certain cases of individuals who are vegetative, we might be able to enhance this ability and improve their level of communication with the outside world.”

This study builds on a joint programme of research at the University of Cambridge and MRC CBSU where a team of researchers have been developing a series of diagnostic and prognostic tools based on brain imaging techniques since 1998. Famously, in 2006 the group was able to use fMRI imaging techniques to establish that a patient in a vegetative state could respond to yes or no questions by indicating different, distinct patterns of brain activity.

Filed under consciousness vegetative state neuroimaging attention brain mapping neuroscience science

205 notes

Incurable Brain Cancer Gene Is Silenced

Gene regulation technology increases survival rates in mice with glioblastoma

Glioblastoma multiforme (GBM), the brain cancer that killed Sen. Edward Kennedy and kills approximately 13,000 Americans a year, is aggressive and incurable. Now a Northwestern University research team is the first to demonstrate delivery of a drug that turns off a critical gene in this complex cancer, increasing survival rates significantly in animals with the deadly disease.

image

Image: Researchers combined gold nanoparticles (in yellow) with small interfering RNAs (in green) to knock down an oncogene that is overexpressed in glioblastoma.

The novel therapeutic, which is based on nanotechnology, is small and nimble enough to cross the blood-brain barrier and get to where it is needed — the brain tumor. Designed to target a specific cancer-causing gene in cells, the drug simply flips the switch of the troublesome oncogene to “off,” silencing the gene. This knocks out the proteins that keep cancer cells immortal.

In a study of mice, the nontoxic drug was delivered by intravenous injection. In animals with GBM, the survival rate increased nearly 20 percent, and tumor size was reduced three to four fold, as compared to the control group. The results are published today (Oct. 30) in Science Translational Medicine.

“This is a beautiful marriage of a new technology with the genes of a terrible disease,” said Chad A. Mirkin, a nanomedicine expert and a senior co-author of the study. “Using highly adaptable spherical nucleic acids, we specifically targeted a gene associated with GBM and turned it off in vivo. This proof-of-concept further establishes a broad platform for treating a wide range of diseases, from lung and colon cancers to rheumatoid arthritis and psoriasis.”

Mirkin is the George B. Rathmann Professor of Chemistry in the Weinberg College of Arts and Sciences and professor of medicine, chemical and biological engineering, biomedical engineering and materials science and engineering.

Glioblastoma expert Alexander H. Stegh came to Northwestern University in 2009, attracted by the University’s reputation for interdisciplinary research, and within weeks was paired up with Mirkin to tackle the difficult problem of developing better treatments for glioblastoma. 

Help is critical for patients with GBM: The median survival rate is 14 to 16 months, and approximately 16,000 new cases are reported in the U.S. every year.

In their research partnership, Mirkin had the perfect tool to tackle the deadly cancer: spherical nucleic acids (SNAs), new globular forms of DNA and RNA, which he had invented at Northwestern in 1996, and which are nontoxic to humans. The nucleic acid sequence is designed to match the target gene.

And Stegh had the gene: In 2007, he and colleagues identified the gene Bcl2Like12 as one that is overexpressed in glioblastoma tumors and related to glioblastoma’s resistance to conventional therapies.

“My research group is working to uncover the secrets of cancer and, more importantly, how to stop it,” said Stegh, a senior co-author of the study. “Glioblastoma is a very challenging cancer, and most chemo-therapeutic drugs fail in the clinic. The beauty of the gene we silenced in this study is that it plays many different roles in therapy resistance. Taking the gene out of the picture should allow conventional therapies to be more effective.”

Stegh is an assistant professor in the Ken and Ruth Davee Department of Neurology at the Northwestern University Feinberg School of Medicine and an investigator in the Northwestern Brain Tumor Institute.

The power of gene regulation technology is that a disease with a genetic basis can be attacked and treated if scientists have the right tools. Thanks to the Human Genome Project and genomics research over the last two decades, there is an enormous number of genetic targets; having the right therapeutic agents and delivery materials has been the challenge.

“The RNA interfering-based SNAs are a completely novel approach in thinking about cancer therapy,” Stegh said. “One of the problems is that we have large lists of genes that are somehow disregulated in glioblastoma, but we have absolutely no way of targeting all of them using standard pharmacological approaches. That’s where we think nanomaterials can play a fundamental role in allowing us to implement the concept of personalized medicine in cancer therapy.”

Stegh and Mirkin’s drug for GBM is specially designed to target the Bcl2Like12 gene in cancer cells. Key is the nanostructure’s spherical shape and nucleic acid density. Normal (linear) nucleic acids cannot get into cells, but these spherical nucleic acids can. Small interfering RNA (siRNA) surrounds a gold nanoparticle like a shell; the nucleic acids are highly oriented, densely packed and form a tiny sphere. (The gold nanoparticle core is only 13 nanometers in diameter.) The RNA’s sequence is programmed to silence the disease-causing gene.

“The problems posed by glioblastoma and many other diseases are simply too big for one research group to handle,” said Mirkin, who also is the director of Northwestern’s International Institute for Nanotechnology. “This work highlights the power of scientists and engineers from different fields coming together to address a difficult medical issue.”

Mirkin first developed the nanostructure platform used in this study in 1996 at Northwestern, and the technology now is the basis of powerful commercialized and FDA-cleared medical diagnostic tools. This new development, however, is the first realization that the nanostructures injected into an animal naturally find their target in the brain and can deliver an effective payload of therapeutics.

The next step for the therapeutic will be to test it in clinical trials.

The nanostructures used in this study were developed in Mirkin’s lab on the Evanston campus and then used in cell and animal studies in Stegh’s lab on the Chicago campus.

(Source: northwestern.edu)

Filed under glioblastoma brain tumors brain cancer medicine science

129 notes

Baby brains are tuned to the specific actions of others
Imitation may be the sincerest form of flattery for adults, but for babies it’s their foremost tool for learning. As renowned people-watchers, babies often observe others demonstrate how to do things and then copy those body movements. It’s how little ones know, usually without explicit instructions, to hold a toy phone to the ear or guide a spoon to the mouth.
Now researchers from the University of Washington and Temple University have found the first evidence revealing a key aspect of the brain processing that occurs in babies to allow this learning by observation.

The findings, published online Oct. 30 by PLOS ONE, are the first to show that babies’ brains showed specific activation patterns when an adult performed a task with different parts of her body. When 14-month-old babies simply watched an adult use her hand to touch a toy, the hand area of the baby’s brain lit up. When another group of infants watched an adult touch the toy using only her foot, the foot area of the baby’s brain showed more activity.

"Babies are exquisitely careful people-watchers, and they’re primed to learn from others," said Andrew Meltzoff, co-author and co-director of the UW Institute for Learning & Brain Sciences. "And now we see that when babies watch someone else, it activates their own brains. This study is a first step in understanding the neuroscience of how babies learn through imitation."

The study took advantage of how the brain is organized. The sensory and motor area of the cortex, the outer portion of the brain known for its creased appearance, is arranged by body part with each area of the body represented in identifiable neural real estate. Prick your finger, stick out your tongue, or kick a ball and distinct areas of the brain light up according to a somatotopic map.
Other studies show that adults show this somatotopic brain activation while watching someone else use different body parts, suggesting that adults understand the actions of others in relation to their own bodies. The researchers wondered whether the same would be true in babies.

The 70 infants in the study wore electroencephalogram, or EEG, caps with embedded sensors that detected brain activity in the regions of the cortex that respond to movement or touch of the feet and hands. Sitting on a parent’s lap, each baby watched as an experimenter touched a toy placed on a low table between the baby and the experimenter.

The toy had a clear plastic dome and was mounted on a sturdy base. When the experimenter pressed the dome with her hand or foot, music played and confetti in the dome spun. The experimenter repeated the action – taking breaks after every four presses – until the baby lost interest.
"Our findings show that when babies see others produce actions with a particular body part, their brains are activated in a corresponding way," said Joni Saby, lead author and a psychology graduate student at Temple University in Philadelphia. "This mapping may facilitate imitation and could play a role in the baby’s ability to then produce the same actions themselves."

One of the basics for babies to learn is how to copy what they see adults do. In other words, they must first know that it is indeed their hand and not their foot, mouth or other body part that is needed.
The new study shows that babies’ brains are organized in a somatotopic way that helps crack the interpersonal code. The connection between doing and seeing actions maps hand to hand, foot to foot, all before they can name those body parts through language.

"The reason this is exciting is that it gives insight into a crucial aspect of imitation," said co-author Peter Marshall, an associate psychology professor at Temple University. "To imitate the action of another person, babies first need to register what body part the other person used. Our findings suggest that babies do this in a particular way by mapping the actions of the other person onto their own body."
Meltzoff added, “The neural system of babies directly connects them to other people, which jump-starts imitation and social-emotional connectedness and bonding. Babies look at you and see themselves.”

Baby brains are tuned to the specific actions of others

Imitation may be the sincerest form of flattery for adults, but for babies it’s their foremost tool for learning. As renowned people-watchers, babies often observe others demonstrate how to do things and then copy those body movements. It’s how little ones know, usually without explicit instructions, to hold a toy phone to the ear or guide a spoon to the mouth.

Now researchers from the University of Washington and Temple University have found the first evidence revealing a key aspect of the brain processing that occurs in babies to allow this learning by observation.

The findings, published online Oct. 30 by PLOS ONE, are the first to show that babies’ brains showed specific activation patterns when an adult performed a task with different parts of her body. When 14-month-old babies simply watched an adult use her hand to touch a toy, the hand area of the baby’s brain lit up. When another group of infants watched an adult touch the toy using only her foot, the foot area of the baby’s brain showed more activity.

"Babies are exquisitely careful people-watchers, and they’re primed to learn from others," said Andrew Meltzoff, co-author and co-director of the UW Institute for Learning & Brain Sciences. "And now we see that when babies watch someone else, it activates their own brains. This study is a first step in understanding the neuroscience of how babies learn through imitation."

The study took advantage of how the brain is organized. The sensory and motor area of the cortex, the outer portion of the brain known for its creased appearance, is arranged by body part with each area of the body represented in identifiable neural real estate. Prick your finger, stick out your tongue, or kick a ball and distinct areas of the brain light up according to a somatotopic map.

Other studies show that adults show this somatotopic brain activation while watching someone else use different body parts, suggesting that adults understand the actions of others in relation to their own bodies. The researchers wondered whether the same would be true in babies.

The 70 infants in the study wore electroencephalogram, or EEG, caps with embedded sensors that detected brain activity in the regions of the cortex that respond to movement or touch of the feet and hands. Sitting on a parent’s lap, each baby watched as an experimenter touched a toy placed on a low table between the baby and the experimenter.

The toy had a clear plastic dome and was mounted on a sturdy base. When the experimenter pressed the dome with her hand or foot, music played and confetti in the dome spun. The experimenter repeated the action – taking breaks after every four presses – until the baby lost interest.

"Our findings show that when babies see others produce actions with a particular body part, their brains are activated in a corresponding way," said Joni Saby, lead author and a psychology graduate student at Temple University in Philadelphia. "This mapping may facilitate imitation and could play a role in the baby’s ability to then produce the same actions themselves."

One of the basics for babies to learn is how to copy what they see adults do. In other words, they must first know that it is indeed their hand and not their foot, mouth or other body part that is needed.

The new study shows that babies’ brains are organized in a somatotopic way that helps crack the interpersonal code. The connection between doing and seeing actions maps hand to hand, foot to foot, all before they can name those body parts through language.

"The reason this is exciting is that it gives insight into a crucial aspect of imitation," said co-author Peter Marshall, an associate psychology professor at Temple University. "To imitate the action of another person, babies first need to register what body part the other person used. Our findings suggest that babies do this in a particular way by mapping the actions of the other person onto their own body."

Meltzoff added, “The neural system of babies directly connects them to other people, which jump-starts imitation and social-emotional connectedness and bonding. Babies look at you and see themselves.”

Filed under motor cortex learning brain mapping brain activity infants psychology neuroscience science

268 notes

Research Finds Pain In Infancy Alters Response To Stress, Anxiety Later In Life
Early life pain alters neural circuits in the brain that regulate stress, suggesting pain experienced by infants who often do not receive analgesics while undergoing tests and treatment in neonatal intensive care may permanently alter future responses to anxiety, stress and pain in adulthood, a research team led by Dr. Anne Murphy, associate director of the Neuroscience Institute at Georgia State University, has discovered.
An estimated 12 percent of live births in the U.S. are considered premature, researchers said. These infants often spend an average of 25 days in neonatal intensive care, where they endure 10-to-18 painful and inflammatory procedures each day, including insertion of feeding tubes and intravenous lines, intubation and repeated heel lance. Despite evidence that pain and stress circuitry in the brain are established and functional in preterm infants, about 65 percent of these procedures are performed without benefit of analgesia. Some clinical studies suggest early life pain has an immediate and long-term impact on responses to stress- and anxiety-provoking events.
The Georgia State study examined whether a single painful inflammatory procedure performed on male and female rat pups on the day of birth alters specific brain receptors that affect behavioral sensitivity to stress, anxiety and pain in adulthood. The findings demonstrated that such an experience is associated with site-specific changes in the brain that regulate how the pups responded to stressful situations. Alterations in how these receptors function have also been associated with mood disorders.
The study findings mirror what is now being reported clinically. Children who experienced unresolved pain following birth show reduced responsiveness to pain and stress.
“While a dampened response to painful and stressful situations may seem advantageous at first, the ability to respond appropriately to a potentially harmful stimulus is necessary in the long term,” Dr. Murphy said.
“The fact that less than 35 percent of infants undergoing painful and invasive procedures receive any sort of pre- or post-operative pain relief needs to be re-evaluated in order to reduce physical and mental health complications associated with preterm birth.”

Research Finds Pain In Infancy Alters Response To Stress, Anxiety Later In Life

Early life pain alters neural circuits in the brain that regulate stress, suggesting pain experienced by infants who often do not receive analgesics while undergoing tests and treatment in neonatal intensive care may permanently alter future responses to anxiety, stress and pain in adulthood, a research team led by Dr. Anne Murphy, associate director of the Neuroscience Institute at Georgia State University, has discovered.

An estimated 12 percent of live births in the U.S. are considered premature, researchers said. These infants often spend an average of 25 days in neonatal intensive care, where they endure 10-to-18 painful and inflammatory procedures each day, including insertion of feeding tubes and intravenous lines, intubation and repeated heel lance. Despite evidence that pain and stress circuitry in the brain are established and functional in preterm infants, about 65 percent of these procedures are performed without benefit of analgesia. Some clinical studies suggest early life pain has an immediate and long-term impact on responses to stress- and anxiety-provoking events.

The Georgia State study examined whether a single painful inflammatory procedure performed on male and female rat pups on the day of birth alters specific brain receptors that affect behavioral sensitivity to stress, anxiety and pain in adulthood. The findings demonstrated that such an experience is associated with site-specific changes in the brain that regulate how the pups responded to stressful situations. Alterations in how these receptors function have also been associated with mood disorders.

The study findings mirror what is now being reported clinically. Children who experienced unresolved pain following birth show reduced responsiveness to pain and stress.

“While a dampened response to painful and stressful situations may seem advantageous at first, the ability to respond appropriately to a potentially harmful stimulus is necessary in the long term,” Dr. Murphy said.

“The fact that less than 35 percent of infants undergoing painful and invasive procedures receive any sort of pre- or post-operative pain relief needs to be re-evaluated in order to reduce physical and mental health complications associated with preterm birth.”

Filed under infants premature babies anxiety stress pain psychology neuroscience science

87 notes

Scientists shed light on brain computations
University of Queensland (UQ) scientists have made a fundamental breakthrough into how the brain decodes the visual world.
Using advanced electrical recording techniques, researchers at UQ’s Queensland Brain Institute (QBI) have discovered how output cells of the eye’ balls retina compute the direction of a moving object.
QBI’s Dr Ben Sivyer and Associate Professor Stephen Williams have found that dendrites – the branching process of a neuron that conducts impulses toward the cell – play a critical role in decoding images.
“In the past decade our research shows that dendrites provide neurons with powerful processing capabilities,” Associate Professor Williams said.
“However the function of dendritic processing in the real-time operation of neuronal networks has remained elusive.”
To gain further insight, the group measured electrical activity from multiple sites in retinal ganglion cells when visual stimuli moved through space.
“The retina, a thin neuronal network at the posterior part of the eyeball, is ideal for investigating the role of active dendritic integration in neuronal circuit function,” he said.
“This is because this network can be maintained intact in a dish and retains its responsiveness to natural stimuli.”
He said while it had long been known that the retinal network extracted and signalled specific aspects of visual stimuli, the new work has discovered how such responses are computed.
“We found that retinal ganglion cells compute the direction of light stimuli through exquisitely controlled local integration compartments in the dendritic tree, a finding which highlights the key function that dendrites play in brain computations,” said Associate Professor Williams.
QBI Director Professor Perry Bartlett said this new insight was vital to brain research.
“Discovering how nerve cells process information is fundamental to understanding how we learn, and to developing new strategies to enhance learning in education and in disease processes in the brain,” he said.
Queensland Minister for Science and Innovation Ian Walker congratulated Dr Sivyer and Associate Professor Williams on their internationally significant findings.
“This is another example of Queensland leading the world in health and medical research,” he said.
“Dendrite research also has flow-on implications for brain-function studies in a range of areas.
“While all of these areas are important, I will be particularly interested to see its application to dementia research, which has been a major focus for recent Queensland Government support.”
The paper, Direction selectivity is computed by active dendritic integration in retinal ganglion cells, is published in the prestigious journal Nature Neuroscience.

Scientists shed light on brain computations

University of Queensland (UQ) scientists have made a fundamental breakthrough into how the brain decodes the visual world.

Using advanced electrical recording techniques, researchers at UQ’s Queensland Brain Institute (QBI) have discovered how output cells of the eye’ balls retina compute the direction of a moving object.

QBI’s Dr Ben Sivyer and Associate Professor Stephen Williams have found that dendrites – the branching process of a neuron that conducts impulses toward the cell – play a critical role in decoding images.

“In the past decade our research shows that dendrites provide neurons with powerful processing capabilities,” Associate Professor Williams said.

“However the function of dendritic processing in the real-time operation of neuronal networks has remained elusive.”

To gain further insight, the group measured electrical activity from multiple sites in retinal ganglion cells when visual stimuli moved through space.

“The retina, a thin neuronal network at the posterior part of the eyeball, is ideal for investigating the role of active dendritic integration in neuronal circuit function,” he said.

“This is because this network can be maintained intact in a dish and retains its responsiveness to natural stimuli.”

He said while it had long been known that the retinal network extracted and signalled specific aspects of visual stimuli, the new work has discovered how such responses are computed.

“We found that retinal ganglion cells compute the direction of light stimuli through exquisitely controlled local integration compartments in the dendritic tree, a finding which highlights the key function that dendrites play in brain computations,” said Associate Professor Williams.

QBI Director Professor Perry Bartlett said this new insight was vital to brain research.

“Discovering how nerve cells process information is fundamental to understanding how we learn, and to developing new strategies to enhance learning in education and in disease processes in the brain,” he said.

Queensland Minister for Science and Innovation Ian Walker congratulated Dr Sivyer and Associate Professor Williams on their internationally significant findings.

“This is another example of Queensland leading the world in health and medical research,” he said.

“Dendrite research also has flow-on implications for brain-function studies in a range of areas.

“While all of these areas are important, I will be particularly interested to see its application to dementia research, which has been a major focus for recent Queensland Government support.”

The paper, Direction selectivity is computed by active dendritic integration in retinal ganglion cells, is published in the prestigious journal Nature Neuroscience.

Filed under retina retinal ganglion cells neurons dendrites neuroscience science

free counters