Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

60 notes

New chemical probe provides tool to investigate role of malignant brain tumor domains
In an article published as the cover story of the March 2013 issue of Nature Chemical Biology, Lindsey James, PhD, research assistant professor in the lab of Stephen Frye, Fred Eshelman Distinguished Professor in the UNC School of Pharmacy and member of the UNC Lineberger Comprehensive Cancer Center, announced the discovery of a chemical probe that can be used to investigate the L3MBTL3 methyl-lysine reader domain. The probe, named UNC1215, will provide researchers with a powerful tool to investigate the function of malignant brain tumor (MBT) domain proteins in biology and disease.
“Before this there were no known chemical probes for the more than 200 domains in the human genome that recognize methyl lysine. In that regard, it is a first in class compound. The goal is to use the chemical probe to understand the biology of the proteins that it targets,” said Dr. James.
Chromatin regulatory pathways play a fundamental role in gene expression and disease development, especially in the case of cancer. While many chemical probes work through the inhibition of enzyme activity, L3MBTL3 functions as a mediator of protein-to-protein interactions, which have been historically difficult to target with small, drug-like molecules.The researchers found three to four further disease subtypes within TN tumors, with more than 75 percent of the tumors falling into the basal-like subtype. Further research is needed to identify the distinct biomarkers shared by the expanded subtypes of TN cancers. The ultimate goal will be to target the individual biomarkers of these subtypes and create therapies that target their individual biology, according to Dr. Perou.
“Many people believe that protein-protein interactions are difficult to target. Often they have a large surface area, so it is hard for small molecules to go in and intervene,” said Dr. James.
Almost 40 percent of the genes that drive cancer can be mapped to dysfunction within signaling pathways. In the last five years, chemical probe development has allowed researchers to make fundamental observations of the role of these pathways in cancer development, as well as pointing to potential targets for new therapies. Each of the complex interactions within the signaling pathways represents a potential point where a therapy can be applied, and the probes allow researchers to interact with these processes at the molecular level and observe the overall effect of their perturbation on the disease state.
In a 2008 Nature Chemical Biology commentary, Dr. Frye outlined the qualities that make a good chemical probe. To Frye, a good chemical probe must be highly selective to enable specific questions to be asked and it must function as well in a cell as in the test tube, providing clear quantitative data with a well understood mechanism of action in either situation. It also must be available to all academic researchers without restrictions on its use, a criteria that the L3MBTL3 probe fulfills through the Frye lab’s commitment to provide researchers with the probe free of charge on request and UNC1215 is already available through commercial vendors as well.

New chemical probe provides tool to investigate role of malignant brain tumor domains

In an article published as the cover story of the March 2013 issue of Nature Chemical Biology, Lindsey James, PhD, research assistant professor in the lab of Stephen Frye, Fred Eshelman Distinguished Professor in the UNC School of Pharmacy and member of the UNC Lineberger Comprehensive Cancer Center, announced the discovery of a chemical probe that can be used to investigate the L3MBTL3 methyl-lysine reader domain. The probe, named UNC1215, will provide researchers with a powerful tool to investigate the function of malignant brain tumor (MBT) domain proteins in biology and disease.

“Before this there were no known chemical probes for the more than 200 domains in the human genome that recognize methyl lysine. In that regard, it is a first in class compound. The goal is to use the chemical probe to understand the biology of the proteins that it targets,” said Dr. James.

Chromatin regulatory pathways play a fundamental role in gene expression and disease development, especially in the case of cancer. While many chemical probes work through the inhibition of enzyme activity, L3MBTL3 functions as a mediator of protein-to-protein interactions, which have been historically difficult to target with small, drug-like molecules.The researchers found three to four further disease subtypes within TN tumors, with more than 75 percent of the tumors falling into the basal-like subtype. Further research is needed to identify the distinct biomarkers shared by the expanded subtypes of TN cancers. The ultimate goal will be to target the individual biomarkers of these subtypes and create therapies that target their individual biology, according to Dr. Perou.

“Many people believe that protein-protein interactions are difficult to target. Often they have a large surface area, so it is hard for small molecules to go in and intervene,” said Dr. James.

Almost 40 percent of the genes that drive cancer can be mapped to dysfunction within signaling pathways. In the last five years, chemical probe development has allowed researchers to make fundamental observations of the role of these pathways in cancer development, as well as pointing to potential targets for new therapies. Each of the complex interactions within the signaling pathways represents a potential point where a therapy can be applied, and the probes allow researchers to interact with these processes at the molecular level and observe the overall effect of their perturbation on the disease state.

In a 2008 Nature Chemical Biology commentary, Dr. Frye outlined the qualities that make a good chemical probe. To Frye, a good chemical probe must be highly selective to enable specific questions to be asked and it must function as well in a cell as in the test tube, providing clear quantitative data with a well understood mechanism of action in either situation. It also must be available to all academic researchers without restrictions on its use, a criteria that the L3MBTL3 probe fulfills through the Frye lab’s commitment to provide researchers with the probe free of charge on request and UNC1215 is already available through commercial vendors as well.

Filed under brain tumor brain cancer gene expression proteins medicine neuroscience science

207 notes

Malign environmental combination favours schizophrenia
The interplay between an infection during pregnancy and stress in puberty plays a key role in the development of schizophrenia, as behaviourists from ETH Zurich demonstrate in a mouse model. However, there is no need to panic.
Around one per cent of the population suffers from schizophrenia, a serious mental disorder that usually does not develop until adulthood and is incurable. Psychiatrists and neuroscientists have long suspected that adverse enviromental factors may play an important role in the development of schizophrenia. Prenatal infections such as toxoplasmosis or influenza, psychological, stress or family history have all come into question as risk factors. Nevertheless, until now researchers were unable to identify the interplay of the individual factors linked to this serious mental disease.
However, a research group headed by Urs Meyer, a senior scientist at the Laboratory of Physiology & Behaviour at ETH Zurich, has now made a breakthrough: for the first time, they were able to find clear evidence that the combination of two environmental factors contributes significantly to the development of schizophrenia-relevant brain changes and at which stages in a person’s life they need to come into play for the disorder to break out. The researchers developed a special mouse model, with which they were able to simulate the processes in humans virtually in fast forward. The study has just been published in the journal Science.

Malign environmental combination favours schizophrenia

The interplay between an infection during pregnancy and stress in puberty plays a key role in the development of schizophrenia, as behaviourists from ETH Zurich demonstrate in a mouse model. However, there is no need to panic.

Around one per cent of the population suffers from schizophrenia, a serious mental disorder that usually does not develop until adulthood and is incurable. Psychiatrists and neuroscientists have long suspected that adverse enviromental factors may play an important role in the development of schizophrenia. Prenatal infections such as toxoplasmosis or influenza, psychological, stress or family history have all come into question as risk factors. Nevertheless, until now researchers were unable to identify the interplay of the individual factors linked to this serious mental disease.

However, a research group headed by Urs Meyer, a senior scientist at the Laboratory of Physiology & Behaviour at ETH Zurich, has now made a breakthrough: for the first time, they were able to find clear evidence that the combination of two environmental factors contributes significantly to the development of schizophrenia-relevant brain changes and at which stages in a person’s life they need to come into play for the disorder to break out. The researchers developed a special mouse model, with which they were able to simulate the processes in humans virtually in fast forward. The study has just been published in the journal Science.

Filed under schizophrenia animal model infection puberty pregnancy stress environment neuroscience science

66 notes

Study looks to distinguish cognitive functioning in centenarians
As life expectancy continues to increase, more and more people will reach and surpass the century mark in age. But even as greater numbers reach and surpass the 100-year milestone, little is known about what constitutes normal levels of cognitive function in the second century of life.
Led by Adam Davey, associate professor in Temple’s Department of Public Health, a group of researchers used a new method called factor mixture analysis — a statistical technique for identifying different groups within a population — to identify the prevalence of cognitive impairment in centenarians and try to understand the cognitive changes that are part of extreme aging. They published their findings, “Profiles of Cognitive Functioning in a Population-Based Sample of Centenarians Using Factor Mixture Analysis,” in the journal Experimental Aging Research.
“One of the motivations for studying centenarians is that they are very close to the upper limit of human life expectancy right now,” said Davey. “By looking at their cognitive functioning we can learn a lot in terms of how common or prevalent cognitive impairment is among that age group.”
Using voter registration lists and nursing home records in 44 counties in northern Georgia, the researchers identified 244 people between the ages of 98-108 — approximately 20 percent of all centenarians living in that region — who participated in the study. Participants were assessed based on a series of standard tests used to measure cognitive functioning.
“As people get into later life and the prevalence of cognitive impairment becomes relatively high, we need some way of distinguishing between those people who are aging normally and the people who have cognitive impairment, which could indicate dementia,” said Davey.
The researchers found that even though approximately two-thirds of centenarians were at or below the threshold for cognitive impairment by one commonly used measure, only one-third of centenarians were identified as cognitively impaired using their new approach.
“That’s consistent with the level of cognitive impairment found in another study that looked at people up to the age of 85-plus,” said Davey. “But even the normal folks have had cognitive declines to the point that they are functioning at a level that would indicate impairment at younger ages.”
The researchers found that characteristics such as age, race and educational attainment can help to distinguish those in the lower cognitive performance group.
“This is the first study that I’m aware of that allows us to distinguish between these two groups of centenarians, so that we can start to develop benchmarks for what is normal cognitive functioning among members of this age group,” said Davey. “These people have lived so long that even their normal cognitive function could be mistaken for a form of dementia if a physician were to treat them as they would someone who was merely old.”
(Image credit: Krissy_77)

Study looks to distinguish cognitive functioning in centenarians

As life expectancy continues to increase, more and more people will reach and surpass the century mark in age. But even as greater numbers reach and surpass the 100-year milestone, little is known about what constitutes normal levels of cognitive function in the second century of life.

Led by Adam Davey, associate professor in Temple’s Department of Public Health, a group of researchers used a new method called factor mixture analysis — a statistical technique for identifying different groups within a population — to identify the prevalence of cognitive impairment in centenarians and try to understand the cognitive changes that are part of extreme aging. They published their findings, “Profiles of Cognitive Functioning in a Population-Based Sample of Centenarians Using Factor Mixture Analysis,” in the journal Experimental Aging Research.

“One of the motivations for studying centenarians is that they are very close to the upper limit of human life expectancy right now,” said Davey. “By looking at their cognitive functioning we can learn a lot in terms of how common or prevalent cognitive impairment is among that age group.”

Using voter registration lists and nursing home records in 44 counties in northern Georgia, the researchers identified 244 people between the ages of 98-108 — approximately 20 percent of all centenarians living in that region — who participated in the study. Participants were assessed based on a series of standard tests used to measure cognitive functioning.

“As people get into later life and the prevalence of cognitive impairment becomes relatively high, we need some way of distinguishing between those people who are aging normally and the people who have cognitive impairment, which could indicate dementia,” said Davey.

The researchers found that even though approximately two-thirds of centenarians were at or below the threshold for cognitive impairment by one commonly used measure, only one-third of centenarians were identified as cognitively impaired using their new approach.

“That’s consistent with the level of cognitive impairment found in another study that looked at people up to the age of 85-plus,” said Davey. “But even the normal folks have had cognitive declines to the point that they are functioning at a level that would indicate impairment at younger ages.”

The researchers found that characteristics such as age, race and educational attainment can help to distinguish those in the lower cognitive performance group.

“This is the first study that I’m aware of that allows us to distinguish between these two groups of centenarians, so that we can start to develop benchmarks for what is normal cognitive functioning among members of this age group,” said Davey. “These people have lived so long that even their normal cognitive function could be mistaken for a form of dementia if a physician were to treat them as they would someone who was merely old.”

(Image credit: Krissy_77)

Filed under brain cognitive function cognitive impairment centenarians aging psychology neuroscience science

76 notes

New model could lead to improved treatment for early stage Alzheimer’s
Researchers at the University of Florida and The Johns Hopkins University have developed a line of genetically altered mice that model the earliest stages of Alzheimer’s disease. This model may help scientists identify new therapies to provide relief to patients who are beginning to experience symptoms.
The researchers report their findings in the current issue of The Journal of Neuroscience.
“The development of this model could help scientists identify new ways to enhance brain function in patients in the early stages of the disease,” said David Borchelt, UF professor of neuroscience in the Evelyn F. and William L. McKnight Brain Institute and director of the SantaFe HealthCare Alzheimer’s Disease Research Center. “Such therapies could preserve brain function longer and delay the appearance of more severe symptoms that leave patients unable to care for themselves.”
In the early stages of Alzheimer’s disease, people struggle with and fail to learn new games, rules or technologies because their cognitive flexibility decreases. The degenerative disease continues with memory loss and the decline of other brain functions.
The researchers worked with mice that had specially designed gene fragments derived from bacteria and from humans that allowed the investigators to control the production of a small peptide. The peptide, called amyloid beta peptide, is a short chain of amino acids. Accumulations of this particular peptide in the brain as lesions called plaques occur early  in the progression of Alzheimer’s disease and seem to trigger the early memory problems.
The team regulated the expression of the peptide using antibiotics — when the animals stopped taking the antibiotic, the peptide-producing gene turned on and caused the mice to develop the plaques found in Alzheimer’s patients. After the mice had developed the Alzheimer pathology, the researchers turned the gene back off and observed that the mice showed persistent memory problems that resemble the early stages of the disease.
“This model may be useful to researchers to test drugs that could help with symptoms of early stage Alzheimer’s disease,” Borchelt said.This research is funded by the National Institute of Neurological Disease and Stroke of the National Institutes of Health, and the SantaFe HealthCare Alzheimer’s Disease Research Center of the University of Florida.

New model could lead to improved treatment for early stage Alzheimer’s

Researchers at the University of Florida and The Johns Hopkins University have developed a line of genetically altered mice that model the earliest stages of Alzheimer’s disease. This model may help scientists identify new therapies to provide relief to patients who are beginning to experience symptoms.

The researchers report their findings in the current issue of The Journal of Neuroscience.

“The development of this model could help scientists identify new ways to enhance brain function in patients in the early stages of the disease,” said David Borchelt, UF professor of neuroscience in the Evelyn F. and William L. McKnight Brain Institute and director of the SantaFe HealthCare Alzheimer’s Disease Research Center. “Such therapies could preserve brain function longer and delay the appearance of more severe symptoms that leave patients unable to care for themselves.”

In the early stages of Alzheimer’s disease, people struggle with and fail to learn new games, rules or technologies because their cognitive flexibility decreases. The degenerative disease continues with memory loss and the decline of other brain functions.

The researchers worked with mice that had specially designed gene fragments derived from bacteria and from humans that allowed the investigators to control the production of a small peptide. The peptide, called amyloid beta peptide, is a short chain of amino acids. Accumulations of this particular peptide in the brain as lesions called plaques occur early  in the progression of Alzheimer’s disease and seem to trigger the early memory problems.

The team regulated the expression of the peptide using antibiotics — when the animals stopped taking the antibiotic, the peptide-producing gene turned on and caused the mice to develop the plaques found in Alzheimer’s patients. After the mice had developed the Alzheimer pathology, the researchers turned the gene back off and observed that the mice showed persistent memory problems that resemble the early stages of the disease.

“This model may be useful to researchers to test drugs that could help with symptoms of early stage Alzheimer’s disease,” Borchelt said.This research is funded by the National Institute of Neurological Disease and Stroke of the National Institutes of Health, and the SantaFe HealthCare Alzheimer’s Disease Research Center of the University of Florida.

Filed under alzheimer's disease brain function memory loss cognitive impairment amyloid beta animal model neuroscience science

117 notes

Nut-cracking monkeys use shapes to strategize their use of tools

Bearded capuchin monkeys deliberately place palm nuts in a stable position on a surface before trying to crack them open, revealing their capacity to use tactile information to improve tool use. The results are published February 27 in the open access journal PLOS ONE by Dorothy Fragaszy and colleagues from the University of Georgia.

The researchers analyzed the monkeys’ tool-use skills by videotaping adult monkeys cracking palm nuts on a surface they used frequently for the purpose. They found that monkeys positioned the nuts flat side down more frequently than expected by random chance. When placing the nuts, the monkeys knocked the nuts on the surface a few times before releasing them, after which the nuts very rarely moved. The researchers suggest that the monkeys may have learned to optimize this tool-use strategy by repeatedly knocking the nut to achieve the stable position prior to cracking it. They conclude that the monkeys’ strategic placement of the nut reveals that the monkeys pay attention to the fit between the nut and the surface each time they place the nut, and adjust their actions accordingly.

In a parallel experiment, the scientists asked blindfolded people to perform the same action, positioning palm nuts on an anvil as if to crack them with a stone or hammer. Like the monkeys, the human participants also followed tactile cues to place the nut flat-side down on the anvil.

Filed under primates tool use animal behavior haptic perception psychology neuroscience science

169 notes

Action video games boost reading skills
Much to the chagrin of parents who think their kids should spend less time playing video games and more time studying, time spent playing action video games can actually make dyslexic children read better. In fact, 12 hours of video game play did more for reading skills than is normally achieved with a year of spontaneous reading development or demanding traditional reading treatments.
The evidence, appearing in the Cell Press journal Current Biology on February 28, follows from earlier work by the same team linking dyslexia to early problems with visual attention rather than language skills.
"Action video games enhance many aspects of visual attention, mainly improving the extraction of information from the environment," said Andrea Facoetti of the University of Padua and the Scientific Institute Medea of Bosisio Parini in Italy. "Dyslexic children learned to orient and focus their attention more efficiently to extract the relevant information of a written word more rapidly."
The findings come as further support for the notion that visual attention deficits are at the root of dyslexia, a condition that makes reading extremely difficult for one out of every ten children, Facoetti added. He emphasized that there is, as of now, no approved treatment for dyslexia that includes video games.
Facoetti’s team, including Sandro Franceschini, Simone Gori, Milena Ruffino, Simona Viola, and Massimo Molteni, tested the reading, phonological, and attentional skills of two groups of children with dyslexia before and after they played action or non-action video games for nine 80-minute sessions. The action video gamers were able to read faster without losing accuracy. They also showed gains in other tests of attention.
"These results are very important in order to understand the brain mechanisms underlying dyslexia, but they don’t put us in a position to recommend playing video games without any control or supervision," Facoetti said.
Still, there is great hope for early interventions that could be applied in low-resource settings. “Our study paves the way for new remediation programs, based on scientific results, that can reduce the dyslexia symptoms and even prevent dyslexia when applied to children at risk for dyslexia before they learn to read.”
And, guess what? Those kids will also be having fun.

Action video games boost reading skills

Much to the chagrin of parents who think their kids should spend less time playing video games and more time studying, time spent playing action video games can actually make dyslexic children read better. In fact, 12 hours of video game play did more for reading skills than is normally achieved with a year of spontaneous reading development or demanding traditional reading treatments.

The evidence, appearing in the Cell Press journal Current Biology on February 28, follows from earlier work by the same team linking dyslexia to early problems with visual attention rather than language skills.

"Action video games enhance many aspects of visual attention, mainly improving the extraction of information from the environment," said Andrea Facoetti of the University of Padua and the Scientific Institute Medea of Bosisio Parini in Italy. "Dyslexic children learned to orient and focus their attention more efficiently to extract the relevant information of a written word more rapidly."

The findings come as further support for the notion that visual attention deficits are at the root of dyslexia, a condition that makes reading extremely difficult for one out of every ten children, Facoetti added. He emphasized that there is, as of now, no approved treatment for dyslexia that includes video games.

Facoetti’s team, including Sandro Franceschini, Simone Gori, Milena Ruffino, Simona Viola, and Massimo Molteni, tested the reading, phonological, and attentional skills of two groups of children with dyslexia before and after they played action or non-action video games for nine 80-minute sessions. The action video gamers were able to read faster without losing accuracy. They also showed gains in other tests of attention.

"These results are very important in order to understand the brain mechanisms underlying dyslexia, but they don’t put us in a position to recommend playing video games without any control or supervision," Facoetti said.

Still, there is great hope for early interventions that could be applied in low-resource settings. “Our study paves the way for new remediation programs, based on scientific results, that can reduce the dyslexia symptoms and even prevent dyslexia when applied to children at risk for dyslexia before they learn to read.”

And, guess what? Those kids will also be having fun.

Filed under reading reading development dyslexia visual attention video games neuroscience psychology science

113 notes

Authors: Develop digital games to improve brain function and well-being
Neuroscientists should help to develop compelling digital games that boost brain function and improve well-being, say two professors specializing in the field in a commentary article published in the science journal Nature.
In the Feb. 28 issue, the two — Daphne Bavelier of the University of Rochester and Richard J. Davidson of the University of Wisconsin-Madison — urge game designers and brain scientists to work together to design new games that train the brain, producing positive effects on behavior, such as decreasing anxiety, sharpening attention and improving empathy. Already, some video games are designed to treat depression and to encourage cancer patients to stick with treatment, the authors note.
Davidson is founder and chair of the Center for Investigating Healthy Minds at the UW’s Waisman Center. Bavelier is a professor in the Department of Brain and Cognitive Sciences at Rochester.
Video game usage, which continues to rise among American children, has been associated with a number of negative outcomes, such as obesity, aggressiveness, antisocial behavior and, in extreme cases, addiction. “At the same time, evidence is mounting that playing games can have a beneficial effects on the brain,” the authors write.
Last year, Bavelier and Davidson presided over a meeting at the White House in which neuroscientists met with entertainment media experts to discuss ways of using interactive technology such as video games to further understanding of brain functions, as well as to provide new, engaging tools for boosting attention and well-being.
Bavelier’s work is focused on how humans learn and how the brain adapts to changes in experience, either by nature (as in deafness) or by training (such as playing video games). Her lab investigates how new media, including video games, can be leveraged to foster learning and brain plasticity.
Davidson, who studies emotion and the brain, is leading a project in collaboration with UW-Madison’s Games + Learning + Society to develop two video games designed to help middle school students develop social and emotional skills, such as empathy, cooperation, mental focus and self-regulation.
"Gradually, this work will begin to document the burning social question of how technology is having an impact on our brains and our lives, and enable us to make evidence-based choices about the technologies of the future, to produce a new set of tools to cultivate positive habits of mind," the authors conclude.

Authors: Develop digital games to improve brain function and well-being

Neuroscientists should help to develop compelling digital games that boost brain function and improve well-being, say two professors specializing in the field in a commentary article published in the science journal Nature.

In the Feb. 28 issue, the two — Daphne Bavelier of the University of Rochester and Richard J. Davidson of the University of Wisconsin-Madison — urge game designers and brain scientists to work together to design new games that train the brain, producing positive effects on behavior, such as decreasing anxiety, sharpening attention and improving empathy. Already, some video games are designed to treat depression and to encourage cancer patients to stick with treatment, the authors note.

Davidson is founder and chair of the Center for Investigating Healthy Minds at the UW’s Waisman Center. Bavelier is a professor in the Department of Brain and Cognitive Sciences at Rochester.

Video game usage, which continues to rise among American children, has been associated with a number of negative outcomes, such as obesity, aggressiveness, antisocial behavior and, in extreme cases, addiction. “At the same time, evidence is mounting that playing games can have a beneficial effects on the brain,” the authors write.

Last year, Bavelier and Davidson presided over a meeting at the White House in which neuroscientists met with entertainment media experts to discuss ways of using interactive technology such as video games to further understanding of brain functions, as well as to provide new, engaging tools for boosting attention and well-being.

Bavelier’s work is focused on how humans learn and how the brain adapts to changes in experience, either by nature (as in deafness) or by training (such as playing video games). Her lab investigates how new media, including video games, can be leveraged to foster learning and brain plasticity.

Davidson, who studies emotion and the brain, is leading a project in collaboration with UW-Madison’s Games + Learning + Society to develop two video games designed to help middle school students develop social and emotional skills, such as empathy, cooperation, mental focus and self-regulation.

"Gradually, this work will begin to document the burning social question of how technology is having an impact on our brains and our lives, and enable us to make evidence-based choices about the technologies of the future, to produce a new set of tools to cultivate positive habits of mind," the authors conclude.

Filed under brain brain function gaming digital games anxiety empathy neuroscience science

98 notes

Ectopic Eyes Function Without Connection to Brain
For the first time, scientists have shown that transplanted eyes located far outside the head in a vertebrate animal model can confer vision without a direct neural connection to the brain.
Biologists at Tufts University School of Arts and Sciences used a frog model to shed new light – literally – on one of the major questions in regenerative medicine, bioengineering, and sensory augmentation research.
"One of the big challenges is to understand how the brain and body adapt to large changes in organization," says Douglas J. Blackiston, Ph.D., first author of the paper "Ectopic Eyes Outside the Head in Xenopus Tadpoles Provide Sensory Data For Light-Mediated Learning," in the February 27 issue of the Journal of Experimental Biology. “Here, our research reveals the brain’s remarkable ability, or plasticity, to process visual data coming from misplaced eyes, even when they are located far from the head.”
Blackiston is a post-doctoral associate in the laboratory of co-author Michael Levin, Ph.D., professor of biology and director of the Center for Regenerative and Developmental Biology at Tufts University.
Levin notes, “A primary goal in medicine is to one day be able to restore the function of damaged or missing sensory structures through the use of biological or artificial replacement components. There are many implications of this study, but the primary one from a medical standpoint is that we may not need to make specific connections to the brain when treating sensory disorders such as blindness.”
In this experiment, the team surgically removed donor embryo eye primordia, marked with fluorescent proteins, and grafted them into the posterior region of recipient embryos. This induced the growth of ectopic eyes. The recipients’ natural eyes were removed, leaving only the ectopic eyes.
Fluorescence microscopy revealed various innervation patterns but none of the animals developed nerves that connected the ectopic eyes to the brain or cranial region.
To determine if the ectopic eyes conveyed visual information, the team developed a computer-controlled visual training system in which quadrants of water were illuminated by either red or blue LED lights. The system could administer a mild electric shock to tadpoles swimming in a particular quadrant. A motion tracking system outfitted with a camera and a computer program allowed the scientists to monitor and record the tadpoles’ motion and speed.
Eyes See Without Wiring to Brain
The team made exciting discoveries: Just over 19 percent of the animals with optic nerves that connected to the spine demonstrated learned responses to the lights. They swam away from the red light while the blue light stimulated natural movement.
Their response to the lights elicited during the experiments was no different from that of a control group of tadpoles with natural eyes intact. Furthermore, this response was not demonstrated by eyeless tadpoles or tadpoles that did not receive any electrical shock.
"This has never been shown before," says Levin. "No one would have guessed that eyes on the flank of a tadpole could see, especially when wired only to the spinal cord and not the brain."The findings suggest a remarkable plasticity in the brain’s ability to incorporate signals from various body regions into behavioral programs that had evolved with a specific and different body plan.
"Ectopic eyes performed visual function," says Blackiston. "The brain recognized visual data from eyes that impinged on the spinal cord. We still need to determine if this plasticity in vertebrate brains extends to different ectopic organs or organs appropriate in different species."
One of the most fascinating areas for future investigation, according to Blackiston and Levin, is the question of exactly how the brain recognizes that the electrical signals coming from tissue near the gut is to be interpreted as visual data.
In computer engineering, notes Levin, who majored in computer science and biology as a Tufts undergraduate, this problem is usually solved by a “header”—a piece of metadata attached to a packet of information that indicates its source and type. Whether electric signals from eyes impinging on the spinal cord carry such an identifier of their origin remains a hypothesis to be tested.

Ectopic Eyes Function Without Connection to Brain

For the first time, scientists have shown that transplanted eyes located far outside the head in a vertebrate animal model can confer vision without a direct neural connection to the brain.

Biologists at Tufts University School of Arts and Sciences used a frog model to shed new light – literally – on one of the major questions in regenerative medicine, bioengineering, and sensory augmentation research.

"One of the big challenges is to understand how the brain and body adapt to large changes in organization," says Douglas J. Blackiston, Ph.D., first author of the paper "Ectopic Eyes Outside the Head in Xenopus Tadpoles Provide Sensory Data For Light-Mediated Learning," in the February 27 issue of the Journal of Experimental Biology. “Here, our research reveals the brain’s remarkable ability, or plasticity, to process visual data coming from misplaced eyes, even when they are located far from the head.”

Blackiston is a post-doctoral associate in the laboratory of co-author Michael Levin, Ph.D., professor of biology and director of the Center for Regenerative and Developmental Biology at Tufts University.

Levin notes, “A primary goal in medicine is to one day be able to restore the function of damaged or missing sensory structures through the use of biological or artificial replacement components. There are many implications of this study, but the primary one from a medical standpoint is that we may not need to make specific connections to the brain when treating sensory disorders such as blindness.”

In this experiment, the team surgically removed donor embryo eye primordia, marked with fluorescent proteins, and grafted them into the posterior region of recipient embryos. This induced the growth of ectopic eyes. The recipients’ natural eyes were removed, leaving only the ectopic eyes.

Fluorescence microscopy revealed various innervation patterns but none of the animals developed nerves that connected the ectopic eyes to the brain or cranial region.

To determine if the ectopic eyes conveyed visual information, the team developed a computer-controlled visual training system in which quadrants of water were illuminated by either red or blue LED lights. The system could administer a mild electric shock to tadpoles swimming in a particular quadrant. A motion tracking system outfitted with a camera and a computer program allowed the scientists to monitor and record the tadpoles’ motion and speed.

Eyes See Without Wiring to Brain

The team made exciting discoveries: Just over 19 percent of the animals with optic nerves that connected to the spine demonstrated learned responses to the lights. They swam away from the red light while the blue light stimulated natural movement.

Their response to the lights elicited during the experiments was no different from that of a control group of tadpoles with natural eyes intact. Furthermore, this response was not demonstrated by eyeless tadpoles or tadpoles that did not receive any electrical shock.

"This has never been shown before," says Levin. "No one would have guessed that eyes on the flank of a tadpole could see, especially when wired only to the spinal cord and not the brain."
The findings suggest a remarkable plasticity in the brain’s ability to incorporate signals from various body regions into behavioral programs that had evolved with a specific and different body plan.

"Ectopic eyes performed visual function," says Blackiston. "The brain recognized visual data from eyes that impinged on the spinal cord. We still need to determine if this plasticity in vertebrate brains extends to different ectopic organs or organs appropriate in different species."

One of the most fascinating areas for future investigation, according to Blackiston and Levin, is the question of exactly how the brain recognizes that the electrical signals coming from tissue near the gut is to be interpreted as visual data.

In computer engineering, notes Levin, who majored in computer science and biology as a Tufts undergraduate, this problem is usually solved by a “header”—a piece of metadata attached to a packet of information that indicates its source and type. Whether electric signals from eyes impinging on the spinal cord carry such an identifier of their origin remains a hypothesis to be tested.

Filed under animal model visual system brain plasticity ectopic eyes regenerative medicine neuroscience science

88 notes

Researchers identify brain pathway triggering impulsive eating
New research from the University of Georgia has identified the neural pathways in an insect brain tied to eating for pleasure, a discovery that sheds light on mirror impulsive eating pathways in the human brain.
"We know when insects are hungry, they eat more, become aggressive and are willing to do more work to get the food," said Ping Shen, a UGA associate professor of cellular biology in the Franklin College of Arts and Sciences. "Little is known about the other half-the reward-driven feeding behavior-when the animal is not so hungry but they still get excited about food when they smell something great.
The fact that a relatively lower animal, a fly larva, actually does this impulsive feeding based on a rewarding cue was a surprise.”
The research team led by Shen, who also is a member of the Biomedical and Health Sciences Institute, found that presenting fed fruit fly larvae with appetizing odors caused impulsive feeding of sugar-rich foods. The findings, published Feb. 28 in Cell Press, suggest eating for pleasure is an ancient behavior and that fly larvae can be used in studying neurobiology and the evolution of olfactory reward-driven impulses.
To test reward-driven behaviors in flies, Shen introduced appetizing odors to groups of well-fed larvae. In every case, the fed larvae consumed about 30 percent more food when surrounded by the attractive odors.
But when the insects were offered a substandard meal, they refused to eat it.
"They have expectations," he said. "If we reduce the concentration of sugar below a threshold, they do not respond anymore. Similar to what you see in humans, if you approach a beautiful piece of cake and you taste it and determine it is old and horrible, you are no longer interested."
Shen’s team also tried to further define this phenomenon-the connection between excitement and expectation. He found when the larvae were presented with a brief odor, the amount of time they were willing to act on the impulse was about 15 minutes.
"After 15 minutes, they revert back to normal. You get excited, but you can’t stay excited forever, so there is a mechanism to shut it down," he said.
His work also suggests the neuropeptides, or brain chemicals acting as signaling molecules triggering impulsive eating, are consistent between flies and humans. Neurons receive and convert stimuli into thoughts that are then relayed to the downstream mechanism telling the animals to act. These signaling molecules are required for this impulse, suggesting the molecular details of these functions are evolutionarily tied between flies and humans.
"There are hyper-rewarding cues that humans and flies have evolved to perceive, and they connect this perception with behavior performance," Shen said. "As long as this is activated, the animal will eat food. In this way, the brain is stupid: It does not know how it gets activated. In this case, the fly says ‘I smell something, I want to do this.’ This kind of connection has been established very early on, probably before the divergence of fly and human. That is why we both have it."
Impulsive and reward-driven behaviors are largely misunderstood, partially due to the complex systems at work in human brains. Fly larvae nervous systems, in terms of scheme and organization, are very similar to adult flies and to mammals, but with fewer neurons and less complex wirings.
"A particular function in the brain of mammals may require a large cluster of neurons," he said. "In flies, it may be only one or four. They are simpler in number but not principle."
In the fly model, four neurons are responsible for relaying signals from the olfactory center to the brain to stimulate action. Each odor and receptor translates the response slightly differently. Human triggers are obviously more diverse, but Shen thinks the mechanism to appreciate the combination is likely the same. He is now working with Tianming Liu, assistant professor of computer science at UGA and member of the Bioimaging Research Center and Institute of Bioinformatics, on a computer model to determine how these odors are interpreted as stimuli.
"Dieting is difficult, especially in the environment of these beautiful foods," Shen said. "It is very hard to control this impulsive urge. So, if we understand how this compulsive eating behavior comes about, we maybe can devise a way, at least for the behavioral aspect, to prevent it. We can modulate our behaviors better or use chemical interventions to calm down these cues."

Researchers identify brain pathway triggering impulsive eating

New research from the University of Georgia has identified the neural pathways in an insect brain tied to eating for pleasure, a discovery that sheds light on mirror impulsive eating pathways in the human brain.

"We know when insects are hungry, they eat more, become aggressive and are willing to do more work to get the food," said Ping Shen, a UGA associate professor of cellular biology in the Franklin College of Arts and Sciences. "Little is known about the other half-the reward-driven feeding behavior-when the animal is not so hungry but they still get excited about food when they smell something great.

The fact that a relatively lower animal, a fly larva, actually does this impulsive feeding based on a rewarding cue was a surprise.”

The research team led by Shen, who also is a member of the Biomedical and Health Sciences Institute, found that presenting fed fruit fly larvae with appetizing odors caused impulsive feeding of sugar-rich foods. The findings, published Feb. 28 in Cell Press, suggest eating for pleasure is an ancient behavior and that fly larvae can be used in studying neurobiology and the evolution of olfactory reward-driven impulses.

To test reward-driven behaviors in flies, Shen introduced appetizing odors to groups of well-fed larvae. In every case, the fed larvae consumed about 30 percent more food when surrounded by the attractive odors.

But when the insects were offered a substandard meal, they refused to eat it.

"They have expectations," he said. "If we reduce the concentration of sugar below a threshold, they do not respond anymore. Similar to what you see in humans, if you approach a beautiful piece of cake and you taste it and determine it is old and horrible, you are no longer interested."

Shen’s team also tried to further define this phenomenon-the connection between excitement and expectation. He found when the larvae were presented with a brief odor, the amount of time they were willing to act on the impulse was about 15 minutes.

"After 15 minutes, they revert back to normal. You get excited, but you can’t stay excited forever, so there is a mechanism to shut it down," he said.

His work also suggests the neuropeptides, or brain chemicals acting as signaling molecules triggering impulsive eating, are consistent between flies and humans. Neurons receive and convert stimuli into thoughts that are then relayed to the downstream mechanism telling the animals to act. These signaling molecules are required for this impulse, suggesting the molecular details of these functions are evolutionarily tied between flies and humans.

"There are hyper-rewarding cues that humans and flies have evolved to perceive, and they connect this perception with behavior performance," Shen said. "As long as this is activated, the animal will eat food. In this way, the brain is stupid: It does not know how it gets activated. In this case, the fly says ‘I smell something, I want to do this.’ This kind of connection has been established very early on, probably before the divergence of fly and human. That is why we both have it."

Impulsive and reward-driven behaviors are largely misunderstood, partially due to the complex systems at work in human brains. Fly larvae nervous systems, in terms of scheme and organization, are very similar to adult flies and to mammals, but with fewer neurons and less complex wirings.

"A particular function in the brain of mammals may require a large cluster of neurons," he said. "In flies, it may be only one or four. They are simpler in number but not principle."

In the fly model, four neurons are responsible for relaying signals from the olfactory center to the brain to stimulate action. Each odor and receptor translates the response slightly differently. Human triggers are obviously more diverse, but Shen thinks the mechanism to appreciate the combination is likely the same. He is now working with Tianming Liu, assistant professor of computer science at UGA and member of the Bioimaging Research Center and Institute of Bioinformatics, on a computer model to determine how these odors are interpreted as stimuli.

"Dieting is difficult, especially in the environment of these beautiful foods," Shen said. "It is very hard to control this impulsive urge. So, if we understand how this compulsive eating behavior comes about, we maybe can devise a way, at least for the behavioral aspect, to prevent it. We can modulate our behaviors better or use chemical interventions to calm down these cues."

Filed under brain fly larva impulsive eating insects neuropeptides evolution neuroscience science

124 notes

‘Rain Man’-like Brains Mapped with Network Analysis

Innovative Technique Sheds Light on Abnormal Brain Connectivity Responsible for Common Genetic Cause of Autism

A group of researchers at UC San Francisco and UC Berkeley have mapped the three-dimensional global connections within the brains of seven adults who have genetic malformations that leave them without the corpus callosum, which connects the left and right sides of the brain.

These “structural connectome” maps, which combine hospital MRIs with the mathematical tool known as network analysis, are described in the upcoming April 15 issue of the journal Neuroimage. They reveal new details about the condition known as agenesis of the corpus callosum, which is one of the top genetic causes of autism. The condition was part of the mysterious brain physiology of Laurence Kim Peek, the remarkable savant portrayed by Dustin Hoffman in the 1987 movie “Rain Man.”

While some people born with agenesis of the corpus callosum are of normal intelligence and do not have any obvious signs of neurologic disease, approximately 40 percent of people with the condition are at high risk for autism. Given this, the work is a step toward finding better ways to image the brains of people with the condition, said Pratik Mukherjee, MD, PhD, a professor of radiology and biomedical imaging at UCSF who was the co-senior author of the research.

Understanding how brain connectivity varies from person to person may help researchers identify imaging biomarkers for autism to help diagnose it and manage care for individuals. Currently autism is diagnosed and assessed based on cognitive tests, such as those involving stacking blocks and looking at pictures on flip cards.

While the new work falls short of a quantitative measure doctors could use instead of cognitive testing, it does offer a proof-of-principle that this novel technique may shed light on neurodevelopment disorders.

“Because you are looking at the whole brain at the network level, you can do new types of analysis to find what’s abnormal,” Mukherjee said.

The Connection between the Brain Hemispheres and Autism

Agenesis of the corpus callosum can arise if individuals are born missing DNA from chromosome 16 and often leads to autism.

Scientists have long puzzled over what the link is between this disorder and the autistic brain, said co-senior author of the paper Elliott Sherr, MD, PhD, professor of neurology and genetics especially since not all people with this malformation develop autism.

Doctors believe this is because the brain has a rich capacity for rewiring in alternative ways.

Pursuing this question, Mukherjee and Sherr turned to MRI and the mathematical technique of network analysis, which has long supported fields like civil engineering, helping urban planners optimize the timing of traffic lights to speed traffic. This is the first time network analysis has been applied to brain mapping for a genetic cause of autism.

The brain offers a significantly complicated challenge for analysis because, unlike the streets of a given city, the brain has hundreds of billions of neurons, many of which make tens of thousands of connections to each other, making its level of connectivity highly complex.

By comparing the seven rain man-like brains to those of 11 people without this malformation, the scientists determined how particular structures called the cingulate bundles were smaller and the neurons within these bundles were less connected to others in the brain. They also found that the network topology of the brain was more variable in people with agenesis of the corpus callosum than in people without the malformation.

Filed under brain AgCC corpus callosum connectome autism Kim Peek network analysis neuroscience science

free counters