Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

58 notes

Motion perception revisited: High Phi effect challenges established motion perception assumptions
Optical illusions abound in human visual perception, as demonstrated by the following well-known examples. Although many are static illusions, motion illusions also occur. Recently, scientists at Université Paris Descartes and Centre National de la Recherche Scientifique, Paris, University of Reading, United Kingdom, and Kyushu University, Japan discovered and investigated a new illusory motion effect, termed high phi by the authors, in which we perceive conspicuous large illusory jumps when presentation of motion signals are followed by brief visual stimuli free of detectable motion signals. The researchers found that the size of the illusory jump does not depend on the speed of the motion signals presented, but rather on spatial frequency and transient duration while jump duration depends on motion signal duration. The study’s authors conclude that their findings demonstrate that existing explanations for this illusion – namely, the loss of coherent motion perception above an upper limit and the preference for minimal motion – are incomplete at best.
Lead researcher Mark Wexler describes some of the challenges he and his colleagues – Andrew Glennerster, Patrick Cavanagh, Hiroyuki Ito, and Takeharu Seno – encountered in conducting their study. “We had the idea that these illusory jumps are related to dmax, the supposed upper speed limit on the steps that leads to motion perception, varies between individuals, and must be measured using random textures,” Wexler tells Medical Xpress. “For displacements below dmax you’re supposed to see the motion more or less correctly,” he explains, “while for displacement above dmax you’re just supposed to see noise – and the latter also turns out to be false.” (These illusory jumps are demonstrated in an online supplement to the paper.)
Interestingly, the researchers discovered the illusion as a bug in a computer program whose purpose was to do something else. “The easy thing to do in those kinds of circumstances is to correct the bug and move on,” Wexler comments, “and ignore how strange the effect of the bug actually is. Our key insight was not to move on.”
According to Wexler, the one finding in motion perception that everyone agrees with, for at least 100 years, is the minimal-motion principle. “The minimal-motion principle states that whenever a stimulus is ambiguous and compatible with more than one motion – as it nearly always is – the brain is supposed to prefer the smallest, slowest motion, including stand-still, that is compatible with the stimulus,” he explains. (In fact, he illustrates, many computer vision systems are built around this principle, and neuroscientists have verified it by recording signals from primate neurons.)
"However," Wexler points out, "one consequence of the high phi effect is the minimal-motion principle can be violated! When the stimulus is incompatible with any globally coherent motion, and therefore equally compatible with any motion, people perceive not only a large jump, but the largest possible jump that they can perceive. This maximum jump is the one that steps by dmax, which acts as the speed limit on motion perception."
Another principle that seems to be violated by the high phi illusion, according to Wexler, is dmax itself. “Below dmax, steps should be more or less seen as what they are – and as can be seen in demonstration three, this is what happens. On the other hand, says Wexler, “above dmax you’re supposed to perceive noise, not motion – but this is not what actually occurs.” Rather, you perceive the high phi jump, as can also be seen in demonstration three. “In one of our experiments,” Wexler adds, “we showed that the amplitude of the jump is very closely correlated with the dmax limit, so that people who have higher dmax limits also see a larger high phi jump.”
It’s known, Wexler points out, that the dmax limit depends on spatial frequency: the lower the frequency (that is, the larger the features in the stimulus) the higher the limit. “And indeed,” he notes, “we found that the magnitude of the illusory jump depends on spatial frequency in exactly the same way: the lower the frequency, the farther the jump.” This can be verified, Wexler adds, by viewing high phi demonstration six and demonstration seven.
Discussing the finding that the direction of the jump depends on the duration of the inducing motion signals, Wexler notes, “We think that the preceding – that is, inducing – motion acts like a seed. For brief inducers, the motion itself acts as the seed, and the jump is experienced forwards with respect to the inducer. For longer inducers, vision begins to adapt to the motion – a result known as the motion aftereffect.” Also known as the waterfall illusion, the motion aftereffect occurs when, after viewing a moving object for an extended period of time, and that object then becomes stationary, the object appears to slowly move in the opposite direction. “Many people initially think that what we’ve found is a consequence of adaptation to motion or the motion aftereffect,” he says. “If so, then it’s the fastest motion aftereffect known. We’ve measured that the illusory motion is 10-100 times faster than the inducing motion! We think that for motion inducers, the adaptation acts as the seed of the fast, backward jump.” (Brief and long inducers can be compared directly in demonstration nine.)
Wexler also describes how their findings relate to the activity of neurons in of the primary visual cortex that respond to lines of a certain angle moving in one direction, as first described by Hubel and Wiesel (1959). “In the brain, motion detectors are sensitive to motion in a particular place – the receptive field – a particular direction, and usually a particular speed,” Wexler notes. “When faced with our stimulus, there can be many accidental matches at the local level. In one image there is a dark spot, for example, and in the next, uncorrelated image there happens to be dark spot just next to it. In that case, a local motion detector will react to this false match – so our stimulus actually activates many local motion detectors, but incoherently, in that all of these motion detectors are signaling different motions. The main point is that in all this incoherent mess the brain finally prefers the largest possible motion.”
Commenting on other areas of research that might benefit from their study, Wexler cites computer vision. “The minimal-motion principle is enshrined in a lot of algorithms for extracting motion,” he concludes. “Our study shows that this principle can be violated. Can we find a different way to extract motion?”
"The dependence on transient duration – which can be clearly seen in demonstration five – is, to be completely honest, a mystery, but a very interesting one,” Wexler continues. “The amplitude of the jump is a very linear function of transient duration, at least for small durations. If some perceptual process goes linearly farther for longer durations, then something in the brain must be effectively rotating at constant speed. I have no idea what that something may be, but it’s an interesting challenge for the future.”

Motion perception revisited: High Phi effect challenges established motion perception assumptions

Optical illusions abound in human visual perception, as demonstrated by the following well-known examples. Although many are static illusions, motion illusions also occur. Recently, scientists at Université Paris Descartes and Centre National de la Recherche Scientifique, Paris, University of Reading, United Kingdom, and Kyushu University, Japan discovered and investigated a new illusory motion effect, termed high phi by the authors, in which we perceive conspicuous large illusory jumps when presentation of motion signals are followed by brief visual stimuli free of detectable motion signals. The researchers found that the size of the illusory jump does not depend on the speed of the motion signals presented, but rather on spatial frequency and transient duration while jump duration depends on motion signal duration. The study’s authors conclude that their findings demonstrate that existing explanations for this illusion – namely, the loss of coherent motion perception above an upper limit and the preference for minimal motion – are incomplete at best.

Lead researcher Mark Wexler describes some of the challenges he and his colleagues – Andrew Glennerster, Patrick Cavanagh, Hiroyuki Ito, and Takeharu Seno – encountered in conducting their study. “We had the idea that these illusory jumps are related to dmax, the supposed upper speed limit on the steps that leads to motion perception, varies between individuals, and must be measured using random textures,” Wexler tells Medical Xpress. “For displacements below dmax you’re supposed to see the motion more or less correctly,” he explains, “while for displacement above dmax you’re just supposed to see noise – and the latter also turns out to be false.” (These illusory jumps are demonstrated in an online supplement to the paper.)

Interestingly, the researchers discovered the illusion as a bug in a computer program whose purpose was to do something else. “The easy thing to do in those kinds of circumstances is to correct the bug and move on,” Wexler comments, “and ignore how strange the effect of the bug actually is. Our key insight was not to move on.”

According to Wexler, the one finding in motion perception that everyone agrees with, for at least 100 years, is the minimal-motion principle. “The minimal-motion principle states that whenever a stimulus is ambiguous and compatible with more than one motion – as it nearly always is – the brain is supposed to prefer the smallest, slowest motion, including stand-still, that is compatible with the stimulus,” he explains. (In fact, he illustrates, many computer vision systems are built around this principle, and neuroscientists have verified it by recording signals from primate neurons.)

"However," Wexler points out, "one consequence of the high phi effect is the minimal-motion principle can be violated! When the stimulus is incompatible with any globally coherent motion, and therefore equally compatible with any motion, people perceive not only a large jump, but the largest possible jump that they can perceive. This maximum jump is the one that steps by dmax, which acts as the speed limit on motion perception."

Another principle that seems to be violated by the high phi illusion, according to Wexler, is dmax itself. “Below dmax, steps should be more or less seen as what they are – and as can be seen in demonstration three, this is what happens. On the other hand, says Wexler, “above dmax you’re supposed to perceive noise, not motion – but this is not what actually occurs.” Rather, you perceive the high phi jump, as can also be seen in demonstration three. “In one of our experiments,” Wexler adds, “we showed that the amplitude of the jump is very closely correlated with the dmax limit, so that people who have higher dmax limits also see a larger high phi jump.”

It’s known, Wexler points out, that the dmax limit depends on spatial frequency: the lower the frequency (that is, the larger the features in the stimulus) the higher the limit. “And indeed,” he notes, “we found that the magnitude of the illusory jump depends on spatial frequency in exactly the same way: the lower the frequency, the farther the jump.” This can be verified, Wexler adds, by viewing high phi demonstration six and demonstration seven.

Discussing the finding that the direction of the jump depends on the duration of the inducing motion signals, Wexler notes, “We think that the preceding – that is, inducing – motion acts like a seed. For brief inducers, the motion itself acts as the seed, and the jump is experienced forwards with respect to the inducer. For longer inducers, vision begins to adapt to the motion – a result known as the motion aftereffect.” Also known as the waterfall illusion, the motion aftereffect occurs when, after viewing a moving object for an extended period of time, and that object then becomes stationary, the object appears to slowly move in the opposite direction. “Many people initially think that what we’ve found is a consequence of adaptation to motion or the motion aftereffect,” he says. “If so, then it’s the fastest motion aftereffect known. We’ve measured that the illusory motion is 10-100 times faster than the inducing motion! We think that for motion inducers, the adaptation acts as the seed of the fast, backward jump.” (Brief and long inducers can be compared directly in demonstration nine.)

Wexler also describes how their findings relate to the activity of neurons in of the primary visual cortex that respond to lines of a certain angle moving in one direction, as first described by Hubel and Wiesel (1959). “In the brain, motion detectors are sensitive to motion in a particular place – the receptive field – a particular direction, and usually a particular speed,” Wexler notes. “When faced with our stimulus, there can be many accidental matches at the local level. In one image there is a dark spot, for example, and in the next, uncorrelated image there happens to be dark spot just next to it. In that case, a local motion detector will react to this false match – so our stimulus actually activates many local motion detectors, but incoherently, in that all of these motion detectors are signaling different motions. The main point is that in all this incoherent mess the brain finally prefers the largest possible motion.”

Commenting on other areas of research that might benefit from their study, Wexler cites computer vision. “The minimal-motion principle is enshrined in a lot of algorithms for extracting motion,” he concludes. “Our study shows that this principle can be violated. Can we find a different way to extract motion?”

"The dependence on transient duration – which can be clearly seen in demonstration five – is, to be completely honest, a mystery, but a very interesting one,” Wexler continues. “The amplitude of the jump is a very linear function of transient duration, at least for small durations. If some perceptual process goes linearly farther for longer durations, then something in the brain must be effectively rotating at constant speed. I have no idea what that something may be, but it’s an interesting challenge for the future.”

Filed under motion perception motion signals motion illusions illusions high phi neuroscience science

172 notes

Scientists unpack testosterone’s role in schizophrenia

Testosterone may trigger a brain chemical process linked to schizophrenia but the same sex hormone can also improve cognitive thinking skills in men with the disorder, two new studies show.

image

Scientists have long suspected testosterone plays an important role in schizophrenia, which affects more men than women. Men are also more likely to develop psychosis in adolescence, previous research has shown.

A new study on lab rodents by researchers from Neuroscience Research Australia analysed the impact increased testosterone had on levels of dopamine, a brain chemical linked to psychotic symptoms of schizophrenia.

The researchers found that testosterone boosted dopamine sensitivity in adolescent male rodents.

“From these rodent studies, we hypothesise that adolescent increases in circulating testosterone may be a driver of increased dopamine activity in the brains of individuals susceptible to psychosis and schizophrenia,” said senior Neuroscience Research Australia researcher and author of the study, Dr Tertia Purves-Tyson, who is presenting her work at the International Congress on Schizophrenia Research in Florida this week.

Dr Philip Mitchell, Scientia Professor and Head of the School of Psychiatry at the University of NSW, said the research was very interesting.

“The relationship between sex steroids, such as testosterone, and psychiatric disorders has long intrigued researchers. For example, we have known for many years that schizophrenia presents earlier in males than females, but the biological mechanism for this has been poorly understood,” said Dr Mitchell, who was not involved in the study.

“The rodent study by Professor Shannon Weickert from the School of Psychiatry at UNSW and NeuRA is therefore of particular interest. This study suggests an important interplay between circulating testosterone levels and the brain’s sensitivity to dopamine – a neurochemical which has been long implicated in the cause of schizophrenia,” said Dr Mitchell.

“This study suggests that it is the interplay between testosterone and dopamine which is critical. This is an important observation which may very well throw an important light on solving the puzzle of the biological causes of schizophrenia.”

Cognitive thinking

A separate study by Dr Thomas Weickert at Neuroscience Research Australia examined the role testosterone plays in the cognitive thinking skills of men with schizophrenia.

The researchers examined testosterone levels in a group of 29 chronically ill men with schizophrenia or schizoaffective disorder, and a control group of 20 healthy men and asked both groups to take a series of cognition tests.

“Circulating testosterone levels significantly predicted performance on verbal memory, processing speed, and working memory in men with schizophrenia … such that increased normal levels of testosterone were beneficial to thought processing in men with schizophrenia but circulating sex steroid levels did not appear to be related to cognitive function in healthy men,” the researchers reported.

“The results suggest that circulating sex steroids may influence thought processes in men with schizophrenia.”

Dr Melanie McDowall, a researcher at the University of Adelaide’s Robinson Institute, said the study added to a large body of evidence demonstrating a link between testosterone and schizophrenia.

“This is not surprising, given the link between testosterone and dopamine,” she said, adding that symptoms of schizophrenia predominantly began after puberty.

“However, as with most endocrine and mental illnesses, schizophrenia is multifaceted (genetic, environmental etc.), hence this may not be the be all and end.”

(Source: theconversation.com)

Filed under schizophrenia testosterone levels psychiatric disorders sex hormones neuroscience science

418 notes

E-tattoo monitors brainwaves and baby bump

Mind reading can be as simple as slapping a sticker on your forehead. An “electronic tattoo” containing flexible electronic circuits can now record some complex brain activity as accurately as an EEG. The tattoo could also provide a cheap way to monitor a developing fetus.

The first electronic tattoo appeared in 2011, when Todd Coleman at the University of California, San Diego, and colleagues designed a transparent patch containing electronic circuits as thin as a human hair. Applied to skin like a temporary tattoo, these could be used to monitor electrophysiological signals associated with the heart and muscles, as well as rudimentary brain activity.

To improve its usefulness, Coleman’s group has now optimised the placement of the electrodes to pick up more complex brainwaves. They have demonstrated this by monitoring so-called P300 signals in the forebrain. These appear when you pay attention to a stimulus. The team showed volunteers a series of images and asked them to keep track of how many times a certain object appeared. Whenever volunteers noticed the object, the tattoo registered a blip in the P300 signal.

The tattoo was as good as conventional EEG at telling whether a person was looking at the target image or another stimulus, the team told a recent Cognitive Neuroscience Society meeting in San Francisco.

The team is now modifying the tattoo to transmit data wirelessly to a smartphone, Coleman says. Eventually, he hopes the device could identify other complex patterns of brain activity, such as those that might be used to control a prosthetic limb.

For now, the group is focusing on optimising the tattoo for use in conditions such as depression and Alzheimer’s disease, each of which have characteristic patterns of neural activity. People with depression could wear the tattoo for an extended period, allowing it to help gauge whether medication is working. “The number one advantage is the medical ease of application,” says Michael Pitts of Reed College in Portland, Oregon.

Because its electronic components are already mass-produced, the tattoo can also be made very cheaply.

That means it might also lend itself to pregnancy monitoring in developing countries. With help from the Bill & Melinda Gates Foundation, Coleman’s group is working on an unobtrusive version of the tattoo that monitors signals such as maternal contractions and fetal heart rate.

Filed under electronic tattoo brain activity electrophysiological signals fetal development neuroscience science

72 notes

Hitting ‘reset’ in protein synthesis restores myelination, suggests new treatment for misfolded protein diseases, such as CMT, Alzheimer’s

Neuroscientists at UB’s Hunter James Kelly Research Institute show how turning down synthesis of a protein improves nerve, muscle function in common neuropathy.

image

A potential new treatment strategy for patients with Charcot-Marie-Tooth disease is on the horizon, thanks to research by neuroscientists now at the University at Buffalo’s Hunter James Kelly Research Institute and their colleagues in Italy and England.

The institute is the research arm of the Hunter’s Hope Foundation, established in 1997 by Jim Kelly, Buffalo Bills Hall of Fame quarterback, and his wife, Jill, after their infant son Hunter was diagnosed with Krabbe Leukodystrophy, an inherited fatal disorder of the nervous system. Hunter died in 2005 at the age of eight. The institute conducts research on myelin and its related diseases with the goal of developing new ways of understanding and treating conditions such as Krabbe disease and other leukodystrophies.

Charcot-Marie-Tooth or CMT disease, which affects the peripheral nerves, is among the most common of hereditary neurological disorders; it is a disease of myelin and it results from misfolded proteins in cells that produce myelin.

The new findings were published online earlier this month in The Journal of Experimental Medicine.

They may have relevance for other diseases that result from misfolded proteins, including Alzheimer’s disease, Parkinson’s, multiple sclerosis, Type 1 diabetes, cancer and mad cow disease.

The paper shows that missteps in translational homeostasis, the process of regulating new protein production so that cells maintain a precise balance between lipids and proteins, may be how some genetic mutations in CMT cause neuropathy.

CMT neuropathies are common, hereditary and progressive; in severe cases, patients end up in wheelchairs. These diseases significantly affect quality of life but not longevity, taking a major toll on patients, families and society, the researchers note.

“It’s possible that our finding could lead to the development of an effective treatment not just for CMT neuropathies but also for other diseases related to misfolded proteins,” says Lawrence Wrabetz, MD, director of the institute and professor of neurology and biochemistry in UB’s School of Medicine and Biomedical Sciences and senior author on the paper. Maurizio D’Antonio, of the Division of Genetics and Cell Biology of the San Raffaele Scientific Institute in Milan is first author; Wrabetz did most of this research while he was at San Raffaele, prior to coming to UB.

The research finding centers around the synthesis of misfolded proteins in Schwann cells, which make myelin in nerves. Myelin is the crucial fatty material that wraps the axons of neurons and allows them to signal effectively. Many CMT neuropathies are associated with mutations in a gene known as P0, which glues the wraps of myelin together. Wrabetz has previously shown in experiments with transgenic mice that those mutations cause the myelin to break down, which in turn, causes degeneration of peripheral nerves and wasting of muscles.

When cells recognize that the misfolded proteins are being synthesized, cells respond by severely reducing protein production in an effort to correct the problem, Wrabetz explains. The cells commence protein synthesis again when a protein called Gadd34 gets involved.

“After cells have reacted to, and corrected, misfolding of proteins, the job of Gadd34 is to turn protein synthesis back on,” says Wrabetz. “What we have shown is that once Gadd34 is turned back on, it activates synthesis of proteins at a level that’s too high—that’s what causes more problems in myelination.

“We have provided proof of principle that Gadd34 causes a problem with translational homeostasis and that’s what causes some neuropathies,” says Wrabetz. “We’ve shown that if we just reduce Gadd34, we actually get better myelination. So, leaving protein synthesis turned partially off is better than turning it back on, completely.”

In both cultures and a transgenic mouse model of CMT neuropathies, the researchers improved myelin by reducing Gadd34 with salubrinal, a small molecule research drug. While salubrinal is not appropriate for human use, Wrabetz and colleagues at UB and elsewhere are working to develop derivatives that are appropriate.

“If we can demonstrate that a new version of this molecule is safe and effective, then it could be part of a new therapeutic strategy for CMT and possibly other misfolded protein diseases as well,” says Wrabetz.

And while CMT is the focus of this particular research, the work is helping scientists at the Hunter James Kelly Research Institute enrich their understanding of myelin disorders in general.

“What we learn in one disease, such as CMT, may inform how we think about toxins for others, such as Krabbe’s,” Wrabetz says. “We’d like to build a foundation and answer basic questions about where and when toxicity in diseases begin.”

The misfolded protein diseases are an interesting and challenging group of diseases to study, he continues. “CMT, for example, is caused by mutations in more than 40 different genes,” he says. “When there are so many different genes involved and so many different mechanisms, you have to find a unifying mechanism: this problem of Gadd34 turning protein synthesis on at too high a level could be one unifying mechanism. The hope is that this proof of principle applies to more than just CMT and may lead to improved treatments for Alzheimer’s, Parkinson’s, Type 1 diabetes and the other diseases caused by misfolded proteins.”

(Source: buffalo.edu)

Filed under protein synthesis charcot-marie-tooth disease myelin leukodystrophies neuropathy neuroscience science

55 notes

Pathway Competition Affects Early Differentiation of Higher Brain Structures
Sand-dwelling and rock-dwelling cichlids living in East Africa’s Lake Malawi share a nearly identical genome, but have very different personalities. The territorial rock-dwellers live in communities where social interactions are important, while the sand-dwellers are itinerant and less aggressive.
Those behavioral differences likely arise from a complex region of the brain known as the telencephalon, which governs communication, emotion, movement and memory in vertebrates – including humans, where a major portion of the telencephalon is known as the cerebral cortex. A study published this week in the journal Nature Communications shows how the strength and timing of competing molecular signals during brain development has generated natural and presumably adaptive differences in the telencephalon much earlier than scientists had previously believed.
In the study, researchers first identified key differences in gene expression between rock- and sand-dweller brains during development, and then used small molecules to manipulate developmental pathways to mimic natural diversity.
“We have shown that the evolutionary changes in the brains of these fishes occur really early in development,” said Todd Streelman, an associate professor in the School of Biology and the Petit Institute for Bioengineering and Biosciences at the Georgia Institute of Technology. “It’s generally been thought that early development of the brain must be strongly buffered against change. Our data suggest that rock-dweller brains differ from sand-dweller brains – before there is a brain.”
For humans, the research could lead scientists to look for subtle changes in brain structures earlier in the development process. This could provide a better understanding of how disorders such as autism and schizophrenia could arise during very early brain development.
The research was supported by the National Science Foundation and published online April 23 by the journal.
“We want to understand how the telencephalon evolves by looking at genetics and developmental pathways in closely-related species from natural populations,” said Jonathan Sylvester, a postdoctoral researcher in the Georgia Tech School of Biology and lead author of the paper. “Adult cichlids have a tremendous amount of variation within the telencephalon, and we investigated the timing and cause of these differences. Unlike many previous studies in laboratory model organisms that focus on large, qualitative effects from knocking out single genes, we demonstrated that brain diversity evolves through quantitative tuning of multiple pathways.”
In examining the fish from embryos to adulthood, the researchers found that the mbuna, or rock-dwellers, tended to exhibit a larger ventral portion of the telencephalon, called the subpallium – while the sand-dwellers tended to have a larger version of the dorsal structure known as the pallium. These structures seem to have evolved differently over time to meet the behavioral and ecological needs of the fishes. The team showed that early variation in the activity of developmental signals expressed as complementary dorsal-ventral gradients, known technically as “Wingless” and “Hedgehog,” are involved in creating those differences during the neural plate stage, as a single sheet of neural tissue folds to form the neural tube.
To specifically manipulate those two pathways, Sylvester removed clutches of between 20 and 40 eggs from brooding female cichlids, which normally incubate fertilized eggs in their mouths. At about 36 to 48 hours after fertilization, groups of eggs were exposed to small-molecule chemicals that either strengthened or weakened the Hedgehog signal, or strengthened or weakened the Wingless signal. The chemical treatment came while the structures that would become the brain were little more than a sheet of cells. After treatment, water containing the chemicals was replaced with fresh water, and the embryos were allowed to continue their development.
“We were able to artificially manipulate these pathways in a way that we think evolution might have worked to shift the process of rock-dweller telencephalon development to sand-dweller development, and vice-versa. Treatment with small molecules allows us incredible temporal and dose precision in manipulating natural development,” Sylvester explained. “We then followed the development of the embryos until we were able to measure the anatomical structures – the size of the pallium and subpallium – to see that we had transformed one to the other.”
The two different brain regions, the dorsal pallium and ventral subpallium, give rise to excitatory and inhibitory neurons in the forebrain. Altering the relative sizes of these regions might change the balance between these neuronal types, ultimately producing behavioral changes in the adult fish.
“Evolution has fine-tuned some of these developmental mechanisms to produce diversity,” Streelman said. “In this study, we have figured out which ones.”
The researchers studied six different species of East African cichlids, and also worked with collaborators at King’s College in London to apply similar techniques in the zebrafish.
As a next step, the researchers would like to follow the embryos through to adulthood to see if the changes seen in embryonic and juvenile brain structures actually do change behavior of adults. It’s possible, said Streelman, that later developmental events could compensate for the early differences.
The results could be of interest to scientists investigating human neurological disorders that result from an imbalance between excitatory and inhibitory neurons. Those disorders include autism and schizophrenia. “We think it is particularly interesting that there may be some adaptive variation in the natural proportions of excitatory versus inhibitory neurons in the species we study, correlated with their natural behavioral differences,” said Streelman.

Pathway Competition Affects Early Differentiation of Higher Brain Structures

Sand-dwelling and rock-dwelling cichlids living in East Africa’s Lake Malawi share a nearly identical genome, but have very different personalities. The territorial rock-dwellers live in communities where social interactions are important, while the sand-dwellers are itinerant and less aggressive.

Those behavioral differences likely arise from a complex region of the brain known as the telencephalon, which governs communication, emotion, movement and memory in vertebrates – including humans, where a major portion of the telencephalon is known as the cerebral cortex. A study published this week in the journal Nature Communications shows how the strength and timing of competing molecular signals during brain development has generated natural and presumably adaptive differences in the telencephalon much earlier than scientists had previously believed.

In the study, researchers first identified key differences in gene expression between rock- and sand-dweller brains during development, and then used small molecules to manipulate developmental pathways to mimic natural diversity.

“We have shown that the evolutionary changes in the brains of these fishes occur really early in development,” said Todd Streelman, an associate professor in the School of Biology and the Petit Institute for Bioengineering and Biosciences at the Georgia Institute of Technology. “It’s generally been thought that early development of the brain must be strongly buffered against change. Our data suggest that rock-dweller brains differ from sand-dweller brains – before there is a brain.”

For humans, the research could lead scientists to look for subtle changes in brain structures earlier in the development process. This could provide a better understanding of how disorders such as autism and schizophrenia could arise during very early brain development.

The research was supported by the National Science Foundation and published online April 23 by the journal.

“We want to understand how the telencephalon evolves by looking at genetics and developmental pathways in closely-related species from natural populations,” said Jonathan Sylvester, a postdoctoral researcher in the Georgia Tech School of Biology and lead author of the paper. “Adult cichlids have a tremendous amount of variation within the telencephalon, and we investigated the timing and cause of these differences. Unlike many previous studies in laboratory model organisms that focus on large, qualitative effects from knocking out single genes, we demonstrated that brain diversity evolves through quantitative tuning of multiple pathways.”

In examining the fish from embryos to adulthood, the researchers found that the mbuna, or rock-dwellers, tended to exhibit a larger ventral portion of the telencephalon, called the subpallium – while the sand-dwellers tended to have a larger version of the dorsal structure known as the pallium. These structures seem to have evolved differently over time to meet the behavioral and ecological needs of the fishes. The team showed that early variation in the activity of developmental signals expressed as complementary dorsal-ventral gradients, known technically as “Wingless” and “Hedgehog,” are involved in creating those differences during the neural plate stage, as a single sheet of neural tissue folds to form the neural tube.

To specifically manipulate those two pathways, Sylvester removed clutches of between 20 and 40 eggs from brooding female cichlids, which normally incubate fertilized eggs in their mouths. At about 36 to 48 hours after fertilization, groups of eggs were exposed to small-molecule chemicals that either strengthened or weakened the Hedgehog signal, or strengthened or weakened the Wingless signal. The chemical treatment came while the structures that would become the brain were little more than a sheet of cells. After treatment, water containing the chemicals was replaced with fresh water, and the embryos were allowed to continue their development.

“We were able to artificially manipulate these pathways in a way that we think evolution might have worked to shift the process of rock-dweller telencephalon development to sand-dweller development, and vice-versa. Treatment with small molecules allows us incredible temporal and dose precision in manipulating natural development,” Sylvester explained. “We then followed the development of the embryos until we were able to measure the anatomical structures – the size of the pallium and subpallium – to see that we had transformed one to the other.”

The two different brain regions, the dorsal pallium and ventral subpallium, give rise to excitatory and inhibitory neurons in the forebrain. Altering the relative sizes of these regions might change the balance between these neuronal types, ultimately producing behavioral changes in the adult fish.

“Evolution has fine-tuned some of these developmental mechanisms to produce diversity,” Streelman said. “In this study, we have figured out which ones.”

The researchers studied six different species of East African cichlids, and also worked with collaborators at King’s College in London to apply similar techniques in the zebrafish.

As a next step, the researchers would like to follow the embryos through to adulthood to see if the changes seen in embryonic and juvenile brain structures actually do change behavior of adults. It’s possible, said Streelman, that later developmental events could compensate for the early differences.

The results could be of interest to scientists investigating human neurological disorders that result from an imbalance between excitatory and inhibitory neurons. Those disorders include autism and schizophrenia. “We think it is particularly interesting that there may be some adaptive variation in the natural proportions of excitatory versus inhibitory neurons in the species we study, correlated with their natural behavioral differences,” said Streelman.

Filed under brain development cichlids gene expression evolution telencephalon cerebral cortex neuroscience science

91 notes

How the brain folds to fit

During fetal development of the mammalian brain, the cerebral cortex undergoes a marked expansion in surface area in some species, which is accommodated by folding of the tissue in species with most expanded neuron numbers and surface area. Researchers have now identified a key regulator of this crucial process.

image

Different regions of the mammalian brain are devoted to the performance of specific tasks. This in turn imposes particular demands on their development and structural organization. In the vertebrate forebrain, for instance, the cerebral cortex – which is responsible for cognitive functions – is remarkably expanded and extensively folded exclusively in mammalian species. The greater the degree of folding and the more furrows present, the larger is the surface area available for reception and processing of neural information. In humans, the exterior of the developing brain remains smooth until about the sixth month of gestation. Only then do superficial folds begin to appear and ultimately dominate the entire brain in humans. Conversely mice, for example, have a much smaller and smooth cerebral cortex.

“The mechanisms that control the expansion and folding of the brain during fetal development have so far been mysterious,” says Professor Magdalena Götz, a professor at the Institute of Physiology at LMU and Director of the Institute for Stem Cell Research at the Helmholtz Center Munich. Götz and her team have now pinpointed a major player involved in the molecular process that drives cortical expansion in the mouse. They were able to show that a novel nuclear protein called Trnp1 triggers the enormous increase in the numbers of nerve cells which forces the cortex to undergo a complex series of folds. Indeed, although the normal mouse brain has a smooth appearance, dynamic regulation of Trnp1 results in activating all necessary processes for the formation of a much enlarged and folded cerebral cortex.

Levels of Trnp1 control expansion and folding
“Trnp1 is critical for the expansion and folding of the cerebral cortex, and its expression level is dynamically controlled during development,” says Götz. In the early embryo, Trnp1 is locally expressed in high concentrations. This promotes the proliferation of self-renewing multipotent neural stem cells and supports tangential expansion of the cerebral cortex. The subsequent fall in levels of Trnp1 is associated with an increase in the numbers of various intermediate progenitors and basal radial glial cells. This results in the ordered formation and migration of a much enlarged number of neurons forming folds in the growing cortex.

The findings are particularly striking because they imply that the same molecule – Trnp1 – controls both the expansion and the folding of the cerebral cortex and is even sufficient to induce folding in a normally smooth cerebral cortex. Trnp1 therefore serves as an ideal starting point from which to dissect the complex network of cellular and molecular interactions that underpin the whole process. Götz and her colleagues are now embarking on the next step in this exciting journey - determination of the molecular function of this novel nuclear protein Trnp1 and how it is regulated. (Cell 2013)

(Source: en.uni-muenchen.de)

Filed under mammalian brain cerebral cortex fetal development cognitive functioning neuroscience science

95 notes

Longer Days Bring ‘Winter Blues’—For Rats, Not Humans
Most of us are familiar with the “winter blues,” the depression-like symptoms known as “seasonal affective disorder,” or SAD, that occurs when the shorter days of winter limit our exposure to natural light and make us more lethargic, irritable and anxious. But for rats it’s just the opposite.
Biologists at UC San Diego have found that rats experience more anxiety and depression when the days grow longer. More importantly, they discovered that the rat’s brain cells adopt a new chemical code when subjected to large changes in the day and night cycle, flipping a switch to allow an entirely different neurotransmitter to stimulate the same part of the brain.
Their surprising discovery, detailed in the April 26 issue of Science, demonstrates that the adult mammalian brain is much more malleable than was once thought by neurobiologists. Because rat brains are very similar to human brains, their finding also provides a greater insight into the behavioral changes in our brain linked to light reception. And it opens the door for new ways to treat brain disorders such as Parkinson’s, caused by the death of dopamine-generating cells in the brain.
The neuroscientists discovered that rats exposed for one week to 19 hours of darkness and five hours of light every day had more nerve cells making dopamine, which made them less stressed and anxious when measured using standardized behavioral tests. Meanwhile, rats exposed for a week with the reverse—19 hours of light and five hours of darkness—had more neurons synthesizing the neurotransmitter somatostatin, making them more stressed and anxious.
“We’re diurnal and rats are nocturnal,” said Nicholas Spitzer, a professor of biology at UC San Diego and director of the Kavli Institute for Brain and Mind. “So for a rat, it’s the longer days that produce stress, while for us it’s the longer nights that create stress.”
Because rats explore and search for food at night, while humans evolved as creatures who hunt and forage during the daylight hours, such differences in brain chemistry and behavior make sense. Evolutionary changes presumably favored humans who were more active gatherers of food during the longer days of summer and saved their energy during the shorter days of winter.
“Light is what wakes us up and if we feel depressed we go for a walk outside,” said Davide Dulcis, a research scientist in Spitzer’s laboratory and the first author of the study. “When it’s spring, I feel more motivation to do the things I like to do because the days are longer. But for the rat, it’s just the opposite. Because rats are nocturnal, they’re less stressed at night, which is good because that’s when they can spend more time foraging or eating.”
But how did our brains change when humans evolved millions of years ago from small nocturnal rodents to diurnal creatures to accommodate those behavioral changes?
“We think that somewhere in the brain there’s been a change,” said Spitzer. “Sometime in the evolution from rat to human there’s been an evolutionary adjustment of circuitry to allow switching of neurotransmitters in the opposite direction in response to the same exposure to a balance of light and dark.”
A study published earlier this month in the American Journal of Preventive Medicine found some correlation to the light-dark cycle in rats and stress in humans, at least when it comes to people searching on the internet for information in the winter versus the summer about mental illness. Using Google’s search data from 2006 to 2010, a team of researchers led by John Ayers of San Diego State University found that mental health searches on Google were, in general, 14 percent higher in the winter in the United States and 11 percent higher in the Australian winter.
“Now that we know that day length can switch transmitters and change behavior, there may be a connection,” said Spitzer.
In their rat experiments, the UC San Diego neuroscientists found that the switch in transmitter synthesis in the rat’s brain cells from dopamine to somatostatin or back again was not due to the growth of new neurons, but to the ability of the same neurons there to produce different neurotransmitters.
Rats exposed to 19 hours of darkness every 24 hours during the week showed higher numbers of dopamine neurons within their brains and were more likely, the researchers found, to explore the open end of an elevated maze, a behavioral test showing they were less anxious. These rats were also more willing to swim, another laboratory test that showed they were less stressed.
“Because rats are nocturnal animals, they like to explore during the night and dopamine is a key part of our and their reward system,” said Spitzer. “It’s part of what allows them to be confident and reduce anxiety.”
The researchers said they don’t know precisely how this neurotransmitter switch works. Nor do they know what proportion of light and darkness or stress triggers this switch in brain chemistry. “Is it 50-50? Or 80 percent light versus dark and 20 percent stress? We don’t know,” added Spitzer. “If we just stressed the animal and didn’t change their photoperiod, would that lead to changes in transmitter identity? We don’t know, but those are all doable experiments.”
But as they learn more about this trigger mechanism, they said one promising avenue for human application might be to use this neurotransmitter switch to deliver dopamine effectively to parts of the brain that no longer receive dopamine in Parkinson’s patients.
“We could switch to a parallel pathway to put dopamine where it’s needed with fewer side effects than pharmacological agents,” said Dulcis.

Longer Days Bring ‘Winter Blues’—For Rats, Not Humans

Most of us are familiar with the “winter blues,” the depression-like symptoms known as “seasonal affective disorder,” or SAD, that occurs when the shorter days of winter limit our exposure to natural light and make us more lethargic, irritable and anxious. But for rats it’s just the opposite.

Biologists at UC San Diego have found that rats experience more anxiety and depression when the days grow longer. More importantly, they discovered that the rat’s brain cells adopt a new chemical code when subjected to large changes in the day and night cycle, flipping a switch to allow an entirely different neurotransmitter to stimulate the same part of the brain.

Their surprising discovery, detailed in the April 26 issue of Science, demonstrates that the adult mammalian brain is much more malleable than was once thought by neurobiologists. Because rat brains are very similar to human brains, their finding also provides a greater insight into the behavioral changes in our brain linked to light reception. And it opens the door for new ways to treat brain disorders such as Parkinson’s, caused by the death of dopamine-generating cells in the brain.

The neuroscientists discovered that rats exposed for one week to 19 hours of darkness and five hours of light every day had more nerve cells making dopamine, which made them less stressed and anxious when measured using standardized behavioral tests. Meanwhile, rats exposed for a week with the reverse—19 hours of light and five hours of darkness—had more neurons synthesizing the neurotransmitter somatostatin, making them more stressed and anxious.

“We’re diurnal and rats are nocturnal,” said Nicholas Spitzer, a professor of biology at UC San Diego and director of the Kavli Institute for Brain and Mind. “So for a rat, it’s the longer days that produce stress, while for us it’s the longer nights that create stress.”

Because rats explore and search for food at night, while humans evolved as creatures who hunt and forage during the daylight hours, such differences in brain chemistry and behavior make sense. Evolutionary changes presumably favored humans who were more active gatherers of food during the longer days of summer and saved their energy during the shorter days of winter.

“Light is what wakes us up and if we feel depressed we go for a walk outside,” said Davide Dulcis, a research scientist in Spitzer’s laboratory and the first author of the study. “When it’s spring, I feel more motivation to do the things I like to do because the days are longer. But for the rat, it’s just the opposite. Because rats are nocturnal, they’re less stressed at night, which is good because that’s when they can spend more time foraging or eating.”

But how did our brains change when humans evolved millions of years ago from small nocturnal rodents to diurnal creatures to accommodate those behavioral changes?

“We think that somewhere in the brain there’s been a change,” said Spitzer. “Sometime in the evolution from rat to human there’s been an evolutionary adjustment of circuitry to allow switching of neurotransmitters in the opposite direction in response to the same exposure to a balance of light and dark.”

A study published earlier this month in the American Journal of Preventive Medicine found some correlation to the light-dark cycle in rats and stress in humans, at least when it comes to people searching on the internet for information in the winter versus the summer about mental illness. Using Google’s search data from 2006 to 2010, a team of researchers led by John Ayers of San Diego State University found that mental health searches on Google were, in general, 14 percent higher in the winter in the United States and 11 percent higher in the Australian winter.

“Now that we know that day length can switch transmitters and change behavior, there may be a connection,” said Spitzer.

In their rat experiments, the UC San Diego neuroscientists found that the switch in transmitter synthesis in the rat’s brain cells from dopamine to somatostatin or back again was not due to the growth of new neurons, but to the ability of the same neurons there to produce different neurotransmitters.

Rats exposed to 19 hours of darkness every 24 hours during the week showed higher numbers of dopamine neurons within their brains and were more likely, the researchers found, to explore the open end of an elevated maze, a behavioral test showing they were less anxious. These rats were also more willing to swim, another laboratory test that showed they were less stressed.

“Because rats are nocturnal animals, they like to explore during the night and dopamine is a key part of our and their reward system,” said Spitzer. “It’s part of what allows them to be confident and reduce anxiety.”

The researchers said they don’t know precisely how this neurotransmitter switch works. Nor do they know what proportion of light and darkness or stress triggers this switch in brain chemistry. “Is it 50-50? Or 80 percent light versus dark and 20 percent stress? We don’t know,” added Spitzer. “If we just stressed the animal and didn’t change their photoperiod, would that lead to changes in transmitter identity? We don’t know, but those are all doable experiments.”

But as they learn more about this trigger mechanism, they said one promising avenue for human application might be to use this neurotransmitter switch to deliver dopamine effectively to parts of the brain that no longer receive dopamine in Parkinson’s patients.

“We could switch to a parallel pathway to put dopamine where it’s needed with fewer side effects than pharmacological agents,” said Dulcis.

Filed under seasonal affective disorder SAD rats neurotransmitters dopamine neurons somatostatin neuroscience science

103 notes

Scientists Create Novel Approach to Find RNAs Involved in Long-term Memory Storage

Despite decades of research, relatively little is known about the identity of RNA molecules that are transported as part of the molecular process underpinning learning and memory.

Now, working together, scientists from the Florida campus of The Scripps Research Institute (TSRI), Columbia University and the University of Florida, Gainesville, have developed a novel strategy for isolating and characterizing a substantial number of RNAs transported from the cell-body of neuron (nerve cell) to the synapse, the small gap separating neurons that enables cell to cell communication.

Using this new method, the scientists were able to identify nearly 6,000 transcripts (RNA sequences) from the genome of Aplysia, a sea slug widely used in scientific investigation.

The scientists’ target is known as the synaptic transcriptome—roughly the complete set of RNA molecules transported from the neuronal cell body to the synapse.

In the study, published recently in the journal Proceedings of the National Academy of Sciences, the scientists focused on the RNA transport complexes that interact with the molecular motor kinesin; kinesin proteins move along filaments known as microtubules in the cell and carry various gene products during the early stage of memory storage.

While neurons use active transport mechanisms such as kinesin to deliver RNA cargos to synapses, once they arrive at their synaptic destination that service stops and is taken over by other, more localized mechanisms—in much the same way that a traveler’s bags gets handed off to the hotel doorman once the taxi has dropped them at the entrance.

The scientists identified thousands of these unique sequences of both coding and noncoding RNAs. As it turned out, several of these RNAs play key roles in the maintenance of synaptic function and growth.

The scientists also uncovered several antisense RNAs (paired duplicates that can inhibit gene expression), although what their function at the synapse might be remains unknown.

“Our analyses suggest that the transported RNAs are surprisingly diverse,” said Sathya Puthanveettil, a TSRI assistant professor who designed the study. “It also brings up an important question of why so many different RNAs are transported to synapses. One reason may be that they are stored there to be used later to help maintain long-term memories.”

The team’s new approach offers the advantage of avoiding the dissection of neuronal processes to identify synaptically localized RNAs by focusing on transport complexes instead, Puthanveettil said. This new approach should help in better understanding changes in localized RNAs and their role in local translation as molecular substrates, not only in memory storage, but also in a variety of other physiological conditions, including development.

“New protein synthesis is a prerequisite for maintaining long term memory,” he said, “but you don’t need this kind of transport forever, so it raises many questions that we want to answer. What molecules need to be synthesized to maintain memory? How long is this collection of RNAs stored? What localized mechanisms come into play for memory maintenance? ”

(Source: scripps.edu)

Filed under memory LTM RNA molecules aplysia synaptic transcriptome neuroscience science

127 notes

Decision Making: From Neuroscience to Psychiatry
Adaptive behaviors increase the likelihood of survival and reproduction and improve the quality of life. However, it is often difficult to identify optimal behaviors in real life due to the complexity of the decision maker’s environment and social dynamics. As a result, although many different brain areas and circuits are involved in decision making, evolutionary and learning solutions adopted by individual decision makers sometimes produce suboptimal outcomes. Although these problems are exacerbated in numerous neurological and psychiatric disorders, their underlying neurobiological causes remain incompletely understood. In this review, theoretical frameworks in economics and machine learning and their applications in recent behavioral and neurobiological studies are summarized. Examples of such applications in clinical domains are also discussed for substance abuse, Parkinson’s disease, attention-deficit/hyperactivity disorder, schizophrenia, mood disorders, and autism. Findings from these studies have begun to lay the foundations necessary to improve diagnostics and treatment for various neurological and psychiatric disorders.

Decision Making: From Neuroscience to Psychiatry

Adaptive behaviors increase the likelihood of survival and reproduction and improve the quality of life. However, it is often difficult to identify optimal behaviors in real life due to the complexity of the decision maker’s environment and social dynamics. As a result, although many different brain areas and circuits are involved in decision making, evolutionary and learning solutions adopted by individual decision makers sometimes produce suboptimal outcomes. Although these problems are exacerbated in numerous neurological and psychiatric disorders, their underlying neurobiological causes remain incompletely understood. In this review, theoretical frameworks in economics and machine learning and their applications in recent behavioral and neurobiological studies are summarized. Examples of such applications in clinical domains are also discussed for substance abuse, Parkinson’s disease, attention-deficit/hyperactivity disorder, schizophrenia, mood disorders, and autism. Findings from these studies have begun to lay the foundations necessary to improve diagnostics and treatment for various neurological and psychiatric disorders.

Filed under decision making psychiatric disorders neurobiology neuroscience neurological disorders science

59 notes

Melatonin delays ALS symptom onset and death in mice

Melatonin injections delayed symptom onset and reduced mortality in a mouse model of the neurodegenerative condition amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease, according to a new study by researchers at the University of Pittsburgh School of Medicine. In a report published online ahead of print in the journal Neurobiology of Disease, the team revealed that receptors for melatonin are found in the nerve cells, a finding that could launch novel therapeutic approaches.

Annually about 5,000 people are diagnosed with ALS, which is characterized by progressive muscle weakness and eventual death due to the failure of respiratory muscles, said senior investigator Robert Friedlander, M.D., UPMC Endowed Professor of neurosurgery and neurobiology and chair, Department of Neurological Surgery, Pitt School of Medicine. But the causes of the condition are not well understood, thwarting development of a cure or even effective treatments.

Melatonin is a naturally occurring hormone that is best known for its role in sleep regulation. After screening more than a thousand FDA-approved drugs several years ago, the research team determined that melatonin is a powerful antioxidant that blocks the release of enzymes that activate apoptosis, or programmed cell death.

"Our experiments show for the first time that a lack of melatonin and melatonin receptor 1, or MT1, is associated with the progression of ALS," Dr. Friedlander said. "We saw similar results in a Huntington’s disease model in an earlier project, suggesting similar biochemical pathways are disrupted in these challenging neurologic diseases."

Hoping to stop neuron death in ALS just as they did in Huntington’s, the research team treated mice bred to have an ALS-like disease with injections of melatonin or with a placebo. Compared to untreated animals, the melatonin group developed symptoms later, survived longer, and had less degeneration of motor neurons in the spinal cord.

"Much more work has to be done to unravel these mechanisms before human trials of melatonin or a drug akin to it can be conducted to determine its usefulness as an ALS treatment," Dr. Friedlander said. "I suspect that a combination of agents that act on these pathways will be needed to make headway with this devastating disease."

(Source: eurekalert.org)

Filed under ALS Lou Gehrig's disease nerve cells melatonin cell death neuroscience science

free counters