Neuroscience

Articles and news from the latest research reports.

Posts tagged science

58 notes

Hypnosis study unlocks secrets of unexplained paralysis

Hypnosis has begun to attract renewed interest from neuroscientists interested in using hypnotic suggestion to test predictions about normal cognitive functioning.

To demonstrate the future potential of this growing field, guest editors Professor Peter Halligan from the School of Psychology at Cardiff University and David A. Oakley of University College London, brought together leading researchers from cognitive neuroscience and hypnosis to contribute to this month’s special issue of the international journal, Cortex.

image

The issue illustrates how methodological and theoretical advances, using hypnotic suggestion, can return novel and experimentally verifiable insights for the neuroscience of consciousness and motor control. The research also includes novel brain imaging studies, which address sceptics’ concerns regarding the subjective reality and comparability of hypnotically suggested phenomena that previously depended on subjects’ largely unverifiable report and behaviour.

Halligan and Oakley also contribute to a new and revealing brain imaging study in the special issue that explores the brain systems involved in hypnotic paralysis. This research follows their earlier pioneering work on hypnotic leg paralysis reported in the Lancet in 2000.

Patients with “functional” or “psychogenic” conversion disorders present symptoms, such as paralyses, are clinically challenging. They comprise between 30 and 40% of patients attending neurology outpatient clinics and place a huge strain on public health services.

Professor Halligan of Cardiff University’s School of Psychology said: “This new study, working with colleagues at the Institute of Psychiatry in London, suggests that hypnosis can provide insights into of the brain systems involved in patients who display symptoms of neurological illness, but without evidence of brain damage. New insights show that symptoms experienced by patients with functional or dissociative conversion disorders (e.g. medically unexplained paralysis) can be simulated using targeted hypnotic suggestion.

"In this study we monitored brain activations of healthy volunteers with hypnosis induction who experienced paralysis-like experiences which could be turned ‘on’ and ‘off’. The suggestion resulted in subjects being unable to move a joystick together with a realistic and compelling experience of being unable to move and control their left hand despite trying.

"When compared to the completed movements, the suggested paralysis condition revealed increased activity in brain regions know to be active during motor planning and intention to move – and also brain areas involved in response selection and inhibition."

Comparing symptoms conveyed by conversion disorder patients and those produced by ‘paralysis’ suggestions in hypnosis, has revealed similar patterns of brain activation associated with attempted movement of the affected limb.

These findings could inform future studies of the brain mechanisms underpinning limb paralysis in patients with conversion disorders. More importantly they could lead to effective treatments.

(Source: cardiff.ac.uk)

Filed under brain cognitive function hypnosis hypnotic paralysis brain activation neuroscience science

38 notes

More Than Just Looking – A Role of Tiny Eye Movements Explained
Tübingen researcher learns how the brain keeps an eye on the periphery even when focusing on one object.
Have you ever wondered whether it’s possible to look at two places at once? Because our eyes have a specialized central region with high visual acuity and good color vision, we must always focus on one spot at a time in order to see our environment. As a result, our eyes constantly jump back and forth as we look around.
But what if – when you are looking at an object – your brain also allowed you to “look” somewhere else at the same time, out of the corner of your eye, as it were? Now, a scientist at the Werner Reichardt Centre for Integrative Neuroscience (CIN), which is funded by the German Excellence initiative at Tübingen University, has found a possible explanation for how this might happen.
Ziad Hafed, the leader of the Physiology of Active Vision Junior Research Group at CIN, wondered about the role of a type of tiny microscopic eye movement that occurs when we fix our gaze on something, called a microsaccade. “Microsaccades are sort of enigmatic,” Hafed says. They are movements of the eye which occur at exactly the moment when we are trying to look at something steadily – i.e., when we are trying to prevent our eyes from moving.
It was long thought that microsaccades were nothing but random, inconsequential tics, but Hafed wondered whether the mere unconscious preparation to generate these tiny eye movements can alter visual perception and effectively allow you to “see” out of the corner of your eye. He found that before generating a microsaccade, the brain reorganizes its visual processing to alter how you perceive things. “Imagine that you are the coach of a football team,” Hafed says. “You would normally ask your defenders to spread out across the field in order to provide good coverage during match play. However, in preparation for an upcoming corner kick by your opposing team, you would reorganize your defenders, assigning two of them to become temporary goalkeepers and protect the goal. What I found was evidence for a similar strategy in the visual brain before microsaccades,” says Hafed. That is, in preparation for generating a tiny microscopic eye movement, the brain – the “coach” – causes a subtle reorganization of the visual system, and thus alters how you might see out of the corner of your eyes (see diagram).
Using a series of experiments on human participants, coupled with computational modeling of the human visual system, Hafed asked participants to fix their attention on a spot that appeared on a screen in front of them, while he carefully measured their tiny microscopic eye movements. Hafed then probed the participants’ ability to look at two places at once by testing their peripheral vision. He found that in preparation to generate a tiny microsaccade, the participants demonstrated remarkable changes in their ability to process visual inputs. In the periphery, tiny microscopic eye movements effectively improved the capacity to direct visual input – from around where gaze is fixed – towards the brain. Hafed’s results, which are described in the leading science journal Neuron, thus demonstrate an important functional role for these tiny, microscopic, and “enigmatic” movements of the eye in helping us to perceive our environment.
Hafed’s results not only help us understand a previously puzzling phenomenon; there are also potentially wide-ranging applications arising from this work. In particular, this work can affect how we design computer and machine user interfaces. For example, using knowledge about the whole range of eye movements we constantly make, including microscopic ones, our future “smart user interfaces” can ensure that things likely to attract our attention are not displayed in places where they can be distracting. Conversely, if we need to locate something that should attract our attention – a warning light in a control room, for instance – this same approach will also be useful. As Hafed put it, “eye movements would essentially be a window on our minds.”

More Than Just Looking – A Role of Tiny Eye Movements Explained

Tübingen researcher learns how the brain keeps an eye on the periphery even when focusing on one object.

Have you ever wondered whether it’s possible to look at two places at once? Because our eyes have a specialized central region with high visual acuity and good color vision, we must always focus on one spot at a time in order to see our environment. As a result, our eyes constantly jump back and forth as we look around.

But what if – when you are looking at an object – your brain also allowed you to “look” somewhere else at the same time, out of the corner of your eye, as it were? Now, a scientist at the Werner Reichardt Centre for Integrative Neuroscience (CIN), which is funded by the German Excellence initiative at Tübingen University, has found a possible explanation for how this might happen.

Ziad Hafed, the leader of the Physiology of Active Vision Junior Research Group at CIN, wondered about the role of a type of tiny microscopic eye movement that occurs when we fix our gaze on something, called a microsaccade. “Microsaccades are sort of enigmatic,” Hafed says. They are movements of the eye which occur at exactly the moment when we are trying to look at something steadily – i.e., when we are trying to prevent our eyes from moving.

It was long thought that microsaccades were nothing but random, inconsequential tics, but Hafed wondered whether the mere unconscious preparation to generate these tiny eye movements can alter visual perception and effectively allow you to “see” out of the corner of your eye. He found that before generating a microsaccade, the brain reorganizes its visual processing to alter how you perceive things. “Imagine that you are the coach of a football team,” Hafed says. “You would normally ask your defenders to spread out across the field in order to provide good coverage during match play. However, in preparation for an upcoming corner kick by your opposing team, you would reorganize your defenders, assigning two of them to become temporary goalkeepers and protect the goal. What I found was evidence for a similar strategy in the visual brain before microsaccades,” says Hafed. That is, in preparation for generating a tiny microscopic eye movement, the brain – the “coach” – causes a subtle reorganization of the visual system, and thus alters how you might see out of the corner of your eyes (see diagram).

Using a series of experiments on human participants, coupled with computational modeling of the human visual system, Hafed asked participants to fix their attention on a spot that appeared on a screen in front of them, while he carefully measured their tiny microscopic eye movements. Hafed then probed the participants’ ability to look at two places at once by testing their peripheral vision. He found that in preparation to generate a tiny microsaccade, the participants demonstrated remarkable changes in their ability to process visual inputs. In the periphery, tiny microscopic eye movements effectively improved the capacity to direct visual input – from around where gaze is fixed – towards the brain. Hafed’s results, which are described in the leading science journal Neuron, thus demonstrate an important functional role for these tiny, microscopic, and “enigmatic” movements of the eye in helping us to perceive our environment.

Hafed’s results not only help us understand a previously puzzling phenomenon; there are also potentially wide-ranging applications arising from this work. In particular, this work can affect how we design computer and machine user interfaces. For example, using knowledge about the whole range of eye movements we constantly make, including microscopic ones, our future “smart user interfaces” can ensure that things likely to attract our attention are not displayed in places where they can be distracting. Conversely, if we need to locate something that should attract our attention – a warning light in a control room, for instance – this same approach will also be useful. As Hafed put it, “eye movements would essentially be a window on our minds.”

Filed under visual perception microsaccades eye movements peripheral vision neuroscience science

105 notes

First snaps made of fetal brains wiring themselves up

The first images have been captured of the fetal brain at different stages of its development. The work gives a glimpse of how the brain’s neural connections form in the womb, and could one day lead to prenatal diagnosis and treatment of conditions such as autism and schizophrenia.

We know little about how the fetal brain grows and functions – not only because it is so small, says Moriah Thomason of Wayne State University in Detroit, but also because “a fetus is doing backflips as we scan it”, making it tricky to get a usable result.

Undeterred, Thomason’s team made a series of functional magnetic resonance imaging (fMRI) scans of the brains of 25 fetuses between 24 and 38 weeks old. Each scan lasted just over 10 minutes, and the team kept only the images taken when the fetus was relatively still.

The researchers used the scans to look at two well-understood features of the developing brain: the spacing of neural connections and the time at which they developed. As expected, the two halves of the fetal brain formed denser and more numerous connections between themselves from one week to the next. The earliest connections tended appear in the middle of the brain and spread outward as the brain continued to develop.

Thomason says that the team is now scanning up to 100 fetuses at different stages of development. These scans might allow them to start to see variation between individuals. They are also applying algorithms to the scanning program that will help correct for the fetus’s movements, so fewer scans will be needed in future.

Once they understand what a normal fetal brain looks like, the researchers hope to study brains that are forming abnormal connections. Disorders such as schizophrenia or autism, for instance, are believed to start during development and might be due to faulty brain connections. Understanding the patterns that characterise these diseases might one day allow physicians to spot early warning signs and intervene sooner. Just as importantly, such images might improve our understanding of how these conditions develop in the first place, Thomason says.

Emi Takahashi of Boston Children’s Hospital says that one way to do this would be to follow a large group of children after they are born, and look back at the prenatal scans of those who later develop a brain disorder. Although she says the study is a very good first step, understanding the miswiring of the brain is so difficult that it may be some time before the results of such work become useful in clinical settings.

(Source: newscientist.com)

Filed under brain brain development fetal brain neuroimaging neural connections neuroscience science

74 notes

Scientists identify molecular system that could help develop potential treatments for conditions such as Alzheimer’s disease
Scientists from the University of Southampton have identified the molecular system that contributes to the harmful inflammatory reaction in the brain during neurodegenerative diseases.
An important aspect of chronic neurodegenerative diseases, such as Alzheimer’s, Parkinson’s, Huntington’s or prion disease, is the generation of an innate inflammatory reaction within the brain.
Results from the study open new avenues for the regulation of the inflammatory reaction and provide new insights into the understanding of the biology of microglial cells, which play a leading role in the development and maintenance of this reaction.
Dr Diego Gomez-Nicola, from the CNS Inflammation group at the University of Southampton and lead author of the paper, says: “The understanding of microglial biology during neurodegenerative diseases is crucial for the development of potential therapeutic approaches to control the harmful inflammatory reaction. These potential interventions could modify or arrest neurodegenerative diseases like Alzheimer disease.
“The future potential outcomes of this line of research would be rapidly translated into the clinics of neuropathology, and would improve the quality of life of patients with these diseases.”
Microglial cells multiply during different neurodegenerative conditions, although little is known about to what extent this accounts for the expansion of the microglial population during the development of the disease or how it is regulated.
Writing in The Journal of Neuroscience, scientists from the University of Southampton describe how they used a laboratory model of neurodegeneration (murine prion disease), to understand the brain’s response to microglial proliferation and dissected the molecules regulating this process. They found that signalling through a receptor called CSF1R is a key for the expansion of the microglial population and therefore drugs could target this.
Dr Diego Gomez-Nicola adds: “We have been able to identify that this molecular system is active in human Alzheimer’s disease and variant Creutzfeldt–Jakob disease, pointing to this mechanism being universal for controlling microglial proliferation during neurodegeneration. By means of targeting CSF1R with selective inhibitors we have been able to delay the clinical symptoms of experimental prion disease, also preventing the loss of neurons.”

Scientists identify molecular system that could help develop potential treatments for conditions such as Alzheimer’s disease

Scientists from the University of Southampton have identified the molecular system that contributes to the harmful inflammatory reaction in the brain during neurodegenerative diseases.

An important aspect of chronic neurodegenerative diseases, such as Alzheimer’s, Parkinson’s, Huntington’s or prion disease, is the generation of an innate inflammatory reaction within the brain.

Results from the study open new avenues for the regulation of the inflammatory reaction and provide new insights into the understanding of the biology of microglial cells, which play a leading role in the development and maintenance of this reaction.

Dr Diego Gomez-Nicola, from the CNS Inflammation group at the University of Southampton and lead author of the paper, says: “The understanding of microglial biology during neurodegenerative diseases is crucial for the development of potential therapeutic approaches to control the harmful inflammatory reaction. These potential interventions could modify or arrest neurodegenerative diseases like Alzheimer disease.

“The future potential outcomes of this line of research would be rapidly translated into the clinics of neuropathology, and would improve the quality of life of patients with these diseases.”

Microglial cells multiply during different neurodegenerative conditions, although little is known about to what extent this accounts for the expansion of the microglial population during the development of the disease or how it is regulated.

Writing in The Journal of Neuroscience, scientists from the University of Southampton describe how they used a laboratory model of neurodegeneration (murine prion disease), to understand the brain’s response to microglial proliferation and dissected the molecules regulating this process. They found that signalling through a receptor called CSF1R is a key for the expansion of the microglial population and therefore drugs could target this.

Dr Diego Gomez-Nicola adds: “We have been able to identify that this molecular system is active in human Alzheimer’s disease and variant Creutzfeldt–Jakob disease, pointing to this mechanism being universal for controlling microglial proliferation during neurodegeneration. By means of targeting CSF1R with selective inhibitors we have been able to delay the clinical symptoms of experimental prion disease, also preventing the loss of neurons.”

Filed under neurodegenerative diseases microglial cells inflammatory reaction alzheimer's disease neuroscience science

129 notes

Circadian clock linked to obesity, diabetes and heart attacks
Disruption in the body’s circadian rhythm can lead not only to obesity, but can also increase the risk of diabetes and heart disease.
That is the conclusion of the first study to show definitively that insulin activity is controlled by the body’s circadian biological clock. The study, which was published on Feb. 21 in the journal Current Biology, helps explain why not only what you eat, but when you eat, matters.
The research was conducted by a team of Vanderbilt scientists directed by Professor of Biological Sciences Carl Johnson and Professors of Molecular Physiology and Biophysics Owen McGuinness and David Wasserman.
“Our study confirms that it is not only what you eat and how much you eat that is important for a healthy lifestyle, but when you eat is also very important,” said postdoctoral fellow Shu-qun Shi, who performed the experiment with research assistant Tasneem Ansari in the Vanderbilt University Medical Center’s Mouse Metabolic Phenotyping Center.
In recent years, a number of studies in both mice and men have found a variety of links between the operation of the body’s biological clock and various aspects of its metabolism, the physical and chemical processes that provide energy and produce, maintain and destroy tissue. It was generally assumed that these variations were caused in response to insulin, which is one of the most potent metabolic hormones. However, no one had actually determined that insulin action follows a 24-hour cycle or what happens when the body’s circadian clock is disrupted.
Because they are nocturnal, mice have a circadian rhythm that is the mirror image of that of humans: They are active during the night and sleep during the day. Otherwise, scientists have found that the internal timekeeping system of the two species operate in nearly the same way at the molecular level. Most types of cells contain their own molecular clocks, all of which are controlled by a master circadian clock in the suprachiasmatic nucleus in the brain.
“People have suspected that our cells’ response to insulin had a circadian cycle, but we are the first to have actually measured it,” said McGuinness. “The master clock in the central nervous system drives the cycle and insulin response follows.”

Circadian clock linked to obesity, diabetes and heart attacks

Disruption in the body’s circadian rhythm can lead not only to obesity, but can also increase the risk of diabetes and heart disease.

That is the conclusion of the first study to show definitively that insulin activity is controlled by the body’s circadian biological clock. The study, which was published on Feb. 21 in the journal Current Biology, helps explain why not only what you eat, but when you eat, matters.

The research was conducted by a team of Vanderbilt scientists directed by Professor of Biological Sciences Carl Johnson and Professors of Molecular Physiology and Biophysics Owen McGuinness and David Wasserman.

“Our study confirms that it is not only what you eat and how much you eat that is important for a healthy lifestyle, but when you eat is also very important,” said postdoctoral fellow Shu-qun Shi, who performed the experiment with research assistant Tasneem Ansari in the Vanderbilt University Medical Center’s Mouse Metabolic Phenotyping Center.

In recent years, a number of studies in both mice and men have found a variety of links between the operation of the body’s biological clock and various aspects of its metabolism, the physical and chemical processes that provide energy and produce, maintain and destroy tissue. It was generally assumed that these variations were caused in response to insulin, which is one of the most potent metabolic hormones. However, no one had actually determined that insulin action follows a 24-hour cycle or what happens when the body’s circadian clock is disrupted.

Because they are nocturnal, mice have a circadian rhythm that is the mirror image of that of humans: They are active during the night and sleep during the day. Otherwise, scientists have found that the internal timekeeping system of the two species operate in nearly the same way at the molecular level. Most types of cells contain their own molecular clocks, all of which are controlled by a master circadian clock in the suprachiasmatic nucleus in the brain.

“People have suspected that our cells’ response to insulin had a circadian cycle, but we are the first to have actually measured it,” said McGuinness. “The master clock in the central nervous system drives the cycle and insulin response follows.”

Filed under circadian clock biological clock suprachiasmatic nucleus insulin insulin resistance obesity medicine science

39 notes

Cooling may prevent trauma-induced epilepsy
In the weeks, months and years after a severe head injury, patients often experience epileptic seizures that are difficult to control. A new study in rats suggests that gently cooling the brain after injury may prevent these seizures.
“Traumatic head injury is the leading cause of acquired epilepsy in young adults, and in many cases the seizures can’t be controlled with medication,” says senior author Matthew Smyth, MD, associate professor of neurological surgery and of pediatrics at Washington University School of Medicine in St. Louis. “If we can confirm cooling’s effectiveness in human trials, this approach may give us a safe and relatively simple way to prevent epilepsy in these patients.”
The researchers reported their findings in Annals of Neurology.
Cooling the brain to protect it from injury is not a new concept. Cooling slows down the metabolic activity of nerve cells, and scientists think this may make it easier for brain cells to survive the stresses of an injury. 
Doctors currently cool infants whose brains may have had inadequate access to blood or oxygen during birth. They also cool some heart attack patients to reduce peripheral brain damage when the heart stops beating.
Smyth has been exploring the possibility of using cooling to prevent seizures or reduce their severity.
“Warmer brain cells seem to be more electrically active, and that may increase the likelihood of abnormal electrical discharges that can coalesce to form a seizure,” Smyth says. “Cooling should have the opposite effect.”
Smyth and colleagues at the University of Washington and the University of Minnesota test potential therapies in a rat model of brain injury. These rats develop chronic seizures weeks after the injury.
Researchers devised a headset that cools the rat brain. They were originally testing its ability to stop seizures when they noticed that cooling seemed to be not only stopping but also preventing seizures.
Scientists redesigned the study to focus on prevention. Under the new protocols, they put headsets on some of the rats that cooled their brains by less than 4 degrees Fahrenheit. Another group of rats wore headsets that did nothing. Scientists who were unaware of which rats they were observing monitored them for seizures during treatment and after the headsets were removed.
Rats that wore the inactive headset had progressively longer and more severe seizures weeks after the injury, but rats whose brains had been cooled only experienced a few very brief seizures as long as four months after injury.
Brain injury also tends to reduce cell activity at the site of the trauma, but the cooling headsets restored the normal activity levels of these cells.
The study is the first to reduce injury-related seizures without drugs, according to Smyth, who is director of the Pediatric Epilepsy Surgery program at St. Louis Children’s Hospital.
“Our results show that the brain changes that cause this type of epilepsy happen in the days and weeks after injury, not at the moment of injury or when the symptoms of epilepsy begin,” says Smyth. “If clinical trials confirm that cooling has similar effects in humans, it could change the way we treat patients with head injuries, and for the first time reduce the chance of developing epilepsy after brain injury.”
Smyth and his colleagues have been testing cooling devices in humans in the operating room, and are planning a multi-institutional trial of an implanted focal brain cooling device to evaluate the efficacy of cooling on established seizures.

Cooling may prevent trauma-induced epilepsy

In the weeks, months and years after a severe head injury, patients often experience epileptic seizures that are difficult to control. A new study in rats suggests that gently cooling the brain after injury may prevent these seizures.

“Traumatic head injury is the leading cause of acquired epilepsy in young adults, and in many cases the seizures can’t be controlled with medication,” says senior author Matthew Smyth, MD, associate professor of neurological surgery and of pediatrics at Washington University School of Medicine in St. Louis. “If we can confirm cooling’s effectiveness in human trials, this approach may give us a safe and relatively simple way to prevent epilepsy in these patients.”

The researchers reported their findings in Annals of Neurology.

Cooling the brain to protect it from injury is not a new concept. Cooling slows down the metabolic activity of nerve cells, and scientists think this may make it easier for brain cells to survive the stresses of an injury. 

Doctors currently cool infants whose brains may have had inadequate access to blood or oxygen during birth. They also cool some heart attack patients to reduce peripheral brain damage when the heart stops beating.

Smyth has been exploring the possibility of using cooling to prevent seizures or reduce their severity.

“Warmer brain cells seem to be more electrically active, and that may increase the likelihood of abnormal electrical discharges that can coalesce to form a seizure,” Smyth says. “Cooling should have the opposite effect.”

Smyth and colleagues at the University of Washington and the University of Minnesota test potential therapies in a rat model of brain injury. These rats develop chronic seizures weeks after the injury.

Researchers devised a headset that cools the rat brain. They were originally testing its ability to stop seizures when they noticed that cooling seemed to be not only stopping but also preventing seizures.

Scientists redesigned the study to focus on prevention. Under the new protocols, they put headsets on some of the rats that cooled their brains by less than 4 degrees Fahrenheit. Another group of rats wore headsets that did nothing. Scientists who were unaware of which rats they were observing monitored them for seizures during treatment and after the headsets were removed.

Rats that wore the inactive headset had progressively longer and more severe seizures weeks after the injury, but rats whose brains had been cooled only experienced a few very brief seizures as long as four months after injury.

Brain injury also tends to reduce cell activity at the site of the trauma, but the cooling headsets restored the normal activity levels of these cells.

The study is the first to reduce injury-related seizures without drugs, according to Smyth, who is director of the Pediatric Epilepsy Surgery program at St. Louis Children’s Hospital.

“Our results show that the brain changes that cause this type of epilepsy happen in the days and weeks after injury, not at the moment of injury or when the symptoms of epilepsy begin,” says Smyth. “If clinical trials confirm that cooling has similar effects in humans, it could change the way we treat patients with head injuries, and for the first time reduce the chance of developing epilepsy after brain injury.”

Smyth and his colleagues have been testing cooling devices in humans in the operating room, and are planning a multi-institutional trial of an implanted focal brain cooling device to evaluate the efficacy of cooling on established seizures.

Filed under brain injury brain damage seizures brain cells nerve cells metabolic activity animal model neuroscience science

124 notes

Where does our head come from?
A research group at the Sars Centre in Bergen has shed new light on the evolutionary origin of the head. In a study published in the journal PLoS Biology they show that in a simple, brainless sea anemone, the same genes that control head development in higher animals regulate the development of the front end of the swimming larvae.
In many animals, the brain is located in a specific structure, the head, together with sensory organs and often together with the mouth. However, there are even more distantly related animals, which have a nervous system, but no brain, like sea anemones and corals. In this study a research group led by Fabian Rentzsch used the sea anemone Nematostella vectensis to find out if one of the ends of the sea anemone corresponds to the head of higher animals. To do this they studied the function of genes that control head development in higher animals during the embryonic development of the starlet sea anemone.
“Despite looking completely different, it has become clear over the last decade, that all animals have a similar repertoire of genes, including those that are required to make the head of higher animals”, says first author and PhD-student Chiara Sinigaglia.
Stands on its head When the sea anemone is in the larval stage it swims. As adults, the sea anemone stands with one end on the sea floor and uses long tentacles on its upper end to catch small animals which they stuff into the only body opening in the middle of the ring of tentacles.
“Based on the appearance of the adult animals, the lower end of these animals has traditionally been called the foot and the upper end the head”, explains Rentzsch. What the research group found out was that in the sea anemone the “head gene” function is located at the end that corresponds to the “foot” of the adult animals. The key was to study the larvae of the sea anemones when theystill move around.
“The larvae swims with the “foot” end forward and this end carries their main sense organ, so at this stage it looks more like this might be their head”, says Rentzsch. And indeed, the “head genes” function on this side of the animals. Sea anemones and all higher animals, including humans, share a common brainless ancestor which lived between 600 and 700 million years ago.
“By revealing the function of “head genes” in Nematostella, we now understand better how and from where the head and brain of higher animals evolved”, Sinigaglia and Rentzsch explain.

Where does our head come from?

A research group at the Sars Centre in Bergen has shed new light on the evolutionary origin of the head. In a study published in the journal PLoS Biology they show that in a simple, brainless sea anemone, the same genes that control head development in higher animals regulate the development of the front end of the swimming larvae.

In many animals, the brain is located in a specific structure, the head, together with sensory organs and often together with the mouth. However, there are even more distantly related animals, which have a nervous system, but no brain, like sea anemones and corals.
In this study a research group led by Fabian Rentzsch used the sea anemone Nematostella vectensis to find out if one of the ends of the sea anemone corresponds to the head of higher animals. To do this they studied the function of genes that control head development in higher animals during the embryonic development of the starlet sea anemone.

“Despite looking completely different, it has become clear over the last decade, that all animals have a similar repertoire of genes, including those that are required to make the head of higher animals”, says first author and PhD-student Chiara Sinigaglia.

Stands on its head
When the sea anemone is in the larval stage it swims. As adults, the sea anemone stands with one end on the sea floor and uses long tentacles on its upper end to catch small animals which they stuff into the only body opening in the middle of the ring of tentacles.

“Based on the appearance of the adult animals, the lower end of these animals has traditionally been called the foot and the upper end the head”, explains Rentzsch.
What the research group found out was that in the sea anemone the “head gene” function is located at the end that corresponds to the “foot” of the adult animals. The key was to study the larvae of the sea anemones when theystill move around.

“The larvae swims with the “foot” end forward and this end carries their main sense organ, so at this stage it looks more like this might be their head”, says Rentzsch. And indeed, the “head genes” function on this side of the animals.
Sea anemones and all higher animals, including humans, share a common brainless ancestor which lived between 600 and 700 million years ago.

“By revealing the function of “head genes” in Nematostella, we now understand better how and from where the head and brain of higher animals evolved”, Sinigaglia and Rentzsch explain.

Filed under head development head genes sea anemones nematostella genes evolution science

30 notes

Clues to Fetal Alcohol Risk: Molecular switch promises new targets for diagnosis and therapy
Fetal alcohol syndrome is the leading preventable cause of developmental disorders in developed countries. And fetal alcohol spectrum disorder (FASD), a range of alcohol-related birth defects that includes fetal alcohol syndrome, is thought to affect as many as 1 in 100 children born in the United States.
Any amount of alcohol consumed by the mother during pregnancy poses a risk of FASD, a condition that can include the distinct pattern of facial features and growth retardation associated with fetal alcohol syndrome as well as intellectual disabilities, speech and language delays, and poor social skills. But drinking can have radically different outcomes for different women and their babies. While twin studies have suggested a genetic component to susceptibility to FASD, researchers have had little success identifying who is at greatest risk or what genes are at play.
Research from Harvard Medical School and Veterans Affairs Boston Healthcare System sheds new light on this question, identifying for the first time a signaling pathway that might determine genetic susceptibility for the development of FASD. The study was published online Feb. 19 in the journal Proceedings of the National Academy of Sciences.
“Our work points to candidate genes for FASD susceptibility and identifies a path for the rational development of drugs that prevent ethanol neurotoxicity,” said Michael Charness, chief of staff at VA Boston Healthcare System and HMS professor of neurology. “And importantly, identifying those mothers whose fetuses are most at risk could help providers better target intensive efforts at reducing drinking during pregnancy.”
The discovery also solves a riddle that had intrigued Charness and other researchers for nearly two decades. In 1996, Charness and colleagues discovered that alcohol disrupted the work of a human protein critical to fetal neural development—a major clue to the biological processes of FASD. The protein, L1, projects through the surface of a cell to help it adhere to its neighbors. When Charness and his team introduced the protein to a culture of mouse fibroblasts cells, L1 increased cell adhesion. Tellingly, the effect was erased in the presence of ethanol (beverage alcohol).
Charness and his team went on to develop multiple cell lines from that first culture, and that’s where they encountered the riddle: In some of those lines, alcohol disrupted L1’s adhesive effect, while in others it did not.
“How could it be possible that a cell that expresses L1 is completely sensitive to alcohol, and others that express it are completely insensitive?” asked Charness, who is also faculty associate dean for veterans hospital programs at HMS and assistant dean at Boston University School of Medicine.
Clearly, something else was affecting the protein’s sensitivity to alcohol — but what? Studies of twins provided one clue: Identical twins are more likely than fraternal twins to have the same diagnosis, positive or negative, for FASD. “That concordance suggests that there are modifying genes, susceptibility genes, that predispose to this condition,” Charness said.

Clues to Fetal Alcohol Risk: Molecular switch promises new targets for diagnosis and therapy

Fetal alcohol syndrome is the leading preventable cause of developmental disorders in developed countries. And fetal alcohol spectrum disorder (FASD), a range of alcohol-related birth defects that includes fetal alcohol syndrome, is thought to affect as many as 1 in 100 children born in the United States.

Any amount of alcohol consumed by the mother during pregnancy poses a risk of FASD, a condition that can include the distinct pattern of facial features and growth retardation associated with fetal alcohol syndrome as well as intellectual disabilities, speech and language delays, and poor social skills. But drinking can have radically different outcomes for different women and their babies. While twin studies have suggested a genetic component to susceptibility to FASD, researchers have had little success identifying who is at greatest risk or what genes are at play.

Research from Harvard Medical School and Veterans Affairs Boston Healthcare System sheds new light on this question, identifying for the first time a signaling pathway that might determine genetic susceptibility for the development of FASD. The study was published online Feb. 19 in the journal Proceedings of the National Academy of Sciences.

“Our work points to candidate genes for FASD susceptibility and identifies a path for the rational development of drugs that prevent ethanol neurotoxicity,” said Michael Charness, chief of staff at VA Boston Healthcare System and HMS professor of neurology. “And importantly, identifying those mothers whose fetuses are most at risk could help providers better target intensive efforts at reducing drinking during pregnancy.”

The discovery also solves a riddle that had intrigued Charness and other researchers for nearly two decades. In 1996, Charness and colleagues discovered that alcohol disrupted the work of a human protein critical to fetal neural development—a major clue to the biological processes of FASD. The protein, L1, projects through the surface of a cell to help it adhere to its neighbors. When Charness and his team introduced the protein to a culture of mouse fibroblasts cells, L1 increased cell adhesion. Tellingly, the effect was erased in the presence of ethanol (beverage alcohol).

Charness and his team went on to develop multiple cell lines from that first culture, and that’s where they encountered the riddle: In some of those lines, alcohol disrupted L1’s adhesive effect, while in others it did not.

“How could it be possible that a cell that expresses L1 is completely sensitive to alcohol, and others that express it are completely insensitive?” asked Charness, who is also faculty associate dean for veterans hospital programs at HMS and assistant dean at Boston University School of Medicine.

Clearly, something else was affecting the protein’s sensitivity to alcohol — but what? Studies of twins provided one clue: Identical twins are more likely than fraternal twins to have the same diagnosis, positive or negative, for FASD. “That concordance suggests that there are modifying genes, susceptibility genes, that predispose to this condition,” Charness said.

Filed under fetal alcohol syndrome FASD brain development neural development birth defects proteins neuroscience science

50 notes

First signals from brain nerve cells with ultrathin nanowires

Electrodes operated into the brain are today used in research and to treat diseases such as Parkinson’s. However, their use has been limited by their size. At Lund University in Sweden, researchers have, for the first time, succeeded in implanting an ultrathin nanowire-based electrode and capturing signals from the nerve cells in the brain of a laboratory animal.

The researchers work at Lund University’s Neuronano Research Centre in an interdisciplinary collaboration between experts in subjects including neurophysiology, biomaterials, electrical measurements and nanotechnology. Their electrode is composed of a group of nanowires, each of which measures only 200 nanometres (billionths of a metre) in diameter.

Such thin electrodes have previously only been used in experiments with cell cultures.

“Carrying out experiments on a living animal is much more difficult. We are pleased that we have succeeded in developing a functioning nano-electrode, getting it into place and capturing signals from nerve cells”, says Professor Jens Schouenborg, who is head of the Neuronano Research Centre.

He sees this as a real breakthrough, but also as only a step on the way. The research group has already worked for several years to develop electrodes that are thin and flexible enough not to disturb the brain tissue, and with material that does not irritate the cells nearby. They now have the first evidence that it is possible to obtain useful nerve signals from nanometre-sized electrodes.

The research will now take a number of directions. The researchers want to try and reduce the size of the base to which the nanowires are attached, improve the connection between the electrode and the electronics that receive the signals from the nerve cells, and experiment with the surface structure of the electrodes to see what produces the best signals without damaging the brain cells.

“In the future, we hope to be able to make electrodes with nanostructured surfaces that are adapted to the various parts of the nerve cells – parts that are no bigger than a few billionths of a metre. Then we could tailor-make each electrode based on where it is going to be placed and what signals it is to capture or emit”, says Jens Schouenborg.

When an electrode is inserted into the brain of a patient or a laboratory animal, it is generally anchored to the skull. This means that it doesn’t move smoothly with the brain, which floats inside the skull, but rather rubs against the surrounding tissue, which in the long term causes the signals to deteriorate. The Lund group’s electrodes will instead be anchored by their surface structure.

“With the right pattern on the surface, they will stay in place yet still move with the body – and the brain – thereby opening up for long-term monitoring of neurones”, explains Jens Schouenborg.

He praises the collaboration between medics, physicists and others at the Neuronano Research Centre, and mentions physicist Dmitry B. Suyatin in particular. He is the principal author of the article which the researchers have now published in the international journal PLOS ONE.

The overall goal of the Neuronano Research Centre is to develop electrodes that can be inserted into the brain to study learning, pain and other mechanisms, and, in the long term, to treat conditions such as chronic pain, depression and Parkinson’s disease.

(Source: lunduniversity.lu.se)

Filed under nerve signals nerve cells brain tissue electrodes cell cultures neuroscience science

122 notes

Bilingual children have a better “working memory” than monolingual children
A study conducted at the University of Granada and the University of York in Toronto, Canada, has revealed that bilingual children develop a better working memory –which holds, processes and updates information over short periods of time– than monolingual children. The working memory plays a major role in the execution of a wide range of activities, such as mental calculation (since we have to remember numbers and operate with them) or reading comprehension (given that it requires associating the successive concepts in a text).
The objective of this study –which was published in the last issue of the Journal of Experimental Child Psychology– was examining how multilingualism influences the development of the “working memory” and investigating the association between the working memory and the cognitive superiority of bilingual people found in previous studies.
Executive Functions
The working memory includes the structures and processes associated with the storage and processing of information over short periods of time. It is one of the components of the so-called “executive functions”: a set of mechanisms involved in the planning and self-regulation of human behavior. Although the working memory is developed in the first years of life, it can be trained and improved with experience.
According to the principal investigator of this study, Julia Morales Castillo, of the Department of Experimental Psychology of the University of Granada, this study contributes to better understand cognitive development in bilingual and monolingual children. “Other studies have demonstrated that bilingual children are better at planning and cognitive control (i.e. tasks involving ignoring irrelevant information or requiring a dominant response). But, to date, there was no evidence on the influence of bilingualism on the working memory.
The study sample included bilingual children between 5 and 7 years of age (a critical period in the development of the working memory). The researchers found that bilingual children performed better than monolingual children in working memory tasks. Indeed, the more complex the tasks the better their performance. “The results of this study suggest that bilingualism does not only improve the working memory in an isolated way, but they affect the global development of executive functions, especially when they have to interact with each other”, Morales Castillo states.
Music Education
According to the researcher, the results of this study “contribute to the growing number of studies on the role of experience in cognitive development”. Other studies have demonstrated that children performing activities such as music education have better cognitive capacities. “However, we cannot determine to what extent children perform these activities due to other factors such as talent or personal interest”.
“However, the children in our study were bilingual because of family reasons rather than because of an interest in languages.

Bilingual children have a better “working memory” than monolingual children

A study conducted at the University of Granada and the University of York in Toronto, Canada, has revealed that bilingual children develop a better working memory –which holds, processes and updates information over short periods of time– than monolingual children. The working memory plays a major role in the execution of a wide range of activities, such as mental calculation (since we have to remember numbers and operate with them) or reading comprehension (given that it requires associating the successive concepts in a text).

The objective of this study –which was published in the last issue of the Journal of Experimental Child Psychology– was examining how multilingualism influences the development of the “working memory” and investigating the association between the working memory and the cognitive superiority of bilingual people found in previous studies.

Executive Functions

The working memory includes the structures and processes associated with the storage and processing of information over short periods of time. It is one of the components of the so-called “executive functions”: a set of mechanisms involved in the planning and self-regulation of human behavior. Although the working memory is developed in the first years of life, it can be trained and improved with experience.

According to the principal investigator of this study, Julia Morales Castillo, of the Department of Experimental Psychology of the University of Granada, this study contributes to better understand cognitive development in bilingual and monolingual children. “Other studies have demonstrated that bilingual children are better at planning and cognitive control (i.e. tasks involving ignoring irrelevant information or requiring a dominant response). But, to date, there was no evidence on the influence of bilingualism on the working memory.

The study sample included bilingual children between 5 and 7 years of age (a critical period in the development of the working memory). The researchers found that bilingual children performed better than monolingual children in working memory tasks. Indeed, the more complex the tasks the better their performance. “The results of this study suggest that bilingualism does not only improve the working memory in an isolated way, but they affect the global development of executive functions, especially when they have to interact with each other”, Morales Castillo states.

Music Education

According to the researcher, the results of this study “contribute to the growing number of studies on the role of experience in cognitive development”. Other studies have demonstrated that children performing activities such as music education have better cognitive capacities. “However, we cannot determine to what extent children perform these activities due to other factors such as talent or personal interest”.

“However, the children in our study were bilingual because of family reasons rather than because of an interest in languages.

Filed under children cognitive development bilingualism working memory neuroscience psychology science

free counters