Neuroscience

Articles and news from the latest research reports.

55 notes

Problem-solving governs how we process sensory stimuli
Various areas of the brain process our sensory experiences. How the areas of the cerebral cortex communicate with each other and process sensory information has long puzzled neuroscientists. Exploring the sense of touch in mice, brain researchers from the University of Zurich now demonstrate that the transmission of sensory information from one cortical area to connected areas depends on the specific task to solve and the goal-directed behavior. These findings can serve as a basis for an improved understanding of cognitive disorders. 
In the mammalian brain, the cerebral cortex plays a crucial role in processing sensory inputs. The cortex can be subdivided into different areas, each handling distinct aspects of perception, decision-making or action. The somatosensory cortex, for instance, comprises the part of the cerebral cortex that primarily processes haptic sensations. The different areas of the cerebral cortex are interconnected and communicate with each other. A central, unanswered question of neuroscience is how exactly do these brain areas communicate to process sensory stimuli and produce appropriate behavior. A team of researchers headed by Professor Fritjof Helmchen at the University of Zurich’s Brain Research Institute now provides an answer: The processing of sensory information depends on what you want to achieve. The brain researchers observed that nerve cells in the sensory cortex that connect to distinct brain areas are activated differentially depending on the task to be solved.
Goal-directed processing of sensory information
In their publication in Nature, the researchers studied how mice use their facial whiskers to explore their environment, much like we do in the dark with our hands and fingers. One mouse group was trained to distinguish coarse and fine sandpapers using their whiskers in order to obtain a reward. Another group had to work out the angle, at which an object – a metal rod – was located relative to their snout. The neuroscientists measured the activity of neurons in the primary somatosensory cortex using a special microscopy technique. With simultaneous anatomical stainings they also identified which of these neurons sent their projections to the more remote secondary somatosensory area and the motor cortex, respectively.
The primary somatosensory neurons with projections to the secondary somatosensory cortex predominantly became active when the mice had to distinguish the surface texture of the sandpaper. Neurons with projections to the motor cortex, on the other hand, were more involved when mice needed to localize the metal rod. These different activity patterns were not evident when mice passively touched sandpaper or metal rods without having been set a task – in other words, when their actions were not motivated by a reward. Thus, the sensory stimuli alone were not sufficient to explain the different pattern of information transfer to the remote brain areas.
Impaired communication in the brain
According to Fritjof Helmchen, the activity in a cortical area can be transmitted to remote areas in a targeted fashion if we have to extract (‘filter’) specific information from the environment to solve a problem. In cognitive disorders such Alzheimer’s disease, Autism, and Schizophrenia, this communication between brain areas is often disrupted. “A better understanding of how these long-range, interconnected networks in the brain operate might help to develop therapies that re-establish this specific cortical communication,” says Helmchen. The aim would be to thereby improve the impaired cognitive abilities of patients.

Problem-solving governs how we process sensory stimuli

Various areas of the brain process our sensory experiences. How the areas of the cerebral cortex communicate with each other and process sensory information has long puzzled neuroscientists. Exploring the sense of touch in mice, brain researchers from the University of Zurich now demonstrate that the transmission of sensory information from one cortical area to connected areas depends on the specific task to solve and the goal-directed behavior. These findings can serve as a basis for an improved understanding of cognitive disorders.

In the mammalian brain, the cerebral cortex plays a crucial role in processing sensory inputs. The cortex can be subdivided into different areas, each handling distinct aspects of perception, decision-making or action. The somatosensory cortex, for instance, comprises the part of the cerebral cortex that primarily processes haptic sensations. The different areas of the cerebral cortex are interconnected and communicate with each other. A central, unanswered question of neuroscience is how exactly do these brain areas communicate to process sensory stimuli and produce appropriate behavior. A team of researchers headed by Professor Fritjof Helmchen at the University of Zurich’s Brain Research Institute now provides an answer: The processing of sensory information depends on what you want to achieve. The brain researchers observed that nerve cells in the sensory cortex that connect to distinct brain areas are activated differentially depending on the task to be solved.

Goal-directed processing of sensory information

In their publication in Nature, the researchers studied how mice use their facial whiskers to explore their environment, much like we do in the dark with our hands and fingers. One mouse group was trained to distinguish coarse and fine sandpapers using their whiskers in order to obtain a reward. Another group had to work out the angle, at which an object – a metal rod – was located relative to their snout. The neuroscientists measured the activity of neurons in the primary somatosensory cortex using a special microscopy technique. With simultaneous anatomical stainings they also identified which of these neurons sent their projections to the more remote secondary somatosensory area and the motor cortex, respectively.

The primary somatosensory neurons with projections to the secondary somatosensory cortex predominantly became active when the mice had to distinguish the surface texture of the sandpaper. Neurons with projections to the motor cortex, on the other hand, were more involved when mice needed to localize the metal rod. These different activity patterns were not evident when mice passively touched sandpaper or metal rods without having been set a task – in other words, when their actions were not motivated by a reward. Thus, the sensory stimuli alone were not sufficient to explain the different pattern of information transfer to the remote brain areas.

Impaired communication in the brain

According to Fritjof Helmchen, the activity in a cortical area can be transmitted to remote areas in a targeted fashion if we have to extract (‘filter’) specific information from the environment to solve a problem. In cognitive disorders such Alzheimer’s disease, Autism, and Schizophrenia, this communication between brain areas is often disrupted. “A better understanding of how these long-range, interconnected networks in the brain operate might help to develop therapies that re-establish this specific cortical communication,” says Helmchen. The aim would be to thereby improve the impaired cognitive abilities of patients.

Filed under somatosensory cortex haptic sensation neurons cerebral cortex cognitive disorders neuroscience science

45 notes

Turn up the volume? A better way to broadcast over the noise
Traffic, aircraft, mobile devices and personal music equipment are not the only sources of noise pollution. Public address systems have become part of the escalating problem, which according to the World Health Organization, costs Europeans alone the equivalent of 654,000 years of healthy life annually.
But researchers at Stockholm’s KTH Royal Institute of Technology have developed a way to bring down the volume on loud public announcements while preserving their clarity in noisy environments.
“By manipulating speech before it is sent to the loudspeakers, we can enhance the speech signal and adapt it to the surrounding noise,” says Gustav Eje Henter, PhD student at Communication Theory at KTH. “This makes it possible to communicate at much lower volume levels than before.”
Earlier approaches to the problem focused on making the speech more prominent, while the KTH researchers are paying attention to what is actually said. They do this by working with computer and machine speech recognition, which is modeled on human hearing. By creating speech that is easier for computers to recognise, people should benefit as well, the researchers say.
“Our manipulation, which is suited for a computer speech recogniser, also makes it easier for people to hear the right thing,” says Petko Petkov, also a PhD student at Communication Theory. “The modified words sound more distinct from each other, making it easier to distinguish them in the noise.”
Petkov and Henter have developed their method together with Professor Bastiaan Kleijn as part of the European collaborative LISTA– or Listening Talker – project.
A recent global evaluation by the LISTA Consortium at University of Edinburgh showed significant increases in the number of words identified correctly in manipulated speech signals, over unaltered speech. The results of the LISTA evaluation are expected to be published later this year. 
In some cases, the improvement in understanding is equivalent to turning down the speech volume by more than 5 decibels, which is similar to the difference in the strength between car and truck noise, while still being able to hear what is said just as clearly.
“This enables communication in conditions where speech normally would be impossible to understand,” says Henter.
The LISTA project is funded by the European Union’s Future and Emerging Technology framework programme, and involves scientists from Spain, Greece, Sweden and the UK. The techniques developed within the project involve both natural and synthetic speech in different types of noise. In addition to public address systems, the project could benefit a wide range of devices that produce speech output – such as mobile phones, radios and in-car navigation systems.

Turn up the volume? A better way to broadcast over the noise

Traffic, aircraft, mobile devices and personal music equipment are not the only sources of noise pollution. Public address systems have become part of the escalating problem, which according to the World Health Organization, costs Europeans alone the equivalent of 654,000 years of healthy life annually.

But researchers at Stockholm’s KTH Royal Institute of Technology have developed a way to bring down the volume on loud public announcements while preserving their clarity in noisy environments.

“By manipulating speech before it is sent to the loudspeakers, we can enhance the speech signal and adapt it to the surrounding noise,” says Gustav Eje Henter, PhD student at Communication Theory at KTH. “This makes it possible to communicate at much lower volume levels than before.”

Earlier approaches to the problem focused on making the speech more prominent, while the KTH researchers are paying attention to what is actually said. They do this by working with computer and machine speech recognition, which is modeled on human hearing. By creating speech that is easier for computers to recognise, people should benefit as well, the researchers say.

“Our manipulation, which is suited for a computer speech recogniser, also makes it easier for people to hear the right thing,” says Petko Petkov, also a PhD student at Communication Theory. “The modified words sound more distinct from each other, making it easier to distinguish them in the noise.”

Petkov and Henter have developed their method together with Professor Bastiaan Kleijn as part of the European collaborative LISTA– or Listening Talker – project.

A recent global evaluation by the LISTA Consortium at University of Edinburgh showed significant increases in the number of words identified correctly in manipulated speech signals, over unaltered speech. The results of the LISTA evaluation are expected to be published later this year. 

In some cases, the improvement in understanding is equivalent to turning down the speech volume by more than 5 decibels, which is similar to the difference in the strength between car and truck noise, while still being able to hear what is said just as clearly.

“This enables communication in conditions where speech normally would be impossible to understand,” says Henter.

The LISTA project is funded by the European Union’s Future and Emerging Technology framework programme, and involves scientists from Spain, Greece, Sweden and the UK. The techniques developed within the project involve both natural and synthetic speech in different types of noise. In addition to public address systems, the project could benefit a wide range of devices that produce speech output – such as mobile phones, radios and in-car navigation systems.

Filed under LISTA project speech signal noisy environment human hearing speech neuroscience science

37 notes

Symptoms of Prader-Willi syndrome associated with interference in circadian, metabolic genes

Researchers with the UC Davis MIND Institute and Agilent Laboratories have found that Prader-Willi syndrome — a genetic disorder best known for causing an insatiable appetite that can lead to morbid obesity — is associated with the loss of non-coding RNAs, resulting in the dysregulation of circadian and metabolic genes, accelerated energy expenditure and metabolic differences during sleep.

The research was led by Janine LaSalle, a professor in the UC Davis Department of Medical Microbiology and Immunology who is affiliated with the MIND Institute. It is published online in Human Molecular Genetics.

“Prader-Willi syndrome children do not sleep as well at night and have daytime sleepiness,” LaSalle said. “Parents have to lock up their pantries because the kids are rummaging for food in the middle of the night, even breaking into their neighbors’ houses to eat.”

The study found that these behaviors are rooted in the loss of a long non-coding RNA that functions to balance energy expenditure in the brain during sleep. The finding could have a profound effect on how clinicians treat children with Prader-Willi, as well as point the way to new, innovative therapies, LaSalle said.

The leading cause of morbid obesity among children in the United States, Prader-Willi involves a complex, and sometimes contradictory, array of symptoms. Shortly after birth children with Prader-Willi experience failure to thrive. Yet after they begin to feed themselves, they have difficulty sleeping and insatiable appetites that lead to obesity if their diets are not carefully monitored.

The current study was conducted in a mouse model of Prader-Willi syndrome. It found that mice engineered with the loss of a long non-coding RNA showed altered energy use and metabolic differences during sleep.

Prader-Willi has been traced to a specific region on chromosome 15 (SNORD116), which produces RNAs that regulate gene expression, rather than coding for proteins. When functioning normally, SNORD116 produces small nucleolar (sno) RNAs and a long non-coding RNA (116HG), as well as a third non-coding RNA implicated in a related disorder, Angelman syndrome. The 116HG long non-coding RNA forms a cloud inside neuronal nuclei that associates with proteins and genes regulating diurnal metabolism in the brain, LaSalle said.

“We thought the cloud would be activating transcription, but in fact it was doing the opposite,” she said. “Most of the genes were dampened by the cloud. This long non-coding RNA was acting as a decoy, pulling the active transcription factors away from genes and keeping them from being expressed.”

As a result, losing snoRNAs and 116HG causes a chain reaction, eliminating the RNA cloud and allowing circadian and metabolic genes to get turned on during sleep periods, when they should be dampened down. This underlies a complex cycle in which the RNA cloud grew during sleep periods (daytime for nocturnal mice), turning down genes associated with energy use, and receded during waking periods, allowing these genes to be expressed. Mice without the 116HG gene lacked the benefit of this neuronal cloud, causing greater energy expenditure during sleep.

The researchers said that the work provides a clearer picture of why children with Prader-Willi syndrome can’t sleep or feel satiated and may change therapeutic approaches. For example, many such children have been treated with growth hormone because of short stature, but this actually may boost other aspects of the disease.

“People had thought the kids weren’t sleeping at night because of the sleep apnea caused by obesity,” said LaSalle. “What this study shows is that the diurnal metabolism is central to the disorder, and that the obesity may be as a result of that. If you can work with that, you could improve therapies, for example figuring out the best times to administer medications.”

(Source: ucdmc.ucdavis.edu)

Filed under circadian rhythms metabolism obesity Prader-Willi syndrome genetics neuroscience science

134 notes

Past Brain Activation Revealed in Scans
Weizmann Institute scientists discover that spontaneously emerging brain activity patterns preserve traces of previous cognitive activity
What if experts could dig into the brain, like archaeologists, and uncover the history of past experiences? This ability might reveal what makes each of us a unique individual, and it could enable the objective diagnosis of a wide range of neuropsychological diseases. New research at the Weizmann Institute hints that such a scenario is within the realm of possibility: It shows that spontaneous waves of neuronal activity in the brain bear the imprints of earlier events for at least 24 hours after the experience has taken place.
The new research stems from earlier findings in the lab of Prof. Rafi Malach of the Institute’s Neurobiology Department and others that the brain never rests, even when its owner is resting. When a person is resting with closed eyes – that is, no visual stimulus is entering the brain – the normal bursts of nerve cell activity associated with incoming information are replaced by ultra-slow patterns of neuronal activity. Such spontaneous or “resting” waves travel in a highly organized and reproducible manner through the brain’s outer layer – the cortex – and the patterns they create are complex, yet periodic and symmetrical.
Like hieroglyphics, it seemed that these patterns might have some meaning, and research student Tal Harmelech, under the guidance of Malach and Dr. Son Preminger, set out to uncover their significance. Their idea was that the patterns of resting brain waves may constitute “archives” for earlier experiences. As we add new experiences, the activation of our brain’s networks lead to long-term changes in the links between brain cells, a facility referred to as plasticity. As our experiences become embedded in these connections, they create “expectations” that come into play before we perform any type of mental task, enabling us to anticipate the result. The researchers hypothesized that information about earlier experiences would thus be incorporated into the links between networks of nerve cells in the cortex, and these would show up in the brain’s spontaneously emerging wave patterns.
In the experiment, the researchers had volunteers undertake a training exercise that would strongly activate a well-defined network of nerve cells in the frontal lobes. While undergoing scans of their brain activity in the Institute’s functional magnetic resonance imaging (fMRI) scanner, the subjects were asked to imagine a situation in which they had to make rapid decisions. The subjects received auditory feedback in real time, based on the information obtained directly from their frontal lobe, which indicated the level of neuronal activity in the trained network. This “neurofeedback” strategy proved highly successful in activating the frontal network – a part of the brain that is notoriously difficult to activate under controlled conditions.
To test whether the connections created in the brain during this exercise would leave their traces in the patterns formed by the resting brain waves, the researchers performed fMRI scans on the resting subjects before the exercise, immediately afterward, and 24 hours later. Their findings, which appeared in the Journal of Neuroscience, showed that the activation of the specific areas in the cortex did indeed remodel the resting brain wave patterns. Surprisingly, the new patterns not only remained the next day, they were significantly strengthened. These observations fit in with the classic learning principles proposed by Donald Hebb in the mid-20th century, in which the co-activation of two linked nerve cells leads to long term strengthening of their link, while activity that is not coordinated weakens this link. The fMRI images of the resting brain waves showed that brain areas that were activated together during the training sessions exhibited an increase in their functional link a day after the training, while those areas that were deactivated by the training showed a weakened functional connectivity.
This research suggests a number of future possibilities for exploring the brain. For example, spontaneously emerging brain patterns could be used as a “mapping tool” for unearthing cognitive events from an individual’s recent past. Or, on a wider scale, each person’s unique spontaneously emerging activity patterns might eventually reveal a sort of personal profile – highlighting each individual’s abilities, shortcomings, biases, learning skills, etc. “Today, we are discovering more and more of the common principles of brain activity, but we have not been able to account for the differences between individuals,” says Malach. “In the future, spontaneous brain patterns could be the key to obtaining unbiased individual profiles.” Such profiles could be especially useful in diagnosing or learning the brain pathologies associated with a wide array of cognitive disabilities.

Past Brain Activation Revealed in Scans

Weizmann Institute scientists discover that spontaneously emerging brain activity patterns preserve traces of previous cognitive activity

What if experts could dig into the brain, like archaeologists, and uncover the history of past experiences? This ability might reveal what makes each of us a unique individual, and it could enable the objective diagnosis of a wide range of neuropsychological diseases. New research at the Weizmann Institute hints that such a scenario is within the realm of possibility: It shows that spontaneous waves of neuronal activity in the brain bear the imprints of earlier events for at least 24 hours after the experience has taken place.

The new research stems from earlier findings in the lab of Prof. Rafi Malach of the Institute’s Neurobiology Department and others that the brain never rests, even when its owner is resting. When a person is resting with closed eyes – that is, no visual stimulus is entering the brain – the normal bursts of nerve cell activity associated with incoming information are replaced by ultra-slow patterns of neuronal activity. Such spontaneous or “resting” waves travel in a highly organized and reproducible manner through the brain’s outer layer – the cortex – and the patterns they create are complex, yet periodic and symmetrical.

Like hieroglyphics, it seemed that these patterns might have some meaning, and research student Tal Harmelech, under the guidance of Malach and Dr. Son Preminger, set out to uncover their significance. Their idea was that the patterns of resting brain waves may constitute “archives” for earlier experiences. As we add new experiences, the activation of our brain’s networks lead to long-term changes in the links between brain cells, a facility referred to as plasticity. As our experiences become embedded in these connections, they create “expectations” that come into play before we perform any type of mental task, enabling us to anticipate the result. The researchers hypothesized that information about earlier experiences would thus be incorporated into the links between networks of nerve cells in the cortex, and these would show up in the brain’s spontaneously emerging wave patterns.

In the experiment, the researchers had volunteers undertake a training exercise that would strongly activate a well-defined network of nerve cells in the frontal lobes. While undergoing scans of their brain activity in the Institute’s functional magnetic resonance imaging (fMRI) scanner, the subjects were asked to imagine a situation in which they had to make rapid decisions. The subjects received auditory feedback in real time, based on the information obtained directly from their frontal lobe, which indicated the level of neuronal activity in the trained network. This “neurofeedback” strategy proved highly successful in activating the frontal network – a part of the brain that is notoriously difficult to activate under controlled conditions.

To test whether the connections created in the brain during this exercise would leave their traces in the patterns formed by the resting brain waves, the researchers performed fMRI scans on the resting subjects before the exercise, immediately afterward, and 24 hours later. Their findings, which appeared in the Journal of Neuroscience, showed that the activation of the specific areas in the cortex did indeed remodel the resting brain wave patterns. Surprisingly, the new patterns not only remained the next day, they were significantly strengthened. These observations fit in with the classic learning principles proposed by Donald Hebb in the mid-20th century, in which the co-activation of two linked nerve cells leads to long term strengthening of their link, while activity that is not coordinated weakens this link. The fMRI images of the resting brain waves showed that brain areas that were activated together during the training sessions exhibited an increase in their functional link a day after the training, while those areas that were deactivated by the training showed a weakened functional connectivity.

This research suggests a number of future possibilities for exploring the brain. For example, spontaneously emerging brain patterns could be used as a “mapping tool” for unearthing cognitive events from an individual’s recent past. Or, on a wider scale, each person’s unique spontaneously emerging activity patterns might eventually reveal a sort of personal profile – highlighting each individual’s abilities, shortcomings, biases, learning skills, etc. “Today, we are discovering more and more of the common principles of brain activity, but we have not been able to account for the differences between individuals,” says Malach. “In the future, spontaneous brain patterns could be the key to obtaining unbiased individual profiles.” Such profiles could be especially useful in diagnosing or learning the brain pathologies associated with a wide array of cognitive disabilities.

Filed under brain mapping brain activity cognitive function Hebbian learning neuroimaging plasticity neuroscience science

136 notes

Protein Linked to Cognitive Decline in Alzheimer’s Identified
Researchers at Columbia University Medical Center (CUMC) have demonstrated that a protein called caspase-2 is a key regulator of a signaling pathway that leads to cognitive decline in Alzheimer’s disease. The findings, made in a mouse model of Alzheimer’s, suggest that inhibiting this protein could prevent the neuronal damage and subsequent cognitive decline associated with the disease. The study was published this month in the online journal Nature Communications.
One of the earliest events in Alzheimer’s is disruption of the brain’s synapses (the small gaps across which nerve impulses are passed), which can lead to neuronal death. Although what drives this process has not been clear, studies have indicated that caspace-2 might be involved, according to senior author Michael Shelanski, MD, PhD, the Delafield Professor of Pathology & Cell Biology, chair of the Department of Pathology and Cell Biology, and co-director of the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC.
Several years ago, in tissue culture studies of mouse neurons, Dr. Shelanski found that caspace-2 plays a critical role in the death of neurons in the presence of amyloid beta, the protein that accumulates in the neurons of people with Alzheimer’s. Other researchers have shown that caspase-2 also contributes to the maintenance of normal synaptic functions.
Dr. Shelanski and his team hypothesized that aberrant activation of caspase-2 may cause synaptic changes in Alzheimer’s disease. To test this hypothesis, the researchers crossed J20 transgenic mice (a common mouse model of Alzheimer’s) with caspase-2 null mice (mice that lack caspase-2). They compared the animals’ ability to negotiate a radial-arm water maze, a standard test of cognitive ability, with that of regular J20 mice and of normal mice at 4, 9, and 14 months of age.
The results for the three groups of mice were similar at the first two intervals. At 14 months, however, the J20/caspase-2 null mice did significantly better in the water maze test than the J20 mice and similarly to the normal mice. “We showed that removing caspase-2 from J20 mice prevented memory impairment — without significant changes in the level of soluble amyloid beta,” said co-lead author Roger Lefort, PhD, associate research scientist at CUMC.
Analysis of the neurons showed that the J20/caspase-2 null mice had a higher density of dendritic spines than the J20 mice. The more spines a neuron has, the more impulses it can transmit.
“The J20/caspase-2 null mice showed the same dendritic spine density and morphology as the normal mice—as opposed to the deficits in the J20 mice,” said co-lead author Julio Pozueta, PhD. “This strongly suggests that caspase-2 is a critical regulator in the memory decline associated with beta-amyloid in Alzheimer’s disease.”
The researchers further validated the results in studies of rat neurons in tissue culture.
Finally, the researchers found that caspase-2 interacts with RhoA, a critical regulator of the morphology (form and structure) of dendritic spines. “It appears that in normal neurons, caspase-2 and RhoA form an inactive complex outside the dendritic spines,” said Dr. Lefort. “When the complex is exposed to amyloid beta, it breaks apart, activating the two components.” Once activated, caspase-2 and RhoA enter the dendritic spines and contribute to their demise, possibly by interacting with a third molecule, the enzyme ROCK-II.
“This raises the possibility that if you can inhibit one or all of these molecules, especially early in the course of Alzheimer’s, you might be able to protect neurons and slow down the cognitive effects of the disease,” said Dr. Lefort.

Protein Linked to Cognitive Decline in Alzheimer’s Identified

Researchers at Columbia University Medical Center (CUMC) have demonstrated that a protein called caspase-2 is a key regulator of a signaling pathway that leads to cognitive decline in Alzheimer’s disease. The findings, made in a mouse model of Alzheimer’s, suggest that inhibiting this protein could prevent the neuronal damage and subsequent cognitive decline associated with the disease. The study was published this month in the online journal Nature Communications.

One of the earliest events in Alzheimer’s is disruption of the brain’s synapses (the small gaps across which nerve impulses are passed), which can lead to neuronal death. Although what drives this process has not been clear, studies have indicated that caspace-2 might be involved, according to senior author Michael Shelanski, MD, PhD, the Delafield Professor of Pathology & Cell Biology, chair of the Department of Pathology and Cell Biology, and co-director of the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC.

Several years ago, in tissue culture studies of mouse neurons, Dr. Shelanski found that caspace-2 plays a critical role in the death of neurons in the presence of amyloid beta, the protein that accumulates in the neurons of people with Alzheimer’s. Other researchers have shown that caspase-2 also contributes to the maintenance of normal synaptic functions.

Dr. Shelanski and his team hypothesized that aberrant activation of caspase-2 may cause synaptic changes in Alzheimer’s disease. To test this hypothesis, the researchers crossed J20 transgenic mice (a common mouse model of Alzheimer’s) with caspase-2 null mice (mice that lack caspase-2). They compared the animals’ ability to negotiate a radial-arm water maze, a standard test of cognitive ability, with that of regular J20 mice and of normal mice at 4, 9, and 14 months of age.

The results for the three groups of mice were similar at the first two intervals. At 14 months, however, the J20/caspase-2 null mice did significantly better in the water maze test than the J20 mice and similarly to the normal mice. “We showed that removing caspase-2 from J20 mice prevented memory impairment — without significant changes in the level of soluble amyloid beta,” said co-lead author Roger Lefort, PhD, associate research scientist at CUMC.

Analysis of the neurons showed that the J20/caspase-2 null mice had a higher density of dendritic spines than the J20 mice. The more spines a neuron has, the more impulses it can transmit.

“The J20/caspase-2 null mice showed the same dendritic spine density and morphology as the normal mice—as opposed to the deficits in the J20 mice,” said co-lead author Julio Pozueta, PhD. “This strongly suggests that caspase-2 is a critical regulator in the memory decline associated with beta-amyloid in Alzheimer’s disease.”

The researchers further validated the results in studies of rat neurons in tissue culture.

Finally, the researchers found that caspase-2 interacts with RhoA, a critical regulator of the morphology (form and structure) of dendritic spines. “It appears that in normal neurons, caspase-2 and RhoA form an inactive complex outside the dendritic spines,” said Dr. Lefort. “When the complex is exposed to amyloid beta, it breaks apart, activating the two components.” Once activated, caspase-2 and RhoA enter the dendritic spines and contribute to their demise, possibly by interacting with a third molecule, the enzyme ROCK-II.

“This raises the possibility that if you can inhibit one or all of these molecules, especially early in the course of Alzheimer’s, you might be able to protect neurons and slow down the cognitive effects of the disease,” said Dr. Lefort.

Filed under alzheimer's disease beta amyloid dementia cognitive decline neurotransmission neuroscience science

52 notes

'Singing' rats show hope for older humans with age-related voice problems
A new study shows that the vocal training of older rats reduces some of the voice problems related to their aging, such as the loss of vocal intensity that accompanies changes in the muscles of the larynx. This is an animal model of a vocal pathology that many humans face as they age. The researchers hope that in the future, voice therapy in aging humans will help improve their quality of life.
The research appears in The Journals of Gerontology.
University of Illinois speech and hearing science professor Aaron Johnson, who led the new study along with his colleagues at the University of Wisconsin, said that aging can cause the muscles of the larynx, the organ that contains the vocal folds, to atrophy. This condition, called presbyphonia, may be treatable with vocal training, he said.
Johnson said in a healthy, young larynx the vocal folds completely close and open during vibration. This creates little puffs of air we hear as sound. In people with presbyphonia, however, the atrophied vocal folds do not close properly, resulting in a gap during vocal fold vibration.
Degradation of the neuromuscular junction, or the interface between the nerve that signals the vocal muscle to work and the muscle itself, also contributes to the symptoms of presbyphonia, Johnson said. In a healthy human, when the signal reaches the neuromuscular junction, it triggers a release of chemicals that signal the muscle to contract. But an age-related decline in the neuromuscular junction can cause weakness and fatigue in the muscle, and may result in a person having a breathy or weak voice and to become fatigued as a result of the extra effort needed to communicate.
Surgery and injections may help correct the gap between the vocal folds seen in presbyphonia, but these invasive procedures are often not viable in the elderly population, Johnson said.
His previous experience working with the elderly as a former classical singer and voice teacher propelled Johnson to “become interested in what we can do as we get older to keep our voices healthy and strong.”
“We know exercise strengthens the limb musculature, but we wanted to know if vocal exercise can strengthen the muscles of the voice,” Johnson said.
To find out if vocal training could have an effect on the strength and physiology of the vocal muscles in humans, Johnson turned to a rat model. Rats make ultrasonic vocalizations that are above the range of human hearing, but special recording equipment and a computer that lowers the frequency of the rat calls allows humans to perceive them. (They sound a bit like bird calls).
Because rats and humans utilize similar neuromuscular mechanisms to vocalize, the rats make ideal subjects for the study of human vocal characteristics, Johnson said.
Both the treatment and control groups contained old and young male rats. In the treatment group, a female rat was placed into a cage with a male rat. When the male expressed interest in her, the female was removed from the cage, causing the male rat to vocalize. The male was rewarded with food for these vocalizations, and after eight weeks of this operant conditioning in which rewards were only given for certain responses, all of the rats in the treatment group had been trained to increase their number of vocalizations during a training session.
At the end of the eight-week period, the researchers measured the intensity of the rats’ vocalizations and analyzed the animals’ larynges to see whether the training had any effect on the condition of their neuromuscular junctions. 
The researchers found the trained old and young rats had similar average vocal intensities, but the untrained older rats had lower average intensities than both the trained rats and the young rats that had not been trained. They also found several age-related differences within the groups’ neuromuscular mechanisms.
“Other research has found that in the elderly, there is a dispersion, or breaking apart, of the neuromuscular junction at the side that is on the muscle itself,” Johnson said. “We found that in the older rats that received training, it wasn’t as dispersed.”
These “singing rats” are the “first evidence that vocal use and vocal training can change the neuromuscular system of the larynx,” Johnson said. 
“While this isn’t a human study, I think this tells us that we can train ourselves to use our voices and not only reduce the effects of age on the muscles of our voices, but actually improve voices that have degraded,” Johnson said.

'Singing' rats show hope for older humans with age-related voice problems

A new study shows that the vocal training of older rats reduces some of the voice problems related to their aging, such as the loss of vocal intensity that accompanies changes in the muscles of the larynx. This is an animal model of a vocal pathology that many humans face as they age. The researchers hope that in the future, voice therapy in aging humans will help improve their quality of life.

The research appears in The Journals of Gerontology.

University of Illinois speech and hearing science professor Aaron Johnson, who led the new study along with his colleagues at the University of Wisconsin, said that aging can cause the muscles of the larynx, the organ that contains the vocal folds, to atrophy. This condition, called presbyphonia, may be treatable with vocal training, he said.

Johnson said in a healthy, young larynx the vocal folds completely close and open during vibration. This creates little puffs of air we hear as sound. In people with presbyphonia, however, the atrophied vocal folds do not close properly, resulting in a gap during vocal fold vibration.

Degradation of the neuromuscular junction, or the interface between the nerve that signals the vocal muscle to work and the muscle itself, also contributes to the symptoms of presbyphonia, Johnson said. In a healthy human, when the signal reaches the neuromuscular junction, it triggers a release of chemicals that signal the muscle to contract. But an age-related decline in the neuromuscular junction can cause weakness and fatigue in the muscle, and may result in a person having a breathy or weak voice and to become fatigued as a result of the extra effort needed to communicate.

Surgery and injections may help correct the gap between the vocal folds seen in presbyphonia, but these invasive procedures are often not viable in the elderly population, Johnson said.

His previous experience working with the elderly as a former classical singer and voice teacher propelled Johnson to “become interested in what we can do as we get older to keep our voices healthy and strong.”

“We know exercise strengthens the limb musculature, but we wanted to know if vocal exercise can strengthen the muscles of the voice,” Johnson said.

To find out if vocal training could have an effect on the strength and physiology of the vocal muscles in humans, Johnson turned to a rat model. Rats make ultrasonic vocalizations that are above the range of human hearing, but special recording equipment and a computer that lowers the frequency of the rat calls allows humans to perceive them. (They sound a bit like bird calls).

Because rats and humans utilize similar neuromuscular mechanisms to vocalize, the rats make ideal subjects for the study of human vocal characteristics, Johnson said.

Both the treatment and control groups contained old and young male rats. In the treatment group, a female rat was placed into a cage with a male rat. When the male expressed interest in her, the female was removed from the cage, causing the male rat to vocalize. The male was rewarded with food for these vocalizations, and after eight weeks of this operant conditioning in which rewards were only given for certain responses, all of the rats in the treatment group had been trained to increase their number of vocalizations during a training session.

At the end of the eight-week period, the researchers measured the intensity of the rats’ vocalizations and analyzed the animals’ larynges to see whether the training had any effect on the condition of their neuromuscular junctions. 

The researchers found the trained old and young rats had similar average vocal intensities, but the untrained older rats had lower average intensities than both the trained rats and the young rats that had not been trained. They also found several age-related differences within the groups’ neuromuscular mechanisms.

“Other research has found that in the elderly, there is a dispersion, or breaking apart, of the neuromuscular junction at the side that is on the muscle itself,” Johnson said. “We found that in the older rats that received training, it wasn’t as dispersed.”

These “singing rats” are the “first evidence that vocal use and vocal training can change the neuromuscular system of the larynx,” Johnson said. 

“While this isn’t a human study, I think this tells us that we can train ourselves to use our voices and not only reduce the effects of age on the muscles of our voices, but actually improve voices that have degraded,” Johnson said.

Filed under aging neuromuscular junction presbyphonia vocal intensity voice neuroscience science

117 notes

Brain Cancer: Hunger for Amino Acids Makes It More Aggressive
An enzyme that facilitates the breakdown of specific amino acids makes brain cancers particularly aggressive. Scientists from the German Cancer Research Center (DKFZ) discovered this in an attempt to find new targets for therapies against this dangerous disease. They have reported their findings in the journal “Nature Medicine”. 
To fuel phases of fast and aggressive growth, tumors need higher-than-normal amounts of energy and the molecular building blocks needed to build new cellular components. Cancer cells therefore consume a lot of sugar (glucose A number of tumors are also able to catabolize the amino acid glutamine, an important building block of proteins. A key enzyme in amino acid decomposition is isocitrate dehydrogenase (IDH). Several years ago, scientists discovered mutations in the gene coding for IDH in numerous types of brain cancer. Very malignant brain tumors called primary glioblastomas carry an intact IDH gene, whereas those that grow more slowly usually have a defective form.
“The study of the IDH gene currently is one of the most important diagnostic criteria for differentiating glioblastomas from other brain cancers that grow more slowly,” says Dr. Bernhard Radlwimmer from the German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ). “We wanted to find out what spurs the aggressive growth of glioblastomas.” In collaboration with scientists from other institutes including Heidelberg University Hospital, Dr. Martje Tönjes and Dr. Sebastian Barbus from Radlwimmer’s team compared gene activity profiles from several hundred brain tumors. They aimed to find out whether either altered or intact IDH show further, specific genetic characteristics that might help explain the aggressiveness of the disease.
The researchers found a significant difference between the two groups in the highly increased activity of the gene for the BCAT1 enzyme, which in normal brain tissue is responsible for breaking down so-called branched-chain amino acids. However, Radlwimmer’s team discovered, only those tumor cells whose IDH gene is not mutated produce BCAT1. “This is not surprising, because as IDH breaks down amino acids, it produces ketoglutarate – a molecule which BCAT1 needs. This explains why BCAT1 is produced only in tumor cells carrying intact IDH. The two enzymes seem to form a kind of functional unit in amino acid catabolism,” says Bernhard Radlwimmer.
Glioblastomas are particularly dreaded because they aggressively invade the healthy brain tissue that surrounds them. When the researchers used a pharmacological substance to block BCAT1’s effects, the tumor cells lost their invasive capacity. In addition, the cells released less of the glutamate neurotransmitter. High glutamate release is responsible for severe neurological symptoms such as epileptic seizures, which are frequently associated with the disease. When transferred to mice, glioblastoma cells in which the BCAT1 gene had been blocked no longer grew into tumors.
“Altogether, we can see that overexpression of BCAT1 contributes to the aggressiveness of glioblastoma cells,” Radlwimmer says. The study suggests that the two enzymes, BCAT1 and IDH, cooperate in the decomposition of branched-chain amino acids. These protein building blocks appear to act as a “food source” that increases the cancer cells’ aggressiveness. Branched-chain amino acids also play a significant role in metabolic diseases such as diabetes. This is the first time that scientists have been able to show the role of these amino acids in the growth of malignant tumors.
“The good news,” sums up Radlwimmer, “is that we have found another target for therapies in BCAT1. In collaboration with Bayer Healthcare, we have already started searching for agents that might be specifically directed against this enzyme.” The researchers also plan to investigate whether BCAT1 expression may serve as an additional marker to diagnose the malignancy of brain cancer.

Brain Cancer: Hunger for Amino Acids Makes It More Aggressive

An enzyme that facilitates the breakdown of specific amino acids makes brain cancers particularly aggressive. Scientists from the German Cancer Research Center (DKFZ) discovered this in an attempt to find new targets for therapies against this dangerous disease. They have reported their findings in the journal “Nature Medicine”.

To fuel phases of fast and aggressive growth, tumors need higher-than-normal amounts of energy and the molecular building blocks needed to build new cellular components. Cancer cells therefore consume a lot of sugar (glucose A number of tumors are also able to catabolize the amino acid glutamine, an important building block of proteins. A key enzyme in amino acid decomposition is isocitrate dehydrogenase (IDH). Several years ago, scientists discovered mutations in the gene coding for IDH in numerous types of brain cancer. Very malignant brain tumors called primary glioblastomas carry an intact IDH gene, whereas those that grow more slowly usually have a defective form.

“The study of the IDH gene currently is one of the most important diagnostic criteria for differentiating glioblastomas from other brain cancers that grow more slowly,” says Dr. Bernhard Radlwimmer from the German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ). “We wanted to find out what spurs the aggressive growth of glioblastomas.” In collaboration with scientists from other institutes including Heidelberg University Hospital, Dr. Martje Tönjes and Dr. Sebastian Barbus from Radlwimmer’s team compared gene activity profiles from several hundred brain tumors. They aimed to find out whether either altered or intact IDH show further, specific genetic characteristics that might help explain the aggressiveness of the disease.

The researchers found a significant difference between the two groups in the highly increased activity of the gene for the BCAT1 enzyme, which in normal brain tissue is responsible for breaking down so-called branched-chain amino acids. However, Radlwimmer’s team discovered, only those tumor cells whose IDH gene is not mutated produce BCAT1. “This is not surprising, because as IDH breaks down amino acids, it produces ketoglutarate – a molecule which BCAT1 needs. This explains why BCAT1 is produced only in tumor cells carrying intact IDH. The two enzymes seem to form a kind of functional unit in amino acid catabolism,” says Bernhard Radlwimmer.

Glioblastomas are particularly dreaded because they aggressively invade the healthy brain tissue that surrounds them. When the researchers used a pharmacological substance to block BCAT1’s effects, the tumor cells lost their invasive capacity. In addition, the cells released less of the glutamate neurotransmitter. High glutamate release is responsible for severe neurological symptoms such as epileptic seizures, which are frequently associated with the disease. When transferred to mice, glioblastoma cells in which the BCAT1 gene had been blocked no longer grew into tumors.

“Altogether, we can see that overexpression of BCAT1 contributes to the aggressiveness of glioblastoma cells,” Radlwimmer says. The study suggests that the two enzymes, BCAT1 and IDH, cooperate in the decomposition of branched-chain amino acids. These protein building blocks appear to act as a “food source” that increases the cancer cells’ aggressiveness. Branched-chain amino acids also play a significant role in metabolic diseases such as diabetes. This is the first time that scientists have been able to show the role of these amino acids in the growth of malignant tumors.

“The good news,” sums up Radlwimmer, “is that we have found another target for therapies in BCAT1. In collaboration with Bayer Healthcare, we have already started searching for agents that might be specifically directed against this enzyme.” The researchers also plan to investigate whether BCAT1 expression may serve as an additional marker to diagnose the malignancy of brain cancer.

Filed under brain cancer amino acids cancer cells glioblastoma brain tumors genetics neuroscience science

96 notes

Genes Involved in Birth Defects May Also Lead to Mental Illness
Gene mutations that lead to major birth defects may also cause subtle disruptions in the brain that contribute to psychiatric disorders such as schizophrenia, autism, and bipolar disorder, according to new research by UC San Francisco scientists.
Over the past several years, researchers in the laboratory of psychiatrist Benjamin Cheyette, MD, PhD, have shown that mutations in a gene called Dact1 cause cell signaling networks to go awry during embryonic development. Researchers observed that mice with Dact1 mutations were born with a range of severe malformations, including some reminiscent of spina bifida in humans.
This new study was designed to explore whether Dact1 mutations exert more nuanced effects in the brain that may lead to mental illness. In doing so, Cheyette, John Rubenstein, MD, PhD, and colleagues in UCSF’s Nina Ireland Laboratory of Developmental Neurobiology used a genetic technique in adult mice to selectively delete the Dact1 protein only in interneurons, a group of brain cells that regulates activity in the cerebral cortex, including cognitive and sensory processes. Poor function of interneurons has been implicated in a range of psychiatric conditions.
As reported in the June 24 online issue of PLOS ONE, researchers found that the genetically altered interneurons appeared relatively normal and had managed to find their proper position in the brain’s circuitry during development. But the cells had significantly fewer synapses, the sites where communication with neighboring neurons takes place. In additional observations not included in the new paper, the team also noted that the cells’ dendrites – fine extensions that normally form bushy arbors studded with synapses – were poorly developed and sparsely branched.
“When you delete this gene function after initial, early development – just eliminating it in neurons after they’ve formed – they migrate to the right place and their numbers are correct, but their morphology is a little off,” Cheyette said. “And that’s very much in line with the kinds of pathology that people have been able to identify in psychiatric illness.
"Neurological illnesses tend to be focal, with lesions that you can identify or pathology you can see on an imaging study," Cheyette explained. "Psychiatric illnesses? Not so much. The differences are really subtle and hard to see.”
Key Gene’s Role in Development of Human Nervous System
The Dact1 protein is part of a fundamental biological system known as the Wnt (pronounced “wint”) signaling pathway. Interactions among proteins in the Wnt pathway orchestrate many processes essential to life in animals as diverse as fruit flies, mice and humans, including the proper development of the immensely complex human nervous system from a single fertilized egg cell.
One way the Wnt pathway manages this task is by maintaining the “polarity” of cells during development, said Cheyette, “a process of sequestering, increasing the concentration of one set of proteins on one side of the cell and a different set of proteins on the other side of the cell.” Polarity is particularly important as precursor cells transform into nerve cells, Cheyette said, because neurons are “the most polarized cells in the body,” with specialized input and output zones that must wind up in the proper spots if the cells are to function normally.
Cheyette said his group is now conducting behavioral experiments with the mice analyzed in the new PLOS ONE paper and with genetically related mouse lines to test whether these mice have behavioral abnormalities in sociability, sensory perception, anxiety or motivation that resemble symptoms in major psychiatric disorders.
He also hopes to collaborate with UCSF colleagues on follow-up experiments to determine whether the activity of neurons lacking Dact1 is impaired in addition to the structural flaws identified in the new study and prior published work from his lab.
Meanwhile, as-yet-unpublished findings from human genetics research conducted by Cheyette’s group suggest that individuals with autism are significantly more likely than healthy comparison subjects to carry mutations in a Wnt pathway gene called WNT1.
“Just because a gene plays an important role in the embryo doesn’t mean it isn’t also important in the brain later, and might be involved in psychiatric pathology,” said Cheyette. “When these genes are mutated, someone may look fine, develop fine and have no obvious medical problems at birth, but they may also develop autism in childhood or have a psychotic break in adulthood and develop schizophrenia.”

Genes Involved in Birth Defects May Also Lead to Mental Illness

Gene mutations that lead to major birth defects may also cause subtle disruptions in the brain that contribute to psychiatric disorders such as schizophrenia, autism, and bipolar disorder, according to new research by UC San Francisco scientists.

Over the past several years, researchers in the laboratory of psychiatrist Benjamin Cheyette, MD, PhD, have shown that mutations in a gene called Dact1 cause cell signaling networks to go awry during embryonic development. Researchers observed that mice with Dact1 mutations were born with a range of severe malformations, including some reminiscent of spina bifida in humans.

This new study was designed to explore whether Dact1 mutations exert more nuanced effects in the brain that may lead to mental illness. In doing so, Cheyette, John Rubenstein, MD, PhD, and colleagues in UCSF’s Nina Ireland Laboratory of Developmental Neurobiology used a genetic technique in adult mice to selectively delete the Dact1 protein only in interneurons, a group of brain cells that regulates activity in the cerebral cortex, including cognitive and sensory processes. Poor function of interneurons has been implicated in a range of psychiatric conditions.

As reported in the June 24 online issue of PLOS ONE, researchers found that the genetically altered interneurons appeared relatively normal and had managed to find their proper position in the brain’s circuitry during development. But the cells had significantly fewer synapses, the sites where communication with neighboring neurons takes place. In additional observations not included in the new paper, the team also noted that the cells’ dendrites – fine extensions that normally form bushy arbors studded with synapses – were poorly developed and sparsely branched.

“When you delete this gene function after initial, early development – just eliminating it in neurons after they’ve formed – they migrate to the right place and their numbers are correct, but their morphology is a little off,” Cheyette said. “And that’s very much in line with the kinds of pathology that people have been able to identify in psychiatric illness.

"Neurological illnesses tend to be focal, with lesions that you can identify or pathology you can see on an imaging study," Cheyette explained. "Psychiatric illnesses? Not so much. The differences are really subtle and hard to see.”

Key Gene’s Role in Development of Human Nervous System

The Dact1 protein is part of a fundamental biological system known as the Wnt (pronounced “wint”) signaling pathway. Interactions among proteins in the Wnt pathway orchestrate many processes essential to life in animals as diverse as fruit flies, mice and humans, including the proper development of the immensely complex human nervous system from a single fertilized egg cell.

One way the Wnt pathway manages this task is by maintaining the “polarity” of cells during development, said Cheyette, “a process of sequestering, increasing the concentration of one set of proteins on one side of the cell and a different set of proteins on the other side of the cell.” Polarity is particularly important as precursor cells transform into nerve cells, Cheyette said, because neurons are “the most polarized cells in the body,” with specialized input and output zones that must wind up in the proper spots if the cells are to function normally.

Cheyette said his group is now conducting behavioral experiments with the mice analyzed in the new PLOS ONE paper and with genetically related mouse lines to test whether these mice have behavioral abnormalities in sociability, sensory perception, anxiety or motivation that resemble symptoms in major psychiatric disorders.

He also hopes to collaborate with UCSF colleagues on follow-up experiments to determine whether the activity of neurons lacking Dact1 is impaired in addition to the structural flaws identified in the new study and prior published work from his lab.

Meanwhile, as-yet-unpublished findings from human genetics research conducted by Cheyette’s group suggest that individuals with autism are significantly more likely than healthy comparison subjects to carry mutations in a Wnt pathway gene called WNT1.

“Just because a gene plays an important role in the embryo doesn’t mean it isn’t also important in the brain later, and might be involved in psychiatric pathology,” said Cheyette. “When these genes are mutated, someone may look fine, develop fine and have no obvious medical problems at birth, but they may also develop autism in childhood or have a psychotic break in adulthood and develop schizophrenia.”

Filed under autism genetic mutations mental health schizophrenia neural circuitry neurons neuroscience science

147 notes

'Out-of-body' virtual experience could help social anxiety

New virtual imaging technology could be used as part of therapy to help people get over social anxiety according to new research from the University of East Anglia (UEA).

Research published today investigated for the first time whether people with social anxiety could benefit from seeing themselves interacting in social situations via video capture.

The experiment gave participants the chance to experience social interaction in the safety of a virtual environment by seeing their own life-size image projected into specially scripted real-time video scenes.

UEA researchers, led by Dr Lina Gega from UEA’s Norwich Medical School and MHCO’s Northumberland Talking Therapies, worked with Xenodu Virtual Environments to create more than 100 different social scenarios – such as using public transport, buying a drink at a bar, socialising at a party, shopping, and talking to a stranger in an art gallery.

The researchers tested whether this sort of experience could become a valuable part of Cognitive Behavioural Therapy (CBT) by including an hour-long session midway through a 12-week CBT course.

Dr Gega said: “People with social anxiety are afraid that they will draw attention to themselves and be negatively judged by others in social situations. Many will either avoid public places and social gatherings altogether, or use safety behaviours to cope – such as not making eye contact and being guarded or hyper-vigilant towards others.

“Paradoxically, this sort of behaviour draws attention to people with social anxiety and feeds into their beliefs that they don’t fit in.

“We wanted to see whether practising social situations in a virtual environment could help.”

Paul Strickland from Xenodu, the company behind the virtual environment system, said: “Our system uses video capture to project a user’s life-size image on screen so that they can watch themselves interacting with custom-scripted and digitally edited video clips.

“It isn’t a head-mounted display – which anxious people may find uncomfortable,” he added. “Instead, the user observes from an out-of-body perspective. They can then simultaneously view themselves and interact with the characters of the film.”

Dr Gega’s project focused on six socially anxious young men recovering from psychosis who also have debilitating social anxiety. The participants engaged with a range of scenarios, some of which were designed to feature rude and hostile people. The virtual environments encouraged participants to practice small-talk, maintain eye contact, test beliefs that they wouldn’t know what to say, and resist safety behaviour such as looking at the floor or being hyper-vigilant.

The main benefits of using these virtual environments in therapy was that it helped participants notice and change anxious behaviours in a safe, controlled environment which could be rehearsed over and over again. Participants were found to drop safety behaviours and take greater social risks. And while realistic to an extent, the ‘fake’ feeling of staged scenarios in itself proved to be a virtue.

“It helped the participants question their interpretation of social cues,” said Dr Gega. “For example, if they thought that one of the characters was looking at them ‘funny’ they could immediately see that there must be an alternative explanation because the scenarios were artificial.

“Another useful aspect of the system is that it can be tailored to address specific fears in social situations - for example a fear of performance, intimacy, or crowds,” she added.

“Two of the patients said that the system felt “weird and surreal”, so the element of having an out-of-body experience is something to study further in future – particularly because psychosis itself is defined by a distorted perception of reality.

“This research explored the feasibility and potential added value of using virtual environments as part of CBT. The next stage would be to carry out a randomised, controlled comparison of CBT with and without the virtual environment system to test whether using the system as a therapy tool leads to greater or quicker symptom improvement.”

Mr Strickland added: “I hope our technology can help make a difference to the lives of people experiencing social anxiety and other specific anxiety conditions for which controlled exposure to feared situations is part of therapy. It is particularly versatile because it doesn’t need technical expertise to set up and use. And the library of scenarios can be built on to capture different types of exposure environments needed in day-to-day clinical practice.”

‘Virtual Environments Using Video Capture for Social Phobia with Psychosis’ is published by the journal Cyberpsychology, Behaviour and Social Networking.

Filed under social anxiety virtual environment CBT technology psychology neuroscience science

163 notes

Pleasure Response from Chocolate: You Can See it in the Eyes
The brain’s pleasure response to tasting food can be measured through the eyes using a common, low-cost ophthalmological tool, according to a study just published in the journal Obesity. If validated, this method could be useful for research and clinical applications in food addiction and obesity prevention.
Dr. Jennifer Nasser, an associate professor in the department of Nutrition Sciences in Drexel University’s College of Nursing and Health Professions, led the study testing the use of electroretinography (ERG) to indicate increases in the neurotransmitter dopamine in the retina.
Dopamine is associated with a variety of pleasure-related effects in the brain, including the expectation of reward. In the eye’s retina, dopamine is released when the optical nerve activates in response to light exposure.
Nasser and her colleagues found that electrical signals in the retina spiked high in response to a flash of light when a food stimulus (a small piece of chocolate brownie) was placed in participants’ mouths. The increase was as great as that seen when participants had received the stimulant drug methylphenidate to induce a strong dopamine response. These responses in the presence of food and drug stimuli were each significantly greater than the response to light when participants ingested a control substance, water.
“What makes this so exciting is that the eye’s dopamine system was considered separate from the rest of the brain’s dopamine system,” Nasser said. “So most people– and indeed many retinography experts told me this– would say that tasting a food that stimulates the brain’s dopamine system wouldn’t have an effect on the eye’s dopamine system.”
This study was a small-scale demonstration of the concept, with only nine participants. Most participants were overweight but none had eating disorders. All fasted for four hours before testing with the food stimulus.
If this technique is validated through additional and larger studies, Nasser said she and other researchers can use ERG for studies of food addiction and food science.
“My research takes a pharmacology approach to the brain’s response to food,” Nasser said. “Food is both a nutrient delivery system and a pleasure delivery system, and a ‘side effect’ is excess calories. I want to maximize the pleasure and nutritional value of food but minimize the side effects. We need more user-friendly tools to do that.”
The low cost and ease of performing electroretinography make it an appealing method, according to Nasser. The Medicare reimbursement cost for clinical use of ERG is about $150 per session, and each session generates 200 scans in just two minutes. Procedures to measure dopamine responses directly from the brain are more expensive and invasive. For example, PET scanning costs about $2,000 per session and takes more than an hour to generate a scan.
(Image: Scott Thornburg)

Pleasure Response from Chocolate: You Can See it in the Eyes

The brain’s pleasure response to tasting food can be measured through the eyes using a common, low-cost ophthalmological tool, according to a study just published in the journal Obesity. If validated, this method could be useful for research and clinical applications in food addiction and obesity prevention.

Dr. Jennifer Nasser, an associate professor in the department of Nutrition Sciences in Drexel University’s College of Nursing and Health Professions, led the study testing the use of electroretinography (ERG) to indicate increases in the neurotransmitter dopamine in the retina.

Dopamine is associated with a variety of pleasure-related effects in the brain, including the expectation of reward. In the eye’s retina, dopamine is released when the optical nerve activates in response to light exposure.

Nasser and her colleagues found that electrical signals in the retina spiked high in response to a flash of light when a food stimulus (a small piece of chocolate brownie) was placed in participants’ mouths. The increase was as great as that seen when participants had received the stimulant drug methylphenidate to induce a strong dopamine response. These responses in the presence of food and drug stimuli were each significantly greater than the response to light when participants ingested a control substance, water.

“What makes this so exciting is that the eye’s dopamine system was considered separate from the rest of the brain’s dopamine system,” Nasser said. “So most people– and indeed many retinography experts told me this– would say that tasting a food that stimulates the brain’s dopamine system wouldn’t have an effect on the eye’s dopamine system.”

This study was a small-scale demonstration of the concept, with only nine participants. Most participants were overweight but none had eating disorders. All fasted for four hours before testing with the food stimulus.

If this technique is validated through additional and larger studies, Nasser said she and other researchers can use ERG for studies of food addiction and food science.

“My research takes a pharmacology approach to the brain’s response to food,” Nasser said. “Food is both a nutrient delivery system and a pleasure delivery system, and a ‘side effect’ is excess calories. I want to maximize the pleasure and nutritional value of food but minimize the side effects. We need more user-friendly tools to do that.”

The low cost and ease of performing electroretinography make it an appealing method, according to Nasser. The Medicare reimbursement cost for clinical use of ERG is about $150 per session, and each session generates 200 scans in just two minutes. Procedures to measure dopamine responses directly from the brain are more expensive and invasive. For example, PET scanning costs about $2,000 per session and takes more than an hour to generate a scan.

(Image: Scott Thornburg)

Filed under chocolate dopamine food addiction optical nerve electroretinography neuroscience science

free counters