Posts tagged science

Posts tagged science
Zebrafish study paves the way for new treatments for genetic disorder
Scientists from the University of Sheffield have paved the way for new treatments for a common genetic disorder thanks to pioneering research on zebrafish – an animal capable of mending its own heart.
Charcot Marie Tooth disease (CMT) is the most common genetic disorder affecting the nervous system. More than 20,000 people in the UK suffer from CMT, which typically causes progressive weakness and long-term pain in the feet, leading to walking difficulties. There is currently no cure for CMT.
A research project conducted at the Sheffield Institute for Translational Neuroscience (SITraN) and the MRC Centre for Developmental and Biomedical Genetics (CDBG) by Dr Andrew Grierson and his team has revealed that zebrafish could hold the key to finding new therapeutic approaches to treat the condition.
Dr Grierson said: “We have studied zebrafish with a genetic defect that causes CMT in humans. The fish develop normally, but once they reach adulthood they start to develop difficulties swimming.
"By looking at the muscles of these fish we have discovered that the problem lies with the connections between motor neurons and muscle, which are known to be essential for walking in humans and also swimming in fish."
CMT represents a group of neurodegenerative disorders typically characterised by demyelination (CMT1), a process which causes damage to the myelin sheaths that surround our neurons, or distal axon degeneration (CMT2) of motor and sensory neurons. The distal axon is the terminal where neurotransmitter packages within neurons are docked.
The majority of CMT2 cases are caused by mutations in mitofusin 2 (MFN2), which is an essential gene encoding a protein responsible for fusion of the mitochondrial outer membrane. Mitochondria are known as the cellular power plants because they generate most of the supply of adenosine triphosphate (ATP), which is used as a source of chemical energy.
Dr Grierson said: “Previous work on this disorder using mammalian models such as mice has been problematic, because the mitofusin genes are essential for embryonic development. Using zebrafish we were able to develop a model with an adult onset, progressive phenotype with predominant symptoms of motor dysfunction similar to CMT2.
"Motor neurons are the largest cells in our bodies, and as such they are highly dependent on a cellular transport system to deliver molecules through the long nerve cell processes which connect the spinal cord to our muscles. We already know that defects in the cellular transport system occur early in the development of diseases such as Alzheimer’s disease, Motor Neuron Disease and spastic paraplegia. Using our zebrafish model we have found that similar defects in transport are also a key part of the disease process in CMT."
Dr Grierson and his team are now seeking funding to identify new treatments for CMT using the zebrafish model. Because of their size and unique biology, zebrafish are ideal to be used in drug screens for the identification of new therapies for untreatable human conditions.
(Image courtesy: University College London)
An SDSU research team has discovered that autism in children affects not only social abilities, but also a broad range of sensory and motor skills.
A group of investigators from San Diego State University’s Brain Development Imaging Laboratory are shedding a new light on the effects of autism on the brain.
The team has identified that connectivity between the thalamus, a deep brain structure crucial for sensory and motor functions, and the cerebral cortex, the brain’s outer layer, is impaired in children with autism spectrum disorders (ASD).
Led by Aarti Nair, a student in the SDSU/UCSD Joint Doctoral Program in Clinical Psychology, the study is the first of its kind, combining functional and anatomical magnetic resonance imaging (fMRI) techniques and diffusion tensor imaging (DTI) to examine connections between the cerebral cortex and the thalamus.
Nair and Dr. Ralph-Axel Müller, an SDSU professor of psychology who was senior investigator of the study, examined more than 50 children, both with autism and without.
Brain communication
The thalamus is a crucial brain structure for many functions, such as vision, hearing, movement control and attention. In the children with autism, the pathways connecting the cerebral cortex and thalamus were found to be affected, indicating that these two parts of the brain do not communicate well with each other.
“This impaired connectivity suggests that autism is not simply a disorder of social and communicative abilities, but also affects a broad range of sensory and motor systems,” Müller said.
Disturbances in the development of both the structure and function of the thalamus may play a role in the emergence of social and communicative impairments, which are among the most prominent and distressing symptoms of autism.
While the findings reported in this study are novel, they are consistent with growing evidence on sensory and motor abnormalities in autism. They suggest that the diagnostic criteria for autism, which emphasize social and communicative impairment, may fail to consider the broad spectrum of problems children with autism experience.
The study was supported with funding from the National Institutes of Health and additional funding from Autism Speaks Dennis Weatherstone Predoctoral Fellowship. It was published in the June issue of the journal, BRAIN.
Breastfeeding not only boosts children’s chances of climbing the social ladder, but it also reduces the chances of downwards mobility, suggests a large study published online in the Archives of Disease in Childhood.

The findings are based on changes in the social class of two groups of individuals born in 1958 (17,419 people) and in 1970 (16,771 people).
The researchers asked each of the children’s mums, when their child was five or seven years old, whether they had breastfed him/her.
They then compared people’s social class as children - based on the social class of their father when they were 10 or 11 - with their social class as adults, measured when they were 33 or 34.
Social class was categorised on a four-point scale ranging from unskilled/semi-skilled manual to professional/managerial.
The research also took account of a wide range of other potentially influential factors, derived from regular follow-ups every few years. These included children’s brain (cognitive) development and stress scores, which were assessed using validated tests at the ages of 10-11.
Significantly fewer children were breastfed in 1970 than in 1958. More than two-thirds (68%) of mothers breastfed their children in 1958, compared with just over one in three (36%) in 1970.
Social mobility also changed over time, with those born in 1970 more likely to be upwardly mobile, and less likely to be downwardly mobile, than those born in 1958.
None the less, when background factors were accounted for, children who had been breastfed were consistently more likely to have climbed the social ladder than those who had not been breastfed. This was true of those born in both 1958 and 1970.
What’s more, the size of the “breastfeeding effect” was the same in both time periods. Breastfeeding increased the odds of upwards mobility by 24% and reduced the odds of downward mobility by around 20% for both groups.
Intellect and stress accounted for around a third (36%) of the total impact of breastfeeding: breastfeeding enhances brain development, which boosts intellect, which in turn increases upwards social mobility. Breastfed children also showed fewer signs of stress.
The evidence suggests that breastfeeding confers a range of long-term health, developmental, and behavioural advantages to children, which persist into adulthood, say the authors.
They note that it is difficult to pinpoint which affords the greatest benefit to the child - the nutrients found in breast milk or the skin to skin contact and associated bonding during breastfeeding.
“Perhaps the combination of physical contact and the most appropriate nutrients required for growth and brain development is implicated in the better neurocognitive and adult outcomes of breastfed infants,” they suggest.
Hunger affects decision making and perception of risk
Hungry people are often difficult to deal with. A good meal can affect more than our mood, it can also influence our willingness to take risks. This phenomenon is also apparent across a very diverse range of species in the animal kingdom. Experiments conducted on the fruit fly, Drosophila, by scientists at the Max Planck Institute of Neurobiology in Martinsried have shown that hunger not only modifies behaviour, but also changes pathways in the brain.
Animal behaviour is radically affected by the availability and amount of food. Studies prove that the willingness of many animals to take risks increases or declines depending on whether the animal is hungry or full. For example, a predator only hunts more dangerous prey when it is close to starvation. This behaviour has also been documented in humans in recent years: one study showed that hungry subjects took significantly more financial risks than their sated colleagues.
Also the fruit fly, Drosophila, changes its behaviour depending on its nutritional state. The animals usually perceive even low quantities of carbon dioxide to be a sign of danger and opt to take flight. However, rotting fruit and plants – the flies’ main sources of food – also release carbon dioxide. Neurobiologists in Martinsried have now discovered how the brain deals with this constant conflict in deciding between a hazardous substance and a potential food source taking advantage of the fly as a great genetic model organism for circuit neuroscience.
In various experiments, the scientists presented the flies with environments containing carbon dioxide or a mix of carbon dioxide and the smell of food. It emerged that hungry flies overcame their aversion to carbon dioxide significantly faster than fed flies – if there was a smell of food in the environment at the same time. Facing the prospect of food, hungry animals are therefore significantly more willing to take risks than sated flies. But how does the brain manage to decide between these options?
Avoiding carbon dioxide is an innate behaviour and should therefore be generated outside the mushroom body in the fly’s brain: previously, the nerve cells in the mushroom body were linked only with learning and behaviour patterns that are based on learned associations. However, when the scientists temporarily disabled these nerve cells, hungry flies no longer showed any reaction whatsoever to carbon dioxide. The behaviour of fed flies, on the other hand, remained the same: they avoided the carbon dioxide.
In further studies, the researchers identified a projection neuron which transports the carbon dioxide information to the mushroom body. This nerve cell is crucial in triggering a flight response in hungry, but not in fed animals. “In fed flies, nerve cells outside the mushroom body are enough for flies to flee from the carbon dioxide. In hungry animals, however, the nerve cells are in the mushroom body and the projection neuron, which carries the carbon dioxide information there, is essential for the flight response. If mushroom body or projection neuron activity is blocked, only hungry flies are no longer concerned about the carbon dioxide,” explains Ilona Grunwald-Kadow, who headed the study.
The results show that the innate flight response to carbon dioxide in fruit flies is controlled by two parallel neural circuits, depending on how satiated the animals are. “If the fly is hungry, it will no longer rely on the ‘direct line’ but will use brain centres to gauge internal and external signals and reach a balanced decision,” explains Grunwald-Kadow. “It is fascinating to see the extent to which metabolic processes and hunger affect the processing systems in the brain,” she adds.
Problem-solving governs how we process sensory stimuli
Various areas of the brain process our sensory experiences. How the areas of the cerebral cortex communicate with each other and process sensory information has long puzzled neuroscientists. Exploring the sense of touch in mice, brain researchers from the University of Zurich now demonstrate that the transmission of sensory information from one cortical area to connected areas depends on the specific task to solve and the goal-directed behavior. These findings can serve as a basis for an improved understanding of cognitive disorders.
In the mammalian brain, the cerebral cortex plays a crucial role in processing sensory inputs. The cortex can be subdivided into different areas, each handling distinct aspects of perception, decision-making or action. The somatosensory cortex, for instance, comprises the part of the cerebral cortex that primarily processes haptic sensations. The different areas of the cerebral cortex are interconnected and communicate with each other. A central, unanswered question of neuroscience is how exactly do these brain areas communicate to process sensory stimuli and produce appropriate behavior. A team of researchers headed by Professor Fritjof Helmchen at the University of Zurich’s Brain Research Institute now provides an answer: The processing of sensory information depends on what you want to achieve. The brain researchers observed that nerve cells in the sensory cortex that connect to distinct brain areas are activated differentially depending on the task to be solved.
Goal-directed processing of sensory information
In their publication in Nature, the researchers studied how mice use their facial whiskers to explore their environment, much like we do in the dark with our hands and fingers. One mouse group was trained to distinguish coarse and fine sandpapers using their whiskers in order to obtain a reward. Another group had to work out the angle, at which an object – a metal rod – was located relative to their snout. The neuroscientists measured the activity of neurons in the primary somatosensory cortex using a special microscopy technique. With simultaneous anatomical stainings they also identified which of these neurons sent their projections to the more remote secondary somatosensory area and the motor cortex, respectively.
The primary somatosensory neurons with projections to the secondary somatosensory cortex predominantly became active when the mice had to distinguish the surface texture of the sandpaper. Neurons with projections to the motor cortex, on the other hand, were more involved when mice needed to localize the metal rod. These different activity patterns were not evident when mice passively touched sandpaper or metal rods without having been set a task – in other words, when their actions were not motivated by a reward. Thus, the sensory stimuli alone were not sufficient to explain the different pattern of information transfer to the remote brain areas.
Impaired communication in the brain
According to Fritjof Helmchen, the activity in a cortical area can be transmitted to remote areas in a targeted fashion if we have to extract (‘filter’) specific information from the environment to solve a problem. In cognitive disorders such Alzheimer’s disease, Autism, and Schizophrenia, this communication between brain areas is often disrupted. “A better understanding of how these long-range, interconnected networks in the brain operate might help to develop therapies that re-establish this specific cortical communication,” says Helmchen. The aim would be to thereby improve the impaired cognitive abilities of patients.
Turn up the volume? A better way to broadcast over the noise
Traffic, aircraft, mobile devices and personal music equipment are not the only sources of noise pollution. Public address systems have become part of the escalating problem, which according to the World Health Organization, costs Europeans alone the equivalent of 654,000 years of healthy life annually.
But researchers at Stockholm’s KTH Royal Institute of Technology have developed a way to bring down the volume on loud public announcements while preserving their clarity in noisy environments.
“By manipulating speech before it is sent to the loudspeakers, we can enhance the speech signal and adapt it to the surrounding noise,” says Gustav Eje Henter, PhD student at Communication Theory at KTH. “This makes it possible to communicate at much lower volume levels than before.”
Earlier approaches to the problem focused on making the speech more prominent, while the KTH researchers are paying attention to what is actually said. They do this by working with computer and machine speech recognition, which is modeled on human hearing. By creating speech that is easier for computers to recognise, people should benefit as well, the researchers say.
“Our manipulation, which is suited for a computer speech recogniser, also makes it easier for people to hear the right thing,” says Petko Petkov, also a PhD student at Communication Theory. “The modified words sound more distinct from each other, making it easier to distinguish them in the noise.”
Petkov and Henter have developed their method together with Professor Bastiaan Kleijn as part of the European collaborative LISTA– or Listening Talker – project.
A recent global evaluation by the LISTA Consortium at University of Edinburgh showed significant increases in the number of words identified correctly in manipulated speech signals, over unaltered speech. The results of the LISTA evaluation are expected to be published later this year.
In some cases, the improvement in understanding is equivalent to turning down the speech volume by more than 5 decibels, which is similar to the difference in the strength between car and truck noise, while still being able to hear what is said just as clearly.
“This enables communication in conditions where speech normally would be impossible to understand,” says Henter.
The LISTA project is funded by the European Union’s Future and Emerging Technology framework programme, and involves scientists from Spain, Greece, Sweden and the UK. The techniques developed within the project involve both natural and synthetic speech in different types of noise. In addition to public address systems, the project could benefit a wide range of devices that produce speech output – such as mobile phones, radios and in-car navigation systems.
Researchers with the UC Davis MIND Institute and Agilent Laboratories have found that Prader-Willi syndrome — a genetic disorder best known for causing an insatiable appetite that can lead to morbid obesity — is associated with the loss of non-coding RNAs, resulting in the dysregulation of circadian and metabolic genes, accelerated energy expenditure and metabolic differences during sleep.
The research was led by Janine LaSalle, a professor in the UC Davis Department of Medical Microbiology and Immunology who is affiliated with the MIND Institute. It is published online in Human Molecular Genetics.
“Prader-Willi syndrome children do not sleep as well at night and have daytime sleepiness,” LaSalle said. “Parents have to lock up their pantries because the kids are rummaging for food in the middle of the night, even breaking into their neighbors’ houses to eat.”
The study found that these behaviors are rooted in the loss of a long non-coding RNA that functions to balance energy expenditure in the brain during sleep. The finding could have a profound effect on how clinicians treat children with Prader-Willi, as well as point the way to new, innovative therapies, LaSalle said.
The leading cause of morbid obesity among children in the United States, Prader-Willi involves a complex, and sometimes contradictory, array of symptoms. Shortly after birth children with Prader-Willi experience failure to thrive. Yet after they begin to feed themselves, they have difficulty sleeping and insatiable appetites that lead to obesity if their diets are not carefully monitored.
The current study was conducted in a mouse model of Prader-Willi syndrome. It found that mice engineered with the loss of a long non-coding RNA showed altered energy use and metabolic differences during sleep.
Prader-Willi has been traced to a specific region on chromosome 15 (SNORD116), which produces RNAs that regulate gene expression, rather than coding for proteins. When functioning normally, SNORD116 produces small nucleolar (sno) RNAs and a long non-coding RNA (116HG), as well as a third non-coding RNA implicated in a related disorder, Angelman syndrome. The 116HG long non-coding RNA forms a cloud inside neuronal nuclei that associates with proteins and genes regulating diurnal metabolism in the brain, LaSalle said.
“We thought the cloud would be activating transcription, but in fact it was doing the opposite,” she said. “Most of the genes were dampened by the cloud. This long non-coding RNA was acting as a decoy, pulling the active transcription factors away from genes and keeping them from being expressed.”
As a result, losing snoRNAs and 116HG causes a chain reaction, eliminating the RNA cloud and allowing circadian and metabolic genes to get turned on during sleep periods, when they should be dampened down. This underlies a complex cycle in which the RNA cloud grew during sleep periods (daytime for nocturnal mice), turning down genes associated with energy use, and receded during waking periods, allowing these genes to be expressed. Mice without the 116HG gene lacked the benefit of this neuronal cloud, causing greater energy expenditure during sleep.
The researchers said that the work provides a clearer picture of why children with Prader-Willi syndrome can’t sleep or feel satiated and may change therapeutic approaches. For example, many such children have been treated with growth hormone because of short stature, but this actually may boost other aspects of the disease.
“People had thought the kids weren’t sleeping at night because of the sleep apnea caused by obesity,” said LaSalle. “What this study shows is that the diurnal metabolism is central to the disorder, and that the obesity may be as a result of that. If you can work with that, you could improve therapies, for example figuring out the best times to administer medications.”
(Source: ucdmc.ucdavis.edu)
Past Brain Activation Revealed in Scans
Weizmann Institute scientists discover that spontaneously emerging brain activity patterns preserve traces of previous cognitive activity
What if experts could dig into the brain, like archaeologists, and uncover the history of past experiences? This ability might reveal what makes each of us a unique individual, and it could enable the objective diagnosis of a wide range of neuropsychological diseases. New research at the Weizmann Institute hints that such a scenario is within the realm of possibility: It shows that spontaneous waves of neuronal activity in the brain bear the imprints of earlier events for at least 24 hours after the experience has taken place.
The new research stems from earlier findings in the lab of Prof. Rafi Malach of the Institute’s Neurobiology Department and others that the brain never rests, even when its owner is resting. When a person is resting with closed eyes – that is, no visual stimulus is entering the brain – the normal bursts of nerve cell activity associated with incoming information are replaced by ultra-slow patterns of neuronal activity. Such spontaneous or “resting” waves travel in a highly organized and reproducible manner through the brain’s outer layer – the cortex – and the patterns they create are complex, yet periodic and symmetrical.
Like hieroglyphics, it seemed that these patterns might have some meaning, and research student Tal Harmelech, under the guidance of Malach and Dr. Son Preminger, set out to uncover their significance. Their idea was that the patterns of resting brain waves may constitute “archives” for earlier experiences. As we add new experiences, the activation of our brain’s networks lead to long-term changes in the links between brain cells, a facility referred to as plasticity. As our experiences become embedded in these connections, they create “expectations” that come into play before we perform any type of mental task, enabling us to anticipate the result. The researchers hypothesized that information about earlier experiences would thus be incorporated into the links between networks of nerve cells in the cortex, and these would show up in the brain’s spontaneously emerging wave patterns.
In the experiment, the researchers had volunteers undertake a training exercise that would strongly activate a well-defined network of nerve cells in the frontal lobes. While undergoing scans of their brain activity in the Institute’s functional magnetic resonance imaging (fMRI) scanner, the subjects were asked to imagine a situation in which they had to make rapid decisions. The subjects received auditory feedback in real time, based on the information obtained directly from their frontal lobe, which indicated the level of neuronal activity in the trained network. This “neurofeedback” strategy proved highly successful in activating the frontal network – a part of the brain that is notoriously difficult to activate under controlled conditions.
To test whether the connections created in the brain during this exercise would leave their traces in the patterns formed by the resting brain waves, the researchers performed fMRI scans on the resting subjects before the exercise, immediately afterward, and 24 hours later. Their findings, which appeared in the Journal of Neuroscience, showed that the activation of the specific areas in the cortex did indeed remodel the resting brain wave patterns. Surprisingly, the new patterns not only remained the next day, they were significantly strengthened. These observations fit in with the classic learning principles proposed by Donald Hebb in the mid-20th century, in which the co-activation of two linked nerve cells leads to long term strengthening of their link, while activity that is not coordinated weakens this link. The fMRI images of the resting brain waves showed that brain areas that were activated together during the training sessions exhibited an increase in their functional link a day after the training, while those areas that were deactivated by the training showed a weakened functional connectivity.
This research suggests a number of future possibilities for exploring the brain. For example, spontaneously emerging brain patterns could be used as a “mapping tool” for unearthing cognitive events from an individual’s recent past. Or, on a wider scale, each person’s unique spontaneously emerging activity patterns might eventually reveal a sort of personal profile – highlighting each individual’s abilities, shortcomings, biases, learning skills, etc. “Today, we are discovering more and more of the common principles of brain activity, but we have not been able to account for the differences between individuals,” says Malach. “In the future, spontaneous brain patterns could be the key to obtaining unbiased individual profiles.” Such profiles could be especially useful in diagnosing or learning the brain pathologies associated with a wide array of cognitive disabilities.
Protein Linked to Cognitive Decline in Alzheimer’s Identified
Researchers at Columbia University Medical Center (CUMC) have demonstrated that a protein called caspase-2 is a key regulator of a signaling pathway that leads to cognitive decline in Alzheimer’s disease. The findings, made in a mouse model of Alzheimer’s, suggest that inhibiting this protein could prevent the neuronal damage and subsequent cognitive decline associated with the disease. The study was published this month in the online journal Nature Communications.
One of the earliest events in Alzheimer’s is disruption of the brain’s synapses (the small gaps across which nerve impulses are passed), which can lead to neuronal death. Although what drives this process has not been clear, studies have indicated that caspace-2 might be involved, according to senior author Michael Shelanski, MD, PhD, the Delafield Professor of Pathology & Cell Biology, chair of the Department of Pathology and Cell Biology, and co-director of the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain at CUMC.
Several years ago, in tissue culture studies of mouse neurons, Dr. Shelanski found that caspace-2 plays a critical role in the death of neurons in the presence of amyloid beta, the protein that accumulates in the neurons of people with Alzheimer’s. Other researchers have shown that caspase-2 also contributes to the maintenance of normal synaptic functions.
Dr. Shelanski and his team hypothesized that aberrant activation of caspase-2 may cause synaptic changes in Alzheimer’s disease. To test this hypothesis, the researchers crossed J20 transgenic mice (a common mouse model of Alzheimer’s) with caspase-2 null mice (mice that lack caspase-2). They compared the animals’ ability to negotiate a radial-arm water maze, a standard test of cognitive ability, with that of regular J20 mice and of normal mice at 4, 9, and 14 months of age.
The results for the three groups of mice were similar at the first two intervals. At 14 months, however, the J20/caspase-2 null mice did significantly better in the water maze test than the J20 mice and similarly to the normal mice. “We showed that removing caspase-2 from J20 mice prevented memory impairment — without significant changes in the level of soluble amyloid beta,” said co-lead author Roger Lefort, PhD, associate research scientist at CUMC.
Analysis of the neurons showed that the J20/caspase-2 null mice had a higher density of dendritic spines than the J20 mice. The more spines a neuron has, the more impulses it can transmit.
“The J20/caspase-2 null mice showed the same dendritic spine density and morphology as the normal mice—as opposed to the deficits in the J20 mice,” said co-lead author Julio Pozueta, PhD. “This strongly suggests that caspase-2 is a critical regulator in the memory decline associated with beta-amyloid in Alzheimer’s disease.”
The researchers further validated the results in studies of rat neurons in tissue culture.
Finally, the researchers found that caspase-2 interacts with RhoA, a critical regulator of the morphology (form and structure) of dendritic spines. “It appears that in normal neurons, caspase-2 and RhoA form an inactive complex outside the dendritic spines,” said Dr. Lefort. “When the complex is exposed to amyloid beta, it breaks apart, activating the two components.” Once activated, caspase-2 and RhoA enter the dendritic spines and contribute to their demise, possibly by interacting with a third molecule, the enzyme ROCK-II.
“This raises the possibility that if you can inhibit one or all of these molecules, especially early in the course of Alzheimer’s, you might be able to protect neurons and slow down the cognitive effects of the disease,” said Dr. Lefort.
'Singing' rats show hope for older humans with age-related voice problems
A new study shows that the vocal training of older rats reduces some of the voice problems related to their aging, such as the loss of vocal intensity that accompanies changes in the muscles of the larynx. This is an animal model of a vocal pathology that many humans face as they age. The researchers hope that in the future, voice therapy in aging humans will help improve their quality of life.
The research appears in The Journals of Gerontology.
University of Illinois speech and hearing science professor Aaron Johnson, who led the new study along with his colleagues at the University of Wisconsin, said that aging can cause the muscles of the larynx, the organ that contains the vocal folds, to atrophy. This condition, called presbyphonia, may be treatable with vocal training, he said.
Johnson said in a healthy, young larynx the vocal folds completely close and open during vibration. This creates little puffs of air we hear as sound. In people with presbyphonia, however, the atrophied vocal folds do not close properly, resulting in a gap during vocal fold vibration.
Degradation of the neuromuscular junction, or the interface between the nerve that signals the vocal muscle to work and the muscle itself, also contributes to the symptoms of presbyphonia, Johnson said. In a healthy human, when the signal reaches the neuromuscular junction, it triggers a release of chemicals that signal the muscle to contract. But an age-related decline in the neuromuscular junction can cause weakness and fatigue in the muscle, and may result in a person having a breathy or weak voice and to become fatigued as a result of the extra effort needed to communicate.
Surgery and injections may help correct the gap between the vocal folds seen in presbyphonia, but these invasive procedures are often not viable in the elderly population, Johnson said.
His previous experience working with the elderly as a former classical singer and voice teacher propelled Johnson to “become interested in what we can do as we get older to keep our voices healthy and strong.”
“We know exercise strengthens the limb musculature, but we wanted to know if vocal exercise can strengthen the muscles of the voice,” Johnson said.
To find out if vocal training could have an effect on the strength and physiology of the vocal muscles in humans, Johnson turned to a rat model. Rats make ultrasonic vocalizations that are above the range of human hearing, but special recording equipment and a computer that lowers the frequency of the rat calls allows humans to perceive them. (They sound a bit like bird calls).
Because rats and humans utilize similar neuromuscular mechanisms to vocalize, the rats make ideal subjects for the study of human vocal characteristics, Johnson said.
Both the treatment and control groups contained old and young male rats. In the treatment group, a female rat was placed into a cage with a male rat. When the male expressed interest in her, the female was removed from the cage, causing the male rat to vocalize. The male was rewarded with food for these vocalizations, and after eight weeks of this operant conditioning in which rewards were only given for certain responses, all of the rats in the treatment group had been trained to increase their number of vocalizations during a training session.
At the end of the eight-week period, the researchers measured the intensity of the rats’ vocalizations and analyzed the animals’ larynges to see whether the training had any effect on the condition of their neuromuscular junctions.
The researchers found the trained old and young rats had similar average vocal intensities, but the untrained older rats had lower average intensities than both the trained rats and the young rats that had not been trained. They also found several age-related differences within the groups’ neuromuscular mechanisms.
“Other research has found that in the elderly, there is a dispersion, or breaking apart, of the neuromuscular junction at the side that is on the muscle itself,” Johnson said. “We found that in the older rats that received training, it wasn’t as dispersed.”
These “singing rats” are the “first evidence that vocal use and vocal training can change the neuromuscular system of the larynx,” Johnson said.
“While this isn’t a human study, I think this tells us that we can train ourselves to use our voices and not only reduce the effects of age on the muscles of our voices, but actually improve voices that have degraded,” Johnson said.