Posts tagged science

Posts tagged science
May 23, 2012
Researchers at New York University and Albert Einstein College of Medicine of Yeshiva University have discovered new ways neurons work together to ease the transition between sleep and wakefulness. Their findings, which appear in the journal Neuron, provide additional insights into sleep-wake patterns and offer methods to explore what may disrupt them.
Their study explored the biological, or circadian, clocks of Drosophila fruit flies, which are commonly used for research in this area. This is because it is relatively easy to find mutants with malfunctioning biological clocks and then to identify the genes underlying the altered behavior. Such studies in fruit flies have allowed the identification of similar “clock genes" in mammals, which function in largely the same manner as they do in a fly’s clock.
In the Neuron study, the researchers moved up a level to study how pacemaker clock neurons—which express clock genes—interact with each other. Specifically, they looked at the relationship between master pacemaker neurons, which control the overall pace of the circadian system, and non-master pacemaker neurons, whose role in circadian rhythms has been less clear.
To do so, they examined flies with normally functioning master and non-master clock neurons and compared them with mutant flies in which the signaling of these neurons was either increased or decreased. These comparisons allowed the researchers to isolate the individual roles of these neurons and, in particular, to understand how master and non-master pacemaker neurons work together to control circadian rhythms.
Their results revealed a previously unknown role for non-master pacemaker neurons. Specifically, these neurons employ a neurotransmitter, glutamate, which suppresses signaling of the master pacemaker neurons during the evening. Artificially increasing this suppression by the non-master clock neurons in the morning made it much harder for flies to wake up. So in normal flies, these non-master pacemaker neurons have to stand aside at dawn, allowing the master pacemaker neurons to fire to wake up the fly. The authors concluded that the balance between signaling of these two groups of clock neurons helps to set the precise time of the transition between sleep and wakefulness.
"Our work shifts the emphasis away from clock genes and starts to address how clock neurons function in a neural network to regulate behavior," explained Justin Blau, an associate professor in NYU’s Department of Biology and one of the study’s co-authors. "And it shows the importance of studying individual groups of clock neurons, since different subsets can have opposite effects on animal behavior.”
"This work helps to elucidate the neurotransmitters and receptors that facilitate communication between specific groups of nerve cells that regulate circadian rhythm," said co-author Myles Akabas, professor of Physiology & Biophysics and of Neuroscience at Albert Einstein College of Medicine. "It demonstrates the power of collaborative interdisciplinary research to address the molecular and cellular basis for behavior."
Provided by New York University
Source: medicalxpress.com
May 23, 2012 by R. Alan Leo
For decades, neurologists have known that a diet high in fat and extremely low in carbohydrates can reduce epileptic seizures that resist drug therapy. But how the diet worked, and why, was a mystery—so much so that in 2010, The New York Times Magazine called it “Epilepsy’s Big, Fat Miracle.”
Now, researchers at Dana-Farber Cancer Institute and Harvard Medical School have proposed an answer, linking resistance to seizures to a protein that modifies cellular metabolism in the brain. The research, to be published in the May 24th issue of the journal Neuron, may lead to the development of new treatments for epilepsy.
The research was led jointly by Nika Danial, HMS assistant professor of cell biology at Dana-Farber Cancer Institute, and Gary Yellen, professor of neurobiology at Harvard Medical School. The first author was Alfredo Giménez-Cassina, a research fellow in Danial’s lab.
Epilepsy is a neurological disorder characterized by repeated seizures, an electrical storm in the brain that can manifest as convulsions, loss of motor control, or loss of consciousness. Some cases of epilepsy can be improved by a diet that drastically reduces sugar intake, triggering neurons to switch from their customary fuel of glucose to fat byproducts called ketone bodies. The so-called ketogenic diet, which mimics effects of starvation, was described more than 80 years ago and received renewed interest in the 1990s. Recent studies corroborate that it works, but shed little light on how.
"The connection between metabolism and epilepsy has been such a puzzle," said Yellen, who was introduced to the ketogenic diet through his wife, Elizabeth Thiele, HMS professor of neurology, who directs the Pediatric Epilepsy Program at MassGeneral Hospital for Children, but was not directly involved in the study. "I’ve met a lot of kids whose lives are completely changed by this diet," Yellen said. "It’s amazingly effective, and it works for many kids for whom drugs don’t work."
"We knew we needed to come at this link between metabolism and epilepsy from a new angle," said Danial, who had previously discovered a surprising double duty for a protein known for its role in apoptosis: The protein, BCL-2-associated Agonist of Cell Death, or BAD, also regulated glucose metabolism.
Giménez-Cassina further discovered that certain modifications in BAD switched metabolism in brain cells from glucose to ketone bodies. “It was then that we realized we had come upon a metabolic switch to do what the ketogenic diet does to the brain without any actual dietary therapy,” said Gimenez-Cassina, who went on to show that these same BAD modifications protect against seizures in experimental models of epilepsy. Still, it wasn’t clear exactly how.
Yellen suspected the solution involved potassium ion channels. While sodium and calcium ion channels tend to excite cells, including neurons, potassium channels tend to suppress cell electrical activity. His lab had previously linked ketone bodies to the activation of ATP-sensitive potassium (KATP) channels in neurons. Yellen had hypothesized that the ketogenic diet worked because ketone bodies provide neurons enough fuel for normal function, but when the electrical and energy storm of an epileptic seizure threatens, the activated KATP channels can shut the storm down. But the effects of diets are broad and complex, so it was impossible to say for sure.
The effects that Danial’s lab had discovered—BAD’s ability to alter metabolism and seizures—offered a new avenue for studying the therapeutic effects of altered metabolism. Together, the researchers decided to investigate whether Danial’s switch governed Yellen’s pathway, and whether they could reverse engineer the seizure protection of a ketogenic diet.
They could. Working in genetically altered mice, the researchers modified the BAD protein to reduce glucose metabolism and increase ketone body metabolism in the brain. Seizures decreased, but the benefit was erased when they knocked out the KATP channel—strong evidence that a BAD-KATP pathway conferred resistance to epileptic seizures. Further experiments suggested that it was indeed BAD’s role in metabolism, not cell death that mattered. The findings make the BAD protein a promising target for new epilepsy drugs.
"Diet sounds like this wholesome way to treat seizures, but it’s very hard. I mean, diets in general are hard, and this diet is really hard," said Yellen, whose wife’s Center for Dietary Therapy in Epilepsy hosts a candy-free Halloween party for its many patients on the ketogenic diet. “So finding a pharmacological substitute for this would make lots of people really happy.”
Provided by Harvard Medical School
Source: medicalxpress.com
May 23, 2012
A new study finds that transplanting embryonic cells into adult mouse spinal cord can alleviate persistent pain. The research, published by Cell Press in the May 24th issue of the journal Neuron, suggests that reduced pain results from successful integration of the embryonic cells into the host spinal cord. The findings open avenues for clinical strategies aimed not just at treating the symptoms of chronic debilitating pain, but correcting the underlying disease pathology.
There are two major classes of chronic pain: inflammatory pain that results from injury to tissue, such as muscle and bone, and neuropathic pain from injury to nerves, for example, in the limbs or face. Damage to nerves can occur after physical trauma and from chemotherapy drugs. With neuropathic pain, the pain occurs in the absence of stimulation, and there is hypersensitivity and exacerbated pain to stimuli that would not normally cause pain. Neuropathic pain is thought to involve the loss of inhibitory neurons that release the chemical GABA, which is an inhibitory neurotransmitter that controls the excitability of neurons, including neurons that transmit pain information.
"Pharmacological approaches to managing neuropathic pain enhance GABA-mediated inhibition. However, some patients do not respond to these therapies and there are significant adverse side effects," explains senior study author, Dr. Allan Basbaum from the University of California, San Francisco. "Therefore, new therapeutic approaches for neuropathic pain are essential." Dr. Basbaum and colleagues explored whether replacement of the damaged inhibitory neurons might be useful for reducing neuropathic pain.
The researchers transplanted immature GABA neurons from mouse fetal brain into the spinal cord of mice with nerve injury-induced pain, a model for human neuropathic pain. The transplanted cells not only survived, but made connections with appropriate targets and integrated into the host spinal cord circuitry. This resulted in an almost complete reversal of the mechanical hypersensitivity generated in a nerve injury model of neuropathic pain. In contrast, the transplant procedure was not effective at reducing pain in a mouse model of inflammatory pain, which is induced by tissue injury.
Taken together, the findings have exciting implications for a cell-based treatment of neuropathic pain in humans. “Our strategy not only ameliorates the symptoms of neuropathic pain but, importantly, is also potentially disease modifying,” concludes Dr. Basbaum. “It is worth considering whether transplants such as these might have clinical utility in humans, a great advantage being that the adverse side effects associated with drug administration can be avoided.”
Provided by Cell Press
Source: medicalxpress.com
May 23, 2012
(Medical Xpress) — Our ability to imagine and plan our future depends on brain regions that store general knowledge, new research shows.
Dr. Muireann Irish from Neuroscience Research Australia (NeuRA) found that dementia patients who can no longer recall general knowledge – for example, the names of famous people or popular songs – are also unable to imagine themselves in the future.
"We already know that if memory of past events is compromised, as is the case in Alzheimer’s disease, then the ability to imagine future scenarios is also impaired,” says Dr. Irish.
"We have now discovered that damage to parts of the brain that store knowledge of facts and meanings can also produce the same effect," she says.
Thinking about the future is an important ability because it helps us to plan and anticipate the consequences of our actions.
"For example, a person with dementia who may leave the oven on, partly because they forget the appropriate action, but also because they cannot project forward in time to anticipate the dangerous consequences this might have," says Dr. Irish.
Dr. Irish and colleagues used MRI to study people with Alzheimer’s disease (memories of past experiences are lost) as well as patients with semantic dementia who have lost the ability to remember facts (semantic memory) but have little problem remembering past experiences.
Surprisingly, she found that the semantic dementia group was as impaired as the Alzheimer’s group when imagining future events, even though their memory of past experiences was relatively intact.
"This is an important finding, as it points to multiple regions in the brain that are responsible for our ability to imagine and plan for the future,” she says.
Provided by Neuroscience Research Australia
Source: medicalxpress.com
ScienceDaily (May 22, 2012) — Researchers at Barrow Neurological Institute at St. Joseph’s Hospital and Medical Center have unveiled how and why the public perceives some magic tricks in recent studies that could have real-world implications in military tactics, marketing and sports.

A professional magician believed that if he moved his hand in a straight line while performing a trick the audience would focus on the beginning and end points of the motion, but not in between. In contrast, he believed if he moved his hand in a curved motion the audience would follow his hand’s trajectory from beginning to end. (Credit: © luzitanija / Fotolia)
Susana Martinez-Conde, PhD, of Barrow’s Laboratory of Visual Neuroscience, and Stephen Macknik, PhD, of Barrow’s Laboratory of Behavioral Neurophysiology are well known for their research into magic and illusions. Their most recent original research projects, published in Frontiers in Human Neuroscience, offer additional insight into perception and cognition.
One of the studies was initiated by professional magician Apollo Robbins, who believed that audience members directed their attention differently depending on the type of hand motion used. Robbins believed that if he moved his hand in a straight line while performing a trick the audience would focus on the beginning and end points of the motion, but not in between. In contrast, he believed if he moved his hand in a curved motion the audience would follow his hand’s trajectory from beginning to end.
By studying the eye movements of individuals as they watched Robbins perform, Barrow researchers confirmed Robbins’ theory. Perhaps more importantly, they also found that the different types of hand motion triggered two different types of eye movement. The researchers discovered that curved motion engaged smooth pursuit eye movements (in which the eye follows a moving object smoothly), whereas straight motion led to saccadic eye movements (in which the eye jumps from one point of interest to another).
"Not only is this discovery important for magicians, but the knowledge that curved motion attracts attention differently from straight motion could have wide-reaching implications — for example, in predator-prey evasion techniques in the natural world, military tactics, sports strategies and marketing," says Martinez-Conde. This finding is believed to be the first discovery in the neuroscientific literature initiated by a magician, rather than a scientist.
In another study, the researchers worked with professional magician Mac King to investigate magicians’ use of social cues — like the position of their gaze — to misdirect observers.
They studied a popular coin-vanishing trick, in which King tosses a coin up and down in his right hand before “tossing” it to his left hand, where it subsequently disappears. In reality, the magician only simulates tossing the coin to the left hand, an implied motion that essentially tricks the neurons into responding as they would have if the coin had actually been thrown.
The Barrow researchers discovered that social misdirection does not always help magic. By presenting two different videos of King — one in which the audience could see his face and another in which his face was hidden — they found that social misdirection did not play a role in this particular trick.
"We wondered if the observer’s perception of magic was going to be different if they could see the magician’s head and eye position. To our surprise, it didn’t matter," says Martinez-Conde. "This indicates that social misdirection in magic is more complicated than previously believed, and not necessary for the perception of all magic tricks."
Source: Science Daily
ScienceDaily (May 22, 2012) — When brain cells start oozing too much of the amyloid protein that is the hallmark of Alzheimer’s disease, the astrocytes that normally nourish and protect them deliver a suicide package instead, researchers report.

Drs. Michael Dinkins (from left), Guanghu Wang and Erhard Bieberich. (Credit: Image courtesy of Georgia Health Sciences University)
Amyloid is excreted by all neurons, but rates increase with aging and dramatically accelerate in Alzheimer’s. Astrocytes, which deliver blood, oxygen and nutrients to neurons in addition to hauling off some of their garbage, get activated and inflamed by excessive amyloid.
Now researchers have shown another way astrocytes respond is by packaging the lipid ceramide with the protein PAR-4, which independently can do damage but together are a more “deadly duo,” said Dr. Erhard Bieberich, biochemist at the Medical College of Georgia at Georgia Health Sciences University.
"If the neuron makes something toxic and dumps it at your door, what would you do?" said Bieberich, corresponding author of the study published in the Journal of Biological Chemistry. “You would probably do something to defend yourself.”
The researchers hypothesize that this lipid-coated package ultimately kills them both, which could help explain the brain-cell death and shrinkage that occurs in Alzheimer’s. “If the astrocytes die, the neurons die,” Bieberich said, noting studies suggest that excess amyloid alone does not kill brain cells. “There must be a secondary process toxifying the amyloid; otherwise the neuron would self-intoxicate before it made a big plaque,” he said. “The neuron would die first.”
One of many avenues for future pursuit include whether a ceramide antibody could be a viable Alzheimer’s treatment. In the researchers’ studies of brain cells of humans with Alzheimer’s as well as an animal model of the disease, antibodies to ceramide and Par-4 prevented astrocytes’ amyloid-induced death.
Ceramide and Par-4 get packaged in lipid-coated vesicles called exosomes; all cells secrete thousands of these vesicles but scientists are only beginning to understand their normal function. When exosomes become deadly, they are called apoxosomes.
Ceramide and Par-4 are typically not in a vesicle, rather in two distinct parts of a cell. Ceramide appears to take the lead in bringing the two together when confronted with amyloid. Bieberich and colleagues at the University of Georgia reported in 2003 that the deadly duo helps eliminate duplicate brain cells that occur early in brain development when their survival could result in a malformed brain. They suspected then that the duo might also have a role in Alzheimer’s.
Risk factors for Alzheimer’s include aging, family history and genetics, according to the Alzheimer’s Association. Increasing evidence suggests that Alzheimer’s also shares many of the same risk factors for cardiovascular disease, such as high cholesterol, high blood pressure and inactivity.
Source: Science Daily
May 22, 2012
(Medical Xpress) — Researchers at the Institut Pasteur and the CNRS have recently identified in mice the role played by neo-neurons formed in the adult brain. By using selective stimulation the researchers were able to show that these neo-neurons increase the ability to learn and memorize difficult cognitive tasks. This newly discovered characteristic of neo-neurons to assimilate complex information could open up new avenues in the treatment of some neurodegenerative diseases. This publication is available online on the Nature Neuroscience journal’s website.

Section of a mouse brain observed using a fluorescence microscope. The green filaments represent neo-neurons in an organized network. Credit: Institut Pasteur
The discovery that new neurons could be formed in the adult brain created quite a stir in 2003 by debunking the age-old belief that a person is born with a set number of neurons and that any loss of neurons is irreversible. This discovery was all the more incredible considering that the function of these new neurons remained undetermined. That is, until today.
Using mice models the team working under Pierre-Marie Lledo, head of the Laboratory for Perception and Memory (Institut Pasteur/CNRS) recently revealed the role of these neo-neurons formed in the adult brain with respect to learning and memory. With the help of an experimental approach using optogenetics, developed by this very same team and published in December 2010, the researchers were able to show that when stimulated by a brief flash of light these neo-neurons facilitate both learning and the memorization of complex tasks. This resulted in mice models that were able to memorize information given during the learning activity more quickly and remember exercises even 50 days after experimentation had ended. The study also shows that neo-neurons generated just after birth hold no added advantages as relates to either learning or memory. In this respect it is only the neurons produced by the adult brain that have any considerable significance.
“This study shows that the activity of just a few neurons produced in the adult brain can still have considerable effects on cognitive processes and behavior. Moreover, this work helps to illustrate how the brain assimilates new stimulations seeing as normally electrical activity (which we mimic using flashes of light) is produced within the brain’s attention centers”, explains the study’s director Pierre-Marie Lledo.
Beyond simply discovering the functional contribution of these neo-neurons, the study has also reaffirmed the clear link between “mood” (defined here by a specific pattern of stimulation) and cerebral activity. It has been shown that curiosity, attentiveness and pleasure all promote the formation of neo-neurons and consequently the acquisition of new cognitive abilities. Conversely, a state of depression is detrimental to the production of new neurons and triggers a vicious cycle which prolongs this state of despondency. These results, and the optogenetics technologies that enabled this study, may prove very useful for devising therapeutic protocols which aim to counter the development of neurologic or psychiatric diseases.
Provided by CNRS
Source: medicalxpress.com
May 22, 2012
University of Georgia researchers have developed a map of the human brain that shows great promise as a new guide to the inner workings of the body’s most complex and critical organ.
With this map, researchers hope to create a next-generation brain atlas that will be an alternative option to the atlas created by German anatomist Korbinian Brodmann more than 100 years ago, which is still commonly used in clinical and research settings.
Tianming Liu, assistant professor of computer science in the UGA Franklin College of Arts and Sciences, and his students Dajiang Zhu and Kaiming Li identified 358 landmarks throughout the brain related to memory, vision, language, arousal regulation and many other fundamental bodily operations. Their findings were published in the April issue of Cerebral Cortex.
The landmarks were discovered using diffusion tensor imaging, a sophisticated neuroimaging technique that allows scientists to visualize nerve fiber connections throughout the brain. Unlike many other neuroimaging studies, their map does not focus only on one section of the brain but rather the whole cerebral cortex.
"Previously, researchers would examine at most three or four small brain networks," Liu said. "We want to examine the whole brain connection, and this is the so-called connectome."
The new map provides a clearer picture of how different areas of the brain are physically connected and how these connections relate to basic brain function. Liu and his team examined hundreds of healthy young adults to establish the landmarks, which they call dense individualized and common connectivity-based cortical landmarks, or DICCCOL.
After extensive testing and comparison, the team determined that these nodes are present in every normal brain, meaning they can be used as a basis of comparison for those with damaged brain tissue or altered brain function.
"DICCCOL is very similar to a GPS system," Zhu said, "only it’s a GPS map of the human brain."
Now, thanks in part to a five-year, $1.6 million grant from the National Institutes of Health, Liu and collaborators Xiaoping Hu and Claire Coles at Emory University are preparing to test their brain map by comparing healthy brains with those of children whose brains were damaged by exposure to cocaine while in the womb.
Prenatal cocaine exposure, or PCE, can cause serious damage to brain networks. Because of this, analysis of the damage provides Liu and his team with an excellent opportunity to evaluate the usefulness of their map.
After comparing the PCE brains to those of healthy individuals, they hope to determine the segments of the brain responsible for physical or mental disabilities observed in children exposed to cocaine.
"The PCE brain is disrupted in a systematic way; the whole brain is wrongly wired," Liu said. "We want to test our map in one of the worst cases, and then we will know if it will work in other cases."
Once the robustness of their map is established, Liu and his team hope that it may prove useful in the evaluation of many other brain disorders, such as Alzheimer’s disease, Parkinson’s disease or stroke.
"This really is a fundamental technology," Liu said. "When we establish these DICCCOLS, we can very easily extend this project to other populations, to other brain diseases."
More information: Liu’s team published their DICCCOL data sets, which includes the source code and diffusion tensor images, at http://dicccol.cs.uga.edu so other researchers may use the findings in their own experiments.
The article, “DICCCOL: Dense Individualized and Common Connectivity-Based Cortical Landmarks,” is available at http://cercor.oxfordjournals.org/content/early/2012/04/05/cercor.bhs072.short
Provided by University of Georgia
Source: medicalxpress.com
ScienceDaily (May 21, 2012) — Seventy-two percent of teenagers participating in a study experienced reduced hearing ability following exposure to a pop rock performance by a popular female singer.

Seventy-two percent of teenagers participating in a study experienced reduced hearing ability following exposure to a pop rock performance by a popular female singer. (Credit: © DWP / Fotolia)
M. Jennifer Derebery, MD, House Clinic physician, along with the House Research Institute tested teens’ hearing before and after a concert and presented the study findings at the American Otologic Society meeting on April 21, 2012. The study has been accepted for publication in an upcoming issue of Otology & Neurotology.
The hearing loss that may be experienced after a pop rock concert is not generally believed to be permanent. It is called a temporary threshold shift and usually disappears within 16-48 hours, after which a person’s hearing returns to previous levels.
“Teenagers need to understand a single exposure to loud noise either from a concert or personal listening device can lead to hearing loss,” said M. Jennifer Derebery, MD, lead author and physician at the House Clinic. “With multiple exposures to noise over 85 decibels, the tiny hair cells may stop functioning and the hearing loss may be permanent.”
In the study, twenty-nine teenagers were given free tickets to a rock concert. To ensure a similar level of noise exposure for the teens, there were two blocks of seats within close range of each other. The seats were located in front of the stage at the far end of the venue approximately 15-18 rows up from the floor.
Parental consent was obtained for all of the underage study participants. The importance of using hearing protection was explained to the teenagers. Researchers then offered hearing protection to the subjects and encouraged them to use the foam ear plugs. However, only three teenagers chose to do so.
Three adult researchers sat with the teenagers. Using a calibrated sound pressure meter, 1,645 measurements of sound decibel (dBA) levels were recorded during the 26 songs played during the three hour concert. The sound levels ranged from 82-110 dBA, with an average of 98.5 dBA. The mean level was greater than 100 dBA for 10 of the 26 songs.
The decibel levels experienced at the concert exceeded what is allowable in the workplace, according to Occupational Safety and Health Administration (OSHA). OSHA safe listening guidelines set time limits for exposures to sound levels of 85 dB and greater in the workplace. The volumes recorded during the concert would have violated OSHA standards in less than 30 minutes. In fact, one third of the teen listeners showed a temporary threshold shift that would not be acceptable in adult workplace environments.
Following the concert, the majority of the study participants also were found to have a significant reduction in the Distortion Product Otoacoustic Emissions (OAE) test. This test checks the function of the tiny outer hair cells in the inner ear that are believed to be the most vulnerable to damage from prolonged noise exposure, and are crucial to normal hearing, the ability to hear soft (or low level sounds), and the ability to understand speech, especially in noisy environments. With exposure to loud noise, the outer hair cells show a reduction in their ability to function, which may later recover. However, it is known that with repeated exposure to loud noise, the tiny hair cells may become permanently damaged. Recent animal research suggests that a single exposure to loud noise may result in permanent damage to the hearing nerve connections themselves that are necessary to hear sound.
Following the concert, 53.6 percent of the teens said they did not think they were hearing as well after the concert. Twenty-five percent reported they were experiencing tinnitus or ringing in their ears, which they did not have before the concert.
Researchers are especially concerned, because in the most recent government survey on health in the United States National Health and Nutrition Examination Survey (NHANES) 2005-2006, 20% of adolescents were found to have at least slight hearing loss, a 31% increase from a similar survey done from 1988-1994.
The findings of the study clearly indicate more research is necessary to determine if the guidelines for noise exposure need to be revised for teenagers. More research is also needed to determine if teenager’s ears are more sensitive to noise than adults.
“It also means we definitely need to be doing more to ensure the sound levels at concerts are not so loud as to cause hearing loss and neurological damage in teenagers, as well as adults,” said Derebery. “Only 3 of our 29 teens chose to use ear protection, even when it was given to them and they were encouraged to do so. We have to assume this is typical behavior for most teen listeners, so we have the responsibility to get the sound levels down to safer levels.”
Researchers recommend teenagers and young adults take an active role in protecting their hearing by utilizing a variety of sound meter ‘apps’ available for smart phones. The sound meters will give a rough estimate of the noise level allowing someone to take the necessary steps to protect their hearing such as wearing ear plugs at a concert.
In addition, Derebery and the study co-authors would like to see concert promoters and the musicians themselves take steps to lower sound levels as well as encourage young concert goers to use hearing protection.
Source: Science Daily
ScienceDaily (May 21, 2012) — Turns out it’s not bad being top dog, or in this case, top baboon.

Wounded baboon. (Credit: Image courtesy of University of Notre Dame)
A new study by University of Notre Dame biologist Beth Archie and colleagues from Princeton and Duke Universities finds that high-ranking male baboons recover more quickly from injuries and are less likely to become ill than other males.
Archie, Jeanne Altmann of Princeton and Susan Alberts of Duke examined health records from the Amboseli Baboon Research Project in Kenya. They found that high rank is associated with faster wound healing. The finding is somewhat surprising, given that top-ranked males also experience high stress, which should suppress immune responses. They also found that social status is a better predictor of wound healing than age.
"In humans and animals, it has always been a big debate whether the stress of being on top is better or worse than the stress of being on the bottom," said Archie, lead researcher on the study. "Our results suggest that, while animals in both positions experience stress, several factors that go along with high rank might serve to protect males from the negative effects of stress."
"The power of this study is in identifying the biological mechanisms that may confer health benefits to high-ranking members of society," said George Gilchrist, program director in the National Science Foundation (NSF)’s Division of Biology, which funded the research. "We know that humans have such benefits, but it took meticulous long-term research on baboon society to tease out the specific mechanisms. The question remains of causation: Is one a society leader because of stronger immune function or vice versa?"
The researchers examined 27 years of data on naturally occurring illness and injuries in wild male baboons, which is a notably large data set. Although research of health and disease in animals in laboratory settings has been quite extensive, this study is one of most comprehensive ever conducted on animals in a natural setting.
The research team investigated how differences in age, physical condition, stress, reproductive effort and testosterone levels contribute to status-related differences in immune functions. Previous research found that high testosterone levels and intense reproductive efforts can suppress immune function and are highest among high-ranking males.
However, Archie and her colleagues found that high-ranking males were less likely to become ill and recovered faster from injuries and illnesses than low-ranking males. The authors suggest that chronic stress, old age and poor physical condition associated with low rank may suppress immune function in low-ranking males.
"The complex interplay among social context, physiology and immune system-mediated health costs and benefits illustrates the power of interdisciplinary research," said Carolyn Ehardt, NSF program director for biological anthropology, which co-funded the research. "This research begins to tease apart the trade-offs in both high and low status in primates, including ourselves, which may lead to understanding the effects of social status on death and disease — not inconsequential for society as a whole."
Source: Science Daily