Posts tagged science

Posts tagged science
For many animals, making sense of the clutter of sensory stimuli is often a matter or literal life or death.
Exactly how animals separate objects of interest, such as food sources or the scent of predators, from background information, however, remains largely unknown. Even the extent to which animals can make such distinctions, and how differences between scents might affect the process were largely a mystery – until now.
A new study, described in an August 3 paper in Nature Neuroscience, a team of researchers led by Venkatesh Murthy, Professor of Molecular and Cellular Biology, showed that while mice can be trained to detect specific odorants embedded in random mixtures, their performance drops steadily with increasing background components. The team included Dan Rokni, Vikrant Kapoor and Vivian Hemmelder, all from Harvard University.
"There is a continuous stream of information constantly arriving at our senses, coming from many different sources," Murthy said. "The classic example would be a cocktail party – though it may be noisy, and there may be many people talking, we are able to focus our attention on one person, while ignoring the background noise.
"Is the same also true for smells?" he continued. "We are bombarded with many smells all jumbled up. Can we pick out one smell "object" – the smell of jasmine, for example, amidst a riot of other smells? Our experience tells us indeed we can, but how do we pick out the ones that we need to pay attention to, and what are the limitations?"
To find answers to those, and other, questions, Murthy and colleagues turned to mice.
After training mice to detect specific scents, researchers presented the animals with a combination of smells – sometimes including the “target” scent, sometimes not. Though previous studies had suggested animals are poor at individual smells, and instead perceived the mixture as a single smell, their findings showed that mice were able to identify when a target scent was present with 85 percent accuracy or better.
"Although the mice do well overall, they perform progressively poorer when the number of background odors increases," Murthy explained.
Understanding why, however, meant first overcoming a problem particular to olfaction.
While the relationship between visual stimuli is relatively easy to understand – differences in color can be easily described as differences in the wavelength of light – no such system exists to describe how two odors relate to each other. Instead, the researchers sought to describe scents according to how they activated neurons in the brain.
Using fluorescent proteins, they created images that show how each of 14 different odors stimulated neurons in the olfactory bulb. What they found, Murthy said, was that the ability of mice to identify a particular smell was markedly diminished if background smells activated the same neurons as the target odor.
"Each odor gives rise to a particular spatial pattern of neural responses," Murthy said. "When the spatial pattern of the background odors overlapped with the target odor, the mice did much more poorly at detecting the target. Therefore, the difficulty of picking out a particular smell among a jumble of other odors, depends on how much the background interferes with your target smell. So, we were able to give a neural explanation for how well you can solve the cocktail party problem.
"This study is interesting because it first shows that smells are not always perceived as one whole object – they can be broken down into their pieces," he added. "This is perhaps not a surprise – there are in fact coffee or wine specialists that can detect faint whiffs of particular elements within the complex mixture of flavors in each coffee or wine. But by doing these studies in mice, we can now get a better understanding of how the brain does this. One can also imagine that understanding how this is done may also allow us to build artificial olfactory systems that can detect specific chemicals in the air that are buried amidst a plethora of other odors."
A little video gaming ‘produces well-adjusted children’
Playing video games for a short period each day could have a small but positive impact on child development, a study by Oxford University suggests.
Scientists found young people who spent less than an hour a day engaged in video games were better adjusted than those who did not play at all.
But children who used consoles for more than three hours reported lower satisfaction with their lives overall.
The research is published in the journal Pediatrics.
The tiny addition of a chemical mark atop a gene that is well known for its involvement in clinical depression and posttraumatic stress disorder can affect the way a person’s brain responds to threats, according to a new study by Duke University researchers.
The results, which appear online August 3 in Nature Neuroscience, go beyond genetics to help explain why some individuals may be more vulnerable than others to stress and stress-related psychiatric disorders.
The study focused on the serotonin transporter, a molecule that regulates the amount of serotonin signaling between brain cells and is a major target for treatment of depression and mood disorders. In the 1990s, scientists discovered that differences in the DNA sequence of the serotonin transporter gene seemed to give some individuals exaggerated responses to stress, including the development of depression.

(Image caption: An artist’s conception shows how molecules called methyl groups attach to a specific stretch of DNA, changing expression of the serotonin transporter gene in a way that ultimately shapes individual differences in the brain’s reactivity to threat. The methyl groups in this diagram are overlaid on the amygdala of the brain, where threat perception occurs. Credit: Annchen Knodt, Duke University)
Sitting on top of the serotonin transporter’s DNA (and studding the entire genome), are chemical marks called methyl groups that help regulate where and when a gene is active, or expressed. DNA methylation is one form of epigenetic modification being studied by scientists trying to understand how the same genetic code can produce so many different cells and tissues as well as differences between individuals as closely related as twins.
In looking for methylation differences, “we decided to start with the serotonin transporter because we know a lot about it biologically, pharmacologically, behaviorally, and it’s one of the best characterized genes in neuroscience,” said senior author Ahmad Hariri, a professor of psychology and neuroscience and member of the Duke Institute for Brain Sciences.
"If we’re going to make claims about the importance of epigenetics in the human brain, we wanted to start with a gene that we have a fairly good understanding of," Hariri said.
This work is part of the ongoing Duke Neurogenetics Study (DNS), a comprehensive study linking genes, brain activity and other biological markers to risk for mental illness in young adults.
The group performed non-invasive brain imaging in the first 80 college-aged participants of the DNS, showing them pictures of angry or fearful faces and watching the responses of a deep brain region called the amygdala, which helps shape our behavioral and biological responses to threat and stress.
The team also measured the amount of methylation on serotonin transporter DNA isolated from the participants’ saliva, in collaboration with Karestan Koenen at Columbia University’s Mailman School of Public Health in New York.
The greater the methylation of an individual’s serotonin transporter gene, the greater the reactivity of the amygdala, the study found. Increased amygdala reactivity may in turn contribute to an exaggerated stress response and vulnerability to stress-related disorders.
To the group’s surprise, even small methylation variations between individuals were sufficient to create differences between individuals’ amygdala reactivity, said lead author Yuliya Nikolova, a graduate student in Hariri’s group. The amount of methylation was a better predictor of amygdala activity than DNA sequence variation, which had previously been associated with risk for depression and anxiety.
The team was excited about the discovery but also cautious, Hariri said, because there have been many findings in genetics that were never replicated.
That’s why they jumped at the chance to look for the same pattern in a different set of participants, this time in the Teen Alcohol Outcomes Study (TAOS) at the University of Texas Health Science Center at San Antonio.
Working with TAOS director, Douglas Williamson, the group again measured amygdala reactivity to angry and fearful faces as well as methylation of the serotonin transporter gene isolated from blood in 96 adolescents between 11 and 15 years old. The analyses revealed an even stronger link between methylation and amygdala reactivity.
"Now over 10 percent of the differences in amygdala function mapped onto these small differences in methylation," Hariri said. The DNS study had found just under 7 percent.
Taking the study one step further, the group also analyzed patterns of methylation in the brains of dead people in collaboration with Etienne Sibille at the University of Pittsburgh, now at the Centre for Addiction and Mental Health in Toronto.
Once again, they saw that methylation of a single spot in the serotonin transporter gene was associated with lower levels of serotonin transporter expression in the amygdala.
"That’s when we thought, ‘Alright, this is pretty awesome,’" Hariri said.
Hariri said the work reveals a compelling mechanistic link: Higher methylation is generally associated with less reading of the gene, and that’s what they saw. He said methylation dampens expression of the gene, which then affects amygdala reactivity, presumably by altering serotonin signaling.
The researchers would now like to see how methylation of this specific bit of DNA affects the brain. In particular, this region of the gene might serve as a landing place for cellular machinery that binds to the DNA and reads it, Nikolova said.
The group also plans to look at methylation patterns of other genes in the serotonin system that may contribute to the brain’s response to threatening stimuli.
The fact that serotonin transporter methylation patterns were similar in saliva, blood and brain also suggests that these patterns may be passed down through generations rather than acquired by individuals based on their own experiences.
Hariri said he hopes that other researchers looking for biomarkers of mental illness will begin to consider methylation above and beyond DNA sequence-based variation and across different tissues.
(Source: eurekalert.org)

(Image caption: Brain image showing activity in the amygdala, the area of the brain involved with emotion. The amydgala was more active during the graphic scenarios only when the harm being described was intentional. Credit: Marois Lab / Vanderbilt)
Fault trumps gruesome evidence when it comes to meting out punishment
Issues of crime and punishment, vengeance and justice date back to the dawn of human history, but it is only in the last few years that scientists have begun exploring the basic nature of the complex neural processes in the brain that underlie these fundamental behaviors.
Now a new brain imaging study – published online Aug. 3 by the journal Nature Neuroscience – has identified the brain mechanisms that underlie our judgment of how severely a person who has harmed another should be punished. Specifically, the study determined how the area of the brain that determines whether such an act was intentional or unintentional trumps the emotional urge to punish the person, however gruesome the harm may be.
“A fundamental aspect of the human experience is the desire to punish harmful acts, even when the victim is a perfect stranger. Equally important, however, is our ability to put the brakes on this impulse when we realize the harm was done unintentionally,” said Rene Marois, the Vanderbilt University professor of psychology who headed the research team. “This study helps us begin to elucidate the neural circuitry that permits this type of regulation.”
The study
In the experiment, the brains of 30 volunteers (20 male, 10 female, average age 23 years) were imaged using functional MRI (fMRI) while they read a series of brief scenarios that described how the actions of a protagonist named John brought harm to either Steve or Mary. The scenarios depicted four different levels of harm: death, maiming, physical assault and property damage. In half of them, the harm was clearly identified as intentional and in half it was clearly identified as unintentional.
Two versions of each scenario were created: one with a factual description of the harm and the other with a graphic description. For example, in a mountain climbing scenario where John cuts Steve’s rope, the factual version states, “Steve falls 100 feet to the ground below. Steve experiences significant bodily harm from the fall and he dies from his injuries shortly after impact.” And the graphic version reads, “Steve plummets to the rocks below. Nearly every bone in his body is broken upon impact. Steve’s screams are muffled by thick, foamy blood flowing from his mouth as he bleeds to death.”
After reading each scenario the participants were asked to list how much punishment John deserved on a scale from zero (no punishment) to nine (most severe punishment the subject endorsed).
Analysis of the responses
When the responses were analyzed, the researchers found that the manner in which the harmful consequences of an action are described significantly influences the level of punishment that people consider appropriate: When the harm was described in a graphic or lurid fashion then people set the punishment level higher than when it was described matter-of-factly. However, this higher punishment level only applied when the participants considered the resulting harm to be intentional. When they considered it to be unintentional, the way it was described didn’t have any effect.
“What we’ve shown is that manipulations of gruesome language leads to harsher punishment, but only in cases where the harm was intentional. Language had no effect when the harm was caused unintentionally,” summarized Michael Treadway, a post-doctoral fellow at Harvard Medical School and lead author of the study.
According to the researchers, the fact that the mere presence of graphic language could cause participants to ratchet up the severity of the punishments suggests that photographs, video and other graphic materials sampled from a crime scene is likely to have an even stronger impact on an individual’s desire to punish.
“Although the underlying scientific basis of this effect wasn’t known until now, the legal system recognized it a long time ago and made provisions to counteract it,” said Treadway. “Judges are permitted to exclude relevant evidence from a trial if they decide that its probative value is substantially outweighed by its prejudicial nature.”
Underlying neuroanatomy
The fMRI scans revealed the areas of the brain that are involved in this complex process. They found that the amygdala, an almond-shaped set of neurons that plays a key role in processing emotions, responded most strongly to the graphic language condition. Like the punishment ratings themselves, however, this effect in the amygdala was only present when harm was done intentionally. Moreover, in this situation the researchers found that the amygdala showed stronger communication with the dorsolateral prefrontal cortex (dlPFC), an area that is critical for punishment decision-making. When the harm was done unintentionally, however, a different regulatory network – one involved in decoding the mental states of other people – became more active and appeared to suppress amygdala responses to the graphic language, thereby preventing the amygdala from affecting decision-making areas in dlPFC.
“This is basically a reassuring finding,” said Marois. “It indicates that, when the harm is not intended, we don’t simply shunt aside the emotional impulse to punish. Instead, it appears that the brain down-regulates the impulse so we don’t feel it as strongly. That is preferable because the urge to punish is less likely to resurface at a future date.”
Do we really only use 10% of our brain?
As the new film Lucy, starring Scarlett Johansson and Morgan Freeman is set to be released in the cinemas this week, I feel I should attempt to dispel the unfounded premise of the film – that we only use 10% of our brains. Let me state that there is no scientific evidence that supports this statement, it is simply a myth.
The concept behind the film is that through the administration of a new cognitive enhancing drug, our female lead character, Lucy, becomes able to harness powerful mental capabilities and enhanced physical abilities. These include telekinesis, mental time travel and being able to absorb information instantaneously. Viewed as such, the human brain should be essentially capable of these feats, we just fail to push our capacity. So if we can unlock the “unused” 90% of the brain we too could be geniuses with super powers?
Bad maths grades, poor participation in class, no interest in arithmetic. Preterm children often suffer from dyscalculia – at least according to some scientific studies. A misunderstanding, claims developmental psychologist Dr Julia Jäkel, who has been studying the performance of preterm children.
Thanks to modern medicine, the percentage of preterm survivors is constantly increasing. On the cognitive level, these children frequently have long-term problems such as poor arithmetic skills and difficulty concentrating. For a long time, research focused on high-risk children, born before 32 weeks gestational age or with less than 1,500 gram. Current studies from the most recent years, however, show that this approach is too short-sighted.
Dr Julia Jäkel from the Department of Developmental Psychology has analysed cognitive abilities of children born between 23 and 41 weeks gestation. In doing so, she covered the entire spectrum, ranging from extremely preterm to healthy term born infants. For this purpose, she used data of the Bavarian Longitudinal Study, which has been following a birth cohort from the late 80s until today. “Having access to such a comprehensive long-term study is a dream come true for every developmental psychologist,” says the Bochum researcher. Over the course of the study, all children underwent a whole battery of tests that assessed their cognitive and educational abilities, and their parents were interviewed in depth.
The RUB researcher has so far mainly focused on data collected at preschool and early school age. For different test tasks, she assessed their cognitive workload, a criterion for the complexity of a given task. The data showed that preterm children had greater difficulties with tasks that demanded higher working memory resources. Moreover, results revealed that not only high-risk children had significant difficulties. On average, the more preterm a child had been born, the poorer were his or her abilities to solve complex tasks.
But what exactly is the nature of these difficulties? It has been frequently suggested that preterm children suffer from dyscalculia. A phenomenon that Julia Jäkel examined more closely. “Mathematical deficiencies, maths learning disorder, dyscalculia, innumeracy – these terms’ definitions vary slightly,” she explains, but there are no standardised, internationally consistent diagnostic criteria. In order to assess specific maths deficiencies, children in Germany are assessed with a number of tests. If their results fall below a certain cut off value in maths while their cognitive skills (IQ) are in the normal range, they are diagnosed with “maths learning disorder” or “dyscalculia”.
“The problem with preterm children, however, is that they often have general cognitive deficits,” Julia Jäkel points out. “According to current criteria, these children can’t be diagnosed.” Together with Dieter Wolke from the University of Warwick, UK, she compared different diagnostic criteria for dyscalculia in her analysis. The aim of the study was to identify specific maths deficiencies in preterm children that were independent of general cognitive impairments. With surprising results: “There is no specific maths deficit in preterm children if their general IQ is factored in,” says the researcher.
This means that preterm children do not suffer from dyscalculia more often than term children. However, they often have maths difficulties and these may not be recognized. This is because the current criteria make it impossible to diagnose dyscalculia if a child also has general cognitive deficits. Thus, these children do not receive specific help in maths although they may be in urgent need. “We need reliable and consistent diagnostic criteria,” demands Julia Jäkel. “And we’ve got to find ways to actually deliver support in schools.”
Together with her British team, the psychologist compared the results of the Bavarian Longitudinal Study with “EPICure” data, a similar study that commenced in the UK in the 1990s, following a cohort of extremely preterm children. The researchers focus on mathematical and educational performance. British preterm children had similar cognitive and basic numerical skills as German preterm children. In terms of maths achievement, however, they showed significantly better results. “We explain this with the fact that, unlike in Germany, in the UK it has not been possible for children to delay school entry,” explains Julia Jäkel. “In addition, special schools are attended by only a small percentage of extremely disabled children. All other children are integrated into normal classes in regular schools and receive targeted support there.”
The developmental psychologist has already demonstrated that assistance at primary-school age can really make a difference. Parents who support their preterm children with sensitive scaffolding can compensate the negative cognitive effects of preterm birth. It is helpful, for example, if parents give their children appropriate feedback to homework tasks and suggest potential solutions, rather than solving the tasks for the child. However, Julia Jäkel believes that a lot of research is yet to be done as far as intervention is concerned: “A large percentage of parents is very dedicated and has resources to help their children,” she says. “But research has not yet produced anything that would ensure successful results in the long-term.” Together with colleagues from the university hospital in Essen, the RUB researcher plans to investigate the benefits of computer-aided working memory training for preterm children’s school success, which has already been successfully applied on an international level.
It would also be helpful if findings from related disciplines, such as developmental psychology, educational research, and neonatal medicine were better integrated. This is, for example, because neonatal medical treatment can significantly affect later cognitive performance. Together with her interdisciplinary team, Julia Jäkel used a comprehensive model to analyse to what extent different neonatal medical indicators affect cognitive development at age 20 months, attention abilities at age six, and maths abilities at age eight years. In her analyses, she factored in child sex and socio-economic status.
Results showed that neonatal medical variables, e.g., the duration of mechanical ventilation, predicted cognitive abilities at age 20 months. Both factors together predicted attention regulation at age six years. And all those precursors, in turn, affected long-term general maths abilities.
Subsequently, Julia Jäkel analysed the data once again from a different perspective, in order to predict specific maths skills that were independent of the child’s IQ. In that model, only two variables had direct impact: the duration of mechanical ventilation and hospitalisation after birth. In the 1980s, when children participating in the Bavarian Longitudinal Study were born, German doctors often used invasive ventilation methods. Today, less invasive methods are available, but to what extent they may affect long-term cognitive performance has not yet been investigated.
“Both too high and too low oxygen concentrations are harmful to brain development,” explains Julia Jäkel. “The neonatologist in charge is faced with the great challenge of determining the right dose for each infant, depending on individually changing situations.” This is why it is so important to integrate psychological models with neonatal intensive care research. The joint objective is to offer preterm children the chance of a successful school career, high quality of life and social participation.
Clues to curbing obesity found in neuronal ‘sweet spot’
Preventing weight gain, obesity, and ultimately diabetes could be as simple as keeping a nuclear receptor from being activated in a small part of the brain, according to a new study by Yale School of Medicine researchers.
Published in the Aug. 1 issue of The Journal of Clinical Investigation (JCI), the study showed that when the researchers blocked the effects of the nuclear receptor PPARgamma in a small number of brain cells in mice, the animals ate less and became resistant to a high-fat diet.
“These animals ate fat and sugar, and did not gain weight, while their control littermates did,” said lead author Sabrina Diano, professor in the Department of Obstetrics, Gynecology & Reproductive Sciences at Yale School of Medicine. “We showed that the PPARgamma receptor in neurons that produce POMC could control responses to a high-fat diet without resulting in obesity.”
POMC neurons are found in the hypothalamus and regulate food intake. They are the neurons that when activated make you feel full and curb appetite. PPARgamma regulates the activation of these neurons.
Diano and her team studied transgenic mice that were genetically engineered to delete the PPARgamma receptor from POMC neurons. They wanted to see if they could prevent the obesity associated with a high-fat, high-sugar diet.
“When we blocked PPARgamma in these hypothalamic cells, we found an increased level of free radical formation in POMC neurons, and they were more active,” said Diano, who is also professor of comparative medicine and neurobiology at Yale and director of the Reproductive Neurosciences Group.
The findings also have key implications in diabetes. PPARgamma is a target of thiazolidinedione (TZD), a class of drugs used to treat type 2 diabetes. They lower blood-glucose levels, however, patients gain weight on these medications.
“Our study suggests that the increased weight gain in diabetic patients treated with TZD could be due to the effect of this drug in the brain, therefore, targeting peripheral PPARgamma to treat type 2 diabetes should be done by developing TZD compounds that can’t penetrate the brain,” said Diano. “We could keep the benefits of TZD without the side-effects of weight gain. Our next steps in this research are to test this theory in diabetes mouse models.”
(Figure 1: Axons grow and turn in response to guidance cues (arrows), which regulate endocytosis and exocytosis at the tips of growing axons. Credit: © 2014 T. Tojima et al.)
Steering the filaments of the developing brain
During brain development, nerve fibers grow and extend to form brain circuits. This growth is guided by molecular cues (Fig. 1), but exactly how these cues guide axon extension has been unclear. Takuro Tojima and colleagues from the RIKEN Brain Science Institute have now uncovered the signaling pathways responsible for turning growing nerve fibers, or axons, toward or away from guidance cues.
The researchers previously showed that axon-repelling cues act by inducing the removal of cell membrane—a process called endocytosis—from the side of the axon closest to the repulsive cue. The enzyme PIPKIγ90 is known to be involved in endocytosis in axons during certain types of synaptic activity, so the researchers investigated whether PIPKIγ90 also played a role in endocytosis during axon turning. By examining the developing brains of chicken embryos expressing an inactive form of PIPKIγ90, the researchers found that cues normally inducing endocytosis were no longer effective in repelling axon growth.
Cues that normally attract axons do so by driving membrane addition—exocytosis—on the side of the axon closest to the cue and also by suppressing endocytosis. Tojima’s team found that axons continued to be attracted to such cues even in the absence of PIPKIγ90, suggesting that PIPKIγ90 signaling is not involved in axon attraction.
The activity of PIPKIγ90 is known to be regulated by an enzyme called CDK5, a subunit of which binds to the protein kinase CaMKII. The researchers found that by inhibiting CDK5 or CaMKII, and thereby blocking the regulation of PIPKIγ90 that is needed to suppress endocytosis, endocytosis could occur in response to attractive cues.
They also found, however, that blocking CDK5 or CaMKII did not have any effect on endocytosis if the neurons expressed a mutant version of PIPKIγ90 that was unaffected by CDK5 and CaMKII signaling. As inhibitors of CDK5 or CaMKII did not alter endocytosis in response to repulsive cues, the team’s findings indicate that different signaling pathways are responsible for turning axons toward or away from guidance cues.
Additionally, Tojima and his colleagues showed that they could induce the attraction of axons toward drugs that inhibit endocytosis, suggesting that being able to control the direction of axon growth has potential therapeutic applications. “We hope our findings will aid in the development of future therapeutic strategies for rewiring neuronal networks after spinal cord injury and neurodegenerative diseases,” explains Tojima.
New research at Washington University School of Medicine in St. Louis helps explain why brain tumors occur more often in males and frequently are more harmful than similar tumors in females. For example, glioblastomas, the most common malignant brain tumors, are diagnosed twice as often in males, who suffer greater cognitive impairments than females and do not survive as long.

The researchers found that retinoblastoma protein (RB), a protein known to reduce cancer risk, is significantly less active in male brain cells than in female brain cells.
The study appears Aug. 1 in The Journal of Clinical Investigation.
“This is the first time anyone ever has identified a sex-linked difference that affects tumor risk and is intrinsic to cells, and that’s very exciting,” said senior author Joshua Rubin, MD, PhD. “These results suggest we need to go back and look at multiple pathways linked to cancer, checking for sex differences. Sex-based distinctions at the level of the cell may not only influence cancer risk but also the effectiveness of treatments.”
Rubin noted that RB is the target of drugs now being evaluated in clinical trials. Trial organizers hope the drugs trigger the protein’s anti-tumor effects and help cancer patients survive longer.
“In clinical trials, we typically examine data from male and female patients together, and that could be masking positive or negative responses that are limited to one sex,” said Rubin, who is an associate professor of pediatrics, neurology and anatomy and neurobiology. “At the very least, we should think about analyzing data for males and females separately in clinical trials.”
Scientists have identified many sex-linked diseases that either occur at different rates in males and females or cause different symptoms based on sex. These distinctions often are linked to sex hormones, which create and maintain many but not all of the biological differences between the sexes.
However, Rubin and his colleagues knew that sex hormones could not account for the differences in brain tumor risk.
“Male brain tumor risk remains higher throughout life despite major age-linked shifts in sex hormone production in males and females,” he said. “If the sex hormones were causing this effect, we’d see major changes in the relative rates of brain tumors in males and females at puberty. But they don’t happen then or later in life when menopause changes female sex hormone production.”
Rubin used a cell model of glioblastoma to prove it is easier to make male brain cells become tumors. After a series of genetic alterations and exposure to a growth factor, male brain cells became cancerous faster and more often than female brain cells.
In experiments designed to identify the reasons for the differences in the male and female cells, the team evaluated three genes to see if they were naturally less active in male brain cells. The genes they studied — neurofibromin, p53 and RB — normally suppress cell division and cell survival. They are mutated and disabled in many cancers.
The scientists found RB was more likely to be inactivated in male brain cells than in female brain cells. When they disabled the RB protein in female brain cells, the cells were equally susceptible to becoming cancers.
“There are other types of tumors that occur at different rates based on sex, such as some liver cancers, which occur more often in males,” Rubin said. “Knowing more about why cancer rates differ between males and females will help us understand basic mechanisms in cancer, seek more effective therapies and perform more informative clinical trials.”
(Source: news.wustl.edu)
Parkinson’s disease affects neurons in the Substantia nigra brain region – their mitochondrial activity ceases and the cells die. Researchers at the Max Planck Institute of Molecular Cell Biology and Genetics show that supplying D-lactate or glycolate, two products of the gene DJ-1, can stop and even counteract this process: Adding the substances to cultured HeLa cells and to cells of the nematode C. elegans restored the activity of mitochondria and prevented the degeneration of neurons. They also showed that the two substances rescued the toxic effects of the weed killer Paraquat. Cells that had been treated with this herbicide, which is known to cause a Parkinson’s like harm of mitochondria, recovered after the addition of the two substances. Both glycolic and D-lactic acids occur naturally in unripe fruits and certain kinds of yoghurt.

(Image caption: Inactivation of the DJ-1 gene results in mitochondrial dysfunction (left), which can be restored by glycolate or D-lactate (right). Active mitochondria are shown in red, DNA is shown in blue. Credit: © MPI-CBG)
Teymuras Kurzchalia and Tony Hyman both have labs at the Max Planck Institute of Molecular Cell Biology and Genetics with rather different research programs – but both happened to stumble upon the gene DJ-1 and joined forces. This gene, originally thought of as an oncogene, has been linked to Parkinson’s disease since 2003. Recent studies showed that DJ-1 belongs to a novel glyxolase family. The major function of these genes is assumed to detoxify aggressive aldehyde by-products from mitochondrial metabolism. The Dresden research team now showed that the products of DJ-1, D-lactate and glycolate, are actually required to maintain the high mitochondrial potential and thus can prevent the degeneration of neurons implicated in Parkinson’s disease.
Their experiments proved that both substances are lifesavers for neurons: Adding them to affected cells, in other words cells treated with the environmental poison Paraquat or with a down-regulated DJ-1, decreased the toxic effect of the herbicide, restored the activity of the mitochondria and thus ensured the survival of the neurons.
„We do not yet understand how exactly D-lactate and glycolate achieve this curative and preventive effect, but the next step will be to investigate the molecular mechanism underlying this process”, say Hyman and Kurzchalia. In addition to further molecular investigation, they also have more concrete plans for the future: As Kurzchalia says “we can develop a yoghurt enriched with D-lactate: It could serve as a protection against Parkinson’s and is actually very tasty at the same time!“ This is why the researchers have filed a patent for their finding.
Many diseases are associated with a decline in mitochondrial activity, not only Parkinson’s. Thus, the researchers believe that the DJ1-products could have a general role in protecting cells from decline.
(Source: mpg.de)