Posts tagged brain

Posts tagged brain
ScienceDaily (May 16, 2012) — In a new study analyzing Internet usage among college students, researchers at Missouri University of Science and Technology have found that students who show signs of depression tend to use the Internet differently than those who show no symptoms of depression.
Using actual Internet usage data collected from the university’s network, the researchers identified nine fine-grained patterns of Internet usage that may indicate depression. For example, students showing signs of depression tend to use file-sharing services more than their counterparts, and also use the Internet in a more random manner, frequently switching among several applications.
The researchers’ findings provide new insights on the association between Internet use and depression compared to existing studies, says Dr. Sriram Chellappan, an assistant professor of computer science at Missouri S&T and the lead researcher in the study.
"The study is believed to be the first that uses actual Internet data, collected unobtrusively and anonymously, to associate Internet usage with signs of depression," Chellappan says. Previous research on Internet usage has relied on surveys, which are "a far less accurate way" of assessing how people use the Internet, he says.
"This is because when students themselves reported their volume and type of Internet activity, the amount of Internet usage data is limited because people’s memories fade with time," Chellappan says. "There may be errors and social desirability bias when students report their own Internet usage." Social desirability bias refers to the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others.
Chellappan and his fellow researchers collected a month’s worth of Internet data for 216 Missouri S&T undergraduate students. The data was collected anonymously and unobtrusively, and students involved in the study were assigned pseudonyms to keep their identities hidden from the researchers.
Before the researchers collected the usage data from the campus network, the students were tested to determine whether they showed signs of depression. The researchers then analyzed the usage data of the study participants. They found that students who showed signs of depression used the Internet much differently than the other study participants.
Chellappan and his colleagues found that depressed students tended to use file-sharing services, send email and chat online more than the other students. Depressed students also tended to use higher “packets per flow” applications, those high-bandwidth applications often associated with online videos and games, than their counterparts.
Students who showed signs of depression also tended to use the Internet in a more “random” manner — frequently switching among applications, perhaps from chat rooms to games to email. Chellappan thinks that randomness may indicate trouble concentrating, a characteristic associated with depression.
The randomness stood out to Chellappan after his graduate student, Raghavendra Kotikalapudi, examined the “flow duration entropy” of students’ online usage. Flow duration entropy refers to the consistency of Internet use during certain periods of time. The lower the flow duration entropy, the more consistent the Internet use.
"Students showing signs of depression had high flow duration entropy, which means that the duration of Internet flows of these students is highly inconsistent," Chellappan says.
At the beginning of the study, the 216 participating students were tested to determine whether they exhibited symptoms of depression. Based on the Center for Epidemiologic Studies-Depression (CES-D) scale, about 30 percent of the students in the study met the minimum criteria for depression. Nationally, previous studies show that between 10 percent and 40 percent of all American students suffer from depression.
To ensure that participants were not identified during the study, each participant was assigned a pseudonym. The campus information technology department then provided the on-campus Internet usage data for each participant from the month of February 2011.
The researchers’ analysis of the month’s worth of data led Chellappan and his colleagues to conclude that students who were identified as exhibiting symptoms of depression used the Internet differently than the other students in the study.
Chellappan’s research has been accepted for publication in a forthcoming issue of IEEE Technology and Society Magazine.
The chief author of the paper is Kotikalapudi, who received his master of science degree in computer science from Missouri S&T in December 2011. His co-authors are Chellappan; Dr. Frances Montgomery, Curators’ Teaching Professor of psychological science; Dr. Donald C. Wunsch, the M.K. Finley Missouri Distinguished Professor of Computer Engineering; and Karl F. Lutzen, information security officer for Missouri S&T’s IT department.
Chellappan is now interested in using these findings to develop software that could be installed on home computers to help individuals determine whether their Internet usage patterns may indicate depression. The software would unobtrusively monitor Internet usage and alert individuals if their usage patterns indicate symptoms of depression.
"The software would be a cost-effective and an in-home tool that could proactively prompt users to seek medical help if their Internet usage patterns indicate possible depression," Chellappan says. "The software could also be installed on campus networks to notify counselors of students whose Internet usage patterns are indicative of depressive behavior."
Chellappan also believes the method used to connect Internet use and depression could also help diagnose other mental disorders like anorexia, bulimia, attention deficit hyperactivity disorder or schizophrenia.
"We could also investigate associations between other Internet features like visits to social networking sites, late night Internet use and randomness in time of Internet use with depressive symptoms," he says. "Applications of this study to diagnose and treat mental disorders for other vulnerable groups like the elderly and military veterans are also significant."
Source: Science Daily
ScienceDaily (May 16, 2012) — A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. The research is published in the May 16, 2012, online issue of Neurology®, the medical journal of the American Academy of Neurology.

A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. (Credit: © modestil / Fotolia)
The study involved college athletes at three Division I schools and compared 214 athletes in contact sports to 45 athletes in non-contact sports such as track, crew and Nordic skiing at the beginning and at the end of their seasons. The contact sport athletes wore special helmets that recorded the acceleration speed and other data at the time of any head impact.
The contact sport athletes experienced an average of 469 head impacts during the season. Athletes were not included in the study if they were diagnosed with a concussion during the season.
All of the athletes took tests of thinking and memory skills before and after the season. A total of 45 contact sport athletes and 55 non-contact sport athletes from one of the schools also took an additional set of tests of concentration, working memory and other skills.
"The good news is that overall there were few differences in the test results between the athletes in contact sports and the athletes in non-contact sports," said study author Thomas W. McAllister, MD, of The Geisel School of Medicine at Dartmouth in Lebanon, N.H. "But we did find that a higher percentage of the contact sport athletes had lower scores than would have been predicted after the season on a measure of new learning than the non-contact sport athletes."
A total of 22 percent of the contact sport athletes performed worse than expected on the test of new learning, compared to four percent of the non-contact sport athletes.
McAllister noted that the study did not find differences in test results between the two groups of athletes at the beginning of the season, suggesting that the cumulative head impacts that contact athletes had incurred over many previous seasons did not result in reduced thinking and memory skills in the overall group.
"These results are somewhat reassuring, given the recent heightened concern about the potential negative effects of these sports," he said. "Nevertheless, the findings do suggest that repetitive head impacts may have a negative effect on some athletes."
McAllister said it’s possible that some people may be genetically more sensitive to head impacts.
Source: Science Daily
ScienceDaily (May 16, 2012) — Poor Phineas Gage. In 1848, the supervisor for the Rutland and Burlington Railroad in Vermont was using a 13-pound, 3-foot-7-inch rod to pack blasting powder into a rock when he triggered an explosion that drove the rod through his left cheek and out of the top of his head. As reported at the time, the rod was later found, “smeared with blood and brains.”

Recreation of Gage accident. (Credit: Copyright John Darrell Van Horn and the UCLA Laboratory of Neuro Imaging, 2012)
Miraculously, Gage lived, becoming the most famous case in the history of neuroscience — not only because he survived a horrific accident that led to the destruction of much of his left frontal lobe but also because of the injury’s reported effects on his personality and behavior, which were said to be profound. Gage went from being an affable 25-year-old to one that was fitful, irreverent and profane. His friends and acquaintances said he was “no longer Gage.”
Over the years, various scientists have studied and argued about the exact location and degree of damage to Gage’s cerebral cortex and the impact it had on his personality. Now, for the first time, researchers at UCLA, using brain-imaging data that was lost to science for a decade, have broadened the examination of Gage to look at the damage to the white matter “pathways” that connect various regions of the brain.
Reporting in the May 16 issue of the journal PLoS ONE, Jack Van Horn, a UCLA assistant professor of neurology, and colleagues note that while approximately 4 percent of the cerebral cortex was intersected by the rod’s passage, more than 10 percent of Gage’s total white matter was damaged. The passage of the tamping iron caused widespread damage to the white matter connections throughout Gage’s brain, which likely was a major contributor to the behavioral changes he experienced.
Because white matter and its myelin sheath — the fatty coating around the nerve fibers that form the basic wiring of the brain — connect the billions of neurons that allow us to reason and remember, the research not only adds to the lore of Phineas Gage but may eventually lead to a better understanding of multiple brain disorders that are caused in part by similar damage to these connections.
"What we found was a significant loss of white matter connecting the left frontal regions and the rest of the brain," said Van Horn, who is a member of UCLA’s Laboratory of Neuro Imaging (LONI). "We suggest that the disruption of the brain’s ‘network’ considerably compromised it. This may have had an even greater impact on Mr. Gage than the damage to the cortex alone in terms of his purported personality change."
LONI is part of an ambitious joint effort with Massachusetts General Hospital and the National Institutes of Health to document the trillions of microscopic links between every one of the brain’s 100 billion neurons — the so-called “connectome.” And because mapping the brain’s physical wiring eventually will lead to answers about what causes mental conditions that may be linked to the breakdown of these connections, it was appropriate, as well as historically interesting, to take a new look at the damage to Gage’s brain.
Since Gage’s 189-year-old skull, which is on display in the Warren Anatomical Museum at Harvard Medical School, is now fragile and unlikely to again be subjected to medical imaging, the researchers had to track down the last known imaging data, from 2001, which had been lost due to various circumstances at Brigham and Women’s Hospital, a teaching affiliate of Harvard, for some 10 years.
The authors were able to recover the computed tomographic data files and managed to reconstruct the scans, which revealed the highest-quality resolution available for modeling Gage’s skull. Next, they utilized advanced computational methods to model and determine the exact trajectory of the tamping iron that shot through his skull. Finally, because the original brain tissue was, of course, long gone, the researchers used modern-day brain images of males that matched Gage’s age and (right) handedness, then used software to position a composite of these 110 images into Gage’s virtual skull, the assumption being that Gage’s anatomy would have been similar.
Van Horn found that nearly 11 percent of Gage’s white matter was damaged, along with 4 percent of the cortex.
"Our work illustrates that while cortical damage was restricted to the left frontal lobe, the passage of the tamping iron resulted in the widespread interruption of white matter connectivity throughout his brain, so it likely was a major contributor to the behavioral changes he experienced," Van Horn said. "Connections were lost between the left frontal, left temporal and right frontal cortices and the left limbic structures of the brain, which likely had considerable impact on his executive as well as his emotional functions."
And while Gage’s personality changed, he eventually was able to travel and find employment as a stagecoach driver for several years in South America. Ultimately, he died in San Francisco, 12 years after the accident.
Van Horn noted a modern parallel.
"The extensive loss of white matter connectivity, affecting both hemispheres, plus the direct damage by the rod, which was limited to the left cerebral hemisphere, is not unlike modern patients who have suffered a traumatic brain injury," he said. "And it is analogous to certain forms of degenerative diseases, such as Alzheimer’s disease or frontal temporal dementia, in which neural pathways in the frontal lobes are degraded, which is known to result in profound behavioral changes."
Van Horn noted that the quantification of the changes to Gage’s brain’s pathways might well provide important insights for clinical assessment and outcome-monitoring in modern-day brain trauma patients.
Source: Science Daily
May 16, 2012
(Medical Xpress) — When an animal is born, its early experiences help map out the still-forming connections in its brain. As neurons in sensory areas of the brain fire in response to sights, smells, and sounds, synapses begin to form, laying the neuronal groundwork for activity later in life. Not all parts of the brain receive input directly from the external world, however, and researchers have wondered how these regions build their wiring early in development.

The output of this indirect-pathway neuron in the striatum of a mouse brain has been genetically silenced. The neuron has been filled through the attached electrode with a red fluorophore to measure its spine density and the number of active synapses. In the background, other indirect pathway neurons are seen in green and red. Credit: Bernardo Sabatini
New research from Howard Hughes Medical Institute investigator Bernardo Sabatini and colleagues on the basal ganglia, a region of the brain that controls motor planning, indicates that development here follows a different strategy. The new findings suggest that wiring of the basal ganglia during early development is driven not only by experience, but also by a self-reinforcing loop of neuronal signaling. As the loop strengthens, more synapses form.
“What we found is that silencing these neurons doesn’t really change their output patterns — of course they are silenced, but they still find their targets and survive — but instead drastically influences their inputs,” said Bernardo L. Sabatini.
The basal ganglia help an animal select its actions based on sensory and social context, as well as past experience. The new clues about how the basal ganglia gets wired shortly after birth, described in the May 13, 2012, issue of the journal Nature, may help scientists understand what happens when the area goes awry, such as in Parkinson’s disease, when degradation of neurons in the basal ganglia interferes with patients’ ability to initiate appropriate movements, or drug addiction, where overstimulation of the basal ganglia spurs inappropriate actions. Sabatini says his team’s findings also suggest that the process can be easily perturbed during development, and may contribute to human disorders such as cerebral palsy and attention deficit hyperactivity disorder.
Although the basal ganglia do not receive direct messages from the external world, this region of the brain is by no means anatomically isolated: it receives signals from all over the cortex, and its output eventually returns to the cortex. Sabatini, who is at Harvard Medical School, explains that to select a motor action, the brain likely signals through that whole loop. “The question is, how do you lay down the circuits for those patterns?”
The basal ganglia are complex, containing many clusters of cells, some of which send excitatory signals and others inhibitory. Sabatini’s group focused on the basal ganglia’s main input station, the striatum. The striatum uses the information it receives to help direct movement in two ways: a ‘direct’ pathway stimulates motor actions and an ‘indirect’ pathway inhibits them. To learn how striatal activity affects circuit development, Sabatini’s team studied mutant mice whose indirect or direct pathways were turned off (because they were unable to release the inhibitory chemical messenger, GABA).
The group expected that silencing these neurons would prevent them from forming connections with the neurons that should have been receiving their signals. To their surprise, the silenced neurons survived and wired themselves to their targets normally. Unexpectedly, however, silencing the striatum’s direct pathway seemed to prevent formation of the connections sending input to the striatum. Silencing the indirect pathway upped the number of inputs. “We went into this study thinking completely differently,” says Sabatini. “What we found is that silencing these neurons doesn’t really change their output patterns — of course they are silenced, but they still find their targets and survive — but instead drastically influences their inputs.”
To see whether individual cells help set up the basal ganglia circuit, Sabatini’s group turned off a select few striatal neurons, rather than whole pathways, in the mice. They found that silencing these neurons did not affect excitatory connections to the area, suggesting that circuit-level activity patterns set up the basal ganglia’s wiring, rather than individual genes or molecules within cells. “It’s hard to believe that there are molecular cues that specify these structures, because it would be way too complicated,” Sabatini says.
When the group dampened activity in neurons that project from the brain’s cortex to the striatum during development, then examined the brain when the mouse had reached early adulthood (25 days after birth) they saw fewer neuronal connections in the striatum compared to mice that had developed normally suggesting that early perturbations in development can have lasting effects. “That experiments is what told us that it’s the ongoing activity of cortical neurons that is driving this process in the striatum,” Sabatini says. The axons — the slender processes of the neuron that carry electrical impulses — stimulate striatal cells by releasing the excitatory neurotransmitter glutamate, telling them to make more synapses and stabilize them, he adds.
Sabatini believes that the basal ganglia tests random connection patterns after an animal is born and reinforces the correct ones. This type of plasticity of the basal ganglia probably lasts into adulthood, because animals are constantly learning to take new actions. Using genetically engineered mice that allow researchers to control exactly which neurons to inactivate and when, Sabatini’s group is now studying how perturbations affect the wiring later in life.
Sabatini expects that these results will get us a step closer to understanding human disease. “Maybe we will show that there’s hope for therapy,” he adds. “If it is plastic, maybe we can recover.”
Provided by Howard Hughes Medical Institute
Source: medicalxpress.com
May 16, 2012
(Medical Xpress) — Scientists at the University of Bristol have shed new light on one of the great unanswered questions of neuroscience: how the brain initiates rhythmic movements like walking, running and swimming.

The Xenopus frog tadpole is a small, simple vertebrate
While experiments in the 1970s using electrical brain stimulation identified areas of the brain responsible for starting locomotion, the precise neuron-by-neuron pathway has not been described in any vertebrate – until now.
To find this pathway, Dr. Edgar Buhl and colleagues in Bristol’s School of Biological Sciences studied a small, simple vertebrate: the Xenopus frog tadpole.
They found that the pathway to initiate swimming consists of just four types of neurons. By touching skin on the head of the tadpole and applying cellular neurophysiology and anatomy techniques, the scientists identified nerve cells that detect the touch on the skin, two types of brain nerve cells which pass on the signal, and the motor nerve cells that control the swimming muscles.
Dr. Buhl said: “These findings address the longstanding question of how locomotion is initiated following sensory stimulation and, for the first time in any vertebrate, define in detail a direct pathway responsible. They could thus be of great evolutionary interest and could also open the path to understanding initiation of locomotion in other vertebrates.”
When mechanisms in the brain that initiate locomotion break down – for example, in people with Parkinson’s disease – starting to walk becomes a real problem. Therefore, understanding the initiation of swimming in tadpoles could be a first step towards understanding the initiation of locomotion in more complex vertebrates, including people, and may eventually have implications for treating movement disorders such as Parkinson’s.
The research is published today in the Journal of Physiology.
Provided by University of Bristol
Source: medicalxpress.com
ScienceDaily (May 15, 2012) — Attention, college students cramming between midterms and finals: Binging on soda and sweets for as little as six weeks may make you stupid.

New research suggests that binging on soda and sweets for as little as six weeks may make you stupid. (Credit: © RTimages / Fotolia)
A new UCLA rat study is the first to show how a diet steadily high in fructose slows the brain, hampering memory and learning — and how omega-3 fatty acids can counteract the disruption. The peer-reviewed Journal of Physiology publishes the findings in its May 15 edition.
"Our findings illustrate that what you eat affects how you think," said Fernando Gomez-Pinilla, a professor of neurosurgery at the David Geffen School of Medicine at UCLA and a professor of integrative biology and physiology in the UCLA College of Letters and Science. "Eating a high-fructose diet over the long term alters your brain’s ability to learn and remember information. But adding omega-3 fatty acids to your meals can help minimize the damage."
While earlier research has revealed how fructose harms the body through its role in diabetes, obesity and fatty liver, this study is the first to uncover how the sweetener influences the brain.
The UCLA team zeroed in on high-fructose corn syrup, an inexpensive liquid six times sweeter than cane sugar, that is commonly added to processed foods, including soft drinks, condiments, applesauce and baby food. The average American consumes more than 40 pounds of high-fructose corn syrup per year, according to the U.S. Department of Agriculture. “We’re not talking about naturally occurring fructose in fruits, which also contain important antioxidants,” explained Gomez-Pinilla, who is also a member of UCLA’s Brain Research Institute and Brain Injury Research Center. “We’re concerned about high-fructose corn syrup that is added to manufactured food products as a sweetener and preservative.”
Gomez-Pinilla and study co-author Rahul Agrawal, a UCLA visiting postdoctoral fellow from India, studied two groups of rats that each consumed a fructose solution as drinking water for six weeks. The second group also received omega-3 fatty acids in the form of flaxseed oil and docosahexaenoic acid (DHA), which protects against damage to the synapses — the chemical connections between brain cells that enable memory and learning.
"DHA is essential for synaptic function — brain cells’ ability to transmit signals to one another," Gomez-Pinilla said. "This is the mechanism that makes learning and memory possible. Our bodies can’t produce enough DHA, so it must be supplemented through our diet."
The animals were fed standard rat chow and trained on a maze twice daily for five days before starting the experimental diet. The UCLA team tested how well the rats were able to navigate the maze, which contained numerous holes but only one exit. The scientists placed visual landmarks in the maze to help the rats learn and remember the way.
Six weeks later, the researchers tested the rats’ ability to recall the route and escape the maze. What they saw surprised them.
"The second group of rats navigated the maze much faster than the rats that did not receive omega-3 fatty acids," Gomez-Pinilla said. "The DHA-deprived animals were slower, and their brains showed a decline in synaptic activity. Their brain cells had trouble signaling each other, disrupting the rats’ ability to think clearly and recall the route they’d learned six weeks earlier."
The DHA-deprived rats also developed signs of resistance to insulin, a hormone that controls blood sugar and regulates synaptic function in the brain. A closer look at the rats’ brain tissue suggested that insulin had lost much of its power to influence the brain cells.
"Because insulin can penetrate the blood-brain barrier, the hormone may signal neurons to trigger reactions that disrupt learning and cause memory loss," Gomez-Pinilla said.
He suspects that fructose is the culprit behind the DHA-deficient rats’ brain dysfunction. Eating too much fructose could block insulin’s ability to regulate how cells use and store sugar for the energy required for processing thoughts and emotions.
"Insulin is important in the body for controlling blood sugar, but it may play a different role in the brain, where insulin appears to disturb memory and learning," he said. "Our study shows that a high-fructose diet harms the brain as well as the body. This is something new."
Gomez-Pinilla, a native of Chile and an exercise enthusiast who practices what he preaches, advises people to keep fructose intake to a minimum and swap sugary desserts for fresh berries and Greek yogurt, which he keeps within arm’s reach in a small refrigerator in his office. An occasional bar of dark chocolate that hasn’t been processed with a lot of extra sweetener is fine too, he said.
Still planning to throw caution to the wind and indulge in a hot-fudge sundae? Then also eat foods rich in omega-3 fatty acids, like salmon, walnuts and flaxseeds, or take a daily DHA capsule. Gomez-Pinilla recommends one gram of DHA per day.
"Our findings suggest that consuming DHA regularly protects the brain against fructose’s harmful effects," said Gomez-Pinilla. "It’s like saving money in the bank. You want to build a reserve for your brain to tap when it requires extra fuel to fight off future diseases."
Source: Science Daily
ScienceDaily (May 15, 2012) — A novel mechanism for anxiety behaviors, including a previously unrecognized inhibitory brain signal, may inspire new strategies for treating psychiatric disorders, University of Chicago researchers report.
By testing the controversial role of a gene called Glo1 in anxiety, scientists uncovered a new inhibitory factor in the brain: the metabolic by-product methylglyoxal. The system offers a tantalizing new target for drugs designed to treat conditions such as anxiety disorder, epilepsy, and sleep disorders.
The study, published in the Journal of Clinical Investigation, found that animals with multiple copies of the Glo1 gene were more likely to exhibit anxiety-like behavior in laboratory tests. Further experiments showed that Glo1 increased anxiety-like behavior by lowering levels of methylglyoxal (MG). Conversely, inhibiting Glo1 or raising MG levels reduced anxiety behaviors.
"Animals transgenic for Glo1 had different levels of anxiety-like behavior, and more copies made them more anxious," said Abraham Palmer, PhD, assistant professor of human genetics at the University of Chicago Medicine and senior author of the study. "We showed that Glo1 was causally related to anxiety-like behavior, rather than merely correlated."
In 2005, a comparison of different mouse strains found a link between anxiety-like behaviors and Glo1, the gene encoding the metabolic enzyme glyoxylase 1. However, subsequent studies questioned the link, and the lack of an obvious connection between glyoxylase 1 and brain function or behavior made some scientists skeptical.
May 15, 2012
(Medical Xpress) — New research from Uppsala University, Sweden, suggests that an active lifestyle in late life protects grey matter and cognitive functions in humans. The findings are now published in the scientific journal Neurobiology of Aging.
In a new study, a multidisciplinary research team from the Uppsala University has systematically studied 331 men and women at the age of 75 years. The researchers examined whether an active lifestyle is tied to brain health in seniors living in Uppsala, Sweden. The brain structure of each participant was measured using magnetic imaging technology, so-called MRT, and various memory tests were administered in order to monitor the seniors’ cognitive status.
“We found that those elderly who reported to be more active in daily routine had larger grey and white matter and showed better performances on various memory tests, compared to those who had a sedentary lifestyle. Interestingly, active elderly had also more grey matter in the precuneus, a brain region that typically shrinks at the beginning of Alzheimer’s disease. Our findings suggest that an active lifestyle is a promising strategy for counteracting cognitive aging late in life,” says Christian Benedict.
The data for the study were taken from the major epidemiological study Prospective Investigation of the Vasculature in Uppsala Seniors (PIVUS). http://www.medsci.uu.se/pivus/
More information: Benedict C et al., Association between physical activity and brain health in older adults, Neurobiology of Aging, in press. http://www.sciencedirect.com/science/article/pii/S0197458012002618
Provided by Uppsala University
Source: medicalxpress.com
ScienceDaily (May 14, 2012) — A new study consisting of inducing cells to express telomerase, the enzyme which — metaphorically — slows down the biological clock — was successful. The research provides a “proof-of-principle” that this “feasible and safe” approach can effectively “improve health span.”

Pictured are Maria A. Blasco and Bruno M. Bernardes de Jesús (co-author) in the CNIO building in Madrid. (Credit: CNIO)
A number of studies have shown that it is possible to lengthen the average life of individuals of many species, including mammals, by acting on specific genes. To date, however, this has meant altering the animals’ genes permanently from the embryonic stage — an approach impracticable in humans. Researchers at the Spanish National Cancer Research Centre (CNIO), led by its director María Blasco, have demonstrated that the mouse lifespan can be extended by the application in adult life of a single treatment acting directly on the animal’s genes. And they have done so using gene therapy, a strategy never before employed to combat aging. The therapy has been found to be safe and effective in mice.
The results were recently published in the journal EMBO Molecular Medicine. The CNIO team, in collaboration with Eduard Ayuso and Fátima Bosch of the Centre of Animal Biotechnology and Gene Therapy at the Universitat Autònoma de Barcelona (UAB), treated adult (one-‐year-‐old) and aged (two-‐year-‐old) mice, with the gene therapy delivering a “rejuvenating” effect in both cases, according to the authors.
Mice treated at the age of one lived longer by 24% on average, and those treated at the age of two, by 13%. The therapy, furthermore, produced an appreciable improvement in the animals’ health, delaying the onset of age-‐related diseases — like osteoporosis and insulin resistance — and achieving improved readings on aging indicators like neuromuscular coordination.
The gene therapy consisted of treating the animals with a DNA-modified virus, the viral genes having been replaced by those of the telomerase enzyme, with a key role in aging. Telomerase repairs the extreme ends or tips of chromosomes, known as telomeres, and in doing so slows the cell’s and therefore the body’s biological clock. When the animal is infected, the virus acts as a vehicle depositing the telomerase gene in the cells.
This study “shows that it is possible to develop a telomerase-based anti-aging gene therapy without increasing the incidence of cancer,” the authors affirm. “Aged organisms accumulate damage in their DNA due to telomere shortening, [this study] finds that a gene therapy based on telomerase production can repair or delay this kind of damage,” they add.
'Resetting' the biological clock
Telomeres are the caps that protect the end of chromosomes, but they cannot do so indefinitely: each time the cell divides the telomeres get shorter, until they are so short that they lose all functionality. The cell, as a result, stops dividing and ages or dies. Telomerase gets around this by preventing telomeres from shortening or even rebuilding them. What it does, in essence, is stop or reset the cell’s biological clock.
But in most cells the telomerase gene is only active before birth; the cells of an adult organism, with few exceptions, have no telomerase. The exceptions in question are adult stem cells and cancer cells, which divide limitlessly and are therefore immortal — in fact several studies have shown that telomerase expression is the key to the immortality of tumour cells.
It is precisely this risk of promoting tumour development that has set back the investigation of telomerase-‐based anti-‐aging therapies.
In 2007, Blasco’s group demonstrated that it was feasible to prolong the lives of transgenic mice, whose genome had been permanently altered at the embryonic stage, by causing their cells to express telomerase and, also, extra copies of cancer-‐resistant genes. These animals live 40% longer than is normal and do not develop cancer.
The mice subjected to the gene therapy now under test are likewise free of cancer. Researchers believe this is because the therapy begins when the animals are adult so do not have time to accumulate sufficient number of aberrant divisions for tumours to appear.
Also important is the kind of virus employed to carry the telomerase gene to the cells. The authors selected demonstrably safe viruses that have been successfully used in gene therapy treatment of hemophilia and eye disease. Specifically, they are non-‐replicating viruses derived from others that are non-‐pathogenic in humans.
This study is viewed primarily as “a proof-‐of-‐principle that telomerase gene therapy is a feasible and generally safe approach to improve healthspan and treat disorders associated with short telomeres,” state Virginia Boccardi (Second University of Naples) and Utz Herbig (New Jersey Medical School-‐University Hospital Cancer Centre) in a commentary published in the same journal.
Although this therapy may not find application as an anti-‐aging treatment in humans, in the short term at least, it could open up a new treatment option for ailments linked with the presence in tissue of abnormally short telomeres, as in some cases of human pulmonary fibrosis.
More healthy years
As Blasco says, “aging is not currently regarded as a disease, but researchers tend increasingly to view it as the common origin of conditions like insulin resistance or cardiovascular disease, whose incidence rises with age. In treating cell aging, we could prevent these diseases.”
With regard to the therapy under testing, Bosch explains: “Because the vector we use expresses the target gene (telomerase) over a long period, we were able to apply a single treatment. This might be the only practical solution for an anti-‐aging therapy, since other strategies would require the drug to be administered over the patient’s lifetime, multiplying the risk of adverse effects.”
Source: Science Daily
May 14th, 2012
Controlled trial shows improved spasticity, reduced pain after smoking medical marijuana.
A clinical study of 30 adult patients with multiple sclerosis (MS) at the University of California, San Diego School of Medicine has shown that smoked cannabis may be an effective treatment for spasticity – a common and disabling symptom of this neurological disease.
The placebo-controlled trial also resulted in reduced perception of pain, although participants also reported short-term, adverse cognitive effects and increased fatigue. The study will be published in the Canadian Medical Association Journal on May 14.
Principal investigator Jody Corey-Bloom, MD, PhD, professor of neurosciences and director of the Multiple Sclerosis Center at UC San Diego, and colleagues randomly assigned participants to either the intervention group (which smoked cannabis once daily for three days) or the control group (which smoked identical placebo cigarettes, also once a day for three days). After an 11-day interval, the participants crossed over to the other group.
“We found that smoked cannabis was superior to placebo in reducing symptoms and pain in patients with treatment-resistant spasticity, or excessive muscle contractions,” said Corey-Bloom.
Earlier reports suggested that the active compounds of medical marijuana were potentially effective in treating neurologic conditions, but most studies focused on orally administered cannabinoids. There were also anecdotal reports of MS patients that endorsed smoking marijuana to relieve symptoms of spasticity.
However, this trial used a more objective measurement, a modified Ashford scale which graded the intensity of muscle tone by measuring such things as resistance in range of motion and rigidity. The secondary outcome, pain, was measured using a visual analogue scale. The researchers also looked at physical performance (using a timed walk) and cognitive function and – at the end of each visit – asked patients to assess their feeling of “highness.”
Although generally well tolerated, smoking cannabis did have mild effects on attention and concentration. The researchers noted that larger, long-terms studies are needed to confirm their findings and determine whether lower doses can result in beneficial effects with less cognitive impact.
The current study is the fifth clinical test of the possible efficacy of cannabis for clinical use reported by the University of California Center for Medicinal Cannabis Research (CMCR). Four other human studies on control of neuropathic pain also reported positive results.
Source: Neuroscience News