Posts tagged science

Posts tagged science
May 17, 2012
Around 1 in 50 people in the general population and 1 in 6 of those aged over 40 years experience neuropathy (damage to the nerves of the peripheral nervous system), which can cause numbness, tingling, pain, or weakness. The most common cause of neuropathy is diabetes, and up to half of diabetes patients can be affected. Currently, among the only treatments for neuropathy are glucose control (which often only delays it) and pain management. Yet less than half of patients are treated for pain, despite the availability of many effective therapies . Growing evidence suggests that various metabolic risk factors, including prediabetes, could be linked with neuropathy and thus be targets for new disease-modifying drugs. The issues are discussed in a Review in the June issue of The Lancet Neurology, by Dr Brian C Callaghan and colleagues, all of the University of Michigan, Ann Arbor, MI, USA.
Diabetes can cause various patterns of so-called diabetic neuropathy, but the most common presentation is a distal symmetrical polyneuropathy (DSP), in which symptoms begin in the feet and spread up the limbs. Patients experience decreased quality of life, both physically and mentally. DSP can cause balance problems, which may lead to falls. Neuropathy is one of three main risk factors for falls in patients with diabetes, along with retinopathy and vestibular dysfunction. Patients with diabetic DSP are two to three times more likely to fall than those with diabetes and no neuropathy. Additionally, patients with severe DSP are at risk of ulcerations and lower-extremity amputations, with 15% developing an ulcer during the course of their disease. Diabetes is the leading cause of lower-extremity amputations, roughly 80 000 of which are undertaken in the USA every year in patients with the disorder. Indeed, patients with diabetes are 15 times more likely than people without diabetes to have this life-changing complication.
Overall, costs associated with diabetic neuropathy in the USA are estimated to be between 4•6 and 13•7 billion dollars, with most of the expense attributed to those with type 2 diabetes. Therefore, neuropathy is associated with a quarter of the total costs of diabetes care in the USA.
Since the data linking prediabetes (a condition with higher than normal blood sugar levels, but not yet high enough for a diabetes diagnosis) with neuropathy are conflicting, a comprehensive study is needed to establish whether or not it is one of the metabolic drivers that underlie the onset and progression of neuropathy. The answer has direct implications for potential therapies for many patients with neuropathy. Currently one third of adult Americans meet criteria for prediabetes, but less than 5% of these people have received a formal diagnosis of prediabetes from their health-care providers and only a small percentage are being treated .Establishing a causal relation between prediabetes and neuropathy would change the clinical management of a substantial number of patients.
Research suggests that various metabolic factors (components of ‘metabolic syndrome’) other than blood glucose control—such as levels of LDL (bad) cholesterol and high blood pressure—might have a role in the development of neuropathy. The authors say that there are promising lines of investigation that could lead to improved prevention and treatment of the disorder. The magnitude of the effect of glucose control on neuropathy is much smaller in patients with type 2 diabetes than in those with type 1 diabetes. In view of this small effect size and the fact that many patients with type 2 diabetes continue to develop neuropathy despite adequate glucose control, discovery of modifiable risk factors for neuropathy is essential. Callaghan and colleagues are currently conducting such a study.
The authors conclude: “Components of the metabolic syndrome, including prediabetes, are potential risk factors for neuropathy, and studies are needed to establish whether they are causally related to neuropathy. These lines of enquiry will have direct implications for the development of new treatments for diabetic neuropathy.”
Provided by Lancet
Source: medicalxpress.com
ScienceDaily (May 17, 2012) — Training the brain to reduce pain could be a promising approach for treating phantom limb pain and complex regional pain syndrome, according to an internationally known neuroscience researcher speaking May 17 at the American Pain Society’s Annual Scientific Meeting.
G. Lorimer Moseley, PhD, professor of clinical neurosciences at University of South Australia and Neuroscience Research Australia, and head of the Body in Mind research team, told the plenary session audience that the brain stores maps of the body that are integrated with neurological systems that survey, regulate, and protect the integrity of the body physically and psychologically. These cortical maps govern movement, sensation and perception, and there is growing evidence, according to Moseley, showing that disruptions of brain maps occur in people with chronic pain. The best evidence is from those with phantom limb pain and complex regional pain syndrome, but there is also data from chronic back pain.
Moseley’s research is focused on the role of the brain and mind in chronic and complex pain disorders. Through collaborations with clinicians, scientists and patients, the Body in Mind team is exploring how the brain and its representation of the body change when pain persists, how the mind influences physiological regulation of the body, how the changes in the brain and mind can be normalized with treatment.
"We’re learning that chronic pain is associated with disruption of brain maps of the body and of the space around the body. When the brain determines the location of a sensory event, it integrates the location of the event in the body with a map of space. Disruption of these processes might be contributing to the problem," said Moseley. He added that it is possible for the body to be unharmed but the brain will respond by causing pain because it misinterpreted a benign stimulus as an attack. "We want to gradually train the brain to stop trying to protect body tissue that doesn’t need protecting."
Moseley said the brain can “rewire” itself, a process called neuroplasticity. Often painful stimuli triggered by a broken bone or other trauma cause the brain to rewire and, as a result, the damage signal is never switched off after the initial body trauma is resolved. The result: Chronic pain. So if the brain is capable of changing to cause persistent pain, can it be changed back to normal to alleviate pain?
"The brain is the focal point of the pain experience, but the plasticity phenomena can be harnessed to help alleviate pain," Moseley said.
He further stated that disrupted cortical body maps may contribute to the development or maintenance of chronic pain and, therefore, could be viable targets for treatment. One treatment approach involves targeting motor systems through a process Moseley calls graded motor imagery. It relies on using visual images to help the brain change its perceptions of the body after prolonged pain stimuli. “For someone with phantom limb pain, the brain’s body map still includes the severed arm or leg, and without any real stimuli from the region, it continues to produce pain,” Moseley explained.
He reported that studies with graded motor imagery have shown encouraging results in complex regional pain syndrome and in phantom limb pain.
"Our work shows that the complex neural connections in the brain not only are associated with chronic pain, they can be reconnected or manipulated through therapy that alters brain perceptions and produce pain relief," said Moseley.
Source: Science Daily
ScienceDaily (May 17, 2012) — Mental distractions make pain easier to take, and those pain-relieving effects aren’t just in your head, according to a report published online on May 17 in Current Biology, a Cell Press publication.
The findings based on high-resolution spinal fMRI (functional magnetic resonance imaging) as people experienced painful levels of heat show that mental distractions actually inhibit the response to incoming pain signals at the earliest stage of central pain processing.
"The results demonstrate that this phenomenon is not just a psychological phenomenon, but an active neuronal mechanism reducing the amount of pain signals ascending from the spinal cord to higher-order brain regions," said Christian Sprenger of the University Medical Center Hamburg-Eppendorf.
Those effects involve endogenous opioids, which are naturally produced by the brain and play a key role in the relief of pain, the new evidence shows.
The research group asked participants to complete either a hard or an easy memory task, both requiring them to remember letters, while they simultaneously applied a painful level of heat to their arms.
When study participants were more distracted by the harder of the two memory tasks, they did indeed perceive less pain. What’s more, their less painful experience was reflected by lower activity in the spinal cord as observed by fMRI scans. (fMRI is often used to measure changes in brain activity, Sprenger explained, and recent advances have made it possible to extend this tool for use in the spinal cord.)
Sprenger and colleagues then repeated the study again, this time giving participants either a drug called naloxone, which blocks the effects of opioids, or a simple saline infusion. The pain-relieving effects of distraction dropped by 40 percent during the application of the opioid antagonist compared to saline, evidence that endogenous opioids play an essential role.
The findings show just how deeply mental processes can go in altering the experience of pain, and that may have clinical importance.
"Our findings strengthen the role of cognitive-behavioral therapeutic approaches in the treatment of pain diseases, as it could be extrapolated that these approaches might also have the potential to alter the underlying neurobiological mechanisms as early as in the spinal cord," the researchers say.
Source: Science Daily
May 17, 2012
Fool me once, shame on you. Fool me twice, shame on my parahippocampal gyrus.

Read Montague, Ph.D., and colleagues at the Virginia Tech Carilion Research Institute discovered two distinct sites for suspicion in the brain: the amygdala, which correlates strongly with a baseline distrustfulness, and the parahippocampal gyrus, which acts like a cerebral lie detector. Credit: Virginia Tech
Scientists at the Virginia Tech Carilion Research Institute have found that suspicion resides in two distinct regions of the brain: the amygdala, which plays a central role in processing fear and emotional memories, and the parahippocampal gyrus, which is associated with declarative memory and the recognition of scenes.
"We wondered how individuals assess the credibility of other people in simple social interactions," said Read Montague, director of the Human Neuroimaging Laboratory and the Computational Psychiatry Unit at the Virginia Tech Carilion Research Institute, who led the study. "We found a strong correlation between the amygdala and a baseline level of distrust, which may be based on a person’s beliefs about the trustworthiness of other people in general, his or her emotional state, and the situation at hand. What surprised us, though, is that when other people’s behavior aroused suspicion, the parahippocampal gyrus lit up, acting like an inborn lie detector.”
The scientists used functional magnetic resonance imaging, or fMRI, to study the neural basis of suspicion. Seventy-six pairs of players, each with a buyer and a seller, competed in 60 rounds of a simple bargaining game while having their brains scanned. At the beginning of each round, the buyer would learn the value of a hypothetical widget and suggest a price to the seller. The seller would then set the price. If the seller’s price fell below the widget’s given value, the trade would go through, with the seller receiving the selling price and the buyer receiving any difference between the selling price and the actual value. If the seller’s price exceeded the value, though, the trade would not execute, and neither party would receive cash.
The authors found, as detailed in a previous paper, that buyers fell into three strategic categories: 42 percent were incrementalists, who were relatively honest about the widget’s value; 37 percent were conservatives, who adopted the strategy of withholding information; and 21 percent were strategists, who were actively deceptive, mimicking incrementalist behavior by sending high suggestions during low-value trials and then reaping greater benefits by sending low suggestions during high-value trials.
The sellers had a monetary incentive to read the buyers’ strategic profiles correctly, yet they received no feedback about the accuracy of the information they were receiving, so they could not confirm any suspicions about patterns of behavior. Without feedback, the sellers were forced to decide whether they should trust the buyers based on the pricing suggestions alone. “The more uncertain a seller was about a buyer’s credibility,” Montague said, “the more active his or her parahippocampal gyrus became.”
The authors believe a person’s baseline suspicion may have important consequences for his or her financial success. “People with a high baseline suspicion were often interacting with fairly trustworthy buyers, so in ignoring the information those buyers provided, they were giving up potential profits,” said Meghana Bhatt, the first author on the research paper. “The ability to recognize credible information in a competitive environment can be just as important as detecting untrustworthy behavior.”
The findings may also have implications for such psychiatric conditions as paranoia and anxiety disorders, said Montague. “The fact that increased amygdala activation corresponds to an inability to detect trustworthy behavior may provide insight into the social interactions of people with anxiety disorders, who often have increased activity in this area of the brain,” he said.
Provided by Virginia Tech
Source: medicalxpress.com
ScienceDaily (May 16, 2012) — A well-known genetic risk factor for Alzheimer’s disease triggers a cascade of signaling that ultimately results in leaky blood vessels in the brain, allowing toxic substances to pour into brain tissue in large amounts, scientists report May 16 in the journal Nature.

The left photo shows destructive proteins (green) lining blood vessels in living brain tissue of mice with the human ApoE4 gene; after the drug cyclosporine A is added, the harmful proteins are nearly gone (right). (Credit: Image courtesy of University of Rochester Medical Center)
The results come from a team of scientists investigating why a gene called ApoE4 makes people more prone to developing Alzheimer’s. People who carry two copies of the gene have roughly eight to 10 times the risk of getting Alzheimer’s disease than people who do not.
A team of scientists from the University of Rochester, the University of Southern California, and other institutions found that ApoE4 works through cyclophilin A, a well-known bad actor in the cardiovascular system, causing inflammation in atherosclerosis and other conditions. The team found that cyclophilin A opens the gates to the brain assault seen in Alzheimer’s.
"We are beginning to understand much more about how ApoE4 may be contributing to Alzheimer’s disease," said Robert Bell, Ph.D., the post-doctoral associate at Rochester who is first author of the paper. "In the presence of ApoE4, increased cyclophilin A causes a breakdown of the cells lining the blood vessels in Alzheimer’s disease in the same way it does in cardiovascular disease or abdominal aneurysm. This establishes a new vascular target to fight Alzheimer’s disease."
The team found that ApoE4 makes it more likely that cyclophilin A will accumulate in large amounts in cells that help maintain the blood-brain barrier, a network of tightly bound cells that line the insides of blood vessels in the brain and carefully regulates what substances are allowed to enter and exit brain tissue.
ApoE4 creates a cascade of molecular signaling that weakens the barrier, causing blood vessels to become leaky. This makes it more likely that toxic substances will leak from the vessels into the brain, damaging cells like neurons and reducing blood flow dramatically by choking off blood vessels.
Doctors have long known that the changes in the brain seen in Alzheimer’s patients — the death of crucial brain cells called neurons — begins happening years or even decades before symptoms appear. The steps described in Nature discuss events much earlier in the disease process.
The idea that vascular problems are at the heart of Alzheimer’s disease is one championed for more than two decades by Berislav Zlokovic, M.D., Ph.D., the leader of the team and a neuroscientist formerly with the University of Rochester Medical Center and now at USC. For 20 years, Zlokovic has investigated how blood flow in the brain is affected in people with the disease, and how the blood-brain barrier allows nutrients to pass into the brain, and harmful substances to exit the brain.
At Rochester, Zlokovic struck up a collaboration with Bradford Berk, M.D., Ph.D.,a cardiologist and CEO of the Medical Center. For more than two decades Berk has studied cyclophilin A, showing how it promotes destructive forces in blood vessels and how it’s central to the forces that contribute to cardiovascular diseases like atherosclerosis and heart attack.
"As a cardiologist, I’ve been interested in understanding the role of cyclophilin A in patients who suffer from cardiovascular illness," said Berk, a professor at the Aab Cardiovascular Research Institute. "Now our collaboration in Rochester has resulted in the discovery that it also has an important role in Alzheimer’s disease. The finding reinforces the basic research enterprise — you never know when knowledge gained in one area will turn out to be crucial in another."
In studies of mice, the team found that mice carrying the ApoE4 gene had five times as much cyclophilin A compared to other mice in cells known as pericytes, which are crucial to maintaining the integrity of the blood-brain barrier. Blood vessels died, blood did not flow as completely through the brain as it did in other mice, and harmful substances like thrombin, fibrin, and hemosiderin, entered the brain tissue.
When the team blocked the action of cyclophilin A, either by knocking out its gene or by using the drug cyclosporine A to inhibit it, the damage in the mice was reversed. Blood flow resumed to normal, and unhealthy leakage of toxic substances from the blood vessels into the brain was slashed by 80 percent.
The team outlined the chain of events involved. Briefly:
Altogether, the activity results in a dramatic boost in the amount of toxic substances in brain tissue. And when the cascade is interrupted at any of several points — when ApoE4 is not present, when cyclophilin A is blocked or shut off, or when NF Kappa B or the MMPs are inhibited — the blood-brain barrier is restored, blood flow returns to normal, and toxic substances do not leak into brain tissue.
For many years, researchers studying Alzheimer’s disease have been focused largely on amyloid beta, a protein structure that accumulates in the brains of patients with Alzheimer’s disease. The latest works points up the importance of other approaches, said Zlokovic, an adjunct professor at Rochester. At USC, Zlokovic is also deputy director of the Zilkha Neurogenetic Institute, director of the Center for Neurodegeneration and Regeneration, and professor and chair of the Department of Physiology and Biophysics.
"Our study has shown major neuronal injury resulting from vascular defects that are not related to amyloid beta," said Zlokovic. "This damage results from a breakdown of the blood-brain barrier and a reduction in blood flow.
"Amyloid beta definitely has an important role in Alzheimer’s disease," added Zlokovic. "But it’s very important to investigate other leads, perhaps where amyloid beta isn’t as centrally involved."
Source: Science Daily
ScienceDaily (May 16, 2012) — What can a fish tell us about human brain development? Researchers at Duke University Medical Center transplanted a set of human genes into a zebrafish and then used it to identify genes responsible for head size at birth.

Here are images of live zebrafish that were studied for genetics and head size to give insight into human head size. The top fish does not have the gene KCTD13 and its head size and brain size are larger; the middle fish is normal; the fish on the bottom expresses too much of the gene and has the smallest head and brain size. (Credit: Christelle Golzio, Duke Center for Human Disease Modeling and Duke Department of Cell Biology)
Researchers at Duke University Medical Center transplanted a set of human genes into a zebrafish and then used it to identify genes responsible for head size at birth.
Head size in human babies is a feature that is related to autism, a condition that recent figures have shown to be more common than previously reported, 1 in 88 children in a March 2012 study. Head size is also a feature of other major neurological disorders, such as schizophrenia.
"In medical research, we need to dissect events in biology so we can understand the precise mechanisms that give rise to neurodevelopmental traits," said senior author Nicholas Katsanis, Ph.D., Jean and George Brumley Jr., MD, Professor of Developmental Biology, and Professor of Pediatrics and Cell Biology. "We need expert scientists to work side by side with clinicians who see such anatomic and other problems in patients, if we are to effectively solve many of our medical problems."
The study was published online in Nature journal on May 16.
Katsanis knew that a region on chromosome 16 was one of the largest genetic contributors to autism and schizophrenia, but a conversation at a European medical meeting pointed him to information that changes within that same region of the genome also were related to changes in a newborn’s head size.
The problem was difficult to address because the region had large deletions and duplications in DNA, which are the most common mutational mechanisms in humans. “Interpretation is harrowingly hard,” said Katsanis, who is also director of the Duke Center for Human Disease Modeling.
The reason is that a duplication of DNA or missing DNA usually involves several genes. “It is very difficult to go from ‘here is a region with many genes, sometimes over 50’ to ‘these are the genes that are driving this pathology,’” Katsanis said.
"There was a light bulb moment," Katsanis said. "The area of the genome we were exploring gave rise to reciprocal (opposite) defects in terms of brain cell growth, so we realized that overexpressing a gene in question might give one phenotype — a smaller head, while shutting down the same gene might yield the other, a larger head."
The researchers transplanted a common duplication area of human chromosome 16 known to contain 29 genes into zebrafish embryos and then systematically turned up the activity of each transplanted human gene to find which might cause a small head (microcephaly) in the fish. They then suppressed the same gene set and asked whether any of them caused the reciprocal defect: larger heads (macrocephaly).
The researchers knew that deletion of the region that contained these 29 genes occurred in 1.7% of children with autism.
It took the team a few months to dissect such a “copy number variant” — an alteration of the genome that results in an abnormal number of one or more sections of chromosomal DNA.
"Now we can go from a genetic finding that is dosage-sensitive and start asking reasonable questions about this gene as it pertains to neurocognitive traits, which is a big leap," Katsanis said. Neurocognitive refers to the ability to think, concentrate, reason, remember, process information, learn, understand and speak.
Many human conditions have anatomical features that are also related to genetics, he said. “There are major limitations in studying autistic or schizophrenic behavior in zebrafish, but we can measure head size, jaw size, or facial abnormalities.”
The single gene in question, KCTD13, is responsible for driving head size in zebrafish by regulating the creation and destruction of new neurons (brain cells). This discovery let the team focus on the analogous gene in humans. “This gene contributes to autism cases, and probably is associated with schizophrenia and also childhood obesity,” Katsanis said.
Once the gene has been uncovered, researchers can examine the protein it produces. “Once you have the protein, you can start asking valuable functional questions and learning what the gene does in the animal or human,” Katsanis said.
Copy number variants, such as the ones this team found on chromosome 16, are now thought to be one of the most common sources of genetic mutations. Hundreds, if not thousands, of such chromosomal deletions and duplications have been found in patients with a broad range of clinical problems, particularly neurodevelopmental disorders.
"Now we may have an efficient tool for dissecting them, which gives us the ability to improve both diagnosis and understanding of disease mechanisms," Katsanis said.
The current study suggests that KCTD13 is a major contributor to some cases of autism, but also points to the synergistic action of this gene with two other genes in the region, named MVP and MAPK3, Katsanis said.
Source: Science Daily
ScienceDaily (May 16, 2012) — In a new study analyzing Internet usage among college students, researchers at Missouri University of Science and Technology have found that students who show signs of depression tend to use the Internet differently than those who show no symptoms of depression.
Using actual Internet usage data collected from the university’s network, the researchers identified nine fine-grained patterns of Internet usage that may indicate depression. For example, students showing signs of depression tend to use file-sharing services more than their counterparts, and also use the Internet in a more random manner, frequently switching among several applications.
The researchers’ findings provide new insights on the association between Internet use and depression compared to existing studies, says Dr. Sriram Chellappan, an assistant professor of computer science at Missouri S&T and the lead researcher in the study.
"The study is believed to be the first that uses actual Internet data, collected unobtrusively and anonymously, to associate Internet usage with signs of depression," Chellappan says. Previous research on Internet usage has relied on surveys, which are "a far less accurate way" of assessing how people use the Internet, he says.
"This is because when students themselves reported their volume and type of Internet activity, the amount of Internet usage data is limited because people’s memories fade with time," Chellappan says. "There may be errors and social desirability bias when students report their own Internet usage." Social desirability bias refers to the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others.
Chellappan and his fellow researchers collected a month’s worth of Internet data for 216 Missouri S&T undergraduate students. The data was collected anonymously and unobtrusively, and students involved in the study were assigned pseudonyms to keep their identities hidden from the researchers.
Before the researchers collected the usage data from the campus network, the students were tested to determine whether they showed signs of depression. The researchers then analyzed the usage data of the study participants. They found that students who showed signs of depression used the Internet much differently than the other study participants.
Chellappan and his colleagues found that depressed students tended to use file-sharing services, send email and chat online more than the other students. Depressed students also tended to use higher “packets per flow” applications, those high-bandwidth applications often associated with online videos and games, than their counterparts.
Students who showed signs of depression also tended to use the Internet in a more “random” manner — frequently switching among applications, perhaps from chat rooms to games to email. Chellappan thinks that randomness may indicate trouble concentrating, a characteristic associated with depression.
The randomness stood out to Chellappan after his graduate student, Raghavendra Kotikalapudi, examined the “flow duration entropy” of students’ online usage. Flow duration entropy refers to the consistency of Internet use during certain periods of time. The lower the flow duration entropy, the more consistent the Internet use.
"Students showing signs of depression had high flow duration entropy, which means that the duration of Internet flows of these students is highly inconsistent," Chellappan says.
At the beginning of the study, the 216 participating students were tested to determine whether they exhibited symptoms of depression. Based on the Center for Epidemiologic Studies-Depression (CES-D) scale, about 30 percent of the students in the study met the minimum criteria for depression. Nationally, previous studies show that between 10 percent and 40 percent of all American students suffer from depression.
To ensure that participants were not identified during the study, each participant was assigned a pseudonym. The campus information technology department then provided the on-campus Internet usage data for each participant from the month of February 2011.
The researchers’ analysis of the month’s worth of data led Chellappan and his colleagues to conclude that students who were identified as exhibiting symptoms of depression used the Internet differently than the other students in the study.
Chellappan’s research has been accepted for publication in a forthcoming issue of IEEE Technology and Society Magazine.
The chief author of the paper is Kotikalapudi, who received his master of science degree in computer science from Missouri S&T in December 2011. His co-authors are Chellappan; Dr. Frances Montgomery, Curators’ Teaching Professor of psychological science; Dr. Donald C. Wunsch, the M.K. Finley Missouri Distinguished Professor of Computer Engineering; and Karl F. Lutzen, information security officer for Missouri S&T’s IT department.
Chellappan is now interested in using these findings to develop software that could be installed on home computers to help individuals determine whether their Internet usage patterns may indicate depression. The software would unobtrusively monitor Internet usage and alert individuals if their usage patterns indicate symptoms of depression.
"The software would be a cost-effective and an in-home tool that could proactively prompt users to seek medical help if their Internet usage patterns indicate possible depression," Chellappan says. "The software could also be installed on campus networks to notify counselors of students whose Internet usage patterns are indicative of depressive behavior."
Chellappan also believes the method used to connect Internet use and depression could also help diagnose other mental disorders like anorexia, bulimia, attention deficit hyperactivity disorder or schizophrenia.
"We could also investigate associations between other Internet features like visits to social networking sites, late night Internet use and randomness in time of Internet use with depressive symptoms," he says. "Applications of this study to diagnose and treat mental disorders for other vulnerable groups like the elderly and military veterans are also significant."
Source: Science Daily
ScienceDaily (May 16, 2012) — A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. The research is published in the May 16, 2012, online issue of Neurology®, the medical journal of the American Academy of Neurology.

A new study suggests that head impacts experienced during contact sports such as football and hockey may worsen some college athletes’ ability to acquire new information. (Credit: © modestil / Fotolia)
The study involved college athletes at three Division I schools and compared 214 athletes in contact sports to 45 athletes in non-contact sports such as track, crew and Nordic skiing at the beginning and at the end of their seasons. The contact sport athletes wore special helmets that recorded the acceleration speed and other data at the time of any head impact.
The contact sport athletes experienced an average of 469 head impacts during the season. Athletes were not included in the study if they were diagnosed with a concussion during the season.
All of the athletes took tests of thinking and memory skills before and after the season. A total of 45 contact sport athletes and 55 non-contact sport athletes from one of the schools also took an additional set of tests of concentration, working memory and other skills.
"The good news is that overall there were few differences in the test results between the athletes in contact sports and the athletes in non-contact sports," said study author Thomas W. McAllister, MD, of The Geisel School of Medicine at Dartmouth in Lebanon, N.H. "But we did find that a higher percentage of the contact sport athletes had lower scores than would have been predicted after the season on a measure of new learning than the non-contact sport athletes."
A total of 22 percent of the contact sport athletes performed worse than expected on the test of new learning, compared to four percent of the non-contact sport athletes.
McAllister noted that the study did not find differences in test results between the two groups of athletes at the beginning of the season, suggesting that the cumulative head impacts that contact athletes had incurred over many previous seasons did not result in reduced thinking and memory skills in the overall group.
"These results are somewhat reassuring, given the recent heightened concern about the potential negative effects of these sports," he said. "Nevertheless, the findings do suggest that repetitive head impacts may have a negative effect on some athletes."
McAllister said it’s possible that some people may be genetically more sensitive to head impacts.
Source: Science Daily
ScienceDaily (May 16, 2012) — Genes play a greater role in forming character traits — such as self-control, decision making or sociability — than was previously thought, new research suggests.

Identical twin boys. Genes play a greater role in forming character traits — such as self-control, decision making or sociability — than was previously thought, new research suggests. (Credit: © vgm6 / Fotolia)
A study of more than 800 sets of twins found that genetics were more influential in shaping key traits than a person’s home environment and surroundings.
Psychologists at the University of Edinburgh who carried out the study, say that genetically influenced characteristics could well be the key to how successful a person is in life.
The study of twins in the US — most aged 50 and over- used a series of questions to test how they perceived themselves and others. Questions included “Are you influenced by people with strong opinions?” and “Are you disappointed about your achievements in life?”
The results were then measured according to the Ryff Psychological Well-Being Scale which assesses and standardizes these characteristics.
By tracking their answers, the research team found that identical twins — whose DNA is [presumed to be] exactly the same — were twice as likely to share traits compared with non-identical twins.
Psychologists say the findings are significant because the stronger the genetic link, the more likely it is that these character traits are carried through a family.
Professor Timothy Bates, of the University of Edinburgh’s School of Philosophy, Psychology and Language Sciences, said that the genetic influence was strongest on a person’s sense of self-control.
Researchers found that genes affected a person’s sense of purpose, how well they get on with people and their ability to continue learning and developing.
Professor Bates added: “Ever since the ancient Greeks, people have debated the nature of a good life and the nature of a virtuous life. Why do some people seem to manage their lives, have good relationships and cooperate to achieve their goals while others do not? Previously, the role of family and the environment around the home often dominated people’s ideas about what affected psychological well-being. However, this work highlights a much more powerful influence from genetics.”
The study, which builds on previous research that found that happiness is underpinned by genes, is published online in the Journal of Personality.
Source: Science Daily
ScienceDaily (May 16, 2012) — Poor Phineas Gage. In 1848, the supervisor for the Rutland and Burlington Railroad in Vermont was using a 13-pound, 3-foot-7-inch rod to pack blasting powder into a rock when he triggered an explosion that drove the rod through his left cheek and out of the top of his head. As reported at the time, the rod was later found, “smeared with blood and brains.”

Recreation of Gage accident. (Credit: Copyright John Darrell Van Horn and the UCLA Laboratory of Neuro Imaging, 2012)
Miraculously, Gage lived, becoming the most famous case in the history of neuroscience — not only because he survived a horrific accident that led to the destruction of much of his left frontal lobe but also because of the injury’s reported effects on his personality and behavior, which were said to be profound. Gage went from being an affable 25-year-old to one that was fitful, irreverent and profane. His friends and acquaintances said he was “no longer Gage.”
Over the years, various scientists have studied and argued about the exact location and degree of damage to Gage’s cerebral cortex and the impact it had on his personality. Now, for the first time, researchers at UCLA, using brain-imaging data that was lost to science for a decade, have broadened the examination of Gage to look at the damage to the white matter “pathways” that connect various regions of the brain.
Reporting in the May 16 issue of the journal PLoS ONE, Jack Van Horn, a UCLA assistant professor of neurology, and colleagues note that while approximately 4 percent of the cerebral cortex was intersected by the rod’s passage, more than 10 percent of Gage’s total white matter was damaged. The passage of the tamping iron caused widespread damage to the white matter connections throughout Gage’s brain, which likely was a major contributor to the behavioral changes he experienced.
Because white matter and its myelin sheath — the fatty coating around the nerve fibers that form the basic wiring of the brain — connect the billions of neurons that allow us to reason and remember, the research not only adds to the lore of Phineas Gage but may eventually lead to a better understanding of multiple brain disorders that are caused in part by similar damage to these connections.
"What we found was a significant loss of white matter connecting the left frontal regions and the rest of the brain," said Van Horn, who is a member of UCLA’s Laboratory of Neuro Imaging (LONI). "We suggest that the disruption of the brain’s ‘network’ considerably compromised it. This may have had an even greater impact on Mr. Gage than the damage to the cortex alone in terms of his purported personality change."
LONI is part of an ambitious joint effort with Massachusetts General Hospital and the National Institutes of Health to document the trillions of microscopic links between every one of the brain’s 100 billion neurons — the so-called “connectome.” And because mapping the brain’s physical wiring eventually will lead to answers about what causes mental conditions that may be linked to the breakdown of these connections, it was appropriate, as well as historically interesting, to take a new look at the damage to Gage’s brain.
Since Gage’s 189-year-old skull, which is on display in the Warren Anatomical Museum at Harvard Medical School, is now fragile and unlikely to again be subjected to medical imaging, the researchers had to track down the last known imaging data, from 2001, which had been lost due to various circumstances at Brigham and Women’s Hospital, a teaching affiliate of Harvard, for some 10 years.
The authors were able to recover the computed tomographic data files and managed to reconstruct the scans, which revealed the highest-quality resolution available for modeling Gage’s skull. Next, they utilized advanced computational methods to model and determine the exact trajectory of the tamping iron that shot through his skull. Finally, because the original brain tissue was, of course, long gone, the researchers used modern-day brain images of males that matched Gage’s age and (right) handedness, then used software to position a composite of these 110 images into Gage’s virtual skull, the assumption being that Gage’s anatomy would have been similar.
Van Horn found that nearly 11 percent of Gage’s white matter was damaged, along with 4 percent of the cortex.
"Our work illustrates that while cortical damage was restricted to the left frontal lobe, the passage of the tamping iron resulted in the widespread interruption of white matter connectivity throughout his brain, so it likely was a major contributor to the behavioral changes he experienced," Van Horn said. "Connections were lost between the left frontal, left temporal and right frontal cortices and the left limbic structures of the brain, which likely had considerable impact on his executive as well as his emotional functions."
And while Gage’s personality changed, he eventually was able to travel and find employment as a stagecoach driver for several years in South America. Ultimately, he died in San Francisco, 12 years after the accident.
Van Horn noted a modern parallel.
"The extensive loss of white matter connectivity, affecting both hemispheres, plus the direct damage by the rod, which was limited to the left cerebral hemisphere, is not unlike modern patients who have suffered a traumatic brain injury," he said. "And it is analogous to certain forms of degenerative diseases, such as Alzheimer’s disease or frontal temporal dementia, in which neural pathways in the frontal lobes are degraded, which is known to result in profound behavioral changes."
Van Horn noted that the quantification of the changes to Gage’s brain’s pathways might well provide important insights for clinical assessment and outcome-monitoring in modern-day brain trauma patients.
Source: Science Daily