People in middle age who have a high blood pressure measure called pulse pressure are more likely to have biomarkers of Alzheimer’s disease in their spinal fluid than those with lower pulse pressure, according to research published in the November 13, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
Pulse pressure is the systolic pressure, or the top number in a blood pressure reading, minus the diastolic, or the bottom number. Pulse pressure increases with age and is an index of the aging of the vascular system.
The study involved 177 people ages 55 to 100 with no symptoms of Alzheimer’s disease. Participants had their pulse pressure taken and lumbar punctures to obtain spinal fluid.
The study found that people who have higher pulse pressure are more likely to have the Alzheimer’s biomarkers amyloid beta, or plaques, and p-tau protein, or tangles, in their cerebral spinal fluid than those with lower pulse pressure. For every 10 point rise in pulse pressure, the average level of p-tau protein in the spinal fluid rose by 1.5 picograms per millileter. A picogram is one trillionth of a gram.
“These results suggest that the forces involved in blood circulation may be related to the development of the hallmark Alzheimer’s disease signs that cause loss of brain cells,” said study author Daniel A. Nation, PhD, of the VA San Diego Healthcare System.
The relationship was found in people age 55 to 70, but not in people age 70 to 100.
“This is consistent with findings indicating that high blood pressure in middle age is a better predictor of later problems with memory and thinking skills and loss of brain cells than high blood pressure in old age,” Nation said.
People who are in love are less able to focus and to perform tasks that require attention. Researcher Henk van Steenbergen concludes this, together with colleagues from Leiden University and the University of Maryland. The article has appeared in the journal Motivation and Emotion.

The more in love, the less focused you are
Forty-three participants who had been in a relationship for less than half a year performed a number of tasks during which they had to discriminate irrelevant from relevant information as soon as possible. It appeared that the more in love they were, the less able they were to ignore the irrelevant information. Love intensity thus was related to how well someone is able to focus. There was no difference between men and women.
Cognitive control
The participants listened to music that elicited romantic feelings and thought of a romantic event to intensify their love feelings. Participants also completed a questionnaire that was used to assess the intensity of their love feelings. The results of the study by Henk van Steenbergen differed from results from previous studies. Those previous studies showed that the ability to ignore distracting information is required to maintain a long-term romantic relationship. Being able to control oneself (also called “cognitive control”) and to resist temptations that could threaten the relationship is essential in long-term love.
Thinking of your beloved
In the study by Van Steenbergen, in contrast, the participants had become involved in a romantic relationship only a few months ago. “When you have just become involved in a romantic relationship you’ll probably find it harder to focus on other things because you spend a large part of your cognitive resources on thinking of your beloved”, Van Steenbergen says. “For long-lasting love in a long-term relationship, on the other hand, it seems crucial to have proper cognitive control.” Over time, a balance between less and more cognitive control may be critical for a successful relationship.
Why is romantic love associated with cognitive control?
Van Steenbergen emphasizes that the link between romantic love and cognitive control is a new area of research. “The reason why romantic love is associated with cognitive control is still unknown. It could be that lovers use all their cognitive resources to think about their beloved, which leaves them no resources to perform a boring task. It could also be that the association goes in the opposite direction: people who have reduced cognitive control may experience more intense love feelings than people who have higher levels of cognitive control.” Future research will have to clarify this.
Patients with traumatic brain injury (TBI) had increased deposits of β-Amyloid (Αβ) plaques, a hallmark of Alzheimer Disease (AD), in some areas of their brains in a study by Young T. Hong, Ph.D., of the University of Cambridge, England, and colleagues.
There may be epidemiological or pathophysiological (changes because of injury) links between TBI and AD, and Αβ plaques are found in as many as 30 percent of patients who die in the acute phase after a TBI. The plaques appear within hours of the injury and can occur in patients of all ages, according to the study background.
Researchers used imaging and brain tissue acquired during autopsies to examine Αβ deposition in patients with TBI. Researchers performed positron emission tomography (PET) imaging using carbon 11-labeled Pittsburgh Compound B ([11C]PIB), a marker of brain amyloid deposition, in 15 participants with a TBI and 11 healthy patients. Autopsy-acquired brain tissue was obtained from 16 people who had a TBI, as well as seven patients with a nonneurological cause of death.
The study’s findings indicate that patients with TBI showed increases in [11C]PIB binding, which may be a marker of Αβ plaque in some areas of the brain.
“The use of ([11C]PIB PET for amyloid imaging following TBI provides us with the potential for understanding the pathophysiology of TBI, for characterizing the mechanistic drivers of disease progression or suboptimal recovery in the subacute phase of TBI, for identifying patients at high risk of accelerated AD, and for evaluating the potential of antiamyloid therapies,” the authors conclude.
A polymer originally designed to help mend broken bones could be successful in delivering chemotherapy drugs directly to the brains of patients suffering from brain tumours, researchers at The University of Nottingham have discovered.

Their study, published in the journal PLOS ONE, shows that the biomaterial can be easily applied to the cavity created following brain cancer surgery and used to release chemotherapy drugs over several weeks.
The targeted nature of the therapy could also reduce the toxic effects of chemotherapy drugs on healthy parts of the body, potentially reducing the debilitating side-effects that many patients experience after cancer treatment.
Patient survival
Dr Ruman Rahman, of the University’s Children’s Brain Tumour Research Centre (CBTRC), who led the study, said: “Our system is an innovative method of drug delivery for the treatment of brain tumours and is intended to be administered immediately after surgery by the operating neurosurgeon.
“Ultimately, this method of drug delivery, in combination with existing therapies, may result in more effective treatment of brain tumours, prolonged patient survival and reduced morbidity.”
Brain tumours are the major cause of cancer-related death in children and adults up to the age of 40. Most relapses occur when surgeons are unable to remove all of the cancerous cells during surgery – something which can be particularly challenging in very young children and babies and by the very nature of a type of adult brain cancer called glioblastoma.
Although alternative systems for delivery of drugs directly to the brain have been developed, they are used infrequently because their success has been limited. This new drug delivery system is the first that can be moulded to the shape of the brain tumour cavity and the first to deliver several different drugs over a clinically meaningful period of time.
The Nottingham polymer formulation is made from two types of micro-particles called PLGA and PEG and has been developed and patented by leading tissue engineer Professor Kevin Shakesheff, based in the University’s School of Pharmacy. A powder at room temperature, it can be mixed to a toothpaste-like consistency with the addition of water.
Unique properties
The unique properties of the polymer lie in its ability to set into a rigid structure only when it reaches body temperature (37 degrees), a feature perfectly tailored for use in medical therapies. It was originally developed as a scaffold on to which new bone cells could be grown to speed up the knitting back together of broken bones.
Dr Ruman Rahman at the CBTRC and Dr Cheryl Rahman from the School of Pharmacy spotted the potential for the polymer to deliver chemotherapy drugs directly to patients’ brain tumours. The work was performed at the CBTRC with neurosurgeon Mr Stuart Smith and neuro-oncologist Professor Richard Grundy. The cavity left by the removal of a tumour would be lined with the polymer while in paste form, which would start to solidify and gradually release the chemotherapy drugs after the incision has been closed. This would directly target any residual cells not initially removed during surgery.
In the lab, the Nottingham scientists were able to successfully demonstrate the slow-release properties of the material by placing paste loaded with three commonly used chemotherapy drugs into a solution of saline and measuring the quantities of the drugs given out by the material over time.
To establish whether the material itself is safe to use on patients in this form of therapy, they used it to create a 3D model onto which they were able to grow brain tumour cells and healthy brain blood vessel cells without any toxicity. They then simulated surgery on a sheep’s brain from an abattoir by moulding the paste around a brain cavity and warming the brain to human body temperature to harden the polymer.
The brain was then scanned using CT and MRI technology to demonstrate that it is still possible to distinguish the polymer from normal brain tissue on a routine brain scan, an aspect crucial for doctors when dealing with follow-up care for brain tumour patients who have undergone surgery.
Robust material
The team also dealt with concerns that the material could disintegrate and release its chemotherapy contents too quickly during the subsequent radiotherapy which many cancer patients undergo following surgery. By placing the biomaterial loaded with chemotherapy drugs into a head cavity of a medical training dummy and subjecting it to the same duration and intensity of radiotherapy used for brain tumour patients they were able to successfully demonstrate the robust integrity of the structure.
Finally they showed that a chemotherapy drug called etoposide could be effective at killing brain cancer cells in a mouse when released from the polymer formulation. The next stage of the research will be to extend the study in mice with brain tumours to test whether animals with the drug-loaded polymers survive longer. The team are also investigating the release of other chemotherapeutic drugs that hold promise, supported by a recent grant award from Sparks.
As the research used a biomaterial and chemotherapy drugs already approved for medical use, many of the usual ethical approval hurdles to allow further investigation have already been cleared.
The first clinical test, anticipated in 3 years’ time, will be to devise a multi-centre phase 0 clinical trial which would involve testing the therapy on a small number of patients for whom other clinical treatments have not been successful and would otherwise only be offered palliative care.
“This is a very exciting development and holds considerable promise for the treatment of malignant brain tumours in the near future” commented Professor Grundy, Co-Director of the CBTRC.
Research released today reveals new mechanisms and areas of the brain associated with anxiety and depression, presenting possible targets to understand and treat these debilitating mental illnesses. The findings were presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
More than 350 million people worldwide suffer from clinical depression and between 5 and 25 percent of adults suffer from generalized anxiety, according to the World Health Organization. The resulting emotional and financial costs to people, families, and society are significant. Further, antidepressants are not always effective and often cause severe side effects.
Today’s new findings show that:
Other recent findings discussed show that:
“Today’s findings represent our rapidly growing understanding of the individual molecules and brain circuits that may contribute to depression and anxiety,” said press conference moderator Lisa Monteggia, PhD, of the University of Texas Southwestern Medical Center, an expert on mechanisms of antidepressant action. “These exciting discoveries represent the potential for significant changes in how we diagnose and treat these illnesses that touch millions.”
Research released today reveals a new model for a genetic eye disease, and shows how animal models — from fruit flies to armadillos and monkeys — can yield valuable information about the human brain. The findings were presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
Animal models have long been central in how we understand the human brain, behavior, and nervous system due to similarities in many brain areas and functions across species. Almost every major medical advance in the last century was made possible by carefully regulated, humane animal research. Today’s findings build on this rich history and demonstrate what animals can teach us about ourselves.
Today’s new findings show that:
Other recent findings discussed show that:
“Neuroscience has always relied on responsible animal research to better understand how our brains and bodies develop, function, and break down,” said press conference moderator Leslie Tolbert, of the University of Arizona, whose work in insects provides insights into brain development. “Today’s studies reveal new ways that research on unlikely-seeming animals, such as armadillos, fruit flies, and worms, could have real impact on our understanding of the human brain and what can go wrong in disease.”
Scientists are gaining a new level of understanding of multiple sclerosis (MS) that may lead to new treatments and approaches to controlling the chronic disease, according to new research released today at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health.
MS is a severe, often crippling, autoimmune disease caused by the body’s immune system attacking the nervous system. Today, more than two million people worldwide suffer from MS and other neuroinflammatory diseases. MS usually strikes in early adulthood and manifests with symptoms including vision loss, paralysis, numbness, and fatigue. The disease can be intermittent or progressive and currently has no cure.
Today’s new findings show that:
Other recent findings discussed show that:
“The findings shown today represent real promise for the millions suffering from MS,” said press conference moderator Jeffrey Rothstein of Johns Hopkins University and an expert in neurodegenerative diseases. “These studies are breakthroughs in understanding and treating a disease that remains uncured, difficult to diagnose, and for which it is very difficult to prevent progression.”
As little as 20 minutes of moderate exercise three times per week during pregnancy enhances the newborn child’s brain development, according to researchers at the University of Montreal and its affiliated CHU Sainte-Justine children’s hospital. This head-start could have an impact on the child’s entire life. “Our research indicates that exercise during pregnancy enhances the newborn child’s brain development,” explained Professor Dave Ellemberg, who led the study. “While animal studies have shown similar results, this is the first randomized controlled trial in humans to objectively measure the impact of exercise during pregnancy directly on the newborn’s brain. We hope these results will guide public health interventions and research on brain plasticity. Most of all, we are optimistic that this will encourage women to change their health habits, given that the simple act of exercising during pregnancy could make a difference for their child’s future.” Ellemberg and his colleagues Professor Daniel Curnier and PhD candidate Élise Labonté-LeMoyne presented their findings today at the Neuroscience 2013 congress in San Diego.

Not so long ago, obstetricians would tell women to take it easy and rest during their pregnancy. Recently, the tides have turned and it is now commonly accepted that inactivity is actually a health concern. “While being sedentary increases the risks of suffering complications during pregnancy, being active can ease post-partum recovery, make pregnancy more comfortable and reduce the risk of obesity in the children,” Curier explained. “Given that exercise has been demonstrated to be beneficial for the adult’s brain, we hypothesized that it could also be beneficial for the unborn child through the mother’s actions.”
To verify this, starting at the beginning of their second trimester, women were randomly assigned to an exercise group or a sedentary group. Women in the exercise group had to perform at least 20 minutes of cardiovascular exercise three times per week at a moderate intensity, which should lead to at least a slight shortness of breath. Women in the sedentary group did not exercise. The brain activity of the newborns was assessed between the ages of 8 to 12 days, by means of electroencephalography, which enables the recording of the electrical activity of the brain. “We used 124 soft electrodes placed on the infant’s head and waited for the child to fall asleep on his or her mother’s lap. We then measured auditory memory by means of the brain’s unconscious response to repeated and novel sounds,” Labonté-LeMoyne said. “Our results show that the babies born from the mothers who were physically active have a more mature cerebral activation, suggesting that their brains developed more rapidly.”
The researchers are now in the process of evaluating the children’s cognitive, motor and language development at age 1 to verify if these differences are maintained.
New human and animal research released today demonstrates how experiences impact genes that influence behavior and health. Today’s studies, presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience and the world’s largest source of emerging news about brain science and health, provide new insights into how experience might produce long-term brain changes in behaviors like drug addiction and memory formation.
The studies focus on an area of research called epigenetics, in which the environment and experiences can turn genes “on” or “off,” while keeping underlying DNA intact. These changes affect normal brain processes, such as development or memory, and abnormal brain processes, such as depression, drug dependence, and other psychiatric disease — and can pass down to subsequent generations.
Today’s new findings show that:
Other recent findings discussed show that:
"DNA may shape who we are, but we also shape our own DNA," said press conference moderator Schahram Akbarian, of the Icahn School of Medicine at Mount Sinai, an expert in epigenetics. "These findings show how experiences like learning or drug exposure change the way genes are expressed, and could be incredibly important in developing treatments for addiction and for understanding processes like memory."
Johns Hopkins engineers and cardiology experts have teamed up to develop a fingernail-sized biosensor that could alert doctors when serious brain injury occurs during heart surgery. By doing so, the device could help doctors devise new ways to minimize brain damage or begin treatment more quickly.

In the Nov. 11 issue of the journal Chemical Science, the team reported on lab tests demonstrating that the prototype sensor had successfully detected a protein associated with brain injuries.
“Ideally, the testing would happen while the surgery is going on, by placing just a drop of the patient’s blood on the sensor, which could activate a sound, light or numeric display if the protein is present,” said the study’s senior author, Howard E. Katz, a Whiting School of Engineering expert in organic thin film transistors, which form the basis of the biosensor.
The project originated about two years ago when Katz, who chairs the Department of Materials Science and Engineering, was contacted by Allen D. Everett, a Johns Hopkins Children’s Center pediatric cardiologist who studies biomarkers linked to pulmonary hypertension and brain injury. As brain injury can occur with heart surgery in both adults and children, the biosensor Everett proposed should work on patients of all ages. He is particularly concerned, however, about operating room injuries to children, whose brains are still developing.
“Many of our young patients need one or more heart surgeries to correct congenital heart defects, and the first of these procedures often occurs at birth,” Everett said. “We take care of these children through adulthood, and we have all have seen the neurodevelopment problems that occur as a consequence of their surgery and post-operative care. These are very sick children, and we have done a brilliant job of improving overall survival from congenital heart surgery, but we have far to go to improve the long-term outcomes of our patients. This is our biggest challenge for the 21st century.”
He said that recent studies found that after heart surgery, about 40 percent of infant patients will have brain abnormalities that show up in MRI scans. The damage is most often caused by strokes, which can be triggered and made worse by multiple events during surgery and recovery, when the brain is most susceptible to injury. These brain injuries can lead to deficiencies in the child’s mental development and motor skills, as well as hyperactivity and speech delay.
To address these problems, Everett sought an engineer to design a biosensor that responds to glial fibrillary acidic protein (GFAP), which is a biomarker linked to brain injuries. “If we can be alerted when the injury is occurring,” he said, “then we should be able to develop better therapies. We could improve our control of blood pressure or redesign our cardiopulmonary bypass machines. We could learn how to optimize cooling and rewarming procedures and have a benchmark for developing and testing new protective medications.”
At present, Everett said, doctors have to wait years for some brain injury-related symptoms to appear. That slows down the process of finding out whether new procedures or treatments to reduce brain injuries are effective. The new device may change that. “The sensor platform is very rapid,” Everett said. “It’s practically instantaneous.”
To create this sensor, materials scientist Katz turned to an organic thin film transistor design. In recent years sensors built on such platforms have shown that they can detect gases and chemicals associated with explosives. These transistors were an attractive choice for Everett’s request because of their potential low cost, low power consumption, biocompatibility and their ability to detect a variety of biomolecules in real time. Futhermore, the architecture of these transistors could accommodate a wide variety of other useful electronic materials.
The sensing area is a small square, 3/8ths-of-an-inch on each side. On the surface of the sensor is a layer of antibodies that attract GFAP, the target protein. When this occurs, it changes the physics of other material layers within the sensor, altering the amount of electrical current that is passing through the device. These electrical changes can be monitored, enabling the user to know when GFAP is present.
“This sensor proved to be extremely sensitive,” Katz said. “It recognized GFAP even when there were many other protein molecules nearby. As far as we’ve been able to determine, this is the most sensitive protein detector based on organic thin film transistors.”
Through the Johns Hopkins Technology Transfer Office, the team members have filed for full patent protection for the new biosensor. Katz said the team is looking for industry collaborators to conduct further research and development of the device, which has not yet been tested on human patients. But with the right level of effort and support, Katz believes the device could be put into clinical use within five years. “I’m getting tremendous personal satisfaction from working on a major medical project that could help patients,” he said.
Everett, the pediatric cardiologist, said the biosensor could eventually be used outside of the operating room to quickly detect brain injuries among athletes and accident victims. “It could evolve into a point-of-care or point-of-injury device,” he said. “It might also be very useful in hospital emergency departments to screen patients for brain injuries.”
Scientists at Rutgers and Emory universities have discovered that a compound often emitted by mold may be linked to symptoms of Parkinson’s disease.

Arati Inamdar and Joan Bennett, researchers in the School of Environmental and Biological Sciences at Rutgers, used fruit flies to establish the connection between the compound – popularly known as mushroom alcohol – and the malfunction of two genes involved in the packaging and transport of dopamine, the chemical released by nerve cells to send messages to other nerve cells in the brain.
The findings were published online today in the Proceedings of the National Academy of Sciences.
“Parkinson’s has been linked to exposure to environmental toxins, but the toxins were man-made chemicals,” Inamdar said. “In this paper, we show that biologic compounds have the potential to damage dopamine and cause Parkinson’s symptoms.”
For co-author Bennett, the research was more than academic. Bennett was working at Tulane University in New Orleans when Hurricane Katrina struck the Gulf Coast in 2005. Her flooded house became infested with molds, which she collected in samples, wearing a mask, gloves and protective gear.
“I felt horrible – headaches, dizziness, nausea,” said Bennett, now a professor of plant pathology and biology at Rutgers. “I knew something about ‘sick building syndrome’ but until then I didn’t believe in it. I didn’t think it would be possible to breathe in enough mold spores to get sick.” That is when she formed her hypothesis that volatiles might be involved.
Inamdar, who uses fruit flies in her research, and Bennett began their study shortly after Bennett arrived at Rutgers. Bennett wanted to understand the connection between molds and symptoms like those she had experienced following Katrina.
The scientists discovered that the volatile organic compound 1-octen-3-ol, otherwise known as mushroom alcohol, can cause movement disorders in flies, similar to those observed in the presence of pesticides, such as paraquat and rotenone. Further, they discovered that it attacked two genes that deal with dopamine, degenerating the neurons and causing the Parkinson’s-like symptoms.
Studies indicate that Parkinson’s disease – a progressive disease of the nervous system marked by tremor, muscular rigidity and slow, imprecise movement — is increasing in rural areas, where it’s usually attributed to pesticide exposure. But rural environments also have a lot of mold and mushroom exposure.
“Our work suggests that 1-octen-3-ol might also be connected to the disease, particularly for people with a genetic susceptibility to it,” Inamdar said. “We’ve given the epidemiologists some new avenues to explore.”
A team of scientists led by researchers from the University of California, San Diego School of Medicine and Ludwig Institute for Cancer Research have identified a novel therapeutic approach for the most frequent genetic cause of ALS, a disorder of the regions of the brain and spinal cord that control voluntary muscle movement, and frontotemporal degeneration, the second most frequent dementia.
Published ahead of print in last week’s online edition of the journal PNAS, the study establishes using segments of genetic material called antisense oligonucleotides – ASOs – to block the buildup and selectively degrade the toxic RNA that contributes to the most common form of ALS, without affecting the normal RNA produced from the same gene.
The new approach may also have the potential to treat frontotemporal degeneration or frontotemporal dementia (FTD), a brain disorder characterized by changes in behavior and personality, language and motor skills that also causes degeneration of regions of the brain.
In 2011, scientists found that a specific gene known as C9orf72 is the most common genetic cause of ALS. It is a very specific type of mutation which, instead of changing the protein, involves a large expansion, or repeated sequence of a set of nucleotides – the basic component of RNA.
A normal C9orf72 gene contains fewer than 30 of the nucleotide repeat unit, GGGGCC. The mutant gene may contain hundreds of repeats of this unit, which generate a repeat containing RNA that the researchers show aggregate into foci.
“Remarkably, we found two distinct sets of RNA foci, one containing RNAs transcribed in the sense direction and the other containing anti-sense RNAs,” said first author Clotilde Lagier-Tourenne, MD, PhD, UC San Diego Department of Neurosciences and Ludwig Institute for Cancer Research.
The researchers also discovered a signature of changes in expression of other genes that accompanies expression of the repeat-containing RNAs. Since they found that reducing the level of expression of the C9orf72 gene in a normal adult nervous system did not produce this signature of changes, the evidence demonstrated a toxicity of the repeat-containing RNAs that could be relieved by reducing the levels of those toxic RNAs.
“This led to our use of the ASOs to target the sense strand. We reduced the accumulation of expanded RNA foci and corrected the sense strand of the gene. Importantly, we showed that we could remove the toxic RNA without affecting the normal RNA that encodes the C9orf72 protein. This selective silencing of a toxic RNA is the holy grail of gene silencing approaches, and we showed we had accomplished it,” Lagier-Tourenne added.
Targeting the sense strand RNAs with a specific ASO did not, however, affect the antisense strand foci nor did it correct the signature of gene expression changes. “Doing that will require separate targeting of the antisense strand – or both - and has now become a critical question,“ Lagier-Tourenne said.
“This approach is exciting as it links two neurodegenerative diseases, ALS and FTD, to the field of expansion, which has gained broadened interest from investigators,” said co-principal investigator John Ravits, MD, UC San Diego Department of Neurosciences. “At the same time, our study also demonstrates the – to now – unrecognized role of anti-sense RNA and its potential as a therapeutic target.”
All animals have to make decisions every day. Where will they live and what will they eat? How will they protect themselves? They often have to make these decisions as a group, too, turning what may seem like a simple choice into a far more nuanced process. So, how do animals know what’s best for their survival?

For the first time, Arizona State University researchers have discovered that at least in ants, animals can change their decision-making strategies based on experience. They can also use that experience to weigh different options.
The findings are featured today in the early online edition of the scientific journal Biology Letters, as well as in its Dec. 23 edition.
Co-authors Taka Sasaki and Stephen Pratt, both with ASU’s School of Life Sciences, have studied insect collectives, such as ants, for years. Sasaki, a postdoctoral research associate, specializes in adapting psychological theories and experiments that are designed for humans to ants, hoping to understand how the collective decision-making process arises out of individually ignorant ants.
“The interesting thing is we can make decisions and ants can make decisions – but ants do it collectively,” said Sasaki. “So how different are we from ant colonies?”
To answer this question, Sasaki and Pratt gave a number of Temnothorax rugatulus ant colonies a series of choices between two nests with differing qualities. In one treatment, the entrances of the nests had varied sizes, and in the other, the exposure to light was manipulated. Since these ants prefer both a smaller entrance size and a lower level of light exposure, they had to prioritize.
“It’s kind of like a humans and buying a house,” said Pratt, an associate professor with the school. “There’s so many options to consider – the size, the number of rooms, the neighborhood, the price, if there’s a pool. The list goes on and on. And for the ants it’s similar, since they live in cavities that can be dark or light, big or small. With all of these things, just like with a human house, it’s very unlikely to find a home that has everything you want.”
Pratt continued to explain that because it is impossible to find the perfect habitat, ants make various tradeoffs for certain qualities, ordering them in a queue of most important aspects. But, when faced with a decision between two different homes, the ants displayed a previously unseen level of intelligence.
According to their data, the series of choices the ants faced caused them to reprioritize their preferences based on the type of decision they faced. Ants that had to choose a nest based on light level prioritized light level over entrance size in the final choice. On the other hand, ants that had to choose a nest based on entrance size ranked light level lower in the later experiment.
This means that, like people, ants take the past into account when weighing options while making a choice. The difference is that ants somehow manage to do this as a colony without any dissent. While this research builds on groundwork previously laid down by Sasaki and Pratt, the newest experiments have already raised more questions.
“You have hundreds of these ants, and somehow they have to reach a consensus,” Pratt said. “How do they do it without anyone in charge to tell them what to do?”
Pratt likened individual ants to individual neurons in the human brain. Both play a key role in the decision-making process, but no one understands how every neuron influences a decision.
Sasaki and Pratt hope to delve deeper into the realm of ant behavior so that one day, they can understand how individual ants influence the colony. Their greater goal is to apply what they discover to help society better understand how humanity can make collective decisions with the same ease ants display.
“This helps us learn how collective decision-making works and how it’s different from individual decision-making,” said Pratt. “And ants aren’t the only animals that make collective decisions – humans do, too. So maybe we can gain some general insight.”
A pilot study by a multi-disciplinary team of investigators at Georgetown University suggests that a simple dot test could help doctors gauge the extent of dopamine loss in individuals with Parkinson’s disease (PD). Their study is being presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience.
“It is very difficult now to assess the extent of dopamine loss — a hallmark of Parkinson’s disease — in people with the disease,” says lead author Katherine R. Gamble, a psychology PhD student working with two Georgetown psychologists, a psychiatrist and a neurologist. “Use of this test, called the Triplets Learning Task (TLT), may provide some help for physicians who treat people with Parkinson’s disease, but we still have much work to do to better understand its utility,” she adds.
Gamble works in the Cognitive Aging Laboratory, led by the study’s senior investigator, Darlene Howard, PhD, Davis Family Distinguished Professor in the department of psychology and member of the Georgetown Center for Brain Plasticity and Recovery.
The TLT tests implicit learning, a type of learning that occurs without awareness or intent, which relies on the caudate nucleus, an area of the brain affected by loss of dopamine.
The test is a sequential learning task that does not require complex motor skills, which tend to decline in people with PD. In the TLT, participants see four open circles, see two red dots appear, and are asked to respond when they see a green dot appear. Unbeknownst to them, the location of the first red dot predicts the location of the green target. Participants learn implicitly where the green target will appear, and they become faster and more accurate in their responses.
Previous studies have shown that the caudate region in the brain underlies implicit learning. In the study, PD participants implicitly learned the dot pattern with training, but a loss of dopamine appears to negatively impact that learning compared to healthy older adults.
“Their performance began to decline toward the end of training, suggesting that people with Parkinson’s disease lack the neural resources in the caudate, such as dopamine, to complete the learning task,” says Gamble.
In this study of 27 people with PD, the research team is now testing how implicit learning may differ by different PD stages and drug doses.
“This work is important in that it may be a non-invasive way to evaluate the level of dopamine deficiency in PD patients, and which may lead to future ways to improve clinical treatment of PD patients,” explains Steven E. Lo, MD, associate professor of neurology at Georgetown University Medical Center, and a co-author of the study.
They hope the TLT may one day be a tool to help determine levels of dopamine loss in PD.
Research from Oregon Health & Science University’s Vollum Institute, published in the current issue of Nature (1, 2), is giving scientists a never-before-seen view of how nerve cells communicate with each other. That new view can give scientists a better understanding of how antidepressants work in the human brain — and could lead to the development of better antidepressants with few or no side effects.
The article in today’s edition of Nature came from the lab of Eric Gouaux, Ph.D., a senior scientist at OHSU’s Vollum Institute and a Howard Hughes Medical Institute Investigator. The article describes research that gives a better view of the structural biology of a protein that controls communication between nerve cells. The view is obtained through special structural and biochemical methods Gouaux uses to investigate these neural proteins.
The Nature article focuses on the structure of the dopamine transporter, which helps regulate dopamine levels in the brain. Dopamine is an essential neurotransmitter for the human body’s central nervous system; abnormal levels of dopamine are present in a range of neurological disorders, including Parkinson’s disease, drug addiction, depression and schizophrenia. Along with dopamine, the neurotransmitters noradrenaline and serotonin are transported by related transporters, which can be studied with greater accuracy based on the dopamine transporter structure.
The Gouaux lab’s more detailed view of the dopamine transporter structure better reveals how antidepressants act on the transporters and thus do their work.
The more detailed view could help scientists and pharmaceutical companies develop drugs that do a much better job of targeting what they’re trying to target — and not create side effects caused by a broader blast at the brain proteins.
"By learning as much as possible about the structure of the transporter and its complexes with antidepressants, we have laid the foundation for the design of new molecules with better therapeutic profiles and, hopefully, with fewer deleterious side effects," said Gouaux.
Gouaux’s latest dopamine transporter research is also important because it was done using the molecule from fruit flies, a dopamine transporter that is much more similar to those in humans than the bacteria models that previous studies had used.
The dopamine transporter article was one of two articles Gouaux had published in today’s edition of Nature. The other article also dealt with a modified amino acid transporter that mimics the mammalian neurotransmitter transporter proteins targeted by antidepressants. It gives new insights into the pharmacology of four different classes of widely used antidepressants that act on certain transporter proteins, including transporters for dopamine, serotonin and noradrenaline. The second paper in part was validated by findings of the first paper — in how an antidepressant bound itself to a specific transporter.
"What we ended up finding with this research was complementary and mutually reinforcing with the other work — so that was really important," Gouaux said. "And it told us a great deal about how these transporters work and how they interact with the antidepressant molecules."
New research on pond snails has revealed that high levels of stress can block memory processes. Researchers from the University of Exeter and the University of Calgary trained snails and found that when they were exposed to multiple stressful events they were unable remember what they had learned.

Previous research has shown that stress also affects human ability to remember. This study, published in the journal PLOS ONE, found that experiencing multiple stressful events simultaneously has a cumulative detrimental effect on memory.
Dr Sarah Dalesman, a Leverhulme Trust Early Career Fellow, from Biosciences at the University of Exeter, formally at the University of Calgary, said: “It’s really important to study how different forms of stress interact as this is what animals, including people, frequently experience in real life. By training snails, and then observing their behaviour and brain activity following exposure to stressful situations, we found that a single stressful event resulted in some impairment of memory but multiple stressful events prevented any memories from being formed.”
The pond snail, Lymnaea stagnalis, has easily observable behaviours linked to memory and large neurons in the brain, both useful benefits when studying memory processes. They also respond to stressful events in a similar way to mammals, making them a useful model species to study learning and memory.
In the study, the pond snails were trained to reduce how often they breathed outside water. Usually pond snails breathe underwater and absorb oxygen through their skin. In water with low oxygen levels the snails emerge and inhale air using a basic lung opened to the air via a breathing hole.
To train the snails not to breathe air they were placed in poorly oxygenated water and their breathing holes were gently poked every time they emerged to breathe. Snail memory was tested by observing how many times the snails attempted to breathe air after they had received their training. Memory was considered to be present if there was a reduction in the number of times they opened their breathing holes. The researchers also assessed memory by monitoring neural activity in the brain.
Immediately before training, the snails were exposed to two different stressful experiences, low calcium - which is stressful as calcium is necessary for healthy shells - and overcrowding by other pond snails.
When faced with the stressors individually, the pond snails had reduced ability to form long term memory, but were still able to learn and form short and intermediate term memory lasting from a few minutes to hours. However, when both stressors were experienced at the same time, results showed that they had additive effects on the snails’ ability to form memory and all learning and memory processes were blocked.
Future work will focus on the effects of stress on different populations of pond snail.
When faced with a choice, the brain retrieves specific traces of memories, rather than a generalized overview of past experiences, from its mental Rolodex, according to new brain-imaging research from The University of Texas at Austin.

Led by Michael Mack, a postdoctoral researcher in the departments of psychology and neuroscience, the study is the first to combine computer simulations with brain-imaging data to compare two different types of decision-making models.
In one model — exemplar — a decision is framed around concrete traces of memories, while in the other model — prototype — the decision is based on a generalized overview of all memories lumped into a specific category.
Whether one model drives decisions more than the other has remained a matter of debate among scientists for more than three decades. But according to the findings, the exemplar model is more consistent with decision-making behavior.
The study was published this month in Current Biology. The authors include Alison Preston, associate professor in the Department of Psychology and the Center for Learning and Memory; and Bradley Love, a professor at University College London.
In the study, 20 respondents were asked to sort various shapes into two categories. During the task their brain activity was observed using functional magnetic resonance imaging (fMRI), allowing researchers to see how the respondents associate shapes with past memories.
According to the findings, behavioral research alone cannot determine whether a subject uses the exemplar or prototype model to make decisions. With brain-imaging analysis, researchers found that the exemplar model accounted for the majority of participants’ decisions. The results show three different regions associated with the exemplar model were activated during the learning task: occipital (visual perception), parietal (sensory) and frontal cortex (attention).
While processing new information, the brain stores concrete traces of experiences, allowing it to make different kinds of decisions, such as categorization information (is that a dog?), identification (is that John’s dog?) and recall (when did I last see John’s dog?).
To illustrate, Mack says: Imagine having a conversation with a friend about buying a new car. When you think of the category “car,” you’re likely to think of an abstract concept of a car, but not specific details. However, abstract categories are composed of memories from individual experiences. So when you imagine “car,” the abstract mental picture is actually derived from experiences, such as your friend’s white sedan or the red sports car you saw on the morning commute.
“We flexibly memorize our experiences, and this allows us to use these memories for different kinds of decisions,” Mack says. “By storing concrete traces of our experiences, we can make decisions about different types of cars and even specific past experiences in our life with the same memories.”
Mack says this new approach to model-based cognitive neuroscience could lead to discoveries in cognitive research.
“The field has struggled with linking theories of how we behave and act to the activation measures we see in the brain,” Mack says. “Our work offers a method to move beyond simply looking at blobs of brain activation. Instead, we use patterns of brain activation to decode the algorithms underlying cognitive behaviors like decision making.”
Can quitting drugs without treatment trigger a decline in mental health? That appears to be the case in an animal model of morphine addiction. Georgetown University Medical Center researchers say their observations suggest that managing morphine withdrawal could promote a healthier mental state in people.
“Over time, drug-abusing individuals often develop mental disorders,” says Italo Mocchetti, PhD, a professor of neuroscience. “It’s been thought that drug abuse itself contributes to mental decline, but our findings suggest that ‘quitting cold turkey’ can also lead to damage.”
In the study published in the November issue of Brain, Behavior and Immunity and presented at Neuroscience 2013, Mocchetti and his research colleagues treated the animals with morphine, or allowed them to undergo withdrawal by stopping the treatment. Then, they measured pro-inflammatory cytokines, which can promote damage and cell death, and the protein CCL5, which has various protective effects in the brain.
“Interestingly, we found that treating the addicted animals with morphine both increased the protective CCL5 protein while decreasing pro-inflammatory cytokines, suggesting a beneficial effect,” Mocchetti explains. The animals that ween’t treated during withdrawal had the opposite results — decreased CCL5 and increased levels of the damaging cytokines.
“From these findings, it appears that morphine withdrawal may be a causative factor that leads to mental decline, presenting an important avenue for research in how we can better help people who are trying to quit using drugs,” concludes Mocchetti.
A Columbia University Medical Center-led research team has clinically validated a new method for predicting time to full-time care, nursing home residence, or death for patients with Alzheimer’s disease. The method, which uses data gathered from a single patient visit, is based on a complex model of Alzheimer’s disease progression that the researchers developed by consecutively following two sets of Alzheimer’s patients for 10 years each. The results were published online ahead of print in the Journal of Alzheimer’s Disease.

“Predicting Alzheimer’s progression has been a challenge because the disease varies significantly from one person to another—two Alzheimer’s patients may both appear to have mild forms of the disease, yet one may progress rapidly, while the other progresses much more slowly,” said senior author Yaakov Stern, PhD, professor of neuropsychology (in neurology, psychiatry, and psychology and in the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain and the Gertrude H. Sergievsky Center) at CUMC. “Our method enables clinicians to predict the disease path with great specificity.”
The brains of children with autism show more connections than the brains of typically developing children do. What’s more, the brains of individuals with the most severe social symptoms are also the most hyper-connected. The findings reported in two independent studies published in the Cell Press journal Cell Reports (1, 2) on November 7th are challenge the prevailing notion in the field that autistic brains are lacking in neural connections.

The findings could lead to new treatment strategies and new ways to detect autism early, the researchers say. Autism spectrum disorder is a neurodevelopmental condition affecting nearly 1 in 88 children.
"Our study addresses one of the hottest open questions in autism research," said Kaustubh Supekar of Stanford University School of Medicine of his and his colleague Vinod Menon’s study aimed at characterizing whole-brain connectivity in children. "Using one of the largest and most heterogeneous pediatric functional neuroimaging datasets to date, we demonstrate that the brains of children with autism are hyper-connected in ways that are related to the severity of social impairment exhibited by these children."
In the second Cell Reports study, Ralph-Axel Müller and colleagues at San Diego State University focused specifically on neighboring brain regions to find an atypical increase in connections in adolescents with a diagnosis of autism spectrum disorder. That over-connection, which his team observed particularly in the regions of the brain that control vision, was also linked to symptom severity.
"Our findings support the special status of the visual system in children with heavier symptom load," Müller said, noting that all of the participants in his study were considered "high-functioning" with IQs above 70. He says measures of local connectivity in the cortex might be used as an aid to diagnosis, which today is based purely on behavioral criteria.
For Supekar and Menon, these new views of the autistic brain raise the intriguing possibility that epilepsy drugs might be used to treat autism.
"Our findings suggest that the imbalance of excitation and inhibition in the local brain circuits could engender cognitive and behavioral deficits observed in autism," Menon said. That imbalance is a hallmark of epilepsy as well, which might explain why children with autism so often suffer with epilepsy too.
"Drawing from these observations, it might not be too far fetched to speculate that the existing drugs used to treat epilepsy may be potentially useful in treating autism," Supekar said.
In a study led by Duke researchers, monkeys have learned to control the movement of both arms on an avatar using just their brain activity.
The findings, published Nov. 6, 2013, in the journal Science Translational Medicine, advance efforts to develop bilateral movement in brain-controlled prosthetic devices for severely paralyzed patients.
To enable the monkeys to control two virtual arms, researchers recorded nearly 500 neurons from multiple areas in both cerebral hemispheres of the animals’ brains, the largest number of neurons recorded and reported to date
Millions of people worldwide suffer from sensory and motor deficits caused by spinal cord injuries. Researchers are working to develop tools to help restore their mobility and sense of touch by connecting their brains with assistive devices. The brain-machine interface approach, pioneered at the Duke University Center for Neuroengineering in the early 2000s, holds promise for reaching this goal. However, until now brain-machine interfaces could only control a single prosthetic limb.
“Bimanual movements in our daily activities — from typing on a keyboard to opening a can — are critically important,” said senior author Miguel Nicolelis, M.D., Ph.D., professor of neurobiology at Duke University School of Medicine. “Future brain-machine interfaces aimed at restoring mobility in humans will have to incorporate multiple limbs to greatly benefit severely paralyzed patients.”
Nicolelis and his colleagues studied large-scale cortical recordings to see if they could provide sufficient signals to brain-machine interfaces to accurately control bimanual movements.
The monkeys were trained in a virtual environment within which they viewed realistic avatar arms on a screen and were encouraged to place their virtual hands on specific targets in a bimanual motor task. The monkeys first learned to control the avatar arms using a pair of joysticks, but were able to learn to use just their brain activity to move both avatar arms without moving their own arms.
As the animals’ performance in controlling both virtual arms improved over time, the researchers observed widespread plasticity in cortical areas of their brains. These results suggest that the monkeys’ brains may incorporate the avatar arms into their internal image of their bodies, a finding recently reported by the same researchers in the journal Proceedings of the National Academy of Sciences.
The researchers also found that cortical regions showed specific patterns of neuronal electrical activity during bimanual movements that differed from the neuronal activity produced for moving each arm separately.
The study suggests that very large neuronal ensembles — not single neurons — define the underlying physiological unit of normal motor functions. Small neuronal samples of the cortex may be insufficient to control complex motor behaviors using a brain-machine interface.
“When we looked at the properties of individual neurons, or of whole populations of cortical cells, we noticed that simply summing up the neuronal activity correlated to movements of the right and left arms did not allow us to predict what the same individual neurons or neuronal populations would do when both arms were engaged together in a bimanual task,” Nicolelis said. “This finding points to an emergent brain property — a non-linear summation — for when both hands are engaged at once.”
Nicolelis is incorporating the study’s findings into the Walk Again Project, an international collaboration working to build a brain-controlled neuroprosthetic device. The Walk Again Project plans to demonstrate its first brain-controlled exoskeleton, which is currently being developed, during the opening ceremony of the 2014 FIFA World Cup.
TAU researchers study the long-term effects of torture on the human pain system

Israeli soldiers captured during the 1973 Yom Kippur War were subjected to brutal torture in Egypt and Syria. Held alone in tiny, filthy spaces for weeks or months, sometimes handcuffed and blindfolded, they suffered severe beatings, burns, electric shocks, starvation, and worse. And rather than receiving treatment, additional torture was inflicted on existing wounds.
Forty years later, research by Prof. Ruth Defrin of the Department of Physical Therapy in the Sackler Faculty of Medicine at Tel Aviv University shows that the ex-prisoners of war (POWs), continue to suffer from dysfunctional pain perception and regulation, likely as a result of their torture. The study — conducted in collaboration with Prof. Zahava Solomon and Prof. Karni Ginzburg of TAU’s Bob Shapell School of Social Work and Prof. Mario Mikulincer of the School of Psychology at the Interdisciplinary Center, Herzliya — was published in the European Journal of Pain.
"The human body’s pain system can either inhibit or excite pain. It’s two sides of the same coin," says Prof. Defrin. "Usually, when it does more of one, it does less of the other. But in Israeli ex-POWs, torture appears to have caused dysfunction in both directions. Our findings emphasize that tissue damage can have long-term systemic effects and needs to be treated immediately."
A painful legacy
The study focused on 104 combat veterans of the Yom Kippur War. Sixty of the men were taken prisoner during the war, and 44 of them were not. In the study, all were put through a battery of psychophysical pain tests — applying a heating device to one arm, submerging the other arm in a hot water bath, and pressing a nylon fiber into a middle finger. They also filled out psychological questionnaires.
The ex-POWs exhibited diminished pain inhibition (the degree to which the body eases one pain in response to another) and heightened pain excitation (the degree to which repeated exposure to the same sensation heightens the resulting pain). Based on these novel findings, the researchers conclude that the torture survivors’ bodies now regulate pain in a dysfunctional way.
It is not entirely clear whether the dysfunction is the result of years of chronic pain or of the original torture itself. But the ex-POWs exhibited worse pain regulation than the non-POW chronic pain sufferers in the study. And a statistical analysis of the test data also suggested that being tortured had a direct effect on their ability to regulate pain.
Head games
The researchers say non-physical torture may have also contributed to the ex-POWs’ chronic pain. Among other forms of oppression and humiliation, the ex-POWs were not allowed to use the toilet, cursed at and threatened, told demoralizing misinformation about their loved ones, and exposed to mock executions. In the later stages of captivity, most of the POWs were transferred to a group cell, where social isolation was replaced by intense friction, crowding, and loss of privacy.
"We think psychological torture also affects the physiological pain system," says Prof. Defrin. "We still have to fully analyze the data, but preliminary analysis suggests there is a connection."
For long, brain development and maturation has been thought to be a one-way process, in which plasticity diminishes with age. The possibility that the adult brain can revert to a younger state and regain plasticity has not been considered, often. In a paper appearing on November 4 in the online open-access journal Molecular Brain, Dr. Tsuyoshi Miyakawa and his colleagues from Fujita Health University show that chronic administration of one of the most widely used antidepressants fluoxetine (FLX, which is also known by trade names like Prozac, Sarafem, and Fontex and is a selective serotonin reuptake inhibitor) can induce a juvenile-like state in specific types of neurons in the prefrontal cortex of adult mice.
In their study, FLX-treated adult mice showed reduced expression of parvalbumin and perineuronal nets, which are molecular markers for maturation and are expressed in a certain group of mature neurons in adults, and increased expression of an immature marker, which typically appears in developing juvenile brains, in the prefrontal cortex. These findings suggest the possibility that certain types of adult neurons in the prefrontal cortex can partially regain a youth-like state; the authors termed this as induced-youth or iYouth. These researchers as well as other groups had previously reported similar effects of FLX in the hippocampal dentate gyrus, basolateral amygdala, and visual cortex, which were associated with increased neural plasticity in certain types of neurons. This study is the first to report on “iYouth” in the prefrontal cortex, which is the brain region critically involved in functions such as working memory, decision-making, personality expression, and social behavior, as well as in psychiatric disorders related to deficits in these functions.
Network dysfunction in the prefrontal cortex and limbic system, including the hippocampus and amygdala, is known to be involved in the pathophysiology of depressive disorders. Reversion to a youth-like state may mediate some of the therapeutic effects of FLX by restoring neural plasticity in these regions. On the other hand, some non-preferable aspects of FLX-induced pseudo-youth may play a role in certain behavioral effects associated with FLX treatment, such as aggression, violence, and psychosis, which have recently received attention as adverse effects of FLX. Interestingly, expression of the same molecular markers of maturation, as discussed in this study, has been reported to be decreased in the prefrontal cortex of postmortem brains of patients with schizophrenia. This raises the possibility that some of FLX’s adverse effects may be attributable to iYouth in the same type of neurons in this region. Currently, basic knowledge on this is lacking, and there are several unanswered questions like: What are the molecular and cellular mechanisms underlying iYouth? What are the differences between actual youth and iYouth? Is iYouth good or bad? Future studies to answer these questions could potentially revolutionize the prevention and/or treatment of various neuropsychiatric disorders and aid in improving the quality of life for an aging population.
More than two decades ago, Ryan Vincent had open brain surgery to remove a malignant brain tumor, resulting in a lengthy hospital stay and weeks of recovery at home. Recently, neurosurgeons at Houston Methodist Hospital removed a different lesion from Vincent’s brain through a tube inserted into a hole smaller than a dime and he went home the next day.

Gavin Britz, MBBCh, MPH, FAANS, chairman of neurosurgery at Houston Methodist Neurological Institute, used a minimally-invasive technique to remove a vascular lesion from deep within the 44-year-old patient’s brain, the first to use this technique in the region. Traditionally, vascular lesions or brain tumors that are located deep within the brain can cause damage just by surgical removal.
“With this new approach, we can navigate through millions of important brain fibers and tracts to access deep areas of the brain where these benign tumors or hemorrhages are located with minimal injury to normal brain,” said Britz. “Ryan’s surgery took less than an hour.”
Houston Methodist neurosurgeons Britz and David Baskin, M.D., director of the Kenneth R. Peak Brain & Pituitary Tumor Center, are using this “six-pillar approach” that encompasses the latest technology in minimally-invasive surgeries — mapping of the brain; navigating the brain like a GPS system; safely accessing the brain and tumor/lesion; using high-end optics for visualization; successfully removing the tumor without disrupting tissues around it; and directed therapy using tissue collected for evaluation that can then be used for personalized treatments.
The new surgical technique is used to remove cancerous and non-cancerous tumors, lesions and cysts deep inside the brain. This approach reduces risks of damage to speech, memory, muscle strength, balance, vision, coordination and other function areas of the brain.
A stem cell therapy previously shown to reduce inflammation in the critical time window after traumatic brain injury also promotes lasting cognitive improvement, according to preclinical research led by Charles Cox, M.D., at The University of Texas Health Science Center at Houston (UTHealth) Medical School.
The research was published in today’s issue of STEM CELLS Translational Medicine.
Cellular damage in the brain after traumatic injury can cause severe, ongoing neurological impairment and inflammation. Few pharmaceutical options exist to treat the problem. About half of patients with severe head injuries need surgery to remove or repair ruptured blood vessels or bruised brain tissue.
A stem cell treatment known as multipotent adult progenitor cell (MAPC) therapy has been found to reduce inflammation in mice immediately after traumatic brain injury, but no one had been able to gauge its usefulness over time.
The research team led by Cox, the Children’s Fund, Inc. Distinguished Professor of Pediatric Surgery at the UTHealth Medical School, injected two groups of brain-injured mice with MAPCs two hours after the mice were injured and again 24 hours later. One group received a dose of 2 million cells per kilogram and the other a dose five times stronger.
After four months, the mice receiving the stronger dose not only continued to have less inflammation—they also made significant gains in cognitive function. A laboratory examination of the rodents’ brains confirmed that those receiving the higher dose of MAPCs had better brain function than those receiving the lower dose.
“Based on our data, we saw improved spatial learning, improved motor deficits and fewer active antibodies in the mice that were given the stronger concentration of MAPCs,” Cox said.
The study indicates that intravenous injection of MAPCs may in the future become a viable treatment for people with traumatic brain injury, he said.
A paper published in a special edition of the journal Science proposes a novel understanding of brain architecture using a network representation of connections within the primate cortex. Zoltán Toroczkai, professor of physics at the University of Notre Dame and co-director of the Interdisciplinary Center for Network Science and Applications, is a co-author of the paper “Cortical High-Density Counterstream Architectures.”

Using brain-wide and consistent tracer data, the researchers describe the cortex as a network of connections with a “bow tie” structure characterized by a high-efficiency, dense core connecting with “wings” of feed-forward and feedback pathways to the rest of the cortex (periphery). The local circuits, reaching to within 2.5 millimeters and taking up more than 70 percent of all the connections in the macaque cortex, are integrated across areas with different functional modalities (somatosensory, motor, cognitive) with medium- to long-range projections.
The authors also report on a simple network model that incorporates the physical principle of entropic cost to long wiring and the spatial positioning of the functional areas in the cortex. They show that this model reproduces the properties of the connectivity data in the experiments, including the structure of the bow tie. The wings of the bow tie emerge from the counterstream organization of the feed-forward and feedback nature of the pathways. They also demonstrate that, contrary to previous beliefs, such high-density cortical graphs can achieve simultaneously strong connectivity (almost direct between any two areas), communication efficiency, and economy of connections (shown via optimizing total wire cost) via weight-distance correlations that are also consequences of this simple network model.
This bow tie arrangement is a typical feature of self-organizing information processing systems. The paper notes that the cortex has some analogies with information-processing networks such as the World Wide Web, as well as metabolism, the immune system and cell signaling. The core-periphery bow tie structure, they say, is “an evolutionarily favored structure for a wide variety of complex networks” because “these systems are not in thermodynamic equilibrium and are required to maintain energy and matter flow through the system.” The brain, however, also shows important differences from such systems. For example, destination addresses are encoded in information packets sent along the Internet, apparently unlike in the brain, and location and timing of activity are critical factors of information processing in the brain, unlike in the Internet.
“Biological data is extremely complex and diverse,” Toroczkai said. “However, as a physicist, I am interested in what is common or invariant in the data, because it may reveal a fundamental organizational principle behind a complex system. A minimal theory that incorporates such principle should reproduce the observations, if not in great detail, but in extent. I believe that with additional consistent data, as those obtained by the Kennedy team, the fundamental principles of massive information processing in brain neuronal networks are within reach.”