Posts tagged neuroscience

Posts tagged neuroscience

NSF-funded Superhero Supercomputer Helps Battle Autism
'Gordon,' a supercomputer with unique flash memory, helps identify gene-related paths to treating mental disorders
When it officially came online at the San Diego Supercomputer Center (SDSC) in early January 2012, Gordon was instantly impressive. In one demonstration, it sustained more than 35 million input/output operations per second—then, a world record.
Input/output operations are an important measure for data intensive computing, indicating the ability of a storage system to quickly communicate between an information processing system, such as a computer, and the outside world. Input/output operations specify how fast a system can retrieve randomly organized data common in large datasets and process it through data mining applications.
The supercomputer’s record-breaking feat wasn’t a surprise; after all, Gordon is named after a comic strip superhero, Flash Gordon.
Gordon’s new and unique architecture employs massive amounts of the type of flash memory common in cell phones and laptops—hence its name. The system is used by scientists whose research requires the mining, searching and/or creating of large databases for immediate or later use, including mapping genomes for applications in personalized medicine and examining computer automation of stock trading by investment firms on Wall Street.
Commissioned by the National Science Foundation (NSF) in 2009 for $20 million, Gordon is part of NSF’s Extreme Science and Engineering Discovery Environment, or XSEDE program, a nationwide partnership comprising 16 high-performance computers and high-end visualization and data analysis resources.
"Gordon is a unique machine in NSF’s Advanced Cyberinfrastructure/XSEDE portfolio," said Barry Schneider, NSF program director for advanced cyberinfrastructure. “It was designed to handle scientific problems involving the manipulation of very large data. It is differentiated from most other resources we support in having a large solid-state memory, 4 GB per core, and the capability of simulating a very large shared memory system with software.”
Last month, a team of researchers from SDSC, the United States and the Institute Pasteur in France reported in the journal Genes, Brain and Behavior that they used Gordon to devise a novel way to describe a time-dependent gene-expression process in the brain that can be used to guide the development of treatments for mental disorders such as autism-spectrum disorders and schizophrenia.
The researchers identified the hierarchical tree of coherent gene groups and transcription-factor networks that determine the patterns of genes expressed during brain development. They found that some “master transcription factors” at the top level of the hierarchy regulated the expression of a significant number of gene groups.
The scientists’ findings can be used for selection of transcription factors that could be targeted in the treatment of specific mental disorders.
"We live in the unique time when huge amounts of data related to genes, DNA, RNA, proteins, and other biological objects have been extracted and stored," said lead author Igor Tsigelny, a research scientist with SDSC as well as with UC San Diego’s Moores Cancer Center and its Department of Neurosciences.
"I can compare this time to a situation when the iron ore would be extracted from the soil and stored as piles on the ground. All we need is to transform the data to knowledge, as ore to steel. Only the supercomputers and people who know what to do with them will make such a transformation possible," he said.
Human Emotion: We Report Our Feelings in 3-D
Like it or not and despite the surrounding debate of its merits, 3-D is the technology du jour for movie-making in Hollywood. It now turns out that even our brains use 3 dimensions to communicate emotions.
According to a new study published in Biological Psychiatry, the human report of emotion relies on three distinct systems: one system that directs attention to affective states (“I feel”), a second system that categorizes these states into words (“good”, “bad”, etc.); and a third system that relates the intensity of affective responses (“bad” or “awful”?).
Emotions are central to the human experience. Whether we are feeling happy, sad, afraid, or angry, we are often asked to identify and report on these feelings. This happens when friends ask us how we are doing, when we talk about professional or personal relationships, when we meditate, and so on. In fact, the very commonness and ease of reporting what we are feeling can lead us to overlook just how important such reports are - and how devastating the impairment of this ability may be for individuals with clinical disorders ranging from major depression to schizophrenia to autism spectrum disorders.
Progress in brain science has steadily been shedding light on the circuits and processes that underlie mood states. One of the leaders in this effort, Dr. Kevin Ochsner, Director of the Social Cognitive Neuroscience Lab at Columbia University, studies the neural bases of social, cognitive and affective processes. In this new study, he and his team set out to study the processes involved in constructing self-reports of emotion, rather than the effects of the self-reports or the emotional states themselves for which there is already much research.
To accomplish this, they recruited healthy participants who underwent brain scans while completing an experimental task that generated a self-report of emotion. This effort allowed the researchers to examine the neural architecture underlying the emotional reports.
“We find that the seemingly simple ability is supported by three different kinds of brain systems: largely subcortical regions that trigger an initial affective response, parts of medial prefrontal cortex that focus our awareness on the response and help generate possible ways of describing what we are feeling, and a part of the lateral prefrontal cortex that helps pick the best words for the feelings at hand,” said Ochsner.
“These findings suggest that self-reports of emotion - while seemingly simple - are supported by a network of brain regions that together take us from an affecting event to the words that make our feelings known to ourselves and others,” he added. “As such, these results have important implications for understanding both the nature of everyday emotional life - and how the ability to understand and talk about our emotions can break down in clinical populations.”
Dr. John Krystal, Editor of Biological Psychiatry, said, “It is critical that we understand the mechanisms underlying the absorption in emotion, the valence of emotion, and the intensity of emotion. In the short run, appreciation of the distinct circuits mediating these dimensions of emotional experience helps us to understand how brain injury, stroke, and tumors produce different types of mood changes. In the long run, it may help us to better treat mood disorders.”
A new study suggests that migraines are related to brain abnormalities present at birth and others that develop over time. The research is published online in the journal Radiology.

Migraines are intense, throbbing headaches, sometimes accompanied by nausea, vomiting and sensitivity to light. Some patients experience auras, a change in visual or sensory function that precedes or occurs during the migraine. More than 300 million people suffer from migraines worldwide, according to the World Health Organization.
Previous research on migraine patients has shown atrophy of cortical regions in the brain related to pain processing, possibly due to chronic stimulation of those areas. Cortical refers to the cortex, or outer layer of the brain.
Much of that research has relied on voxel-based morphometry, which provides estimates of the brain’s cortical volume. In the new study, Italian researchers used a different approach: a surface-based MRI method to measure cortical thickness.
"For the first time, we assessed cortical thickness and surface area abnormalities in patients with migraine, which are two components of cortical volume that provide different and complementary pieces of information," said Massimo Filippi, M.D., director of the Neuroimaging Research Unit at the University Ospedale San Raffaele and professor of neurology at the University Vita-Salute’s San Raffaele Scientific Institute in Milan. "Indeed, cortical surface area increases dramatically during late fetal development as a consequence of cortical folding, while cortical thickness changes dynamically throughout the entire life span as a consequence of development and disease."
Dr. Filippi and colleagues used magnetic resonance imaging (MRI) to acquire T2-weighted and 3-D T1-weighted brain images from 63 migraine patients and 18 healthy controls. Using special software and statistical analysis, they estimated cortical thickness and surface area and correlated it with the patients’ clinical and radiologic characteristics.
Compared to controls, migraine patients showed reduced cortical thickness and surface area in regions related to pain processing. There was only minimal anatomical overlap of cortical thickness and cortical surface area abnormalities, with cortical surface area abnormalities being more pronounced and distributed than cortical thickness abnormalities. The presence of aura and white matter hyperintensities—areas of high intensity on MRI that appear to be more common in people with migraine—was related to the regional distribution of cortical thickness and surface area abnormalities, but not to disease duration and attack frequency.
"The most important finding of our study was that cortical abnormalities that occur in patients with migraine are a result of the balance between an intrinsic predisposition, as suggested by cortical surface area modification, and disease-related processes, as indicated by cortical thickness abnormalities," Dr. Filippi said. "Accurate measurements of cortical abnormalities could help characterize migraine patients better and improve understanding of the pathophysiological processes underlying the condition."
Additional research is needed to fully understand the meaning of cortical abnormalities in the pain processing areas of migraine patients, according to Dr. Filippi.
"Whether the abnormalities are a consequence of the repetition of migraine attacks or represent an anatomical signature that predisposes to the development of the disease is still debated," he said. "In my opinion, they might contribute to make migraine patients more susceptible to pain and to an abnormal processing of painful conditions and stimuli."
The researchers are conducting a longitudinal study of the patient group to see if their cortical abnormalities are stable or tend to worsen over the course of the disease. They are also studying the effects of treatments on the observed modifications of cortical folding and looking at pediatric patients with migraine to assess whether the abnormalities represent a biomarker of the disease.
(Source: eurekalert.org)
Researchers scoring a win-win with novel set of concussion diagnostic tools
From Junior Seau, former San Diego Chargers linebacker, to Dave Duerson, former Chicago Bears safety — who both committed suicide as a result of chronic traumatic encephalopathy (CTE) — traumatic brain injuries (TBIs) have been making disturbing headlines at an alarming rate. In the United States alone, TBIs account for an estimated 1.6 million to 3.8 million sports injuries every year, with approximately 300,000 of those being diagnosed among young, nonprofessional athletes. But TBIs are not confined to sports; they are also considered a signature wound among soldiers of the Iraq and Afghanistan wars.
The potential impact on the health and well-being of individuals with brain injuries are numerous. These individuals might display a range of symptoms — such as headaches, depression, loss of memory and loss of brain function — that may persist for weeks or months. The effects of brain injuries are most devastating when they remain unrecognized for long periods of time. This is where Christian Poellabauer, associate professor of computer science and engineering; Patrick Flynn, professor of computer science and engineering; Nikhil Yadav, graduate student of computer science and engineering; and a team of students and faculty are making their own impact.
Although baseline tests of athletes prior to an injury are trending up, these tests must still be compared to examinations after an injury has occurred. They require heavy medical equipment, such as a CT scanner, MRI equipment or X-ray machine, and are not always conclusive. The Notre Dame team has developed a tablet-based testing system that captures the voice of an individual and analyzes the speech for signs of a potential concussion anytime, anywhere, in real time.
“This project is a great example of how mobile computing and sensing technologies can transform health care,” Poellabauer said. “More important, because almost 90 percent of concussions go unrecognized, this technology offers tremendous potential to reduce the impact of concussive and subconcussive hits to the head.”
The system sounds simple enough: An individual speaks into a tablet equipped with the Notre Dame program before and after an event. The two samples are then compared for TBI indicators, which include distorted vowels, hyper nasality and imprecise consonants.
Notre Dame’s system offers a variety of advantages over traditional testing, such as portability, high accuracy, low cost and a low probability of manipulation (the results cannot be faked); it has also proven very successful. In testing that occurred during the Notre Dame Bengal Bouts and Baraka Bouts, annual student boxing tournaments, the researchers established baselines for boxers using tests such as the Axon Sports Computerized Cognitive Assessment Tool (CCAT), the Sport Concussion Assessment Tool 2 (SCAT2) and the Notre Dame iPad-based reading and voice recording test.
During the 2012 Bengal Bouts, nine concussions (out of 125 participants) were confirmed by this new speech-based test and the University’s medical team. Separate tests of 80 female boxers were also conducted during the 2012 Baraka Bouts. Outcomes of the 2013 Bengal Bouts are currently being compared to the findings of the University medical team on approximately 130 male boxers.
The testing was done in cooperation with James Moriarity, the University’s chief sports medicine physician, who has developed a series of innovative concussion testing studies.

Researchers discover the brain origins of variation in pathological anxiety
New findings from nonhuman primates suggest that an overactive core circuit in the brain, and its interaction with other specialized circuits, accounts for the variability in symptoms shown by patients with severe anxiety. In a brain-imaging study published in the Proceedings of the National Academy of Sciences (PNAS), researchers from the University of Wisconsin School of Medicine and Public Health describe work that for the first time provides an understanding of the root causes of clinical variability in anxiety disorders.
Using a well-established nonhuman primate model of childhood anxiety, the scientists identified a core circuit that is chronically over-active in all anxious individuals, regardless of their particular pattern of symptoms. They also identified a set of more specialized circuits that are over- or under-active in individuals prone to particular symptoms, such as chronically high levels of the stress-hormone cortisol.
“These findings provide important new insights into altered brain functioning that explains why people with anxiety have such different symptoms and clinical presentations, and it also gives us new ideas, based on an understanding of altered brain function, for helping people with different types of anxiety,’’ says Ned Kalin, senior author, chair of Psychiatry and director of the HealthEmotions Research Institute.
“There is a large need for new treatment strategies, because our current treatments don’t work well for many anxious adults and children who come to us for help.”
In the study, key anxiety-related symptoms were measured in 238 young rhesus monkeys using behavioral and hormonal measurement procedures similar to those routinely used to assess extreme shyness in children. Young monkeys are ideally suited for these studies because of their similarities in brain development and social behavior, Kalin notes. Variation in brain activity was quantified in the monkeys using positron emission tomography (PET) imaging, a method that is also used in humans.
Combining behavioral measures of shyness, physiological measures of the stress-hormone cortisol, and brain metabolic imaging, co-lead authors Alexander Shackman, Andrew Fox and their collaborators showed that a core neural system marked by elevated activity in the central nucleus of the amygdala was a consistent brain signature shared by young monkeys with chronically high levels of anxiety. This was true despite striking differences across monkeys in the predominance of particular anxiety-related symptoms.
The Wisconsin researchers also showed that young monkeys with particular anxiety profiles, such as high levels of shyness, showed changes in symptom-specific brain circuits. Finally, Shackman, Fox and colleagues uncovered evidence that the two kinds of brain circuits, one shared by all anxious individuals, the other specific to those with particular symptoms, work together to produce different presentations of pathological anxiety.
The new study builds upon earlier work by the Kalin laboratory demonstrating that activity in the amygdala is strongly shaped by early-life experiences, such as parenting and social interactions. They hypothesize that extreme anxiety stems from problems with the normal maturation of brain systems involved in emotional learning, which suggests that anxious children have difficulty learning to effectively regulate brain anxiety circuits. Taken together, this line of research sets the stage for improved strategies for preventing extreme childhood anxiety from blossoming into full-blown anxiety disorders.
“This means the amygdala is an extremely attractive target for new, broad-spectrum anxiety treatments,’’ says Shackman. “The central nucleus of the amygdala is a uniquely malleable substrate for anxiety, one that can help to trigger a wide range of symptoms.”
The work also suggests more specific brain targets for different symptom profiles. Such therapies could range from new, more selectively targeted medications to intensive therapies that seek to re-train the amygdala, ranging from conventional cognitive-behavioral therapies to training in mindfulness and other techniques, Shackman noted. To further understand the clinical significance of these observations, the laboratory is conducting a parallel study in young children suffering from anxiety disorders.
The field of cell therapy, which aims to form new cells in the body in order to cure disease, has taken another important step in the development towards new treatments. A new report from researchers at Lund University in Sweden shows that it is possible to re-programme other cells to become nerve cells, directly in the brain.

Two years ago, researchers in Lund were the first in the world to re-programme human skin cells, known as fibroblasts, to dopamine-producing nerve cells – without taking a detour via the stem cell stage. The research group has now gone a step further and shown that it is possible to re-programme both skin cells and support cells directly to nerve cells, in place in the brain.
“The findings are the first important evidence that it is possible to re-programme other cells to become nerve cells inside the brain”, said Malin Parmar, research group leader and Reader in Neurobiology.
The researchers used genes designed to be activated or de-activated using a drug. The genes were inserted into two types of human cells: fibroblasts and glia cells – support cells that are naturally present in the brain. Once the researchers had transplanted the cells into the brains of rats, the genes were activated using a drug in the animals’ drinking water. The cells then began their transformation into nerve cells.
In a separate experiment on mice, where similar genes were injected into the mice’s brains, the research group also succeeded in re-programming the mice’s own glia cells to become nerve cells.
“The research findings have the potential to open the way for alternatives to cell transplants in the future, which would remove previous obstacles to research, such as the difficulty of getting the brain to accept foreign cells, and the risk of tumour development”, said Malin Parmar.
All in all, the new technique of direct re-programming in the brain could open up new possibilities to more effectively replace dying brain cells in conditions such as Parkinson’s disease.
“We are now developing the technique so that it can be used to create new nerve cells that replace the function of damaged cells. Being able to carry out the re-programming in vivo makes it possible to imagine a future in which we form new cells directly in the human brain, without taking a detour via cell cultures and transplants”, concluded Malin Parmar.
The research article is entitled ‘Generation of induced neurons via direct conversion in vivo’ and has been published in the Proceedings of the National Academy of Science (PNAS)
(Source: lunduniversity.lu.se)

Innate ability to vocalize: Deaf or not, courting male mice make same sounds
Scientists have long thought mice might be a model for how humans learn to vocalize. But new research led by scientists at Washington State University Vancouver has found that, unlike humans and songbirds, mice do not learn to vocalize.
The results, published in the Journal of Neuroscience, point the way to a more finely focused, genetic tool for teasing out the mysteries of speech and its disorders.
To see if mice learn to vocalize, WSU neurophysiologist Christine Portfors destroyed the ear hair cells in more than a dozen newborn male mice. The cells convert sound waves into electrical signals processed by the brain, making hearing possible.
The deaf mice were then raised with hearing mice in a normal social environment.
Portfors and her fellow researchers, including WSU graduate student Elena Mahrt, used males because they are particularly exuberant vocalizers in the presence of females.
"We can elicit vocalization behavior in males really easily by just putting them with a female,” Portfors said. "They vocalize like crazy.”
And it turned out that it didn’t matter if the mouse was deaf or not. The researchers catalogued essentially the same suite of ultrasonic sounds from both the deaf and hearing mice. “It means that they don’t need to hear to be able to produce their sounds, their vocalizations,” Portfors said. “Basically, they don’t need to hear themselves. They don’t need auditory feedback. They don’t need to learn.”
The finding means mice are out as a model to study vocal learning. However, scientists can now focus on the mouse to learn the genetic mechanism behind communication disorders.
"If you don’t have learning as a variable, you can look at the genetic control of these things,” Portfors said. "You can look at the genetic control of the output of the signal. It’s not messed up by an animal that’s been in a particular learning situation.”
(Image: Fotolia)
Brain Size Didn’t Drive Evolution, Research Suggests
Brain organization, not overall size, may be the key evolutionary difference between primate brains, and the key to what gives humans their smarts, new research suggests.
In the study, researchers looked at 17 species that span 40 million years of evolutionary time, finding changes in the relative size of specific brain regions, rather than changes in brain size, accounted for three-quarters of brain evolution over that time. The study, published today (March 26) in the Proceedings of the Royal Society B, also revealed that massive increases in the brain’s prefrontal cortex played a critical role in great ape evolution.
"For the first time, we can really identify what is so special about great ape brain organization," said study co-author Jeroen Smaers, an evolutionary biologist at the University College London.
Is bigger better?
Traditionally, scientists have thought humans’ superior intelligence derived mostly from the fact that our brains are three times bigger than our nearest living relatives, chimpanzees.
But bigger isn’t always better. Bigger brains take much more energy to power, so scientists have hypothesized that brain reorganization could be a smarter strategy to evolve mental abilities.
To see how brain organization evolved throughout primates, Smaers and his colleague Christophe Soligo analyzed post-mortem slices of brains from 17 different primates, then mapped changes in brain size onto an evolutionary tree.
Over evolutionary time, several key brain regions increased in size relative to other regions. Great apes (especially humans) saw a rise in white matter in the prefrontal cortex, which contributes to social cognition, moral judgments, introspection and goal-directed planning.
"The prefrontal cortex is a little bit like the CEO of the brain," Smaers told LiveScience. "It takes information from other brain areas and it synthesizes them."
When great apes diverged from old-world monkeys about 20 million years ago, brain regions tied to motor planning also increased in relative size. That could have helped them orchestrate the complex movements needed to manipulate tools — possibly to get at different food sources, Smaers said.
Gibbons and howler monkeys showed a different pattern. Even though their bodies and their brains got smaller over time, the hippocampus, which plays a role in spatial tasks, tended to increase in size in relation to the rest of the brain. That may have allowed these monkeys to be spatially adept and inhabit a more diverse range of environments.
Prefrontal cortex
The study shows that specific parts of the brain can selectively scale up to meet the demands of new environments, said Chet Sherwood, an anthropologist at George Washington University, who was not involved in the study.
The finding also drives home the importance of the prefrontal cortex, he said.
"It’s very suggestive that connectivity of prefrontal cortex has been a particularly strong driving force in ape and human brains," Sherwood told LiveScience.
The virus that causes cold sores, along with other viral or bacterial infections, may be associated with cognitive problems, according to a new study published in the March 26, 2013, print issue of Neurology®, the medical journal of the American Academy of Neurology.
The study found that people who have had higher levels of infection in their blood (measured by antibody levels), meaning they had been exposed over the years to various pathogens such as the herpes simplex type 1 virus that causes cold sores, were more likely to have cognitive problems than people with lower levels of infection in the blood. “We found the link was greater among women, those with lower levels of education and Medicaid or no health insurance, and most prominently, in people who do not exercise,” said author Mira Katan, MD, with the Northern Manhattan Study at Columbia University Medical Center in New York and a member of the American Academy of Neurology. The study was performed in collaboration with the Miller School of Medicine at the University of Miami in Miami, FL.
For the study, researchers tested thinking and memory in 1,625 people with an average age of 69 from northern Manhattan in New York. Participants gave blood samples that were tested for five common low grade infections: three viruses (herpes simplex type 1 (oral) and type 2 (genital), and cytomegalovirus), chlamydia pneumoniae (a common respiratory infection) and Helicobacter pylori (a bacteria found in the stomach).
The results showed that the people who had higher levels of infection had a 25 percent increase in the risk of a low score on a common test of cognition called the Mini-Mental State Examination.
The memory and thinking skills were tested every year for an average of eight years. But infection was not associated with changes in memory and thinking abilities over time.
“While this association needs to be further studied, the results could lead to ways to identify people at risk of cognitive impairment and eventually lower that risk,” said Katan. “For example, exercise and childhood vaccinations against viruses could decrease the risk for memory problems later in life.” The study was supported by the National Institutes of Neurological Disorders and Stroke (NINDS), the Swiss National Science Foundation and the Leducq Foundation.

New urgency in battle against ‘bound legs’ disease
The harm done by konzo – a disease overshadowed by the war and drought it tends to accompany – goes beyond its devastating physical effects to impair children’s memory, problem solving and other cognitive functions.
Even children without physical symptoms of konzo appear to lose cognitive ability when exposed to the toxin that causes the disease, researchers report in the journal Pediatrics.
“That’s what’s especially alarming,” said lead author Michael Boivin, a Michigan State University associate professor of psychiatry and of neurology and ophthalmology. “We found subtle effects that haven’t been picked up before. These kids aren’t out of the woods, even if they don’t have the disease.”
Konzo means “bound legs” in the African Yaka language, a reference to how its victims walk with feet bent inward after the disease strips away motor control in their lower limbs. Its onset is rapid, and the damage is permanent.
People contract konzo by consuming poorly processed bitter cassava, a drought-resistant staple food in much of sub-Saharan Africa. Typically, the plant’s tuber is soaked for a few days, then dried in the sun and ground into flour – a process that degrades naturally occurring cyanide.
“As long as they do that, the food’s pretty safe,” said Boivin, who began studying konzo in 1990 as a Fulbright researcher in the Democratic Republic of Congo. “But in times of war, famine, displacement and hardship, people take shortcuts. If they’re subsisting on poorly processed cassava and they don’t have other sources of protein, it can cause permanent damage to the nervous system.
“Konzo doesn’t make many headlines because it usually follows other geopolitical aspects of human suffering,” he added. “Still, there are potentially tens of millions of kids at risk throughout central and western Africa. The public health scope is huge.”
To find out if the disease affects cognitive function, Boivin and colleagues from Oregon Health and Science University turned to the war-torn Congo. They randomly selected 123 children with konzo and 87 neighboring children who showed no signs of the disease but whose blood and urine samples indicated elevated levels of the toxin.
Using cognitive tests, the researchers found that children with konzo had a much harder time using working memory to solve problems and organize visual and spatial information.
They also found that konzo and non-konzo children from the outbreak area showed poor working memory and impaired fine-motor skills when compared to a reference group of children from a part of the region unaffected by the disease.
Konzo’s subtler impacts might seem minor compared to its striking physical symptoms, but Boivin noted that the cognitive damage is similar to that caused by chronic low-grade exposures to other toxic substances such as lead.
Scientists eventually may be able to prevent such damage by creating nontoxic cassava varieties and introducing other resilient crops to affected regions, Boivin said. Meanwhile, public health education programs are under way to help stop outbreaks.
“For now,” he said, “if we could just avoid the worst of it – the full-blown konzo disease that has such devastating effects for children and families – that’s a good start.”