Posts tagged brain

Posts tagged brain
When food is scarce, a smaller brain will do
A new study explains how young brains are protected when nutrition is poor. The findings, published on March 7th in Cell Reports, a Cell Press publication, reveal a coping strategy for producing a fully functional, if smaller, brain. The discovery, which was made in larval flies, shows the brain as an incredibly adaptable organ and may have implications for understanding the developing human brain as well, the researchers say.
The key is a carefully timed developmental system that ultimately ensures neural diversity at the expense of neural numbers.
"In essence, this study reveals an adaptive strategy allowing the reduction of the number of neurons produced in the face of sub-optimal nutritional conditions, while preserving their diversity," said Cedric Maurange of Aix-Marseille Université in France. "This is a survival strategy permitting the developing brain to produce the minimal set of neurons necessary to be functional, at the minimum energetic cost."
Most of the neurons in the human brain are produced well before birth, as the developing fetus grows and changes in the womb. But how the young brain copes with adversity is an unresolved question. If a mother doesn’t have enough food to eat, what happens to the brain of her baby?
To find out, Maurange and his colleagues looked to the fruit fly, a workhorse of biology. The much shorter lifespan of fruit flies means that they reach the equivalent of toddlerhood in just four days’ time.
Their developmental studies in the fly visual system reveal an early sensitivity to the availability of amino acids, ingredients that are the building blocks of proteins. They found that a fly with all the amino acids it needs ends up with a larger pool of neural stem cells than one lacking those nutrients. Later, when those neural stem cells start to produce the many different types of neurons, that nutrient sensitivity goes away. The end result is a brain that is functional but smaller. In some flies, the optic lobe contained 40 percent fewer neurons and still worked.
"We were surprised to realize that the optic lobe can have such a drastically reduced number of neurons under dietary restriction and yet remains functional," Maurange said.
The findings may help to explain well-documented patterns of brain growth in humans. The human brain is protected over other organs when nutrients are lacking late in fetal development, producing a brain that is large relative to organs such as the pancreas or intestine. But when nutrients are limited early in larval development, the brain remains small along with the rest of the body. Those growth patterns are known as asymmetric and symmetric intrauterine growth restriction (IUGR), respectively.
"Our work suggests new avenues to investigate how early nutrient restriction affects mammalian brain development and may help in understanding the mechanisms underlying symmetric and asymmetric IUGR in humans," Maurange said.
Star-Shaped Glial Cells Act as the Brain’s “Motherboard”
The transistors and wires that power our electronic devices need to be mounted on a base material known as a “motherboard.” Our human brain is not so different — neurons, the cells that transmit electrical and chemical signals, are connected to one another through synapses, similar to transistors and wires, and they need a base material too.
But the cells serving that function in the brain may have other functions as well. PhD student Maurizio De Pittà of Tel Aviv University’s Schools of Physics and Astronomy and Electrical Engineering says that astrocytes, the star-shaped glial cells that are predominant in the brain, not only control the flow of information between neurons but also connect different neuronal circuits in various regions of the brain.
Using models designed to mimic brain signalling, De Pittà’s research, led by his TAU supervisor Prof. Eshel Ben-Jacob, determined that astrocytes are actually “smart” in addition to practical. They integrate all the different messages being transferred through the neurons and multiplexing them to the brain’s circuitry. Published in the journal Frontiers in Computational Neuroscience and sponsored by the Italy-Israel Joint Neuroscience Lab, this research introduces a new framework for making sense of brain communications — aiding our understanding of the diseases and disorders that impact the brain.
Transcending boundaries
"Many pathologies are related to malfunctions in brain connectivity," explains Prof. Ben-Jacob, citing epilepsy as one example. "Diagnosis and the development of therapies rely on understanding the network of the brain and the source of undesirable activity."
Connectivity in the brain has traditionally been defined as point-to-point connections between neurons, facilitated by synapses. Astrocytes serve a protective function by encasing neurons and forming borders between different areas of the brain. These cells also transfer information more slowly, says Prof. Ben-Jacob — one-tenth of a second compared to one-thousandth of a second in neurons — producing signals that carry larger amounts of information over longer distances. Aastrocytes can transfer information regionally or spread it to different areas throughout the brain — connecting neurons in a different manner than conventional synapses.
De Pittà and his fellow researchers developed computational models to look at the different aspects of brain signalling, such as neural network electrical activity and signal transfer by synapses. In the course of their research, they discovered that astrocytes actually take an active role in the way these signals are distributed, confirming theories put forth by leading experimental scientists.
Astrocytes form additional networks to those of the neurons and synapses, operating simultaneously to co-ordinate information from different regions of the brain — much like an electrical motherboard functions in a computer, or a conductor ensuring that the entire orchestra is working in harmony, explains De Pittà.
These findings should encourage neuroscientists to think beyond neuron-based networks and adopt a more holistic view of the brain, he suggests, noting that the two communication systems are actually interconnected, and the breakdown of one can certainly impact the other. And what may seem like damage in one small area could actually be carried to larger regions.
A break in communication
According to Prof. Ben-Jacob, a full understanding of the way the brain sends messages is significant beyond satisfying pure scientific curiosity. Many diseases and disorders are caused by an irregularity in the brain’s communication system or by damage to the glial cells, so more precise information on how the network functions can help scientists identify the cause or location of a breakdown and develop treatments to overcome the damage.
In the case of epilepsy, for example, the networks frequently become overexcited. Alzheimer’s disease and other memory disorders are characterized by a loss of cell-to-cell connection. Further understanding brain connectivity can greatly aid research into these and other brain-based pathologies.

"Use it or lose it." The saying could apply especially to the brain when it comes to protecting against Alzheimer’s disease. Previous studies have shown that keeping the mind active, exercising and social interactions may help delay the onset of dementia in Alzheimer’s disease.
Now, a new study led by Dennis Selkoe, MD, co-director of the Center for Neurologic Diseases in the Brigham and Women’s Hospital (BWH) Department of Neurology, provides specific pre-clinical scientific evidence supporting the concept that prolonged and intensive stimulation by an enriched environment, especially regular exposure to new activities, may have beneficial effects in delaying one of the key negative factors in Alzheimer’s disease.
The study will be published online on March 6, 2013 in Neuron.
Alzheimer’s disease occurs when a protein called amyloid beta accumulates and forms “senile plaques” in the brain. This protein accumulation can block nerve cells in the brain from properly communicating with one another. This may gradually lead to an erosion of a person’s mental processes, such as memory, attention, and the ability to learn, understand and process information.
The BWH researchers used a wild-type mouse model when evaluating how the environment might affect Alzheimer’s disease. Unlike other pre-clinical models used in Alzheimer’s disease research, wild-type mice tend to more closely mimic the scenario of average humans developing the disease under normal environmental conditions, rather than being strongly genetically pre-disposed to the disease.
Selkoe and his team found that prolonged exposure to an enriched environment activated certain adrenalin-related brain receptors which triggered a signaling pathway that prevented amyloid beta protein from weakening the communication between nerve cells in the brain’s “memory center,” the hippocampus. The hippocampus plays an important role in both short- and long-term memory.
The ability of an enriched, novel environment to prevent amyloid beta protein from affecting the signaling strength and communication between nerve cells was seen in both young and middle-aged wild-type mice.
"This part of our work suggests that prolonged exposure to a richer, more novel environment beginning even in middle age might help protect the hippocampus from the bad effects of amyloid beta, which builds up to toxic levels in one hundred percent of Alzheimer patients," said Selkoe.
Moreover, the scientists found that exposing the brain to novel activities in particular provided greater protection against Alzheimer’s disease than did just aerobic exercise. According to the researchers, this observation may be due to stimulation that occurred not only physically, but also mentally, when the mice moved quickly from one novel object to another.
"This work helps provide a molecular mechanism for why a richer environment can help lessen the memory-eroding effects of the build-up of amyloid beta protein with age," said Selkoe. "They point to basic scientific reasons for the apparent lessening of AD risk in people with cognitively richer and more complex experiences during life."
A region of the brain known to play a key role in visual and spatial processing has a parallel function: sorting visual information into categories, according to a new study by researchers at the University of Chicago.
Primates are known to have a remarkable ability to place visual stimuli into familiar and meaningful categories, such as fruit or vegetables. They can also direct their spatial attention to different locations in a scene and make spatially-targeted movements, such as reaching.
The study, published in the March issue of Neuron, shows that these very different types of information can be simultaneously encoded within the posterior parietal cortex. The research brings scientists a step closer to understanding how the brain interprets visual stimuli and solves complex tasks.
“We found that multiple functions can be mapped onto a particular region of the brain and even onto individual brain cells in that region,” said study author David Freedman, PhD, assistant professor of neurobiology at the University of Chicago. “These functions overlap. This particular brain area, even its individual neurons, can independently encode both spatial and cognitive signals.”
Freedman studies the effects of learning on the brain and how information is stored in short-term memory, with a focus on the areas that process visual stimuli. To examine this phenomenon, he has taught monkeys to play a simple video game in which they learn to assign moving visual patterns into categories.
“The task is a bit like a baseball umpire calling balls and strikes,” he said, “since the monkeys have to sort the various motion patterns into two groups, or categories.”
The monkeys master the tasks over a few weeks of training. Once they do, the researchers record electrical signals from parietal lobe neurons while the subjects perform the categorization task. By measuring electrical activity patterns of these neurons, the researchers can decode the information conveyed by the neurons’ activity.
“The activity patterns in these parietal neurons carry strong information about the category that each motion pattern gets assigned to during the task,” Freedman said.
(Image: Thinkstock)
Solving the ‘Cocktail Party Problem’
Many smartphones claim to filter out background noise, but they’ve got nothing on the human brain. We can tune in to just one speaker at a noisy cocktail party with little difficulty—an ability that has been a scientific mystery since the early 1950s. Now, researchers argue that the competing noise of other partygoers is filtered out in the brain before it reaches regions involved in higher cognitive functions, such as language and attention control. Their experiments were the first to demonstrate this process.
The scientists didn’t do anything as social as attend a noisy party. Instead, Charles Schroeder, a psychiatrist at the Columbia University College of Physicians and Surgeons in New York City, and colleagues recorded the brain activity of six people with intractable epilepsy who required brain surgery. In order to identify the part of their brains responsible for seizures, the patients underwent 1 to 4 weeks of observation through electrocorticography (ECoG), a technique that provides precise neural recordings via electrodes placed directly on the surface of the brain. Schroeder and his team, using the ECoG data, conducted their experiments during this time.
The researchers showed the patients two videos simultaneously, each of a person telling a 9- to 12-second story; they were asked to concentrate on just one speaker. To determine which neural recordings corresponded to the “ignored” and “attended” speech, the team reconstructed speech patterns from the brain’s electrical activity using a mathematical model. The scientists then matched the reconstructed patterns with the original patterns coming from the ignored and attended speakers.
The patients’ brains had registered both attended and ignored speech, though they showed some preference for the attended speech, the researchers report online in Neuron. Because the researchers were able to record several regions of the patients’ brains, they saw that regions associated with “higher-order” abilities—like the inferior frontal cortex, which is involved with language—had only representations of attended speech. Moreover, this representation of attended speech improved as the speaker’s story unfolded. These findings support a continuous model of attention—called the “selective entrainment hypothesis”—in which the brain tracks and becomes increasingly selective to a particular voice.
The research supports the selective entrainment hypothesis, agrees Jason Bohland, director of Boston University’s Quantitative Neuroscience Laboratory, but it “doesn’t necessarily tell us how that happens. That’s a really hard question, and is still left very much up in the air.”
Though a technology less-invasive than ECoG would be needed, Bohland and Schroeder agree that this research could help provide good clinical markers for people with certain social disorders. People with attention deficit disorder, for example, may struggle in tracking specific voices or filtering out unwanted neural representations of sounds. And those problems should be represented in their brain activity.
Schroeder explained that this study was a part of a new wave of research that aims to “approximate a map of the total brain circuit that’s involved in [complex] things like speech and music perception, which people consider—rightly or wrongly—to be uniquely human.”
Flip of a single molecular switch makes an old brain young
The flip of a single molecular switch helps create the mature neuronal connections that allow the brain to bridge the gap between adolescent impressionability and adult stability. Now Yale School of Medicine researchers have reversed the process, recreating a youthful brain that facilitated both learning and healing in the adult mouse.
Scientists have long known that the young and old brains are very different. Adolescent brains are more malleable or plastic, which allows them to learn languages more quickly than adults and speeds recovery from brain injuries. The comparative rigidity of the adult brain results in part from the function of a single gene that slows the rapid change in synaptic connections between neurons.
By monitoring the synapses in living mice over weeks and months, Yale researchers have identified the key genetic switch for brain maturation a study released March 6 in the journal Neuron. The Nogo Receptor 1 gene is required to suppress high levels of plasticity in the adolescent brain and create the relatively quiescent levels of plasticity in adulthood. In mice without this gene, juvenile levels of brain plasticity persist throughout adulthood. When researchers blocked the function of this gene in old mice, they reset the old brain to adolescent levels of plasticity.
“These are the molecules the brain needs for the transition from adolescence to adulthood,” said Dr. Stephen Strittmatter. Vincent Coates Professor of Neurology, Professor of Neurobiology and senior author of the paper. “It suggests we can turn back the clock in the adult brain and recover from trauma the way kids recover.”
Rehabilitation after brain injuries like strokes requires that patients re-learn tasks such as moving a hand. Researchers found that adult mice lacking Nogo Receptor recovered from injury as quickly as adolescent mice and mastered new, complex motor tasks more quickly than adults with the receptor.
“This raises the potential that manipulating Nogo Receptor in humans might accelerate and magnify rehabilitation after brain injuries like strokes,” said Feras Akbik, Yale doctoral student who is first author of the study.
Researchers also showed that Nogo Receptor slows loss of memories. Mice without Nogo receptor lost stressful memories more quickly, suggesting that manipulating the receptor could help treat post-traumatic stress disorder.
“We know a lot about the early development of the brain,” Strittmatter said, “But we know amazingly little about what happens in the brain during late adolescence.”
Last month, the National Institutes of Health announced a new collaborative initiative that aims to accelerate the search for biomarkers — changes in the body that can be used to predict, diagnose or monitor a disease — in Parkinson’s disease, in part by improving collaboration among researchers and helping patients get involved in clinical studies. As part of this program, launched by the National Institute of Neurological Disorders and Stroke (NINDS), part of the NIH, Clemens Scherzer, MD, a neurologist and researcher at Brigham and Women’s Hospital (BWH), was awarded $2.6 million over five years to work on the development of biomarkers and facilitate NINDS-wide access to one of the largest data and biospecimens bank in the world for Parkinson’s available at BWH. This NINIDS initiative is highlighted in an editorial in the March issue of Lancet Neurology.
"There is a critical gap in the research that leads to lack of treatment for diseases like Parkinson’s," said Scherzer. "Biomarkers are desperately needed to make clinical trials more efficient, less expensive and to monitor disease and treatment response. We are hopeful that this initiative will fast track new discoveries in this area."
According to Scherzer, most of our knowledge of the human brain is based on the analysis of just 1.5 percent of the human genome that encodes proteins. The first part of Scherzer’s project will examine the function of the remaining 98.5 percent of the genome that, so far, has been unexplored in the human brain. While this remainder had been previously dismissed as “junk”, it is now becoming clearer that parts of it actively regulate cell biology. Scherzer and colleagues believe that “dark matter” RNA transcribed from stretches of so called “junk” DNA is active in brain cells and contributes to the complexity of normal dopamine neurons and, when corrupted, Parkinson’s disease.
"This offers a potentially ground breaking opportunity for biomarker development. Initially, the team will search for these RNAs associated in brain tissue of individuals at earliest stages of the disease. Then, this team will look for related biomarkers in the bloodstream and cerebrospinal fluid in both healthy brains and those with Parkinson’s," Scherzer said.
Scherzer’s lab has been spearheading biomarker research in this field since 2004 and the team already has 2,000 patients enrolled and being followed in a longitudinal study with rich clinical data and one of the largest biobanks in the world for Parkinson’s tissue with support from the Harvard NeuroDiscovery Center. The biobank was designed as an incubator for Parkinson’s research and until now was chiefly available for research collaborations within the Harvard-affiliated community. As part of this new project, this vast resource will be open to all NIH-funded investigators.
"Our ultimate goal is to personalize treatment for our patients with Parkinson’s." said Scherzer. "By opening up this vast collection of specimens, we are exploding the resources that are available to NIH-funded investigators looking at this disease. We hope to harness the power of collaboration to speed up biomarkers discovery."
(Source: brighamandwomens.org)
Computer Model May Help Athletes and Soldiers Avoid Brain Damage and Concussions
Concussions can occur in sports and in combat, but health experts do not know precisely which jolts, collisions and awkward head movements during these activities pose the greatest risks to the brain. To find out, Johns Hopkins engineers have developed a powerful new computer-based process that helps identify the dangerous conditions that lead to concussion-related brain injuries. This approach could lead to new medical treatment options and some sports rule changes to reduce brain trauma among players.
The research comes at a time when greater attention is being paid to assessing and preventing the head injuries sustained by both soldiers and athletes. Some kinds of head injuries are difficult to see with standard diagnostic imaging but can have serious long-term consequences. Concussions, once dismissed as a short-term nuisance, have more recently been linked to serious brain disorders.
“Concussion-related injuries can develop even when nothing has physically touched the head, and no damage is apparent on the skin,” said K. T. Ramesh, the Alonzo G. Decker Jr. Professor of Science and Engineering who led the research at Johns Hopkins. “Think about a soldier who is knocked down by the blast wave of an explosion, or a football player reeling after a major collision. The person may show some loss of cognitive function, but you may not immediately see anything in a CT-scan or MRI that tells you exactly where and how much damage has been done to the brain. You don’t know what happened to the brain, so how do you figure out how to treat the patient?”
To help doctors answer this question, Ramesh led a team that used a powerful technique called diffusion tensor imaging, together with a computer model of the head, to identify injured axons, which are tiny but important fibers that carry information from one brain cell to another. These axons are concentrated in a kind of brain tissue known as “white matter,” and they appear to be injured during the so-called mild traumatic brain injury associated with concussions. Ramesh’s team has shown that the axons are injured most easily by strong rotations of the head, and the researchers’ process can calculate which parts of the brain are most likely to be injured during a specific event.
The team described its new technique in the Jan. 8 edition of the Journal of Neurotrauma. The lead author, Rika M. Wright, played a major role in the research while completing her doctoral studies in Johns Hopkins’ Whiting School of Engineering, supervised by Ramesh. Wright is now a postdoctoral research fellow at Carnegie Mellon University. Ramesh is continuing to conduct research using the technique at Johns Hopkins with support from the National Institutes of Health.
Beyond its use in evaluating combat and sports-related injuries, the work could have wider applications, such as detecting axonal damage among patients who have received head injuries in vehicle accidents or serious falls. “This is the kind of injury that may take weeks to manifest,” Ramesh said. “By the time you assess the symptoms, it may be too late for some kinds of treatment to be helpful. But if you can tell right away what happened to the brain and where the injury is likely to have occurred, you may be able to get a crucial head-start on the treatment.”
Is it a Stroke or Benign Dizziness? A Simple Bedside Test Can Tell
A bedside electronic device that measures eye movements can successfully determine whether the cause of severe, continuous, disabling dizziness is a stroke or something benign, according to results of a small study led by Johns Hopkins Medicine researchers.
"Using this device can directly predict who has had a stroke and who has not," says David Newman-Toker, M.D., Ph.D., an associate professor of neurology and otolaryngology at the Johns Hopkins University School of Medicine and leader of the study described in the journal Stroke. “We’re spending hundreds of millions of dollars a year on expensive stroke work-ups that are unnecessary, and probably missing the chance to save tens of thousands of lives because we aren’t properly diagnosing their dizziness or vertigo as stroke symptoms.”
Newman-Toker says if additional larger studies confirm these results, the device could one day be the equivalent of an electrocardiogram (EKG), a simple noninvasive test routinely used to rule out heart attack in patients with chest pain. And, he adds, universal use of the device could “virtually eliminate deaths from misdiagnosis and save a lot of time and money.”
To distinguish stroke from a more benign condition, such as vertigo linked to an inner ear disturbance, specialists typically use three eye movement tests that are essentially a stress test for the balance system. In the hands of specialists, these bedside clinical tests (without the device) have been shown in several large research studies to be extremely accurate — “nearly perfect, and even better than immediate MRI,” says Newman-Toker. One of those tests, known as the horizontal head impulse test, is the best predictor of stroke. To perform it, doctors or technicians ask patients to look at a target on the wall and keep their eyes on the target as doctors move the patients’ heads from side to side. But, says Newman-Toker, it requires expertise to determine whether a patient is making the fast corrective eye adjustments that would indicate a benign form of dizziness as opposed to a stroke.
For the new study, researchers instead performed the same test using a small, portable device — a video-oculography machine that detects minute eye movements that are difficult for most physicians to notice. The machine includes a set of goggles, akin to swimming goggles, with a USB-connected webcam and an accelerometer in the frame. The webcam is hooked up to a laptop where a continuous picture of the eye is taken. Software interprets eye position based on movements and views of the pupil, while the accelerometer measures the speed of the movement of the head.
Newman-Toker says the test could be easily employed to prevent misdiagnosis of as many as 100,000 strokes a year, leading to earlier stroke diagnosis and more efficient triage and treatment decisions for patients with disabling dizziness. Overlooked strokes mean delayed or missed treatments that lead to roughly 20,000 to 30,000 preventable deaths or disabilities a year, he says. The technology, he adds, could someday be used in a smartphone application to enable wider access to a quick and accurate diagnosis of strokes whose main symptom is dizziness, as opposed to one-sided weakness or garbled speech.
The diagnosis of stroke in patients with severe dizziness, vomiting, difficulty walking and intolerance to head motion is difficult, Newman-Toker says. He estimates there are 4 million emergency department visits annually in the United States for dizziness or vertigo, at least half a million of which involve patients at high risk for stroke. The most common causes are benign inner ear conditions, but many emergency room doctors, Newman-Toker says, find it nearly impossible to tell the difference between the benign conditions and something more serious, such as a stroke. So they often rely on brain imaging - usually a CT scan, an expensive and inaccurate technology for this particular diagnosis.
The Hopkins-led study enrolled 12 patients at The Johns Hopkins Hospital and the University of Illinois College of Medicine at Peoria, who later underwent confirmatory MRI. Six were diagnosed with stroke and six with a benign condition using video-oculography. MRI later confirmed all 12 diagnoses.

Age-Related Dementia May Begin with Neurons’ Inability to Rid Themselves of Unwanted Proteins
A team of European scientists from the University Medical Center Hamburg-Eppendorf (UKE) and the Cologne Excellence Cluster on Cellular Stress Responses in Aging-Associated Diseases (CECAD) at the University of Cologne in Germany has taken an important step closer to understanding the root cause of age-related dementia. In research involving both worms and mice, they have found that age-related dementia is likely the result of a declining ability of neurons to dispose of unwanted aggregated proteins. As protein disposal becomes significantly less efficient with increasing age, the buildup of these unwanted proteins ultimately leads to the development and progression of dementia. This research appears in the March 2013 issue of the journal Genetics.
“By studying disease progression in dementia, specifically by focusing on mechanisms neurons use to dispose of unwanted proteins, we show how these are interconnected and how these mechanisms deteriorate over time,” said Markus Glatzel, M.D., a researcher involved in the work from the Institute of Neuropathology at UKE in Hamburg, Germany. “This gives us a better understanding as to why dementias affect older persons; the ultimate aim is to use these insights to devise novel therapies to restore the full capacity of protein disposal in aged neurons.”
To make this discovery, scientists carried out their experiments in both worm and mouse models that had a genetically-determined dementia in which the disease was caused by protein accumulation in neurons. In the worm model, researchers in the lab of Thorsten Hoppe, Ph.D., from the CECAD Cluster of Excellence could inactivate distinct routes used for the disposal of the unwanted proteins. Results provided valuable insight into the mechanisms that neurons use to cope with protein accumulation. These pathways were then assessed in young and aged mice. This study provides an explanation of why dementias exponentially increase with age. Additionally, neuron protein disposal methods may offer a therapeutic target for the development of drugs to treat and/or prevent dementias.
“This is an exciting study that helps us understand what’s going wrong at a cellular level in age-related dementias,” said Mark Johnston, Ph.D., Editor-in-Chief of the journal Genetics. “This research holds possibilities for future identification of substances that can prevent, stop, or reverse this cellular malfunction in humans.”
(Image: damato)