Posts tagged neuroscience

Posts tagged neuroscience
Scientists have identified a set of 10 proteins in the blood which can predict the onset of Alzheimer’s, marking a significant step towards developing a blood test for the disease. The study, led by King’s College London and UK proteomics company, Proteome Sciences plc,analysed over 1,000 individuals and is the largest of its kind to date.

There are currently no effective long-lasting drug treatments for Alzheimer’s, and it is believed that many new clinical trials fail because drugs are given too late in the disease process. A blood test could be used to identify patients in the early stages of memory loss for clinical trials to find drugs to halt the progression of the disease.
The study, published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association, is the result of an international collaboration led by King’s College London and Proteome Sciences plc, funded by Alzheimer’s Research UK, the UK Medical Research Council, the National Institute for Health Research (NIHR) Maudsley Biomedical Research Centre and Proteome Sciences.
The researchers used data from three international studies. Blood samples from a total of 1,148 individuals (476 with Alzheimer’s disease; 220 with ‘Mild Cognitive Impairment’ (MCI) and 452 elderly controls without dementia) were analysed for 26 proteins previously shown to be associated with Alzheimer’s disease. A sub-group of 476 individuals across all three groups also had an MRI brain scan.
Researchers identified 16 of these 26 proteins to be strongly associated with brain shrinkage in either MCI or Alzheimer’s. They then ran a second series of tests to establish which of these proteins could predict the progression from MCI to Alzheimer’s. They identified a combination of 10 proteins capable of predicting whether individuals with MCI would develop Alzheimer’s disease within a year, with an accuracy of 87 percent.
Dr Abdul Hye, lead author of the study from the Institute of Psychiatry at King’s College London, said: “Memory problems are very common, but the challenge is identifying who is likely to develop dementia. There are thousands of proteins in the blood, and this study is the culmination of many years’ work identifying which ones are clinically relevant. We now have a set of 10 proteins that can predict whether someone with early symptoms of memory loss, or mild cognitive impairment, will develop Alzheimer’s disease within a year, with a high level of accuracy.”
Professor Simon Lovestone, senior author of the study from the University of Oxford, who led the work whilst at King’s, said: “Alzheimer’s begins to affect the brain many years before patients are diagnosed with the disease. Many of our drug trials fail because by the time patients are given the drugs, the brain has already been too severely affected. A simple blood test could help us identify patients at a much earlier stage to take part in new trials and hopefully develop treatments which could prevent the progression of the disease. The next step will be to validate our findings in further sample sets, to see if we can improve accuracy and reduce the risk of misdiagnosis, and to develop a reliable test suitable to be used by doctors.”
Dr Eric Karran, Director of Research at Alzheimer’s Research UK, the UK’s leading dementia research charity, said: “As the onset of Alzheimer’s is often slow and subtle, a blood test to identify those at high risk of the disease at an early stage would be of real value. Detecting the first signs of Alzheimer’s could improve clinical trials for new treatments and help those already concerned about their memory, but we’re not currently in a position to use such a test to screen the general population.
“With an ageing population, and age the biggest risk factor for Alzheimer’s, we are expecting rising numbers of people to be affected over the coming years. It’s important to develop new ways to intervene early in the disease to help people maintain their quality of life for as long as possible.”
Dr Ian Pike, co-author of the paper from Proteome Sciences, said: “By linking the best British academic and commercial research, this landmark study in Alzheimer’s disease is a major advance in the development of a simple blood test to identify the disease before clinical symptoms appear. This is the window that will offer the best chance of successful treatment. Equally important, a blood test will be considerably easier and less expensive than using brain imaging or cerebrospinal spinal fluid.
“We are in the process of selecting commercial partners to combine the protein biomarkers in a blood test for the global market, a key step forward to deliver effective and early treatment for this crippling disease.”
Alzheimer’s disease is the most common form of dementia. Globally, it is estimated that 135 million people will have dementia by 2050. In 2010, the annual global cost of dementia was estimated at$604 billion. MCI includes problems with day-to-day memory, language and attention,and can be an early sign of dementia, or a symptom of stress or anxiety. Approximately 10% of people diagnosed with MCI develop dementia within a year but apart from regular assessments to measure memory decline, there is currently no accurate way of predicting who will, or won’t, develop dementia.
Previous studies have also shown that PET brain scans and plasma in lumbar fluid can be used to predict the onset of dementia from MCI. However, PET imaging is highly expensive and lumbar punctures invasive.
(Source: kcl.ac.uk)
The protein that is mutated in Huntington’s disease is critical for wiring the brain in early life, according to a new Duke University study.

(Image caption: The protein associated with Huntington’s disease, Htt, is critical in early brain development. Brains of 5-week-old mice whose Htt was deleted show signs of cellular stress — reactive astrocytes (green) and microglia (white and red) and faulty connections — in brain circuits that have already been linked to the disease. Credit: Spencer McKinstry)
Huntington’s disease is a progressive neurodegenerative disorder that causes a wide variety of symptoms, such as uncontrolled movements, inability to focus or remember, depression and aggression. By the time these symptoms appear, usually in middle age, the disease has already ravaged the brain.
The new findings, published July 9 in the Journal of Neuroscience, add to growing evidence that Huntington’s and other neurodegenerative disorders, such as Alzheimer’s disease, may take root during development, said lead author Cagla Eroglu, an assistant professor of cell biology in the Duke University Medical School, and member of the Duke Institute for Brain Sciences.
“The study is exciting because it means that, if we understand what these developmental errors are, we may be able to interfere with the first stage of the disease, before it shows itself,” Eroglu said.
Several years ago, Eroglu and her team were looking for molecular players involved in the formation of new connections, or synapses, in early brain development in mice when their studies unexpectedly hit on the huntingtin (Htt) protein, which is present throughout the body and which forms clumps in the brain cells of people with Huntington’s disease.
“(Htt) had been implicated in certain cellular functions and synaptic dysfunction in Huntington’s, but the possibility that Htt is playing a direct role in synapse formation was not explored,” Eroglu said.
To understand the protein’s role as synapses form, the scientists created mice in which Htt is deleted only in the cortex, a part of the brain that is implicated in the disease and that controls perception, memory and thought.
At three weeks of age (roughly similar to the first two years of human life), a time when a mouse begins to take in its surroundings through its eyes and ears, the synapses of the mutant mice formed more rapidly compared with those of healthy mice, the scientists found.
But by five weeks, when some synapses typically strengthen while others weaken in a normal process called pruning, the synapses had completely deteriorated in the mutant mice. In collaboration with another Duke researcher, Henry Yin, an assistant professor in psychology & neuroscience, the team also investigated the changes in synaptic function in these mutant mice and found severe alterations of the synaptic physiology.
Not only did the researchers see faulty circuits in the mice missing cortical Htt, they also saw signs of cellular stress in the brain, in the exact spot within the cortex that projects to the striatum, another brain area targeted by Huntington’s disease in people. “There’s something about that particular circuit that is vulnerable to changes in Htt,” Eroglu said.
The researchers also examined what happens in early brain development in a mouse model of Huntington’s disease. Similar to people with the disease, these animals have one normal copy of the Htt gene, and one mutated copy, which produces a protein that is present in cells but in expanded form.
The researchers found the same pattern: the Huntington’s disease model animals have synapses that initially mature much faster than normal in the cortex and then die off.
The new results also suggest that missing Htt for a prolonged period may not only affect the development but also the maintenance of healthy synapses, Eroglu said.
That’s especially relevant to a current strategy for treating Huntington’s disease: dialing down Htt levels in the brain using gene therapy or small-molecule inhibitors. But it has been a challenge to target the mutated copy of the gene, not the normal copy. Interested in the implications of lowering overall Htt levels, the group plans to delete Htt in the mouse brain later in life and measure the number of its synapses.
Other mouse models of the disease are also likely to have these faulty circuits. “We think this is probably a common thing, but that’s something we’re working on: whether we can detect early signs of faulty connections, correct it before the disease starts, and make these mice better,” Eroglu said.
(Source: today.duke.edu)

German doctors highlight the potential dangers surrounding headbanging in a Case Report published in The Lancet. Ariyan Pirayesh Islamian and colleagues from the Hannover Medical School, detail the case of a man who developed a chronic subdural haematoma (bleeding in the brain) after headbanging at a Motörhead concert.
In January 2013, a 50-year-old man came to the neurosurgical department of Hannover Medical School with a 2 week history of a constant worsening headache affecting the whole head. Although his medical history was unremarkable and he reported no previous head trauma, 4 weeks before he had been headbanging at a Motörhead concert.
A cranial CT confirmed the man had a chronic subdural haematoma on the right side of his brain. Surgeons removed the haematoma (blood clot) through a burr hole and used closed system subdural drainage for 6 days after surgery. His headache subsided and he was well on his last examination 2 months later.
Headbanging refers to the violent and rhythmic movement of the head synchronous with rock music, most commonly heavy metal. Motörhead, undoubtedly one of the greatest rock’n’roll bands on earth, helped to pioneer speed metal where fast tempo songs that have an underlying rhythm of 200bpm are aspired to.
Although generally considered harmless, headbanging-related injuries include carotid artery dissection, whiplash, mediastinal emphysema, and odontoid neck fracture. This is the first reported case showing evidence that headbanging can cause “chronic” subdural haematoma.
"Even though there are only a few documented cases of subdural haematomas, the incidence may be higher because the symptoms of this type of brain injury are often clinically silent or cause only mild headache that resolves spontaneously", explains lead author Dr Ariyan Pirayesh Islamian.**
"This case serves as evidence in support of Motörhead’s reputation as one of the most hardcore rock’n’roll acts on earth, if nothing else because of their music’s contagious speed drive and the hazardous potential for headbanging fans to suffer brain injury."

Sleep deprivation leads to symptoms of schizophrenia
Psychologists at the University of Bonn are amazed by the severe deficits caused by a sleepless night
Twenty-four hours of sleep deprivation can lead to conditions in healthy persons similar to the symptoms of schizophrenia. This discovery was made by an international team of researchers under the guidance of the University of Bonn and King’s College London. The scientists point out that this effect should be investigated more closely in persons who have to work at night. In addition, sleep deprivation may serve as a model system for the development of drugs to treat psychosis. The results have now been published in “The Journal of Neuroscience”.
In psychosis, there is a loss of contact with reality and this is associated with hallucinations and delusions. The chronic form is referred to as schizophrenia, which likewise involves thought disorders and misperceptions. Affected persons report that they hear voices, for example. Psychoses rank among the most severe mental illnesses. An international team of researchers under the guidance of the University of Bonn has now found out that after 24 hours of sleep deprivation in healthy patients, numerous symptoms were noted which are otherwise typically attributed to psychosis or schizophrenia. “It was clear to us that a sleepless night leads to impairment in the ability to concentrate,” says Prof. Dr. Ulrich Ettinger of the Cognitive Psychology Unit in the Department of Psychology at the University of Bonn. “But we were surprised at how pronounced and how wide the spectrum of schizophrenia-like symptoms was.”
The scientists from the University of Bonn, King’s College London (England) as well as the Department of Psychiatry and Psychotherapy of the University of Bonn Hospital examined a total of 24 healthy subjects of both genders aged 18 to 40 in the sleep laboratory of the Department of Psychology. In an initial run, the test subjects were to sleep normally in the laboratory. About one week later, they were kept awake all night with movies, conversation, games and brief walks. On the following morning, subjects were each asked about their thoughts and feelings. In addition, subjects underwent a measurement known as prepulse inhibition.
Unselected information leads to chaos in the brain
"Prepulse inhibition is a standard test to measure the filtering function of the brain,” explains lead author Dr. Nadine Petrovsky from Prof. Ettinger’s team. In the experiment, a loud noise is heard via headphones. As a result, the test subjects experience a startle response, which is recorded with electrodes through the contraction of facial muscles. If a weaker stimulus is emitted beforehand as a “prepulse”, the startle response is lower. “The prepulse inhibition demonstrates an important function of the brain: Filters separate what is important from what is not important and prevent sensory overload,” says Dr. Petrovsky.
In our subjects, this filtering function of the brain was significantly reduced following a sleepless night. “There were pronounced attention deficits, such as what typically occurs in the case of schizophrenia,” reports Prof. Ettinger. “The unselected flood of information led to chaos in the brain.” Following sleep deprivation, the subjects also indicated in questionnaires that they were somewhat more sensitive to light, color or brightness. Accordingly, their sense of time and sense of smell were altered and mental leaps were reported. Many of those who spent the night even had the impression of being able to read thoughts or notice altered body perception. “We did not expect that the symptoms could be so pronounced after one night spent awake,” says the psychologist from the University of Bonn.
Sleep deprivation as a model system for mental illnesses
The scientists see an important potential application for their results in research for drugs to treat psychoses. “In drug development, mental disorders like these have been simulated to date in experiments using certain active substances. However, these convey the symptoms of psychoses in only a very limited manner,” says Prof. Ettinger. Sleep deprivation may be a much better model system because the subjective symptoms and the objectively measured filter disorder are far more akin to mental illnesses. Of course, the sleep deprivation model is not harmful: After a good night’s recovery sleep, the symptoms disappear. There is also a need for research with regard to persons who regularly have to work at night. “Whether the symptoms of sleep deprivation gradually become weaker due to acclimatization has yet to be investigated,” says the psychologist from the University of Bonn.
(Image: Getty)
Dodging dots helps explain brain circuitry
A neuroscience study provides new insight into the primal brain circuits involved in collision avoidance, and perhaps a more general model of how neurons can participate in networks to process information and act on it.
In the study, Brown University neuroscientists tracked the cell-by-cell progress of neural signals from the eyes through the brains of tadpoles as they saw and reacted to stimuli including an apparently approaching black circle. In so doing, the researchers were able to gain a novel understanding of how individual cells contribute in a broader network that distinguishes impending collisions.
The basic circuitry involved is present in a wide variety of animals, including people, which is no surprise given how fundamental collision avoidance is across animal behavior.
“Imagine yourself walking in a forest while keeping a conversation with your friend,” said Arseny Khakhalin, neuroscience postdoctoral scholar at Brown and lead author of the study in the European Journal of Neuroscience. “You can totally keep the conversation going, and at the same time avoid tree trunks and shrubs without even thinking about them consciously. That’s because you have a whole region in your brain that is dedicated, among other things, to this task.”
Turning tail
To learn how collision avoidance works, Khakhalin studied the task using tadpoles as a model organism, because as senior author and neuroscience professor Carlos Aizenman put it, they are “sufficiently complex to produce interesting behavior, but have nervous systems sufficiently simple to address in an integrated experimental approach.”
They started with the avoidance behavior. With tadpoles in a dish atop a screen, they projected digital black dots, representing virtual objects, of varying widths, at varying speeds and angles of approach. They also just flashed dots in place. The tadpoles would flee approaching dots as long as they reached a certain threshold angular size, but rarely reacted to the dots that merely blinked onto the scene but weren’t moving toward them. The response confirmed that tadpoles can distinguish approaching rather than merely proximate visual stimuli.
The researchers then sought to determine how the tadpoles process different stimuli. To do that they held the tadpoles in place while presenting a variety of simple animations via a fiber optic cable held next to an eye. The animations included a flashed circle, an apparently approaching circle (it became larger and larger), and a couple of “in between” animations, such as a circle that was faded in, rather than simply flashed into being.
While the tadpoles watched the animations, the researchers tracked their tail movements with a high-speed camera (to determine if the tadpoles were executing a fleeing maneuver) and recorded electrical signals along the visual processing circuitry: at the optic nerve leading from the retina to the brain’s optic tectum region, at “excitatory” and “inhibitory” synaptic inputs of neurons in the optic tectum, and at the outputs of the tectal neurons.
What the scientists found was that the tectum, rather than the retina, appears to be where the tadpoles determine that something is approaching rather than merely present. How did they know? The strongest difference between responses to the apparently approaching circle, versus responses to other stimuli, such as flashed or faded circles, was detected at the stage of output from tectal neurons.
Moreover, the difference in activity related to approaching vs. flashed circles increased as the signal propagated from the optic nerve, through tectum input, and to tectum output.
“The tectum is the first place that responded to approaching stimuli not just differently, but stronger,” Khakhalin said.
Inhibition moderates the conversation
An implication of the experiments was that when individual neurons in the tectum are uniquely activated by an apparently approaching stimulus, they collectively generate a signal to send to downstream parts of the brain that can get the tail moving to avoid the collision.
That’s indeed what excitatory neurons do, but the researchers wanted to know what role the inhibitory neurons were playing, especially because the balance of inhibitory and excitatory activity in the tectum varied with different stimuli.
To find out, they chemically blocked inhibitory neurons in the tectum in some tadpoles, chemically enhanced their activity in others and left still other tadpoles unaltered as controls. They found that when they altered the degree of inhibition in either direction, the output selectivity for an oncoming stimulus was lost. When inhibition was blocked, the individual excitatory cells lost their selectivity, too. When inhibition was enhanced, the individual excitatory cells retained their selectivity but could not project a signal collectively.
Khakhalin said the evidence seems to support the idea of inhibitory cells as facilitators of network function. They were not necessarily responsible for making the tectum selective. Instead, their ability to moderate excitation allowed the network of cells to function so that an organized signal from the individual excitatory neurons could emerge from the tectum.
The team was able to use these findings to create a conceptual model of the collision stimulus circuitry.
Khakhalin’s hypothesis of how it works is that inhibitory/excitatory balance allows the tectum to build up a necessary degree of excitement about the stimulus of interest (e.g. something has been getting bigger) while still allowing enough “calm” to consider the next moment wave of input (it just got bigger again).
Aizenman said the paper illustrates broader approach that his lab is applying to fundamental neuroscience questions.
“It is part of a greater project to be able to take an entire behavior and break it down into all of its neuronal components, to build a model in which we can understand how activity in single neurons and in the connections between them can all synergize to produce a behavior,” he said.
Scientists Criticize Europe’s $1.6B Brain Project
Dozens of neuroscientists are protesting Europe’s $1.6 billion attempt to recreate the functioning of the human brain on supercomputers, fearing it will waste vast amounts of money and harm neuroscience in general.
The 10-year Human Brain Project is largely funded by the European Union. In an open letter issued Monday, more than 190 neuroscience researchers called on the EU to put less money into the effort to “build” a brain, and to invest instead in existing projects.
If the EU doesn’t adopt their recommendations, the scientists said, they will boycott the Human Brain Project and urge colleagues to do the same.
GABA actions and ionic plasticity in epilepsy
Concepts of epilepsy, based on a simple change in neuronal excitation/inhibition balance, have subsided in face of recent insights into the large diversity and context-dependence of signaling mechanisms at the molecular, cellular and neuronal network level. GABAergic transmission exerts both seizure-suppressing and seizure-promoting actions. These two roles are prone to short-term and long-term alterations, evident both during epileptogenesis and during individual epileptiform events. The driving force of GABAergic currents is controlled by ion-regulatory molecules such as the neuronal K-Cl cotransporter KCC2 and cytosolic carbonic anhydrases. Accumulating evidence suggests that neuronal ion regulation is highly plastic, thereby contributing to the multiple roles ascribed to GABAergic signaling during epileptogenesis and epilepsy.
The anatomy of fear: Understanding the biological underpinnings of anxiety, phobias and PTSD
Fear in a mouse brain looks much the same as fear in a human brain.
When a frightening stimulus is encountered, the thalamus shoots a message to the amygdala — the primitive part of the brain — even before it informs the parts responsible for higher cognition. The amygdala then goes into its hard-wired fight-or-flight response, triggering a host of predictable symptoms, including racing heart, heavy breathing, startle response, and sweating.
The similarities of fear response in the brains of mice and men have allowed scientists to understand the neural circuitry and molecular processes of fear and fear behaviors perhaps better than any other response. That understanding has spurred breakthroughs in treatments for psychiatric disorders that are underpinned by fear.
Anxiety disorders are one of the most common mental illnesses in the country, with nearly one-third of Americans experiencing symptoms at least once during their lives. There are generalized anxiety disorders and fear-related disorders, which include panic disorders, phobias, and post-traumatic stress disorder (PTSD).
Emory psychiatrist and researcher Kerry Ressler is on the front lines of fear-disorder research. In his lab at Yerkes National Primate Research Center, he studies the molecular and cellular mechanisms of fear learning and extinction in mouse models. At Grady Memorial Hospital, he investigates the psychology, genetics, and biology of PTSD. And through the Grady Trauma Project, he works to draw attention to the problem of inner city intergenerational violence.
"If you look at Kerry’s work, it can seem like it’s all over the place — he’s got so many studies going on, and he collaborates with so many other scientists," says Barbara Rothbaum, associate vice chair of clinical research in psychiatry and director of the Trauma and Anxiety Recovery Program at Emory. "But they are all pieces to the same puzzle. All his work, from molecular to clinical to policy, fits together and starts telling a story." A Howard Hughes Medical Institute investigator, Ressler was recently elected to the Institute of Medicine — one of the highest honors in the fields of health and medicine. He was named a member of a new national PTSD consortium led by Draper Laboratory. And he recently appeared on the Charlie Rose show’s brain series.
Panic attacks seem to tie the fear-related disorders together, he explained on Charlie Rose. Everyone experiences fear, which evolved as a survival mechanism, but it only rises to a clinical level when people are unable to function normally in the face of it. For instance, PTSD includes not only intrusive thoughts, memories, nightmares, and startle responses, but also the concept of avoidance, which may extend to other areas of the individual’s life.
"There’s a patient I’ve seen who was attacked in a dark alley," Ressler shared on the show. "Initially it just felt dangerous to go out at night, but after a while she grew afraid of men and couldn’t go to that part of town. Then she couldn’t leave her house, and finally, her bedroom. The world got more and more dangerous."
What’s Lost as Handwriting Fades
Does handwriting matter?
Not very much, according to many educators. The Common Core standards, which have been adopted in most states, call for teaching legible writing, but only in kindergarten and first grade. After that, the emphasis quickly shifts to proficiency on the keyboard.
But psychologists and neuroscientists say it is far too soon to declare handwriting a relic of the past. New evidence suggests that the links between handwriting and broader educational development run deep.
Children not only learn to read more quickly when they first learn to write by hand, but they also remain better able to generate ideas and retain information. In other words, it’s not just what we write that matters — but how.
Enlarging the scope: grasping brain complexity
To further advance our understanding of the brain, new concepts and theories are needed. In particular, the ability of the brain to create information flows must be reconciled with its propensity for synchronization and mass action. The theoretical and empirical framework of Coordination Dynamics, a key aspect of which is metastability, are presented as a starting point to study the interplay of integrative and segregative tendencies that are expressed in space and time during the normal course of brain and behavioral function. Some recent shifts in perspective are emphasized, that may ultimately lead to a better understanding of brain complexity.