Posts tagged science

Posts tagged science
Restoring Active Memory Program Poised to Launch
Teams will develop and test implantable therapeutic devices for memory restoration in patients with memory deficits caused by disease or trauma
DARPA has selected two universities to initially lead the agency’s Restoring Active Memory (RAM) program, which aims to develop and test wireless, implantable “neuroprosthetics” that can help servicemembers, veterans, and others overcome memory deficits incurred as a result of traumatic brain injury (TBI) or disease.
The University of California, Los Angeles (UCLA), and the University of Pennsylvania (Penn) will each head a multidisciplinary team to develop and test electronic interfaces that can sense memory deficits caused by injury and attempt to restore normal function. Under the terms of separate cooperative agreements with DARPA, UCLA will receive up to $15 million and Penn will receive up to $22.5 million over four years, with full funding contingent on the performer teams successfully meeting a series of technical milestones. DARPA also has a cooperative agreement worth up to $2.5 million in place with Lawrence Livermore National Laboratory to develop an implantable neural device for the UCLA-led effort.
“The start of the Restoring Active Memory program marks an exciting opportunity to reveal many new aspects of human memory and learn about the brain in ways that were never before possible,” said DARPA Program Manager Justin Sanchez. “Anyone who has witnessed the effects of memory loss in another person knows its toll and how few options are available to treat it. We’re going to apply the knowledge and understanding gained in RAM to develop new options for treatment through technology.”
TBI is a serious cause of disability in the United States. Diagnosed in more than 270,000 military servicemembers since 2000 and affecting an estimated 1.7 million U.S. civilians each year, TBI frequently results in an impaired ability to retrieve memories formed prior to injury and a reduced capacity to form or retain new memories following injury. Despite the scale of the problem, no effective therapies currently exist to mitigate the long-term consequences of TBI on memory. Through the RAM program, DARPA seeks to accelerate the development of technology needed to address this public health challenge and help servicemembers and others overcome memory deficits by developing new neuroprosthetics to bridge gaps in the injured brain.
“We owe it to our service members to accelerate research that can minimize the long-term impacts of their injuries,” Sanchez said. “Despite increasingly aggressive prevention efforts, traumatic brain injury remains a serious problem in military and civilian sectors. Through the Restoring Active Memory program, DARPA aims to better understand the underlying neurological basis of memory loss and speed the development of innovative therapies.”
Specifically, RAM performers aim to develop and test wireless, fully implantable neural-interface medical devices that can serve as “neuroprosthetics”—technology that can effectively bridge the gaps that interfere with an individual’s ability to encode new memories or retrieve old ones.
To start, DARPA will support the development of multi-scale computational models with high spatial and temporal resolution that describe how neurons code declarative memories—those well-defined parcels of knowledge that can be consciously recalled and described in words, such as events, times, and places. Researchers will also explore new methods for analysis and decoding of neural signals to understand how targeted stimulation might be applied to help the brain reestablish an ability to encode new memories following brain injury. “Encoding” refers to the process by which newly learned information is attended to and processed by the brain when first encountered.
Building on this foundational work, researchers will attempt to integrate the computational models developed under RAM into new, implantable, closed-loop systems able to deliver targeted neural stimulation that may ultimately help restore memory function. These studies will involve volunteers living with deficits in the encoding and/or retrieval of declarative memories and/or volunteers undergoing neurosurgery for other neurological conditions.
Unique to the UCLA team’s approach is a focus on the portion of the brain known as the entorhinal area. UCLA researchers previously demonstrated that human memory could be facilitated by stimulating that region, which is known to be involved in learning and memory. Considered the entrance to the hippocampus—which helps form and store memories—the entorhinal area plays a crucial role in transforming daily experience into lasting memories. Data collected during the first year of the project from patients already implanted with brain electrodes as part of their treatment for epilepsy will be used to develop a computational model of the hippocampal-entorhinal system that can then be used to test memory restoration in patients.
After developing an advanced, new wireless neuromodulation device—featuring ten-times smaller size and much higher spatial resolution than existing devices—the UCLA team will implant such devices into the entorhinal area and hippocampus of patients with traumatic brain injury.
The Penn team’s approach is based on an understanding that memory is the result of complex interactions among widespread brain regions. Researchers will study neurosurgical patients who have electrodes implanted in multiple areas of their brains for the treatment of various neurological conditions. By recording neural activity from these electrodes as patients play computer-based memory games, the researchers will measure “biomarkers” of successful memory function—patterns of activity that accompany the successful formation of new memories and the successful retrieval of old ones. Researchers could then use those models and a novel neural stimulation and monitoring system—being developed in partnership with Medtronic—to restore brain memory function. The investigational system will simultaneously monitor and stimulate a number of brain sites, which may lead to better understandings of the brain and how brain stimulation therapy can potentially restore normal brain function following injury or the onset of neuropsychological illness.
In addition to human clinical efforts, RAM will support animal studies to advance the state-of-the-art of quantitative models that account for the encoding and retrieval of complex memories and memory attributes, including their hierarchical associations with one another. This work will also seek to identify any characteristic neural and behavioral correlates of memories facilitated by therapeutic devices.

Burst spinal artery aneurysm linked to Ecstasy use
Taking the street drug Ecstasy could lead to a potentially fatal weakening and rupture of the spinal cord artery, doctors have warned in the Journal of NeuroInterventional Surgery.
Posterior spinal artery aneurysms - a blood-filled swelling of the spinal cord artery, caused by a weakening and distension of the vessel wall - are rare, with only 12 cases reported to date. But all of them caused spinal bleeding which affected the function of the spinal cord.
Doctors discovered one of these aneurysms in a previously healthy teenager who had taken Ecstasy or MDMA.
The morning after the night before, he woke up with headache, neck pain and muscle spasms. After a week these symptoms suddenly took a turn for the worse, accompanied by nausea, prompting him to seek help at his local emergency department.
A week later the teen was transferred to a specialist neurosurgical unit for further investigations, which revealed an aneurysm, measuring 2 x 1 mm, on the left side of the spinal cord artery at the back of his neck.
The aneurysm was successfully removed, along with the weakened portion of the artery. The teen made a full recovery, with no lasting nerve damage.
But the authors reiterate that Ecstasy use has already been linked to severe systemic and neurological complications, including stroke, inflammation of the arteries in the brain (vasculitis) and internal brain bleeds.
And now, posterior spinal artery aneurysm can be added to the list, they say.
The drug acts on the sympathetic nervous system, sparking a sudden hike in blood pressure, as a result of the surge in serotonin it releases. And this could make any pre-existing aneurysms or other arterial abnormalities prone to rupture, they warn.
Contradictory findings on how the full moon affect our sleep
A Swiss research study conducted last year showed that the full moon affects sleep. The findings demonstrated that people average 20 minutes less sleep, take five minutes longer to fall asleep and experience 30 minutes more of REM sleep, during which most dreaming is believed to occur.
Different outcome
Numerous studies through the years have attempted to prove or disprove the hypothesis that lunar phases affect human sleep. But results have been hard to repeat. A group of researchers at the famed Max Planck Institute and elsewhere analyzed data from more than 1,000 people and 26,000 nights of sleep, only to find no correlation.
International researchers are being urged to publish their results in hopes of getting to the bottom of the question. Michael Smith and his co-researchers at Sahlgrenska Academy have analyzed data generated by a previous sleep study and compared them with the lunar cycle.
20 minutes less sleep
Based on a study of 47 healthy 18-30 year-olds and published in Current Biology, the results support the theory that a correlation exists.
“Our study generated findings similar to the Swiss project,” Michael Smith says. “Subjects slept an average of 20 minutes less and had more trouble falling asleep during the full moon phase. However, the greatest impact on REM sleep appeared to be during the new moon.”
More susceptible brain
The retrospective study by the Gothenburg researchers suggests that the brain is more susceptible to external disturbances when the moon is full.
“The purpose of our original study was to examine the way that noise disturbs sleep,” Mr. Smith continues. “Re-analysis of our data showed that sensitivity, measured as reactivity of the cerebral cortex, is greatest during the full moon.”
Greater cortical reactivity was found in both women and men, whereas only men had more trouble falling asleep and slept less when the moon was full. Skeptics warn that both age and gender differences may be a source of error, not to mention more subtle factors such as physical condition and exposure to light during the day.
Need for more studies
Though fully aware of the issues, Mr. Smith is not prepared to dismiss the results of the Gothenburg study.
“The rooms in our sleep laboratories do not have any windows,” he says. “So the effect we found cannot be attributable to increased nocturnal light during full moon. Thus, there may be a built-in biological clock that is affected by the moon, similar to the one that regulates the circadian rhythm. But all this is mere speculation – additionally, more highly controlled studies that target these mechanisms are needed before more definitive conclusions can be drawn.”
The article Human sleep and cortical reactivity are influenced by lunar phase is published in Current Biology.
Scientists have identified a set of 10 proteins in the blood which can predict the onset of Alzheimer’s, marking a significant step towards developing a blood test for the disease. The study, led by King’s College London and UK proteomics company, Proteome Sciences plc,analysed over 1,000 individuals and is the largest of its kind to date.

There are currently no effective long-lasting drug treatments for Alzheimer’s, and it is believed that many new clinical trials fail because drugs are given too late in the disease process. A blood test could be used to identify patients in the early stages of memory loss for clinical trials to find drugs to halt the progression of the disease.
The study, published in Alzheimer’s & Dementia: The Journal of the Alzheimer’s Association, is the result of an international collaboration led by King’s College London and Proteome Sciences plc, funded by Alzheimer’s Research UK, the UK Medical Research Council, the National Institute for Health Research (NIHR) Maudsley Biomedical Research Centre and Proteome Sciences.
The researchers used data from three international studies. Blood samples from a total of 1,148 individuals (476 with Alzheimer’s disease; 220 with ‘Mild Cognitive Impairment’ (MCI) and 452 elderly controls without dementia) were analysed for 26 proteins previously shown to be associated with Alzheimer’s disease. A sub-group of 476 individuals across all three groups also had an MRI brain scan.
Researchers identified 16 of these 26 proteins to be strongly associated with brain shrinkage in either MCI or Alzheimer’s. They then ran a second series of tests to establish which of these proteins could predict the progression from MCI to Alzheimer’s. They identified a combination of 10 proteins capable of predicting whether individuals with MCI would develop Alzheimer’s disease within a year, with an accuracy of 87 percent.
Dr Abdul Hye, lead author of the study from the Institute of Psychiatry at King’s College London, said: “Memory problems are very common, but the challenge is identifying who is likely to develop dementia. There are thousands of proteins in the blood, and this study is the culmination of many years’ work identifying which ones are clinically relevant. We now have a set of 10 proteins that can predict whether someone with early symptoms of memory loss, or mild cognitive impairment, will develop Alzheimer’s disease within a year, with a high level of accuracy.”
Professor Simon Lovestone, senior author of the study from the University of Oxford, who led the work whilst at King’s, said: “Alzheimer’s begins to affect the brain many years before patients are diagnosed with the disease. Many of our drug trials fail because by the time patients are given the drugs, the brain has already been too severely affected. A simple blood test could help us identify patients at a much earlier stage to take part in new trials and hopefully develop treatments which could prevent the progression of the disease. The next step will be to validate our findings in further sample sets, to see if we can improve accuracy and reduce the risk of misdiagnosis, and to develop a reliable test suitable to be used by doctors.”
Dr Eric Karran, Director of Research at Alzheimer’s Research UK, the UK’s leading dementia research charity, said: “As the onset of Alzheimer’s is often slow and subtle, a blood test to identify those at high risk of the disease at an early stage would be of real value. Detecting the first signs of Alzheimer’s could improve clinical trials for new treatments and help those already concerned about their memory, but we’re not currently in a position to use such a test to screen the general population.
“With an ageing population, and age the biggest risk factor for Alzheimer’s, we are expecting rising numbers of people to be affected over the coming years. It’s important to develop new ways to intervene early in the disease to help people maintain their quality of life for as long as possible.”
Dr Ian Pike, co-author of the paper from Proteome Sciences, said: “By linking the best British academic and commercial research, this landmark study in Alzheimer’s disease is a major advance in the development of a simple blood test to identify the disease before clinical symptoms appear. This is the window that will offer the best chance of successful treatment. Equally important, a blood test will be considerably easier and less expensive than using brain imaging or cerebrospinal spinal fluid.
“We are in the process of selecting commercial partners to combine the protein biomarkers in a blood test for the global market, a key step forward to deliver effective and early treatment for this crippling disease.”
Alzheimer’s disease is the most common form of dementia. Globally, it is estimated that 135 million people will have dementia by 2050. In 2010, the annual global cost of dementia was estimated at$604 billion. MCI includes problems with day-to-day memory, language and attention,and can be an early sign of dementia, or a symptom of stress or anxiety. Approximately 10% of people diagnosed with MCI develop dementia within a year but apart from regular assessments to measure memory decline, there is currently no accurate way of predicting who will, or won’t, develop dementia.
Previous studies have also shown that PET brain scans and plasma in lumbar fluid can be used to predict the onset of dementia from MCI. However, PET imaging is highly expensive and lumbar punctures invasive.
(Source: kcl.ac.uk)
The protein that is mutated in Huntington’s disease is critical for wiring the brain in early life, according to a new Duke University study.

(Image caption: The protein associated with Huntington’s disease, Htt, is critical in early brain development. Brains of 5-week-old mice whose Htt was deleted show signs of cellular stress — reactive astrocytes (green) and microglia (white and red) and faulty connections — in brain circuits that have already been linked to the disease. Credit: Spencer McKinstry)
Huntington’s disease is a progressive neurodegenerative disorder that causes a wide variety of symptoms, such as uncontrolled movements, inability to focus or remember, depression and aggression. By the time these symptoms appear, usually in middle age, the disease has already ravaged the brain.
The new findings, published July 9 in the Journal of Neuroscience, add to growing evidence that Huntington’s and other neurodegenerative disorders, such as Alzheimer’s disease, may take root during development, said lead author Cagla Eroglu, an assistant professor of cell biology in the Duke University Medical School, and member of the Duke Institute for Brain Sciences.
“The study is exciting because it means that, if we understand what these developmental errors are, we may be able to interfere with the first stage of the disease, before it shows itself,” Eroglu said.
Several years ago, Eroglu and her team were looking for molecular players involved in the formation of new connections, or synapses, in early brain development in mice when their studies unexpectedly hit on the huntingtin (Htt) protein, which is present throughout the body and which forms clumps in the brain cells of people with Huntington’s disease.
“(Htt) had been implicated in certain cellular functions and synaptic dysfunction in Huntington’s, but the possibility that Htt is playing a direct role in synapse formation was not explored,” Eroglu said.
To understand the protein’s role as synapses form, the scientists created mice in which Htt is deleted only in the cortex, a part of the brain that is implicated in the disease and that controls perception, memory and thought.
At three weeks of age (roughly similar to the first two years of human life), a time when a mouse begins to take in its surroundings through its eyes and ears, the synapses of the mutant mice formed more rapidly compared with those of healthy mice, the scientists found.
But by five weeks, when some synapses typically strengthen while others weaken in a normal process called pruning, the synapses had completely deteriorated in the mutant mice. In collaboration with another Duke researcher, Henry Yin, an assistant professor in psychology & neuroscience, the team also investigated the changes in synaptic function in these mutant mice and found severe alterations of the synaptic physiology.
Not only did the researchers see faulty circuits in the mice missing cortical Htt, they also saw signs of cellular stress in the brain, in the exact spot within the cortex that projects to the striatum, another brain area targeted by Huntington’s disease in people. “There’s something about that particular circuit that is vulnerable to changes in Htt,” Eroglu said.
The researchers also examined what happens in early brain development in a mouse model of Huntington’s disease. Similar to people with the disease, these animals have one normal copy of the Htt gene, and one mutated copy, which produces a protein that is present in cells but in expanded form.
The researchers found the same pattern: the Huntington’s disease model animals have synapses that initially mature much faster than normal in the cortex and then die off.
The new results also suggest that missing Htt for a prolonged period may not only affect the development but also the maintenance of healthy synapses, Eroglu said.
That’s especially relevant to a current strategy for treating Huntington’s disease: dialing down Htt levels in the brain using gene therapy or small-molecule inhibitors. But it has been a challenge to target the mutated copy of the gene, not the normal copy. Interested in the implications of lowering overall Htt levels, the group plans to delete Htt in the mouse brain later in life and measure the number of its synapses.
Other mouse models of the disease are also likely to have these faulty circuits. “We think this is probably a common thing, but that’s something we’re working on: whether we can detect early signs of faulty connections, correct it before the disease starts, and make these mice better,” Eroglu said.
(Source: today.duke.edu)

German doctors highlight the potential dangers surrounding headbanging in a Case Report published in The Lancet. Ariyan Pirayesh Islamian and colleagues from the Hannover Medical School, detail the case of a man who developed a chronic subdural haematoma (bleeding in the brain) after headbanging at a Motörhead concert.
In January 2013, a 50-year-old man came to the neurosurgical department of Hannover Medical School with a 2 week history of a constant worsening headache affecting the whole head. Although his medical history was unremarkable and he reported no previous head trauma, 4 weeks before he had been headbanging at a Motörhead concert.
A cranial CT confirmed the man had a chronic subdural haematoma on the right side of his brain. Surgeons removed the haematoma (blood clot) through a burr hole and used closed system subdural drainage for 6 days after surgery. His headache subsided and he was well on his last examination 2 months later.
Headbanging refers to the violent and rhythmic movement of the head synchronous with rock music, most commonly heavy metal. Motörhead, undoubtedly one of the greatest rock’n’roll bands on earth, helped to pioneer speed metal where fast tempo songs that have an underlying rhythm of 200bpm are aspired to.
Although generally considered harmless, headbanging-related injuries include carotid artery dissection, whiplash, mediastinal emphysema, and odontoid neck fracture. This is the first reported case showing evidence that headbanging can cause “chronic” subdural haematoma.
"Even though there are only a few documented cases of subdural haematomas, the incidence may be higher because the symptoms of this type of brain injury are often clinically silent or cause only mild headache that resolves spontaneously", explains lead author Dr Ariyan Pirayesh Islamian.**
"This case serves as evidence in support of Motörhead’s reputation as one of the most hardcore rock’n’roll acts on earth, if nothing else because of their music’s contagious speed drive and the hazardous potential for headbanging fans to suffer brain injury."

Sleep deprivation leads to symptoms of schizophrenia
Psychologists at the University of Bonn are amazed by the severe deficits caused by a sleepless night
Twenty-four hours of sleep deprivation can lead to conditions in healthy persons similar to the symptoms of schizophrenia. This discovery was made by an international team of researchers under the guidance of the University of Bonn and King’s College London. The scientists point out that this effect should be investigated more closely in persons who have to work at night. In addition, sleep deprivation may serve as a model system for the development of drugs to treat psychosis. The results have now been published in “The Journal of Neuroscience”.
In psychosis, there is a loss of contact with reality and this is associated with hallucinations and delusions. The chronic form is referred to as schizophrenia, which likewise involves thought disorders and misperceptions. Affected persons report that they hear voices, for example. Psychoses rank among the most severe mental illnesses. An international team of researchers under the guidance of the University of Bonn has now found out that after 24 hours of sleep deprivation in healthy patients, numerous symptoms were noted which are otherwise typically attributed to psychosis or schizophrenia. “It was clear to us that a sleepless night leads to impairment in the ability to concentrate,” says Prof. Dr. Ulrich Ettinger of the Cognitive Psychology Unit in the Department of Psychology at the University of Bonn. “But we were surprised at how pronounced and how wide the spectrum of schizophrenia-like symptoms was.”
The scientists from the University of Bonn, King’s College London (England) as well as the Department of Psychiatry and Psychotherapy of the University of Bonn Hospital examined a total of 24 healthy subjects of both genders aged 18 to 40 in the sleep laboratory of the Department of Psychology. In an initial run, the test subjects were to sleep normally in the laboratory. About one week later, they were kept awake all night with movies, conversation, games and brief walks. On the following morning, subjects were each asked about their thoughts and feelings. In addition, subjects underwent a measurement known as prepulse inhibition.
Unselected information leads to chaos in the brain
"Prepulse inhibition is a standard test to measure the filtering function of the brain,” explains lead author Dr. Nadine Petrovsky from Prof. Ettinger’s team. In the experiment, a loud noise is heard via headphones. As a result, the test subjects experience a startle response, which is recorded with electrodes through the contraction of facial muscles. If a weaker stimulus is emitted beforehand as a “prepulse”, the startle response is lower. “The prepulse inhibition demonstrates an important function of the brain: Filters separate what is important from what is not important and prevent sensory overload,” says Dr. Petrovsky.
In our subjects, this filtering function of the brain was significantly reduced following a sleepless night. “There were pronounced attention deficits, such as what typically occurs in the case of schizophrenia,” reports Prof. Ettinger. “The unselected flood of information led to chaos in the brain.” Following sleep deprivation, the subjects also indicated in questionnaires that they were somewhat more sensitive to light, color or brightness. Accordingly, their sense of time and sense of smell were altered and mental leaps were reported. Many of those who spent the night even had the impression of being able to read thoughts or notice altered body perception. “We did not expect that the symptoms could be so pronounced after one night spent awake,” says the psychologist from the University of Bonn.
Sleep deprivation as a model system for mental illnesses
The scientists see an important potential application for their results in research for drugs to treat psychoses. “In drug development, mental disorders like these have been simulated to date in experiments using certain active substances. However, these convey the symptoms of psychoses in only a very limited manner,” says Prof. Ettinger. Sleep deprivation may be a much better model system because the subjective symptoms and the objectively measured filter disorder are far more akin to mental illnesses. Of course, the sleep deprivation model is not harmful: After a good night’s recovery sleep, the symptoms disappear. There is also a need for research with regard to persons who regularly have to work at night. “Whether the symptoms of sleep deprivation gradually become weaker due to acclimatization has yet to be investigated,” says the psychologist from the University of Bonn.
(Image: Getty)
Dodging dots helps explain brain circuitry
A neuroscience study provides new insight into the primal brain circuits involved in collision avoidance, and perhaps a more general model of how neurons can participate in networks to process information and act on it.
In the study, Brown University neuroscientists tracked the cell-by-cell progress of neural signals from the eyes through the brains of tadpoles as they saw and reacted to stimuli including an apparently approaching black circle. In so doing, the researchers were able to gain a novel understanding of how individual cells contribute in a broader network that distinguishes impending collisions.
The basic circuitry involved is present in a wide variety of animals, including people, which is no surprise given how fundamental collision avoidance is across animal behavior.
“Imagine yourself walking in a forest while keeping a conversation with your friend,” said Arseny Khakhalin, neuroscience postdoctoral scholar at Brown and lead author of the study in the European Journal of Neuroscience. “You can totally keep the conversation going, and at the same time avoid tree trunks and shrubs without even thinking about them consciously. That’s because you have a whole region in your brain that is dedicated, among other things, to this task.”
Turning tail
To learn how collision avoidance works, Khakhalin studied the task using tadpoles as a model organism, because as senior author and neuroscience professor Carlos Aizenman put it, they are “sufficiently complex to produce interesting behavior, but have nervous systems sufficiently simple to address in an integrated experimental approach.”
They started with the avoidance behavior. With tadpoles in a dish atop a screen, they projected digital black dots, representing virtual objects, of varying widths, at varying speeds and angles of approach. They also just flashed dots in place. The tadpoles would flee approaching dots as long as they reached a certain threshold angular size, but rarely reacted to the dots that merely blinked onto the scene but weren’t moving toward them. The response confirmed that tadpoles can distinguish approaching rather than merely proximate visual stimuli.
The researchers then sought to determine how the tadpoles process different stimuli. To do that they held the tadpoles in place while presenting a variety of simple animations via a fiber optic cable held next to an eye. The animations included a flashed circle, an apparently approaching circle (it became larger and larger), and a couple of “in between” animations, such as a circle that was faded in, rather than simply flashed into being.
While the tadpoles watched the animations, the researchers tracked their tail movements with a high-speed camera (to determine if the tadpoles were executing a fleeing maneuver) and recorded electrical signals along the visual processing circuitry: at the optic nerve leading from the retina to the brain’s optic tectum region, at “excitatory” and “inhibitory” synaptic inputs of neurons in the optic tectum, and at the outputs of the tectal neurons.
What the scientists found was that the tectum, rather than the retina, appears to be where the tadpoles determine that something is approaching rather than merely present. How did they know? The strongest difference between responses to the apparently approaching circle, versus responses to other stimuli, such as flashed or faded circles, was detected at the stage of output from tectal neurons.
Moreover, the difference in activity related to approaching vs. flashed circles increased as the signal propagated from the optic nerve, through tectum input, and to tectum output.
“The tectum is the first place that responded to approaching stimuli not just differently, but stronger,” Khakhalin said.
Inhibition moderates the conversation
An implication of the experiments was that when individual neurons in the tectum are uniquely activated by an apparently approaching stimulus, they collectively generate a signal to send to downstream parts of the brain that can get the tail moving to avoid the collision.
That’s indeed what excitatory neurons do, but the researchers wanted to know what role the inhibitory neurons were playing, especially because the balance of inhibitory and excitatory activity in the tectum varied with different stimuli.
To find out, they chemically blocked inhibitory neurons in the tectum in some tadpoles, chemically enhanced their activity in others and left still other tadpoles unaltered as controls. They found that when they altered the degree of inhibition in either direction, the output selectivity for an oncoming stimulus was lost. When inhibition was blocked, the individual excitatory cells lost their selectivity, too. When inhibition was enhanced, the individual excitatory cells retained their selectivity but could not project a signal collectively.
Khakhalin said the evidence seems to support the idea of inhibitory cells as facilitators of network function. They were not necessarily responsible for making the tectum selective. Instead, their ability to moderate excitation allowed the network of cells to function so that an organized signal from the individual excitatory neurons could emerge from the tectum.
The team was able to use these findings to create a conceptual model of the collision stimulus circuitry.
Khakhalin’s hypothesis of how it works is that inhibitory/excitatory balance allows the tectum to build up a necessary degree of excitement about the stimulus of interest (e.g. something has been getting bigger) while still allowing enough “calm” to consider the next moment wave of input (it just got bigger again).
Aizenman said the paper illustrates broader approach that his lab is applying to fundamental neuroscience questions.
“It is part of a greater project to be able to take an entire behavior and break it down into all of its neuronal components, to build a model in which we can understand how activity in single neurons and in the connections between them can all synergize to produce a behavior,” he said.
Scientists Criticize Europe’s $1.6B Brain Project
Dozens of neuroscientists are protesting Europe’s $1.6 billion attempt to recreate the functioning of the human brain on supercomputers, fearing it will waste vast amounts of money and harm neuroscience in general.
The 10-year Human Brain Project is largely funded by the European Union. In an open letter issued Monday, more than 190 neuroscience researchers called on the EU to put less money into the effort to “build” a brain, and to invest instead in existing projects.
If the EU doesn’t adopt their recommendations, the scientists said, they will boycott the Human Brain Project and urge colleagues to do the same.
GABA actions and ionic plasticity in epilepsy
Concepts of epilepsy, based on a simple change in neuronal excitation/inhibition balance, have subsided in face of recent insights into the large diversity and context-dependence of signaling mechanisms at the molecular, cellular and neuronal network level. GABAergic transmission exerts both seizure-suppressing and seizure-promoting actions. These two roles are prone to short-term and long-term alterations, evident both during epileptogenesis and during individual epileptiform events. The driving force of GABAergic currents is controlled by ion-regulatory molecules such as the neuronal K-Cl cotransporter KCC2 and cytosolic carbonic anhydrases. Accumulating evidence suggests that neuronal ion regulation is highly plastic, thereby contributing to the multiple roles ascribed to GABAergic signaling during epileptogenesis and epilepsy.