April 4, 2012
Brain pacemakers have a long-term effect in patients with the most severe depression. This has now been proven by scientists from the Bonn University Medical Center. Eleven patients took part in the study over a period of two to five years. A lasting reduction in symptoms of more than 50 percent was seen in nearly half of the subjects. The results are now being presented in the current edition of the journal Neuropsychopharmacology.
People with severe depression are constantly despondent, lacking in drive, withdrawn and no longer feel joy. Most suffer from anxiety and the desire to take their own life. Approximately one out of every five people in Germany suffers from depression in the course of his/her life – sometimes resulting in suicide. People with depression are frequently treated with psychotherapy and medication. “However, many patients are not helped by any therapy,” says Prof. Dr. Thomas E. Schläpfer from the Bonn University Medical Center for Psychiatry and Psychotherapy. “Many spend more than ten years in bed – not because they are tired, but because they have no drive at all and they are unable to get up.”
One possible alternative is “deep brain stimulation,” in which electrodes are implanted in the patient’s brain. The target point is the nucleus accumbens - an area of the brain known as the gratification center. There, a weak electrical current stimulates the nerve cells. Brain pacemakers of this type are often used today by neurosurgeons and neurologists to treat ongoing muscle tremors in Parkinson’s disease.
A 2009 study proved an antidepressive effect
In 2009, the Bonn scientists were able to establish that brain pacemakers also demonstrate an effect in the most severely depressed patients. Ten subjects who underwent implantation of electrodes in the nucleus accumbens all experienced relief of symptoms. Half of the subjects had a particularly noticeable response to the stimulation by the electrodes.
"In the current study, we investigated whether these effects last over the long term or whether the effects of the deep brain stimulation gradually weaken in patients," says Prof. Schläpfer. There are always relapses in the case of psychotherapy or drug treatment. Many patients had already undergone up to 60 treatments with psychotherapy, medications and electroconvulsive therapy, to no avail. "By contrast, in the case of deep brain stimulation, the clinical improvement continues steadily for many years." The scientists observed a total of eleven patients over a period of two to five years. "Those who initially responded to the deep brain stimulation are still responding to it even today," says the Bonn psychiatrist, summarizing the results. During the study, one patient committed suicide. "That is very unfortunate," says Prof. Schläpfer. "However, this cannot always be prevented in the case of patients with very severe depression."
he current study shows that the positive effects last for years
Even after a short amount of time, the study participants demonstrated an improvement in symptoms. “The intensity of the anxiety symptoms decreased and the subjects’ drive improved,” reports the psychiatrist. “After many years of illness, some were even able to work again.” With the current publication, the scientists have now demonstrated that the positive effects do not decrease over a longer period of time. “An improvement in symptoms was recorded for all subjects; for nearly half of the subjects, the extent of the symptoms was more than 50 percent below that of the baseline, even years after the start of treatment,” says Prof. Schläpfer. “There were no serious adverse effects of the therapy recorded.”
The long-term effect is now confirmed with the current study. How precisely the electrical stimulation is able to alter the function of the nucleus accumbens is not yet known. “Research is still needed in this area,” says Prof. Schläpfer. “Using imaging techniques, it was proven that the electrodes actually activate the nucleus accumbens.” The deep brain stimulation method may signify hope for people who suffer from the most severe forms of depressive diseases. “However, it will still take quite a bit of time before this therapeutic method becomes a part of standard clinical practice,” says the Bonn scientist.
Provided by University of Bonn
Source: medicalxpress.com
April 4, 2012
Awakening from anesthesia is often associated with an initial phase of delirious struggle before the full restoration of awareness and orientation to one’s surroundings. Scientists now know why this may occur: primitive consciousness emerges first. Using brain imaging techniques in healthy volunteers, a team of scientists led by Adjunct Professor Harry Scheinin, M.D. from the University of Turku, Finland in collaboration with investigators from the University of California, Irvine, have now imaged the process of returning consciousness after general anesthesia. The emergence of consciousness was found to be associated with activations of deep, primitive brain structures rather than the evolutionary younger neocortex.

This image shows one returning from oblivion — imaging the neural core of consciousness. Positron emission tomography (PET) findings show that the emergence of consciousness after anesthesia is associated with activation of deep, phylogenetically old brain structures rather than the neocortex. Left: Sagittal (top) and axial (bottom) sections show activation in the anterior cingulate cortex (i), thalamus (ii) and the brainstem (iii) locus coeruleus/parabrachial area overlaid on magnetic resonance image (MRI) slices. Right: Cortical renderings show no evident activations. Credit: Turku PET Center
These results may represent an important step forward in the scientific explanation of human consciousness.
"We expected to see the outer bits of brain, the cerebral cortex (often thought to be the seat of higher human consciousness), would turn back on when consciousness was restored following anesthesia. Surprisingly, that is not what the images showed us. In fact, the central core structures of the more primitive brain structures including the thalamus and parts of the limbic system appeared to become functional first, suggesting that a foundational primitive conscious state must be restored before higher order conscious activity can occur" Scheinin said.
Twenty young healthy volunteers were put under anesthesia in a brain scanner using either dexme-detomidine or propofol anesthetic drugs. The subjects were then woken up while brain activity pictures were being taken. Dexmedetomidine is used as a sedative in the intensive care unit setting and propofol is widely used for induction and maintenance of general anesthesia. Dexmedetomidineinduced unconsciousness has a close resemblance to normal physiological sleep, as it can be reversed with mild physical stimulation or loud voices without requiring any change in the dosing of the drug. This unique property was critical to the study design, as it enabled the investigators to separate the brain activity changes associated with the changing level of consciousness from the drugrelated effects on the brain. The staterelated changes in brain activity were imaged with positron emission tomography (PET).
The emergence of consciousness, as assessed with a motor response to a spoken command, was associated with the activation of a core network involving subcortical and limbic regions that became functionally coupled with parts of frontal and inferior parietal cortices upon awakening from dexme-detomidine-induced unconsciousness. This network thus enabled the subjective awareness of the external world and the capacity to behaviorally express the contents of consciousness through voluntary responses.
Interestingly, the same deep brain structures, i.e. the brain stem, thalamus, hypothalamus and the anterior cingulate cortex, were activated also upon emergence from propofol anesthesia, suggesting a common, drugindependent mechanism of arousal. For both drugs, activations seen upon regaining consciousness were thus mostly localized in deep, phylogenetically old brain structures rather than in the neocortex.
The researchers speculate that because current depth-of-anesthesia monitoring technology is based on cortical electroencephalography (EEG) measurement (i.e., measuring electrical signals on the sur-face of the scalp that arise from the brain’s cortical surface), their results help to explain why these devices fail in differentiating the conscious and unconscious states and why patient awareness during general anesthesia may not always be detected. The results presented here also add to the current understanding of anesthesia mechanisms and form the foundation for developing more reliable depth-of-anesthesia technology.
The anesthetized brain provides new views into the emergence of consciousness. Anesthetic agents are clinically useful for their remarkable property of being able to manipulate the state of consciousness. When given a sufficient dose of an anesthetic, a person will lose the precious but mysterious capacity of being aware of one’s own self and the surrounding world, and will sink into a state of oblivion. Conversely, when the dose is lightened or wears off, the brain almost magically recreates a subjective sense of being as experience and awareness returns. The ultimate nature of consciousness remains a mystery, but anesthesia offers a unique window for imaging internal brain activity when the subjective phenomenon of consciousness first vanishes and then re-emerges. This study was designed to give the clearest picture so far of the internal brain processes involved in this phenomenon.
The results may also have broader implications. The demonstration of which brain mechanisms are involved in the emergence of the conscious state is an important step forward in the scientific explanation of consciousness. Yet, much harder questions remain. How and why do these neural mechanisms create the subjective feeling of being, the awareness of self and environment the state of being conscious?
Provided by Academy of Finland
Source: medicalxpress.com
ScienceDaily (Apr. 4, 2012) — University of Oregon scientists collaborating with an Oregon company that synthesizes antisense Morpholinos for genetic research have developed a UV light-activated on-off switch for the vital gene-blocking molecule. Based on initial testing in zebra-fish embryos, the enhanced molecule promises to deliver new insights for developmental biologists and brain researchers.
The seven-member team describes the advancement in an open-access paper published in the May issue of the journal Development. UO neuroscientist Philip Washbourne, a professor of biology, says the paper is a “proof-of-concept” on an idea he began discussing with scientists at Gene Tools LLC in Philomath, Ore., about four years ago. Gene Tools was founded in the 1980s by James Summerton, who first invented Morpholino oligos. The company holds the exclusive license to distribute these molecules to researchers around the world.
Morpholinos are short-chain, artificially produced oligomers that bind to RNA in cells and block protein synthesis. For a decade, biologists have used them in zebra fish, mice and African clawed toads to study development, but they remained in the active, or on, position. Gene Tools created and introduced a light-sensitive linker, allowing researchers to control the molecule — even leaving one on in one cell and off in an adjacent cell — with a pinpoint UV laser beam.
Researchers in Washbourne’s lab — led by neuroscience research associate Alexandra Tallafuss — were challenged to give the new molecules a test run. They applied them to their work in zebra fish. “Now we can turn them on and off,” Washbourne said. “You can insert them and then manipulate them to learn just when a gene is important, and we learned two things right away.”
Researchers have known that if a gene known as “no tail” is blocked in development, zebra fish fail to grow tails. They now know that the no-tail gene does not need to produce protein for tail formation until about 10 hours, or very late, into an embryo’s development.
Secondly, the researchers looked at the gene sox10, which is vital in the formation of neural crest cells, which give rise to dorsal root ganglion cells — neurons that migrate out of the spinal cord — and pigment cells. “Again, we found that sox10 is not needed as early in development as theorized,” Washbourne said.
"These light-sensitive molecules significantly expand the power and precision of molecular genetic studies in zebrafish," said Robert Riddle, a program director at the National Institute of Neurological Disorders and Stroke (NINDS). "Researchers from many fields will be able to use these tools to explore the function of different genes in embryonic regions, specific cell types and at precise times in an animal’s lifespan."
The NINDS and National Institute of Child Health and Human Development, both at the National Institutes of Health, supported the research through grants to Washbourne and Eisen.
"This successful collaboration between our scientists and this Oregon-based company shows that commercial innovation can come quickly by jointly addressing common needs," said Kimberly Andrews Espy, vice president for research and innovation at the UO. "This is a remarkable example of turning a concept into a working tool that likely will benefit many researchers around the world."
Source: Science Daily
April 3rd, 2012
Working in mice, scientists at Washington University School of Medicine in St. Louis have devised a treatment that prevents the optic nerve injury that occurs in glaucoma, a neurodegenerative disease that is a leading cause of blindness.
Researchers increased the resistance of optic nerve cells to damage by repeatedly exposing the mice to low levels of oxygen similar to those found at high altitudes. The stress of the intermittent low-oxygen environment induces a protective response called tolerance that makes nerve cells — including those in the eye — less vulnerable to harm.
The study, published online in Molecular Medicine, is the first to show that tolerance induced by preconditioning can protect against a neurodegenerative disease.
Stress is typically thought of as a negative phenomenon, but senior author Jeffrey M. Gidday, PhD, associate professor of neurological surgery and ophthalmology, and others have previously shown that the right kinds of stress, such as exercise and low-oxygen environments, can precondition cells and induce changes that make them more resistant to injury and disease.
Scientists previously thought tolerance in the central nervous system only lasted for a few days. But last year Gidday developed a preconditioning protocol that extended the effects of tolerance from days to months. By exposing mice to hypoxia, or low oxygen concentrations, several times over a two-week period, Gidday and colleagues triggered an extended period of tolerance. After preconditioning ended, the brain was protected from stroke damage for at least 8 weeks.
“Once we discovered tolerance could be extended, we wondered whether this protracted period of injury resistance could also protect against the slow, progressive loss of neurons that characterizes neurodegenerative diseases,” Gidday says.
To find out, Gidday turned to an animal model of glaucoma, a condition linked to increases in the pressure of the fluid that fills the eye. The only treatments for glaucoma are drugs that reduce this pressure; there are no therapies designed to protect the retina and optic nerves from harm.
Scientists classify glaucoma as a neurodegenerative disease based on how slowly and progressively it kills retinal ganglion cells. The bodies of these cells are located in the retina of the eye; their branches or axons come together in bundles and form the optic nerves. Scientists don’t know if damage begins in the bodies or axons of the cells, but as more and more retinal ganglion cells die, patients experience peripheral vision loss and eventually become blind.
For the new study, Yanli Zhu, MD, research instructor in neurosurgery, induced glaucoma in mice by tying off vessels that normally allow fluid to drain from the eye. This causes pressure in the eye to increase. Zhu then assessed how many cell bodies and axons of retinal ganglion cells were intact after three or 10 weeks.
The investigators found that normal mice lost an average of 30 percent of their retinal ganglion cell bodies after 10 weeks of glaucoma. But mice that received the preconditioning before glaucoma-inducing surgery lost only 3 percent of retinal ganglion cell bodies.
“We also showed that preconditioned mice lost significantly fewer retinal ganglion cell axons,” Zhu says.
Gidday is currently investigating which genes are activated or repressed by preconditioning. He hopes to identify the changes in gene activity that make cells resistant to damage.
“Previous research has shown that there are literally hundreds of survival genes built into our DNA that are normally inactive,” Gidday says. “When these genes are activated, the proteins they encode can make cells much less vulnerable to a variety of injuries.”
Identifying specific survival genes should help scientists develop drugs that can activate them, according to Gidday.
Neurologists are currently conducting clinical trials to see if stress-induced tolerance can reduce brain damage after acute injuries like stroke, subarachnoid hemorrhage or trauma.
Gidday hopes his new finding will promote studies of tolerance’s potential usefulness in animal models of Parkinson’s disease, Alzheimer’s disease and other neurodegenerative conditions.
“Neurons in the central nervous system appear to be hard-wired for survival,” Gidday says. “This is one of the first steps in establishing a framework for how we can take advantage of that metaphorical wiring and use positive stress to help treat a variety of neurological diseases.”
Source: Neuroscience News
April 3, 2012
Researchers at the University of Montreal’s Sainte-Justine Hospital have identified how neural cells like those in our bodies are able to build up resistance to opioid pain drugs within hours. Humans have known about the usefulness of opioids, which are often harvested from poppy plants, for centuries, but we have very little insight into how they lose their effectiveness in the hours, days and weeks following the first dose.
"Our study revealed cellular and molecular mechanisms within our bodies that enables us to develop resistance to this medication, or what scientists call drug tolerance," lead author Dr. Graciela Pineyro explained. "A better understanding of these mechanisms will enable us to design drugs that avoid tolerance and produce longer therapeutic responses."
The research team looked at how drug molecules would interact with molecules called “receptors” that exist in every cell in our body. Receptors, as the name would suggest, receive “signals” from the chemicals that they come into contact with, and the signals then cause the various cells to react in different ways. They sit on the cell wall, and wait for corresponding chemicals known as receptor ligands to interact with them. “Until now, scientists have believed that ligands acted as ‘on- off’ switches for these receptors, all of them producing the same kind of effect with variations in the magnitude of the response they elicit,” Pineyro explained. “We now know that drugs that activate the same receptor do not always produce the same kind of effects in the body, as receptors do not always recognize drugs in the same way. Receptors will configure different drugs into specific signals that will have different effects on the body.”
Pineyro is attempting to tease the “painkilling” function of opioids from the part that triggers mechanisms that enable tolerance to build up. “My laboratory and my work are mostly structured around rational drug design, and trying to define how drugs produce their desired and non desired effects, so as to avoid the second, Pineyro said. “If we can understand the chemical mechanisms by which drugs produce therapeutic and undesired side effects, we will be able to design better drugs.”
Once activated by a drug, receptors move from the surface of the cell to its interior, and once they have completed this ‘journey’, they can either be destroyed or return to the surface and used again through a process known as “receptor recycling.” By comparing two types of opioids – DPDPE and SNC-80 – the researchers found that the ligands that encouraged recycling produced less analgesic tolerance than those that didn’t. “We propose that the development of opioid ligands that favour recycling could be away of producing longer-acting opioid analgesics,” Pineyro said.
Provided by University of Montreal
Source: medicalxpress.com
April 3, 2012
Epilepsy affects 50 million people worldwide, but in a third of these cases, medication cannot keep seizures from occurring. One solution is to shoot a short pulse of electricity to the brain to stamp out the seizure just as it begins to erupt. But brain implants designed to do this have run into a stubborn problem: too many false alarms, triggering unneeded treatment. To solve this, Johns Hopkins biomedical engineers have devised new seizure detection software that, in early testing, significantly cuts the number of unneeded pulses of current that an epilepsy patient would receive.

Sridevi Sarma’s research focuses on a system with three components: electrodes implanted in the brain, which are connected by wires to a neurostimulator or battery pack, and a sensing device, also located in the brain implant, which detects when a seizure is starting and activates the current to stop it. Credit: Greg Stanley/JHU
Sridevi V. Sarma, an assistant professor of biomedical engineering, is leading this effort to improve anti-seizure technology that sends small amounts of current into the brain to control seizures.
"These devices use algorithms — a series of mathematical steps —to figure out when to administer the treatment," Sarma said. "They’re very good at detecting when a seizure is about to happen, but they also produce lots of false positives, sometimes hundreds in one day. If you introduce electric current to the brain too often, we don’t know what the health impacts might be. Also, too many false alarms can shorten the life of the battery that powers the device, which must be replaced surgically."
Her new software was tested on real-time brain activity recordings collected from four patients with drug-resistant epilepsy who experienced seizures while being monitored. In a study published recently in the journal Epilepsy & Behavior, Sarma’s team reported that its system yielded superior results, including flawless detection of actual seizures and up to 80 percent fewer alarms when a seizure was not occurring. Although the testing was not conducted on patients in a clinical setting, the results were promising.
ScienceDaily (Apr. 3, 2012) — Depressed individuals with a tendency to ruminate on negative thoughts, i.e. to repeatedly think about particular negative thoughts or memories, show different patterns of brain network activation compared to healthy individuals, report scientists of a new study in Biological Psychiatry.
The risk for depression is increased in individuals with a tendency towards negative ruminations, but patterns of autobiographic memory also may be predictive of depression.
When asked to recall specific events, some individuals have a tendency to recall broader categories of events instead of specific events. This is termed overgeneral memory and, like those who tend to ruminate, these individuals also have a higher risk of developing depression.
These self-referential activities engage a network of brain regions called the default mode network, or DMN. Prior studies using imaging techniques have already shown that the DMN activates abnormally in individuals with depression, but the relationship between DMN activity and depressive ruminations was not clear.
In this new report, Dr. Shuqiao Yao of Central South University in Hunan, China and colleagues evaluated DMN functional connectivity in untreated young adults experiencing their first episode of major depression and healthy volunteers. Each participant underwent a brain scan and completed tests to measure their levels of rumination and overgeneral memory.
As expected, the depressed patients exhibited higher levels of rumination and overgeneral memory than did the control subjects. They also observed increased functional connectivity in the anterior medial cortex regions and decreased functional connectivity in the posterior medial cortex regions in depressed patients compared with control subjects.
Among the depressed subjects, an interesting pattern of dissociation emerged. The increased connectivity in anterior regions was positively associated with rumination, while the decreased connectivity in posterior regions was negatively associated with overgeneral memory.
Dr. Yao commented on the importance of these findings: “In the future, resting-state network activity in the brain will provide useful models for investigating network features of cognitive dysfunction in psychopathology.”
"As we dig deeper in brain imaging studies, we are becoming increasingly interested in the activity of brain circuits rather than single brain regions," said Dr. John Krystal, Editor of Biological Psychiatry. “Although it is a more complicated process, studying brain circuits may provide greater insight into symptoms, such as depressive ruminations. The current study nicely illustrates how altered activity at different sites within a brain network may be related to different features of depression.”
Source: Science Daily
April 3, 2012
Approximately 6 million Americans have brain aneurysms, a condition that occurs when a weak or thin spot develops on a blood vessel in the brain causing it to balloon. Often, these do not cause symptoms and go undetected, but every year an estimated 30,000 Americans experience a ruptured aneurysm that bleeds into the brain causing a life threatening injury. Immediate medical treatment is necessary to prevent stroke, nerve damage or death, and includes surgery or coiling. Coiling is an approach that blocks blood flow to the aneurysm by filling it with platinum coils. While less invasive than surgery, the likelihood of future aneurysm recurrence and subsequent treatment is higher with coiling. In an effort to lower the risk for repeat aneurysm treatment after coiling, Northwestern Medicine researchers are examining a new type of gel-coated coil to determine if it is more effective than the standard bare coils in preventing aneurysm recurrence.
Aneurysms can be a very serious health threat, according to Bernard R. Bendok, MD, a neurosurgeon at Northwestern Memorial Hospital, who is the principal investigator for the new generation Hydrogel Endovascular Aneurysm Treatment Trial (HEAT). “When an aneurysm needs treatment, it is important to perform the safest, most effective and most durable treatment. This clinical research trial, called HEAT, will help us determine whether bare platinum coils, which have been used for years, or the newer gel-coated coils are more effective long-term,” said Bendok, who is also an associate professor of neurological surgery and radiology at Northwestern University Feinberg School of Medicine.
Coiling involves inserting a catheter into an artery and threading it through the body using live x-rays as a guide to the site of the aneurysm. Coils are passed through the catheter and released into the aneurysm filling it to block blood from entering. Blood clots then form around the coil preventing the vessels from rupturing or leaking and destroying the aneurysm.
"Coils are not always able to fill the aneurysm completely, which leaves dead space in the aneurysm. This space has been associated with a higher rate of aneurysm recurrence," explained Bendok. "The new coils are made with platinum and a hydrogel that expands over time to eliminate the space between the coils, potentially limiting the need for future treatment."
HEAT is an international randomized study that seeks to determine how the gel packed coils measure up to the standard option in preventing future aneurysm recurrence. Northwestern is the lead site for the trial. Patients may be eligible for the trial if they are between the ages of 18 and 75 years with aneurysms 3 to 14mm in size, amenable to coiling. An estimated 30 sites around the world are expected to join the trial which has an enrollment goal of 600 participants.
On average aneurysms impact about one percent of the adult population. Understanding symptoms and risk factors can be potentially lifesaving. Small aneurysms may not be associated with symptoms, but a larger, growing aneurysm may cause pressure on tissues and nerves, leading to symptoms including headache, pain above and behind the eye, a dilated pupil, double vision, and weakness, numbness or paralysis on one side of face or body.
"In many cases, brain aneurysms remain silent until there’s a major problem," said Bendok. "Most are not found until they rupture or are found incidentally on brain images taken to assess another condition. The number one sign to look for is a sudden and extremely severe headache. If this occurs, one should seek immediate medical attention."
Other indicators that a person may have a ruptured aneurysm include double vision, nausea, vomiting, stroke-like symptoms, stiff neck, loss of consciousness and in some cases, seizure and changes in memory. Risk factors include hypertension, alcohol and drug abuse, and smoking. Aneurysms can be influenced by genetic factors and family history may be an indication for screening. People with certain hereditary diseases including connective tissue disorders or polycystic kidney disease can have a higher occurrence. Other associations include arteriovenous malformation (AVM) and blockage of certain blood vessels in the brain. Women are more likely than men to have brain aneurysms. It’s estimated about 10 in every 100,000 people will experience a ruptured aneurysm each year.
"Brain aneurysm rupture can be very devastating," said H. Hunt Batjer, MD, chairman of the department of neurological surgery at Northwestern Memorial and Michael J. Marchese Professor of neurological surgery at the Feinberg School. "It’s important to know what to look for and who might be at increased risk for aneurysm disease. While current treatments are effective, trials like HEAT have the potential to advance the art and science of brain aneurysm treatment and lead to even better treatment options in the future."
Provided by Northwestern Memorial Hospital
Source: medicalxpress.com
April 3, 2012 By Miles O’ Brien and Jon Baime
(Medical Xpress) — It’s a chilling thought—losing the sense of sight because of severe injury or damage to the brain’s visual cortex. But, is it possible to train a damaged or injured brain to “see” again after such a catastrophic injury? Yes, according to Tony Ro, a neuroscientist at the City College of New York, who is artificially recreating a condition called blindsight in his lab.
"Blindsight is a condition that some patients experience after having damage to the primary visual cortex in the back of their brains. What happens in these patients is they go cortically blind, yet they can still discriminate visual information, albeit without any awareness." explains Ro.
[Video]
While no one is ever going to say blindsight is 20/20, Ro says it holds tantalizing clues to the architecture of the brain. “There are a lot of areas in the brain that are involved with processing visual information, but without any visual awareness.” he points out. “These other parts of the brain receive input from the eyes, but they’re not allowing us to access it consciously.”
With support from the National Science Foundation’s (NSF) Directorate for Social, Behavioral and Economic Sciences, Ro is developing a clearer picture of how other parts of the brain, besides the visual cortex, respond to visual stimuli.
In order to recreate blindsight, Ro must find a volunteer who is willing to temporarily be blinded by having a powerful magnetic pulse shot right into their visual cortex. The magnetic blast disables the visual cortex and blinds the person for a split second. “That blindness occurs very shortly and very rapidly—on the order of one twentieth of a second or so,” says Ro.
On the day of Science Nation’s visit to Ro’s lab in the Hamilton Heights section of Manhattan, volunteer Lei Ai is seated in a small booth in front of a computer with instructions to keep his eyes on the screen. A round device is placed on the back of Ai’s head. Then, the booth is filled with the sound of consistent clicks, about two seconds apart. Each click is a magnetic pulse disrupting the activity in his visual cortex, blinding him. Just as the pulse blinds him, a shape, such as a diamond or a square, flashes onto a computer screen in front of him.
Ro says that 60 to nearly 100 percent of the time, test subjects report back the shape correctly. “They’ll be significantly above chance levels at discriminating those shapes, even though they’re unaware of them. Sometimes they’re nearly perfect at it,” he adds.
Ro observes what happens to other areas of Ai’s brain during the instant he is blinded and a shape is flashed on the screen. While the blindness wears off immediately with no lasting effects, according to Ro, the findings are telling. “There are likely to be a lot of alternative visual pathways that go into the brain from our eyes that process information at unconscious levels,” he says.
Ro believes understanding and mapping those alternative pathways might be the key to new rehabilitative therapies. “We have a lot of soldiers returning home who have a lot of brain damage to visual areas of the brain. We might be able to rehabilitate these patients,” he says. And that’s something worth looking into.
Provided by National Science Foundation
Source: medicalxpress.com
April 3, 2012
The brains of people with anorexia and obesity are wired differently, according to new research. Neuroscientists for the first time have found that how our brains respond to food differs across a spectrum of eating behaviors – from extreme overeating to food deprivation. This study is one of several new approaches to help better understand and ultimately treat eating disorders and obesity.
Eating disorders have the highest mortality rate of any mental illness. And more than two-thirds of the U.S. population are overweight or obese – a health factor associated with cardiovascular issues, diabetes, and cancer. “This body of work not only increases our understanding of the relationship between food and brain function but can also inform weight loss programs,” says Laura Martin of Hoglund Brain Imaging Center at the University of Kansas Medical Center, one of several researchers whose work being presented today at a meeting of cognitive neuroscientists in Chicago.
"One of the most intriguing aspects of these studies of the brain on food," Martin says, is that they show "consistent activations of reward areas of the brain that are also implicated in studies of addiction." However, how those reward areas respond to food differs between people depending on their eating behaviors, according to the new brain imaging study by Laura Holsen of Harvard Medical School and Brigham and Women’s Hospital and colleagues.
Holsen’s team conducted fMRI brain scans of individuals with one of three eating conditions – anorexia nervosa, simple obesity, and Prader-Willi syndrome (extreme obesity) – as well as healthy control subjects. When hungry, those with anorexia, who severely restrict their food intake, showed substantially decreased responses to various pictures of food in regions of their brains associated with reward and pleasure. For those who chronically overeat, there were significantly increased responses in those same brain regions.
"Our findings provide evidence of an overall continuum relating food intake behavior and weight outcomes to food reward circuitry activity," Holsen says. Her work also has implications, she says, for everyday eating decisions in healthy individuals. "Even in individuals who do not have eating disorders, there are areas of the brain that assist in evaluating the reward value of different foods, which in turn plays a role in the decisions we make about which foods to eat."
Kyle Simmons of the Laureate Institute studies the neural mechanisms that govern such everyday eating decisions. His work with fMRI scans has found that as soon as people see food, their brains automatically gather information about how they think it will taste and how that will make them feel. The brain scans showed an apparent overlap in the region on the insula that responds to seeing food pictures and the region of the insula that processes taste, the “primary gustatory cortex.”
Simmons is currently expanding this work to better understand the differences in taste preferences between lean, healthy individuals and obese ones. “We simply don’t know yet if differences exist between lean and obese participants,” he says. “And knowing which brain regions underlie inferences about food taste and reward is critical if we are going to develop efficacious interventions for obesity and certain eating disorders, both of which are associated with enormous personal and public health costs.”
Provided by Cognitive Neuroscience Society
Source: medicalxpress.com
April 3rd, 2012
For children with autism, being born several weeks early or several weeks late tends to increase the severity of their symptoms, according to new research out of Michigan State University.
Additionally, autistic children who were born either preterm or post-term are more likely to self-injure themselves compared with autistic children born on time, revealed the study by Tammy Movsas of MSU’s Department of Epidemiology.
Though the study did not uncover why there is an increase in autistic symptoms, the reasons may be tied to some of the underlying causes of why a child is born preterm (prior to 37 weeks) or post-term (after 42 weeks) in the first place.
The research appears online in the Journal of Autism and Development Disorders.
Movsas, a postdoctoral epidemiology fellow in MSU’s College of Human Medicine, said the study reveals there are many different manifestations of autism spectrum disorder, a collection of developmental disorders including both autism and Asperger syndrome. It also shows the length of the mother’s pregnancy is one factor affecting the severity of the disorder.
While previous research has linked premature birth to higher rates of autism, this is one of the first studies to look at the severity of the disease among autistic children who had been born early, on time and late.
“We think about autism being caused by a combination of genetic and environmental factors,” she said. “With preterm and post-term babies, there is something underlying that is altering the genetic expression of autism.
“The outside environment in which a preterm baby continues to mature is very different than the environment that the baby would have experienced in utero. This change in environment may be part of the reason why there is a difference in autistic severity in this set of infants.”
Movsas added that for post-term babies, the longer exposure to hormones while a baby is in utero, the higher chance of placental malfunction and the increased rate of C-section and instrument-assisted births may play a role.
The study also found that babies born outside of normal gestational age (40 weeks) – specifically very preterm babies – showed an increase in stereotypical autistic mannerisms.
“Normal gestation age of birth seems to mitigate the severity of autism spectrum disorder symptoms, and the types of autistic traits tend to be different depending on age at birth,” she said.
The study analyzed an online database compiled by Kennedy Krieger Institute at Johns Hopkins University of nearly 4,200 mothers – with autistic children ages 4-21 – between 2006 and 2010. It divided the data on births into four categories: very preterm (born prior to 34 weeks); preterm (34 to 37 weeks); standard (37 to 42 weeks); and post-term (born after 42 weeks)
The mothers filled out a pair of questionnaires regarding the symptoms of their autistic children, and the results revealed very preterm, preterm and post-term autistic children had significantly higher screening scores for autism spectrum disorder than autistic children born full term.
“The findings point to the fact that although autism has a strong genetic component, something about pregnancy or the perinatal period may affect how autism manifests,” said Nigel Paneth, an MSU epidemiologist who worked with Movsas on the paper. “This adds to our earlier finding that prematurity is a major risk factor for autism spectrum disorder and may help us understand if anything can be done during early life to prevent or alleviate autism spectrum disorder.”
Source: Neuroscience News
April 2, 2012
New research confirms that childhood onset temporal lobe epilepsy has a significant impact on brain aging. Study findings published in Epilepsia, a peer-reviewed journal of the International League Against Epilepsy (ILAE), show age-accelerated ventricular expansion outside the normal range in this patient population.
According to the Centers for Disease Control and Prevention (CDC), epilepsy affects nearly 2 million Americans. Temporal lobe epilepsy is the most common form of partial epilepsy, with 60% of all patients having this form of the disease. Previous evidence suggests that patients with childhood onset epilepsy have significant cognitive and developmental deficiencies, which continue into adulthood, particularly in those resistant to antiepileptic drugs.
Prior imaging studies of patients with temporal lobe epilepsy have shown abnormalities in brain structure in hippocampus, in thalamus and other subcortical structures, and also in cortical and white matter volume. However, there is limited knowledge of the effects of aging on these structural changes.
To characterize differences in brain structure and patterns of age-related change, Dr. Bruce Hermann and colleagues from the University of Wisconsin-Madison recruited 55 patients with chronic temporal lobe epilepsy and 53 healthy controls for their study. Participants were between the ages of 14 and 60, with patients having mean age of onset of epilepsy in childhood/adolescence. Magnetic resonance imaging (MRI) was used to measure cortical thickness, area and volume in the brains of all subjects.
In participants with epilepsy, there were extensive abnormalities in brain structure, involving subcortical regions, cerebellum and cortical gray matter thickness and volume in the temporal and extratemporal lobes. Furthermore, researchers found that increasing chronological age was associated with progressive changes in cortical, subcortical and cerebellar regions for both epilepsy subjects and healthy controls. The pattern of change was similar for both groups, but epilepsy patients always showed more extensive abnormalities. In particular, epilepsy patients displayed age-accelerated expansion of the lateral and third ventricles. “The anatomic abnormalities in patients with epilepsy indicate a significant neurodevelopmental impact,” said Dr. Hermann.
"Patients with epilepsy are burdened with significant neurodevelopmental challenges due to these cumulative brain abnormalities," concludes Dr. Hermann. "The consequences of these anatomical changes for epilepsy patients as they progress into elder years remain unknown and further study of the adverse effects in those of older chronological age is needed."
Provided by Wiley
Source: medicalxpress.com
April 2, 2012
One of the most frustrating challenges for some stroke patients can be the inability to find and speak words even if they know what they want to say. Speech therapy is laborious and can take months. New research is seeking to cut that time significantly, with the help of non-invasive brain stimulation.
"Non-invasive brain stimulation can allow painless, inexpensive, and apparently safe method for cognitive improvement with with potential long term efficacy," says Roi Cohen Kadosh of the University of Oxford. Recent results, presented this week at a meeting of cognitive neuroscientists in Chicago, offer exciting possibilities for improving variety of abilities – from speech to memory to numerical proficiency.
A focus of many of these studies is tDCS – transcranial direct current stimulation. In tDCS, researchers apply weak electrical currents to the head via electrodes for a short period of time, for example 20 minutes. The currents pass through the skull and alter spontaneous neural activity. Some types of stimulation excite the neurons, while others suppress them. Subjects usually feel only a slight tingling for less than 30 seconds. The effects of tDCS can last for up to 12 months, Cohen Kadosh says, “most likely due to molecular and cellular changes that are important mechanisms implementing learning and memory.”
Stimulating speech recovery
For Jenny Crinion of University College London, who is both a neuroscientist and clinical speech and language therapist, the interest in tDCS sprang from a desire to help stroke patients through their long recovery. While speech therapy works well at improving speech following aphasic stroke, it can be frustratingly slow. She hopes to pair brain-stimulation interventions with proven language-rehabilitation methods, Crinion says, “such that the same maximum recovery is ultimately achieved as with therapy alone but with fewer hours of rehab.”
Crinion’s current work focuses on understanding how tDCS affects the areas of the brain involved in speech production. She paired an fMRI picture-naming study with a 6-week-long tDCS and word-finding treatment study to see if brain stimulation could improve stroke patients’ speech both immediately after treatment and three months later. In the picture-naming task, people were presented with pictures of simple, everyday words such as car and asked to name them as quickly and accurately as possible.
April 2, 2012
Sleep plays a powerful role in preserving our memories. But while recent research shows that wakefulness may cloud memories of negative or traumatic events, a new study has found that wakefulness also degrades positive memories. Sleep, it seems, protects positive memories just as it does negative ones, and that has important implications for the treatment of post-traumatic stress disorder.
"The study of how sleep helps us remember and process emotional information is still young," says Alexis Chambers of the University of Notre Dame. Past work has focused on the role of negative memories for sleep, in particular how insomnia is a healthy biological response for people to reduce negative memories and emotions associated with a traumatic event.
Two new studies presented this week at a meeting of cognitive neuroscientists in Chicago are exploring the flip side: how sleep treats the positive. “Only if we investigate all the possibilities within this field will we ever fully understand the processes underlying our sleep, memory, and emotions,” Chambers says.
Protecting the positive
To test how sleep affects positive memories, Rebecca Spencer of the University of Massachusetts, Amherst, and her colleagues split 70 young adults into two groups, one that got to sleep overnight and one that had to stay awake. Both groups viewed images of positive items, such as puppies and flowers, and neutral items, such as furniture or dinner plates. The researchers then tested the participants’ memories of and emotional reactions to the images 12 hours later, after either the period of sleep or wake.
They found that “sleep enhances our emotionally positive memories while these memories decay over wake,” Spencer says. “Positive memories may even be prioritized for processing during sleep.” But while people remembered the positive images more than the neutral ones, their emotional response to the positive images did not change over sleep versus wake. “It doesn’t matter if you went to sleep or stayed awake – what you thought was a ‘9’ – really great – you still think is a ‘9’,” she says.
April 2, 2012
(Medical Xpress) — A research team including University of Wyoming neurobiologist Jeff Woodbury has discovered a new technique to determine how the touch sensory system is organized in hairy skin, providing a new understanding of the sense of touch.

The journal Cell’s cover story features research findings by University of Wyoming neurobiologist Jeff Woodbury. He was part of a research team that is providing a new understanding of the sense of touch.
Their findings were selected to appear as the feature and cover article in Cell, one of the pre-eminent international journals in the biological sciences.
The research provides the first picture of how nerve cells that carry signals from hair on the skin are organized. Unlike all other senses, the skin is least amenable to study and has remained the most poorly understood.
"We have described the system that is in place to help explain how sensory information is processed to perceive the sense of touch," says Woodbury, an associate professor in the UW Department of Zoology and Physiology. He was part of a multidisciplinary research team led by David Ginty from Johns Hopkins University. Colleen Cassidy, a doctoral student in Woodbury’s lab, was a co-author of the study, which also included colleagues from the Howard Hughes Medical Institute at Rockefeller University, University of Pennsylvania and University of Pittsburgh.
"We have also been able to identify how combinations of nerve cells respond to fine-tactile stimuli, so we can now really begin to tease apart the circuitry of touch sensation," Woodbury adds. "One of the real breakthroughs is that, for the first time in more than 200 years of study, we now know the specific functions of some of the many different kinds of nerve endings in the skin. This is truly exciting and a major advance."
Mice have several different types of hair follicles in their coat, each of which is linked to the central nervous system by low-threshold wire-like nerve cells that stretch all the way to the spinal cord. There, the myriad signals carried from the skin are integrated, processed and sent to the brain.
This network of nerve endings in the skin of most hairy mammals, including humans, allows them to perceive fine tactile sensations, such as a drop of rain or an insect landing on their skin. The researchers now have a better understanding of how this complex system is organized. Before this discovery, Woodbury says there was no way to see how all of these different nerve cells were arranged — both in the skin and at the top of the spinal cord, where they end up.
The study, Woodbury says, opens doors to understanding not only touch, but skin senses such as temperature detection and pain.
"Touch is ultimately felt in the brain; it alerts us that something is going on," he says. "We have identified the logic of how this system is organized. We now know that each individual hair is a distinct sensory organ, and each one will detect different forces. A broad spectrum of frequencies within a given stimulus are ultimately recombined and analyzed until we become aware that something has happened, like a drop of rain or a light breeze."
Once the different sensory neurons are identified, researchers could test hypotheses about the role of these cells in the process of sensation.
"For example, researchers could study the animal, in the presence or absence of each of the different types of sensory cells, to determine differences in the animal’s behavior," Woodbury says. "It will be possible to shut them off, take them out of the picture, to see how the animal responds to different types of stimulation. The key to understanding any system is first to gain a marker to identify all the different components, and we have made a major step in that direction."
Provided by University of Wyoming
Source: medicalxpress.com
April 2, 2012
New research published in the April issue of The Journal of Nuclear Medicine reveals that systemic inflammation causes an increase in depressive symptoms and metabolic changes in the parts of the brain responsible for mood and motivation. With this finding, researchers can begin to test potential treatments for depression for patients that experience symptoms that are related to inflammation in the body or within the brain.
Multiple studies in rodents have shown that inflammation in the body has effects on the brain. This has also been shown in a few human studies—both through measurements of behavioral changes and brain imaging—when subjects were engaged in various computer tasks. The study “Glucose Metabolism in the Insula and Cingulate Is Affected by Systemic Inflammation in Humans,” however, for the first time measured brain activity when subjects were at rest.
"In the study we used F-18 fluorodeoxyglucose (FDG) positron emission tomography (PET), which can accurately measure glucose metabolism in the brain, to determine which brain regions responded to systemic inflammation. Since the subjects were at rest, the changes we observed in the brain can only attributed to systemic inflammation," noted Jonas Hannestad, MD, PhD, lead author of the article.
In the study, nine healthy individuals received a double-blind endotoxin (which elicits systemic inflammation and mild depressive symptoms such as fatigue and reduced social interest) and placebo on different days. After administration, F-18 FDG PET was used to measure the differences in the cerebral metabolic rate of glucose in the insula, cingulate and amygdala regions of the brain. Behavior changes were also primarily assessed on the Montgomery-Asberg Depression Rating Scale (MADRS).
A statistical analysis of the results showed that endotoxin administration was associated with a higher normalized glucose metabolism (NMG) in the insula and lower NMG in the cingulate compared to the placebo; there was no significant difference in the NMG in the amygdala. Seven of nine subjects had an increase in NMG in the insula and a decrease in NMG in the cingulate, and all nine subjects had a decrease in NMG in the right anterior cingulate, suggesting that systemic inflammation induces fundamental physiologic changes in regional brain glucose metabolism. In addition, the MADRS increased for each subject after endotoxin administration, whereas no significant change was noted with the placebo.
Most researchers agree that depression is not a homogeneous disease, but rather that there are multiple mechanisms that can lead to similar symptoms. “If we can show that a subtype of depression is caused in part by inflammation,” said Hannestad, “we can test the ability of treatments that reduce inflammation in only patients in whom we believe inflammation plays a role. In the future, I expect that researchers in this field will be able to develop more precise PET measures that can be used to distinguish between, for instance, a person with ‘inflammatory depression’ and a person with another kind of depression. PET could then be used as diagnostic biomarker to separate subtypes of depression and as a therapeutic biomarker to detect the response to treatment.”
Nearly 17 percent of adults experience depression at some point over their lifetime, with 30.4 percent of cases classified as severe, according to the U.S. National Institute of Mental Health. Fifty-seven percent of adults with depression report receiving treatment in the past 12 months, although 37.8 percent receive minimally adequate treatment.
Provided by Society of Nuclear Medicine
Source: medicalxpress.com
April 2, 2012
A team of University of Pittsburgh mathematicians is using computational models to better understand how the structure of neural variability relates to such functions as short-term memory and decision making. In a paper published online April 2 in Proceedings of the National Academy of Sciences (PNAS), the Pitt team examines how fluctuations in brain activity can impact the dynamics of cognitive tasks.
Previous recordings of neural activity during simple cognitive tasks show a tremendous amount of trial-to-trial variability. For example, when a person was instructed to hold the same stimulus in working, or short-term, memory during two separate trials, the brain cells involved in the task showed very different activity during the two trials.
"A big challenge in neuroscience is translating variability expressed at the cellular and brain-circuit level with that in cognitive behaviors," said Brent Doiron, assistant professor of mathematics in Pitt’s Kenneth P. Dietrich School of Arts and Sciences and the project’s principal investigator. "It’s a fact that short-term memory degrades over time. If you try to recall a stored memory, there likely will be errors, and these cognitive imperfections increase the longer that short-term memory is engaged."
Doiron explains that brain cells increase activity during short-term memory functions. But this activity randomly drifts over time as a result of stochastic (or chance) forces in the brain. This drifting is what Doiron’s team is trying to better understand.
"As mathematicians, what we’re really trying to do is relate the structure and dynamics of this stochastic variability of brain activity to the variability in cognitive performance," said Doiron. "Linking the variability at these two levels will give important clues about the neural mechanisms that support cognition."
Using a combination of statistical mechanics and nonlinear system theory, the Pitt team examined the responses of a model of a simplified memory network proposed to be operative in the prefrontal cortex. When sources of neural variability were distributed over the entire network, as opposed to only over subsections, the performance of the memory network was enhanced. This helped the Pitt team make the prediction published in PNAS, that brain wiring affects how neural networks contend with—and ultimately express—variability in memory and decision making.
Recently, experimental neurosciencists are getting a better understanding of how the brain is wired, and theories like those published in PNAS by Doiron’s group give a context for their findings within a cognitive framework. The Doiron group plans to apply the general principle of linking brain circuitry to neural variability in a variety of sensory, motor, and memory/decision-making frameworks.
Provided by University of Pittsburgh
Source: medicalxpress.com
April 2nd, 2012
Therapy to mend parts of the brain damaged by strokes has moved a step closer, thanks to research at Monash University’s Australian Regenerative Medicine Institute (ARMI) and the Florey Neuroscience Institutes (FNI).
Scientists, James Bourne and Jihane Homman-Ludiye, of ARMI, and Tobias Merson, of FNI, have discovered precursor cells in the visual processing region of the brains of young marmoset monkeys which can form new brain cells in a culture dish.
The work, published recently in the journal, PLoS One, raises the possibility of new therapies for victims of brain injuries such as stroke.
Commenting on the work, Stem Cells Australia’s Professor Martin Pera said “These results, which point strongly to the existence of stem cells in the primate cortex, have important implications for understanding normal brain function and add to a growing body of evidence that stem or progenitor cells may participate in the repair of injuries to this critical region of the brain.”
The team isolated a type of cell from the brain tissue of two-week-old marmoset monkeys, which have similar brains to humans.
They exposed the cells to various combinations of growth factors – proteins that promote cell proliferation – to see if the cells would multiply and form neurons in the culture dish.
Some of the cells started to multiply to form clusters of cells called neurospheres – the forerunners of mature brain cells – when treated with two specific growth factors. This puts them in a class of cells called neural progenitors. Like stem cells, these cells can convert into specialist cells to form various tissues.
It was once thought that our full complement of brain cells was fixed at birth. That view has been toppled in recent decades with the discovery of stem cells in the human brain that can form new neurons in adulthood, said Dr Merson, a neuroscientist.
But until now, those cells have been thought to be limited to two regions of the brain, including the hippocampus, which is involved in memory and learning.
The team’s breakthrough suggests that cells with the ability to form new neurons after birth are much more widespread in the brain. The cells under investigation in this latest research were isolated from the primary visual cortex, the brain structure at the back of the head involved in the processing of stimuli from the eyes. “This structure is very big in humans and other primates and is often affected by brain injury,” Dr Bourne said.
“Our results support the view that this region of the brain has the potential to generate new neurons at later stages than once thought,” Dr Merson said. “We were surprised at how easily we were able to generate the proliferating neurospheres. We were able to propagate them, and keep them in culture for up to a year.”
He said other regions of the brain involved in sensory processing could harbour similar cells.
The scientists plan further research to see if the production of new neurons after birth occurs naturally in the primary visual cortex, and whether the mechanism could be activated after injury.
“It could be plausible to manipulate the progenitor cells to produce more neurons,” Dr Bourne said.
Source: Neuroscience News
ScienceDaily (Apr. 2, 2012) — Scientists from the Florida campus of The Scripps Research Institute have shown in animal models that the loss of memory that comes with aging is not necessarily a permanent thing.
In a new study published this week in an advance, online edition of the journal Proceedings of the National Academy of Science, Ron Davis, chair of the Department of Neuroscience at Scripps Florida, and Ayako Tonoki-Yamaguchi, a research associate in Davis’s lab, took a close look at memory and memory traces in the brains of both young and old fruit flies.
What they found is that like other organisms — from mice to humans — there is a defect that occurs in memory with aging. In the case of the fruit fly, the ability to form memories lasting a few hours (intermediate-term memory) is lost due to age-related impairment of the function of certain neurons. Intriguingly, the scientists found that stimulating those same neurons can reverse these age-related memory defects.
"This study shows that once the appropriate neurons are identified in people, in principle at least, one could potentially develop drugs to hit those neurons and rescue those memories affected by the aging process," Davis said. "In addition, the biochemistry underlying memory formation in fruit flies is remarkably conserved with that in humans so that everything we learn about memory formation in flies is likely applicable to human memory and the disorders of human memory."
While no one really understands what is altered in the brain during the aging process, in the current study the scientists were able to use functional cellular imaging to monitor the changes in the fly’s neuron activity before and after learning.
"We are able to peer down into the fly brain and see changes in the brain," Davis said. "We found changes that appear to reflect how intermediate-term memory is encoded in these neurons."
Olfactory memory, which was used by the scientists, is the most widely studied form of memory in fruit flies — basically pairing an odor with a mild electric shock. These tactics produce short-term memories that persist for around a half-hour, intermediate-term memory that lasts a few hours, and long-term memory that persists for days.
The team found that in aged animals, the signs of encoded memory were absent after a few hours. In that way, the scientists also learned exactly which neurons in the fly are altered by aging to produce intermediate-term memory impairment. This advance, Davis notes, should greatly help scientists understand how aging alters neuronal function.
Intriguingly, the scientists took the work a step further and stimulated these neurons to see if the memory could be rescued. To do this, the scientists placed either cold-activated or heat-activated ion channels in the neurons known to become defective with aging and then used cold or heat to stimulate them. In both cases, the intermediate-term memory was successfully rescued.
Source: Science Daily
ScienceDaily (Apr. 2, 2012) — An international team of researchers involving the University of Adelaide has made a major discovery that could lead to more effective treatment of severe pain using morphine.
Morphine is an extremely important drug for pain relief, but it can lead to a range of side-effects — such as patients developing tolerance to morphine and increased sensitivity to pain. Until now, how this occurs has remained a mystery.
The team from the University of Colorado and University of Adelaide has shown for the first time how opioid drugs, such as morphine, create an inflammatory response in the brain — by activating an immune receptor in the brain.
They have also demonstrated how this brain immune receptor can be blocked, laying the groundwork for the development of new therapeutic drugs that improve the effectiveness of morphine while reducing many of its problematic side effects.
The results of this research are published April 2 in the Proceedings of the National Academy of Sciences (PNAS).
"Because morphine is considered to be such an important drug in the management of moderate to severe pain in patients right around the world, we believe these results will have far-reaching benefits," says study co-author Dr Mark Hutchinson, ARC Research Fellow in the University of Adelaide’s School of Medical Sciences.
Dr Hutchinson’s team, including University of Adelaide colleague Professor Andrew Somogyi, conducted studies in mice to validate the work done at the University of Colorado by the teams of Assistant Professor Hubert Yin and Professor Linda Watkins.
"For some time it’s been assumed that the inflammatory response from morphine was being caused via the classical opioid receptors," says Dr Hutchinson.
"However, we found instead that morphine binds to an immune receptor complex called toll-like receptor 4 (TLR4), and importantly this occurs in a very similar way to how this receptor detects bacteria.
"Our experiments in mice have shown that if this relationship with the immune receptor is disrupted, it will prevent the inflammatory response.
"This is an exciting result because it opens up possibilities for future drugs that promote the beneficial actions of morphine while negating some of the harmful side effects. This could lead to major advances in patient and palliative care," he says.
Source: Science Daily
ScienceDaily (Apr. 2, 2012) — As computer scientists this year celebrate the 100th anniversary of the birth of the mathematical genius Alan Turing, who set out the basis for digital computing in the 1930s to anticipate the electronic age, they still quest after a machine as adaptable and intelligent as the human brain.
Now, computer scientist Hava Siegelmann of the University of Massachusetts Amherst, an expert in neural networks, has taken Turing’s work to its next logical step. She is translating her 1993 discovery of what she has dubbed “Super-Turing” computation into an adaptable computational system that learns and evolves, using input from the environment in a way much more like our brains do than classic Turing-type computers. She and her post-doctoral research colleague Jeremie Cabessa report on the advance in the current issue of Neural Computation.
"This model is inspired by the brain," she says. "It is a mathematical formulation of the brain’s neural networks with their adaptive abilities." The authors show that when the model is installed in an environment offering constant sensory stimuli like the real world, and when all stimulus-response pairs are considered over the machine’s lifetime, the Super Turing model yields an exponentially greater repertoire of behaviors than the classical computer or Turing model. They demonstrate that the Super-Turing model is superior for human-like tasks and learning.
"Each time a Super-Turing machine gets input it literally becomes a different machine," Siegelmann says. "You don’t want this for your PC. They are fine and fast calculators and we need them to do that. But if you want a robot to accompany a blind person to the grocery store, you’d like one that can navigate in a dynamic environment. If you want a machine to interact successfully with a human partner, you’d like one that can adapt to idiosyncratic speech, recognize facial patterns and allow interactions between partners to evolve just like we do. That’s what this model can offer."
Classical computers work sequentially and can only operate in the very orchestrated, specific environments for which they were programmed. They can look intelligent if they’ve been told what to expect and how to respond, Siegelmann says. But they can’t take in new information or use it to improve problem-solving, provide richer alternatives or perform other higher-intelligence tasks.
In 1948, Turing himself predicted another kind of computation that would mimic life itself, but he died without developing his concept of a machine that could use what he called “adaptive inference.” In 1993, Siegelmann, then at Rutgers, showed independently in her doctoral thesis that a very different kind of computation, vastly different from the “calculating computer” model and more like Turing’s prediction of life-like intelligence, was possible. She published her findings in Science and in a book shortly after.
"I was young enough to be curious, wanting to understand why the Turing model looked really strong," she recalls. "I tried to prove the conjecture that neural networks are very weak and instead found that some of the early work was faulty. I was surprised to find out via mathematical analysis that the neural models had some capabilities that surpass the Turing model. So I re-read Turing and found that he believed there would be an adaptive model that was stronger based on continuous calculations."
Each step in Siegelmann’s model starts with a new Turing machine that computes once and then adapts. The size of the set of natural numbers is represented by the notation aleph-zero, ℵ0, representing also the number of different infinite calculations possible by classical Turing machines in a real-world environment on continuously arriving inputs. By contrast, Siegelmann’s most recent analysis demonstrates that Super-Turing computation has 2ℵ0, possible behaviors. “If the Turing machine had 300 behaviors, the Super-Turing would have 2300, more than the number of atoms in the observable universe,” she explains.
The new Super-Turing machine will not only be flexible and adaptable but economical. This means that when presented with a visual problem, for example, it will act more like our human brains and choose salient features in the environment on which to focus, rather than using its power to visually sample the entire scene as a camera does. This economy of effort, using only as much attention as needed, is another hallmark of high artificial intelligence, Siegelmann says.
"If a Turing machine is like a train on a fixed track, a Super-Turing machine is like an airplane. It can haul a heavy load, but also move in endless directions and vary its destination as needed. The Super-Turing framework allows a stimulus to actually change the computer at each computational step, behaving in a way much closer to that of the constantly adapting and evolving brain," she adds.
Siegelmann and two colleagues recently were notified that they will receive a grant to make the first ever Super-Turing computer, based on Analog Recurrent Neural Networks. The device is expected to introduce a level of intelligence not seen before in artificial computation.
Source: Science Daily
ScienceDaily (Apr. 2, 2012) — Testosterone, the primary male sex hormone, appears to have antidepressant properties, but the exact mechanisms underlying its effects have remained unclear. Nicole Carrier and Mohamed Kabbaj, scientists at Florida State University, are actively working to elucidate these mechanisms.
They’ve discovered that a specific pathway in the hippocampus, a brain region involved in memory formation and regulation of stress responses, plays a major role in mediating testosterone’s effects, according to their new report in Biological Psychiatry.
Compared to men, women are twice as likely to suffer from an affective disorder like depression. Men with hypogonadism, a condition where the body produces no or low testosterone, also suffer increased levels of depression and anxiety. Testosterone replacement therapy has been shown to effectively improve mood.
Although it may seem that much is already known, it is of vital importance to fully characterize how and where these effects are occurring so that scientists can better target the development of future antidepressant therapies.
To advance this goal, the scientists performed multiple experiments in neutered adult male rats. The rats developed depressive-like behaviors that were reversed with testosterone replacement.
They also “identified a molecular pathway called MAPK/ERK2 (mitogen activated protein kinase/ extracellular regulated kinase 2) in the hippocampus that plays a major role in mediating the protective effects of testosterone,” said Kabbaj.
This suggests that the proper functioning of ERK2 is necessary before the antidepressant effects of testosterone can occur. It also suggests that this pathway may be a promising target for antidepressant therapies.
Kabbaj added, “Interestingly, the beneficial effects of testosterone were not associated with changes in neurogenesis (generation of new neurons) in the hippocampus as it is the case with other classical antidepressants like imipramine (Tofranil) and fluoxetine (Prozac).”
In results published elsewhere by the same group, testosterone has shown beneficial effects only in male rats, not in female rats.
Source: Science Daily
ScienceDaily (Apr. 2, 2012) — Why do some persons succumb to post-traumatic stress disorder (PTSD) while others who suffered the same ordeal do not? A new UCLA study sheds light on the answer.
UCLA scientists have linked two genes involved in serotonin production to a higher risk of developing PTSD. Published in the April 3 online edition of the Journal of Affective Disorders, the findings suggest that susceptibility to PTSD is inherited, pointing to new ways of screening for and treating the disorder.
"People can develop post-traumatic stress disorder after surviving a life-threatening ordeal like war, rape or a natural disaster," explained lead author Dr. Armen Goenjian, a research professor of psychiatry at the Semel Institute for Neuroscience and Human Behavior at UCLA. "If confirmed, our findings could eventually lead to new ways to screen people at risk for PTSD and target specific medicines for preventing and treating the disorder."
PTSD can arise following child abuse, terrorist attacks, sexual or physical assault, major accidents, natural disasters or exposure to war or combat. Symptoms include flashbacks, feeling emotionally numb or hyper-alert to danger, and avoiding situations that remind one of the original trauma.
Goenjian and his colleagues extracted the DNA of 200 adults from several generations of 12 extended families who suffered PTSD symptoms after surviving the devastating 1988 earthquake in Armenia.
In studying the families’ genes, the researchers found that persons who possessed specific variants of two genes were more likely to develop PTSD symptoms. Called TPH1 and TPH2, these genes control the production of serotonin, a brain chemical that regulates mood, sleep and alertness — all of which are disrupted in PTSD.
"We suspect that the gene variants produce less serotonin, predisposing these family members to PTSD after exposure to violence or disaster," said Goenjian. "Our next step will be to try and replicate the findings in a larger, more heterogeneous population."
Affecting about 7 percent of Americans, PTSD has become a pressing health issue for a large percentage of war veterans returning from Iraq and Afghanistan. The UCLA team’s discovery could be used to help screen persons who may be at risk for developing PTSD.
"A diagnostic tool based upon TPH1 and TPH2 could enable military leaders to identify soldiers who are at higher risk of developing PTSD, and reassign their combat duties accordingly," observed Goenjian. "Our findings may also help scientists uncover alternative treatments for the disorder, such as gene therapy or new drugs that regulate the chemicals responsible for PTSD symptoms."
According to Goenjian, pinpointing genes connected with PTSD symptoms will help neuroscientists classify the disorder based on brain biology instead of clinical observation. Psychiatrists currently rely on a trial and error approach to identify the best medication for controlling an individual patient’s symptoms.
Serotonin is the target of the popular antidepressants known as SSRIs, or selective serotonin re-uptake inhibitors, which prolong the effect of serotonin in the brain by slowing its absorption by brain cells. More physicians are prescribing SSRIs to treat psychiatric disease beyond depression, including PTSD and obsessive compulsive disorder.
Source: Science Daily
April 1, 2012
Rutgers scientists think they have found a way to prevent and possibly reverse the most debilitating symptoms of a rare, progressive childhood degenerative disease that leaves children with slurred speech, unable to walk, and in a wheelchair before they reach adolescence.
In today’s online edition of Nature Medicine, Karl Herrup, chair of the Department of Cell Biology and Neuroscience in the School of Arts and Sciences provides new information on why this genetic disease attacks the cerebellum – a part of the brain that controls movement coordination, equilibrium, and muscle tone – and other regions of the brain.
Using mouse and human brain tissue studies, Herrup and his colleagues at Rutgers found that in the brain tissue of young adults who died from ataxia-telangiectasia, or A-T disease, a protein known as HDAC4 was in the wrong place. HDAC4 is known to regulate bone and muscle development, but it is also found in the nerve cells of the brain. The protein that is defective in A-T, they discovered, plays a critical role in keeping HDAC4 from ending up in the nucleus of the nerve cell instead of in the cytoplasm where it belongs. In a properly working nerve cell, the HDAC4 in the cytoplasm helps to prevent nerve cell degeneration; however, in the brain tissue of young adults who had died from A-T disease, the protein was in the nucleus where it attacked the histones – the small proteins that coat and protect the DNA.
"What we have found is a double-edged sword," said Herrup. "While the HDAC4 protein protected a neuron’s function when it was in the cytoplasm, it was lethal in the nucleus."
To prove this point, Rutgers scientists analyzed mice, genetically engineered with the defective protein found in children with A-T, as well as wild mice. The animals were tested on a rotating rod to measure their motor coordination. While the normal mice were able to stay on the rod without any problems for five to six minutes, the mutant mice fell off within 15 to 20 seconds.
After being treated with trichostation A (TSA), a chemical compound that inhibits the ability of HDAC4 to modify proteins, they found that the mutant mice were able to stay on the rotating rod without falling off – almost as long as the normal mice.
Although the behavioral symptoms and brain cell loss in the engineered mice are not as severe as in humans, all of the biochemical signs of cell stress were reversed and the motor skills improved dramatically in the mice treated with TSA. This outcome proves that brain cell function could be restored, Herrup said.
"The caveat here is that we have fixed a mouse brain with less devastation and fewer problems than seen in a child with A-T disease," said Herrup. "But what this mouse data says is that we can take existing cells that are on their way to death and restore their function."
Neurological degeneration is not the only life-threatening effect associated with this genetic disease. A-T disease – which occurs in an estimated 1 in 40,000 births – causes the immune system to break down and leaves children extremely susceptible to cancers such as leukemia or lymphoma. There is no known cure and most die in their teens or early 20s. According to the AT Children’s Project, many of those who die at a young age might not have been properly diagnosed, which may, in fact, make the disease even more common.
Herrup says although this discovery does not address all of the related medical conditions associated with the disease, saving existing brain cells – even those that are close to death – and restoring life-altering neurological functions would make a tremendous improvement in the lives of these children.
"We can never replace cells that are lost," said Herrup. "But what these mouse studies indicate is that we can take the cells that remain in the brains of these children and make them work better. This could improve the quality of life for these kids by unimaginable amounts."
Additionally, Herrup says, the research might provide insight into other neurodegenerative diseases. “If this is found to be true, then the work we’ve done on this rare disease of childhood may have a much wider application in helping to treat other diseases of the nervous system, even those that affect the elderly, like Alzheimer’s,” he said.
Provided by Rutgers University
Source: medicalxpress.com
March 30th, 2012
A new animal model of nerve injury has brought to light a critical role of an enzyme called Nmnat in nerve fiber maintenance and neuroprotection. Understanding biological pathways involved in maintaining healthy nerves and clearing away damaged ones may offer scientists targets for drugs to mitigate neurodegenerative diseases such as Huntington’s and Parkinson’s, as well as aid in situations of acute nerve damage, such as spinal cord injury.
University of Pennsylvanian biologists developed the model in the adult fruit fly, Drosophila melanogaster.
“We are using the basic power of the fly to learn about how neurons are damaged in acute injury situations,” said Nancy Bonini, senior author of the research and a professor in the Department of Biology at Penn. “Our work indicates that Nmnat may be key.”
The research was published in Current Biology. First author on the study is postdoctoral researcher Yanshan Fang, with additional contributions from postdoctoral researcher Lorena Soares and research technicians Xiuyin Teng and Melissa Geary, all of Penn’s Department of Biology.
When a nerve suffers an acute injury — as might be caused by a penetrating wound, for example, or a broken bone that damages nearby tissues — the long projection of the nerve cell, called the axon, can become injured and degenerate. The process by which it disintegrates is known as Wallerian or Wallerian-like degeneration and is an active, orderly process.
Though this function of eliminating damaged nerve cells is crucial, biologists do not have a clear understanding of all of the molecular signaling pathways that govern the process.
Bonini’s lab has previously focused on chronic neurodegenerative diseases but made this foray into acute nerve injury to determine if mechanistic overlaps exist between acute axon injury and chronic neurodegeneration. They first searched for an appropriate nerve tract to target and identified the wing of adult flies as a prime option.
The fly wing is not only translucent and a site of lengthy nerve fibers that can be easily observed, but it can also be cut to cause injury without killing the fly. That way, the researchers can follow the animal’s response to nerve injury for weeks.
Using various reagents to manipulate the fly’s genetic traits, the team confirmed that the cut wing nerve underwent Wallerian degeneration. They then tested versions of Nmnat and another protein called WldS, all of which had previously been shown to protect nerves from degeneration, to see if any of these might stop the process. All significantly delayed neurodegeneration. Even a form of Nmnat that hadn’t worked in other animal models suppressed degeneration, although to a lesser extent.
“That indicates that our assay is really sensitive,” Bonini said. “This sensitivity could help us identify genes that have moderate although important functionality at protecting against nerve degeneration.”
Their investigations into the wing nerve also showed that the degenerating axon “died back,” fragmenting first from the axon terminals, the side farthest from the nerve cell body—a pattern similar to what has been seen in other disorders.
Doing more genetic tinkering, the researchers showed that when the animal’s own Nmnat was depleted, the nerves fragmented in the same way as if the axon was physically cut. And when Nmnat and the other “rescue” proteins were added back to these genetically modified flies, they were able to block degeneration, highlighting that Nmnat is critical to maintaining healthy axons.
In a final set of experiments, the biologists sought to narrow where in the nerve cells Nmnat might be working. They focused on mitochondria, the powerhouses of cells. When they created a genetic line of flies that blocked mitochondria from entering the axon fibers, the nerve tract degenerated, again, in a dying-back fashion. Yet now WldS and Nmnat failed to prevent axon degeneration, suggesting that those proteins may act on and require the presence of axonal mitochondria to maintain healthy nerves in normal flies.
Flipping that scenario around, they looked to see what happened to the mitochondria of flies upon nerve injury. When they cut the wing nerve axons, the mitochondria rapidly disappeared. Yet they can largely preserve the mitochrondria by increasing expression of Nmnat.
Their results, taken together with the findings of other studies, suggest that Nmnat may stabilize mitochondria in some way in order to keep axons in a healthy state.
“We have some hope that these proteins or their activity may someday serve as drug targets or could provide the foundation for a therapeutic advance,” Bonini said. “But right now, my hope is that the power of the fly model will open up a lot of new directions of research and new pathways that could be targets for development in the future.”
Source: Neuroscience News
16:55 30 March 2012
Sumit Paul-Choudhury, editor

My Soul, 2005, Katharine Dowson (Image: Image courtesy of the artist and GV Art)
LOOKING at your own brain is a humbling and slightly unnerving experience. Mine, depicted in a freshly acquired MRI scan, is startlingly intricate, compact - and baffling. This is as much of a portrait of my own mind as I am ever likely to see. But to my ignorant eyes (which, by way of an eerie bonus, are now looking at their own cross-sections) it looks pretty much like any other brain.
Apparently a more expert eye wouldn’t help. “Whilst all my participants get very excited about seeing their brain for the first time after being scanned, and I frequently get asked ‘What can you tell me about my brain?’, the reality is that the brain will for a long time yet remain a mysterious mass,” says the neuroscientist who scanned my brain, for research purposes. “We must be content with knowing that the ‘I’ is constructed in its intricacies, but we cannot explain how.”
The hope of closing the gap between the physical and mental is presumably what gets neuroscientists up in the morning, but it’s frustrating for a layperson like me. Avowed materialist though I am, I nonetheless rebel against the knowledge that the impassive blob on screen is “me”.
This cognitive dissonance was what I took with me to the opening of Brains, a new show at London’s Wellcome Collection, whose subtitle, “The Mind as Matter”, suggests that its curators sympathise with my materialist perspective. “The neurosciences hold out the prospect of an objective account of consciousness - the soul or mind as nothing more than intricately connected flesh,” reads the introduction. But the bulk of the exhibition is dedicated to whole brains, brain collectors and anatomical paraphernalia, with little explicit reference to the brain’s fine structure, or how it might give rise to thought.
This remit is less restrictive than it might sound. Evolution has seen to it that the most vital of our bodily organs is well guarded against intrusion, and as a by-product, well hidden from inspection. Even today, only a small minority of the population have seen their own brains - and many (unlike me, thankfully) have done so only when they had reason to be fearful of what they found.
So the history of attempts to access, visualise and understand the brain is a rich one. But it’s also well worn, and some of the historical material - elaborate anatomical models, kooky phrenological busts and grim-looking surgical implements - is over-familiar. The scientific objects are more compelling, albeit they also tend to the grotesque - from the arachnid contraptions used to measure skull size to the spools of finely-sliced mouse-brains on tape.
Much of the fascination lies not in the objects themselves, but in the human stories behind them, told in captions whose straight-facedness sometimes comes across as drollery or clinical detachment. Trepanning tools fashioned from flint and animal teeth “would have taken longer to cut through the skull than more modern instruments”, one informs us drily; the American Anthropometric Society was “basically a club to enable the leading men of US science to dissect each other,” says another.
Secluded in the skull, individual brains develop relatively few distinguishing features save those given them by trauma, disease or rare accidents of birth. So brains of note tend to be associated with tales of misfortune. That’s often the case with anatomical specimens, but what’s remarkable about the brains on display here is how much they overlap with criminality. Some of the most striking have been acquired from people whose wickedness in life was deemed sufficient reason to deny them dignity in death. In other cases, the moral equation is reversed, with collectors stepping outside the bounds of decency in their desire to possess the brain; the abhorrent nadir being reached with the probable murder of “feeble” children by Nazi doctors.

(Image: Science Museum, London)
The collection’s examples of brains being voluntarily donated are equally remarkable. Persuading someone to donate this most personal of organs is a tough sell, and the historical portion of the exhibition makes much of how appeals to ego have persuaded the great and good to offer up their brains for post-mortem examination. More recently, medical study has provided motivation for would-be donors. Particularly poignant is the tale of Anita Newcomb McGee, a female US army surgeon who gave up her nine-month old son’s brain, along with a photograph and a sketch of his head, with the words “I want him to benefit the world in some way if possible.”
But this altruism is tempered by the fact that brains, unlike many of our other “charismatic organs”, cannot conceivably be transplanted. The knowledge that someone else might gain life from my gifted heart is a powerful incentive to donate, but I have less incentive to be generous with my brain - though it would seem that distinction is not made by those who do donate. Perhaps reluctant donors might be won over by the moving photos compiled by artist Ania Dabrowska and social scientist Bronwyn Parry of cheerful would-be donors ranging from an ex-soldier to a headmistress.
And one of the exhibition’s stand-out exhibits might provide further reassurance: a documentary video of anatomists at London’s Hammersmith Hospital painstakingly and precisely slicing donated brains into half-centimetre wedges in near-silence. Or perhaps not. This is well sanctioned, respectful science being conducted on freely donated organs for the betterment of human health and knowledge; and yet it nonetheless provokes one of my fellow spectators into muttering, very much to herself: “Wrong, wrong, wrong”. For myself, I find the video one of the most compelling exhibits — perhaps the only one that prompts me to contemplate offering myself up for more than a non-invasive scan in the service of medical science.
Less clear-cut, for me, are the nearby sections of Einstein’s brain, preserved under deeply dubious circumstances after his death. Clearly, there’s a certain fascination associated with perhaps the most famous brain there ever was; but Einstein’s brain is no more legible than any other, and the slim prospect that scientific insights can be gleaned from its study seems poor recompense for the undignified proxying of a great mind by illicitly-obtained fragments of tissue.

(Image: Wellcome Library, London. Wellcome Images)
The final insult: an accompanying 3D-printed replica of the entire organ, reconstructed for a TV documentary from archive photographs. This absurd resin relic epitomises, for me, the extent to which the objects on display at the Wellcome are imbued with significance by the knowledge that they were once the seats of consciousness. It’s the minds of the audience, rather than the brains on display, that are doing the work.
The Wellcome show confronts its visitors with the gulf between our hard-won knowledge about the form of the brain and our as-yet-meagre understanding of its function, much as my scan did to me. It didn’t narrow the gap between the grey image of my brain and my sense of self; if anything, it widened it. But it did make me appreciate what an amazing gap it is, and marvel anew at the work of those who are seeking to close it.
Brains: The mind as matter is showing at the Wellcome Collection in London until 17 June.
Source: New Scientist
March 30, 2012
A team of neuroscientists from the Institute of Psychiatry (IoP) at King’s College London have developed a digital atlas of the human brain for iPad. The ‘Brain’ App is the first of its kind, and is based on cutting edge neuro-imaging research from the NatBrainLab at the IoP.

Image taken from the ‘Brain’ Study Room
Dr. Marco Catani, Head of the NatBrainLab who led the development of the App with Dr. Flavio Dell’Acqua and Dr. Michel Thiebaut de Schotten, said: “For 10 years our lab has pioneered the use of highly advanced neuro-imaging techniques. This is the first time that imaging methods usually only applied to research have been used in an educational App. It’s very exciting to see our work transformed into such an accessible, fun and beautiful tool.”

Image taken from the ‘Brain’ Dissection Room
Two types of scans were used to develop the content of ‘Brain’ – results from an MRI scan reveal the structural properties of the brain, and images from a Diffusion Tractography scan allow the user to identify connections in the brain.
The App is split into two virtual rooms. The Dissection Room allows the user to play with a 3D human brain, select individual structures and ‘pull’ them apart to visualize their anatomical features. The Study Room then offers a more thorough explanation of functional aspects and their relationship to neurological and psychiatric disorders.
Dr. Catani adds: “The interactive nature of our App really allows you to explore the depths of the neural network and appreciate the complexity of the human brain. Because the content is based directly on research, the finished product is an accurate reflection of the real thing.”
Dr. Catani and his team are now working towards developing the next version of the App. By integrating scans from several different brains into the programme, they hope to be able to offer the user the chance to see directly how the brain develops from childhood to old age and the direct effect of different age-related disorders on the brain.
The App is currently being used by Dr. Catani and his colleagues to teach MSc students neuroscience.
Provided by King’s College London
Source: medicalxpress.com
March 30, 2012
(HealthDay) — Electrocorticography (ECoG) signals from patients with chronic motor dysfunction represent motor information that may be useful for controlling prosthetic arms, according to a study published in the March issue of the Annals of Neurology.

To investigate whether ECoG signals recorded from chronically paralyzed patients and whether those signals can be applied to control a prosthetic, Takufumi Yanagisawa, M.D., Ph.D., of the Osaka University Medical School in Japan, and colleagues recorded ECoG signals from sensorimotor cortices of 12 patients while they attempted to carry out three to five simple hand and elbow movements. Sensorimotor function was normal in five patients, moderately impaired due to central nervous system lesions sparing the cortex in four patients, and severely impaired due to peripheral nervous system lesion or amputation in three patients.
The researchers found that the high gamma power (80 to 150 Hz) of the ECoG signals during movements was responsive to different types of movement and provided the best information for movement classification. In all patients, the classification performance was significantly better than chance, although for patients with severely impaired motor function the differences between ECoG power modulations during different types of movement were significantly fewer. Cortical representations tended to overlap each other in impaired patients. One moderately impaired patient and three non-paralyzed patients successfully controlled a prosthetic arm using the classification method in real time.
"ECoG signals appear useful for prosthetic arm control and may provide clinically feasible motor restoration for patients with paralysis but no injury of the sensorimotor cortex," the authors write.
Source: medicalxpress.com
March 30, 2012
An estimated 2.2 million people in the United States live with epilepsy, a complex brain disorder characterized by sudden and often unpredictable seizures. The highest rate of onset occurs in children and older adults, and it affects people of all ethnicities and socio-economic backgrounds, yet this common disorder is widely misunderstood. Epilepsy refers to a spectrum of disorders with seizures that vary in type, cause, severity, and frequency. Many people do not know the causes of epilepsy or what measures to take if they witness a seizure. A new report from the Institute of Medicine highlights numerous gaps in the knowledge and management of epilepsy and recommends actions for improving the lives of those with epilepsy and their families and promoting better understanding of the disorder.
Effective treatments for epilepsy are available but access to treatment and timely referrals to specialized care are often lacking, the report’s expert committee found. Reaching rural and underserved populations, as well as providing state-of-the art care for people with persistent seizures, is particularly crucial. The report’s recommendations for expanding access to patient-centered health care include early identification and treatment of epilepsy and associated health conditions, implementing measures that assess quality of care, and establishing accreditation criteria and processes for specialized epilepsy centers. In addition, the wide variety of health professionals who care for those with epilepsy need improved knowledge and skills to provide the highest quality health care.
Some causes of epilepsy, such as traumatic brain injury, infection, and stroke, are preventable. Prevention efforts should continue for these established risk factors, as well as for recurring seizures in people with epilepsy and depression, and for epilepsy-related causes of death, the report says.
People with epilepsy need additional education and skills to optimally manage their disorder. Consistent delivery of accurate, clearly communicated health information from sources that include health care professionals and epilepsy organizations can better prepare those with epilepsy and their families to cope with the disorder and its consequences, the report says. Accurate, current data on the extent and consequences of epilepsy and its associated health conditions are especially needed to inform policymakers and identify opportunities for reducing the burden of epilepsy.
Living with epilepsy can affect employment, driving ability, and many other aspects of quality of life. The report stresses the importance of improved access to a range of community services, including vocational, educational, transportation, transitional care, and independent living assistance as well as support groups. The committee urged collaboration among federal agencies, state health departments, and relevant epilepsy organizations to improve and integrate these services and programs, particularly at state and local levels.
Misperceptions about epilepsy persist and a focus on raising public awareness and knowledge is needed, the report adds. Educating community members such as teachers, employers, and others on how to manage seizures could help improve public understanding of epilepsy. The report suggests several strategies for stakeholders to improve public knowledge of the disorder, including forming partnerships with the media, establishing advisory councils, and engaging people with epilepsy and their families to serve as advocates and educators within their communities.
Provided by National Academy of Sciences
Source: medicalxpress.com
March 30, 2012
Researchers want to help the Army better camouflage its soldiers and break the enemy’s efforts to hide.

Researchers Jay Hegde and Xing Chen are using functional MRI to look at the brains of study participants learning how to break camouflage in order to help identify soldiers who will be good at it and identify better ways to teach it. Credit: Phil Jones, GHSU Photographer
"We want to make our camouflage unbreakable and we want to break the camouflage of the enemy," said Dr. Jay Hegde, neuroscientist in the Medical College of Georgia at Georgia Health Sciences University.
Hegde and GHSU Postdoctoral Fellow Dr. Xing Chen are using a relatively simple technique they developed to teach civilian volunteers to break camouflage. They flash a series of camouflage pictures on a computer screen, providing about a half second after each to spot, for instance, a face in a sea of mushrooms. A green light signals a correct answer and a red light signals an incorrect answer. The computer-generated images include distractions to make the difficult task even more challenging.
They are finding that an hour of daily training in as little as two weeks results in proficiency for 60 percent of the mostly college and graduate school students who have signed up for their training. The Army’s current approach is taking soldiers into battlefield situations to hone these skills.
As part of a three-year grant from the Office of Army Research, the researchers want to determine which parts of the brain light up when trained snipers break camouflage.
"We need to figure out how the expert camouflage-breakers do it," Hegde said. "We want to figure out what parts of the brain are most responsive when people break camouflage and, a related experiment is what part of the brain changes its response when people learn to break camouflage." Their techniques include functional magnetic resonance imaging to measure blood flow activity as an indicator of brain cell activity.
Figuring out which parts of the brain are involved could give the Army and others a better way to identify future first-rate snipers and objectively assess instructional efforts.
"If you are the Dean of the Army Sniper Corps and want to develop top-notch snipers, you don’t want to spend a year training them before giving up on half of them," Hegde said. A brain scan could help signal whose relevant areas are well developed and, consequently, have natural skill.
Early evidence points toward two regions of the temporal lobe, found on either side of the brain and known to have a role in speech and vision. A region called the fusiform gyrus – which plays a role in facial recognition and lights up when people become experts at recognizing various objects, such as a particular bird species – may be important in breaking camouflage as well.
Hegde suspects that expertise at breaking camouflage stems from the fusiform gyrus in combination with some other area(s) of the brain. And, because good recognition skills don’t typically translate from one area to another, he also suspects that the parts of the brain involved vary with the object of their attention. For example, the ability to easily recognize the make and model of a car doesn’t guarantee skill at breaking camouflage and Hegde notes that some military snipers aren’t good at game-hunting.
Vision happens when light enters the retina where photoreceptor cells turn it into signals that are interpreted by the brain. “If there is a whole lot of light falling on them, they send a lot of signals, beep, beep, beep,” he said in rapid succession. “If there is a little bit of light they fire slowly.” The brain connects the dots to form a familiar face or landscape. Camouflage complicates the task – the difference between recognizing a mountain goat against a clear blue sky and finding a moth among a pile of fall leaves.
"Here is the beautiful thing that we are finding out: if you know what you are looking for, the next time you can break the camouflage of the moth. Without knowing what you are looking for, the picture also is ambiguous," Hegde said.
Provided by Georgia Health Sciences University
Source: medicalxpress.com
ScienceDaily (Mar. 30, 2012) — Human attention to a particular portion of an image alters the way the brain processes visual cortex responses to that image.

A schematic diagram of the contrast discrimination task, showing the focal cue trial (top row) and the distributed cue trial (bottom row). The contrast within the top right circle increases from the first interval (second column) to the second interval (fourth column). The third column is the interstimulus interval. (Credit: Copyright 2011 Elsevier Inc.)
Our ability to ignore some, but not other stimuli, allows us to focus our attention and improve our performance on a specific task. The ability to respond to visual stimuli during a visual task hinges on altered brain processing of responses within the visual cortex at the back of the brain, where visual information is first received from the eyes. How this occurs was recently demonstrated by an international team of researchers led by Justin Gardner at the RIKEN Brain Science Institute in Wako.
In a contrast discrimination task, the researchers showed three observers a stimulus of a group of four circles, each containing grey and white bars that created stripes of different contrasts. After a short pause, the researchers showed the circles again, but the contrast within one of the circles was different. The observers were instructed to choose which group of circles contained the higher contrast.
In ‘focal cue trials’, an arrow directed the observers’ attention to a particular circle. In ‘distributed cue’ trials’, four arrows directed their attention diffusely, across all four circles. Gardner and colleagues found that the observers’ performance was better in the focal cue trials.
Using a magnetic resonance imaging (MRI) scanner, the research team was able to map the precise location within the visual cortex that was activated by the visual information within each of the four circles. During the contrast discrimination task, Gardner and colleagues could therefore measure the observers’ visual cortex activity elicited by the stimuli. In this way, they could correlate brain activity in the visual cortex with the observers’ attention and their choice of contrasting circles.
Visual cortex responses tended to be largest when the observers were paying attention to a particular target circle, and smallest when they were ignoring a circle. The researchers determined that the largest visual cortex responses to the stimuli guided the eventual choice of each observer, leading to enhanced performance on the visual task.
"We used computational modeling to test various hypotheses about how attention affects brain processing of visual information to improve behavioral performance," explains Gardner. "We concluded that the observers’ attention causes their brains to select the largest cortical response to guide contrast choice, since we found that an ‘efficient selection’ model best explained the behavioral and fMRI data," he says.
If the findings extend to other senses, such as hearing, researchers may begin to understand how humans make sense of a perceptually cluttered world.
Source: Science Daily
ScienceDaily (Mar. 29, 2012) — A type of cell plentiful in the brain, long considered mainly the stuff that holds the brain together and oft-overlooked by scientists more interested in flashier cells known as neurons, wields more power in the brain than has been realized, according to new research published March 29 in Science Signaling.

Human astrocytes. (Credit: Image courtesy of University of Rochester Medical Center)
Neuroscientists at the University of Rochester Medical Center report that astrocytes are crucial for creating the proper environment for our brains to work. The team found that the cells play a key role in reducing or stopping the electrical signals that are considered brain activity, playing an active role in determining when cells called neurons fire and when they don’t.
That is a big step forward from what scientists have long considered the role of astrocytes — to nurture neurons and keep them healthy.
"Astrocytes have long been called housekeeping cells — tending to neurons, nurturing them, and cleaning up after them," said Maiken Nedergaard, M.D., D.M.Sc., professor of Neurosurgery and leader of the study. "It turns out that they can influence the actions of neurons in ways that have not been realized."
Proper brain function relies on billions of electrical signals — tiny molecular explosions, really — happening remarkably in sync. Recalling the face of a loved one, swinging a baseball bat, walking down the street — all those actions rely on electrical signals passed instantly along our nerves like a molecular hot potato from one brain cell to another.
For that to happen, the molecular brew of chemicals like sodium, calcium and potassium that brain cells reside in must be just right — and astrocytes help to maintain that balanced environment. For instance, when a neuron sends an impulse, or fires, levels of potassium surrounding the cell jump dramatically, and those levels must come down immediately for the brain to work properly. Scientists have long known that that’s a job for astrocytes — sopping up excess potassium, ending the nerve pulse, and restoring the cells so they can fire again immediately.
In the paper in Science Signaling, Nedergaard’s team discovered an expanded role for astrocytes. The team learned that in addition to simply absorbing excess potassium, astrocytes themselves can cause potassium levels around the neuron to drop, putting neuronal signaling to a stop.
"Far from only playing a passive role, astrocytes can initiate the uptake of potassium in a way that affects neuronal activity," said Nedergaard. "It’s a simple, yet powerful mechanism for astrocytes to rapidly modulate neuronal activity."
Nedergaard has investigated the secret lives of astrocytes for more than two decades. She has shown how the cells communicate using calcium to signal. Nearly 20 years ago in a paper in Science, she pioneered the idea that glial cells like astrocytes communicate with neurons and affect them. Since then, has been a lot of speculation by other scientists that chemicals call gliotransmitters, such as glutamate and ATP, are key to this process.
In contrast, in the latest research Nedergaard’s team found that another signaling system involving potassium is at work. By sucking up potassium, astrocytes quell the firing of neurons, increasing what scientists call “synaptic fidelity.” Important brain signals are crisper and clearer because there is less unwanted activity or “chatter” among neurons that should not be firing. Such errant neuronal activity is linked to a plethora of disorders, including epilepsy, schizophrenia, and attention-deficit disorder.
"This gives us a new target for a disease like epilepsy, where signaling among brain cells is not as controlled as it should be," said Nedergaard, whose team is based in the Division of Glia Disease and Therapeutics of the Center for Translational Neuromedicine. of the Department of Neurosurgery
The first authors of the paper are Fushun Wang, Ph.D., research assistant professor of Neurosurgery; and graduate student Nathan Anthony Smith. They did much of the work by using a sophisticated laser-based system to monitor the activity of astrocytes in the living brain of rats and mice. The work by Smith, a graduate student in the University’s neuroscience program, was supported by a Kirschstein National Research Service Award from the National Institute of Neurological Disorders and Stroke (NINDS).
Other authors from Rochester include Takumi Fujita, Ph.D., post-doctoral associate; Takahiro Takano, Ph.D., assistant professor; Qiwu Xu, technical associate; and Lane Bekar, Ph.D., formerly research assistant professor, now at the University of Saskatchewan. Also contributing were Akemichi Baba of Hyogo University of Health Sciences in Japan, and Toshio Matsuda of Osaka University in Japan.
Nedergaard notes that the complexity and size of our astrocytes is one of few characteristics that differentiate our brains from rodents. Our astrocytes are bigger, faster, and much more complex in both structure and function. She has found that astrocytes contribute to conditions like stroke, Alzheimer’s, epilepsy, and spinal cord injury.
"Astrocytes are integral to the most sophisticated brain processes," she added.
Source: Science Daily
ScienceDaily (Mar. 29, 2012) — Certain genes and proteins that promote growth and development of embryos also play a surprising role in sending chemical signals that help adults learn, remember, forget and perhaps become addicted, University of Utah biologists have discovered.

This is a microscope image of the roundworm or nematode C. elegans with its nervous system glowing green due to labeling with a green jellyfish protein. (Credit: Penelope Brockie, University of Utah.)
"We found that these molecules and signaling pathways [named Wnt] do not retire after development of the organism, but have a new and surprising role in the adult. They are called back to action to change the properties of the nervous system in response to experience," says biology Professor Andres Villu Maricq, senior author of the new study in the March 30 issue of the journal Cell.
The study was performed in C. elegans — the millimeter-long roundworm or nematode — which has a nervous system that serves as a model for those of vertebrate animals, including humans.
Because other Wnt pathways in worms are known to work in humans too, the researchers believe that Wnt genes, the Wnt proteins they produce and so-called “Wnt signaling” also are involved in human learning, memory and forgetting.
"Almost certainly what we have discovered is going on in our brain as well," Maricq says. And because a worm nerve-signal "receptor" in the study is analogous to a human nicotine receptor involved in addiction, schizophrenia and some other mental disorders, some of the genes identified in the worm study "represent possible new targets for treatment of schizophrenia and perhaps addiction," he adds.
Wnt genes and their proteins already were known to “pattern the development and distribution of organs in the body” during embryo development, and to be responsible for various cancers and developmental defects when mutated, he says.
Maricq conducted the study with these Utah biologists: doctoral students Michael Jensen and Dane Maxfield; postdoctoral researchers Michael M. Francis, Frederic Hoerndli and Rui Wang; undergraduate Erica Johnson; Penelope Brockie, a research associate professor; and David M. Madsen, a senior research specialist.
March 29, 2012
(Medical Xpress) — Brain stimulation can markedly improve people’s ability to solve highly complex problems, a recent University of Sydney study suggests.

(L-R) Professor Allan Snyder and Richard Chi found brain stimulation helped people solve a puzzle.
The findings by Professor Allan Snyder and Richard Chi, from the University of Sydney, are published in Neuroscience Letters.
"The results suggest non-invasive brain stimulation could assist people in solving tasks that appear straightforward but are inherently difficult," said Professor Snyder.
Our minds have evolved to solve certain problems effortlessly, yet we struggle to solve others that appear simple but require us to apply an unfamiliar paradigm, to ‘think outside the box’.

The famous ‘nine dots puzzle’. Can you join them using only four straight lines without taking your pen off the page?
"As an example we have taken the famous nine dots problem, where you are asked to join all the dots with four straight lines without taking the pen off the page," Professor Snyder said.
"Surprisingly, investigations over the last century show that almost no one can do this."
Now the researchers have shown that more than 40 percent of the people they tested were able to solve the nine dots problem after receiving 10 minutes of safe, non-invasive brain stimulation.
Specifically the left anterior temporal lobe of the brain is inhibited while simultaneously the right anterior temporal lobe is excited, employing a technique known as transcranial direct current stimulation.
Using the same procedure the researchers have previously reported success in amplifying insight and memory.
Chi and Snyder suggest that their unique brain stimulation protocol could ultimately enable people to “escape the tricks our minds impose on us,” as Professor Snyder describes it, and solve tasks that appear deceptively simple.
Provided by University of Sydney
Source: medicalxpress.com
March 29, 2012
The first atlas of the surface of the human brain based upon genetic information has been produced by a national team of scientists, led by researchers at the University of California, San Diego School of Medicine and the VA San Diego Healthcare System. The work is published in the March 30 issue of the journal Science.

This is a genetic clustering map of the brain, left lateral view. Credit: UC San Diego School of Medicine
The atlas reveals that the cerebral cortex – the sheet of neural tissue enveloping the brain – is roughly divided into genetic divisions that differ from other brain maps based on physiology or function. The genetic atlas provides scientists with a new tool for studying and explaining how the brain works, particularly the involvement of genes.
"Genetics are important to understanding all kinds of biological phenomena," said William S. Kremen, PhD, professor of psychiatry at the UC San Diego School of Medicine and co-senior author with Anders M. Dale, PhD, professor of radiology, neurosciences, and psychiatry, also at the UC San Diego School of Medicine.
According to Chi-Hua Chen, PhD, first author and a postdoctoral fellow in the UC San Diego Department of Psychiatry, “If we can understand the genetic underpinnings of the brain, we can get a better idea of how it develops and works, information we can then use to ultimately improve treatments for diseases and disorders.”
The human cerebral cortex, characterized by distinctive twisting folds and fissures called sulci, is just 0.08 to 0.16 inches thick, but contains multiple layers of interconnected neurons with key roles in memory, attention, language, cognition and consciousness.
Other atlases have mapped the brain by cytoarchitecture – differences in tissues or function. The new map is based entirely upon genetic information derived from magnetic resonance imaging (MRI) of 406 adult twins participating in the Vietnam Era Twin Registry (VETSA), an ongoing longitudinal study of cognitive aging supported in part by grants from the National Institutes of Health (NIH). It follows a related study published last year by Kremen, Dale and colleagues that affirmed the human cortical regionalization is similar to and consistent with patterns found in other mammals, evidence of a common conservation mechanism in evolution.
"We are excited by the development of this new atlas, which we hope will help us understand aging-related changes in brain structure and cognitive function now occurring in the VETSA participants," said Jonathan W. King, PhD, of the National Institute on Aging, part of the NIH.
The atlas plots genetic correlations between different points on the cortical surface of the twins’ brains. The correlations represent shared genetic influences and reveal that genetic brain divisions do not map one-to-one with traditional brain divisions that are based on structure and function. “Yet, the pattern of this genetic map still suggests that it is neuroanatomically meaningful,” said Kremen.
Kremen said the genetic brain atlas may be especially useful for scientists who employ genome-wide association studies, a relatively new tool that looks for common genetic variants in people that may be associated with a particular trait, condition or disease.
Provided by University of California - San Diego
Source: medicalxpress.com
March 29, 2012
The brain appears to be wired more like the checkerboard streets of New York City than the curvy lanes of Columbia, Md., suggests a new brain imaging study. The most detailed images, to date, reveal a pervasive 3D grid structure with no diagonals, say scientists funded by the National Institutes of Health.

Curvature in this DSI image of a whole human brain turns out to be folding of 2-D sheets of parallel neuronal fibers that cross paths at right angles. This picture came from the new Connectom scanner. Credit: Van Wedeen, M.D., Martinos Center and Dept. of Radiology, Massachusetts General Hospital and Harvard University Medical School
"Far from being just a tangle of wires, the brain’s connections turn out to be more like ribbon cables — folding 2D sheets of parallel neuronal fibers that cross paths at right angles, like the warp and weft of a fabric," explained Van Wedeen, M.D., of Massachusetts General Hospital (MGH), A.A. Martinos Center for Biomedical Imaging and the Harvard Medical School. "This grid structure is continuous and consistent at all scales and across humans and other primate species."
Wedeen and colleagues report new evidence of the brain’s elegant simplicity March 30, 2012 in the journal Science. The study was funded, in part, by the NIH’s National Institute of Mental Health (NIMH), the Human Connectome Project of the NIH Blueprint for Neuroscience Research, and other NIH components.
"Getting a high resolution wiring diagram of our brains is a landmark in human neuroanatomy," said NIMH Director Thomas R. Insel, M.D. "This new technology may reveal individual differences in brain connections that could aid diagnosis and treatment of brain disorders."
March 28, 2012 By Kimm Fesenmaier
(Medical Xpress) — When jurors sentencing convicted criminals are instructed to weigh not only facts but also tricky emotional factors, they rely on parts of the brain associated with sympathy and making moral judgments, according to a new paper by a team of neuroscientists. Using brain-imaging techniques, the researchers, including Caltech’s Colin Camerer, found that the most lenient jurors show heightened levels of activity in the insula, a brain region associated with discomfort and pain and with imagining the pain that others feel.

The findings provide insight into the role that emotion plays in jurors’ decision-making processes, indicating a close relationship between sympathy and mitigation.
In the study, the researchers, led by Makiko Yamada of National Institute of Radiological Sciences in Japan, considered cases where juries were given the option to lessen the sentences for convicted murderers. In such cases with “mitigating circumstances,” jurors are instructed to consider factors, sometimes including emotional elements, that might cause them to have sympathy for the criminal and, therefore, shorten the sentence. An example would be a case in which a man killed his wife to spare her from a more painful death, say, from a terminal illness.
"Finding out if jurors are weighing sympathy reasonably is difficult to do, objectively," says Colin Camerer, the Robert Kirby Professor of Behavioral Finance and Economics at Caltech. "Instead of asking the jurors, we asked their brains."
The researchers scanned the brains of citizens (potential jurors) while the participants read scenarios adapted from actual murder cases with mitigating circumstances. In some cases, the circumstances were sympathy-inducing; in others, where, for example, a man became enraged when an ex-girlfriend refused him, they were not. The scientists used functional magnetic resonance imaging (fMRI), a type of brain scanning that tracks increases in oxygenated blood flow, indicating heightened brain activity. The participants also had their brains scanned when they determined whether to lessen the sentences, and by how much.
The team found that sympathy activated the dorsomedial prefrontal cortex, precuneus, and temporo-parietal junction—brain regions associated with moral conflict and thinking about the feelings of others. Similarly, the jurors had increased activity in these regions during sentencing when the mitigating circumstances earned their sympathy. In those cases, they also delivered shorter hypothetical sentences.
In addition to Camerer and Yamada, coauthors on the new paper, “Neural circuits in the brain that are activated when mitigating criminal sentences,” are Saori Fujie, Harumasa Takano, Hiroshi Ito, Tetsuya Suhara, and Hidehiko Takahashi of the National Institute of Radiological Sciences; Motoichiro Kato of the Keio University of Medicine; and Tetsuya Matsuda of Tamagawa University Brain Science Institute. Yamada is also affiliated with Tamagawa University Brain Science Institute and Kyoto University School of Medicine; she and Takahashi are additionally affiliated with the Japan Science and Technology Agency.
More information: Neural circuits in the brain that are activated when mitigating criminal sentences
Provided by California Institute of Technology
Source: medicalxpress.com
March 28, 2012 by Stuart Mason Dambrot
(Medical Xpress) — College and cramming – often where’s there’s one, the other is not far behind. That said, however, it has been recognized since the late 1800s that repeated periodic exposure to the same material leads to better retention than does a single en masse session. Nevertheless, the phenomenon’s neurobiological processes have remained poorly understood, although activity-dependent synaptic plasticity – notably long-term potentiation (LTP) of glutamatergic transmission – is believed to enable rapid storage of new information. Recently, researchers at the University of California in Irvine and the Scripps Research Institute in Jupiter, Florida determined that hippocampal activity can enhance LTP through theta burst stimulation (TBS) – but only when the affected synapses receive, after a long delay, a secondary TBS. The researchers describe mechanisms that maximize synaptic changes that optimally encode new memory by requiring long delays learning-related TBS activity.

A second theta burst train expands the pool of F-actin-enriched spines. (A) Fluorescent phalloidin labeling in CA1 stratum radiatum. (Scale bar = 10 μm). (B) Counts of densely phalloidin-positive spines in slices collected 15 or 75 min after TBS1 (gray bars) or 15 min after TBS2 delayed by 60 min (black bar). (C) Traces show responses to two successive bursts separated by 200 ms (red for second response). (D) Counts of TBS1-induced phalloidin labeling for vehicle (gray) and CX614-treated (blue) slices. (E) Pretreatment with CX614 (blue line) caused a 70% increase in the magnitude of LTP induced by TBS1; this was accompanied by a loss of TBS2-induced potentiation. Image Copyright © 2012 PNAS, doi: 10.1073/pnas.1120700109
Gavin Rumbaugh (Scripps Research Institute) discussed the challenges he, Gary Lynch (University of California) and their team encountered in the study. “The field is trying to understand the neurobiology of new learning, and in particular, how learning induces an even more complex biology to keep new information in our neural circuits,” Rumbaugh tells Medical Xpress. “Over the recent decade, it has become clear that plasticity at individual synapses is a way that neural circuits store information. However, it remains unclear how properties of synapses influence key aspects of learning and memory.”
March 28, 2012
Recent clinical studies have shown that general anesthesia can be harmful to infants, presenting a dilemma for both doctors and parents. But new research at Wake Forest Baptist Medical Center may point the way to treatment options that protect very young children against the adverse effects of anesthesia.
As detailed in a study published in the March 23 online edition of the journal Neuroscience, Wake Forest Baptist scientists explored a number of strategies designed to prevent anesthesia-induced damage to the brain in infants.
Using an animal model, the researchers tested the effectiveness of a fragment of a neuroprotective protein called ADNP, as well as vitamin D3, a low-level dose of anesthetic and aspirin. They found that three of the four strategies tested protected the brain from injury induced by 20 mg ketamine, a commonly used general anesthetic.
"What didn’t work was aspirin, which was a surprise because aspirin is known to protect the brain from injury," said Christopher P. Turner, Ph.D., assistant professor of neurobiology and anatomy at Wake Forest Baptist and lead author of the study. "In fact, in our study aspirin actually increased the severity of injury from the anesthesia, possibly because it prevents the generation of substances that may be neuroprotective."
Turner and his team studied rats at ages equivalent to children from birth to age 4.
In separate tests, the rodents were injected with: NAP, a peptide fragment of activity-dependent neuroprotective protein (ADNP), 15 minutes before ketamine was administered; two 20-mg doses of vitamin D3, at 24 hours and at 15 minutes before 20 mg ketamine; a non-toxic (5 mg) doses of ketamine 24 hours before a toxic dose of 20 mg ketamine was administered; and a 30-mg dose of aspirin 15 minutes before exposure to ketamine.
The Turner lab found that NAP, vitamin D3 and prior exposure to low (non-toxic) ketamine could all prevent injury from exposure to a toxic (20 mg) level of ketamine. However, aspirin appeared to enhance ketamine-induced injury.
"We designed our studies to give doctors several possible treatment options because not all of these strategies may work in clinical applications," Turner said. "However, because vitamin D3 is already in clinical use, our findings show that it is quite promising with few risks. Further, NAP is currently in clinical trials to diminish the severity of other types of brain injury, so we feel this discovery represents a breakthrough for anesthesia-induced neurotoxicity. However, there may be a critical window of efficacy for NAP, which we need to explore further.
"Of all the approaches that our team studied, using a low dose of ketamine may be both the simplest and most cost-effective, as it suggests children can be pre-treated with the same anesthesia that will be used when they undergo general surgery," Turner added. "In essence, a low-level dose of ketamine primes the child’s brain so that the second, higher dose is not as lethal, much like an inoculation."
Provided by Wake Forest Baptist Medical Center
Source: medicalxpress.com
March 28, 2012
(Medical Xpress) — While stimulants may improve unengaged workers’ performance, a new University of British Columbia study suggests that for others, caffeine and amphetamines can have the opposite effect, causing workers with higher motivation levels to slack off.

The study – published online today by Nature’s Neuropsychopharmacology – explored the impacts of stimulants on “slacker” rats and “worker” rats, and sheds important light on why stimulants might affect people differently, a question that has long been unclear. It also suggests that patients being treated with stimulants for a range of illnesses may benefit from more personalized treatment programs.
“Every day, millions of people use stimulants to wake up, stay alert and increase their productivity – from truckers driving all night to students cramming for exams,” says Jay Hosking, a PhD candidate in UBC’s Dept. of Psychology, who led the study. “These findings suggest that some stimulants may actually have an opposite effect for people who naturally favour the difficult tasks of life that come with greater rewards.”
Hosking says some individuals are more willing to concentrate and exert effort to achieve their goals than others. However, little is known about the brain mechanisms determining how much cognitive effort one will expend in decision-making for accomplishing tasks.
Hosking and study co-author Catharine Winstanley, a professor in UBC’s Dept. of Psychology, found that rats – like humans – show varying levels of willingness to expend high or low degrees of mental effort to obtain food rewards. When presented with stimulants, the “slacker” rats that typically avoided challenges worked significantly harder when given amphetamines, while “worker” rats that typically embraced challenges were less motivated by caffeine or amphetamine.
While more research is needed to understand the brain mechanisms at work, the study suggests that the amount of mental attention people devote to achieving their goals may play a role in determining how stimulants drugs affect them, Hosking says.
Winstanley, a Michael Smith Foundation for Health Research scholar, says people with psychiatric illnesses, brain injuries and Attention Deficit Hyperactivity Disorder (ADHD) may benefit from treatment programs with greater personalization, noting that patients often use stimulants to counter drowsiness and fatigue from their conditions and treatments, with mixed results.
Provided by University of British Columbia
Source: medicalxpress.com
ScienceDaily (Mar. 27, 2012) — Just as the familiar sugar in food can be bad for the teeth and waistline, another sugar has been implicated as a health menace and blocking its action may have benefits that include improving long-term memory in older people and treating cancer.

Blocking the action of a sugar could boost memory and even fight cancer. The neuron on the left has CREB with O-GlcNAc and is short. The neuron on the right does not have that form of CREB and is long. (Credit: Linda Hsieh-Wilson, Ph.D.)
Progress toward finding such a blocker for the sugar — with the appropriately malicious-sounding name “oh-glick-nack” — was the topic of a report presented at the 243rd National Meeting & Exposition of the American Chemical Society (ACS) in San Diego on March 27.
Linda Hsieh-Wilson, Ph.D., explained that the sugar is not table sugar (sucrose), but one of many other substances produced in the body’s cells that qualify as sugars from a chemical standpoint. Named O-linked beta-N-acetylglucosamine — or “O-GlcNAc” — it helps in orchestrating health and disease at their origins, inside the billions of cells that make up the body. O-GlcNAc does so by attaching to proteins that allow substances to pass in and out of the nucleus of cells, for instance, and helping decide whether certain genes are turned on or off. In doing so, O-GlcNAc sends signals that may be at the basis of cancer, diabetes, Alzheimer’s disease and other disorders. Research suggests, for instance, that proteins loaded up with too much O-GlcNAc can’t function normally.
At the ACS meeting, Hsieh-Wilson described how research in her lab at the California Institute of Technology and Howard Hughes Medical Institute implicate O-GlcNAc in memory loss and cancer. The research emerged from Hsieh-Wilson’s use of advanced lab tools for probing a body process that involves attachment of sugars like O-GlcNAc to proteins. Called protein glycosylation, it helps nerves and other cells communicate with each other in ways that keep the body coordinated and healthy. When O-GlcNAc is attached to a protein, that binding process is known as O-GlcNAc glycosylation.
Hsieh-Wilson’s team screened the entire mammalian brain for all O-GlcNAc-glycosylated proteins, using a new process that her laboratory developed. They identified more than 200 proteins bearing O-GlcNAc attachments or tags, many for the first time. The research was done in mice, stand-ins for humans in research that cannot be done on people. Some of the proteins carrying O-GlcNAc were involved in regulating processes like drug addiction and securing long-term storage of memories.
O-GlcNAc’s effects on one particular protein, CREB, got the scientists’ attention. CREB is a key substance that turns on and regulates the activity of genes. Many of the genes in cells are inactive at any given moment. Substances like CREB, termed transcription factors, turn genes on. Hsieh-Wilson found that when O-GlcNAc attached to CREB, CREB’s ability to turn on genes was impaired. When the researchers blocked O-GlcNAc from binding CREB, the mice developed long-term memories faster than normal mice.
Could blocking O-GlcNAc boost long-term memory in humans?
"We’re far from understanding what happens in humans," Hsieh-Wilson emphasized. "Completely blocking O-GlcNAc might not be desirable. Do you really want to sustain all memories long-term, even of events that are best forgotten? How would blocking the sugar from binding to other proteins affect other body processes? There are a lot of unanswered questions. Nevertheless, this research could eventually lead to ways to improve memory."
In a related study, Hsieh-Wilson found that O-GlcNAc interacted with another protein in ways that encourage the growth of cancer cells, suggesting that blocking its attachment might protect against cancer or slow the growth of cancer. And indeed, in mouse experiments, blocking O-GlcNAc resulted in much smaller tumors.
Again, a treatment for humans based on this discovery is far in the future, but the study singles out O-GlcNAc as a potential new target for developing anti-cancer drugs.
Source: Science Daily
March 27, 2012
A hallmark of human intelligence is the ability to efficiently adapt to uncertain, changing and open-ended environments. In such environments, efficient adaptive behavior often requires considering multiple alternative behavioral strategies, adjusting them, and possibly inventing new ones. These reasoning, learning and creative abilities involve the frontal lobes, which are especially well developed in humans compared to other primates. However, how the frontal function decides to create new strategies and how multiple strategies can be monitored concurrently remain largely unknown.

In a new study, published March 27 in the online, open-access journal PLoS Biology, Anne Collins and Etienne Koechlin of Ecole Normale Supérieure and Institut National de la Santé et de la Recherche Médicale, France, examine frontal lobe function using behavioral experiments and computational models of human decision-making. They find that human frontal function concurrently monitors no more than three/four strategies but favors creativity, i.e. the exploration and creation of new strategies whenever no monitored strategies appear to be reliable enough.
The researchers asked one hundred participants to find “3-digit pin codes” by a method of trial and error, under a variety of conditions. They then developed a computational model that predicted the responses produced by participants, which also revealed that participants made their choices by mentally constructing and concurrently monitoring up to three distinct behavioral strategies; flexibly associating digits, motor responses and expected auditory feedbacks.
"This is a remarkable result, because the actual number of correct codes varied across sessions. This suggests that this capacity limit is a hard constraint of human higher cognition," said Dr. Koechlin. Consistently, the performance was significantly better in sessions including no more than three repeated codes.
Furthermore, the researchers found that the pattern of participants’ responses derived from a decision system that strongly favors the exploration of new behavioral strategies: “The results provide evidence that the human executive system favors creativity for compensating its limited monitoring capacity” explained Dr. Koechlin. “It favors the exploration of new strategies but restrains the monitoring and storage of uncompetitive ones. Interestingly, this ability to regulate creativity varied across participants and critically explains individual variations in performances. We believe our study may also help to understand the biological foundations of individual differences in decision-making and adaptive behavior”.
Provided by Public Library of Science
Source: medicalxpress.com
ScienceDaily (Mar. 27, 2012) — Cognitive training including puzzles, handicrafts and life skills are known to reduce the risk, and help slow down the progress, of dementia amongst the elderly. A new study published in BioMed Central’s open access journal BMC Medicine showed that cognitive training was able to improve reasoning, memory, language and hand eye co-ordination of healthy, older adults.
It is estimated that by 2050 the number of people over 65 years old will have increased to 1.1 billion worldwide, and that 37 million of these will suffer from dementia. Research has already shown that mental activity can reduce a person’s risk of dementia but the effect of mental training on healthy people is less well understood. To address this researchers from China have investigated the use of cognitive training as a defence against mental decline for healthy older adults who live independently.
To be recruited onto the trial participants had to be between 65 and 75 years old, and have good enough eyesight, hearing, and communication skills, to be able to complete all parts of the training. The hour long training sessions occurred twice a week, for 12 weeks, and the subjects were provided with homework. Training included a multi-approach system tackling memory, reasoning, problem solving, map reading, handicrafts, health education and exercise, or focussing on reasoning only. The effect of booster training, provided six months later, was also tested.
The results of the study were positive. Profs Chunbo Li and Wenyuan Wu who led the research explained, “Compared to the control group, who received no training, both levels of cognitive training improved mental ability, although the multifaceted training had more of a long term effect. The more detailed training also improved memory, even when measured a year later and booster training had an additional improvement on mental ability scores.”
This study shows that cognitive training therapy may prevent mental decline amongst healthy older people and help them to continue independent living longer in their advancing years.
Source: Science Daily
ScienceDaily (Mar. 26, 2012) — Working with genetically engineered mice and the genomes of thousands of people with schizophrenia, researchers at Johns Hopkins say they now better understand how both nature and nurture can affect one’s risks for schizophrenia and abnormal brain development in general.

The green neurons have reduced DISC1 protein. Red neurons have less effective GABA. (Credit: Johns Hopkins Medicine)
The researchers reported in the March 2 issue of Cell that defects in a schizophrenia-risk genes and environmental stress right after birth together can lead to abnormal brain development and raise the likelihood of developing schizophrenia by nearly one and half times.
"Our study suggests that if people have a single genetic risk factor alone or a traumatic environment in very early childhood alone, they may not develop mental disorders like schizophrenia," says Guo-li Ming, M.D., Ph.D., professor of neurology and member of the Institute for Cell Engineering at the Johns Hopkins University School of Medicine. "But the findings also suggest that someone who carries the genetic risk factor and experiences certain kinds of stress early in life may be more likely to develop the disease."
Pinpointing the cause or causes of schizophrenia has been notoriously difficult, owing to the likely interplay of multiple genes and environmental triggers, Ming says. Searching for clues at the molecular level, the Johns Hopkins team focused on the interaction of two factors long implicated in the disease: Disrupted-in-Schizophrenia 1 (DISC1) protein, which is important for brain development, and GABA, a brain chemical needed for normal brain function.
To find how these factors impact brain development and disease susceptibility, the researchers first engineered mice to have reduced levels of DISC1 protein in one type of neuron in the hippocampus, a region of the brain involved in learning, memory and mood regulation. Through a microscope, they saw that newborn mouse brain cells with reduced levels of DISC1 protein had similar sized and shaped neurons as those from mice with normal levels of DISC1 protein. To change the function of the chemical messenger GABA, the researchers engineered the same neurons in mice to have more effective GABA. Those brain cells looked much different than normal neurons, with longer appendages or projections. Newborn mice engineered with both the more effective GABA and reduced levels of DISC1 showed the longest projections, suggesting, Ming said, that defects in both DISC1 and GABA together could change the physiology of developing neurons for the worse.
Meanwhile, other researchers at University of Calgary and at the National Institute of Physiological Sciences in Japan had shown in newborn mice that changes in environment and routine stress can impede GABA from working properly during development. In the next set of experiments, the investigators paired reducing DISC1 levels and stress in mice to see if it could also lead to developmental defects. To stress the mice, the team separated newborns from their mothers for three hours a day for ten days, then examined neurons from the stressed newborns and saw no differences in their size, shape and organization compared with unstressed mice. But when they similarly stressed newborn mice with reduced DISC1 levels, the neurons they saw were larger, more disorganized and had more projections than the unstressed mouse neurons. The projections, in fact, went to the wrong places in the brain.
Next, to see if their results in mice correlated to suspected human schizophrenia risk factors, the researchers compared the genetic sequences of 2,961 schizophrenia patients and healthy people from Scotland, Germany and the United States. Specifically, they determined if specific variations of DNA letters found in two genes, DISC1 and a gene for another protein, NKCC1, which controls the effect of GABA, were more likely to be found in schizophrenia patients than in healthy individuals. They paired 36 DNA “letter” changes in DISC1 and two DNA letter variations in NKCC1 — one DNA letter change per gene — in all possible combinations. Results showed that if a person’s genome contained one specific combination of single DNA letter changes, then that person is 1.4 times more likely than people without these DNA changes to develop schizophrenia. Having these single DNA letter changes in either one of these genes alone did not increase risk.
"Now that we have identified the precise genetic risks, we can rationally search for drugs that correct these defects," says Hongjun Song, Ph.D., co-author, professor of neurology and director of the Stem Cell Program at the Institute for Cell Engineering.
Source: Science Daily
ScienceDaily (Mar. 26, 2012) — Repeated stress triggers the production and accumulation of insoluble tau protein aggregates inside the brain cells of mice, say researchers at the University of California, San Diego School of Medicine in a new study published in the March 26 Online Early Edition of the Proceedings of the National Academy of Sciences.

Exposing mice to 14 days of repeated stress resulted in an accumulation of insoluble phosphorylated tau protein aggregates in brain cells, visualized in this electron micrograph. (Credit: Image courtesy of Robert Rissman, UC San Diego)
The aggregates are similar to neurofibrillary tangles or NFTs, modified protein structures that are one of the physiological hallmarks of Alzheimer’s disease. Lead author Robert A. Rissman, PhD, assistant professor of neurosciences, said the findings may at least partly explain why clinical studies have found a strong link between people prone to stress and development of sporadic Alzheimer’s disease (AD), which accounts for up to 95 percent of all AD cases in humans.
"In the mouse models, we found that repeated episodes of emotional stress, which has been demonstrated to be comparable to what humans might experience in ordinary life, resulted in the phosphorylation and altered solubility of tau proteins in neurons," Rissman said. "These events are critical in the development of NFT pathology in Alzheimer’s disease."
The effect was most notable in the hippocampus, said Rissman, a region of the brain linked to the formation, organization and storage of memories. In AD patients, the hippocampus is typically the first region of the brain affected by tau pathology and the hardest-hit, with substantial cell death and shrinkage.
Not all forms of stress are equally threatening. In earlier research, Rissman and colleagues reported that acute stress — a single, passing episode — does not result in lasting, debilitating long lasting changes in accumulation of phosphorylated tau. Acute stress-induced modifications in the cell are transient, he said, and on the whole, probably beneficial.
"Acute stress may be useful for brain plasticity and helping to facilitate learning. Chronic stress and continuous activation of stress pathways may lead to pathological changes in stress circuitry. It may be too much of a good thing." As people age, perhaps their neuronal circuits do too, he said, becoming less robust and perhaps less capable of completely rebounding from the effects of stress.
"Age is the primary, known risk factor for Alzheimer’s disease. It may be that as we age, our neurons just aren’t as plastic as they once were and some succumb."
The researchers observed that stress cues impacted two key corticotropin-releasing factor receptors, suggesting a target for potential therapies. Rissman noted drugs already exist and are in human trials (for other conditions) that modulate the activity of these receptors.
"You can’t eliminate stress. We all need to be able to respond at some level to stressful stimuli. The idea is to use an antagonist molecule to reduce the effects of stress upon neurons. The stress system can still respond, but the response in the brain and hippocampus would be toned down so that it doesn’t result in harmful, permanent damage."
The authors dedicate this work to long time mentor and colleague, Dr. Wylie Vale, whose years of pioneering work deciphering and describing the stress system were fundamental to this paper. Vale passed away earlier this year at the age of 70.
Source: Science Daily
ScienceDaily (Mar. 26, 2012) — Individuals with major depressive disorder (MDD) often undergo multiple courses of antidepressant treatment during their lives. This is because the disorder can recur despite treatment and because finding the right medication for a specific individual can take time.
While the relationship between prior treatment and the brain’s response to subsequent treatment is unknown, a new study by UCLA researchers suggests that how the brain responds to antidepressant medication may be influenced by its remembering of past antidepressant exposure.
Interestingly, the researchers used a harmless placebo as the key to tracking the footprints of prior antidepressant use.
Aimee Hunter, the study’s lead author and an assistant professor of psychiatry at UCLA’s Semel Institute for Neuroscience and Human Behavior, and colleagues showed that a simple placebo pill, made to look like actual medication for depression, can “trick” the brain into responding in the same manner as the actual medication.
The report was published online March 23 in the journal European Neuropsychopharmacology.
The investigators examined changes in brain function in 89 depressed persons during eight weeks of treatment, using either an antidepressant medication or a similar-looking placebo pill. They set out to compare the two treatments — medication versus placebo — but they also added a twist: They separately examined the data for subjects who had never previously taken an antidepressant and those who had.
The researchers focused on the prefrontal cortex, an area of the brain thought to be involved in planning complex cognitive behavior, personality expression, decision-making and moderating social behavior, all things depressed people wrestle with.
Brain changes were assessed using electroencephalograph (EEG) measures developed at UCLA by study co-authors Dr. Ian Cook, UCLA’s Miller Family Professor of Psychiatry, and Dr. Andrew Leuchter, a professor of psychiatry and director of the Laboratory of Brain, Behavior and Pharmacology at UCLA’s Semel Institute. The EEG measure, recorded from scalp electrodes, is linked to blood flow in the cerebral cortex, which suggests the level of brain activity.
The antidepressant medication given during the study appeared to produce slight decreases in prefrontal brain activity, regardless of whether subjects had received prior antidepressant treatment during their lifetime or not. (A decrease in brain activity is not necessarily a bad thing, the researchers note; with depression, too much activity in the brain can be as bad as too little.)
However, the researchers observed striking differences in the power of placebo, depending on subjects’ prior antidepressant use. Subjects who had never been treated with an antidepressant exhibited large increases in prefrontal brain activity during placebo treatment. But those who had used antidepressant medication in the past showed slight decreases in prefrontal activity — brain changes that were indistinguishable from those produced by the actual drug.
"The brain’s response to the placebo pill seems to depend on what happened previously — on whether or not the brain has ever ‘seen’ antidepressant medication before," said Hunter, who is a member of the placebo research team at the Laboratory of Brain, Behavior and Pharmacology. "If it has seen it before, then the brain’s signature ‘antidepressant-exposure’ response shows up."
According to Hunter, the effect looks conspicuously like a classical conditioning phenomenon, wherein prior exposure to the actual drug may have produced the specific prefrontal brain response and subsequent exposure to the cues surrounding drug administration — the relationship with the doctor or nurse, the medical treatment setting, the act of taking a prescribed pill and so forth — came to elicit a similar brain response through ‘conditioning’ or ‘associative learning.’
While medication can have a powerful effect on our physiology, said Hunter, “the behaviors and cues in the environment that are associated with taking medication can come to elicit their own effects. One’s personal treatment history is one of the many factors that influence the overall effects of treatment.”
Still, she noted, there are other possible explanations, and further research is needed to tease out changes in brain function that are related to antidepressant exposure, compared with brain changes that are related to clinical improvement during treatment.
Source: Science Daily
ScienceDaily (Mar. 26, 2012) — Smoking alters the impact of a schizophrenia risk gene. Scientists from the universities of Zurich and Cologne demonstrate that healthy people who carry this risk gene and smoke process acoustic stimuli in a similarly deficient way as patients with schizophrenia. Furthermore, the impact is all the stronger the more the person smokes.
Schizophrenia has long been known to be hereditary. However, as a melting pot of disorders with different genetic causes is concealed behind manifestations of schizophrenia, research has still not been able to identify the main gene responsible to this day.
In order to study the genetic background of schizophrenia, the frequency of particular risk genes between healthy and ill people has mostly been compared until now. Pharmacopyschologist Professor Boris Quednow from University Hospital of Psychiatry, Zurich, and Professor Georg Winterer’s workgroup at the University of Cologne have now adopted a novel approach. Using electroencephalography (EEG), the scientists studied the processing of simple acoustic stimuli (a sequence of similar clicks). When processing a particular stimulus, healthy people suppress the processing of other stimuli that are irrelevant to the task at hand. Patients with schizophrenia exhibit deficits in this kind of stimulus filtering and thus their brains are probably inundated with too much information. As psychiatrically healthy people also filter stimuli with varying degrees of efficiency, individual stimulus processing can be associated with particular genes.
Smokers process stimuli less effectively
In a large-scale study involving over 1,800 healthy participants from the general population, Boris Quednow and Georg Winterer examined how far acoustic stimulus filtering is connected with a known risk gene for schizophrenia: the so-called “transcription factor 4” gene (TCF4). TCF4 is a protein that plays a key role in early brain development. As patients with schizophrenia often smoke, the scientists also studied the smoking habits of the test subjects.
The data collected shows that psychiatrically healthy carriers of the TCF4 gene also filter stimuli less effectively — like people who suffer from schizophrenia. It turned out that primarily smokers who carry the risk gene display a less effective filtering of acoustic impressions. This effect was all the more pronounced the more the people smoked. Non-smoking carriers of the risk gene, however, did not process stimuli much worse. “Smoking alters the impact of the TCF4 gene on acoustic stimulus filtering,” says Boris Quednow, explaining this kind of gene-environment interaction. “Therefore, smoking might also increase the impact of particular genes on the risk of schizophrenia.”
The results could also be significant for predicting schizophrenic disorders and for new treatment approaches, says Quednow and concludes: “Smoking should also be considered as an important cofactor for the risk of schizophrenia in future studies.” A combination of genetic (e.g. TCF4), electrophysiological (stimulus filtering) and demographic (smoking) factors could help diagnose the disorder more rapidly or also define new, genetically more uniform patient subgroups.
Source: Science Daily
ScienceDaily (Mar. 26, 2012) — Though autism is often not diagnosed until the age of three, some children begin to show signs of developmental delay before they turn a year old. While not all infants and toddlers with delays will develop autism spectrum disorders (ASD), experts point to early detection of these signs as key to capitalizing on early diagnosis and intervention, which is believed to improve developmental outcomes.
According to Dr. Rebecca Landa, director of the Center for Autism and Related Disorders at the Kennedy Krieger Institute in Baltimore, Md., parents need to be empowered to identify the warning signs of ASD and other communication delays.
"We want to encourage parents to become good observers of their children’s development so that they can see the earliest indicators of delays in a baby’s communication, social and motor skills," says Dr. Landa, who also cautions that some children who develop ASD don’t show signs until after the second birthday or regress after appearing to develop typically.
For the past decade, Dr. Landa has followed infant siblings of children with autism to identify red flags of the disorder in their earliest form. Her research has shown that diagnosis is possible in some children as young as 14 months and sparked the development of early intervention models that have been shown to improve outcomes for toddlers showing signs of ASD as young as one and two years old.
Dr. Landa recommends that as parents play with their infant (6 — 12 months), they look for the following signs that have been linked to later diagnosis of ASD or other communication disorders:
1. Rarely smiles when approached by caregivers 2. Rarely tries to imitate sounds and movements others make, such as smiling and laughing, during simple social exchanges 3. Delayed or infrequent babbling 4. Does not respond to his or her name with increasing consistency from 6 — 12 months 5. Does not gesture to communicate by 10 months 6. Poor eye contact 7. Seeks your attention infrequently 8. Repeatedly stiffens arms, hands, legs or displays unusual body movements such as rotating the hands on the wrists, uncommon postures or other repetitive behaviors 9. Does not reach up toward you when you reach to pick him or her up 10. Delays in motor development, including delayed rolling over, pushing up and crawling
"If parents suspect something is wrong with their child’s development, or that their child is losing skills, they should talk to their pediatrician or another developmental expert," says Dr. Landa. "Don’t adopt a ‘wait and see’ perspective. We want to identify delays early in development so that intervention can begin when children’s brains are more malleable and still developing their circuitry."
Source: Science Daily