Posts tagged science

Posts tagged science
Autism Linked to Increased Genetic Change in Regions of Genome Instability
Children with autism have increased levels of genetic change in regions of the genome prone to DNA rearrangements, so called “hotspots,” according to a research discovery to be published in the print edition of the journal Human Molecular Genetics. The research indicates that these genetic changes come in the form of an excess of duplicated DNA segments in hotspot regions and may affect the chances that a child will develop autism — a behavioral disorder that affects about 1 of every 88 children in the United States, according to the Centers for Disease Control.
Earlier work had identified, in children with autism, a greater frequency of rare DNA deletions or duplications, known as DNA copy number changes. These rare and harmful events are found in approximately 5 to 10 percent of cases, raising the question as to what other genetic changes might contribute to the disorders known as autism spectrum disorders.
The new research shows that children with autism have — in addition to these rare events — an excess of duplicated DNA including more common variants not exclusively found in children with autism, but are found at elevated levels compared to typically developing children. The research collaboration includes groups led at Penn State by Scott Selleck; at the University of California Davis/MIND Institute by Isaac Pessah, Irva Hertz-Picciotto, Flora Tassone, and Robin Hansen; and at the University of Washington by Evan Eichler.
The investigators also found that the balance of DNA duplications and deletions in children with autism was different from that found in more severe developmental disorders, such as intellectual disability or multiple congenital anomalies, where the levels of both deletions and duplications are increased compared to controls, and are even higher than in children with autism.
They also found that children who had more difficulty with daily living skills also had the greatest level of copy number change throughout their genome. “These measures of adaptive behavior provide an indication of the severity of the impairment in the children with autism. These behaviors were significantly correlated with the amount of DNA copy number change,” Selleck said, emphasizing that the research revealed “clear and graded effects of the genetic change.”
"These results beg the question as to the origin of this genetic change," Selleck said. "The increased levels of both rare and common variants suggests the possibility that these individuals are predisposed to genetic alteration."
A vigorous debate is ongoing in the research community about the degree of genetic versus environmental contributions to autism. Selleck said the finding of an overall increase in genetic change in children with autism heightens the need to search for the basis of this variation. “We know that environmental factors can affect the stability of the genome, but we don’t know if the DNA copy number change we detect in these children is a result of environmental exposures, nutrition, medical factors, lifestyle, genetic susceptibility, or combinations of many elements together,” Selleck said. “The elevated levels of common variants is telling us something. It suggests that pure selection of randomly generated variants may not be the whole story.”
The Penn State team includes Department of Biochemistry and Molecular Biology Associate Professor Marylyn Ritchie and Assistant Professor Santhosh Girirajan. “The relationship between the level of copy number change and the degree of neurodevelopmental disability is something we have noted previously for large, rare variants” says Girirajan, “but this work extends those observations to common copy number variants, suggesting the level of copy number change in children with autism is larger than we had appreciated.” Girirajan, the first author of the study, coordinated the effort between the Penn State and University of Washington researchers.
Human diseases caused by misfolded proteins known as prions are some of most rare yet terrifying on the planet—incurable with disturbing symptoms that include dementia, personality shifts, hallucinations and coordination problems. The most well-known of these is Creutzfeldt-Jakob disease, which can be described as the naturally occurring human equivalent of mad cow disease.
Now, scientists from the Florida campus of The Scripps Research Institute (TSRI) have for the first time identified a pair of drugs already approved for human use that show anti-prion activity and, for one of them, great promise in treating these universally fatal disorders.
The study, led by TSRI Professor Corinne Lasmézas and performed in collaboration with TSRI Professor Emeritus Charles Weissmann and Director of Lead Identification Peter Hodder, was published this week online ahead of print by the journal Proceedings of the National Academy of Sciences.
The new study used an innovative high-throughput screening technique to uncover compounds that decrease the amount of the normal form of the prion protein (PrP, which becomes distorted by the disease) at the cell surface. The scientists found two compounds that reduced PrP on cell surfaces by approximately 70 percent in the screening and follow up tests.
The two compounds are already marketed as the drugs tacrolimus and astemizole.
Tacrolimus is an immune suppressant widely used in organ transplantation. Tacrolimus could prove problematic as an anti-prion drug, however, because of issues including possible neurotoxicity.
However, astemizole is an antihistamine that has potential for use as an anti-prion drug. While withdrawn voluntarily from the U.S. over-the-counter market in 1999 because of rare cardiac arrhythmias when used in high doses, it has been available in generic form in more than 30 countries and has a well-established safety profile. Astemizole not only crosses the blood-brain barrier, but works effectively at a relatively low concentration.
Lasmézas noted that astemizole appears to stimulate autophagy, the process by which cells eliminate unwanted components. “Autophagy is involved in several protein misfolding neurodegenerative diseases such as Alzheimer’s, Parkinson’s and Huntington’s diseases,” she said. “So future studies on the mode of action of astemizole may uncover potentially new therapeutic targets for prion diseases and similar disorders.”
The study noted that eliminating cell surface PrP expression could also be a potentially new approach to treat Alzheimer’s disease, which is characterized by the build-up of amyloid β plaque in the brain. PrP is a cell surface receptor for Aβ peptides and helps mediate a number of critical deleterious processes in animal models of the disease.
(Source: scripps.edu)

Brain-imaging tool and stroke risk test help identify cognitive decline early
The connection between stroke risk and cognitive decline has been well established by previous research. Individuals with higher stroke risk, as measured by factors like high blood pressure, have traditionally performed worse on tests of memory, attention and abstract reasoning.
The current small study demonstrated that not only stroke risk, but also the burden of plaques and tangles, as measured by a UCLA brain scan, may influence cognitive decline.
The imaging tool used in the study was developed at UCLA and reveals early evidence of amyloid beta “plaques” and neurofibrillary tau “tangles” in the brain — the hallmarks of Alzheimer’s disease.
The study, published in the April issue of the Journal of Alzheimer’s Disease, demonstrates that taking both stroke risk and the burden of plaques and tangles into accout may offer a more powerful assessment of factors determining how people are doing now and will do in the future.
"The findings reinforce the importance of managing stroke risk factors to prevent cognitive decline even before clinical symptoms of dementia appear," said first author Dr. David Merrill, an assistant clinical professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA.
This is one of the first studies to examine both stroke risk and plaque and tangle levels in the brain in relation to cognitive decline before dementia has even set in, Merrill said.
According to the researchers, the UCLA brain-imaging tool could prove useful in tracking cognitive decline over time and offer additional insight when used with other assessment tools.
For the study, the team assessed 75 people who were healthy or had mild cognitive impairment, a risk factor for the future development of Alzheimer’s. The average age of the participants was 63.
The individuals underwent neuropsychological testing and physical assessments to calculate their stroke risk using the Framingham Stroke Risk Profile, which examines age, gender, smoking status, systolic blood pressure, diabetes, atrial fibrillation (irregular heart rhythm), use of blood pressure medications, and other factors.
In addition, each participant was injected with a chemical marker called FDDNP, which binds to deposits of amyloid beta plaques and neurofibrillary tau tangles in the brain. The researchers then used positron emission tomography (PET) to image the brains of the subjects — a method that enabled them to pinpoint where these abnormal proteins accumulate.
The study found that greater stroke risk was significantly related to lower performance in several cognitive areas, including language, attention, information-processing speed, memory, visual-spatial functioning (e.g., ability to read a map), problem-solving and verbal reasoning.
The researchers also observed that FDDNP binding levels in the brain correlated with participants’ cognitive performance. For example, volunteers who had greater difficulties with problem-solving and language displayed higher levels of the FDDNP marker in areas of their brain that control those cognitive activities.
"Our findings demonstrate that the effects of elevated vascular risk, along with evidence of plaques and tangles, is apparent early on, even before vascular damage has occurred or a diagnosis of dementia has been confirmed," said the study’s senior author, Dr. Gary Small, director of the UCLA Longevity Center and a professor of psychiatry and biobehavioral sciences who holds the Parlow–Solomon Chair on Aging at UCLA’s Semel Institute.
Researchers found that several individual factors in the stroke assessment stood out as predictors of decline in cognitive function, including age, systolic blood pressure and use of blood pressure–related medications.
Small noted that the next step in the research would be studies with a larger sample size to confirm and expand the findings.
The initial clinical trial of a novel approach to treating amyotrophic lateral sclerosis (ALS) – blocking production of a mutant protein that causes an inherited form of the progressive neurodegenerative disease – may be a first step towards a new era in the treatment of such disorders. Investigators from Massachusetts General Hospital (MGH) and Washington University School of Medicine report that infusion of an antisense oligonucleotide against SOD1, the first gene to be associated with familial ALS, had no serious adverse effects and the drug was successfully distributed thoughout the central nervous system.
"This therapy directly targets the cause of this form of ALS – a mutation in SOD1, which was originally discovered here at the MGH by my mentor Robert Brown," says Merit Cudkowicz, MD, chief of Neurology at MGH and senior author of the report in Lancet Neurology, which has been released online. “It’s very exciting that we have reached a stage when we can start clinical trials against this type of ALS.”
ALS causes the death of motor neurons in the brain and spinal cord, stopping transmission of neural signals to nerve fibers and leading to weakness, paralysis and usually death from respiratory failure. Only 10 percent of ALS cases are inherited, and mutations in SOD1 – which produce an aberrant, toxic form of the protein – account for about 20 percent of familial cases. Although that first SOD1 mutation was identified 20 years ago by the team lead by Brown – who is now professor and chief of Neurology at the University of Massachusetts Medical School – a technology that directly addresses such mutations became available only recently.
The current study, the first author of which is Timothy Miller, MD, PhD, of Washington University, used what are called antisense oligonucleotides – small, single-stranded DNA or RNA molecules that prevent production of a protein by binding to its messenger RNA. While antisense medications have been tested against several types of disease, this was the first trial in a neurological disorder, making the assurance of safety – a primary goal of a phase 1 study – particular important. Studies in animal models led by Miller and others found that the experimental antisense drug used in this trial reduced expression of mutated and nonmutated SOD1 and slowed the progression of ALS.
Conducted at the MGH, Washington University, Johns Hopkins University and the Methodist Neurological Institute in Houston, the trial enrolled a total of 21 patients with SOD1 familial ALS. Four sequential groups of participants received spinal infusions over an 11-hour period of the antisense drug or a placebo, with the active drug being administered at one of four dosage levels. Since participants in one group were free to join a subsequent group more than 60 days later, seven received two infusions and two received a total of three.
Some of the participants reported the type of adverse effects typically associated with spinal infusions – headache and back pain – with no difference between the active drug and placebo groups. Participants who receive subsequent infusions reported fewer adverse effects. Cerebrospinal fluid samples taken immediately after infusion revealed the presence of the antisense oligonucleotidein all participants receiving the drug at levels close to what was predicted based on animal studies. Analysis of spinal cord samples from one participant who had later died from ALS found drug levels highest at the site of the infusion and lowest at the furthest point and suggested that prior estimates of how long the drug would persist in the spinal cord were accurate.
Cudkowicz notes that the next step will be a larger study to address long-term safety and take a first look at the effectiveness of antisense treatment against ALS “This is a very important step forward for neurodegenerative disorders in general,” she explains. “There are other ALS gene mutations that antisense technology may be useful against. There also is an ongoing study of a different oligonucleotide against spinal muscular atrophy, and ongoing preclinical studies in Huntington’s disease, myotonic dystrophy and other neurological disorders are in development.
"The first person with ALS that I cared for had SOD1 ALS," she adds, "and I promised her a commitment to finding a treatment for this form of the disease. It’s so gratifying to finally be at the stage of knowledge where we can start testing this treatment in patients with SOD1 ALS. We also hope that this treatment may apply to the broader population of patient with sporadic ALS." Cudkowicz is the Julieanne Dorn Professor of Neurology at Harvard Medical School.
(Source: massgeneral.org)

Researchers Develop New System to Study Trigger of Cell Death in Nervous System
Researchers at the University of Arkansas have developed a new model system to study a receptor protein that controls cell death in both humans and fruit flies, a discovery that could lead to a better understanding of neurodegenerative diseases such as Alzheimer’s and Parkinson’s.
Michael Lehmann, an associate professor of biological sciences, uses fruit fly genetics to study the receptor — N-methyl-D-aspartate receptor, known as the NMDA receptor — that triggers programmed cell death in the human nervous system.
With an aging population, neurodegenerative diseases have become a major public health concern, Lehmann said.
“Whenever brain cells die as a result of neurodegenerative disease, or as a consequence of injuries caused by stroke, exposure to alcohol or neurotoxins, this receptor is involved,” he said. “So it’s very important to understand how it functions and how it may be possible to influence it.”
When larvae of Drosophila melanogaster, a common fruit fly, grow from the larval stage into adults, they shed most of their former organs and grow new ones. About 1 ½ years ago, researchers in Lehmann’s laboratory discovered that the NMDA receptor is required for cell death in the system that they had used for several years to study basic mechanisms of programmed cell death in fruit flies.
“Our model system for studying programmed cell death is the salivary glands in the fly larvae, which are comparatively large organs that completely disappear during metamorphosis,” he said. “Disposal of this tissue by programmed cell death provides us with a very nice system to study the genes that are required for the process. We can use it to identify genes that are required for programmed cell death in humans, as well.”
The National Institutes of Health has awarded Lehmann a three-year, $260,530 grant to support the study.
Brandy Ree, a doctoral student in the interdisciplinary graduate program in cell and molecular biology, worked with Lehmann to use a combination of biochemistry and fruit fly genetics in an attempt to define the pathway that leads from activation of the receptor to the cell’s eventual death.
“We developed a new system to study the receptor outside the nervous system in a normal developmental context,” Lehmann said. “Many of the different components involved in cell death are known in this system. There are more than 30,000 publications about this receptor, but there is still very little known about how the receptor causes cell death. We just have to connect the dots and fit the receptor into the pathway to find out how exactly it contributes to the cell’s death.”
A mid-career investigator in the Center for Protein Structure and Function at the University of Arkansas, Lehmann has studied programmed cell death in Drosophila melanogaster for more than a decade.
In 2007, Lehmann’s research group discovered an important mechanism that regulates the destruction of larval fruit fly salivary glands that could point the way to understanding programmed cell death in the human immune system. They published their findings in the Journal of Cell Biology.
(Image: BD Biosciences)

Research identifies co-factors critical to PTSD development
Research led by Ya-Ping Tang, MD, PhD, Associate Professor of Cell Biology and Anatomy at LSU Health Sciences Center New Orleans, has found that the action of a specific gene occurring during exposure to adolescent trauma is critical for the development of adult-onset Post-Traumatic Stress Disorder (PTSD.) The findings are published in PNAS Online Early Edition the week of April 1-5, 2013.
"This is the first study to show that a timely manipulation of a certain neurotransmitter system in the brain during the stage of trauma exposure is potentially an effective strategy to prevent the pathogenesis of PTSD," notes Dr. Tang.
The research team conducted a series of experiments using a specific strain of transgenic mice, in which the function of the gene can be suppressed, and then restored. The model combined exposure to adolescent trauma as well as an acute stressor. Clinically PTSD may occur immediately following a trauma, but in many cases, a time interval may exist between the trauma and the onset of disease. Exposure to a second stress or re-victimization can be an important causative factor. However, the researchers discovered that exposure to both adolescent trauma and to acute stress was not enough to produce consistent PTSD-like behavior. When exposure to trauma and stress was combined with the function of a specific transgene called CCKR-2, consistent PTSD-like behavior was observed in all of the behavioral tests, indicating that the development of PTSD does not depend only on the trauma itself.
As a predominant form of human anxiety disorders, PTSD affects 7.8% of people between 15-54 years in the United States. PTSD can cause feelings of hopelessness, despair and shame, employment and relationship problems, anger, and sleep difficulties. Additionally, PTSD can increase the risk of other mental health conditions including depression, substance abuse, eating disorders, and suicidal thoughts, as well as certain medical conditions including cardiovascular disease, chronic pain, autoimmune disorders, and musculoskeletal conditions.
A favored current theory of the development of anxiety disorders, including PTSD, is a gene/environment interaction. This study demonstrated that the function of the CCKR-2 gene in the brain is a cofactor, along with trauma insult, and identified a critical time window for the interaction in the development of PTSD.
"Once validated in human subjects, our findings may help target potential therapies to prevent or cure this devastating mental disorder," Dr. Tang concludes.
(Image: canstockphoto)
Accused of complicity in Alzheimer’s, amyloid proteins may be getting a bad rap
Amyloids — clumps of misfolded proteins found in the brains of people with Alzheimer’s disease and other neurodegenerative disorders — are the quintessential bad boys of neurobiology. They’re thought to muck up the seamless workings of the neurons responsible for memory and movement, and researchers around the world have devoted themselves to devising ways of blocking their production or accumulation in humans.
But now a pair of recent research studies from the Stanford University School of Medicine sets a solid course toward rehabilitating the reputation of the proteins that form these amyloid tangles, or plaques. In the process, they appear poised to turn the field of neurobiology on its head.
The first study, published in August, showed that an amyloid-forming protein called beta amyloid, which is strongly implicated in Alzheimer’s disease, could reverse the symptoms of a multiple-sclerosis-like neurodegenerative disease in laboratory mice.
The second study, published April 3 in Science Translational Medicine, extends the finding to show that small portions of several notorious amyloid-forming proteins (including well-known culprits like tau and prion proteins) can also quickly alleviate symptoms in mice with the condition — despite the fact that the fragments can and do form the long tendrils, or fibrils, previously thought harmful to nerve health.
“What we’re finding is that, at least under certain circumstances, these amyloid peptides actually help the brain,” said Lawrence Steinman, MD, professor of neurology and neurological sciences and of pediatrics. “This really turns the ‘amyloid-is-bad’ dogma upside down. It will require a shift in people’s fundamental beliefs about neurodegeneration and diseases like multiple sclerosis, Alzheimer’s and Parkinson’s.”
Steinman is a noted expert in multiple sclerosis whose research led to the development of natalizumab (marketed as Tysabri), a potent treatment for the disease.
Taken together, the studies begin to suggest the radical new idea that full-length, amyloid-forming proteins may in fact be produced by the body as a protective, rather than destructive, force. In particular, Steinman’s study shows that these proteins may function as molecular chaperones, escorting and removing from sites of injury specific molecules involved in inflammation and inappropriate immune responses.
Steinman, who is also the medical school’s George A. Zimmermann Professor, is the corresponding author of the research. Jonathan Rothbard, PhD, a senior research scientist in the Steinman laboratory, is the senior author; postdoctoral scholar Michael Kurnellas, PhD, is the lead author.
Although the specific findings of Steinman’s two studies are surprising, there have been inklings from previous research that amyloid-forming proteins may not be all bad. In particular, inhibiting, or knocking out, the expression of several of the proteins in the mouse models of multiple sclerosis — a technique that should block the course of the disease if these proteins are the cause — instead worsened the animals’ symptoms.
And there’s the fact that these so-called dangerous amyloid-forming molecules are surprisingly prevalent. “We know the body makes a lot of amyloid-forming proteins in response to injury,” said Steinman. “I’m doubtful that that’s done to produce more harm. For example, the prion protein is found in every cell in our bodies. What is it doing? It’s possible that any therapeutic maneuver to remove all of these proteins could interfere with their natural function.”
Understanding how amyloids form requires an understanding of the biology of proteins, which are essentially strings of smaller components called amino acids attached end to end. Once they’re made, these protein strings twist and fold into specific three-dimensional shapes that fit together like keys and locks to do the work of the cell.
A misfolded protein is likely to be unable to carry out its duties and must be disposed of by the body’s cellular waste-management system. Amyloid-forming proteins (of which there are around 20), however, don’t go quietly, if at all. Instead, they initiate a chain reaction with other misfolded proteins — forming long, insoluble strands called fibrils that mat together to form amyloid clumps. These clumps appear consistently in the brains of people with neurodegenerative diseases like Alzheimer’s and multiple sclerosis, but not in the brains of healthy people.
Although these clumps are thought to be detrimental to nerve cells, it’s not entirely clear how they cause harm. One possibility is the ability of the fibrils to form cylindrical pores that could disrupt the cellular membrane and interfere with the orderly flow of ions and molecules used by the cells to communicate and transmit nerve signals. Regardless, their very presence suggests a diagnosis of neurodegeneration to many clinicians, including — until recently — Steinman.
“We began this research because these molecules are present in the brains of people with multiple sclerosis,” said Steinman. “We expected to show that the presence of beta amyloid made the disease worse in laboratory animals. Instead, we saw a great deal of benefit.”
Intrigued by the results of their first study, the researchers next tested the effect of small, six-amino-acid portions of several amyloid-forming proteins, including beta amyloid, which appeared likely to share a three-dimensional structure. They found that nearly all of the tiny protein molecules, or hexamers, were also able to temporarily reverse the symptoms of multiple sclerosis in the mice (when the treatment was stopped, the mice developed signs of the condition within a few days).
The researchers noted, however, that the curative effect of the hexamers was linked to their ability to form fibrils similar, but not identical, to their longer parent molecules. For example, these simplified hexamer fibrils are more easily formed and broken apart than those composed of whole proteins. They are also thought not to be able to form the cylindrical pores that might damage cell membranes. Finally, the hexamer fibrils appear to inhibit the formation of fibrils from full-length proteins — perhaps by blocking, or failing to promote, the chain reaction that initiates fibril formation.
When Steinman and his colleagues mixed the fibril-forming hexamers with blood plasma from three people with multiple sclerosis, they found that the fibrils bound to and removed from solution many potentially damaging molecules involved in inflammation and the immune response.
“These hexamer fibrils appear to be working to remove dangerous chemicals from the vicinity of the injury,” said Steinman.
The researchers are eager to pursue the use of these small hexamers as therapies for neurodegenerative diseases like multiple sclerosis. Much research is still needed, but Steinman is hopeful.
“The lessons we learn from our study of amyloid-forming proteins in multiple sclerosis could be helpful for stroke and brain trauma, as well as for Alzheimer’s,” said Steinman. “We’re gaining insight into how current therapeutic approaches may be affecting the body, and beginning to understand the nuances necessary to design a successful treatment. Although it will take time, we’re determined to move promising results out of the laboratory and into the clinic as quickly as possible.”
(Image: Wikimedia Commons)
Laser Light Zaps Away Cocaine Addiction
By stimulating one part of the brain with laser light, researchers at the National Institutes of Health (NIH) and the Ernest Gallo Clinic and Research Center at UC San Francisco (UCSF) have shown that they can wipe away addictive behavior in rats – or conversely turn non-addicted rats into compulsive cocaine seekers.
“When we turn on a laser light in the prelimbic region of the prefrontal cortex, the compulsive cocaine seeking is gone,” said Antonello Bonci, MD, scientific director of the intramural research program at the NIH’s National Institute on Drug Abuse (NIDA), where the work was done. Bonci is also an adjunct professor of neurology at UCSF and an adjunct professor at Johns Hopkins University.
Described this week in the journal Nature, the new study demonstrates the central role the prefrontal cortex plays in compulsive cocaine addiction. It also suggests a new therapy that could be tested immediately in humans, said Billy Chen of NIDA, the lead author of the study.
Any new human therapy would not be based on using lasers, but would most likely rely on electromagnetic stimulation outside the scalp, in particular a technique called transcranial magnetic stimulation (TMS). Clinical trials are now being designed to test whether this approach works, Chen added.
The High Cost of Cocaine Abuse
Cocaine abuse is a major public health problem in the United States today, and it places a heavy toll on society in terms of lost job productivity, lost earnings, cocaine-related crime, incarcerations, investigations, and treatment and prevention programs.
The human toll is even greater, with an estimated 1.4 million Americans addicted to the drug. It is frequently the cause of emergency room visits – 482,188 in 2008 alone – and it is a top cause of heart attacks and strokes for people under 35.
One of the hallmarks of cocaine addiction is compulsive drug taking – the loss of ability to refrain from taking the drug even if it’s destroying one’s life.
What makes the new work so promising, said Bonci, is that Chen and his colleagues were working with an animal model that mimics this sort of compulsive cocaine addiction. The animals, like human addicts, are more likely to make bad decisions and take cocaine even when they are conditioned to expect self-harm associated with it.
Electrophysiological studies involving these rats have shown that they have extremely low activity in the prefrontal cortex – a brain region fundamental for impulse control, decision making and behavioral flexibility. Similar studies that imaged the brains of humans have shown the same pattern of low activity in this region in people who are compulsively addicted to cocaine.
Altering Brain Activity with a Laser
To test whether altering the activity in this brain region could impact addiction, Chen and his colleagues employed a technique called optogenetics to shut the activity on and off using a laser.
First they took light-sensitive proteins called rhodopsins and used genetic engineering to insert them into neurons in the rat’s prefrontal cortex. Activating this region with a laser tuned to the rhodopsins turned the nerve cells on and off.
Turning on these cells wiped out the compulsive behavior, while switching them off turned the non-addicted ones into addicted, researchers found.
What’s exciting, said Bonci, is that there is a way to induce a similar activation of the prelimbic cortex in people through a technique called transcranial magnetic stimulation (TMS), which applies an external electromagnetic field to the brain and has been used as a treatment for symptoms of depression.
Bonci and his colleagues plan to begin clinical trials at NIH in which they will use this technique a few sessions a week to stimulate the prefrontal cortex in people who are addicted to cocaine and see if they can restore activity to that part of the brain and help them avoid taking the drug.
Anything you can do I can do better: Neuromolecular foundations of the superiority illusion
The existential psychologist Rollo May wrote that “depression is the inability to construct a future” while Lionel Tiger stated that “optimism has been central to the process of human evolution”. These deceptively simple phrases are remarkable in their depth and the connections they form between philosophy, psychology and neuroscience. Both capture the essence of human nature by articulating their insight that our ability to imagine and plan for the future is not only one of the most striking aspects of our species, but also that the inability to exercise this faculty is profoundly damaging to our happiness and sense of self. Two concepts related to these observations are depressive realism – the assertion that people with depression actually have a more accurate perception of reality, and moreover are less affected by its counterpoint, the superiority illusion. The superiority illusion is a cognitive bias by which individuals, relative to others, overestimate their positive qualities and abilities (such as intelligence, cognitive ability, and desirable traits) and underestimate their negative qualities. (Other cognitive biases include optimism bias and illusion of control.) While mathematically flawed – given a normal population distribution, most people are not above average – the superiority illusion is a positive belief that promotes mental health. Recently, scientists at the National Institute of Radiological Sciences (Chiba, Japan), the Japan Science and Technology Agency (Saitama), and Stanford University School of Medicine used resting-state functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) to study the default states of neural and molecular systems that generate the superiority illusion. They showed that resting-state functional connectivity between the frontal cortex and striatum regulated by inhibitory dopaminergic neurotransmission determines individual levels of the superiority illusion. The scientists state that their findings help clarify how the superiority illusion is biologically determined and identify potential molecular and neural targets for treating depressive realism.

New Genetic Evidence Suggests a Continuum Among Neurodevelopmental and Psychiatric Disorders
A paper published this month in the prestigious medical journal The Lancet Neurology suggests that a broad spectrum of developmental and psychiatric disorders, ranging from autism and intellectual disability to schizophrenia, should be conceptualized as different manifestations of a common underlying denominator, “developmental brain dysfunction,” rather than completely independent conditions with distinct causes.
In “Developmental Brain Dysfunction: Revival and Expansion of Old Concepts Based on New Genetic Evidence,” the authors make two key points:
According to Andres Moreno De Luca, M.D., research scientist at the Autism and Developmental Medicine Institute at Geisinger Health System and article co-author, “Recent genetic studies conducted in thousands of individuals have shown that identical genetic mutations are shared among neurodevelopmental disorders that are thought to be clinically distinct. What we have seen over the past few years is that genetic mutations that were initially found in individuals with one disorder, such as intellectual disability or autism, are then identified in people with an apparently different condition like schizophrenia, epilepsy, or bipolar disorder.”
“It turns out that the genes don’t respect our diagnostic classification boundaries, but that really isn’t surprising given the overlapping symptoms and frequent co-existence of neurodevelopmental disorders,” said Scott M. Myers, M.D., autism specialist at Geisinger Health System and article co-author.
“We believe this study supports use of the term ‘developmental brain dysfunction’ or DBD, which would encompass the broad spectrum of neurodevelopmental and neuropsychiatric disorders,” said David H. Ledbetter, Ph.D., executive vice president and chief scientific officer at Geisinger Health System, and article co-author. “Additionally, it is clear that diagnostic tools such as whole genome analysis for both children and their families are essential when diagnosing and treating these disorders in order to ensure the most personalized treatment.”
An example used in the study was analysis of intelligence quotient (IQ) scores. The average IQ score in the general population is 100. Historically, the medical community has defined intellectual disability as an IQ of less than 70 (with concurrent deficits in adaptive functioning). But according to Dr. Ledbetter, there is little difference in the function of a child with an IQ of 69 versus 71, yet one may be diagnosed with a disability and the other may not.
“We know a variety of factors contribute to IQ score, including genetics, as a child’s IQ is highly correlated with that of his or her parents and siblings. Therefore, an important factor to take into consideration when interpreting IQ is family background,” said Dr. Ledbetter. “Imagine if we have a child with a genetic abnormality, but the child’s IQ is 85. Technically, we would not diagnose this child with a disability. However, if the family of this child has IQs around 130, we could consider that this child’s genetic anomaly has ‘cost’ him or her 45 IQ points – a very substantial difference.”
According to Dr. Myers, “One implication of this concept is that studies designed to investigate the causes and mechanisms of developmental brain dysfunction should focus on measurement of quantifiable neuropsychological and neurobehavioral traits across groups of individuals with different clinical diagnoses. Another is that whenever possible, individuals with a particular genetic variant or other risk factor should be compared to their unaffected family members, not just to population norms.”