Posts tagged science

Posts tagged science

Impulsive murderers much more mentally impaired than those who kill strategically
The minds of murderers who kill impulsively, often out of rage, and those who carefully carry out premeditated crimes differ markedly both psychologically and intellectually, according to a new study by Northwestern Medicine® researcher Robert Hanlon.
“Impulsive murderers were much more mentally impaired, particularly cognitively impaired, in terms of both their intelligence and other cognitive functions,” said Hanlon, senior author of the study and associate professor of clinical psychiatry and clinical neurology at Northwestern University Feinberg School of Medicine.
“The predatory and premeditated murderers did not typically show any major intellectual or cognitive impairments, but many more of them have psychiatric disorders,” he said.
Published online in the journal Criminal Justice and Behavior, the study is the first to examine the neuropsychological and intelligence differences of murderers who kill impulsively versus those who kill as the result of a premeditated strategic plan.
Based on established criteria, 77 murderers from typical prison populations in Illinois and Missouri were classified into the two groups (affective/impulsive and premeditated/predatory murderers). Hanlon compared their performances on standardized measures of intelligence and neuropsychological tests of memory, attention and executive functions. He spent hours with each individual, administering series of tests to complete an evaluation. Hanlon has spent thousands of hours studying the minds of murderers through his research.
“It’s important to try to learn as much as we can about the thought patterns and the psychopathology, neuropathology and mental disorders that tend to characterize the types of people committing these crimes,” he said. “Ultimately, we may be able to increase our rates of prevention and also assist the courts, particularly helping judges and juries be more informed about the minds and the mental abnormalities of the people who commit these violent crimes.”
(Image: ALAMY)
A look inside children’s minds
University of Iowa study shows how 3- and 4-year-olds retain what they see around them
When young children gaze intently at something or furrow their brows in concentration, you know their minds are busily at work. But you’re never entirely sure what they’re thinking.
Now you can get an inside look. Psychologists led by the University of Iowa for the first time have peered inside the brain with optical neuroimaging to quantify how much 3- and 4-year-old children are grasping when they survey what’s around them and to learn what areas of the brain are in play. The study looks at “visual working memory,” a core cognitive function in which we stitch together what we see at any given point in time to help focus attention. In a series of object-matching tests, the researchers found that 3-year-olds can hold a maximum of 1.3 objects in visual working memory, while 4-year-olds reach capacity at 1.8 objects. By comparison, adults max out at 3 to 4 objects, according to prior studies.
“This is literally the first look into a 3 and 4-year-old’s brain in action in this particular working memory task,” says John Spencer, psychology professor at the UI and corresponding author of the paper, which appears in the journal NeuroImage.
The research is important, because visual working memory performance has been linked to a variety of childhood disorders, including attention-deficit/hyperactivity disorder (ADHD), autism, developmental coordination disorder as well as affecting children born prematurely. The goal is to use the new brain imaging technique to detect these disorders before they manifest themselves in children’s behavior later on.
“At a young age, children may behave the same,” notes Spencer, who’s also affiliated with the Delta Center and whose department is part of the College of Liberal Arts and Sciences, “but if you can distinguish these problems in the brain, then it’s possible to intervene early and get children on a more standard trajectory.”
Plenty of research has gone into better understanding visual working memory in children and adults. Those prior studies divined neural networks in action using function magnetic resonance imaging (fMRI). That worked great for adults, but not so much with children, especially young ones, whose jerky movements threw the machine’s readings off kilter. So, Spencer and his team turned to functional near-infrared spectroscopy (fNIRS), which has been around since the 1960s but has never been used to look at working memory in children as young as three years of age.
“It’s not a scary environment,” says Spencer of the fNIRS. “No tube, no loud noises. You just have to wear a cap.”
Like fMRI, fNIRS records neural activity by measuring the difference in oxygenated blood concentrations anywhere in the brain. You’ve likely seen similar technology when a nurse puts your finger in a clip to check your circulation. In the brain, when a region is activated, neurons fire like mad, gobbling up oxygen provided in the blood. Those neurons need another shipment of oxygen-rich blood to arrive to keep going. The fNIRS measures the contrast between oxygen-rich and oxygen-deprived blood to gauge which area of the brain is going full tilt at a point in time.
The researchers outfitted the youngsters with colorful, comfortable ski hats in which fiber optic wires had been woven. The children played a computer game in which they were shown a card with one to three objects of different shapes for two seconds. After a pause of a second, the children were shown a card with either the same or different shapes. They responded whether they had seen a match.
The tests revealed novel insights. First, neural activity in the right frontal cortex was an important barometer of higher visual working memory capacity in both age groups. This could help clinicians evaluate children’s visual working memory at a younger age than before, and work with those whose capacity falls below the norm, the researchers say.
Secondly, 4-year olds showed a greater use than 3-year olds of the parietal cortex, located in both hemispheres below the crown of the head and which is believed to guide spatial attention.
"This suggests that improvements in performance are accompanied by increases in the neural response," adds Aaron Buss, a UI graduate student in psychology and the first author on the paper. "Further work will be needed to explain exactly how the neural response increases—either through changes in local tuning, or through changes in long range connectivity, or some combination."

Breaking habits before they start
Our daily routines can become so ingrained that we perform them automatically, such as taking the same route to work every day. Some behaviors, such as smoking or biting your fingernails, become so habitual that we can’t stop even if we want to.
Although breaking habits can be hard, MIT neuroscientists have now shown that they can prevent them from taking root in the first place, in rats learning to run a maze to earn a reward. The researchers first demonstrated that activity in two distinct brain regions is necessary in order for habits to crystallize. Then, they were able to block habits from forming by interfering with activity in one of the brain regions — the infralimbic (IL) cortex, which is located in the prefrontal cortex.
The MIT researchers, led by Institute Professor Ann Graybiel, used a technique called optogenetics to block activity in the IL cortex. This allowed them to control cells of the IL cortex using light. When the cells were turned off during every maze training run, the rats still learned to run the maze correctly, but when the reward was made to taste bad, they stopped, showing that a habit had not formed. If it had, they would keep going back by habit.
“It’s usually so difficult to break a habit,” Graybiel says. “It’s also difficult to have a habit not form when you get a reward for what you’re doing. But with this manipulation, it’s absolutely easy. You just turn the light on, and bingo.”
Graybiel, a member of MIT’s McGovern Institute for Brain Research, is the senior author of a paper describing the findings in the June 27 issue of the journal Neuron. Kyle Smith, a former MIT postdoc who is now an assistant professor at Dartmouth College, is the paper’s lead author.
Patterns of habitual behavior
Previous studies of how habits are formed and controlled have implicated the IL cortex as well as the striatum, a part of the brain related to addiction and repetitive behavioral problems, as well as normal functions such as decision-making, planning and response to reward. It is believed that the motor patterns needed to execute a habitual behavior are stored in the striatum and its circuits.
Recent studies from Graybiel’s lab have shown that disrupting activity in the IL cortex can block the expression of habits that have already been learned and stored in the striatum. Last year, Smith and Graybiel found that the IL cortex appears to decide which of two previously learned habits will be expressed.
“We have evidence that these two areas are important for habits, but they’re not connected at all, and no one has much of an idea of what the cells are doing as a habit is formed, as the habit is lost, and as a new habit takes over,” Smith says.
To investigate that, Smith recorded activity in cells of the IL cortex as rats learned to run a maze. He found activity patterns very similar to those that appear in the striatum during habit formation. Several years ago, Graybiel found that a distinctive “task-bracketing” pattern develops when habits are formed. This means that the cells are very active when the animal begins its run through the maze, are quiet during the run, and then fire up again when the task is finished.
This kind of pattern “chunks” habits into a large unit that the brain can simply turn on when the habitual behavior is triggered, without having to think about each individual action that goes into the habitual behavior.
The researchers found that this pattern took longer to appear in the IL cortex than in the striatum, and it was also less permanent. Unlike the pattern in the striatum, which remains stored even when a habit is broken, the IL cortex pattern appears and disappears as habits are formed and broken. This was the clue that the IL cortex, not the striatum, was tracking the development of the habit.
Multiple layers of control
The researchers’ ability to optogenetically block the formation of new habits suggests that the IL cortex not only exerts real-time control over habits and compulsions, but is also needed for habits to form in the first place.
“The previous idea was that the habits were stored in the sensorimotor system and this cortical area was just selecting the habit to be expressed. Now we think it’s a more fundamental contribution to habits, that the IL cortex is more actively making this happen,” Smith says.
This arrangement offers multiple layers of control over habitual behavior, which could be advantageous in reining in automatic behavior, Graybiel says. It is also possible that the IL cortex is contributing specific pieces of the habitual behavior, in addition to exerting control over whether it occurs, according to the researchers. They are now trying to determine whether the IL cortex and the striatum are communicating with and influencing each other, or simply acting in parallel.
“A role for the IL cortex in the regulation of habit is not a new idea, but the details of the interaction between it and the striatum that emerge from this analysis are novel and interesting,” says Christopher Pittenger, an assistant professor of psychiatry and psychology at Yale University School of Medicine, who was not part of the research team. “Thinking in the long term, it raises the question of whether targeted manipulations of the IL cortex might be useful for the breaking habits — and exciting possibility with potential clinical ramifications.”
The study suggests a new way to look for abnormal activity that might cause disorders of repetitive behavior, Smith says. Now that the researchers have identified the neural signature of a normal habit, they can look for signs of habitual behavior that is learned too quickly or becomes too rigid. Finding such a signature could allow scientists to develop new ways to treat disorders of repetitive behavior by using deep brain stimulation, which uses electronic impulses delivered by a pacemaker to suppress abnormal brain activity.
The day of the big barbecue arrives and it’s time to fire up the grill. But rather than toss the hamburgers and hotdogs haphazardly onto the grate, you wait for the heat to reach an optimal temperature, and then neatly lay them out in their apportioned areas according to size and cooking times. Meanwhile, your friend is preparing the beverages. Cups are grabbed face down from the stack, turned over, and – using the other hand – filled with ice.
While these tasks – like countless, everyday actions – may seem trivial at first glance, they are actually fairly complex, according to Robrecht van der Wel, an assistant professor of psychology at Rutgers–Camden. “For instance, the observation that you grab a glass differently when you are filling a beverage than when you are stacking glasses suggests that you are thinking about the goal that you want to achieve,” he says. “How do you manipulate the glass? How do you coordinate your actions so that the liquid goes into the cup? These kinds of actions are not just our only way to accomplish our intentions, but they reveal our intentions and mental states as well.”
van der Wel and his research partners, Marlene Meyer and Sabine Hunnius, turned their attention to how action planning generalizes to collaborative actions performed with others in a study, titled Higher-order planning for individual and joint object manipulations, published recently in Experimental Brain Research.
According to van der Wel, the researchers were especially interested in determining whether people’s actions exhibit certain social capabilities when performing multiple-action sequences in concert with a partner. “It is a pretty astonishing ability that we, as people, are able to plan and coordinate our actions with others,” says van der Wel. “If people plan ahead for themselves, what happens if they are now in a task where their action might influence another person’s comfort? Do they actually take that into account or not, even though, for their personal action, it makes no difference?”
In the research study, participants first completed a series of individual tasks requiring them to pick up a cylindrical object with one hand, pass it to their other hand, and then place it on a shelf. In the collaborative tasks, individuals picked up the object and handed it to their partner, who placed it on the shelf. The researchers varied the height of the shelf, to test whether people altered their grasps to avoid uncomfortable end postures. The object could only be grasped at one of two positions, implying that the first grasp would determine the postures – and comfort – of the remaining actions.
According to the researchers, the results from both the individual and joint performances show that participants altered their grasp location relative to the height of the shelf. The participants in both scenarios were thus more likely to use a low-grasp location when the shelf was low, and vice versa. Doing so implied that the participants ended the sequences in comfortable postures. The researchers conclude that, in both individual and collaborative scenarios, participants engaged in extended planning to finish the object-transport sequences in a relatively comfortable posture. Given that participants did plan ahead for the sake of their action partner, it indicates an implicit social awareness that supports collaboration across individuals.
van der Wel notes that, while such basic actions may seem insignificant, it is important to understand how people perform basic tasks such as manipulating objects when considering those populations that aren’t able to complete them so efficiently. “How to pick up an object seems like a really trivial problem when you look at healthy adults, but as soon as you look at children, or people suffering from a stroke, it takes some time to develop that skill properly,” says van der Wel. “When someone has a stroke, it is not that they have damage to the musculature involved in doing the task; rather, damage to action planning areas in the brain results in an inability to perform simple actions. A better understanding of the mechanisms involved in action planning may guide rehabilitation strategies in such cases.”
According to van der Wel, the researchers are currently working on modifying the task to determine the age at which children begin planning their actions with respect to other peoples’ comfort. In particular, they want to understand how the development of social action planning links with the development of other cognitive and social abilities.
(Source: news.rutgers.edu)
New work at the University of California, Davis, shows for the first time how visual attention affects activity in specific brain cells. The paper, published June 26 in the journal Nature, shows that attention increases the efficiency of signaling into the brain’s cerebral cortex and boosts the ratio of signal over noise.

It’s the first time neuroscientists have been able to look at the behavior of synaptic circuits at such a fine-grained level of resolution while measuring the effects of attention, said Professor Ron Mangun, dean of social sciences at UC Davis and a researcher at the UC Davis Center for Mind and Brain.
Our brains recreate an internal map of the world we see through our eyes, mapping our visual field onto specific brain cells. Humans and our primate relatives have the ability to pay attention to objects in the visual scene without looking at them directly, Mangun said.
"Essentially, we ‘see out of the corner of our eyes,’ as the old saying goes. This ability helps us detect threats, and react quickly to avoid them, as when a car running a red light at high speed is approach from our side," he said.
Postdoctoral scholar Farran Briggs worked with Mangun and Professor Martin Usrey at the UC Davis Center for Neuroscience to measure signaling through single nerve connections, or synapses, in monkeys while they performed a standard cognitive test for attention: pressing a joystick in response to seeing a stimulus appear in their field of view.
By taking measurements on each side of a synapse leading into the cerebral cortex, the team could measure when neurons were firing, the strength of the signal and the signal-to-noise ratio.
The researchers found that when the animals were paying attention to an area within their field of view, the signal strength through corresponding synapses leading into the cortex became more effective, and the signal was boosted relative to background noise.
Combining established cognitive psychology with advanced neuroscience, the technique opens up new possibilities for research.
"There are a lot of questions about attention that we can now investigate, such as which brain mechanisms are disordered in diseases that affect attention," Usrey said.
The method could be used, for example, to probe the cholinergic nervous system, which is impacted by Alzheimer’s disease. It could also help to better understand developmental disorders that involve defects in attention, such as attention deficit hyperactivity disorder and autism.
"It’s going to turn out to be important for understanding and treating all kinds of diseases," Mangun predicted.
(Source: news.ucdavis.edu)
Vision and Hearing Work Together in the Brain to Help Us Catch a Moving Target
A new study has found that chasing down a moving object is not only a matter of sight or of sound, but of mind.
The study found that people who are blindfolded employ the same strategy to intercept a running ball carrier as people who can see, which suggests that multiple areas of the brain cooperate to accomplish the task.
Regardless of whether they could see or not, the study participants seemed to aim ahead of the ball carrier’s trajectory and then run to the spot where they expected him or her to be in the near future. Researchers call this a “constant target-heading angle” strategy, similar to strategies used by dogs catching Frisbees and baseball players catching fly balls.
It’s also the best way to catch an object that is trying to evade capture, explained Dennis Shaffer, assistant professor of psychology at The Ohio State University at Mansfield.
“The constant-angle strategy geometrically guarantees that you’ll reach your target, if your speed and the target’s speed stay constant, and you’re both moving in a straight line. It also gives you leeway to adjust if the target abruptly changes direction to evade you,” Shaffer said.
“The fact that people run after targets at a constant angle regardless of whether they can see or not suggests that there are brain mechanisms in place that we would call ‘polymodal’—areas of the brain that serve more than one form of sensory modality. Sight and hearing may be different senses, but within the brain the results of the sensory input for this task may be the same.”
The study appears in the journal Psychonomic Bulletin and Review.
Nine people participated in the study—mainly students at Ohio State and Arizona State University, where the study took place. Some had experience playing football, either at a high school or collegiate intramural level, while others had limited or no experience with football.
The nine of them donned motion-capture equipment and took turns in pairs, one running a football across a 20-meter field (nearly 22 yards), and one chasing. They randomly assigned participants to sighted and blindfolded conditions. In the blindfolded condition, participants wore a sleep mask and the runner carried a foam football with a beeping device inside, so that the chaser had a chance to locate them by sound. The runners ran in the general direction of the chasers at different angles, and sometimes the runner would cut right or left halfway through the run.
The study was designed so that the pursuer wouldn’t have time to consciously think about how to catch the runner.
“We were just focused on trying to touch the runner as soon as possible and before they exited the field,” Shaffer said. “The idea was to have the strategy emerge by instinct.”
About 97 percent of the time, the person doing the chasing used the constant-angle strategy—even when they were blindfolded and only able to hear the beeping football.
The results were surprising, even to Shaffer.
“I knew that this seemed to be a universal strategy across species, but I expected that people’s strategies would vary more when they were blindfolded, just because we aren’t used to running around blindfolded. I didn’t expect that the blindfolded strategies would so closely match the sighted ones.”
The findings suggest that there’s some common area in the brain that processes sight and sound together when we’re chasing something.
There is another strategy for catching moving targets. Researchers call it the pursuit or aiming strategy, because it involves speeding directly at the target’s current location. It’s how apex predators such as sharks catch prey.
“As long as you are much faster than your prey, the pursuit strategy is great. You just overtake them,” Shaffer said.
In a situation where the competition is more equal, the constant-angle strategy works better—the pursuer doesn’t have to be faster than the target, and if the target switches direction, the pursuer has time to adjust.
The study builds on Shaffer’s previous work with how collegiate-level football players chase ball carriers. He’s also studied how people catch baseballs and dogs catch Frisbees. All appear to use strategies similar to the constant target-heading angle strategy, which suggests that a common neural mechanism could be at work.
(Source: researchnews.osu.edu)
UC Berkeley researchers have found that a lack of sleep, which is common in anxiety disorders, may play a key role in ramping up the brain regions that contribute to excessive worrying.

Neuroscientists have found that sleep deprivation amplifies anticipatory anxiety by firing up the brain’s amygdala and insular cortex, regions associated with emotional processing. The resulting pattern mimics the abnormal neural activity seen in anxiety disorders. Furthermore, their research suggests that innate worriers – those who are naturally more anxious and therefore more likely to develop a full-blown anxiety disorder – are acutely vulnerable to the impact of insufficient sleep.
“These findings help us realize that those people who are anxious by nature are the same people who will suffer the greatest harm from sleep deprivation,” said Matthew Walker, a professor of psychology and neuroscience at UC Berkeley and senior author of the paper, which was published in the Journal of Neuroscience.
The results suggest that people suffering from such maladies as generalized anxiety disorder, panic attacks and post-traumatic stress disorder, may benefit substantially from sleep therapy. At UC Berkeley, psychologists such as Allison Harvey, a co-author on the Journal of Neuroscience paper, have been garnering encouraging results in studies that use sleep therapy on patients with depression, bipolar disorder and other mental illnesses.
“If sleep disruption is a key factor in anxiety disorders, as this study suggests, then it’s a potentially treatable target,” Walker said. “By restoring good quality sleep in people suffering from anxiety, we may be able to help ameliorate their excessive worry and disabling fearful expectations.”
While previous research has indicated that sleep disruption and psychiatric disorders often occur together, this latest study is the first to causally demonstrate that sleep loss triggers excessive anticipatory brain activity associated with anxiety, researchers said.
“It’s been hard to tease out whether sleep loss is simply a byproduct of anxiety, or whether sleep disruption causes anxiety,” said Andrea Goldstein, a UC Berkeley doctoral student in neuroscience and lead author of the study. “This study helps us understand that causal relationship more clearly.”
In their experiments, performed at UC Berkeley’s Sleep and Neuroimaging Laboratory, Walker and his research team scanned the brains of 18 healthy young adults as they viewed dozens of images, first after a good night’s rest, and again after a sleepless night. The images were either neutral, disturbing or alternated between both.
Participants in the experiments reported a wide range of baseline anxiety levels, but none fit the criteria for a clinical anxiety disorder. After getting a full night’s rest at the lab, which researchers monitored by measuring neural electrical activity, their brains were scanned via functional MRI as they waited to be shown, and then viewed 90 images during a 45-minute session.
To trigger anticipatory anxiety, researchers primed the participants using one of three visual cues prior to each series of images. A large red minus sign signaled to participants that they were about to see a highly unpleasant image, such as a death scene. A yellow circle portended a neutral image, such as a basket on a table. Perhaps most stressful was a white question mark, which indicated that either a grisly image or a bland, innocuous one was coming, and kept participants in a heightened state of suspense.
When sleep-deprived and waiting in suspenseful anticipation for a neutral or disturbing image to appear, activity in the emotional brain centers of all the participants soared, especially in the amygdala and the insular cortex. Notably, the amplifying impact of sleep deprivation was most dramatic for those people who were innately anxious to begin with.
“This discovery illustrates how important sleep is to our mental health,” said Walker. “It also emphasizes the intimate relationship between sleep and psychiatric disorders, both from a cause and a treatment perspective.”
(Source: newscenter.berkeley.edu)
Researchers Find Zinc’s Crucial Pathway to the Brain
A new study helps explain how parts of the brain maintain their delicate balance of zinc, an element required in minute but crucial doses, particularly during embryonic development.
The study, led at the Marine Biological Laboratory (MBL) by Mark Messerli in collaboration with scientists from the University of California, Davis, shows that neural cells require zinc uptake through a membrane transporter referred to as ZIP12. If that route is closed, neuronal sprouting and growth are significantly impaired and is fatal for a developing embryo. Their discovery was published in the Proceedings of the National Academy of Sciences.
“This particular transporter is an essential doorway for many neurons in the central nervous system,” explains Messerli. “You knock out this one gene, this one particular pathway for the uptake of zinc into these cells, and you essentially prevent neuronal outgrowth. That’s lethal to the embryo.”
Previously, scientists thought that zinc could use more than one pathway to enter the cell during early brain development. Some other elements, like calcium, enjoy such luxury of multiple options.
Knocking out ZIP12, affected several critical processes in the brain, the scientists found. For example, frog embryos were unable to develop their neural systems properly. Additionally, neurons had trouble reaching out to connect to other neurons; their extensions were both shorter and fewer in number than normal.
“We were surprised that ZIP12 was required at such an early and critical stage of development,” said Winyoo Chowanadisai, a researcher in nutrition at the University of California at Davis and visiting scientist in the Cellular Dynamics Program at the MBL. Dr. Chowanadisai was the first on the team to realize that ZIP12 is expressed in such abundance in the brain.“This study also reinforces the importance of periconceptional and prenatal nutrition and counseling to promote health during the earliest stages of life.”
ZIP12 is part of a larger family of transporters involved in the movement of metal ions from outside the cell. Other reports showed that simultaneously blocking 3 other transporters in the family – including ZIP1, 2, and 3 – had no major effects on embryonic development.
Zinc is needed for healthy neural development, helping the brain to learn and remember new information. However, too much zinc can also be problematic.
The research team is investigating the implications of their results on processes like embryonic brain development and wound healing.
“[The result] was not expected,” said Messerli, a physiologist in the MBL’s Bell Center for Regenerative Biology and Tissue Enginering and Cellular Dynamics Program. ““We found that zinc uptake through ZIP12 is a regulatory point for neuronal growth, required for development and possibly required for learning and memory throughout life. We want to elucidate the downstream targets that zinc is affecting. That’s the next exploration.”
Alzheimer’s Disease Mouse Models Point To A Potential Therapeutic Approach
Building on research published eight years ago in the journal Chemistry and Biology, Kenneth S. Kosik, Harriman Professor in Neuroscience and co-director of the Neuroscience Research Institute (NRI) at UC Santa Barbara, and his team have now applied their findings to two distinct, well-known mouse models, demonstrating a new potential target in the fight against Alzheimer’s and other neurodegenerative diseases.
The results were published online June 4 as the Paper of the Week in the Journal of Biological Chemistry. As a Paper of the Week, Kosik’s work is among the top 2 percent of manuscripts the journal reviews in a year. Based on significance and overall importance, between 50 and 100 papers are selected for this honor from the more than 6,600 published each year.
Kosik and his research team focused on tau, a protein normally present in the brain, which can develop into neurofibrillary tangles (NFTs) that, along with plaques containing amyloid-ß protein, characterize Alzheimer’s disease. When tau becomes pathological, many phosphate groups attach to it, causing it to become dysfunctional and intensely phosphorylated, or hyperphosphorylated. Aggregations of hyperphosphorylated tau are also referred to as paired helical filaments.
"What struck me most while working on this project was how so many people I’d never met came to me to share their stories and personal anxieties about Alzheimer’s disease," said Xuemei Zhang, lead co-author and an assistant specialist in the Kosik Lab. "There is no doubt that finding therapeutic treatment is the only way to help this fast-growing population." Israel Hernandez, a postdoctoral scholar of the NRI and UCSB’s Department of Molecular, Cellular and Developmental Biology, is the paper’s other lead co-author.
Treatments for hyperphosphorylated tau, one of the main causes of Alzheimer’s disease, do not exist. Current treatment is restricted to drugs that increase the concentration of neurotransmitters to promote signaling between neurons.
However, this latest research explores the possibility that a small class of molecules called diaminothiazoles can act as inhibitors of kinase enzymes that phosphorylate tau. Kosik’s team studied the toxicity and immunoreactivity of several diaminothiazoles that targeted two key kinases, CDK5/p25 and GSK3ß, in two Alzheimer’s disease mouse models. The investigators found that the compounds can efficiently inhibit the enzymes with hardly any toxic effects in the therapeutic dose range.
Treatment with the lead compound in this study, LDN-193594, dramatically affected the prominent neuronal cell loss that accompanies increased CDK5 activity. Diaminothiazole kinase inhibitors not only reduced tau phosphorylation but also exerted a neuroprotective effect in vivo. In addition to reducing the amount of the paired helical filaments in the mice’s brains, they also restored their learning and memory abilities during a fear-conditioning assay.
According to the authors, the fact that treatment with diaminothiazole kinase inhibitors reduced the phosphorylation of tau provides strong evidence that small molecular kinase inhibitor treatment could slow the progression of tau pathology. “Given the contribution of both CDK5 and GSK3ß to tau phosphorylation,” said Kosik, “effective treatment of tauopathies may require dual kinase targeting.”
Madison Cornwell, a Beckman Scholar with UCSB’s Center for Science and Engineering Partnerships who worked in Kosik’s lab, added: “As a beginning step, we demonstrated that two of these compounds were successful in clearing the brain of tau tangles in a mouse model, but someday inhibitors of these kinases may serve to ameliorate the symptoms of Alzheimer’s disease in patients.”
Promising Alzheimer’s ‘drug’ halts memory loss
A new class of experimental drug-like small molecules is showing great promise in targeting a brain enzyme to prevent early memory loss in Alzheimer’s disease, according to Northwestern Medicine® research.
Developed in the laboratory of D. Martin Watterson, the molecules halted memory loss and fixed damaged communication among brain cells in a mouse model of Alzheimer’s.
"This is the starting point for the development of a new class of drugs," said Watterson, lead author of a paper on the study and the John G. Searle Professor of Molecular Biology and Biochemistry at Northwestern University Feinberg School of Medicine. "It’s possible someday this class of drugs could be given early on to people to arrest certain aspects of Alzheimer’s."
Changes in the brain start to occur ten to 15 years before serious memory problems become apparent in Alzheimer’s.
"This class of drugs could be beneficial when the nerve cells are just beginning to become impaired," said Linda Van Eldik, a senior author of the paper and director of the University of Kentucky Sanders-Brown Center on Aging.
The study is a collaboration between Northwestern’s Feinberg School, Columbia University Medical Center and the University of Kentucky. It will be published June 26 in the journal PLOS ONE.
The novel drug-like molecule, called MW108, reduces the activity of an enzyme that is over-activated during Alzheimer’s and is considered a contributor to brain inflammation and impaired neuron function. Strong communication between neurons in the brain is an essential process for memory formation.
"I’m not aware of any other drug that has this effect on the central nervous system," Watterson said.
"These exciting results provide new hope for developing drugs against an important molecular target in the brain," said Roderick Corriveau, program director at the National Institute of Neurological Disorders and Stroke, which helped support the research. "They also provide a promising strategy for identifying small molecule drugs designed to treat Alzheimer’s disease and other neurological disorders."
Watterson and his collaborators have a new National Institutes of Health (NIH) award to further refine the compound so it is metabolically stable and safe for use in humans and develop it to the point of starting a phase 1 clinical trial.
(Image: Jay Vollmar)