Neuroscience

Articles and news from the latest research reports.

95 notes

Testosterone could combat dementia in women

In a new study, post-menopausal women on testosterone therapy showed a significant improvement in verbal learning and memory, offering a promising avenue for research into memory and ageing.

image

Led by Director of the Women’s Health Research Program at Monash University, Professor Susan Davis, and presented at ENDO 2103, the research is the first large, randomised, placebo-controlled investigation into the effects of testosterone on cognitive function in postmenopausal women.

Testosterone has been implicated as being important for brain function in men and these results indicate that it has a role in optimising learning and memory in women.

Dementia, which was estimated to affect more than 35 million people worldwide in 2010, is more common in women than men. There are no effective treatments to prevent memory decline.

In the study, 96 postmenopausal women recruited from the community were randomly allocated to receive a testosterone gel or a visually identical placebo gel to be applied to the skin. Participants underwent a comprehensive series of cognitive tests at the beginning of the study and 26 weeks later.

All women performed in the normal range for their age at the beginning of the trial. There was a statistically significant and clinically meaningful improvement in verbal learning and memory amongst the women using the testosterone gel after 26 weeks.

Professor Davis said the results indicated that testosterone played an important role in women’s health. 

"Much of the research on testosterone in women to date has focused on sexual function. But testosterone has widespread effects in women, including, it appears, significant favourable effects on verbal learning and memory," Professor Davis said. 

"Our findings provide compelling evidence for the conduct of larger clinical studies to further investigate the role of testosterone in cognitive function in women.

Androgen levels did increase in the cohort on testosterone therapy, but on average, remained in the normal female range. No negative side-effects of the therapy were observed.

Filed under testosterone memory dementia aging cognitive function women neuroscience science

110 notes

Online games offer trove of brain data
Study of 35 million users of brain-training software finds alcohol and sleep linked to cognitive performance.
By trawling through data from 35 million users of online ‘brain-training’ tools, researchers have conducted a survey of what they say is the world’s largest data set of human cognitive performance. Their preliminary results show that drinking moderately correlates with better cognitive performance and that sleeping too little or too much has a negative association.
The study, published this week in Frontiers in Human Neuroscience, analysed user data from Lumosity, a collection of web-based games made by Lumos Labs, based in San Francisco, California. Researchers at Lumos conducted the study in collaboration with scientists at two US universities as part of the Human Cognition Project, which the authors describe as “a collaborative research effort to describe the human mind”.
The authors examined results from more than 600 million completed tasks — which measured players’ speed, memory capacity and cognitive flexibility — to get a snapshot of how lifestyle factors can affect cognition and how learning ability changes with age.
Users who enjoyed one or two alcoholic drinks a day tended to perform better on cognitive tasks than teetotallers and heavier drinkers, whose scores dropped as the number of daily drinks increased. The optimal sleep time was seven hours, with performance worsening for every hour of sleep lost or added.
The study authors also looked at performance over time for users who returned to the same brain-training tasks at least 25 times. Performance decreased with age, but the ability to learn new tasks that relied on ‘crystallized knowledge’ (such as vocabulary) did not decline as quickly as it did for those that measured ‘fluid intelligence’ (such as the ability to memorize new sets of information).
Daniel Sternberg, a data scientist at Lumos who led the study, and his colleagues say that their study sample is much broader than those of most psychological studies, which tend to draw from pools of university students.
Buzzwords and biased samples?
But Frederick Unverzagt, neuropsychologist at Indiana University in Indianapolis, who has studied other cognitive-training tools such as training courses in verbal reasoning or speed processing in patients with dementia, says that the sample in this study is also biased: the users of brain-training tools are younger (compared to the typical dementia patients), most of them live in the United States or Europe and, most importantly, they are likely to already be interested in cognitive-training tasks. Although Lumosity has a pool of 35 million users, when the researchers looked at changes in performance over time, they focused on groups of about 22,000 people.
“From a trials perspective, this is very selective,” says Fred Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who has also studied the efficacy of brain-training techniques. “The lower performance scores they saw in older individuals,” he says, “could be attributable to the fact that the older adults were the ones who stuck with it for a long time because they were the ones who needed the training the most.”
And the findings are not controversial or particularly surprising. “But what is interesting and important is this idea that we can have a new paradigm for doing this kind of research: looking at large data sets in order to look at many different kinds of people, to tease out the demographic and lifestyle factors that influence cognition,” says Sternberg. “There are many other interesting questions that other researchers could answer by using this data set — this is just the tip of the iceberg.”

Online games offer trove of brain data

Study of 35 million users of brain-training software finds alcohol and sleep linked to cognitive performance.

By trawling through data from 35 million users of online ‘brain-training’ tools, researchers have conducted a survey of what they say is the world’s largest data set of human cognitive performance. Their preliminary results show that drinking moderately correlates with better cognitive performance and that sleeping too little or too much has a negative association.

The study, published this week in Frontiers in Human Neuroscience, analysed user data from Lumosity, a collection of web-based games made by Lumos Labs, based in San Francisco, California. Researchers at Lumos conducted the study in collaboration with scientists at two US universities as part of the Human Cognition Project, which the authors describe as “a collaborative research effort to describe the human mind”.

The authors examined results from more than 600 million completed tasks — which measured players’ speed, memory capacity and cognitive flexibility — to get a snapshot of how lifestyle factors can affect cognition and how learning ability changes with age.

Users who enjoyed one or two alcoholic drinks a day tended to perform better on cognitive tasks than teetotallers and heavier drinkers, whose scores dropped as the number of daily drinks increased. The optimal sleep time was seven hours, with performance worsening for every hour of sleep lost or added.

The study authors also looked at performance over time for users who returned to the same brain-training tasks at least 25 times. Performance decreased with age, but the ability to learn new tasks that relied on ‘crystallized knowledge’ (such as vocabulary) did not decline as quickly as it did for those that measured ‘fluid intelligence’ (such as the ability to memorize new sets of information).

Daniel Sternberg, a data scientist at Lumos who led the study, and his colleagues say that their study sample is much broader than those of most psychological studies, which tend to draw from pools of university students.

Buzzwords and biased samples?

But Frederick Unverzagt, neuropsychologist at Indiana University in Indianapolis, who has studied other cognitive-training tools such as training courses in verbal reasoning or speed processing in patients with dementia, says that the sample in this study is also biased: the users of brain-training tools are younger (compared to the typical dementia patients), most of them live in the United States or Europe and, most importantly, they are likely to already be interested in cognitive-training tasks. Although Lumosity has a pool of 35 million users, when the researchers looked at changes in performance over time, they focused on groups of about 22,000 people.

“From a trials perspective, this is very selective,” says Fred Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who has also studied the efficacy of brain-training techniques. “The lower performance scores they saw in older individuals,” he says, “could be attributable to the fact that the older adults were the ones who stuck with it for a long time because they were the ones who needed the training the most.”

And the findings are not controversial or particularly surprising. “But what is interesting and important is this idea that we can have a new paradigm for doing this kind of research: looking at large data sets in order to look at many different kinds of people, to tease out the demographic and lifestyle factors that influence cognition,” says Sternberg. “There are many other interesting questions that other researchers could answer by using this data set — this is just the tip of the iceberg.”

Filed under cognitive performance Lumosity Human Cognition Project cognition psychology neuroscience

316 notes

Time perception altered by mindfulness meditation
New published research from psychologists at the universities of Kent and Witten/Herdecke has shown that mindfulness meditation has the ability to temporarily alter practitioners’ perceptions of time – a finding that has wider implications for the use of mindfulness both as an everyday practice, and in clinical treatments and interventions.
Led by Dr Robin Kramer from Kent’s School of Psychology, the research team hypothesised that, given mindfulness’ emphasis on moment-to-moment awareness, mindfulness meditation would slow down time and produce the feeling that short periods of time lasted longer.
To test this hypothesis, they used a temporal bisection task, which allows researchers to measure where each individual subjectively splits a period of time in half. Participants’ responses to this task were collected twice, once before and then again after a listening task. By separating people into two groups, participants listened for ten minutes to either an audiobook or a meditation exercise designed to focus their attention on the movement of breath in the body. The results showed that the control group (audiobook) didn’t change in their responses after the listening task compared with before. However, meditation led to a relative overestimation of durations i.e. time periods felt longer than they had before.
The reasons for this have been interpreted by Dr Kramer and team as the result of attentional changes, producing either improved attentional resources that allow increased attention to the processing of time, or a shift to internally-oriented attention that would have the same effect.
Dr Kramer said: ‘Our findings represent some of the first to demonstrate how mindfulness meditation can alter the perception of time. Given the increasing popularity of mindfulness in everyday practice, its relationship with time perception may provide an important step in our understanding of this pervasive, ancient practice in our modern world.’
Dr Kramer also explained that the benefits of mindfulness and mindfulness-based therapies in a variety of domains are now being identified. These include decreases in rumination, improvements in cognitive flexibility, working memory capacity and sustained attention, and reductions in reactivity, anxiety and depressive symptoms. Mindfulness-based treatments also appear to provide broad antidepressant and antianxiety effects, as well as decreases in general psychological distress. As such, these interventions have been applied with a variety of patients, including those suffering from fibromyalgia, psoriasis, cancer, binge eating and chronic pain.
Dr Dinkar Sharma, Senior Lecturer in Psychology at Kent, commented: ‘Demonstrating that mindfulness has an effect on time perception is important because it opens up the opportunity that mindfulness could be used to alter psychological disorders that are associated with a range of distortions in the perception of time - such as disorders of memory, emotion and addiction.’
Dr Ulrich Weger, of Witten/Herdecke’s Department of Psychology and Psychotherapy, concluded by stating that ‘the impact of a brief mindfulness exercise on elementary processes such as time perception is remarkable’.

Time perception altered by mindfulness meditation

New published research from psychologists at the universities of Kent and Witten/Herdecke has shown that mindfulness meditation has the ability to temporarily alter practitioners’ perceptions of time – a finding that has wider implications for the use of mindfulness both as an everyday practice, and in clinical treatments and interventions.

Led by Dr Robin Kramer from Kent’s School of Psychology, the research team hypothesised that, given mindfulness’ emphasis on moment-to-moment awareness, mindfulness meditation would slow down time and produce the feeling that short periods of time lasted longer.

To test this hypothesis, they used a temporal bisection task, which allows researchers to measure where each individual subjectively splits a period of time in half. Participants’ responses to this task were collected twice, once before and then again after a listening task. By separating people into two groups, participants listened for ten minutes to either an audiobook or a meditation exercise designed to focus their attention on the movement of breath in the body. The results showed that the control group (audiobook) didn’t change in their responses after the listening task compared with before. However, meditation led to a relative overestimation of durations i.e. time periods felt longer than they had before.

The reasons for this have been interpreted by Dr Kramer and team as the result of attentional changes, producing either improved attentional resources that allow increased attention to the processing of time, or a shift to internally-oriented attention that would have the same effect.

Dr Kramer said: ‘Our findings represent some of the first to demonstrate how mindfulness meditation can alter the perception of time. Given the increasing popularity of mindfulness in everyday practice, its relationship with time perception may provide an important step in our understanding of this pervasive, ancient practice in our modern world.’

Dr Kramer also explained that the benefits of mindfulness and mindfulness-based therapies in a variety of domains are now being identified. These include decreases in rumination, improvements in cognitive flexibility, working memory capacity and sustained attention, and reductions in reactivity, anxiety and depressive symptoms. Mindfulness-based treatments also appear to provide broad antidepressant and antianxiety effects, as well as decreases in general psychological distress. As such, these interventions have been applied with a variety of patients, including those suffering from fibromyalgia, psoriasis, cancer, binge eating and chronic pain.

Dr Dinkar Sharma, Senior Lecturer in Psychology at Kent, commented: ‘Demonstrating that mindfulness has an effect on time perception is important because it opens up the opportunity that mindfulness could be used to alter psychological disorders that are associated with a range of distortions in the perception of time - such as disorders of memory, emotion and addiction.’

Dr Ulrich Weger, of Witten/Herdecke’s Department of Psychology and Psychotherapy, concluded by stating that ‘the impact of a brief mindfulness exercise on elementary processes such as time perception is remarkable’.

Filed under time perception meditation mindful meditation emotion memory psychology neuroscience science

189 notes

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development 
Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.
The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.
The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.
The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.
In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.
The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.
“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”
The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development

Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.

The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.

The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.

The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.

In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.

The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.

“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”

The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Filed under brain development neuroplasticity sensory perception hippocampus genetics neuroscience science

181 notes

Scientists discover previously unknown requirement for brain development
Scientists at the Salk Institute for Biological Studies have demonstrated that sensory regions in the brain develop in a fundamentally different way than previously thought, a finding that may yield new insights into visual and neural disorders.
In a paper published June 7, 2013, in Science, Salk researcher Dennis O’Leary and his colleagues have shown that genes alone do not determine how the cerebral cortex grows into separate functional areas. Instead, they show that input from the thalamus, the main switching station in the brain for sensory information, is crucially required.
O’Leary has done pioneering studies in “arealization,” the way in which the neo-cortex, the major region of cerebral cortex, develops specific areas dedicated to particular functions. In a landmark paper published in Science in 2000, he showed that two regulatory genes were critically responsible for the general pattern of the neo-cortex, and has since shown distinct roles for other genes in this process. In this new set of mouse experiments, his laboratory focused on the visual system, and discovered a new, unexpected twist to the story.
"In order to function properly, it is essential that cortical areas are mapped out correctly, and it is this architecture that was thought to be genetically pre-programmed," says O’Leary, holder of the Vincent J. Coates Chair in Molecular Neurobiology at Salk. "To our surprise, we discovered thalamic input plays an essential role far earlier in brain development."
Vision is relayed from the outside world into processing areas within the brain. The relay starts when light hits the retina, a thin strip of cells at the back of the eye that detects color and light levels and encodes the information as electrical and chemical signals. Through retinal ganglion cells, those signals are then sent into the Lateral Geniculate Nucleus (LGN), a structure in thalamus.
In the next important step in the relay, the LGN routes the signals into the primary visual area (V1) in the neo-cortex, a multi-layered structure that is divided into functionally and anatomically distinct areas. V1 begins the process of extracting visual information, which is further carried out by “higher order” visual areas in the neo-cortex that are vitally important to visual perception. Like parts in a machine, the functions of these areas are both individual and integrated. Damage in one tiny area can lead to strange visual disorders in which a person may be able to see a moving ball, and yet not perceive it is in motion.
Current dogma holds that this basic architecture is entirely genetically determined, with environmental input only playing a role later in development. One of the most famous examples of this idea is the Nobel Prize-winning work of visual neuroscientists David Hubel and Torsten Wiesel, which showed that there is a “critical period” of sensitivity in vision. Their finding was commonly interpreted as a warning that without exposure to basic visual stimuli early in life, even an individual with a healthy brain will be unable to see correctly.
Later discoveries in neural plasticity more optimistically suggested that early deprivation can be overcome, and the brain can even sprout new neurons in specific areas. Nevertheless, this still reinforced the idea that environmental influences might modify neural architecture, but only genetics could establish how cortical areas would be laid out.
In their new study, however, O’Leary and the paper’s co-first authors, Shen-Ju Chou and Zoila Babot, post-doctoral researchers in O’Leary’s laboratory, show that genetics only provides a broad field in the neo-cortex for visual areas.
When they created mouse mutants that disconnected the link between thalamus and cortex but only after early cortical development was complete, they found that the primary and higher order visual areas failed to differentiate from one another as they should.
"Our new understanding is that genes only create a rough lay-out of cortical areas," explains O’Leary. "There must be thalamic input to develop the fine differentiation necessary for proper sensory processing."
Essentially, if the brain were a house, genes would determine which areas were bedrooms. Thalamic input provides the details, distinguishing what will be the master bedroom, a child’s bedroom, a guest bedroom and so on. “The size and location of areas within the overall cortex does not change, but without thalamic input from the LGN, the critical differentiation process that creates primary and higher order visual areas does not happen,” says O’Leary.
Given that most sensory modalities—sight, hearing, touch—route through thalamus to cortex, this experiment may suggest why, when someone lacks a sensory modality from birth, that individual has a harder time processing restored sensory input than someone who lost the sense later in life. But in addition, as O’Leary says, “More subtle changes in thalamic input in humans would also likely result in changes to the neo-cortex that could well have a substantial impact on the ability to process vision, or other senses, and lead to abnormal behavior.”
O’Leary says his lab plans to continue to explore the links between how cortical areas in the brain are established and various developmental disorders, such as autism.
(Image: Nucleus Medical Art, Inc.)

Scientists discover previously unknown requirement for brain development

Scientists at the Salk Institute for Biological Studies have demonstrated that sensory regions in the brain develop in a fundamentally different way than previously thought, a finding that may yield new insights into visual and neural disorders.

In a paper published June 7, 2013, in Science, Salk researcher Dennis O’Leary and his colleagues have shown that genes alone do not determine how the cerebral cortex grows into separate functional areas. Instead, they show that input from the thalamus, the main switching station in the brain for sensory information, is crucially required.

O’Leary has done pioneering studies in “arealization,” the way in which the neo-cortex, the major region of cerebral cortex, develops specific areas dedicated to particular functions. In a landmark paper published in Science in 2000, he showed that two regulatory genes were critically responsible for the general pattern of the neo-cortex, and has since shown distinct roles for other genes in this process. In this new set of mouse experiments, his laboratory focused on the visual system, and discovered a new, unexpected twist to the story.

"In order to function properly, it is essential that cortical areas are mapped out correctly, and it is this architecture that was thought to be genetically pre-programmed," says O’Leary, holder of the Vincent J. Coates Chair in Molecular Neurobiology at Salk. "To our surprise, we discovered thalamic input plays an essential role far earlier in brain development."

Vision is relayed from the outside world into processing areas within the brain. The relay starts when light hits the retina, a thin strip of cells at the back of the eye that detects color and light levels and encodes the information as electrical and chemical signals. Through retinal ganglion cells, those signals are then sent into the Lateral Geniculate Nucleus (LGN), a structure in thalamus.

In the next important step in the relay, the LGN routes the signals into the primary visual area (V1) in the neo-cortex, a multi-layered structure that is divided into functionally and anatomically distinct areas. V1 begins the process of extracting visual information, which is further carried out by “higher order” visual areas in the neo-cortex that are vitally important to visual perception. Like parts in a machine, the functions of these areas are both individual and integrated. Damage in one tiny area can lead to strange visual disorders in which a person may be able to see a moving ball, and yet not perceive it is in motion.

Current dogma holds that this basic architecture is entirely genetically determined, with environmental input only playing a role later in development. One of the most famous examples of this idea is the Nobel Prize-winning work of visual neuroscientists David Hubel and Torsten Wiesel, which showed that there is a “critical period” of sensitivity in vision. Their finding was commonly interpreted as a warning that without exposure to basic visual stimuli early in life, even an individual with a healthy brain will be unable to see correctly.

Later discoveries in neural plasticity more optimistically suggested that early deprivation can be overcome, and the brain can even sprout new neurons in specific areas. Nevertheless, this still reinforced the idea that environmental influences might modify neural architecture, but only genetics could establish how cortical areas would be laid out.

In their new study, however, O’Leary and the paper’s co-first authors, Shen-Ju Chou and Zoila Babot, post-doctoral researchers in O’Leary’s laboratory, show that genetics only provides a broad field in the neo-cortex for visual areas.

When they created mouse mutants that disconnected the link between thalamus and cortex but only after early cortical development was complete, they found that the primary and higher order visual areas failed to differentiate from one another as they should.

"Our new understanding is that genes only create a rough lay-out of cortical areas," explains O’Leary. "There must be thalamic input to develop the fine differentiation necessary for proper sensory processing."

Essentially, if the brain were a house, genes would determine which areas were bedrooms. Thalamic input provides the details, distinguishing what will be the master bedroom, a child’s bedroom, a guest bedroom and so on. “The size and location of areas within the overall cortex does not change, but without thalamic input from the LGN, the critical differentiation process that creates primary and higher order visual areas does not happen,” says O’Leary.

Given that most sensory modalities—sight, hearing, touch—route through thalamus to cortex, this experiment may suggest why, when someone lacks a sensory modality from birth, that individual has a harder time processing restored sensory input than someone who lost the sense later in life. But in addition, as O’Leary says, “More subtle changes in thalamic input in humans would also likely result in changes to the neo-cortex that could well have a substantial impact on the ability to process vision, or other senses, and lead to abnormal behavior.”

O’Leary says his lab plans to continue to explore the links between how cortical areas in the brain are established and various developmental disorders, such as autism.

(Image: Nucleus Medical Art, Inc.)

Filed under brain development brain mapping neuroplasticity neurons neocortex LGN neuroscience science

75 notes

Compound enhances SSRI antidepressant’s effects in mice

A synthetic compound is able to turn off “secondary” vacuum cleaners in the brain that take up serotonin, resulting in the “happy” chemical being more plentiful, scientists from the School of Medicine at The University of Texas Health Science Center San Antonio have discovered. Their study, released June 18 by The Journal of Neuroscience, points to novel targets to treat depression.

Serotonin, a neurotransmitter that carries chemical signals, is associated with feelings of wellness. Selective serotonin reuptake inhibitors (SSRIs) are commonly prescribed antidepressants that block a specific “vacuum cleaner” for serotonin (the serotonin transporter, or SERT) from taking up serotonin, resulting in more supply of the neurotransmitter in circulation in the extracellular fluid of the brain.

Delicate balance

"Serotonin is released by neurons in the brain," said Lyn Daws, Ph.D., professor of physiology and pharmacology in the School of Medicine. "Too much or too little may be a bad thing. It is thought that having too little serotonin is linked to depression. That’s why we think Prozac-type drugs (SSRIs) work, by stopping the serotonin transporter from taking up serotonin from extracellular fluid in the brain."

A problem with SSRIs is that many depressed patients experience modest or no therapeutic benefit. It turns out that, while SSRIs block the activity of the serotonin transporter, they don’t block other “vacuum cleaners.” “Until now we did not appreciate the presence of backup cleaners for serotonin,” Dr. Daws said. “We were not the first to show their presence in the brain, but we were among the first show that they were limiting the ability of the SSRIs to increase serotonin signaling in the brain. The study described in this new paper is the first demonstration of enhancing the antidepressant-like effect of an SSRI by concurrently blocking these backup vacuum cleaners.”

Serotonin ceiling

Even if SERT activity is blocked, the backup vacuum cleaners (called organic cation transporters) keep a ceiling on how high the serotonin levels can rise, which likely limits the optimal therapeutic benefit to the patient, Dr. Daws said.

"Right now, the compound we have, decynium-22, is not an agent that we want to give to people in clinical trials," she said. "We are not there yet. Where we are is being able to use this compound to identify new targets in the brain for antidepressant activity and to turn to medicinal chemists to design molecules to block these secondary vacuum cleaners."

(Source: eurekalert.org)

Filed under antidepressants depression serotonin SSRIs decynium-22 medicine neuroscience science

51 notes

Alzheimer’s disease protein controls movement in mice
Researchers in Berlin and Munich, Germany and Oxford, United Kingdom, have revealed that a protein well known for its role in Alzheimer’s disease controls spindle development in muscle and leads to impaired movement in mice when the protein is absent or treated with inhibitors. The results, which are published in The EMBO Journal, suggest that drugs under development to target the beta-secretase-1 protein, which may be potential treatments for Alzheimer’s disease, might produce unwanted side effects related to defective movement.
Alzheimer’s disease is the most common form of dementia found in older adults. The World Health Organization estimates that approximately 18 million people worldwide have Alzheimer’s disease. The number of people affected by the disease may increase to 34 million by 2025. Scientists know that the protein beta-secretase-1 or Bace1, a protease enzyme that breaks down proteins into smaller molecules, is involved in Alzheimer’s disease. Bace1 cleaves the amyloid precursor protein and generates the damaging Abeta peptides that accumulate as plaques in the brain leading to disease. Now scientists have revealed in more detail how Bace1 works.
"Our results show that mice that lack Bace1 proteins or are treated with inhibitors of the enzyme have difficulties in coordination and walking and also show reduced muscle strength," remarked Carmen Birchmeier, one of the authors of the paper, Professor at the Max-Delbrück-Center for Molecular Medicine in Berlin, Germany, and an EMBO Member. "In addition, we were able to show that the combined activities of Bace1 and another protein, neuregulin-1 or Nrg1, are needed to sustain the muscle spindles in mice and to maintain motor coordination."
Muscle spindles are sensory organs that are found throughout the muscles of vertebrates. They are able to detect how muscles stretch and convey the perception of body position to the brain. The researchers used genetic analyses, biochemical studies and interference with pharmacological inhibitors to investigate how Bace1 works in mice. “If the signal strength of a specific form of neuregulin-1 known as IgNrg1 is gradually reduced, increasingly severe defects in the formation and maturation of muscle spindles are observed in mice. Furthermore, it appears that Bace1 is required for full IgNrg1 activity. The graded loss of IgNrg1 activity results in the animals having increasing difficulties with movement and coordination,” says Cyril Cheret, the first author of the work.
Drug developers are interested in stopping the Bace1 protein in its tracks because it represents a promising route to treat Alzheimer’s disease. If the protein were inhibited, it would interfere with the generation of the smaller damaging proteins that accumulate in the brain as amyloid plaques and would therefore provide some level of protection from the effects of the disease. “Our data indicate that one unwanted side effect of the long-term inhibition of Bace1 might be the disruption of muscle spindle formation and impairment of movement. This finding is relevant to scientists looking for ways to develop drugs that target the Bace1 protein and should be considered,” says Birchmeier. Several Bace1 inhibitors are currently being tested in phase II and phase III clinical trials for the treatment of Alzheimer’s disease.

Alzheimer’s disease protein controls movement in mice

Researchers in Berlin and Munich, Germany and Oxford, United Kingdom, have revealed that a protein well known for its role in Alzheimer’s disease controls spindle development in muscle and leads to impaired movement in mice when the protein is absent or treated with inhibitors. The results, which are published in The EMBO Journal, suggest that drugs under development to target the beta-secretase-1 protein, which may be potential treatments for Alzheimer’s disease, might produce unwanted side effects related to defective movement.

Alzheimer’s disease is the most common form of dementia found in older adults. The World Health Organization estimates that approximately 18 million people worldwide have Alzheimer’s disease. The number of people affected by the disease may increase to 34 million by 2025. Scientists know that the protein beta-secretase-1 or Bace1, a protease enzyme that breaks down proteins into smaller molecules, is involved in Alzheimer’s disease. Bace1 cleaves the amyloid precursor protein and generates the damaging Abeta peptides that accumulate as plaques in the brain leading to disease. Now scientists have revealed in more detail how Bace1 works.

"Our results show that mice that lack Bace1 proteins or are treated with inhibitors of the enzyme have difficulties in coordination and walking and also show reduced muscle strength," remarked Carmen Birchmeier, one of the authors of the paper, Professor at the Max-Delbrück-Center for Molecular Medicine in Berlin, Germany, and an EMBO Member. "In addition, we were able to show that the combined activities of Bace1 and another protein, neuregulin-1 or Nrg1, are needed to sustain the muscle spindles in mice and to maintain motor coordination."

Muscle spindles are sensory organs that are found throughout the muscles of vertebrates. They are able to detect how muscles stretch and convey the perception of body position to the brain. The researchers used genetic analyses, biochemical studies and interference with pharmacological inhibitors to investigate how Bace1 works in mice. “If the signal strength of a specific form of neuregulin-1 known as IgNrg1 is gradually reduced, increasingly severe defects in the formation and maturation of muscle spindles are observed in mice. Furthermore, it appears that Bace1 is required for full IgNrg1 activity. The graded loss of IgNrg1 activity results in the animals having increasing difficulties with movement and coordination,” says Cyril Cheret, the first author of the work.

Drug developers are interested in stopping the Bace1 protein in its tracks because it represents a promising route to treat Alzheimer’s disease. If the protein were inhibited, it would interfere with the generation of the smaller damaging proteins that accumulate in the brain as amyloid plaques and would therefore provide some level of protection from the effects of the disease. “Our data indicate that one unwanted side effect of the long-term inhibition of Bace1 might be the disruption of muscle spindle formation and impairment of movement. This finding is relevant to scientists looking for ways to develop drugs that target the Bace1 protein and should be considered,” says Birchmeier. Several Bace1 inhibitors are currently being tested in phase II and phase III clinical trials for the treatment of Alzheimer’s disease.

Filed under alzheimer's disease dementia neurodegenerative diseases movement impairment BACE1 muscle spindles neuroscience science

82 notes

Scientists Design a Potential Drug Compound that Attacks Parkinson’s Disease on Two Fronts

Scientists from the Florida campus of The Scripps Research Institute (TSRI) have found a compound that could counter Parkinson’s disease in two ways at once.

In a new study published recently online ahead of print by the journal ACS Chemical Biology, the scientists describe a “dual inhibitor”—two compounds in a single molecule—that attacks a pair of proteins closely associated with development of Parkinson’s disease.

“In general, these two enzymes amplify the effect of each other,” said team leader Phil LoGrasso, a TSRI professor who has been a pioneer in the development of JNK inhibitors for the treatment of neurodegenerative diseases. “What we were looking for is a high-affinity, high-selectivity treatment that is additive or synergistic in its effect—a one-two punch.”

That could be what they found.

This new dual inhibitor attacks two enzymes—the leucine-rich repeat kinase 2 (LRRK2) and the c-jun-N-terminal kinase (JNK)—pronounced “junk.” Genetic testing of several thousand Parkinson’s patients has shown that mutations in the LRRK2 gene increase the risk of Parkinson’s disease, while JNK has been shown to play an important role in neuron (nerve cell) survival in a range of neurodegenerative diseases. As such, they have become highly viable targets for drugs to treat disorders such as Parkinson’s disease.

A dual inhibitor ultimately would be preferred over separate individual JNK and LRRK2 inhibitors because a combination molecule would eliminate complications of drug-drug interactions and the need to optimize individual inhibitor doses for efficacy, the study noted.

Now the team’s new dual inhibitor will need to be optimized for potency, high selectivity (which reduces off-target side effects) and bioavailability so it can be tested in animal models of Parkinson’s disease.

(Source: scripps.edu)

Filed under neurodegenerative diseases neurodegeneration parkinson's disease neurons JNK inhibitors neuroscience science

73 notes

Hong Kong Skyscrapers Appear to Fall in Real-World Illusion

No matter how we jump, roll, sit, or lie down, our brain manages to maintain a visual representation of the world that stays upright relative to the pull of gravity. But a new study of rider experiences on the Hong Kong Peak Tram, a popular tourist attraction, shows that specific features of the environment can dominate our perception of verticality, making skyscrapers appear to fall.

image

The study is published in Psychological Science, a journal of the Association for Psychological Science.

The Hong Kong Peak Tram to Victoria Peak is a popular way to survey the Hong Kong skyline and millions of people ride the tram every year.

“On one trip, I noticed that the city’s skyscrapers next to the tram started to appear very tilted, as if they were falling, which anyone with common sense knows is impossible,” says lead researcher Chia-huei Tseng of the University of Hong Kong. “The gasps of the other passengers told me I wasn’t the only one seeing it.”

The illusion was perplexing because, in contrast with most illusions studied in the laboratory, observers have complete access to visual cues from the outside world through the tram’s open windows.

Exploring the illusion under various conditions, Tseng and colleagues found that the perceived, or illusory, tilt was greatest on night-time rides, perhaps a result of the relative absence of visual-orientation cues or a heightened sense of enclosure at night. Enhancing the tilted frame of reference within the tram car — indicated by features like oblique window frames, beams, floor, and lighting fixtures — makes the true vertical of the high rises seem to tilt in the opposite direction.

The illusion was significantly reduced by obscuring the window frame and other reference cues inside the tram car, by using wedges to adjust observers’ position, and by having them stand during the tram ride.

But no single modification was sufficient to eliminate the illusion.

“Our findings demonstrate that signals from all the senses must be consonant with each other to abolish the tilt illusion,” the researchers write. “On the tram, it seems that vision dominates verticality perception over other sensory modalities that also mediate earth gravity, such as the vestibular and tactile systems.”

The robustness of the tram illusion took the researchers by surprise:

“We took the same tram up and down for hundreds of trips, and the illusion did not reduce a bit,” says Tseng. “This suggests that our experiences and our learned knowledge about the world — that buildings should be vertical — are not enough to cancel our brain’s wrong conclusion.”

Filed under tram illusion perception skyscrapers visual representation psychology neuroscience science

93 notes

Brain Can Plan Actions Toward Things the Eye Doesn’t See

People can plan strategic movements to several different targets at the same time, even when they see far fewer targets than are actually present, according to a new study published in Psychological Science, a journal of the Association for Psychological Science.

image

A team of researchers at the Brain and Mind Institute at the University of Western Ontario took advantage of a pictorial illusion — known as the “connectedness illusion” — that causes people to underestimate the number of targets they see.

When people act on these targets, however, they can rapidly plan accurate and strategic reaches that reflect the actual number of targets.

Using sophisticated statistical techniques to analyze participants’ responses to multiple potential targets, the researchers found that participants’ reaches to the targets were unaffected by the presence of the connecting lines.

Thus, the “connectedness illusion” seemed to influence the number of targets they perceived but did not impact their ability to plan actions related to the targets.

These findings indicate that the processes in the brain that plan visually guided actions are distinct from those that allow us to perceive the world.

“The design of the experiments allowed us to separate these two processes, even though they normally unfold at the same time,” explained lead researcher Jennifer Milne, a PhD student at the University of Western Ontario.

“It’s as though we have a semi-autonomous robot in our brain that plans and executes actions on our behalf with only the broadest of instructions from us!”

According to Mel Goodale, professor at the University of Western Ontario and senior author on the paper, these findings “not only reveal just how sophisticated the visuomotor systems in the brain are, but could also have important implications for the design and implementation of robotic systems and efficient human-machine interfaces.”

Filed under brain connectedness illusion visuomotor systems visual perception psychology neuroscience science

free counters