Neuroscience

Articles and news from the latest research reports.

Posts tagged science

69 notes

Getting to grips with migraine
Researchers identify some of the biological roots of migraine from large-scale genome study
In the largest study of migraines, researchers have found 5 genetic regions that for the first time have been linked to the onset of migraine. This study opens new doors to understanding the cause and biological triggers that underlie migraine attacks.
The team identified 12 genetic regions associated with migraine susceptibility. Eight of these regions were found in or near genes known to play a role in controlling brain circuitries and two of the regions were associated with genes that are responsible for maintaining healthy brain tissue. The regulation of these pathways may be important to the genetic susceptibility of migraines.
Migraine is a debilitating disorder that affects approximately 14% of adults. Migraine has recently been recognised as the seventh disabler in the Global Burden of Disease Survey 2010 and has been estimated to be the most costly neurological disorder. It is an extremely difficult disorder to study because no biomarkers between or during attacks have been identified so far.
"This study has greatly advanced our biological insight about the cause of migraine," says Dr Aarno Palotie, from the Wellcome Trust Sanger Institute. "Migraine and epilepsy are particularly difficult neural conditions to study; between episodes the patient is basically healthy so it’s extremely difficult to uncover biochemical clues.
"We have proven that this is the most effective approach to study this type of neurological disorder and understand the biology that lies at the heart of it."
The team uncovered the underlying susceptibilities by comparing the results from 29 different genomic studies, including over 100,000 samples from both migraine patients and control samples.
They found that some of the regions of susceptibility lay close to a network of genes that are sensitive to oxidative stress, a biochemical process that results in the dysfunction of cells.
The team expects many of the genes at genetic regions associated with migraine are interconnected and could potentially be disrupting the internal regulation of tissue and cells in the brain, resulting in some of the symptoms of migraine.
"We would not have made discoveries by studying smaller groups of individuals," says Dr Gisela Terwindt, co-author from Leiden University Medical Centre. "This large scale method of studying over 100,000 samples of healthy and affected people means we can tease out the genes that are important suspects and follow them up in the lab."
The team identified an additional 134 genetic regions that are possibly associated to migraine susceptibility with weaker statistical evidence. Whether these regions underlie migraine susceptibility or not still needs to be elucidated. Other similar studies show that these statistically weaker culprits can play an equal part in the underlying biology of a disease or disorder.
"The molecular mechanisms of migraine are poorly understood. The sequence variants uncovered through this meta-analysis could become a foothold for further studies to better understanding the pathophysiology of migraine" says Dr Kári Stefánsson, President of deCODE genetics.
"This approach is the most efficient way of revealing the underlying biology of these neural disorders," says Dr Mark Daly, from the Massachusetts General Hospital and the Broad Institute of MIT and Harvard. "Effective studies that give us biological or biochemical results and insights are essential if we are to fully get to grips with this debilitating condition.
"Pursuing these studies in even larger samples and with denser maps of biological markers will increase our power to determine the roots and triggers of this disabling disorder."

Getting to grips with migraine

Researchers identify some of the biological roots of migraine from large-scale genome study

In the largest study of migraines, researchers have found 5 genetic regions that for the first time have been linked to the onset of migraine. This study opens new doors to understanding the cause and biological triggers that underlie migraine attacks.

The team identified 12 genetic regions associated with migraine susceptibility. Eight of these regions were found in or near genes known to play a role in controlling brain circuitries and two of the regions were associated with genes that are responsible for maintaining healthy brain tissue. The regulation of these pathways may be important to the genetic susceptibility of migraines.

Migraine is a debilitating disorder that affects approximately 14% of adults. Migraine has recently been recognised as the seventh disabler in the Global Burden of Disease Survey 2010 and has been estimated to be the most costly neurological disorder. It is an extremely difficult disorder to study because no biomarkers between or during attacks have been identified so far.

"This study has greatly advanced our biological insight about the cause of migraine," says Dr Aarno Palotie, from the Wellcome Trust Sanger Institute. "Migraine and epilepsy are particularly difficult neural conditions to study; between episodes the patient is basically healthy so it’s extremely difficult to uncover biochemical clues.

"We have proven that this is the most effective approach to study this type of neurological disorder and understand the biology that lies at the heart of it."

The team uncovered the underlying susceptibilities by comparing the results from 29 different genomic studies, including over 100,000 samples from both migraine patients and control samples.

They found that some of the regions of susceptibility lay close to a network of genes that are sensitive to oxidative stress, a biochemical process that results in the dysfunction of cells.

The team expects many of the genes at genetic regions associated with migraine are interconnected and could potentially be disrupting the internal regulation of tissue and cells in the brain, resulting in some of the symptoms of migraine.

"We would not have made discoveries by studying smaller groups of individuals," says Dr Gisela Terwindt, co-author from Leiden University Medical Centre. "This large scale method of studying over 100,000 samples of healthy and affected people means we can tease out the genes that are important suspects and follow them up in the lab."

The team identified an additional 134 genetic regions that are possibly associated to migraine susceptibility with weaker statistical evidence. Whether these regions underlie migraine susceptibility or not still needs to be elucidated. Other similar studies show that these statistically weaker culprits can play an equal part in the underlying biology of a disease or disorder.

"The molecular mechanisms of migraine are poorly understood. The sequence variants uncovered through this meta-analysis could become a foothold for further studies to better understanding the pathophysiology of migraine" says Dr Kári Stefánsson, President of deCODE genetics.

"This approach is the most efficient way of revealing the underlying biology of these neural disorders," says Dr Mark Daly, from the Massachusetts General Hospital and the Broad Institute of MIT and Harvard. "Effective studies that give us biological or biochemical results and insights are essential if we are to fully get to grips with this debilitating condition.

"Pursuing these studies in even larger samples and with denser maps of biological markers will increase our power to determine the roots and triggers of this disabling disorder."

Filed under migraines brain circuitry brain tissue genetics genomics neuroscience science

154 notes

Addiction Relapse Might Be Thwarted By Turning Off Brain Trigger
Researchers at the Ernest Gallo Clinic and Research Center at UC San Francisco have been able to identify and deactivate a brain pathway linked to memories that cause alcohol cravings in rats, a finding that may one day lead to a treatment option for people who suffer from alcohol abuse disorders and other addictions.
In the study, researchers were able to prevent the addicted animals from seeking alcohol and drinking it, the equivalent of relapse.
“One of the main causes of relapse is craving, triggered by the memory by certain cues – like going into a bar, or the smell or taste of alcohol,” said lead author Segev Barak, PhD, at the time a postdoctoral fellow in the lab of co-senior author Dorit Ron, PhD, a Gallo Center investigator and UCSF professor of neurology.
“We learned that when rats were exposed to the smell or taste of alcohol, there was a small window of opportunity to target the area of the brain that reconsolidates the memory of the craving for alcohol and to weaken or even erase the memory, and thus the craving” he said.
The study, also supervised by co-senior author Patricia H. Janak, PhD, a Gallo Center investigator and UCSF professor of neurology, was published online on June 23 in Nature Neuroscience.
Neural Mechanism That Triggers Alcohol Memory
In the first phase of the study, rats had the choice to freely drink water or alcohol over the course of seven weeks, and during this time developed a high preference for alcohol.
In the next phase, they had the opportunity to access alcohol for one hour a day, which they learned to do by pressing a lever. They were then put through a 10-day period of abstinence from alcohol.
Following this period, the animals were exposed for five minutes to just the smell and taste of alcohol, which cued them to remember how much they liked drinking it. The researchers then scanned the animals’ brains, and identified the neural mechanism responsible for the reactivation of the memory of the alcohol – a molecular pathway mediated by an enzyme known as mammalian target of rapamycin complex 1 (mTORC1).
They found that just a small drop of alcohol presented to the rats turned on the mTORC1 pathway specifically in a select region of the amygdala, a structure linked to emotional reactions and withdrawal from alcohol, and cortical regions involved in memory processing.
They further showed that once mTORC1 was activated, the alcohol-memory stabilized (reconsolidated) and the rats relapsed on the following days, meaning in this case, that they started again to push the lever to dispense more alcohol.
“The smell and taste of alcohol were such strong cues that we could target the memory specifically without impacting other memories, such as a craving for sugar,” said Barak, who added that the Ron research group has been doing brain studies for many years and has never seen such a robust and specific activation in the brain.
Drug that Erases the Memory of Alcohol
In the next part of the study, the researchers set out to see if they could prevent the reconsolidation of the memory of alcohol by inhibiting mTORC1, thus preventing relapse. When mTORC1 was inactivated using a drug called rapamycin, administered immediately after the exposure to the cue (smell, taste), there was no relapse to alcohol-seeking the next day.
Strikingly, drinking remained suppressed for up to 14 days, the end point of the study. These results suggest that rapamycin erased the memory of alcohol for a long period, said Ron.
The authors said the study is an important first step, but that more research is needed to determine how mTORC1 contributes to alcohol memory reconsolidation and whether turning off mTORC1 with rapamycin would prevent relapse for more than two weeks.
The authors also said it would be interesting to test if rapamycin, an FDA-approved drug currently used to prevent organ rejection after transplantation, or other mTORC1 inhibitors that are currently being developed in pharmaceutical companies, would prevent relapse in human alcoholics.
“One of the main problems in alcohol abuse disorders is relapse, and current treatment options are very limited.” Barak said. “Even after detoxification and a period of rehabilitation, 70 to 80 percent of patients will relapse in the first several years. It is really thrilling that we were able to completely erase the memory of alcohol and prevent relapse in these animals. This could be a revolution in treatment approaches for addiction, in terms of erasing unwanted memories and thereby manipulating the brain triggers that are so problematic for people with addictions.”

Addiction Relapse Might Be Thwarted By Turning Off Brain Trigger

Researchers at the Ernest Gallo Clinic and Research Center at UC San Francisco have been able to identify and deactivate a brain pathway linked to memories that cause alcohol cravings in rats, a finding that may one day lead to a treatment option for people who suffer from alcohol abuse disorders and other addictions.

In the study, researchers were able to prevent the addicted animals from seeking alcohol and drinking it, the equivalent of relapse.

“One of the main causes of relapse is craving, triggered by the memory by certain cues – like going into a bar, or the smell or taste of alcohol,” said lead author Segev Barak, PhD, at the time a postdoctoral fellow in the lab of co-senior author Dorit Ron, PhD, a Gallo Center investigator and UCSF professor of neurology.

“We learned that when rats were exposed to the smell or taste of alcohol, there was a small window of opportunity to target the area of the brain that reconsolidates the memory of the craving for alcohol and to weaken or even erase the memory, and thus the craving” he said.

The study, also supervised by co-senior author Patricia H. Janak, PhD, a Gallo Center investigator and UCSF professor of neurology, was published online on June 23 in Nature Neuroscience.

Neural Mechanism That Triggers Alcohol Memory

In the first phase of the study, rats had the choice to freely drink water or alcohol over the course of seven weeks, and during this time developed a high preference for alcohol.

In the next phase, they had the opportunity to access alcohol for one hour a day, which they learned to do by pressing a lever. They were then put through a 10-day period of abstinence from alcohol.

Following this period, the animals were exposed for five minutes to just the smell and taste of alcohol, which cued them to remember how much they liked drinking it. The researchers then scanned the animals’ brains, and identified the neural mechanism responsible for the reactivation of the memory of the alcohol – a molecular pathway mediated by an enzyme known as mammalian target of rapamycin complex 1 (mTORC1).

They found that just a small drop of alcohol presented to the rats turned on the mTORC1 pathway specifically in a select region of the amygdala, a structure linked to emotional reactions and withdrawal from alcohol, and cortical regions involved in memory processing.

They further showed that once mTORC1 was activated, the alcohol-memory stabilized (reconsolidated) and the rats relapsed on the following days, meaning in this case, that they started again to push the lever to dispense more alcohol.

“The smell and taste of alcohol were such strong cues that we could target the memory specifically without impacting other memories, such as a craving for sugar,” said Barak, who added that the Ron research group has been doing brain studies for many years and has never seen such a robust and specific activation in the brain.

Drug that Erases the Memory of Alcohol

In the next part of the study, the researchers set out to see if they could prevent the reconsolidation of the memory of alcohol by inhibiting mTORC1, thus preventing relapse. When mTORC1 was inactivated using a drug called rapamycin, administered immediately after the exposure to the cue (smell, taste), there was no relapse to alcohol-seeking the next day.

Strikingly, drinking remained suppressed for up to 14 days, the end point of the study. These results suggest that rapamycin erased the memory of alcohol for a long period, said Ron.

The authors said the study is an important first step, but that more research is needed to determine how mTORC1 contributes to alcohol memory reconsolidation and whether turning off mTORC1 with rapamycin would prevent relapse for more than two weeks.

The authors also said it would be interesting to test if rapamycin, an FDA-approved drug currently used to prevent organ rejection after transplantation, or other mTORC1 inhibitors that are currently being developed in pharmaceutical companies, would prevent relapse in human alcoholics.

“One of the main problems in alcohol abuse disorders is relapse, and current treatment options are very limited.” Barak said. “Even after detoxification and a period of rehabilitation, 70 to 80 percent of patients will relapse in the first several years. It is really thrilling that we were able to completely erase the memory of alcohol and prevent relapse in these animals. This could be a revolution in treatment approaches for addiction, in terms of erasing unwanted memories and thereby manipulating the brain triggers that are so problematic for people with addictions.”

Filed under alcohol abuse addiction amygdala rapamycin mTORC1 memory neuroscience science

790 notes

Trying to Learn a Foreign Language? Avoid Reminders of Home
Something odd happened when Shu Zhang was giving a presentation to her classmates at the Columbia Business School in New York City. Zhang, a Chinese native, spoke fluent English, yet in the middle of her talk, she glanced over at her Chinese professor and suddenly blurted out a word in Mandarin. “I meant to say a transition word like ‘however,’ but used the Chinese version instead,” she says. “It really shocked me.”
Shortly afterward, Zhang teamed up with Columbia social psychologist Michael Morris and colleagues to figure out what had happened. In a new study, they show that reminders of one’s homeland can hinder the ability to speak a new language. The findings could help explain why cultural immersion is the most effective way to learn a foreign tongue and why immigrants who settle within an ethnic enclave acculturate more slowly than those who surround themselves with friends from their new country.
Previous studies have shown that cultural icons such as landmarks and celebrities act like “magnets of meaning,” instantly activating a web of cultural associations in the mind and influencing our judgments and behavior, Morris says. In an earlier study, for example, he asked Chinese Americans to explain what was happening in a photograph of several fish, in which one fish swam slightly ahead of the others. Subjects first shown Chinese symbols, such as the Great Wall or a dragon, interpreted the fish as being chased. But individuals primed with American images of Marilyn Monroe or Superman, in contrast, tended to interpret the outlying fish as leading the others. This internally driven motivation is more typical of individualistic American values, some social psychologists say, whereas the more externally driven explanation of being pursued is more typical of Chinese culture.
To determine whether these cultural icons can also interfere with speaking a second language, Zhang, Morris, and their colleagues recruited male and female Chinese students who had lived in the United States for a less than a year and had them sit opposite a computer monitor that displayed the face of either a Chinese or Caucasian male called “Michael Lee.” As microphones recorded their speech, the volunteers conversed with Lee, who spoke to them in English with an American accent about campus life.
Next, the team compared the fluency of the volunteers’ speech when they were talking to a Chinese versus a Caucasian face. Although participants reported a more positive experience chatting with the Chinese version of “Michael Lee,” they were significantly less fluent, producing 11% fewer words per minute on average, the authors report online today in the Proceedings of the National Academy of Sciences. “It’s ironic” that the more comfortable volunteers were with their conversational partner, the less fluent they became, Zhang says. “That’s something we did not expect.”
To rule out the possibility that the volunteers were speaking more fluently to the Caucasian face on purpose, thus explaining the performance gap, Zhang and colleagues asked the participants to invent a story, such as a boy swimming in the ocean, while simultaneously being exposed to Chinese and American icons rather than faces. Seeing Chinese icons such as the Great Wall also interfered with the volunteers’ English fluency, causing a 16% drop in words produced per minute. The icons also made the volunteers 85% more likely to use a literal translation of the Chinese word for an object rather than the English term, Zhang says. Rather than saying “pistachio,” for example, volunteers used the Chinese version, “happy nuts.”
Understanding how these subtle cultural cues affect language fluency could help employers design better job interviews, Morris says. For example, taking a Japanese job candidate out for sushi, although a well-meaning gesture, might not be the best way to help them shine.
"It’s quite striking that these effects were so robust," says Mary Helen Immordino-Yang, a developmental psychologist at the University of Southern California in Los Angeles. They show that "we’re exquisitely attuned to cultural context," she says, and that "even subtle cues like the ethnicity of the person we’re talking to" can affect language processing. The take-home message? "If one wants to acculturate rapidly, don’t move to an ethnic enclave neighborhood where you’ll be surrounded by people like yourself," Morris says. Sometimes, a familiar face is the last thing you need to see.

Trying to Learn a Foreign Language? Avoid Reminders of Home

Something odd happened when Shu Zhang was giving a presentation to her classmates at the Columbia Business School in New York City. Zhang, a Chinese native, spoke fluent English, yet in the middle of her talk, she glanced over at her Chinese professor and suddenly blurted out a word in Mandarin. “I meant to say a transition word like ‘however,’ but used the Chinese version instead,” she says. “It really shocked me.”

Shortly afterward, Zhang teamed up with Columbia social psychologist Michael Morris and colleagues to figure out what had happened. In a new study, they show that reminders of one’s homeland can hinder the ability to speak a new language. The findings could help explain why cultural immersion is the most effective way to learn a foreign tongue and why immigrants who settle within an ethnic enclave acculturate more slowly than those who surround themselves with friends from their new country.

Previous studies have shown that cultural icons such as landmarks and celebrities act like “magnets of meaning,” instantly activating a web of cultural associations in the mind and influencing our judgments and behavior, Morris says. In an earlier study, for example, he asked Chinese Americans to explain what was happening in a photograph of several fish, in which one fish swam slightly ahead of the others. Subjects first shown Chinese symbols, such as the Great Wall or a dragon, interpreted the fish as being chased. But individuals primed with American images of Marilyn Monroe or Superman, in contrast, tended to interpret the outlying fish as leading the others. This internally driven motivation is more typical of individualistic American values, some social psychologists say, whereas the more externally driven explanation of being pursued is more typical of Chinese culture.

To determine whether these cultural icons can also interfere with speaking a second language, Zhang, Morris, and their colleagues recruited male and female Chinese students who had lived in the United States for a less than a year and had them sit opposite a computer monitor that displayed the face of either a Chinese or Caucasian male called “Michael Lee.” As microphones recorded their speech, the volunteers conversed with Lee, who spoke to them in English with an American accent about campus life.

Next, the team compared the fluency of the volunteers’ speech when they were talking to a Chinese versus a Caucasian face. Although participants reported a more positive experience chatting with the Chinese version of “Michael Lee,” they were significantly less fluent, producing 11% fewer words per minute on average, the authors report online today in the Proceedings of the National Academy of Sciences. “It’s ironic” that the more comfortable volunteers were with their conversational partner, the less fluent they became, Zhang says. “That’s something we did not expect.”

To rule out the possibility that the volunteers were speaking more fluently to the Caucasian face on purpose, thus explaining the performance gap, Zhang and colleagues asked the participants to invent a story, such as a boy swimming in the ocean, while simultaneously being exposed to Chinese and American icons rather than faces. Seeing Chinese icons such as the Great Wall also interfered with the volunteers’ English fluency, causing a 16% drop in words produced per minute. The icons also made the volunteers 85% more likely to use a literal translation of the Chinese word for an object rather than the English term, Zhang says. Rather than saying “pistachio,” for example, volunteers used the Chinese version, “happy nuts.”

Understanding how these subtle cultural cues affect language fluency could help employers design better job interviews, Morris says. For example, taking a Japanese job candidate out for sushi, although a well-meaning gesture, might not be the best way to help them shine.

"It’s quite striking that these effects were so robust," says Mary Helen Immordino-Yang, a developmental psychologist at the University of Southern California in Los Angeles. They show that "we’re exquisitely attuned to cultural context," she says, and that "even subtle cues like the ethnicity of the person we’re talking to" can affect language processing. The take-home message? "If one wants to acculturate rapidly, don’t move to an ethnic enclave neighborhood where you’ll be surrounded by people like yourself," Morris says. Sometimes, a familiar face is the last thing you need to see.

Filed under cross-language interference language processing cultural cues psychology neuroscience science

124 notes

Repairing Bad Memories
It was a Saturday night at the New York Psychoanalytic Institute, and the second-floor auditorium held an odd mix of gray-haired, cerebral Upper East Side types and young, scruffy downtown grad students in black denim. Up on the stage, neuroscientist Daniela Schiller, a riveting figure with her long, straight hair and impossibly erect posture, paused briefly from what she was doing to deliver a mini-lecture about memory.
She explained how recent research, including her own, has shown that memories are not unchanging physical traces in the brain. Instead, they are malleable constructs that may be rebuilt every time they are recalled. The research suggests, she said, that doctors (and psychotherapists) might be able to use this knowledge to help patients block the fearful emotions they experience when recalling a traumatic event, converting chronic sources of debilitating anxiety into benign trips down memory lane.
And then Schiller went back to what she had been doing, which was providing a slamming, rhythmic beat on drums and backup vocals for the Amygdaloids, a rock band composed of New York City neuroscientists. During their performance at the institute’s second annual “Heavy Mental Variety Show,” the band blasted out a selection of its greatest hits, including songs about cognition (“Theory of My Mind”), memory (“A Trace”), and psychopathology (“Brainstorm”).
“Just give me a pill,” Schiller crooned at one point, during the chorus of a song called “Memory Pill.” “Wash away my memories …”
The irony is that if research by Schiller and others holds up, you may not even need a pill to strip a memory of its power to frighten or oppress you.
Read more

Repairing Bad Memories

It was a Saturday night at the New York Psychoanalytic Institute, and the second-floor auditorium held an odd mix of gray-haired, cerebral Upper East Side types and young, scruffy downtown grad students in black denim. Up on the stage, neuroscientist Daniela Schiller, a riveting figure with her long, straight hair and impossibly erect posture, paused briefly from what she was doing to deliver a mini-lecture about memory.

She explained how recent research, including her own, has shown that memories are not unchanging physical traces in the brain. Instead, they are malleable constructs that may be rebuilt every time they are recalled. The research suggests, she said, that doctors (and psychotherapists) might be able to use this knowledge to help patients block the fearful emotions they experience when recalling a traumatic event, converting chronic sources of debilitating anxiety into benign trips down memory lane.

And then Schiller went back to what she had been doing, which was providing a slamming, rhythmic beat on drums and backup vocals for the Amygdaloids, a rock band composed of New York City neuroscientists. During their performance at the institute’s second annual “Heavy Mental Variety Show,” the band blasted out a selection of its greatest hits, including songs about cognition (“Theory of My Mind”), memory (“A Trace”), and psychopathology (“Brainstorm”).

“Just give me a pill,” Schiller crooned at one point, during the chorus of a song called “Memory Pill.” “Wash away my memories …”

The irony is that if research by Schiller and others holds up, you may not even need a pill to strip a memory of its power to frighten or oppress you.

Read more

Filed under memory emotional memory reconsolidation dementia neuroscience science

95 notes

Testosterone could combat dementia in women

In a new study, post-menopausal women on testosterone therapy showed a significant improvement in verbal learning and memory, offering a promising avenue for research into memory and ageing.

image

Led by Director of the Women’s Health Research Program at Monash University, Professor Susan Davis, and presented at ENDO 2103, the research is the first large, randomised, placebo-controlled investigation into the effects of testosterone on cognitive function in postmenopausal women.

Testosterone has been implicated as being important for brain function in men and these results indicate that it has a role in optimising learning and memory in women.

Dementia, which was estimated to affect more than 35 million people worldwide in 2010, is more common in women than men. There are no effective treatments to prevent memory decline.

In the study, 96 postmenopausal women recruited from the community were randomly allocated to receive a testosterone gel or a visually identical placebo gel to be applied to the skin. Participants underwent a comprehensive series of cognitive tests at the beginning of the study and 26 weeks later.

All women performed in the normal range for their age at the beginning of the trial. There was a statistically significant and clinically meaningful improvement in verbal learning and memory amongst the women using the testosterone gel after 26 weeks.

Professor Davis said the results indicated that testosterone played an important role in women’s health. 

"Much of the research on testosterone in women to date has focused on sexual function. But testosterone has widespread effects in women, including, it appears, significant favourable effects on verbal learning and memory," Professor Davis said. 

"Our findings provide compelling evidence for the conduct of larger clinical studies to further investigate the role of testosterone in cognitive function in women.

Androgen levels did increase in the cohort on testosterone therapy, but on average, remained in the normal female range. No negative side-effects of the therapy were observed.

Filed under testosterone memory dementia aging cognitive function women neuroscience science

316 notes

Time perception altered by mindfulness meditation
New published research from psychologists at the universities of Kent and Witten/Herdecke has shown that mindfulness meditation has the ability to temporarily alter practitioners’ perceptions of time – a finding that has wider implications for the use of mindfulness both as an everyday practice, and in clinical treatments and interventions.
Led by Dr Robin Kramer from Kent’s School of Psychology, the research team hypothesised that, given mindfulness’ emphasis on moment-to-moment awareness, mindfulness meditation would slow down time and produce the feeling that short periods of time lasted longer.
To test this hypothesis, they used a temporal bisection task, which allows researchers to measure where each individual subjectively splits a period of time in half. Participants’ responses to this task were collected twice, once before and then again after a listening task. By separating people into two groups, participants listened for ten minutes to either an audiobook or a meditation exercise designed to focus their attention on the movement of breath in the body. The results showed that the control group (audiobook) didn’t change in their responses after the listening task compared with before. However, meditation led to a relative overestimation of durations i.e. time periods felt longer than they had before.
The reasons for this have been interpreted by Dr Kramer and team as the result of attentional changes, producing either improved attentional resources that allow increased attention to the processing of time, or a shift to internally-oriented attention that would have the same effect.
Dr Kramer said: ‘Our findings represent some of the first to demonstrate how mindfulness meditation can alter the perception of time. Given the increasing popularity of mindfulness in everyday practice, its relationship with time perception may provide an important step in our understanding of this pervasive, ancient practice in our modern world.’
Dr Kramer also explained that the benefits of mindfulness and mindfulness-based therapies in a variety of domains are now being identified. These include decreases in rumination, improvements in cognitive flexibility, working memory capacity and sustained attention, and reductions in reactivity, anxiety and depressive symptoms. Mindfulness-based treatments also appear to provide broad antidepressant and antianxiety effects, as well as decreases in general psychological distress. As such, these interventions have been applied with a variety of patients, including those suffering from fibromyalgia, psoriasis, cancer, binge eating and chronic pain.
Dr Dinkar Sharma, Senior Lecturer in Psychology at Kent, commented: ‘Demonstrating that mindfulness has an effect on time perception is important because it opens up the opportunity that mindfulness could be used to alter psychological disorders that are associated with a range of distortions in the perception of time - such as disorders of memory, emotion and addiction.’
Dr Ulrich Weger, of Witten/Herdecke’s Department of Psychology and Psychotherapy, concluded by stating that ‘the impact of a brief mindfulness exercise on elementary processes such as time perception is remarkable’.

Time perception altered by mindfulness meditation

New published research from psychologists at the universities of Kent and Witten/Herdecke has shown that mindfulness meditation has the ability to temporarily alter practitioners’ perceptions of time – a finding that has wider implications for the use of mindfulness both as an everyday practice, and in clinical treatments and interventions.

Led by Dr Robin Kramer from Kent’s School of Psychology, the research team hypothesised that, given mindfulness’ emphasis on moment-to-moment awareness, mindfulness meditation would slow down time and produce the feeling that short periods of time lasted longer.

To test this hypothesis, they used a temporal bisection task, which allows researchers to measure where each individual subjectively splits a period of time in half. Participants’ responses to this task were collected twice, once before and then again after a listening task. By separating people into two groups, participants listened for ten minutes to either an audiobook or a meditation exercise designed to focus their attention on the movement of breath in the body. The results showed that the control group (audiobook) didn’t change in their responses after the listening task compared with before. However, meditation led to a relative overestimation of durations i.e. time periods felt longer than they had before.

The reasons for this have been interpreted by Dr Kramer and team as the result of attentional changes, producing either improved attentional resources that allow increased attention to the processing of time, or a shift to internally-oriented attention that would have the same effect.

Dr Kramer said: ‘Our findings represent some of the first to demonstrate how mindfulness meditation can alter the perception of time. Given the increasing popularity of mindfulness in everyday practice, its relationship with time perception may provide an important step in our understanding of this pervasive, ancient practice in our modern world.’

Dr Kramer also explained that the benefits of mindfulness and mindfulness-based therapies in a variety of domains are now being identified. These include decreases in rumination, improvements in cognitive flexibility, working memory capacity and sustained attention, and reductions in reactivity, anxiety and depressive symptoms. Mindfulness-based treatments also appear to provide broad antidepressant and antianxiety effects, as well as decreases in general psychological distress. As such, these interventions have been applied with a variety of patients, including those suffering from fibromyalgia, psoriasis, cancer, binge eating and chronic pain.

Dr Dinkar Sharma, Senior Lecturer in Psychology at Kent, commented: ‘Demonstrating that mindfulness has an effect on time perception is important because it opens up the opportunity that mindfulness could be used to alter psychological disorders that are associated with a range of distortions in the perception of time - such as disorders of memory, emotion and addiction.’

Dr Ulrich Weger, of Witten/Herdecke’s Department of Psychology and Psychotherapy, concluded by stating that ‘the impact of a brief mindfulness exercise on elementary processes such as time perception is remarkable’.

Filed under time perception meditation mindful meditation emotion memory psychology neuroscience science

189 notes

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development 
Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.
The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.
The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.
The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.
In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.
The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.
“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”
The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development

Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.

The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.

The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.

The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.

In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.

The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.

“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”

The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Filed under brain development neuroplasticity sensory perception hippocampus genetics neuroscience science

181 notes

Scientists discover previously unknown requirement for brain development
Scientists at the Salk Institute for Biological Studies have demonstrated that sensory regions in the brain develop in a fundamentally different way than previously thought, a finding that may yield new insights into visual and neural disorders.
In a paper published June 7, 2013, in Science, Salk researcher Dennis O’Leary and his colleagues have shown that genes alone do not determine how the cerebral cortex grows into separate functional areas. Instead, they show that input from the thalamus, the main switching station in the brain for sensory information, is crucially required.
O’Leary has done pioneering studies in “arealization,” the way in which the neo-cortex, the major region of cerebral cortex, develops specific areas dedicated to particular functions. In a landmark paper published in Science in 2000, he showed that two regulatory genes were critically responsible for the general pattern of the neo-cortex, and has since shown distinct roles for other genes in this process. In this new set of mouse experiments, his laboratory focused on the visual system, and discovered a new, unexpected twist to the story.
"In order to function properly, it is essential that cortical areas are mapped out correctly, and it is this architecture that was thought to be genetically pre-programmed," says O’Leary, holder of the Vincent J. Coates Chair in Molecular Neurobiology at Salk. "To our surprise, we discovered thalamic input plays an essential role far earlier in brain development."
Vision is relayed from the outside world into processing areas within the brain. The relay starts when light hits the retina, a thin strip of cells at the back of the eye that detects color and light levels and encodes the information as electrical and chemical signals. Through retinal ganglion cells, those signals are then sent into the Lateral Geniculate Nucleus (LGN), a structure in thalamus.
In the next important step in the relay, the LGN routes the signals into the primary visual area (V1) in the neo-cortex, a multi-layered structure that is divided into functionally and anatomically distinct areas. V1 begins the process of extracting visual information, which is further carried out by “higher order” visual areas in the neo-cortex that are vitally important to visual perception. Like parts in a machine, the functions of these areas are both individual and integrated. Damage in one tiny area can lead to strange visual disorders in which a person may be able to see a moving ball, and yet not perceive it is in motion.
Current dogma holds that this basic architecture is entirely genetically determined, with environmental input only playing a role later in development. One of the most famous examples of this idea is the Nobel Prize-winning work of visual neuroscientists David Hubel and Torsten Wiesel, which showed that there is a “critical period” of sensitivity in vision. Their finding was commonly interpreted as a warning that without exposure to basic visual stimuli early in life, even an individual with a healthy brain will be unable to see correctly.
Later discoveries in neural plasticity more optimistically suggested that early deprivation can be overcome, and the brain can even sprout new neurons in specific areas. Nevertheless, this still reinforced the idea that environmental influences might modify neural architecture, but only genetics could establish how cortical areas would be laid out.
In their new study, however, O’Leary and the paper’s co-first authors, Shen-Ju Chou and Zoila Babot, post-doctoral researchers in O’Leary’s laboratory, show that genetics only provides a broad field in the neo-cortex for visual areas.
When they created mouse mutants that disconnected the link between thalamus and cortex but only after early cortical development was complete, they found that the primary and higher order visual areas failed to differentiate from one another as they should.
"Our new understanding is that genes only create a rough lay-out of cortical areas," explains O’Leary. "There must be thalamic input to develop the fine differentiation necessary for proper sensory processing."
Essentially, if the brain were a house, genes would determine which areas were bedrooms. Thalamic input provides the details, distinguishing what will be the master bedroom, a child’s bedroom, a guest bedroom and so on. “The size and location of areas within the overall cortex does not change, but without thalamic input from the LGN, the critical differentiation process that creates primary and higher order visual areas does not happen,” says O’Leary.
Given that most sensory modalities—sight, hearing, touch—route through thalamus to cortex, this experiment may suggest why, when someone lacks a sensory modality from birth, that individual has a harder time processing restored sensory input than someone who lost the sense later in life. But in addition, as O’Leary says, “More subtle changes in thalamic input in humans would also likely result in changes to the neo-cortex that could well have a substantial impact on the ability to process vision, or other senses, and lead to abnormal behavior.”
O’Leary says his lab plans to continue to explore the links between how cortical areas in the brain are established and various developmental disorders, such as autism.
(Image: Nucleus Medical Art, Inc.)

Scientists discover previously unknown requirement for brain development

Scientists at the Salk Institute for Biological Studies have demonstrated that sensory regions in the brain develop in a fundamentally different way than previously thought, a finding that may yield new insights into visual and neural disorders.

In a paper published June 7, 2013, in Science, Salk researcher Dennis O’Leary and his colleagues have shown that genes alone do not determine how the cerebral cortex grows into separate functional areas. Instead, they show that input from the thalamus, the main switching station in the brain for sensory information, is crucially required.

O’Leary has done pioneering studies in “arealization,” the way in which the neo-cortex, the major region of cerebral cortex, develops specific areas dedicated to particular functions. In a landmark paper published in Science in 2000, he showed that two regulatory genes were critically responsible for the general pattern of the neo-cortex, and has since shown distinct roles for other genes in this process. In this new set of mouse experiments, his laboratory focused on the visual system, and discovered a new, unexpected twist to the story.

"In order to function properly, it is essential that cortical areas are mapped out correctly, and it is this architecture that was thought to be genetically pre-programmed," says O’Leary, holder of the Vincent J. Coates Chair in Molecular Neurobiology at Salk. "To our surprise, we discovered thalamic input plays an essential role far earlier in brain development."

Vision is relayed from the outside world into processing areas within the brain. The relay starts when light hits the retina, a thin strip of cells at the back of the eye that detects color and light levels and encodes the information as electrical and chemical signals. Through retinal ganglion cells, those signals are then sent into the Lateral Geniculate Nucleus (LGN), a structure in thalamus.

In the next important step in the relay, the LGN routes the signals into the primary visual area (V1) in the neo-cortex, a multi-layered structure that is divided into functionally and anatomically distinct areas. V1 begins the process of extracting visual information, which is further carried out by “higher order” visual areas in the neo-cortex that are vitally important to visual perception. Like parts in a machine, the functions of these areas are both individual and integrated. Damage in one tiny area can lead to strange visual disorders in which a person may be able to see a moving ball, and yet not perceive it is in motion.

Current dogma holds that this basic architecture is entirely genetically determined, with environmental input only playing a role later in development. One of the most famous examples of this idea is the Nobel Prize-winning work of visual neuroscientists David Hubel and Torsten Wiesel, which showed that there is a “critical period” of sensitivity in vision. Their finding was commonly interpreted as a warning that without exposure to basic visual stimuli early in life, even an individual with a healthy brain will be unable to see correctly.

Later discoveries in neural plasticity more optimistically suggested that early deprivation can be overcome, and the brain can even sprout new neurons in specific areas. Nevertheless, this still reinforced the idea that environmental influences might modify neural architecture, but only genetics could establish how cortical areas would be laid out.

In their new study, however, O’Leary and the paper’s co-first authors, Shen-Ju Chou and Zoila Babot, post-doctoral researchers in O’Leary’s laboratory, show that genetics only provides a broad field in the neo-cortex for visual areas.

When they created mouse mutants that disconnected the link between thalamus and cortex but only after early cortical development was complete, they found that the primary and higher order visual areas failed to differentiate from one another as they should.

"Our new understanding is that genes only create a rough lay-out of cortical areas," explains O’Leary. "There must be thalamic input to develop the fine differentiation necessary for proper sensory processing."

Essentially, if the brain were a house, genes would determine which areas were bedrooms. Thalamic input provides the details, distinguishing what will be the master bedroom, a child’s bedroom, a guest bedroom and so on. “The size and location of areas within the overall cortex does not change, but without thalamic input from the LGN, the critical differentiation process that creates primary and higher order visual areas does not happen,” says O’Leary.

Given that most sensory modalities—sight, hearing, touch—route through thalamus to cortex, this experiment may suggest why, when someone lacks a sensory modality from birth, that individual has a harder time processing restored sensory input than someone who lost the sense later in life. But in addition, as O’Leary says, “More subtle changes in thalamic input in humans would also likely result in changes to the neo-cortex that could well have a substantial impact on the ability to process vision, or other senses, and lead to abnormal behavior.”

O’Leary says his lab plans to continue to explore the links between how cortical areas in the brain are established and various developmental disorders, such as autism.

(Image: Nucleus Medical Art, Inc.)

Filed under brain development brain mapping neuroplasticity neurons neocortex LGN neuroscience science

75 notes

Compound enhances SSRI antidepressant’s effects in mice

A synthetic compound is able to turn off “secondary” vacuum cleaners in the brain that take up serotonin, resulting in the “happy” chemical being more plentiful, scientists from the School of Medicine at The University of Texas Health Science Center San Antonio have discovered. Their study, released June 18 by The Journal of Neuroscience, points to novel targets to treat depression.

Serotonin, a neurotransmitter that carries chemical signals, is associated with feelings of wellness. Selective serotonin reuptake inhibitors (SSRIs) are commonly prescribed antidepressants that block a specific “vacuum cleaner” for serotonin (the serotonin transporter, or SERT) from taking up serotonin, resulting in more supply of the neurotransmitter in circulation in the extracellular fluid of the brain.

Delicate balance

"Serotonin is released by neurons in the brain," said Lyn Daws, Ph.D., professor of physiology and pharmacology in the School of Medicine. "Too much or too little may be a bad thing. It is thought that having too little serotonin is linked to depression. That’s why we think Prozac-type drugs (SSRIs) work, by stopping the serotonin transporter from taking up serotonin from extracellular fluid in the brain."

A problem with SSRIs is that many depressed patients experience modest or no therapeutic benefit. It turns out that, while SSRIs block the activity of the serotonin transporter, they don’t block other “vacuum cleaners.” “Until now we did not appreciate the presence of backup cleaners for serotonin,” Dr. Daws said. “We were not the first to show their presence in the brain, but we were among the first show that they were limiting the ability of the SSRIs to increase serotonin signaling in the brain. The study described in this new paper is the first demonstration of enhancing the antidepressant-like effect of an SSRI by concurrently blocking these backup vacuum cleaners.”

Serotonin ceiling

Even if SERT activity is blocked, the backup vacuum cleaners (called organic cation transporters) keep a ceiling on how high the serotonin levels can rise, which likely limits the optimal therapeutic benefit to the patient, Dr. Daws said.

"Right now, the compound we have, decynium-22, is not an agent that we want to give to people in clinical trials," she said. "We are not there yet. Where we are is being able to use this compound to identify new targets in the brain for antidepressant activity and to turn to medicinal chemists to design molecules to block these secondary vacuum cleaners."

(Source: eurekalert.org)

Filed under antidepressants depression serotonin SSRIs decynium-22 medicine neuroscience science

51 notes

Alzheimer’s disease protein controls movement in mice
Researchers in Berlin and Munich, Germany and Oxford, United Kingdom, have revealed that a protein well known for its role in Alzheimer’s disease controls spindle development in muscle and leads to impaired movement in mice when the protein is absent or treated with inhibitors. The results, which are published in The EMBO Journal, suggest that drugs under development to target the beta-secretase-1 protein, which may be potential treatments for Alzheimer’s disease, might produce unwanted side effects related to defective movement.
Alzheimer’s disease is the most common form of dementia found in older adults. The World Health Organization estimates that approximately 18 million people worldwide have Alzheimer’s disease. The number of people affected by the disease may increase to 34 million by 2025. Scientists know that the protein beta-secretase-1 or Bace1, a protease enzyme that breaks down proteins into smaller molecules, is involved in Alzheimer’s disease. Bace1 cleaves the amyloid precursor protein and generates the damaging Abeta peptides that accumulate as plaques in the brain leading to disease. Now scientists have revealed in more detail how Bace1 works.
"Our results show that mice that lack Bace1 proteins or are treated with inhibitors of the enzyme have difficulties in coordination and walking and also show reduced muscle strength," remarked Carmen Birchmeier, one of the authors of the paper, Professor at the Max-Delbrück-Center for Molecular Medicine in Berlin, Germany, and an EMBO Member. "In addition, we were able to show that the combined activities of Bace1 and another protein, neuregulin-1 or Nrg1, are needed to sustain the muscle spindles in mice and to maintain motor coordination."
Muscle spindles are sensory organs that are found throughout the muscles of vertebrates. They are able to detect how muscles stretch and convey the perception of body position to the brain. The researchers used genetic analyses, biochemical studies and interference with pharmacological inhibitors to investigate how Bace1 works in mice. “If the signal strength of a specific form of neuregulin-1 known as IgNrg1 is gradually reduced, increasingly severe defects in the formation and maturation of muscle spindles are observed in mice. Furthermore, it appears that Bace1 is required for full IgNrg1 activity. The graded loss of IgNrg1 activity results in the animals having increasing difficulties with movement and coordination,” says Cyril Cheret, the first author of the work.
Drug developers are interested in stopping the Bace1 protein in its tracks because it represents a promising route to treat Alzheimer’s disease. If the protein were inhibited, it would interfere with the generation of the smaller damaging proteins that accumulate in the brain as amyloid plaques and would therefore provide some level of protection from the effects of the disease. “Our data indicate that one unwanted side effect of the long-term inhibition of Bace1 might be the disruption of muscle spindle formation and impairment of movement. This finding is relevant to scientists looking for ways to develop drugs that target the Bace1 protein and should be considered,” says Birchmeier. Several Bace1 inhibitors are currently being tested in phase II and phase III clinical trials for the treatment of Alzheimer’s disease.

Alzheimer’s disease protein controls movement in mice

Researchers in Berlin and Munich, Germany and Oxford, United Kingdom, have revealed that a protein well known for its role in Alzheimer’s disease controls spindle development in muscle and leads to impaired movement in mice when the protein is absent or treated with inhibitors. The results, which are published in The EMBO Journal, suggest that drugs under development to target the beta-secretase-1 protein, which may be potential treatments for Alzheimer’s disease, might produce unwanted side effects related to defective movement.

Alzheimer’s disease is the most common form of dementia found in older adults. The World Health Organization estimates that approximately 18 million people worldwide have Alzheimer’s disease. The number of people affected by the disease may increase to 34 million by 2025. Scientists know that the protein beta-secretase-1 or Bace1, a protease enzyme that breaks down proteins into smaller molecules, is involved in Alzheimer’s disease. Bace1 cleaves the amyloid precursor protein and generates the damaging Abeta peptides that accumulate as plaques in the brain leading to disease. Now scientists have revealed in more detail how Bace1 works.

"Our results show that mice that lack Bace1 proteins or are treated with inhibitors of the enzyme have difficulties in coordination and walking and also show reduced muscle strength," remarked Carmen Birchmeier, one of the authors of the paper, Professor at the Max-Delbrück-Center for Molecular Medicine in Berlin, Germany, and an EMBO Member. "In addition, we were able to show that the combined activities of Bace1 and another protein, neuregulin-1 or Nrg1, are needed to sustain the muscle spindles in mice and to maintain motor coordination."

Muscle spindles are sensory organs that are found throughout the muscles of vertebrates. They are able to detect how muscles stretch and convey the perception of body position to the brain. The researchers used genetic analyses, biochemical studies and interference with pharmacological inhibitors to investigate how Bace1 works in mice. “If the signal strength of a specific form of neuregulin-1 known as IgNrg1 is gradually reduced, increasingly severe defects in the formation and maturation of muscle spindles are observed in mice. Furthermore, it appears that Bace1 is required for full IgNrg1 activity. The graded loss of IgNrg1 activity results in the animals having increasing difficulties with movement and coordination,” says Cyril Cheret, the first author of the work.

Drug developers are interested in stopping the Bace1 protein in its tracks because it represents a promising route to treat Alzheimer’s disease. If the protein were inhibited, it would interfere with the generation of the smaller damaging proteins that accumulate in the brain as amyloid plaques and would therefore provide some level of protection from the effects of the disease. “Our data indicate that one unwanted side effect of the long-term inhibition of Bace1 might be the disruption of muscle spindle formation and impairment of movement. This finding is relevant to scientists looking for ways to develop drugs that target the Bace1 protein and should be considered,” says Birchmeier. Several Bace1 inhibitors are currently being tested in phase II and phase III clinical trials for the treatment of Alzheimer’s disease.

Filed under alzheimer's disease dementia neurodegenerative diseases movement impairment BACE1 muscle spindles neuroscience science

free counters