Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

39 notes

Swedish study suggests reduced risk of dementia

A new Swedish study published in the journal Neurology shows that the risk of developing dementia may have declined over the past 20 years, in direct contrast to what many previously assumed. The result is based on data from SNAC-K, an ongoing study on aging and health that started in 1987.

"We know that cardiovascular disease is an important risk factor for dementia. The suggested decrease in dementia risk coincides with the general reduction in cardiovascular disease over recent decades", says Associate Professor Chengxuan Qiu of the Aging Research Center, established by Karolinska Institutet and Stockholm University. "Health check-ups and cardiovascular disease prevention have improved significantly in Sweden, and we now see results of this improvement reflected in the risk of developing dementia."

Dementia is a constellation of symptoms characterized by impaired memory and other mental functions. After age 75, dementia is commonly due to multiple causes, mainly Alzheimers disease and vascular dementia. In the current study, more than 3000 persons 75 years and older living in the central Stockholm neighborhood of Kungsholmen participated. Of the participants, 523 were diagnosed with some form of dementia. The key members of the research group have been essentially the same since 1987, including the neurologist responsible for the clinical diagnoses of dementia. All study participants were assessed by a nurse, a physician, and a psychologist.

The result shows the prevalence of dementia was stable in both men and women across all age groups after age 75 during the entire study period (1987-1989 and 2001-2004), despite the fact that the survival of persons with dementia increased since the end of the 1980s. This means that the overall risk of developing dementia must have declined during the period, possibly thanks to prevention and better treatment of cardiovascular disease.

"The reduction of dementia risk is a positive phenomenon, but it is important to remember that the number of people with dementia will continue to rise along with the increase in life expectancy and absolute numbers of people over age 75", says Professor Laura Fratiglioni, Director of the Aging Research Center. "This means that the societal burden of dementia and the need for medical and social services will continue to increase. Today there’s no way to cure patients who have dementia. Instead we must continue to improve health care and prevention in this area."

(Source: ki.se)

Filed under dementia dementia risk aging SNAC-K cardiovascular disease neuroscience science

184 notes

Video games: bad or good for your memory?
After the horrific shooting sprees at Columbine High School in 1999 and Virginia Tech in 2007, players of violent video games, such as First Person Shooter (FPS) games, have often been accused in the media of being impulsive, antisocial, or aggressive.
Positive effects
However, the question is: do First Person Shooter games also have positive effects for our mental processes? At the University of Leiden, we investigated whether gaming could be a fast and easy way to improve your memory.
Develop an adaptive mindset
Indeed, the new generations of FPS (compared to strategic) games are not just about pressing a button at the right moment but require the players to develop an adaptive mindset to rapidly react and monitor fast moving visual and auditory stimuli.
Gamers compared to non-gamers
In a study published in  Psychological Research Journal, Dr. Lorenza Colzato and her fellow researchers compared, on a task related to working memory, people who played at least five hours weekly with people who never played video games.  
More flexible brain
The researchers found that gamers outperformed non-gamers. They suggest that video game experience trains your brain to become more flexible in the updating and monitoring of new information enhancing the memory capacity of the gamers.
Video about the research

Video games: bad or good for your memory?

After the horrific shooting sprees at Columbine High School in 1999 and Virginia Tech in 2007, players of violent video games, such as First Person Shooter (FPS) games, have often been accused in the media of being impulsive, antisocial, or aggressive.

Positive effects

However, the question is: do First Person Shooter games also have positive effects for our mental processes? At the University of Leiden, we investigated whether gaming could be a fast and easy way to improve your memory.

Develop an adaptive mindset

Indeed, the new generations of FPS (compared to strategic) games are not just about pressing a button at the right moment but require the players to develop an adaptive mindset to rapidly react and monitor fast moving visual and auditory stimuli.

Gamers compared to non-gamers

In a study published in Psychological Research Journal, Dr. Lorenza Colzato and her fellow researchers compared, on a task related to working memory, people who played at least five hours weekly with people who never played video games.  

More flexible brain

The researchers found that gamers outperformed non-gamers. They suggest that video game experience trains your brain to become more flexible in the updating and monitoring of new information enhancing the memory capacity of the gamers.

Video about the research

Filed under memory working memory first person shooter games gaming video games psychology neuroscience science

133 notes

Autistic Children’s Love For Video Games Could Lead To New Treatment Options 
Kids and teenagers suffering from autism spectrum disorder (ASD) are more likely to use television and video games and less likely to spend time on social media than their normally-developing counterparts, claims new research set for publication in a future issue of the Journal of Autism and Developmental Disorders.
Micah Mazurek, an assistant professor of health psychology and a clinical child psychologist at the University of Missouri, recruited 202 children and adolescents with ASD and 179 of their typically developing siblings for the study.
Those with ASD spent more time playing video games and watching TV than spending time on physical or pro-social activities (including spending time on websites like Facebook or Twitter). The opposite was also true: typically-developing children spent more time on non-screen-related activities than they did watching shows or playing on the PS3 or the Xbox 360, according to the soon-to-be-published study.
“Many parents and clinicians have noticed that children with ASD are fascinated with technology, and the results of our recent studies certainly support this idea,” Mazurek said in a statement. “We found that children with ASD spent much more time playing video games than typically developing children, and they are much more likely to develop problematic or addictive patterns of video game play.”
In a separate study of 169 boys with ASD, excessive video game use had been linked to oppositional behaviors, such as refusal to follow directions or getting into arguments with others. Mazurek said that the issues will need to be further examined in future, closely-controlled research.
“Because these studies were cross-sectional, it is not clear if there is a causal relationship between video game use and problem behaviors,” she said. “Children with ASD may be attracted to video games because they can be rewarding, visually engaging and do not require face-to-face communication or social interaction. Parents need to be aware that, although video games are especially reinforcing for children with ASD, children with ASD may have problems disengaging from these games.”
Despite those issues, Mazurek also believes that autistic children’s love for video games and television could be used for beneficial purposes. The professor believes that discovering what makes these screen-related pastimes so attractive to kids with ASD could help researchers and medical experts develop new treatment options.
“Using screen-based technologies, communication and social skills could be taught and reinforced right away,” Mazurek explained. “However, more research is needed to determine whether the skills children with ASD might learn in virtual reality environments would translate into actual social interactions.”

Autistic Children’s Love For Video Games Could Lead To New Treatment Options

Kids and teenagers suffering from autism spectrum disorder (ASD) are more likely to use television and video games and less likely to spend time on social media than their normally-developing counterparts, claims new research set for publication in a future issue of the Journal of Autism and Developmental Disorders.

Micah Mazurek, an assistant professor of health psychology and a clinical child psychologist at the University of Missouri, recruited 202 children and adolescents with ASD and 179 of their typically developing siblings for the study.

Those with ASD spent more time playing video games and watching TV than spending time on physical or pro-social activities (including spending time on websites like Facebook or Twitter). The opposite was also true: typically-developing children spent more time on non-screen-related activities than they did watching shows or playing on the PS3 or the Xbox 360, according to the soon-to-be-published study.

“Many parents and clinicians have noticed that children with ASD are fascinated with technology, and the results of our recent studies certainly support this idea,” Mazurek said in a statement. “We found that children with ASD spent much more time playing video games than typically developing children, and they are much more likely to develop problematic or addictive patterns of video game play.”

In a separate study of 169 boys with ASD, excessive video game use had been linked to oppositional behaviors, such as refusal to follow directions or getting into arguments with others. Mazurek said that the issues will need to be further examined in future, closely-controlled research.

“Because these studies were cross-sectional, it is not clear if there is a causal relationship between video game use and problem behaviors,” she said. “Children with ASD may be attracted to video games because they can be rewarding, visually engaging and do not require face-to-face communication or social interaction. Parents need to be aware that, although video games are especially reinforcing for children with ASD, children with ASD may have problems disengaging from these games.”

Despite those issues, Mazurek also believes that autistic children’s love for video games and television could be used for beneficial purposes. The professor believes that discovering what makes these screen-related pastimes so attractive to kids with ASD could help researchers and medical experts develop new treatment options.

“Using screen-based technologies, communication and social skills could be taught and reinforced right away,” Mazurek explained. “However, more research is needed to determine whether the skills children with ASD might learn in virtual reality environments would translate into actual social interactions.”

Filed under autism ASD video games gaming social interaction psychology neuroscience science

149 notes

High Levels of Glutamate in Brain May Kick-Start Schizophrenia
An excess of the brain neurotransmitter glutamate may cause a transition to psychosis in people who are at risk for schizophrenia, reports a study from investigators at Columbia University Medical Center (CUMC) published in the current issue of Neuron.
The findings suggest 1) a potential diagnostic tool for identifying those at risk for schizophrenia and 2) a possible glutamate-limiting treatment strategy to prevent or slow progression of schizophrenia and related psychotic disorders.
“Previous studies of schizophrenia have shown that hypermetabolism and atrophy of the hippocampus are among the most prominent changes in the patient’s brain,” said senior author Scott Small, MD, Boris and Rose Katz Professor of Neurology at CUMC. “The most recent findings had suggested that these changes occur very early in the disease, which may point to a brain process that could be detected even before the disease begins.”
To locate that process, the Columbia researchers used neuroimaging tools in both patients and a mouse model. First they followed a group of 25 young people at risk for schizophrenia to determine what happens to the brain as patients develop the disorder. In patients who progressed to schizophrenia, they found the following pattern: First, glutamate activity increased in the hippocampus, then hippocampus metabolism increased, and then the hippocampus began to atrophy.
To see if the increase in glutamate led to the other hippocampus changes, the researchers turned to a mouse model of schizophrenia. When the researchers increased glutamate activity in the mouse, they saw the same pattern as in the patients: The hippocampus became hypermetabolic and, if glutamate was raised repeatedly, the hippocampus began to atrophy.
Theoretically, this dysregulation of glutamate and hypermetabolism could be identified through imaging individuals who are either at risk for or in the early stage of disease. For these patients, treatment to control glutamate release might protect the hippocampus and prevent or slow the progression of psychosis.
Strategies to treat schizophrenia by reducing glutamate have been tried before, but with patients in whom the disease is more advanced. “Targeting glutamate may be more useful in high-risk people or in those with early signs of the disorder,” said Jeffrey A. Lieberman, MD, a renowned expert in the field of schizophrenia, Chair of the Department of Psychiatry at CUMC, and president-elect of the American Psychiatric Association. “Early intervention may prevent the debilitating effects of schizophrenia, increasing recovery in one of humankind’s most costly mental disorders.”
In an accompanying commentary, Bita Moghaddam, PhD, professor of neuroscience and of psychiatry, University of Pittsburgh, suggests that if excess glutamate is driving schizophrenia in high-risk individuals, it may also explain why a patient’s first psychotic episodes are often caused by periods of stress, since stress increases glutamate levels in the brain.

High Levels of Glutamate in Brain May Kick-Start Schizophrenia

An excess of the brain neurotransmitter glutamate may cause a transition to psychosis in people who are at risk for schizophrenia, reports a study from investigators at Columbia University Medical Center (CUMC) published in the current issue of Neuron.

The findings suggest 1) a potential diagnostic tool for identifying those at risk for schizophrenia and 2) a possible glutamate-limiting treatment strategy to prevent or slow progression of schizophrenia and related psychotic disorders.

“Previous studies of schizophrenia have shown that hypermetabolism and atrophy of the hippocampus are among the most prominent changes in the patient’s brain,” said senior author Scott Small, MD, Boris and Rose Katz Professor of Neurology at CUMC. “The most recent findings had suggested that these changes occur very early in the disease, which may point to a brain process that could be detected even before the disease begins.”

To locate that process, the Columbia researchers used neuroimaging tools in both patients and a mouse model. First they followed a group of 25 young people at risk for schizophrenia to determine what happens to the brain as patients develop the disorder. In patients who progressed to schizophrenia, they found the following pattern: First, glutamate activity increased in the hippocampus, then hippocampus metabolism increased, and then the hippocampus began to atrophy.

To see if the increase in glutamate led to the other hippocampus changes, the researchers turned to a mouse model of schizophrenia. When the researchers increased glutamate activity in the mouse, they saw the same pattern as in the patients: The hippocampus became hypermetabolic and, if glutamate was raised repeatedly, the hippocampus began to atrophy.

Theoretically, this dysregulation of glutamate and hypermetabolism could be identified through imaging individuals who are either at risk for or in the early stage of disease. For these patients, treatment to control glutamate release might protect the hippocampus and prevent or slow the progression of psychosis.

Strategies to treat schizophrenia by reducing glutamate have been tried before, but with patients in whom the disease is more advanced. “Targeting glutamate may be more useful in high-risk people or in those with early signs of the disorder,” said Jeffrey A. Lieberman, MD, a renowned expert in the field of schizophrenia, Chair of the Department of Psychiatry at CUMC, and president-elect of the American Psychiatric Association. “Early intervention may prevent the debilitating effects of schizophrenia, increasing recovery in one of humankind’s most costly mental disorders.”

In an accompanying commentary, Bita Moghaddam, PhD, professor of neuroscience and of psychiatry, University of Pittsburgh, suggests that if excess glutamate is driving schizophrenia in high-risk individuals, it may also explain why a patient’s first psychotic episodes are often caused by periods of stress, since stress increases glutamate levels in the brain.

Filed under schizophrenia psychotic disorders brain neurons glutamate hippocampus hypermetabolism neuroscience science

106 notes

Increased brain activity predicts future onset of substance use
Do people get caught in the cycle of overeating and drug addiction because their brain reward centers are over-active causing them to experience greater cravings for food or drugs? In a unique prospective study Oregon Research Institute (ORI) senior scientist Eric Stice, Ph.D., and colleagues tested this theory, called the reward surfeit model. The results indicated that elevated responsivity of reward regions in the brain increased the risk for future substance use, which has never been tested before prospectively with humans. Paradoxically, results also provide evidence that even a limited history of substance use was related to less responsivity in the reward circuitry, as has been suggested by experiments with animals. The research appears in the May 1, 2013 issue of Biological Psychiatry.
In a novel study using functional Magnetic Resonance Imaging (fMRI) Stice’s team tested whether individual differences in reward region responsivity predicted overweight/obesity onset among initially healthy weight adolescents and substance use onset among initially abstinent adolescents. The neural response to food and monetary reward was measured in 162 adolescents. Body fat and substance use were assessed at the time of the fMRI and again one year later.
"The findings are important because this is the first test of whether atypical responsivity of reward circuitry increases risk for substance use," says Dr. Stice. "Although numerous researchers have suggested that reduced responsivity is a vulnerability factor for substance use, this theory was based entirely on cross-sectional studies comparing substance abusing individuals to healthy controls; no studies have tested this thesis with prospective data."
Investigators examined the extent to which reward circuitry (e.g., the striatum) was activated in response to receipt and anticipated receipt of money. Monetary reward is a general reinforcer and has been used frequently to assess reward sensitivity. The team also used another paradigm to assess brain activation in response to the individual’s consumption and anticipated consumption of chocolate milkshake. Results showed that greater activation in the striatum during monetary reward receipt at baseline predicted future substance use onset over a 1-year follow-up.
Noteworthy was that adolescents who had already begun using substances showed less striatal response to monetary reward. This finding provides the first evidence that even a relatively short period of moderate substance use might reduce reward region responsivity to a general reinforcer.
"The implications are that the more individuals use psychoactive substances, the less responsive they will be to rewarding experiences, meaning that they may derive less reinforcement from other pursuits, such as interpersonal relationships, hobbies, and school work. This may contribute to the escalating spiral of drug use that characterizes substance use disorders," commented Stice.
Although the investigators had expected parallel neural predictors of future onset of overweight during exposure to receipt and anticipated receipt of a palatable food, no significant effects emerged. It is possible that these effects are weaker and that a longer follow-up period will be necessary to better differentiate who will gain weight and who will remain at a healthy weight.
(Image courtesy: West Virginia University)

Increased brain activity predicts future onset of substance use

Do people get caught in the cycle of overeating and drug addiction because their brain reward centers are over-active causing them to experience greater cravings for food or drugs? In a unique prospective study Oregon Research Institute (ORI) senior scientist Eric Stice, Ph.D., and colleagues tested this theory, called the reward surfeit model. The results indicated that elevated responsivity of reward regions in the brain increased the risk for future substance use, which has never been tested before prospectively with humans. Paradoxically, results also provide evidence that even a limited history of substance use was related to less responsivity in the reward circuitry, as has been suggested by experiments with animals. The research appears in the May 1, 2013 issue of Biological Psychiatry.

In a novel study using functional Magnetic Resonance Imaging (fMRI) Stice’s team tested whether individual differences in reward region responsivity predicted overweight/obesity onset among initially healthy weight adolescents and substance use onset among initially abstinent adolescents. The neural response to food and monetary reward was measured in 162 adolescents. Body fat and substance use were assessed at the time of the fMRI and again one year later.

"The findings are important because this is the first test of whether atypical responsivity of reward circuitry increases risk for substance use," says Dr. Stice. "Although numerous researchers have suggested that reduced responsivity is a vulnerability factor for substance use, this theory was based entirely on cross-sectional studies comparing substance abusing individuals to healthy controls; no studies have tested this thesis with prospective data."

Investigators examined the extent to which reward circuitry (e.g., the striatum) was activated in response to receipt and anticipated receipt of money. Monetary reward is a general reinforcer and has been used frequently to assess reward sensitivity. The team also used another paradigm to assess brain activation in response to the individual’s consumption and anticipated consumption of chocolate milkshake. Results showed that greater activation in the striatum during monetary reward receipt at baseline predicted future substance use onset over a 1-year follow-up.

Noteworthy was that adolescents who had already begun using substances showed less striatal response to monetary reward. This finding provides the first evidence that even a relatively short period of moderate substance use might reduce reward region responsivity to a general reinforcer.

"The implications are that the more individuals use psychoactive substances, the less responsive they will be to rewarding experiences, meaning that they may derive less reinforcement from other pursuits, such as interpersonal relationships, hobbies, and school work. This may contribute to the escalating spiral of drug use that characterizes substance use disorders," commented Stice.

Although the investigators had expected parallel neural predictors of future onset of overweight during exposure to receipt and anticipated receipt of a palatable food, no significant effects emerged. It is possible that these effects are weaker and that a longer follow-up period will be necessary to better differentiate who will gain weight and who will remain at a healthy weight.

(Image courtesy: West Virginia University)

Filed under brain activity drug addiction reward surfeit model reward center fMRI substance use neuroscience science

59 notes

Big boost in drug discovery: New use for stem cells identifies a promising way to target ALS

image

Using a new, stem cell-based, drug-screening technology that could reinvent and greatly reduce the cost of developing pharmaceuticals, researchers at the Harvard Stem Cell Institute (HSCI) have found a compound that is more effective in protecting the neurons killed in amyotrophic lateral sclerosis (ALS) than are two drugs that failed in human clinical trials after large sums were invested in them.

The new screening technique developed by Lee Rubin, a member of HSCI’s executive committee and a professor in Harvard’s Department of Stem Cell and Regenerative Biology (SCRB), had predicted that the two drugs that eventually failed in the third and final stage of human testing would do just that.

“It’s a deep, dark secret of drug discovery that very few drugs have been tested on human-diseased cells before being tested in a live person,” said Rubin, who heads HSCI’s program in translational medicine. “We were interested in the notion that we can use stem cells to correct that situation.”

Rubin’s model is built on an earlier proof of concept developed by HSCI principal faculty member Kevin Eggan, who demonstrated that it was possible to move a neuron-based disease into a laboratory dish using stem cells carrying the genes of patients with the disease.

In a paper published today in the journal Cell Stem Cell, Rubin laid out how he and his colleagues applied their new method of stem cell-based drug discovery to ALS, also known as Lou Gehrig’s disease. The illness is associated with the progressive death of motor neurons, which pass information between the brain and the muscles. As cells die, people with ALS experience weakness in their limbs, followed by rapid paralysis and respiratory failure. The disease typically strikes later in life. Ten percent of cases are genetically predisposed, but for most patients there is no known trigger.

Rubin’s lab began by studying the disease in mice, growing billions of motor neurons from mouse embryonic stem cells, half normal and half with a genetic mutation known to cause ALS. Investigators starved the cells of nutrients and then screened 5,000 druglike molecules to find any that would keep the motor neurons alive.

Several hits were identified, but the molecule that best prolonged the life of both normal and ALS motor neurons was kenpaullone, previously known for blocking the action of an enzyme (GSK-3) that switches on and off several cellular processes, including cell growth and death. “Shockingly, this molecule keeps cells alive better than the standard culture medium that everybody keeps motor neurons in,” Rubin said.

Kenpaullone proved effective in several follow-up experiments that put mouse motor neurons in situations of certain death. Neuron survival increased in the presence of the molecule whether the cells were programmed to die or were placed in a toxic environment.

After further investigation, Rubin’s lab discovered that kenpaullone’s potency came from its ability also to inhibit HGK, an enzyme that sets off a chain of reactions that leads to motor neuron death. This enzyme was not previously known to be important in motor neurons or associated with ALS, marking the discovery of a new drug target for the disease.

“I think that stem cell screens will discover new compounds that have never been discovered before by other methods,” Rubin said. “I’m excited to think that someday one of them might actually be good enough to go into the clinic.”

To find out if kenpaullone worked in diseased human cells, Rubin’s lab exposed patient motor neurons and motor neurons grown from human embryonic stem cells to the molecule, as well as two drugs that did well in mice but failed in phase III human clinical trials for ALS. Once again, kenpaullone increased the rate of neuron survival, while one drug saw little response, and the other drug failed to keep any cells alive.

According to Rubin, before kenpaullone could be used as a drug, it would need a substantial molecular makeover to make it better able to target cells and find its way into the spinal cord so it can access motor neurons.

“This is kind of a proof of principle on the do-ability of the whole thing,” he said. “I think it’s possible to use this method to discover new drug targets and to prevalidate compounds on real human disease cells before putting them in the clinic.”

Rubin’s next steps will be to continue searching for better druglike compounds that can inhibit HGK and thus enhance motor neuron survival. He believes that the new information that comes out of this research will be useful to academia and the pharmaceutical industry.

“These kinds of exploratory screens are hard to fund, so being part of the HSCI” — which provided some of the funding — “has been absolutely essential,” Rubin said.

(Source: news.harvard.edu)

Filed under ALS Lou Gehrig’s disease neurons motor neurons stem cells medicine neuroscience science

52 notes

Science surprise: Toxic protein made in unusual way may explain brain disorder

A bizarre twist on the usual way proteins are made may explain mysterious symptoms in the grandparents of some children with mental disabilities.

The discovery, made by a team of scientists at the University of Michigan Medical School, may lead to better treatments for older adults with a recently discovered genetic condition.

The condition, called Fragile X-associated Tremor Ataxia Syndrome (FXTAS), causes shakiness and balance problems and is often misdiagnosed as Parkinson’s disease. The grandchildren of people with the disease have a separate disorder called Fragile X syndrome, caused by problems in the same gene. The new discovery may also help shine light on that disease, though indirectly.

In a new paper published in the journal Neuron, the U-M-led team presents evidence that a toxic protein they’ve named FMRpolyG contributes to the death of nerve cells in FXTAS – and that this protein is made in a very unusual way.

Normally, DNA is transcribed into RNA, and then a part of the RNA is translated into a protein that performs its function in cells. Where this translation process starts on the RNA is usually determined by a specific sequence called a start codon.

The gene mutation that causes FXTAS is a repeated DNA sequence that is made into RNA but normally is not made into protein because it lacks a start codon. However, the investigators discovered that when this repeat expands, it can trigger protein production by a new mechanism known as RAN translation.

Corresponding author Peter Todd, M.D., Ph.D., notes that this unusual translation process appears to stem from a long chain of repeated DNA “letters” found in the genes of both grandparents and kids with Fragile X mutations. Todd is the Bucky and Patti Harris Professor in the U-M Department of Neurology

"Essentially, we’ve found that a sequence of DNA which shouldn’t be made into protein is being made into protein – and that this causes a toxicity in nerve cells," he explains. "We believe that the protein forms aggregates, and that this is a major contributor to toxicity and symptoms in FXTAS."

The U-M group went on to show how this RAN translation occurs in FXTAS and demonstrated that blocking it prevents the repeat mutation from being toxic, suggesting a new target for future treatments.

Fragile X tremor/ataxia syndrome or FXTAS was only discovered a decade ago. It may affect as many as one in every 3,000 men and one in 20,000 women, who have a repeat mutation in the gene known as FMR1. However, these patients don’t usually develop symptoms until late middle age, allowing them to pass the mutation on to their daughters, who can then have children where the DNA repeat that has grown much longer. In those children, especially in boys, it can cause severe intellectual disability and autism-like symptoms as the FMR1 gene shuts down and none of the normal protein is produced.

In fact, says Todd, it’s often only after a child is diagnosed with Fragile X syndrome through genetic testing that their grandfather or grandmother finds out that their own symptoms stem from FXTAS. Doctors in U-M’s Neurogenetics clinic for adults, and the Pediatric Genetics Clinic at U-M’s C.S. Mott Children’s Hospital, routinely work together to address the needs of Fragile X families.

"We have some treatments for the symptoms that FXTAS patients have, but we do not yet have a cure," says Todd, who regularly sees patients with FXTAS and related disorders. "Better treatments are needed – and this new discovery might help lead to novel strategies for clearing away or preventing the buildup of this toxic protein."

In addition, he says, the discovery that Fragile X ataxia results in part from RAN translation could have significance both for other diseases like amyotrophic lateral sclerosis (ALS, also called Lou Gehrig’s disease) and certain forms of dementia that are caused by DNA repeats. It can also aid our understanding of basic biology. “This may represent a new way in which translational initiation events occur, and may have importance beyond this one disease,” he notes. Further research on how RAN translation occurs, and why, is needed.

The idea that proteins can be created without a “start site” flies in the face of what most students of biology have learned in the last century. “In biology, we’re finding that the rules we once thought were hard and fast have some wiggle room,” Todd says.

(Source: eurekalert.org)

Filed under fragile x syndrome toxic protein nerve cells gene mutation DNA sequence neuroscience science

62 notes

Bursts of Brain Activity May Protect Against Alzheimer’s Disease

TAU reveals the missing link between brain patterns and Alzheimer’s

image

Evidence indicates that the accumulation of amyloid-beta proteins, which form the plaques found in the brains of Alzheimer’s patients, is critical for the development of Alzheimer’s disease, which impacts 5.4 million Americans. And not just the quantity, but also the quality of amyloid-beta peptides is crucial for Alzheimer’s initiation. The disease is triggered by an imbalance in two different amyloid species — in Alzheimer’s patients, there is a reduction in a relative level of healthy amyloid-beta 40 compared to 42.

Now Dr. Inna Slutsky of Tel Aviv University’s Sackler Faculty of Medicine and the Sagol School of Neuroscience, with postdoctoral fellow Dr. Iftach Dolev and PhD student Hilla Fogel, have uncovered two main features of the brain circuits that impact this crucial balance. The researchers have found that patterns of electrical pulses (called “spikes”) in the form of high-frequency bursts and the filtering properties of synapses are crucial to the regulation of the amyloid-beta 40/42 ratio. Synapses that transfer information in spike bursts improve the amyloid-beta 40/42 ratio.

This represents a major advance in understanding that brain circuits regulate composition of amyloid-beta proteins, showing that the disease is not just driven by genetic mutations, but by physiological mechanisms as well. Their findings were recently reported in the journal Nature Neuroscience.

Tipping the balance

High-frequency bursts in the brain are critical for brain plasticity, information processing, and memory encoding. To check the connection between spike patterns and the regulation of amyloid-beta 40/42 ratio, Dr. Dolev applied electrical pulses to the hippocampus, a brain region involved in learning and memory.

When increasing the rate of single pulses at low frequencies in rat hippocampal slices, levels of both amyloid-beta 42 and 40 grew, but the 40/42 ratio remained the same. However, when the same number of pulses was distributed in high-frequency bursts, researchers discovered an increased amyloid-beta 40 production. In addition, the researchers found that only synapses optimized to transfer encoded by bursts contributed towards tipping the balance in favor of amyloid-beta 40. Further investigations conducted by Fogel revealed that the connection between spiking patterns and the type of amyloid-beta produced could revolve around a protein called presenilin. “We hypothesize that changes in the temporal patterns of spikes in the hippocampus may trigger structural changes in the presenilin, leading to early memory impairments in people with sporadic Alzheimer’s,” explains Dr. Slutsky.

Behind the bursts

According to Dr. Slutsky, different kinds of environmental changes and experiences — including sensory and emotional experience — can modify the properties of synapses and change the spiking patterns in the brain. Previous research has suggested that a stimulant-rich environment could be a contributing factor in preventing the development of Alzheimer’s disease, much as crossword and similar puzzles appear to stimulate the brain and delay the onset of Alzheimer’s. In the recent study, the researchers discovered that changes in sensory experiences also regulate synaptic properties — leading to an increase in amyloid-beta 40.

In the next stage, Dr. Slutsky and her team are aiming to manipulate activity patterns in the specific hippocampal pathways of Alzheimer’s models to test if it can prevent the initiation of cognitive impairment. The ability to monitor dynamics of synaptic activity in humans would be a step forward early diagnosis of sporadic Alzheimer’s.

(Source: aftau.org)

Filed under brain brain circuits amyloid beta proteins alzheimer's disease plasticity neurons neuroscience science

158 notes

Hologram-like 3-D brain helps researchers decode migraine pain
Wielding a joystick and wearing special glasses, pain researcher Alexandre DaSilva rotates and slices apart a large, colorful, 3-D brain floating in space before him.
Despite the white lab coat, it appears DaSilva’s playing the world’s most advanced virtual video game. The University of Michigan dentistry professor is actually hoping to better understand how our brains make their own pain-killing chemicals during a migraine attack.
The 3-D brain is a novel way to examine data from images taken during a patient’s actual migraine attack, says DaSilva, who heads the Headache and Orofacial Pain Effort at the U-M School of Dentistry and the Molecular and Behavioral Neuroscience Institute.
Different colors in the 3-D brain give clues about chemical processes happening during a patient’s migraine attack using a PET scan, or positron emission tomography, a type of medical imaging.
"This high level of immersion (in 3-D) effectively places our investigators inside the actual patient’s brain image," DaSilva said.
The 3-D research occurs in the U-M 3-D Lab, part of the U-M Library.

Hologram-like 3-D brain helps researchers decode migraine pain

Wielding a joystick and wearing special glasses, pain researcher Alexandre DaSilva rotates and slices apart a large, colorful, 3-D brain floating in space before him.

Despite the white lab coat, it appears DaSilva’s playing the world’s most advanced virtual video game. The University of Michigan dentistry professor is actually hoping to better understand how our brains make their own pain-killing chemicals during a migraine attack.

The 3-D brain is a novel way to examine data from images taken during a patient’s actual migraine attack, says DaSilva, who heads the Headache and Orofacial Pain Effort at the U-M School of Dentistry and the Molecular and Behavioral Neuroscience Institute.

Different colors in the 3-D brain give clues about chemical processes happening during a patient’s migraine attack using a PET scan, or positron emission tomography, a type of medical imaging.

"This high level of immersion (in 3-D) effectively places our investigators inside the actual patient’s brain image," DaSilva said.

The 3-D research occurs in the U-M 3-D Lab, part of the U-M Library.

Filed under virtual reality migraine 3-D brain brain positron emission tomography pain neuroscience science

49 notes

First steps of synapse building captured in live zebra fish embryos
Using spinning disk microscopy on barely day-old zebra fish embryos, University of Oregon scientists have gained a new window on how synapse-building components move to worksites in the central nervous system.
What researchers captured in these see-through embryos — in what may be one of the first views of early glutamate-driven synapse formation in a living vertebrate — were orderly movements of protein-carrying packets along axons to a specific site where a synapse would be formed.
Washbourne addresses:
► The basic importance of the findings
► The connection to diseases, including autism
The discovery, in research funded by the National Institutes of Health, is described in a paper placed online ahead of publication in the April 25 issue of the open-access journal Cell Reports. It is noteworthy because most synapses formed in vertebrates use glutamate as a neurotransmitter, and breakdowns in the process have been tied to conditions such as autism, schizophrenia and mental retardation.
The zebra fish has become one of the leading research models for studying early development, in general, and human-disease states.
In this case, researchers used immunofluorescence labeling to highlight the area they put under the microscopes. The embryos they studied were barely 24-hours old and a millimeter in length, but neurons in their spinal cord were already forming connections called synapses. Images were taken every 30 seconds over two hours.
"If we zoom out a bit and look at development in the human, the majority of synapse formation occurs in the cortex after birth and continues for the first two years in a baby’s life," said Philip Washbourne, a professor of biology and member of the UO’s Institute of Neuroscience.
Previous studies, done in vitro, contradicted each other, with one, in 2000, identifying a single packet of building blocks arriving at a pre-synaptic terminal. The other, in 2004, identified two protein packets. After watching the process unfold live, with imaging over long time spans, Washbourne said: “We now see at least three, and maybe more, such deliveries.”
"Axons are long processes — think of them as highways — of neurons. In humans, these can be a meter long, from spinal cord to your big toe," he said. It’s in the cell body where all the proteins are made, and they have to be transported out. Is it done by a single bus or by several cars? These results point to additional layers of complexity in the established mechanisms of synaptogenesis."
The new research also showed that sequence also is crucial. Two different pre-synaptic packages of molecules repeatedly arrived in the same order. A key building block — the protein synapsin — always arrived third. As these delivery vehicles traveled the axonal highway, another protein, a cyclin-dependent kinase known as Cdk5, acts as a stoplight at the synapse-construction site, where phosphorylation occurs. More research is needed on Cdk5, Washbourne said.
"Understanding how all this happens will inform us to what’s going wrong in neurodevelopment that leads to diseases," Washbourne said. "We have indications that the glue that gets all this going includes a gene that has been linked to autism, so knowing how these molecules start the process of synapse formation — and what goes wrong in people with mutations in these genes — might allow for a therapeutic targeting to correct the mutations and manipulate the stop signs."

First steps of synapse building captured in live zebra fish embryos

Using spinning disk microscopy on barely day-old zebra fish embryos, University of Oregon scientists have gained a new window on how synapse-building components move to worksites in the central nervous system.

What researchers captured in these see-through embryos — in what may be one of the first views of early glutamate-driven synapse formation in a living vertebrate — were orderly movements of protein-carrying packets along axons to a specific site where a synapse would be formed.

Washbourne addresses:

► The basic importance of the findings

► The connection to diseases, including autism

The discovery, in research funded by the National Institutes of Health, is described in a paper placed online ahead of publication in the April 25 issue of the open-access journal Cell Reports. It is noteworthy because most synapses formed in vertebrates use glutamate as a neurotransmitter, and breakdowns in the process have been tied to conditions such as autism, schizophrenia and mental retardation.

The zebra fish has become one of the leading research models for studying early development, in general, and human-disease states.

In this case, researchers used immunofluorescence labeling to highlight the area they put under the microscopes. The embryos they studied were barely 24-hours old and a millimeter in length, but neurons in their spinal cord were already forming connections called synapses. Images were taken every 30 seconds over two hours.

"If we zoom out a bit and look at development in the human, the majority of synapse formation occurs in the cortex after birth and continues for the first two years in a baby’s life," said Philip Washbourne, a professor of biology and member of the UO’s Institute of Neuroscience.

Previous studies, done in vitro, contradicted each other, with one, in 2000, identifying a single packet of building blocks arriving at a pre-synaptic terminal. The other, in 2004, identified two protein packets. After watching the process unfold live, with imaging over long time spans, Washbourne said: “We now see at least three, and maybe more, such deliveries.”

"Axons are long processes — think of them as highways — of neurons. In humans, these can be a meter long, from spinal cord to your big toe," he said. It’s in the cell body where all the proteins are made, and they have to be transported out. Is it done by a single bus or by several cars? These results point to additional layers of complexity in the established mechanisms of synaptogenesis."

The new research also showed that sequence also is crucial. Two different pre-synaptic packages of molecules repeatedly arrived in the same order. A key building block — the protein synapsin — always arrived third. As these delivery vehicles traveled the axonal highway, another protein, a cyclin-dependent kinase known as Cdk5, acts as a stoplight at the synapse-construction site, where phosphorylation occurs. More research is needed on Cdk5, Washbourne said.

"Understanding how all this happens will inform us to what’s going wrong in neurodevelopment that leads to diseases," Washbourne said. "We have indications that the glue that gets all this going includes a gene that has been linked to autism, so knowing how these molecules start the process of synapse formation — and what goes wrong in people with mutations in these genes — might allow for a therapeutic targeting to correct the mutations and manipulate the stop signs."

Filed under zebrafish CNS glutamate synapses neurotransmitters autism schizophrenia mental retardation neuroscience science

free counters