Neuroscience

Articles and news from the latest research reports.

Posts tagged science

36 notes

Parkinson’s Disease Protein Gums up Garbage Disposal System in Cells

Clumps of α-synuclein protein in nerve cells are hallmarks of many degenerative brain diseases, most notably Parkinson’s disease.

image

“No one has been able to determine if Lewy bodies and Lewy neurites, hallmark pathologies in Parkinson’s disease can be degraded,” says Virginia Lee, PhD, director of the Center for Neurodegenerative Disease Research, at the Perelman School of Medicine, University of Pennsylvania.

“With the new neuron model system of Parkinson’s disease pathologies our lab has developed recently, we demonstrated that these aberrant clumps in cells resist degradation as well as impair the function of the macroautophagy  system, one of the major garbage disposal systems within the cell.”

Macroautophagy, literally self eating, is the degradation of unnecessary or dysfunctional cellular bits and pieces by a compartment in the cell called the lysosome.

Lee, also a professor of Pathology and Laboratory Medicine, and colleagues published their results in the early online edition of the Journal of Biological Chemistry this week.

Alpha-synuclein (α-syn ) diseases all have  clumps of the protein and include Parkinson’s disease (PD), and array of related disorders: PD with dementia , dementia with Lewy bodies, and multiple system atrophy. In most of these, α-syn forms insoluble aggregates of stringy fibrils that accumulate in the cell body and extensions of neurons.

These unwanted α-syn clumps are modified by abnormal attachments of many phosphate chemical groups as well as by the protein ubiquitin, a molecular tag for degradation. They are widely distributed in the central nervous system, where they are associated with neuron loss.

Using cell models in which intracellular α-syn clumps accumulate after taking up synthetic α-syn fibrils, the team showed that α-syn inclusions cannot be degraded, even though they are located near the  lysosome and the proteasome, another type of garbage disposal in the cell.

The α-syn aggregates persist even after soluble α-syn levels within the cell are substantially reduced, suggesting that once formed, the α-syn inclusions are resistant to being cleared. What’s more, they found that α-syn aggregates impair the overall autophagy degradative process by delaying the maturation of autophagy machines known as autophagosomes, which may contribute to the increased cell death seen in clump-filled nerve cells. Understanding the impact of α-syn aggregates on autophagy may help elucidate therapies for α-syn-related neurodegeneration.

(Source: uphs.upenn.edu)

Filed under neurodegenerative diseases parkinson's disease nerve cells lysosome CNS autophagy neuroscience science

472 notes

Smoking genes predict risk
Your DNA may play a significant role in determining whether or not you end up a smoker – and how easy you find it to kick the habit.
Many large studies have identified particular gene variants that are more common in smokers than other people, suggesting the they play a role in nicotine dependence.
Now an international team of researchers have used these genetic clues develop a ‘genetic risk profile’, and to see how accurate it is, they have road-tested it on the on a well known sample of Kiwis: the Dunedin Birth Cohort.
Researchers analysed data from the long-term study of 1,000 New Zealanders to identify whether individuals at high genetic risk got hooked on cigarettes more quickly as teens and whether, as adults, they had a harder time quitting.
The results, published in JAMA Psychiatry, showed that a person’s genetic risk profile did not predict whether he or she would try cigarettes. But for those who did try cigarettes, having a high-risk genetic profile predicted increased likelihood of heavy smoking and nicotine dependence.
This link was most apparent for teenagers; Among teens who tried cigarettes, those with a high-risk genetic profile were 24 percent more likely to become daily smokers by age 15 and 43 percent more likely to become pack-a-day smokers by age 18.
As adults, those with high-risk genetic profiles were 22 percent more likely to fail in their attempts at quitting.
“The effects of genetic risk seem to be limited to people who start smoking as teens,” said author Daniel Belsky, a post-doctoral research fellow at Duke University.
“This suggests there may be something special about nicotine exposure in the adolescent brain, with respect to these genetic variants.”
The authors noted that their genetic risk profile isn’t yet accurate enough to be used for targeted interventions to prevent at-risk teens smoking, but it does highlight the critical adolescent period in addiction development.
“Public health policies that make it harder for teens to become regular smokers should continue to be a focus in antismoking efforts,” Belsky said.

Smoking genes predict risk

Your DNA may play a significant role in determining whether or not you end up a smoker – and how easy you find it to kick the habit.

Many large studies have identified particular gene variants that are more common in smokers than other people, suggesting the they play a role in nicotine dependence.

Now an international team of researchers have used these genetic clues develop a ‘genetic risk profile’, and to see how accurate it is, they have road-tested it on the on a well known sample of Kiwis: the Dunedin Birth Cohort.

Researchers analysed data from the long-term study of 1,000 New Zealanders to identify whether individuals at high genetic risk got hooked on cigarettes more quickly as teens and whether, as adults, they had a harder time quitting.

The results, published in JAMA Psychiatry, showed that a person’s genetic risk profile did not predict whether he or she would try cigarettes. But for those who did try cigarettes, having a high-risk genetic profile predicted increased likelihood of heavy smoking and nicotine dependence.

This link was most apparent for teenagers; Among teens who tried cigarettes, those with a high-risk genetic profile were 24 percent more likely to become daily smokers by age 15 and 43 percent more likely to become pack-a-day smokers by age 18.

As adults, those with high-risk genetic profiles were 22 percent more likely to fail in their attempts at quitting.

“The effects of genetic risk seem to be limited to people who start smoking as teens,” said author Daniel Belsky, a post-doctoral research fellow at Duke University.

“This suggests there may be something special about nicotine exposure in the adolescent brain, with respect to these genetic variants.”

The authors noted that their genetic risk profile isn’t yet accurate enough to be used for targeted interventions to prevent at-risk teens smoking, but it does highlight the critical adolescent period in addiction development.

“Public health policies that make it harder for teens to become regular smokers should continue to be a focus in antismoking efforts,” Belsky said.

Filed under smoking nicotine dependence adolescent brain genes genetics neuroscience science

35 notes

Surgical menopause may prime brain for stroke, Alzheimer’s

Women who abruptly and prematurely lose estrogen from surgical menopause have a two-fold increase in cognitive decline and dementia.

image

"This is what the clinical studies indicate and our animal studies looking at the underlying mechanisms back this up," said Brann, corresponding author of the study in the journal Brain. “We wanted to find out why that is occurring. We suspect it’s due to the premature loss of estrogen.”

In an effort to mimic what occurs in women, Brann and his colleagues looked at rats 10 weeks after removal of their estrogen-producing ovaries that were either immediately started on low-dose estrogen therapy, started therapy 10 weeks later or never given estrogen.

When the researchers caused a stroke-like event in the brain’s hippocampus, a center of learning and memory, they found the rodents treated late or not at all experienced more brain damage, specifically to a region of the hippocampus called CA3 that is normally stroke-resistant.

To make matters worse, untreated or late-treated rats also began an abnormal, robust production of Alzheimer’s disease-related proteins in the CA3 region, even becoming hypersensitive to one of the most toxic of the beta amyloid proteins that are a hallmark of Alzheimer’s.

Both problems appear associated with the increased production of free radicals in the brain. In fact, when the researchers blocked the excessive production, heightened stroke sensitivity and brain cell death in the CA3 region were reduced.

Interestingly the brain’s increased sensitivity to stressors such as inadequate oxygen was gender specific, Brann said. Removing testes in male rats, didn’t affect stroke size or damage.

Although exactly how it works is unknown, estrogen appears to help protect younger females from problems such as stroke and heart attack. Their risks of the maladies increase after menopause to about the same as males. Follow up studies are needed to see if estrogen therapy also reduces sensitivity to the beta amyloid protein in the CA3 region, as they expect, Brann noted.

Brann earlier showed that prolonged estrogen deprivation in aging rats dramatically reduces the number of brain receptors for the hormone as well as its ability to prevent strokes. Damage was forestalled if estrogen replacement was started shortly after hormone levels drop, according to the 2011 study in the journal Proceedings of the National Academy of Sciences.

The surprising results of the much-publicized Women’s Health Initiative – a 12-year study of 161,808 women ages 50-79 – found hormone therapy generally increased rather than decreased stroke risk as well as other health problems. Critics said one problem with the study was that many of the women, like Brann’s aged rats, had gone years without hormone replacement, bolstering the case that timing is everything.

(Source: eurekalert.org)

Filed under beta amyloid brain damage cognitive decline dementia alzheimer's disease neuroscience science

93 notes

Should I trust my intuition?
Do we always make better decisions when we take more time to think? Or are there decisions where more time doesn’t really help?
A study led by Zachary Mainen, Director of the Champalimaud Neuroscience Programme, and published in the scientific journal, Neuron, reports that when rats were challenged with a series of perceptual decision problems, their performance was just as good when they decided rapidly as when they took a much longer time to respond. Despite being encouraged to slow down and try harder, the subjects of this study achieved their maximum performance in less than 300 milliseconds.
'There are many kinds of decisions, and for some, having more time appears to be of no help. In these cases, you'd better go with your intuition, and that's what our subjects did', explains Zachary Mainen, the neuroscientist who led this study, while an Associate Professor at CSHL, in the USA.
This study suggests that rats can be used as an animal model to investigate what is happening in the human brain when ‘intuitive’ decisions are being made. ‘Decision-making is not a well-understood process, but it appears to be surprisingly similar among species. This study provides a basis to begin to take apart one type of decision and see how it really works’, the author adds. 
(Image: Kristen Dold | Thinkstock)

Should I trust my intuition?

Do we always make better decisions when we take more time to think? Or are there decisions where more time doesn’t really help?

A study led by Zachary Mainen, Director of the Champalimaud Neuroscience Programme, and published in the scientific journal, Neuron, reports that when rats were challenged with a series of perceptual decision problems, their performance was just as good when they decided rapidly as when they took a much longer time to respond. Despite being encouraged to slow down and try harder, the subjects of this study achieved their maximum performance in less than 300 milliseconds.

'There are many kinds of decisions, and for some, having more time appears to be of no help. In these cases, you'd better go with your intuition, and that's what our subjects did', explains Zachary Mainen, the neuroscientist who led this study, while an Associate Professor at CSHL, in the USA.

This study suggests that rats can be used as an animal model to investigate what is happening in the human brain when ‘intuitive’ decisions are being made. ‘Decision-making is not a well-understood process, but it appears to be surprisingly similar among species. This study provides a basis to begin to take apart one type of decision and see how it really works’, the author adds.

(Image: Kristen Dold | Thinkstock)

Filed under decision-making animal model intuitive decisions neuroscience psychology science

228 notes

How herpesvirus invades nervous system
Northwestern Medicine scientists have identified a component of the herpesvirus that “hijacks” machinery inside human cells, allowing the virus to rapidly and successfully invade the nervous system upon initial exposure.
Led by Gregory Smith, associate professor in immunology and microbiology at Northwestern University Feinberg School of Medicine, researchers found that viral protein 1-2, or VP1/2, allows the herpesvirus to interact with cellular motors, known as dynein. Once the protein has overtaken this motor, the virus can speed along intercellular highways, or microtubules, to move unobstructed from the tips of nerves in skin to the nuclei of neurons within the nervous system.
This is the first time researchers have shown a viral protein directly engaging and subverting the cellular motor; most other viruses passively hitch a ride into the nervous system.
"This protein not only grabs the wheel, it steps on the gas," says Smith. "Overtaking the cellular motor to invade the nervous system is a complicated accomplishment that most viruses are incapable of achieving. Yet the herpesvirus uses one protein, no others required, to transport its genetic information over long distances without stopping."
Herpesvirus is widespread in humans and affects more than 90 percent of adults in the United States. It is associated with several types of recurring diseases, including cold sores, genital herpes, chicken pox, and shingles. The virus can live dormant in humans for a lifetime, and most infected people do not know they are disease carriers. The virus can occasionally turn deadly, resulting in encephalitis in some.
Until now, scientists knew that herpesviruses travel quickly to reach neurons located deep inside the body, but the mechanism by which they advance remained a mystery.
Smith’s team conducted a variety of experiments with VP1/2 to demonstrate its important role in transporting the virus, including artificial activation and genetic mutation of the protein. The team studied the herpesvirus in animals, and also in human and animal cells in culture under high-resolution microscopy. In one experiment, scientists mutated the virus with a slower form of the protein dyed red, and raced it against a healthy virus dyed green. They observed that the healthy virus outran the mutated version down nerves to the neuron body to insert DNA and establish infection.
"Remarkably, this viral protein can be artificially activated, and in these conditions it zips around within cells in the absence of any virus. It is striking to watch," Smith says.
He says that understanding how the viruses move within people, especially from the skin to the nervous system, can help better prevent the virus from spreading.
Additionally, Smith says, “By learning how the virus infects our nervous system, we can mimic this process to treat unrelated neurologic diseases. Even now, laboratories are working on how to use herpesviruses to deliver genes into the nervous system and kill cancer cells.”
Smith’s team will next work to better understand how the protein functions. He notes that many researchers use viruses to learn how neurons are connected to the brain.
"Some of our mutants will advance brain mapping studies by resolving these connections more clearly than was previously possible," he says.

How herpesvirus invades nervous system

Northwestern Medicine scientists have identified a component of the herpesvirus that “hijacks” machinery inside human cells, allowing the virus to rapidly and successfully invade the nervous system upon initial exposure.

Led by Gregory Smith, associate professor in immunology and microbiology at Northwestern University Feinberg School of Medicine, researchers found that viral protein 1-2, or VP1/2, allows the herpesvirus to interact with cellular motors, known as dynein. Once the protein has overtaken this motor, the virus can speed along intercellular highways, or microtubules, to move unobstructed from the tips of nerves in skin to the nuclei of neurons within the nervous system.

This is the first time researchers have shown a viral protein directly engaging and subverting the cellular motor; most other viruses passively hitch a ride into the nervous system.

"This protein not only grabs the wheel, it steps on the gas," says Smith. "Overtaking the cellular motor to invade the nervous system is a complicated accomplishment that most viruses are incapable of achieving. Yet the herpesvirus uses one protein, no others required, to transport its genetic information over long distances without stopping."

Herpesvirus is widespread in humans and affects more than 90 percent of adults in the United States. It is associated with several types of recurring diseases, including cold sores, genital herpes, chicken pox, and shingles. The virus can live dormant in humans for a lifetime, and most infected people do not know they are disease carriers. The virus can occasionally turn deadly, resulting in encephalitis in some.

Until now, scientists knew that herpesviruses travel quickly to reach neurons located deep inside the body, but the mechanism by which they advance remained a mystery.

Smith’s team conducted a variety of experiments with VP1/2 to demonstrate its important role in transporting the virus, including artificial activation and genetic mutation of the protein. The team studied the herpesvirus in animals, and also in human and animal cells in culture under high-resolution microscopy. In one experiment, scientists mutated the virus with a slower form of the protein dyed red, and raced it against a healthy virus dyed green. They observed that the healthy virus outran the mutated version down nerves to the neuron body to insert DNA and establish infection.

"Remarkably, this viral protein can be artificially activated, and in these conditions it zips around within cells in the absence of any virus. It is striking to watch," Smith says.

He says that understanding how the viruses move within people, especially from the skin to the nervous system, can help better prevent the virus from spreading.

Additionally, Smith says, “By learning how the virus infects our nervous system, we can mimic this process to treat unrelated neurologic diseases. Even now, laboratories are working on how to use herpesviruses to deliver genes into the nervous system and kill cancer cells.”

Smith’s team will next work to better understand how the protein functions. He notes that many researchers use viruses to learn how neurons are connected to the brain.

"Some of our mutants will advance brain mapping studies by resolving these connections more clearly than was previously possible," he says.

Filed under herpesvirus dynein viral protein nervous system neurons infection neuroscience science

5,096 notes

Which Came First, the Head or the Brain?
The sea anemone, a cnidarian, has no brain. It does have a nervous system, and its body has a clear axis, with a mouth on one side and a basal disk on the other. However, there is no organized collection of neurons comparable to the kind of brain found in bilaterians, animals that have both a bilateral symmetry and a top and bottom. (Most animals except sponges, cnidarians, and a few other phyla are bilaterians.) So an interesting evolutionary question is, which came first, the head or the brain? Do animals such as sea anemones, which lack a brain, have something akin to a head?

In this issue of PLOS Biology, Chiara Sinigaglia and colleagues report that at least some developmental pathways seen in cnidarians share a common lineage with head and brain development in bilaterians. It might seem intuitive to expect to find genes involved in brain development around the mouth of the anemone, and previous work has suggested that the oral region in cnidarians corresponds to the head region of bilaterians. However, there has been debate over whether the oral or aboral pole of cnidarians is analogous to the anterior pole of bilaterians. At the start of its life cycle a sea anemone exists as a free swimming planula, which then attaches to a surface and becomes a sea anemone. That free-swimming phase contains an apical tuft, a sensory structure at the front of the swimming animal’s body. The apical tuft is the part that attaches and becomes the aboral pole (the part distal from the mouth) of the adult anemone.

To test whether genetic expression in the aboral pole of cnidarians does in fact resemble the head patterning seen in bilaterians, the researchers analyzed gene expression in Nematostella vectensis, a sea anemone found in estuaries and bays. They focused on the six3 and FoxQ2 transcription factors, as these genes are known to regulate development of the anterior-posterior axis in bilaterian species. (six3 knockout mice, for example, fail to develop a forebrain, and in humans, six3 is known to regulate the development of forebrain and eyes.)

The N. vectensis genome contains one gene from the six3/6 group and four foxQ2 genes. Sinigaglia and colleagues found that Nvsix3/6 and one of the foxQ2 genes, NvFoxQ2a, were expressed predominantly on the aboral pole of the developing cnidarian but, after gastrulation, were excluded from a small spot in that region (NvSix3/6 was also expressed in a small number of other cells of the planula that resembled neurons). Because of this, the authors call NvSix3/6 and NvFoQ2a “ring genes”, and genes that are then expressed in that spot “spot genes.” The spot then develops into the apical tuft.

Through knockdown and rescue experiments, the researchers demonstrate that NvSix3/6 is required for the development of the aboral region; without it, the expression of spot genes is reduced or eliminated and the apical tuft of the planula doesn’t form. This suggests that development of the region distal from the cnidarian mouth appears to parallel the development of the bilaterian head.

This research demonstrates that at least a subset of the genes that cause head and brain formation in bilaterians are also differentially expressed in the aboral region of the sea urchin. The expression patterns are not identical to those in all bilaterians; however, the similarities suggest that the patterns of gene expression arose in an ancestor common to bilaterians and cnidarians, and that the process was then modified in bilaterians to produce a brain. So to answer the evolutionary question posed above, it seems that the developmental module that produces a head came first.

Which Came First, the Head or the Brain?

The sea anemone, a cnidarian, has no brain. It does have a nervous system, and its body has a clear axis, with a mouth on one side and a basal disk on the other. However, there is no organized collection of neurons comparable to the kind of brain found in bilaterians, animals that have both a bilateral symmetry and a top and bottom. (Most animals except sponges, cnidarians, and a few other phyla are bilaterians.) So an interesting evolutionary question is, which came first, the head or the brain? Do animals such as sea anemones, which lack a brain, have something akin to a head?

In this issue of PLOS Biology, Chiara Sinigaglia and colleagues report that at least some developmental pathways seen in cnidarians share a common lineage with head and brain development in bilaterians. It might seem intuitive to expect to find genes involved in brain development around the mouth of the anemone, and previous work has suggested that the oral region in cnidarians corresponds to the head region of bilaterians. However, there has been debate over whether the oral or aboral pole of cnidarians is analogous to the anterior pole of bilaterians. At the start of its life cycle a sea anemone exists as a free swimming planula, which then attaches to a surface and becomes a sea anemone. That free-swimming phase contains an apical tuft, a sensory structure at the front of the swimming animal’s body. The apical tuft is the part that attaches and becomes the aboral pole (the part distal from the mouth) of the adult anemone.

To test whether genetic expression in the aboral pole of cnidarians does in fact resemble the head patterning seen in bilaterians, the researchers analyzed gene expression in Nematostella vectensis, a sea anemone found in estuaries and bays. They focused on the six3 and FoxQ2 transcription factors, as these genes are known to regulate development of the anterior-posterior axis in bilaterian species. (six3 knockout mice, for example, fail to develop a forebrain, and in humans, six3 is known to regulate the development of forebrain and eyes.)

The N. vectensis genome contains one gene from the six3/6 group and four foxQ2 genes. Sinigaglia and colleagues found that Nvsix3/6 and one of the foxQ2 genes, NvFoxQ2a, were expressed predominantly on the aboral pole of the developing cnidarian but, after gastrulation, were excluded from a small spot in that region (NvSix3/6 was also expressed in a small number of other cells of the planula that resembled neurons). Because of this, the authors call NvSix3/6 and NvFoQ2a “ring genes”, and genes that are then expressed in that spot “spot genes.” The spot then develops into the apical tuft.

Through knockdown and rescue experiments, the researchers demonstrate that NvSix3/6 is required for the development of the aboral region; without it, the expression of spot genes is reduced or eliminated and the apical tuft of the planula doesn’t form. This suggests that development of the region distal from the cnidarian mouth appears to parallel the development of the bilaterian head.

This research demonstrates that at least a subset of the genes that cause head and brain formation in bilaterians are also differentially expressed in the aboral region of the sea urchin. The expression patterns are not identical to those in all bilaterians; however, the similarities suggest that the patterns of gene expression arose in an ancestor common to bilaterians and cnidarians, and that the process was then modified in bilaterians to produce a brain. So to answer the evolutionary question posed above, it seems that the developmental module that produces a head came first.

Filed under sea anemone cnidarians brain brain formation gene expression genes neuroscience science

101 notes

Scientists identify brain’s ‘molecular memory switch’

Scientists have identified a key molecule responsible for triggering the chemical processes in our brain linked to our formation of memories.  The findings, published in the journal Frontiers in Neural Circuits, reveal a new target for therapeutic interventions to reverse the devastating effects of memory loss.

image

The BBSRC-funded research, led by scientists at the University of Bristol, aimed to better understand the mechanisms that enable us to form memories by studying the molecular changes in the hippocampus — the part of the brain involved in learning.

Previous studies have shown that our ability to learn and form memories is due to an increase in synaptic communication called Long Term Potentiation [LTP].  This communication is initiated through a chemical process triggered by calcium entering brain cells and activating a key enzyme called ‘Ca2+ responsive kinase’ [CaMKII].  Once this protein is activated by calcium it triggers a switch in its own activity enabling it to remain active even after the calcium has gone. This special ability of CaMKII to maintain its own activity has been termed ‘the molecular memory switch’.

Until now, the question still remained as to what triggers this chemical process in our brain that allows us to learn and form long-term memories.  The research team, comprising scientists from the University’s School of Physiology and Pharmacology, conducted experiments using the common fruit fly [Drosophila] to analyse and identify the molecular mechanisms behind this switch. Using advanced molecular genetic techniques that allowed them to temporarily inhibit the flies’ memory the team were able to identify a gene called CASK as the synaptic molecule regulating this ‘memory switch’.

Dr James Hodge, the study’s lead author, said: “Fruit flies are remarkably compatible for this type of study as they possess similar neuronal function and neural responses to humans.  Although small they are very smart, for instance, they can land on the ceiling and detect that the fruit in your fruit bowl has gone off before you can.”

“In experiments whereby we tested the flies’ learning and memory ability, involving two odours presented to the flies with one associated with a mild shock, we found that around 90 per cent were able to learn the correct choice remembering to avoid the odour associated with the shock. Five lessons of the odour with punishment made the fly remember to avoid that odour for between 24 hours and a week, which is a long time for an insect that only lives a couple of months.“

By localising the function of the key molecules CASK and CaMKII to the flies’ equivalent brain area to the human hippocampus, the team found that the flies lacking these genes showed disrupted memory formation.  In repeat memory tests those lacking these key genes were shown to have no ability to remember at three hours (mid-term memory) and 24 hours (long-term memory) although their initial learning or short-term memory wasn’t affected.

Finally, the team introduced a copy of the human CASK gene — it is 80 per cent identical to the fly CASK gene — into the genome of a fly that completely lacked its own CASK gene and was therefore not usually able to remember.  The researchers found that flies which had a copy of the human CASK gene could remember like a normal wildtype fly.

Dr Hodge, from the University’s School of Physiology and Pharmacology, said: “Research into memory is particularly important as it gives us our sense of identity, and deficits in learning and memory occur in many diseases, injuries and during aging”.

“CASK’s control of CaMKII ‘molecular memory switch’ is clearly a critical step in how memories are written into neurons in the brain.  These findings not only pave the way for to developing new therapies which reverse the effects of memory loss but also prove the compatibility of Drosophila to model these diseases in the lab and screen for new drugs to treat these diseases. Furthermore, this work provides an important insight into how brains have evolved their huge capacity to acquire and store information.”

These findings clearly demonstrate that neuronal function of CASK is conserved between flies and human, validating the use of Drosophila to understand CASK function in both the healthy and diseased brain. Mutations in human CASK gene have been associated with neurological and cognitive defects including severe learning difficulties.

(Source: bristol.ac.uk)

Filed under memory memory loss hippocampus LTP brain cells fruit flies molecular mechanisms neuroscience science

252 notes

Is Obama’s Plan to Map the Human Brain this Generation’s Equivalent to Landing a Man on the Moon?
President John F. Kennedy’s mission in the 1960s was to land a man on the moon. President Bill Clinton made cracking the human genome one of his top priorities. Now, President Barack Obama says a detailed map of the human brain is necessary to understand how it works and what needs to be done when it’s not working properly. The president is expected to unveil his plans for an estimated $3 billion, decade-long commitment to the Brain Activity Map project next month in his 2014 budget proposal.
Rutgers Today talked with Rutgers University behavioral neuroscientist Timothy Otto, professor and director of the Behavioral and Systems Neuroscience program in the Department of Psychology, about what we know about the brain, how much we still need to discover and if spending billions of dollars in research will enable scientists to develop new treatments for debilitating neurological diseases like Alzheimer’s, Parkinson’s and autism.
Read more

Is Obama’s Plan to Map the Human Brain this Generation’s Equivalent to Landing a Man on the Moon?

President John F. Kennedy’s mission in the 1960s was to land a man on the moon. President Bill Clinton made cracking the human genome one of his top priorities. Now, President Barack Obama says a detailed map of the human brain is necessary to understand how it works and what needs to be done when it’s not working properly. The president is expected to unveil his plans for an estimated $3 billion, decade-long commitment to the Brain Activity Map project next month in his 2014 budget proposal.

Rutgers Today talked with Rutgers University behavioral neuroscientist Timothy Otto, professor and director of the Behavioral and Systems Neuroscience program in the Department of Psychology, about what we know about the brain, how much we still need to discover and if spending billions of dollars in research will enable scientists to develop new treatments for debilitating neurological diseases like Alzheimer’s, Parkinson’s and autism.

Read more

Filed under brain Brain Activity Map BAM project neurodegenerative diseases neurological disorders neuroscience science

44 notes

Researchers discover primary role of the olivocochlear efferent system

New research from the Massachusetts Eye and Ear, Harvard Medical School and Harvard Program in Speech and Hearing Bioscience and Technology may have discovered a key piece in the puzzle of how hearing works by identifying the role of the olivocochlear efferent system in protecting ears from hearing loss. The findings could eventually lead to screening tests to determine who is most susceptible to hearing loss. Their paper is published today in the Journal of Neuroscience.

Until recently, it was common knowledge that exposure to a noisy environment (concert, iPod, mechanical tools, firearm, etc.), could lead to permanent or temporary hearing loss. Most audiologists would assess the damage caused by this type of exposure by measuring hearing thresholds, the lowest level at which one starts to detect/sense a sound at a particular frequency (pitch). Drs. Sharon Kujawa and Charles Liberman, both researchers at Mass. Eye and Ear, showed in 2009 that noise exposures leading to a temporary hearing loss in mice (when hearing thresholds return to what they were before exposure) in fact can be associated with cochlear neuropathy, a situation in which, despite having a normal threshold, a portion of auditory nerve fibers is missing).

The inner ear, the organ that converts sounds into messages that will be conveyed to and decoded by the brain, receives in turn fibers from the central nervous system. Those fibers are known as the olivocochlear efferent system. Up to now, the involvement of this efferent system in the protection from acoustic injury – although clearly demonstrated – has been a matter of debate because all the previous experiments were probing its protective effects following noise exposures very unlikely to be found in nature.

Stephane Maison, Ph.D., investigator at the Eaton-Peabody Laboratory at Mass. Eye and Ear and lead author, explains. “Humans are currently exposed to the type of noise used in those experiments but it’s hard to conceive that some vertebrates, thousands of years ago, were submitted to stimuli similar to those delivered by speakers. So many researchers believed that the protective effects of the efferent system were an epiphenomenon – not its true function.”

Instead of using loud noise exposures evoking a change in hearing threshold, we used a moderate noise exposure at a level similar to those found in restaurants, conferences, malls, and also in nature (some frogs emit vocalizations at similar or higher levels) and instead of looking at thresholds, we looked for signs of cochlear neuropathy, Dr. Maison continued.

The researchers demonstrated that such moderate exposure lead to cochlear neuropathy (loss of auditory nerve fibers), which causes difficulty to hear in noisy environments.

"This is tremendously important because all of us are submitted to such acoustic environments and it takes a lot of auditory nerve fiber loss before it gets to be detected by simply measuring thresholds as it’s done when preforming an audiogram," Dr. Maison said. "The second important discovery is that, in mice where the efferent system has been surgically removed, cochlear neuropathy is tremendously exacerbated. That second piece proves that the efferent system does play a very important role in protecting the ear from cochlear neuropathy and we may have found its main function."

The researchers say they are excited about this discovery because the strength of the efferent system can be recorded non-invasively in humans and a non-invasive assay to record the efferent system strength has already been developed and shows that one is able to predict vulnerability to acoustic injury (Maison and Liberman, Predicting vulnerability to acoustic injury with a noninvasive assay of olivocochlear reflex strength, Journal of Neuroscience, 20:4701-4707, 2000).

"One could envision applying this assay or a modified version of it to human populations to screen for individuals most at risk in noise environments," Dr. Maison concluded.

(Source: eurekalert.org)

Filed under olivocochlear efferent system hearing hearing loss nerve fibers inner ear cochlear neuropathy neuroscience science

87 notes

Virtual Games Help the Blind Navigate Unknown Territory
On March 27th JoVE (Journal of Visualized Experiments) published a new video article by Dr. Lotfi Merabet showing how researchers in the Department of Ophthalmology at Massachusetts Eye and Ear Infirmary and Harvard Medical School have developed a virtual gaming environment to help blind individuals improve navigation skills and develop a cognitive spatial map of unfamiliar buildings and public locations.
"For the blind, finding your way or navigating in a place that is unfamiliar presents a real challenge," Dr. Merabet explains. "As people with sight, we can capture sensory information through our eyes about our surroundings. For the blind that is a real challenge… the blind will typically use auditory and tactile cues."
The technique utilizes computer generated layouts of public buildings and spatial sensory feedback to synthesize a virtual world that mimics a real world navigation task. In the game, participants must find jewels and carry them out of the building, without being intercepted by roaming monsters that steal the jewels and hide them elsewhere.  Participants interface with the virtual building by using a keyboard and wearing headphones that play auditory cues that help spatially orient them to the world around them. This interaction helps users generate an accurate mental layout of the mimicked building.  Dr. Merabet and his colleagues are also exploring applications of this technology with other user interfaces, like a Wii Remote or joystick.
"We have developed software called ABES, the Audio Based Environment Simulator that represents the actual physical environment of the Carol Center for the Blind in Newton Massachusetts. The participants will use the game metaphor to get a sense of the whole building through open discovery, allowing people to learn room layouts more naturally than if they were just following directions."
The technology will invariably be useful for the 285 million blind people world-wide, 6 million of which live in the United States. It will also have applications beyond the blind community for individuals with other visual impairments, cognitive deficits, or those recovering from brain injuries.
Dr. Merabet considers publication in JoVE’s video format especially helpful. “It is conceptually difficult for a sighted person to understand ‘a video game for blind people.’ What JoVE allows us to do is break down layouts of the game and strategy, show how the auditory cues can be used and how we quantify performance going from the virtual game to the physical world.”

Virtual Games Help the Blind Navigate Unknown Territory

On March 27th JoVE (Journal of Visualized Experiments) published a new video article by Dr. Lotfi Merabet showing how researchers in the Department of Ophthalmology at Massachusetts Eye and Ear Infirmary and Harvard Medical School have developed a virtual gaming environment to help blind individuals improve navigation skills and develop a cognitive spatial map of unfamiliar buildings and public locations.

"For the blind, finding your way or navigating in a place that is unfamiliar presents a real challenge," Dr. Merabet explains. "As people with sight, we can capture sensory information through our eyes about our surroundings. For the blind that is a real challenge… the blind will typically use auditory and tactile cues."

The technique utilizes computer generated layouts of public buildings and spatial sensory feedback to synthesize a virtual world that mimics a real world navigation task. In the game, participants must find jewels and carry them out of the building, without being intercepted by roaming monsters that steal the jewels and hide them elsewhere.  Participants interface with the virtual building by using a keyboard and wearing headphones that play auditory cues that help spatially orient them to the world around them. This interaction helps users generate an accurate mental layout of the mimicked building.  Dr. Merabet and his colleagues are also exploring applications of this technology with other user interfaces, like a Wii Remote or joystick.

"We have developed software called ABES, the Audio Based Environment Simulator that represents the actual physical environment of the Carol Center for the Blind in Newton Massachusetts. The participants will use the game metaphor to get a sense of the whole building through open discovery, allowing people to learn room layouts more naturally than if they were just following directions."

The technology will invariably be useful for the 285 million blind people world-wide, 6 million of which live in the United States. It will also have applications beyond the blind community for individuals with other visual impairments, cognitive deficits, or those recovering from brain injuries.

Dr. Merabet considers publication in JoVE’s video format especially helpful. “It is conceptually difficult for a sighted person to understand ‘a video game for blind people.’ What JoVE allows us to do is break down layouts of the game and strategy, show how the auditory cues can be used and how we quantify performance going from the virtual game to the physical world.”

Filed under blind virtual gaming environment navigation skills sensory information cognitive map neuroscience science

free counters