Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

119 notes

Optogenetics as good as electrical stimulation
Neuroscientists are eagerly, but not always successfully, looking for proof that optogenetics – a celebrated technique that uses pulses of visible light to genetically alter brain cells to be excited or silenced – can be as successful in complex and large brains as it has been in rodent models.
A new study in the journal Current Biology may be the most definitive demonstration yet that the technique can work in nonhuman primates as well as, or even a little better than, the tried-and-true method of perturbing brain circuits with small bursts of electrical current. Brown University researchers directly compared the two techniques to test how well they could influence the visual decision-making behavior of two primates.
“For most of my colleagues in neuroscience to say ‘I’ll be able to incorporate [optogenetics] into my daily work with nonhuman primates,’ you have to get beyond ‘It does seem to sort of work’,” said study senior author David Sheinberg, professor of neuroscience professor affiliated with the Brown Institute for Brain Science. “In our comparison, one of the nice things is that in some ways we found quite analogous effects between electrical and optical [stimulation] but in the optical case it seemed more focused.”
Ultimately if it consistently proves safe and effective in the large, complex brains of primates, optogenetics could eventually be used in humans where it could provide a variety of potential diagnostic and therapeutic benefits.
Evidence in sight
With that in mind, Sheinberg, lead author Ji Dai and second author Daniel Brooks designed their experiments to determine whether and how much optical or electrical stimulation in a particular area of the brain called the lateral intraparietal area (LIP) would affect each subject’s decision making when presented with a choice between a target and a similar-looking, distracting character.
“This is an area of the brain involved in registering the location of salient objects in the visual world,” said Sheinberg who added that the experimental task was more cognitively sophisticated than those tested in optogenetics experiments in nonhuman primates before.
The main task for the subjects was to fixate on a central point in middle of the screen and then to look toward the letter “T” when it appeared around the edge of the screen. In some trials, they had to decide quickly between the T and a similar looking “+” or “†” character presented on opposite ends of the screen. They were rewarded if they glanced toward the T.
Before beginning those trials, the researchers had carefully placed a very thin combination sensor of an optical fiber and an electrode amid a small population of cells in the LIP of each subject. Then they mapped where on the screen an object should be in order for them to detect a response in those cells. They called that area the receptive field. With this information, they could then look to see what difference either optical or electrical stimulation of those cells would have on the subject’s inclination to look when the T or the distracting character appeared at various locations in visual space.
They found that stimulating with either method increased both subjects’ accuracy in choosing the target when it appeared in their receptive field. They also found the primates became less accurate when the distracting character appeared in their receptive field. Generally accuracy was unchanged when neither character was in the receptive field.
In other words, the stimulation of a particular group of LIP cells significantly biased the subjects to look at objects that appeared in the receptive field associated with those cells. Either stimulation method could therefore make the subjects more accurate or effectively distract them from making the right choice.
The magnitude of the difference made by either stimulation method compared to no stimulation were small, but statistically significant. When the T was in the receptive field, one research subject became 10 percentage points more accurate (80 percent vs. 70 percent) when optically stimulated and eight points more accurate when electrically stimulated. The subject was five points less accurate (73 percent vs. 78 percent) with optical stimulation and six percentage points less accurate with electrical stimulation when the distracting character was in the receptive field.
The other subject showed similar differences. In all, the two primates made thousands of choices over scores of sessions between the T and the distracting character with either kind of stimulation or none. Compared head-to-head in a statistical analysis, electrical and optical stimulation showed essentially similar effects in biasing the decisions.
Optical advantages
Although the two methods performed at parity on the main measure of accuracy, the optogenetic method had a couple of advantages, Sheinberg said.
Electrical stimulation appeared to be less precise in the cells it reached, a possibility suggested by a reduction in electrically stimulated subjects’ reaction time when the T appeared outside the receptive field. Optogenetic stimulation, Sheinberg said, did not produce such unintended effects.
Electrical stimulation also makes simultaneous electrical recording very difficult, Sheinberg said. That makes it hard to understand what neurons do when they are stimulated. Optogenetics, he said, allows for easier simultaneous electrical recording of neural activity.
Sheinberg said he is encouraged about using optogenetics to investigate even more sophisticated questions of cognition.
“Our goal is to be able to now expand this and use it again as a daily tool to probe circuits in more complicated paradigms,” Sheinberg said.
He plans a new study in which his group will look at memory of visual cues in the LIP.

Optogenetics as good as electrical stimulation

Neuroscientists are eagerly, but not always successfully, looking for proof that optogenetics – a celebrated technique that uses pulses of visible light to genetically alter brain cells to be excited or silenced – can be as successful in complex and large brains as it has been in rodent models.

A new study in the journal Current Biology may be the most definitive demonstration yet that the technique can work in nonhuman primates as well as, or even a little better than, the tried-and-true method of perturbing brain circuits with small bursts of electrical current. Brown University researchers directly compared the two techniques to test how well they could influence the visual decision-making behavior of two primates.

“For most of my colleagues in neuroscience to say ‘I’ll be able to incorporate [optogenetics] into my daily work with nonhuman primates,’ you have to get beyond ‘It does seem to sort of work’,” said study senior author David Sheinberg, professor of neuroscience professor affiliated with the Brown Institute for Brain Science. “In our comparison, one of the nice things is that in some ways we found quite analogous effects between electrical and optical [stimulation] but in the optical case it seemed more focused.”

Ultimately if it consistently proves safe and effective in the large, complex brains of primates, optogenetics could eventually be used in humans where it could provide a variety of potential diagnostic and therapeutic benefits.

Evidence in sight

With that in mind, Sheinberg, lead author Ji Dai and second author Daniel Brooks designed their experiments to determine whether and how much optical or electrical stimulation in a particular area of the brain called the lateral intraparietal area (LIP) would affect each subject’s decision making when presented with a choice between a target and a similar-looking, distracting character.

“This is an area of the brain involved in registering the location of salient objects in the visual world,” said Sheinberg who added that the experimental task was more cognitively sophisticated than those tested in optogenetics experiments in nonhuman primates before.

The main task for the subjects was to fixate on a central point in middle of the screen and then to look toward the letter “T” when it appeared around the edge of the screen. In some trials, they had to decide quickly between the T and a similar looking “+” or “†” character presented on opposite ends of the screen. They were rewarded if they glanced toward the T.

Before beginning those trials, the researchers had carefully placed a very thin combination sensor of an optical fiber and an electrode amid a small population of cells in the LIP of each subject. Then they mapped where on the screen an object should be in order for them to detect a response in those cells. They called that area the receptive field. With this information, they could then look to see what difference either optical or electrical stimulation of those cells would have on the subject’s inclination to look when the T or the distracting character appeared at various locations in visual space.

They found that stimulating with either method increased both subjects’ accuracy in choosing the target when it appeared in their receptive field. They also found the primates became less accurate when the distracting character appeared in their receptive field. Generally accuracy was unchanged when neither character was in the receptive field.

In other words, the stimulation of a particular group of LIP cells significantly biased the subjects to look at objects that appeared in the receptive field associated with those cells. Either stimulation method could therefore make the subjects more accurate or effectively distract them from making the right choice.

The magnitude of the difference made by either stimulation method compared to no stimulation were small, but statistically significant. When the T was in the receptive field, one research subject became 10 percentage points more accurate (80 percent vs. 70 percent) when optically stimulated and eight points more accurate when electrically stimulated. The subject was five points less accurate (73 percent vs. 78 percent) with optical stimulation and six percentage points less accurate with electrical stimulation when the distracting character was in the receptive field.

The other subject showed similar differences. In all, the two primates made thousands of choices over scores of sessions between the T and the distracting character with either kind of stimulation or none. Compared head-to-head in a statistical analysis, electrical and optical stimulation showed essentially similar effects in biasing the decisions.

Optical advantages

Although the two methods performed at parity on the main measure of accuracy, the optogenetic method had a couple of advantages, Sheinberg said.

Electrical stimulation appeared to be less precise in the cells it reached, a possibility suggested by a reduction in electrically stimulated subjects’ reaction time when the T appeared outside the receptive field. Optogenetic stimulation, Sheinberg said, did not produce such unintended effects.

Electrical stimulation also makes simultaneous electrical recording very difficult, Sheinberg said. That makes it hard to understand what neurons do when they are stimulated. Optogenetics, he said, allows for easier simultaneous electrical recording of neural activity.

Sheinberg said he is encouraged about using optogenetics to investigate even more sophisticated questions of cognition.

“Our goal is to be able to now expand this and use it again as a daily tool to probe circuits in more complicated paradigms,” Sheinberg said.

He plans a new study in which his group will look at memory of visual cues in the LIP.

Filed under optogenetics neural circuit electrical stimulation lateral intraparietal area neuroscience science

108 notes

Sniffing Out Danger: Rutgers Scientists Say Fearful Memories Can Trigger Heightened Sense of Smell

Most people – including scientists – assumed we can’t just sniff out danger.

It was thought that we become afraid of an odor – such as leaking gas – only after information about a scary scent is processed by our brain.

image

But neuroscientists at Rutgers University studying the olfactory – sense of smell – system in mice have discovered that this fear reaction can occur at the sensory level, even before the brain has the opportunity to interpret that the odor could mean trouble.

In a new study published today in Science, John McGann, associate professor of behavioral and systems neuroscience in the Department of Psychology, and his colleagues, report that neurons in the noses of laboratory animals reacted more strongly to threatening odors before the odor message was sent to the brain.

“What is surprising is that we tend to think of learning as something that only happens deep in the brain after conscious awareness,” says McGann whose laboratory studies the sense of smell. “But now we see how the nervous system can become especially sensitive to threatening stimuli and that fear-learning can affect the signals passing from sensory organs to the brain.”

McGann and students Marley Kass and Michelle Rosenthal made this discovery by using light to observe activity in the brains of genetically engineered mice through a window in the mouse’s skull. They found that those mice that received an electric shock simultaneously with a specific odor showed an enhanced response to the smell in the cells in the nose, before the message was delivered to the neurons in the brain.

This new research – which indicates that fearful memories can influence the senses – could help to better understand conditions like Post Traumatic Stress Disorder, in which feelings of anxiety and fear exist even though an individual is no longer in danger.

“We know that anxiety disorders like PTSD can sometimes be triggered by smell, like the smell of diesel exhaust for a soldier,” says McGann who received funding from the National Institute of Mental Health and the National Institute on Deafness and Other Communication Disorders for this research. “What this study does is gives us a new way of thinking about how this might happen.”

In their study, the scientists also discovered a heightened sensitivity to odors in the mice traumatized by shock. When these mice smelled the odor associated with the electrical shocks, the amount of neurotransmitter – chemicals that carry communications between nerve cells – released from the olfactory nerve into the brain was as big as if the odor were four times stronger than it actually was.

This created mice whose brains were hypersensitive to the fear-associated odors. Before now, scientists did not think that reward or punishment could influence how the sensory organs process information.

The next step in the continuing research, McGann says, is to determine whether the hypersensitivity to threatening odors can be reversed by using exposure therapy to teach the mice that the electrical shock is no longer associated with a specific odor. This could help develop a better understanding of fear learning that might someday lead to new therapeutic treatments for anxiety disorders in humans, he says.

(Source: news.rutgers.edu)

Filed under olfactory system memory fear learning anxiety disorders neuroscience science

321 notes

Even when test scores go up, some cognitive abilities don’t

To evaluate school quality, states require students to take standardized tests; in many cases, passing those tests is necessary to receive a high-school diploma. These high-stakes tests have also been shown to predict students’ future educational attainment and adult employment and income.

image

Such tests are designed to measure the knowledge and skills that students have acquired in school — what psychologists call “crystallized intelligence.” However, schools whose students have the highest gains on test scores do not produce similar gains in “fluid intelligence” — the ability to analyze abstract problems and think logically — according to a new study from MIT neuroscientists working with education researchers at Harvard University and Brown University.

In a study of nearly 1,400 eighth-graders in the Boston public school system, the researchers found that some schools have successfully raised their students’ scores on the Massachusetts Comprehensive Assessment System (MCAS). However, those schools had almost no effect on students’ performance on tests of fluid intelligence skills, such as working memory capacity, speed of information processing, and ability to solve abstract problems.

“Our original question was this: If you have a school that’s effectively helping kids from lower socioeconomic environments by moving up their scores and improving their chances to go to college, then are those changes accompanied by gains in additional cognitive skills?” says John Gabrieli, the Grover M. Hermann Professor of Health Sciences and Technology, professor of brain and cognitive sciences, and senior author of a forthcoming Psychological Science paper describing the findings.

Instead, the researchers found that educational practices designed to raise knowledge and boost test scores do not improve fluid intelligence. “It doesn’t seem like you get these skills for free in the way that you might hope, just by doing a lot of studying and being a good student,” says Gabrieli, who is also a member of MIT’s McGovern Institute for Brain Research.

Measuring cognition

This study grew out of a larger effort to find measures beyond standardized tests that can predict long-term success for students. “As we started that study, it struck us that there’s been surprisingly little evaluation of different kinds of cognitive abilities and how they relate to educational outcomes,” Gabrieli says.

The data for the Psychological Science study came from students attending traditional, charter, and exam schools in Boston. Some of those schools have had great success improving their students’ MCAS scores — a boost that studies have found also translates to better performance on the SAT and Advanced Placement tests.

The researchers calculated how much of the variation in MCAS scores was due to the school that students attended. For MCAS scores in English, schools accounted for 24 percent of the variation, and they accounted for 34 percent of the math MCAS variation. However, the schools accounted for very little of the variation in fluid cognitive skills — less than 3 percent for all three skills combined.

In one example of a test of fluid reasoning, students were asked to choose which of six pictures completed the missing pieces of a puzzle — a task requiring integration of information such as shape, pattern, and orientation.

“It’s not always clear what dimensions you have to pay attention to get the problem correct. That’s why we call it fluid, because it’s the application of reasoning skills in novel contexts,” says Amy Finn, an MIT postdoc and lead author of the paper.

Even stronger evidence came from a comparison of about 200 students who had entered a lottery for admittance to a handful of Boston’s oversubscribed charter schools, many of which achieve strong improvement in MCAS scores. The researchers found that students who were randomly selected to attend high-performing charter schools did significantly better on the math MCAS than those who were not chosen, but there was no corresponding increase in fluid intelligence scores.

However, the researchers say their study is not about comparing charter schools and district schools. Rather, the study showed that while schools of both types varied in their impact on test scores, they did not vary in their impact on fluid cognitive skills. 

The researchers plan to continue tracking these students, who are now in 10th grade, to see how their academic performance and other life outcomes evolve. They have also begun to participate in a new study of high school seniors to track how their standardized test scores and cognitive abilities influence their rates of college attendance and graduation.

Implications for education

Gabrieli notes that the study should not be interpreted as critical of schools that are improving their students’ MCAS scores. “It’s valuable to push up the crystallized abilities, because if you can do more math, if you can read a paragraph and answer comprehension questions, all those things are positive,” he says.

He hopes that the findings will encourage educational policymakers to consider adding practices that enhance cognitive skills. Although many studies have shown that students’ fluid cognitive skills predict their academic performance, such skills are seldom explicitly taught.

“Schools can improve crystallized abilities, and now it might be a priority to see if there are some methods for enhancing the fluid ones as well,” Gabrieli says.

Some studies have found that educational programs that focus on improving memory, attention, executive function, and inductive reasoning can boost fluid intelligence, but there is still much disagreement over what programs are consistently effective.

(Source: web.mit.edu)

Filed under crystallized intelligence fluid intelligence cognition learning psychology neuroscience science

81 notes

Dietary Amino Acids Relieve Sleep Problems after Traumatic Brain Injury in Animals

Scientists who fed a cocktail of key amino acids to mice improved sleep disturbances caused by brain injuries in the animals. These new findings suggest a potential dietary treatment for millions of people affected by traumatic brain injury (TBI)—a condition that is currently untreatable.

image

“If this type of dietary treatment is proved to help patients recover function after traumatic brain injury, it could become an important public health benefit,” said study co-leader Akiva S. Cohen, Ph.D., a neuroscientist at The Children’s Hospital of Philadelphia (CHOP).

Cohen is the co-senior author of the animal TBI study appearing today in Science Translational Medicine. He collaborated with two experts in sleep medicine: co-senior author Allan I. Pack, M.D., Ph.D., director of the Center for Sleep and Circadian Neurobiology in the Perelman School of Medicine at the University of Pennsylvania; and first author Miranda M. Lim, M.D., Ph.D., formerly at the Penn Sleep Center, and now on faculty at the Portland VA Medical Center and Oregon Health and Science University.

Every year in the U.S., an estimated 2 million people suffer a TBI, accounting for a major cause of disability across all age groups. Although 75 percent of reported TBI cases are milder forms such as concussion, even concussion may cause chronic neurological impairments, including cognitive, motor and sleep problems.

“Sleep disturbances, such as excessive daytime sleepiness and nighttime insomnia, disrupt quality of life and can delay cognitive recovery in patients with TBI,” said Lim, a neurologist and sleep medicine specialist. Although physicians can relieve the dangerous swelling that occurs after a severe TBI, there are no existing treatments to address the underlying brain damage associated with neurobehavioral problems such as impaired memory, learning and sleep patterns.

Cohen and team investigate the use of selected branched chain amino acids (BCAA)—precursors of the neurotransmitters glutamate and GABA, which are involved in communication among neurons and help to maintain a normal balance in brain activity. His research team previously showed that a BCAA diet restored cognitive ability in brain-injured mice. The current study was the first to analyze sleep-wake patterns in an animal model.

Comparing mice with experimentally induced mild TBI to uninjured mice, the scientists found the injured mice were unable to stay awake for long periods of time. The injured mice had lower activity among orexin neurons, which help to maintain the animals’ wakefulness. This is similar to results in human studies showing decreased orexin levels in the spinal fluid after TBI.

In the current study, the dietary therapy restored the orexin neurons to a normal activity level and improved wakefulness in the brain-injured mice. EEG recordings also showed improved brain wave patterns among the mice that consumed the BCAA diet.

“These results in an animal model provide a proof-of-principle for investigating this dietary intervention as a treatment for TBI patients,” said Cohen. “If a dietary supplement can improve sleeping and waking patterns as well as cognitive problems, it could help brain-injured patients regain crucial functions.” Cohen cautioned that current evidence does not support TBI patients medicating themselves with commercially available amino acids.

(Source: chop.edu)

Filed under TBI brain injury amino acids sleep glutamate neurons neuroscience science

131 notes

Sleep-Deprived Mice Show Connections Among Lack of Shut-eye, Diabetes, Age
Sleep, or the lack of it, seems to affect just about every aspect of human physiology. Yet, the molecular pathways through which sleep deprivation wreaks its detrimental effects on the body remain poorly understood. Although numerous studies have looked at the consequences of sleep deprivation on the brain, comparatively few have directly tested its effects on peripheral organs.
During sleep deprivation cells upregulate the UPR – the unfolded protein response – a process where misfolded proteins get refolded or degraded.
Five years ago, researchers at the Perelman School of Medicine, University of Pennsylvania, showed that the UPR is an adaptive response to stress induced by sleep deprivation and is impaired in the brains of old mice. Those findings suggested that inadequate sleep in the elderly, who normally experience sleep disturbances, could exacerbate an already-impaired protective response to protein misfolding that happens in aging cells. Protein misfolding and clumping is associated with many diseases such as Alzheimer’s and Parkinson’s, noted Nirinjini Naidoo, Ph.D., research associate professor in the Division of Sleep Medicine in that study.
Naidoo is also senior author of a follow-up study in Aging Cell this month that shows, for the first time, an effect of sleep deprivation on the UPR in peripheral tissue, in this case, the pancreas. They showed that stress in pancreatic cells due to sleep deprivation may contribute to the loss or dysfunction of these cells important to maintaining proper blood sugar levels, and that these functions may be exacerbated by normal aging.
“The combined effect of aging and sleep deprivation resulted in a loss of control of blood sugar reminiscent of pre-diabetes in mice,” says Naidoo. “We hypothesize that older humans might be especially susceptible to the effects of sleep deprivation on the disruption of glucose homeostasis via cell stress.”
Working with Penn colleague Joe Baur, Ph.D., assistant professor of Physiology, Naidoo started a collaboration to look at the relationship of sleep deprivation, the UPR, and metabolic response with age. Other researchers had suggested that the death of beta cells associated with type 2 diabetes may be due to stress in a cell compartment called the endoplasmic reticulum (ER). The UPR is one part of the quality control system in the ER, where some proteins are made.
Knowing this, Naidoo and Baur asked if sleep deprivation (SD) causes ER stress in the pancreas, via an increase in protein misfolding, and in turn, how this relates to aging.
The team examined tissues in mice for cellular stress following acute SD, and they also looked for cellular stress in aging mice. Their results show that both age and SD combine to induce cellular stress in the pancreas.
Older mice fared markedly worse when subjected to sleep deprivation. Pancreas tissue from older mice or from young animals subjected to sleep deprivation exhibited signs of protein misfolding, yet both were able to maintain insulin secretion and control blood sugar levels. Pancreas tissue from acutely sleep-deprived aged animals exhibited a marked increase in CHOP, a protein associated with cell death, suggesting a maladaptive response to cellular stress with age that was amplified by sleep deprivation.
Acute sleep deprivation caused increased plasma glucose levels in both young and old animals. However, this change was not overtly related to stress in beta cells, since plasma insulin levels were not lower following acute lack of sleep.
Accordingly, young animals subjected to acute sleep deprivation remained tolerant to a glucose challenge. In a chronic sleep deprivation experiment, young mice were sensitized to insulin and had improved control of their blood sugar, whereas aged animals became hyperglycemic and failed to maintain appropriate plasma insulin concentrations.
While changes in insulin secretion are unlikely to play a major role in the acute effects of SD, cellular stress in pancreatic tissue suggests that chronic SD may contribute to the loss or dysfunction of endocrine cells, and that these effects may be exacerbated by normal aging, say the researchers.

Sleep-Deprived Mice Show Connections Among Lack of Shut-eye, Diabetes, Age

Sleep, or the lack of it, seems to affect just about every aspect of human physiology. Yet, the molecular pathways through which sleep deprivation wreaks its detrimental effects on the body remain poorly understood. Although numerous studies have looked at the consequences of sleep deprivation on the brain, comparatively few have directly tested its effects on peripheral organs.

During sleep deprivation cells upregulate the UPR – the unfolded protein response – a process where misfolded proteins get refolded or degraded.

Five years ago, researchers at the Perelman School of Medicine, University of Pennsylvania, showed that the UPR is an adaptive response to stress induced by sleep deprivation and is impaired in the brains of old mice. Those findings suggested that inadequate sleep in the elderly, who normally experience sleep disturbances, could exacerbate an already-impaired protective response to protein misfolding that happens in aging cells. Protein misfolding and clumping is associated with many diseases such as Alzheimer’s and Parkinson’s, noted Nirinjini Naidoo, Ph.D., research associate professor in the Division of Sleep Medicine in that study.

Naidoo is also senior author of a follow-up study in Aging Cell this month that shows, for the first time, an effect of sleep deprivation on the UPR in peripheral tissue, in this case, the pancreas. They showed that stress in pancreatic cells due to sleep deprivation may contribute to the loss or dysfunction of these cells important to maintaining proper blood sugar levels, and that these functions may be exacerbated by normal aging.

“The combined effect of aging and sleep deprivation resulted in a loss of control of blood sugar reminiscent of pre-diabetes in mice,” says Naidoo. “We hypothesize that older humans might be especially susceptible to the effects of sleep deprivation on the disruption of glucose homeostasis via cell stress.”

Working with Penn colleague Joe Baur, Ph.D., assistant professor of Physiology, Naidoo started a collaboration to look at the relationship of sleep deprivation, the UPR, and metabolic response with age. Other researchers had suggested that the death of beta cells associated with type 2 diabetes may be due to stress in a cell compartment called the endoplasmic reticulum (ER). The UPR is one part of the quality control system in the ER, where some proteins are made.

Knowing this, Naidoo and Baur asked if sleep deprivation (SD) causes ER stress in the pancreas, via an increase in protein misfolding, and in turn, how this relates to aging.

The team examined tissues in mice for cellular stress following acute SD, and they also looked for cellular stress in aging mice. Their results show that both age and SD combine to induce cellular stress in the pancreas.

Older mice fared markedly worse when subjected to sleep deprivation. Pancreas tissue from older mice or from young animals subjected to sleep deprivation exhibited signs of protein misfolding, yet both were able to maintain insulin secretion and control blood sugar levels. Pancreas tissue from acutely sleep-deprived aged animals exhibited a marked increase in CHOP, a protein associated with cell death, suggesting a maladaptive response to cellular stress with age that was amplified by sleep deprivation.

Acute sleep deprivation caused increased plasma glucose levels in both young and old animals. However, this change was not overtly related to stress in beta cells, since plasma insulin levels were not lower following acute lack of sleep.

Accordingly, young animals subjected to acute sleep deprivation remained tolerant to a glucose challenge. In a chronic sleep deprivation experiment, young mice were sensitized to insulin and had improved control of their blood sugar, whereas aged animals became hyperglycemic and failed to maintain appropriate plasma insulin concentrations.

While changes in insulin secretion are unlikely to play a major role in the acute effects of SD, cellular stress in pancreatic tissue suggests that chronic SD may contribute to the loss or dysfunction of endocrine cells, and that these effects may be exacerbated by normal aging, say the researchers.

Filed under alzheimer's disease aging sleep sleep deprivation diabetes neuroscience science

81 notes

Staying ahead of Huntington’s disease

Huntington’s disease is a devastating, incurable disorder that results from the death of certain neurons in the brain. Its symptoms show as progressive changes in behavior and movements.

image

The neurodegenerative disease is caused by a defect in the huntingtin gene (Htt) that causes an abnormal expansion in a part of DNA, called a CAG codon or triplet that codes for the amino acid glutamine. A healthy version of the Htt gene has between 20 and 23 CAG triplets. The mutational expansion in Htt can lead to long repeats of the CAG triplet, resulting in the mutant protein having a long sequence of several glutamine residues called a polyglutamine tract. This CAG triplet expansion in unrelated genes is the root of at least nine neurodegenerative disorders, including Huntington’s disease.

Rohit Pappu, PhD, professor of biomedical engineering at Washington University in St. Louis, and his colleagues in the School of Engineering & Applied Science and in the School of Medicine, are working to understand how expanded polyglutamine tracts form the types of supramolecular structures that are presumed to be toxic to neurons – a feature that polyglutamine expansions share with proteins associated with Alzheimer’s disease and Parkinson’s disease.

In recent work, Pappu and his research team showed that the amino acid sequences on either side of the polyglutamine tract within Htt can act as natural gatekeepers because they control the fundamental ability of polyglutamine tracts to form structures that are implicated in cellular toxicity. The results were published in PNAS Early Edition Nov.25.

“These are progressive onset disorders,” Pappu says. “The longer the polyglutamine tract gets, the more severe the disease, and the symptoms worsen with age. Our results are exciting because it means that any success we have in mimicking the effects of naturally occurring gatekeepers would be a significant step forward. And mechanistic studies are important in this regard because they enable us to learn from nature’s own strategies.

“Previous studies from other labs showed that the toxic effects of polyglutamine expansions are tempered by the sequence contexts of polyglutamine tracts in Htt, not just the lengths of the polyglutamine tracts”, Pappu says.

He and his research team focused on understanding the effects of sequence stretches that lie on either side of the polyglutamine tract in Htt.  The results show that the N-terminal stretch accelerates the formation of ordered structures that are presumed to be benign to cells, whereas the C-terminal stretch slows the overall transition into structures that are expected to create trouble for cells, suggesting that these naturally occurring sequences behave as gatekeepers. 

“It appears that where polyglutamine stretches are of functional importance, nature has ensured that they are flanked by gatekeeping sequences,” Pappu says.

Pappu and his team are now working to find way s to mimic the effects of the N- and C-terminal flanking sequences from Htt. His team is working closely with Marc Diamond, MD, the David Clayson Professor of Neurology at the School of Medicine, to understand how naturally occurring proteins interact with flanking sequences and see if they can coopt them to ameliorate the toxic functions in the polyglutamine expansions.

(Source: engineering.wustl.edu)

Filed under huntington's disease neurodegenerative diseases neurodegeneration neurons neuroscience science

115 notes

Study Raises Questions about Longstanding Forensic Identification Technique
Forensic experts have long used the shape of a person’s skull to make positive identifications of human remains. But those findings may now be called into question, since a new study from North Carolina State University shows that there is not enough variation in skull shapes to make a positive ID.
“In a lot of cases, murder victims or the victims of disasters are from lower socioeconomic backgrounds and don’t have extensive dental records we can use to make a match,” says Dr. Ann Ross, a forensic expert and professor of anthropology at NC State who is senior author of a paper on the new study. “But those people may have been in car accidents or other incidents that led them to have their skulls X-rayed in emergency rooms or elsewhere. And those skull X-rays have often been used to make IDs. I’ve done it myself.
“But now we’ve tried to validate this technique, and our research shows that the shape of the skull isn’t enough to make a positive ID,” Ross says.
At issue is the “cranial vault outline,” not the “face” of the skull. The cranial vault outline is the profile of the skull when viewed from the side, running from just above the bridge of the nose to the point where the skull and neck meet.
For the study, the researchers surveyed 106 members of the American Academy of Forensic Sciences. Survey participants were asked to evaluate 14 antemortem X-rays and five postmortem X-rays. Participants were then asked to match the 5 postmortem X-rays with the appropriate antemortem X-ray, effectively establishing a positive ID.
But the researchers found that only 47 percent of the participants made accurate identifications on all five skulls. Participants who have Ph.D.s did slightly better, with 56 percent of them getting all five correct. (The test has been made available here so that anyone can take it.)
“This doesn’t mean that cranial vault outlines aren’t useful,” says Ashley Maxwell, lead author of the paper and a former graduate student at NC State. “For example, outlines can be valuable if teeth or other features are missing or have been destroyed. But it does mean that cranial vault outlines shouldn’t be given too much weight.
“The more characteristics we can take into account, such as facial features and cranial vault outlines, the more accurate we can be,” Maxwell says.

Study Raises Questions about Longstanding Forensic Identification Technique

Forensic experts have long used the shape of a person’s skull to make positive identifications of human remains. But those findings may now be called into question, since a new study from North Carolina State University shows that there is not enough variation in skull shapes to make a positive ID.

“In a lot of cases, murder victims or the victims of disasters are from lower socioeconomic backgrounds and don’t have extensive dental records we can use to make a match,” says Dr. Ann Ross, a forensic expert and professor of anthropology at NC State who is senior author of a paper on the new study. “But those people may have been in car accidents or other incidents that led them to have their skulls X-rayed in emergency rooms or elsewhere. And those skull X-rays have often been used to make IDs. I’ve done it myself.

“But now we’ve tried to validate this technique, and our research shows that the shape of the skull isn’t enough to make a positive ID,” Ross says.

At issue is the “cranial vault outline,” not the “face” of the skull. The cranial vault outline is the profile of the skull when viewed from the side, running from just above the bridge of the nose to the point where the skull and neck meet.

For the study, the researchers surveyed 106 members of the American Academy of Forensic Sciences. Survey participants were asked to evaluate 14 antemortem X-rays and five postmortem X-rays. Participants were then asked to match the 5 postmortem X-rays with the appropriate antemortem X-ray, effectively establishing a positive ID.

But the researchers found that only 47 percent of the participants made accurate identifications on all five skulls. Participants who have Ph.D.s did slightly better, with 56 percent of them getting all five correct. (The test has been made available here so that anyone can take it.)

“This doesn’t mean that cranial vault outlines aren’t useful,” says Ashley Maxwell, lead author of the paper and a former graduate student at NC State. “For example, outlines can be valuable if teeth or other features are missing or have been destroyed. But it does mean that cranial vault outlines shouldn’t be given too much weight.

“The more characteristics we can take into account, such as facial features and cranial vault outlines, the more accurate we can be,” Maxwell says.

Filed under cranial vault outline x-rays neuroimaging forensics neuroscience science

138 notes

Neural prosthesis restores behavior after brain injury

Scientists from Case Western Reserve University and University of Kansas Medical Center have restored behavior—in this case, the ability to reach through a narrow opening and grasp food—using a neural prosthesis in a rat model of brain injury.

Ultimately, the team hopes to develop a device that rapidly and substantially improves function after brain injury in humans. There is no such commercial treatment for the 1.5 million Americans, including soldiers in Afghanistan and Iraq, who suffer traumatic brain injuries (TBI), or the nearly 800,000 stroke victims who suffer weakness or paralysis in the United States, annually.

The prosthesis, called a brain-machine-brain interface, is a closed-loop microelectronic system. It records signals from one part of the brain, processes them in real time, and then bridges the injury by stimulating a second part of the brain that had lost connectivity.
Their work is published online this week in the science journal Proceedings of the National Academy of Sciences.

“If you use the device to couple activity from one part of the brain to another, is it possible to induce recovery from TBI? That’s the core of this investigation,” said Pedram Mohseni, professor of electrical engineering and computer science at Case Western Reserve, who built the brain prosthesis.

“We found that, yes, it is possible to use a closed-loop neural prosthesis to facilitate repair of a brain injury,” he said.

The researchers tested the prosthesis in a rat model of brain injury in the laboratory of Randolph J. Nudo, professor of molecular and integrative physiology at the University of Kansas. Nudo mapped the rat’s brain and developed the model in which anterior and posterior parts of the brain that control the rat’s forelimbs are disconnected.

Atop each animal’s head, the brain-machine-brain interface is a microchip on a circuit board smaller than a quarter connected to microelectrodes implanted in the two brain regions.

The device amplifies signals, which are called neural action potentials and produced by the neurons in the anterior of the brain. An algorithm separates these signals, recorded as brain spike activity, from noise and other artifacts. With each spike detected, the microchip sends a pulse of electric current to stimulate neurons in the posterior part of the brain, artificially connecting the two brain regions.

Two weeks after the prosthesis had been implanted and run continuously, the rat models using the full closed-loop system had recovered nearly all function lost due to injury, successfully retrieving a food pellet close to 70 percent of the time, or as well as normal, uninjured rats. Rat models that received random stimuli from the device retrieved less than half the pellets and those that received no stimuli retrieved about a quarter of them.

“A question still to be answered is must the implant be left in place for life?” Mohseni said. “Or can it be removed after two months or six months, if and when new connections have been formed in the brain?”

Brain studies have shown that, during periods of growth, neurons that regularly communicate with each other develop and solidify connections.

Mohseni and Nudo said they need more systematic studies to determine what happens in the brain that leads to restoration of function. They also want to determine if there is an optimal time window after injury in which they must implant the device in order to restore function.

(Source: blog.case.edu)

Filed under TBI brain injury prosthetics BMI brain damage neuroscience science

477 notes

Music brings memories back to the brain injured
In the first study of its kind, two researchers have used popular music to help severely brain-injured patients recall personal memories. Amee Baird and Séverine Samson outline the results and conclusions of their pioneering research in the recent issue of the journal Neuropsychological Rehabilitation.
Although their study covered a small number of cases, it’s the very first to examine ‘music-evoked autobiographical memories’ (MEAMs) in patients with acquired brain injuries (ABIs), rather than those who are healthy or suffer from Alzheimer’s Disease.
In their study, Baird and Samson played extracts from ‘Billboard Hot 100’ number-one songs in random order to five patients. The songs, taken from the whole of the patient’s lifespan from age five, were also played to five control subjects with no brain injury. All were asked to record how familiar they were with a given song, whether they liked it, and what memories it invoked.
Doctors Baird and Samson found that the frequency of recorded MEAMs was similar for patients (38%–71%) and controls (48%–71%). Only one of the four ABI patients recorded no MEAMs. In fact, the highest number of MEAMs in the whole group was recorded by one of the ABI patients. In all those studied, the majority of MEAMs were of a person, people or a life period and were typically positive. Songs that evoked a memory were noted as more familiar and more liked than those that did not.
As a potential tool for helping patients regain their memories, Baird and Samson conclude that: “Music was more efficient at evoking autobiographical memories than verbal prompts of the Autobiographical Memory Interview (AMI) across each life period, with a higher percentage of MEAMs for each life period compared with AMI scores.”
“The findings suggest that music is an effective stimulus for eliciting autobiographical memories and may be beneficial in the rehabilitation of autobiographical amnesia, but only in patients without a fundamental deficit in autobiographical recall memory and intact pitch perception.”
The authors hope that their ground-breaking work will encourage others to carry out further studies on MEAMs in larger ABI populations. They also call for further studies of both healthy people and those with other neurological conditions to learn more about the clear relationship between memory, music and emotion; they hope that one day we might truly “understand the mechanisms underlying the unique memory enhancing effect of music”.

Music brings memories back to the brain injured

In the first study of its kind, two researchers have used popular music to help severely brain-injured patients recall personal memories. Amee Baird and Séverine Samson outline the results and conclusions of their pioneering research in the recent issue of the journal Neuropsychological Rehabilitation.

Although their study covered a small number of cases, it’s the very first to examine ‘music-evoked autobiographical memories’ (MEAMs) in patients with acquired brain injuries (ABIs), rather than those who are healthy or suffer from Alzheimer’s Disease.

In their study, Baird and Samson played extracts from ‘Billboard Hot 100’ number-one songs in random order to five patients. The songs, taken from the whole of the patient’s lifespan from age five, were also played to five control subjects with no brain injury. All were asked to record how familiar they were with a given song, whether they liked it, and what memories it invoked.

Doctors Baird and Samson found that the frequency of recorded MEAMs was similar for patients (38%–71%) and controls (48%–71%). Only one of the four ABI patients recorded no MEAMs. In fact, the highest number of MEAMs in the whole group was recorded by one of the ABI patients. In all those studied, the majority of MEAMs were of a person, people or a life period and were typically positive. Songs that evoked a memory were noted as more familiar and more liked than those that did not.

As a potential tool for helping patients regain their memories, Baird and Samson conclude that: “Music was more efficient at evoking autobiographical memories than verbal prompts of the Autobiographical Memory Interview (AMI) across each life period, with a higher percentage of MEAMs for each life period compared with AMI scores.”

“The findings suggest that music is an effective stimulus for eliciting autobiographical memories and may be beneficial in the rehabilitation of autobiographical amnesia, but only in patients without a fundamental deficit in autobiographical recall memory and intact pitch perception.”

The authors hope that their ground-breaking work will encourage others to carry out further studies on MEAMs in larger ABI populations. They also call for further studies of both healthy people and those with other neurological conditions to learn more about the clear relationship between memory, music and emotion; they hope that one day we might truly “understand the mechanisms underlying the unique memory enhancing effect of music”.

Filed under music brain injury autobiographical memory alzheimer's disease TBI neuroscience science

63 notes

Baylor Research Institute Studies Traumatic Brain Injury Rehab Outcomes

For patients recovering from a traumatic brain injury (TBI), the rehabilitation process – compensating for changes in functioning, adaptation and even community reintegration – can be challenging. Unfortunately, not all rehab programs are created equal, and with the differences comes a difference in outcomes, according to a first-of-its-kind study published in The Journal of Head Trauma Rehabilitation.

image

Collectively authored by Baylor researchers, the outcomes study (titled “Comparative Effectiveness of Traumatic Brain Injury Rehabilitation: Differential Outcomes Across TBI Model Systems Centers”), set out to identify if outcomes at the post-discharge and one-year points varied across 21 Traumatic Brain Injury Model System (TBIMS) centers. The Baylor Institute of Rehabilitation (BIR) was one of the centers studied.

At the study’s onset, researchers had an idea of what they might find, but their findings revealed the opposite.

“We expected that, after accounting for differences in patient characteristics and severity of injury, patient outcomes would be similar across centers,” said Marie Dahdah, PhD, investigator at the Baylor Institute for Rehabilitation. “They were not. There were significant variations, with a 25 percent to 45 percent difference between the best performing site and the site with the lowest outcomes at discharge.”

While differences in outcomes have long been reported in designated trauma centers (and for other specialties, including general and cardiac surgery, transplant and oncology), the study was the first piece of research to demonstrate that those differences exist in the rehabilitation context.

The team acknowledged that those variances could be attributed to institutional structures, resources and clinical practices, but that more research is needed to determine which of these factors is associated with optimum outcomes.

“In order to identify factors that contribute to variation in patient outcomes across centers, we are undertaking research that identifies different patient, injury and process-level factors associated with functional outcomes of patients,” Dr. Dahdah said. “Those factors can then be targeted to improve patient outcomes.”

In other phases of this study, these Baylor investigators (along with teams from three other TBIMS sites) are reviewing the quantity and frequency of various types of rehabilitation therapies used in inpatient TBI settings. The team will also study evidenced-based best practices for speech, occupational, physical and recreational therapy interventions, as well as neurocognitive and psychosocial interventions.

The results from those subsequent studies could help identify gaps between current practices and evidence-based best practices, with the aim of helping inform rehabilitation programs across the country and ensuring that all centers have the same opportunities for quality outcomes.

“I think I speak for my entire research team when I say that our involvement in this type of research comes out of our collective desire to improve quality of rehabilitation care, thereby enhancing outcomes following TBI,” Dr. Dahdah said. “My hope is that by synthesizing and disseminating what is known about effective evidence-based rehabilitation interventions, BIR as part of the North Texas TBIMS will be able to encourage changes necessary to help institutions, clinicians and therapists to provide the best quality TBI rehabilitation care to their patients.”

Of course, with the Baylor Institute of Rehabilitation being among the 21-center pool, one very obvious question remains. How did BIR’s outcomes compare with the other 20 centers?

“I cannot count for you the number of times I have been asked that question,” Dr. Dahdah said. “To ensure the integrity of our study, even our research team is blind to the identity of the centers.”

But despite how well even the strongest inpatient rehab centers perform in a comparative context, there is always room for improvement, especially with best-practice regimens.

“Our research has already started discussions within the TBI Model Systems research community,” Dr. Dahdah said. “We believe more research needs to be done to identify the key determinants of patient outcomes so that benchmarks for quality rehabilitation care can be derived for patients and their families.”

(Source: media.baylorhealth.com)

Filed under TBI brain damage rehabilitation neuroscience science

free counters