Neuroscience

Articles and news from the latest research reports.

29 notes

Scientists prevent development of deafness in animals engineered to have Usher syndrome

Hearing impairment is the most common sensory disorder, with congenital hearing impairment present in approximately 1 in 1,000 newborns, and yet there is no physiological cure for children who are born deaf. Most cases of congenital deafness are due to a mutation in a gene that is required for normal development of the sensory hair cells in the inner ear that are responsible for detecting sound. To cure deafness caused by such mutations, the expression of the gene must be corrected, a feat that has been elusive until recently.

Rosalind Franklin University of Medicine and Science (RFUMS) Assistant Professor Michelle Hastings and her team, along with investigators at Louisiana State University Health Sciences Center in New Orleans, Louisiana and Isis Pharmaceuticals in Carlsbad, CA, have now found a way to target gene expression in the ear and rescue hearing and balance in mice that have a mutation that causes deafness in humans. The results of the study are reported in the paper, Rescue of hearing and vestibular function in a mouse model of human deafness, which was published February 4, 2013 in the journal Nature Medicine.

Dr. Hastings collaborated with research leaders across the country, including RFUMS colleagues Francine Jodelka and Anthony Hinrich, who were co-first authors on the study, as well as Dr. Dominik Duelli and Kate McCaffrey; co-first author Dr. Jennifer Lentz at Louisiana State University Health Sciences Center New Orleans, and Dr. Lentz’s research team, including Drs. Hamilton Farris and Nicolas Bazan and Matthew Spalitta; and Dr. Frank Rigo at Isis Pharmaceuticals. The collaboration led to the development of a novel therapeutic approach to treat deafness and balance impairment by injecting mice with a single dose of a small, synthetic RNA-like molecule, called an antisense oligonucleotide (ASO). The ASO was designed to specifically recognize and fix a mutation in a gene called USH1C, that causes Usher syndrome in humans. The ASO blocks the effect of the mutation, allowing the gene product to function properly, thereby preventing deafness.

Usher syndrome is the leading genetic cause of combined deafness and blindness in humans. Treatment of these Usher mice with the ASO early in life rescues hearing and cures all balance problems. “The effectiveness of the ASO is striking,” states Hastings. “A single dose of the drug to newborn mice corrects balance problems and allows these otherwise deaf mice to hear at levels similar to non-Usher mice for a large portion of their life,” she says.

Validating ASO efficacy in the Usher mice is an important step in the process of developing the strategy for human therapy. Dr. Lentz, who has been studying Usher syndrome for almost 10 years and engineered the mice to model the human disease, states, “Successfully treating a human genetic disease in this animal model brings the possibility of treating patients much closer.”

The results of the study demonstrate the therapeutic potential of this type of ASO in the treatment of deafness and provide evidence that congenital deafness can be effectively overcome by treatment early in development to correct gene expression.

"The discovery of an ASO-type drug that can effectively rescue hearing opens the door to developing similar approaches to target and cure other causes of hearing loss," says Dr. Hastings who has been awarded a grant from the National Institute of Health to further develop the ASOs for the treatment of deafness with Drs. Lentz, Rigo and Duelli.

(Source: eurekalert.org)

Filed under Usher syndrome congenital deafness hearing impairment sensory hair cells medicine science

45 notes

What Causes Lou Gehrig’s Sticky Masses?
Globs of protein clustered in the neurons that control muscles have long been the hallmark of amyotrophic lateral sclerosis (ALS), the fatal neurodegenerative disease also commonly known as Lou Gehrig’s disease. Now, a study of the most commonly found mutant gene in people with ALS reveals an unexpected origin of some of those sticky masses, a finding that may offer drug developers a new target for treatments.
Located on the ninth chromosome, which explains part of its unwieldy name, the C9orf72 gene has a bit of a stutter. A typical version in healthy people contains a stretch of DNA where a string of six genetic letters—GGGGCC—repeats up to 25 times. Scientists have recently found that in a sizable share of people with ALS and frontotemporal dementia (FTD), a less common neurological disease characterized by language, memory, and emotional problems, this repeat occurs many more times; some people have thousands of copies.
Since these C9orf72 mutations were discovered in 2011, some researchers have speculated that the repeats interrupt production of the gene’s normal protein, which serves some as-yet unknown, but vital function in motor neurons or other brain cells. Others have hypothesized that the mutation spawns a large, misshapen strand of RNA that grabs on to proteins such as TDP-43, which normally help process RNA, creating protein tangles that starve the cell of the machinery it needs to function.
Molecular biologists at the Ludwig Maximilians University Munich in Germany and the University of Antwerp in Belgium, however, wondered whether the genetic stutters themselves coded for proteins that became tangled in the cell. Few scientists had considered this because the stutters don’t contain the “start signal” that allows proteins to be made. Still, in a few other diseases caused by genetic repeats, the cell manages to produce proteins from the abnormal gene despite lacking this signal. Sometimes these proteins are toxic and ultimately kill the cell.
Based on the DNA sequence of the GGGGCC-laden C9orf72 seen in ALS and FTD patients, the European team determined that  if translated, the gene would produce various proteins containing strings of repeat amino acids. Dubbed dipeptide repeat (DPR) proteins, these molecules don’t normally appear in humans and should be prone to clumping, the scientists concluded. Indeed, when they began to search for DPR protein clusters in actual human brain tissues, they found them in tissue from FTD and ALS patients with the C9orf72 mutation. No such lumps showed up in the brain tissue of healthy controls or ALS and FTD patients without the C9orf72 mutation, increasing the likelihood that the mutation produced them, Dieter Edbauer, a molecular biologist at Ludwig Maximilians, and his co-authors report online today in Science.

What Causes Lou Gehrig’s Sticky Masses?

Globs of protein clustered in the neurons that control muscles have long been the hallmark of amyotrophic lateral sclerosis (ALS), the fatal neurodegenerative disease also commonly known as Lou Gehrig’s disease. Now, a study of the most commonly found mutant gene in people with ALS reveals an unexpected origin of some of those sticky masses, a finding that may offer drug developers a new target for treatments.

Located on the ninth chromosome, which explains part of its unwieldy name, the C9orf72 gene has a bit of a stutter. A typical version in healthy people contains a stretch of DNA where a string of six genetic letters—GGGGCC—repeats up to 25 times. Scientists have recently found that in a sizable share of people with ALS and frontotemporal dementia (FTD), a less common neurological disease characterized by language, memory, and emotional problems, this repeat occurs many more times; some people have thousands of copies.

Since these C9orf72 mutations were discovered in 2011, some researchers have speculated that the repeats interrupt production of the gene’s normal protein, which serves some as-yet unknown, but vital function in motor neurons or other brain cells. Others have hypothesized that the mutation spawns a large, misshapen strand of RNA that grabs on to proteins such as TDP-43, which normally help process RNA, creating protein tangles that starve the cell of the machinery it needs to function.

Molecular biologists at the Ludwig Maximilians University Munich in Germany and the University of Antwerp in Belgium, however, wondered whether the genetic stutters themselves coded for proteins that became tangled in the cell. Few scientists had considered this because the stutters don’t contain the “start signal” that allows proteins to be made. Still, in a few other diseases caused by genetic repeats, the cell manages to produce proteins from the abnormal gene despite lacking this signal. Sometimes these proteins are toxic and ultimately kill the cell.

Based on the DNA sequence of the GGGGCC-laden C9orf72 seen in ALS and FTD patients, the European team determined that if translated, the gene would produce various proteins containing strings of repeat amino acids. Dubbed dipeptide repeat (DPR) proteins, these molecules don’t normally appear in humans and should be prone to clumping, the scientists concluded. Indeed, when they began to search for DPR protein clusters in actual human brain tissues, they found them in tissue from FTD and ALS patients with the C9orf72 mutation. No such lumps showed up in the brain tissue of healthy controls or ALS and FTD patients without the C9orf72 mutation, increasing the likelihood that the mutation produced them, Dieter Edbauer, a molecular biologist at Ludwig Maximilians, and his co-authors report online today in Science.

Filed under neurodegenerative diseases Lou Gehrig's disease ALS gene mutation genetics proteins science

148 notes

Mysterious Disease Discovered Locally, Strikes Mainly Young Women
It’s a mysterious, newly discovered disease that strikes mainly young women, and it’s often misdiagnosed. Doctors who discovered it, here in Philadelphia, say it’s like your brain is on fire. 3 On Your Side Health Reporter Stephanie Stahl says it starts with personality changes.
Young women dazed, restrained in hospital beds, acting possessed and then becoming catatonic. They’d been so normal, when suddenly their lives went haywire.
“One minute I’d be sobbing, crying hysterically, and the next minute I’d be laughing, said Susannah Cahalan, of New Jersey.
“I was very paranoid and manic. There was something wrong. I thought trucks were following me,” said Emily Gavigan, of Pennsylvania.
And it got worse for Emily Gavigan, who was a sophomore at the University of Scranton. Hospitalized, and out of it, she couldn’t control her arm movements. Then there were seizures, and she needed a ventilator. Her parents were watching their only child slip away.
"It was life and death for weeks," said Grace Gavigan, Emily’s mom.
"We were losing her. This is something that I couldn’t control," said Bill Gavigan, Emily’s dad.
Doctors also couldn’t figure out what was wrong with Susannah.
"I had bizarre abnormal movements, would leave my arms out extended, you know, in front of me. I was a relatively normal person, then the next minute I’m hallucinating and insisting that my father had kidnapped me," said Susannah.
Turns out, Susannah and Emily weren’t mentally ill. They both had an auto immune disease called Anti-NMDA Receptor Encephalitis, when antibodies attack the brain, causing swelling.
Susannah says this is how doctors explained it to her parents, “He told them her brain is on fire. He used those words: ‘Her brain is on fire.’”

Mysterious Disease Discovered Locally, Strikes Mainly Young Women

It’s a mysterious, newly discovered disease that strikes mainly young women, and it’s often misdiagnosed. Doctors who discovered it, here in Philadelphia, say it’s like your brain is on fire. 3 On Your Side Health Reporter Stephanie Stahl says it starts with personality changes.

Young women dazed, restrained in hospital beds, acting possessed and then becoming catatonic. They’d been so normal, when suddenly their lives went haywire.

“One minute I’d be sobbing, crying hysterically, and the next minute I’d be laughing, said Susannah Cahalan, of New Jersey.

“I was very paranoid and manic. There was something wrong. I thought trucks were following me,” said Emily Gavigan, of Pennsylvania.

And it got worse for Emily Gavigan, who was a sophomore at the University of Scranton. Hospitalized, and out of it, she couldn’t control her arm movements. Then there were seizures, and she needed a ventilator. Her parents were watching their only child slip away.

"It was life and death for weeks," said Grace Gavigan, Emily’s mom.

"We were losing her. This is something that I couldn’t control," said Bill Gavigan, Emily’s dad.

Doctors also couldn’t figure out what was wrong with Susannah.

"I had bizarre abnormal movements, would leave my arms out extended, you know, in front of me. I was a relatively normal person, then the next minute I’m hallucinating and insisting that my father had kidnapped me," said Susannah.

Turns out, Susannah and Emily weren’t mentally ill. They both had an auto immune disease called Anti-NMDA Receptor Encephalitis, when antibodies attack the brain, causing swelling.

Susannah says this is how doctors explained it to her parents, “He told them her brain is on fire. He used those words: ‘Her brain is on fire.’”

Filed under brain Anti-NMDA Receptor Encephalitis encephalitis autoimmune disease neuroscience science

70 notes

Cells forged from human skin show promise in treating MS, myelin disorders

A study out today in the journal Cell Stem Cell shows that human brain cells created by reprogramming skin cells are highly effective in treating myelin disorders, a family of diseases that includes multiple sclerosis and rare childhood disorders called pediatric leukodystrophies.

The study is the first successful attempt to employ human induced pluripotent stem cells (hiPSC) to produce a population of cells that are critical to neural signaling in the brain. In this instance, the researchers utilized cells crafted from human skin and transplanted them into animal models of myelin disease.

"This study strongly supports the utility of hiPSCs as a feasible and effective source of cells to treat myelin disorders," said University of Rochester Medical Center (URMC) neurologist Steven Goldman, M.D., Ph.D., lead author of the study. "In fact, it appears that cells derived from this source are at least as effective as those created using embryonic or tissue-specific stem cells."

The discovery opens the door to potential new treatments using hiPSC-derived cells for a range of neurological diseases characterized by the loss of a specific cell population in the central nervous system called myelin. Like the insulation found on electrical wires, myelin is a fatty tissue that ensheathes the connections between nerve cells and ensures the crisp transmission of signals from one cell to another. When myelin tissue is damaged, communication between cells can be disrupted or even lost.

The most common myelin disorder is multiple sclerosis, a condition in which the body’s own immune system attacks and destroys myelin. The loss of myelin is also the hallmark of a family of serious and often fatal diseases known as pediatric leukodystrophies. While individually very rare, collectively several thousand children are born in the U.S. with some form of leukodystrophy every year.

The source of the myelin cells in the brain and spinal cord is cell type called the oligodendrocyte. Oligodendrocytes are, in turn, the offspring of another cell called the oligodendrocyte progenitor cell, or OPC. Myelin disorders have long been considered a potential target for cell-based therapies. Scientists have theorized that if healthy OPCs could be successfully transplanted into the diseased or injured brain, then these cells might be able to produce new oligodendrocytes capable of restoring lost myelin, thereby reversing the damage caused by these diseases.

However, several obstacles have thwarted scientists. One of the key challenges is that OPCs are a mature cell in the central nervous system and appear late in development.

"Compared to neurons, which are among the first cells formed in human development, there are more stages and many more steps required to create glial cells such as OPCs," said Goldman. "This process requires that we understand the basic biology and the normal development of these cells and then reproduce this precise sequence in the lab."

Another challenge has been identifying the ideal source of these cells. Much of the research in the field has focused on cells derived from tissue-specific and embryonic stem cells. While research using these cells has yielded critical insight into the biology of stem cells, these sources are not considered ideal to meet demand once stem cell-based therapies become more common.

The discovery in 2007 that human skin cells could be “reprogrammed” to the point where they returned to a biological state equivalent of an embryonic stem cell, called induced pluripotent stem cells, represented a new path forward for scientists. Because these cells – created by using the recipient’s own skin – would be a genetic match, the likelihood of rejection upon transplantation is significantly diminished. These cells also promised an abundant source of material from which to fashion the cells necessary for therapies.

Goldman’s team was the first to successfully master the complex process of using hiPSCs to create OPCs. This process proved time consuming. It took Goldman’s lab four years to establish the exact chemical signaling required to reprogram, produce, and ultimately purify OPCs in sufficient quantities for transplantation and each preparation required almost six months to go from skin cell to a transplantable population of myelin-producing cells.

Once they succeeded in identifying and purifying OPCs from hiPSCs, they then assessed the ability of the cells to make new myelin when transplanted into mice with a hereditary leukodystrophy that rendered them genetically incapable of producing myelin.

They found that the OPCs spread throughout the brain and began to produce myelin. They observed that hiPSC-derived cells did this even more quickly, efficiently, and effectively than cells created using tissue-derived OPCs. The animals were also free of any tumors, a dangerous potential side effect of some stem cell therapies, and survived significantly longer than untreated mice.

"The new population of OPCs and oligodendrocytes was dense, abundant, and complete," said Goldman. "In fact, the re-myelination process appeared more rapid and efficient than with other cell sources."

The next stage in evaluating these cells – clinical studies – may not be long in the offing. Goldman, along with a team of researchers and clinicians from Rochester, Syracuse, and Buffalo, are preparing to launch a clinical trial using OPCs to treat multiple sclerosis. This group, titled the Upstate MS Consortium, has been approved for funding by New York State Stem Cell Science (NYSTEM). While the consortia’s initial study – the early stages of which are scheduled to begin in 2015 – will focus cells derived from tissue sources, Goldman anticipates that hiPSC-derived OPCs will eventually be included in this project.

(Source: eurekalert.org)

Filed under MS myelin disorders skin cells myelin hiPSC stem cells oligodendrocytes medicine science

97 notes

New brain-test app
Two years ago, researcher Josef Bless was listening to music on his phone when he suddenly had an idea.
"I noticed that the sounds of the different instruments were distributed differently between the ears, and it struck me that this was very similar to the tests we routinely use in our laboratory to measure brain function. In dichotic listening, each ear is presented with a different syllable at the same time (one to the left and one to the right ear) and the listener has to say which syllable seems clearest. The test indicates which side of the brain is most active during language processing," Bless explains.
Josef Bless is working on a PhD in psychology at the University of Bergen. He is a member of the Bergen fMRI Group, an interdisciplinary research group headed by Professor Kenneth Hugdahl, who has received a European Research Council (ERC) Advanced Grant for his brain research.
The iPhone app for dichotic listening is called iDichotic and was launched on the App Store in 2011, where it can be downloaded for free. Some one year later, more than 1,000 people have downloaded the app, and roughly half have sent their test results to the researchers’ database.
The researchers analysed the first 167 results they received and compared them with the results of 76 individuals tested in laboratories in Norway and Australia. The results have been published in the journal Frontiers in Psychology.
"We found that the results from the app were as reliable as those of the controlled laboratory tests. This means that smartphones can be used as a tool for psychological testing, opening up a wealth of exciting new possibilities," says Bless.
"The app makes it possible to gather large volumes of data easily and inexpensively. I think we will see more and more psychological tests coming to smartphones," he adds.
The researchers have also developed a special version of iDichotic for patients with schizophrenia who suffer from auditory hallucinations (i.e. hear “voices”). The app helps in training patients to improve their focus, so that when they hear voices, they are better able to shut them out.
"Using a mobile app, patients can be tested and receive training at home, instead of having to come to our laboratory," says Bless.
The app iDichotic has been developed in collaboration with Professor Kenneth Hugdahl, Doctor René Westerhausen, and Magne Gudmundsen.

New brain-test app

Two years ago, researcher Josef Bless was listening to music on his phone when he suddenly had an idea.

"I noticed that the sounds of the different instruments were distributed differently between the ears, and it struck me that this was very similar to the tests we routinely use in our laboratory to measure brain function. In dichotic listening, each ear is presented with a different syllable at the same time (one to the left and one to the right ear) and the listener has to say which syllable seems clearest. The test indicates which side of the brain is most active during language processing," Bless explains.

Josef Bless is working on a PhD in psychology at the University of Bergen. He is a member of the Bergen fMRI Group, an interdisciplinary research group headed by Professor Kenneth Hugdahl, who has received a European Research Council (ERC) Advanced Grant for his brain research.

The iPhone app for dichotic listening is called iDichotic and was launched on the App Store in 2011, where it can be downloaded for free. Some one year later, more than 1,000 people have downloaded the app, and roughly half have sent their test results to the researchers’ database.

The researchers analysed the first 167 results they received and compared them with the results of 76 individuals tested in laboratories in Norway and Australia. The results have been published in the journal Frontiers in Psychology.

"We found that the results from the app were as reliable as those of the controlled laboratory tests. This means that smartphones can be used as a tool for psychological testing, opening up a wealth of exciting new possibilities," says Bless.

"The app makes it possible to gather large volumes of data easily and inexpensively. I think we will see more and more psychological tests coming to smartphones," he adds.

The researchers have also developed a special version of iDichotic for patients with schizophrenia who suffer from auditory hallucinations (i.e. hear “voices”). The app helps in training patients to improve their focus, so that when they hear voices, they are better able to shut them out.

"Using a mobile app, patients can be tested and receive training at home, instead of having to come to our laboratory," says Bless.

The app iDichotic has been developed in collaboration with Professor Kenneth Hugdahl, Doctor René Westerhausen, and Magne Gudmundsen.

Filed under brain dichotic listening iDichotic smartphone app psychology neuroscience science

118 notes

Pitt/UPMC Team Describes Technology that Lets Spinal Cord-Injured Man Control Robot Arm with Thoughts
Researchers at the University of Pittsburgh School of Medicine and UPMC describe in PLoS ONE how an electrode array sitting on top of the brain enabled a 30-year-old paralyzed man to control the movement of a character on a computer screen in three dimensions with just his thoughts. It also enabled him to move a robot arm to touch a friend’s hand for the first time in the seven years since he was injured in a motorcycle accident.
With brain-computer interface (BCI) technology, the thoughts of Tim Hemmes, who sustained a spinal cord injury that left him unable to move his body below the shoulders, were interpreted by computer algorithms and translated into intended movement of a computer cursor and, later, a robot arm, explained lead investigator Wei Wang, Ph.D., assistant professor, Department of Physical Medicine and Rehabilitation, Pitt School of Medicine.
“When Tim reached out to high-five me with the robotic arm, we knew this technology had the potential to help people who cannot move their own arms achieve greater independence,” said Dr. Wang, reflecting on a memorable scene from September 2011 that was re-told in stories around the world. “It’s very important that we continue this effort to fulfill the promise we saw that day.”
Six weeks before the implantation surgery, the team conducted functional magnetic resonance imaging (fMRI) of Mr. Hemmes’ brain while he watched videos of arm movement. They used that information to place a postage stamp-size electrocortigraphy (ECoG) grid of 28 recording electrodes on the surface of the brain region that fMRI showed controlled right arm and hand movement. Wires from the device were tunneled under the skin of his neck to emerge from his chest where they could be connected to computer cables as necessary.
For 12 days at his home and nine days in the research lab, Mr. Hemmes began the testing protocol by watching a virtual arm move, which triggered neural signals that were sensed by the electrodes. Distinct signal patterns for particular observed movements were used to guide the up and down motion of a ball on a computer screen. Soon after mastering movement of the ball in two dimensions, namely up/down and right/left, he was able to also move it in/out with accuracy on a 3-dimensional display.
“During the learning process, the computer helped Tim hit his target smoothly by restricting how far off course the ball could wander,” Dr. Wang said. “We gradually took off the ‘training wheels,’ as we called it, and he was soon doing the tasks by himself with 100 percent brain control.”
The robot arm was developed by Johns Hopkins University’s Applied Physics Laboratory. Currently, Jan Scheuermann, of Whitehall, Pa., is testing another BCI technology at Pitt/UPMC.

Pitt/UPMC Team Describes Technology that Lets Spinal Cord-Injured Man Control Robot Arm with Thoughts

Researchers at the University of Pittsburgh School of Medicine and UPMC describe in PLoS ONE how an electrode array sitting on top of the brain enabled a 30-year-old paralyzed man to control the movement of a character on a computer screen in three dimensions with just his thoughts. It also enabled him to move a robot arm to touch a friend’s hand for the first time in the seven years since he was injured in a motorcycle accident.

With brain-computer interface (BCI) technology, the thoughts of Tim Hemmes, who sustained a spinal cord injury that left him unable to move his body below the shoulders, were interpreted by computer algorithms and translated into intended movement of a computer cursor and, later, a robot arm, explained lead investigator Wei Wang, Ph.D., assistant professor, Department of Physical Medicine and Rehabilitation, Pitt School of Medicine.

“When Tim reached out to high-five me with the robotic arm, we knew this technology had the potential to help people who cannot move their own arms achieve greater independence,” said Dr. Wang, reflecting on a memorable scene from September 2011 that was re-told in stories around the world. “It’s very important that we continue this effort to fulfill the promise we saw that day.”

Six weeks before the implantation surgery, the team conducted functional magnetic resonance imaging (fMRI) of Mr. Hemmes’ brain while he watched videos of arm movement. They used that information to place a postage stamp-size electrocortigraphy (ECoG) grid of 28 recording electrodes on the surface of the brain region that fMRI showed controlled right arm and hand movement. Wires from the device were tunneled under the skin of his neck to emerge from his chest where they could be connected to computer cables as necessary.

For 12 days at his home and nine days in the research lab, Mr. Hemmes began the testing protocol by watching a virtual arm move, which triggered neural signals that were sensed by the electrodes. Distinct signal patterns for particular observed movements were used to guide the up and down motion of a ball on a computer screen. Soon after mastering movement of the ball in two dimensions, namely up/down and right/left, he was able to also move it in/out with accuracy on a 3-dimensional display.

“During the learning process, the computer helped Tim hit his target smoothly by restricting how far off course the ball could wander,” Dr. Wang said. “We gradually took off the ‘training wheels,’ as we called it, and he was soon doing the tasks by himself with 100 percent brain control.”

The robot arm was developed by Johns Hopkins University’s Applied Physics Laboratory. Currently, Jan Scheuermann, of Whitehall, Pa., is testing another BCI technology at Pitt/UPMC.

Filed under BCI spinal cord injury robotic arm motor movements neural activity robotics neuroscience science

50 notes

Fear factor: Study shows brain’s response to scary stimuli

Driving through his hometown, a war veteran with post-traumatic stress disorder may see roadside debris and feel afraid, believing it to be a bomb. He’s ignoring his safe, familiar surroundings and only focusing on the debris; yet, when it comes to the visual cortex, a recent study at the University of Florida suggests this is completely normal.

The findings, published last month in the Journal of Neuroscience, show that even people who don’t have anxiety disorders respond visually at the sight of something scary while ignoring signs that indicate safety. This contradicts a common belief that only people with anxiety disorders have difficulty processing comforting visual stimuli, or safety cues, said Andreas Keil, a professor of psychology in UF’s College of Liberal Arts and Sciences.

“We’ve established that, in terms of visual responding, it’s not a disorder to not respond to a safety cue,” Keil said. “We all do that. So now we can study at what stage in the processing stream, with given patients, is the problem occurring.”

Co-authors Keil and Vladimir Miskovic, both members of the UF Center for the Study of Emotion and Attention, examined the effect of competing danger and safety cues within the visual cortex. The study results could help distinguish between normal and abnormal processes within the visual cortex and identify what parts of the brain are targets for the treatment of anxiety disorders.

“You’d think the visual cortex would just faithfully code for visual information,” said Shmuel Lissek, an assistant professor of psychology at the University of Minnesota not involved in the study. “This kind of work is testing the idea that activations in the visual cortex are actually different if the stimulus has an emotional value than if it doesn’t.”

(Source: news.ufl.edu)

Filed under visual cortex visual stimuli PTSD brainwaves anxiety anxiety disorders neuroscience psychology science

73 notes

Fluctuations in the size of brain waves contribute to information processing
Cyclical variations in the size of brain wave rhythms may participate in the encoding of information by the brain, according to a new study led by Colin Molter of the Neuroinformatics Japan Center, RIKEN Brain Science Institute, Wako.
Brain waves are produced by the synchronized activity of large populations of neurons. Low frequency brain waves called theta oscillations are known to support memory formation. Researchers typically examine the frequency of oscillations in a given part of the brain and the timing of oscillations in different brain regions, but know very little about how variations in the size of these oscillations contribute to information processing.
Molter and his colleagues used electrode arrays to record brain waves from the rat hippocampus, a structure known to be critical for memory formation and spatial navigation, while the animals performed various behaviors, such as exploring open spaces, running through a maze and in a wheel, and sleeping. They observed fluctuations in the size of theta oscillations during all the behaviors—the brain waves did not remain the same size, but rather waxed and waned second by second.
During spatial navigation for example, individual hippocampal neurons called place cells become more active when the animal is in one or a few specific locations compared to the rest of the explored environment. The researchers found that the time of firing of many of the place cells correlated with the fluctuations in the size of the theta waves. During sleep, the activity of most of the cells was timed with the largest theta oscillations.
Even though the size of theta waves is correlated with motor behavior, their cyclic fluctuations at this time scale, observed while the rats ran and explored, were not correlated with the animals’ speed or acceleration. The fluctuations are instead likely to be generated by the brain itself, as their presence during sleep also suggests they are intrinsic.
The researchers speculate that this phenomenon could be helpful for the neuronal representation of space, resolving the ambiguity of space coding by place cells that become active in multiple preferred locations. “We are currently working on several new experiments to understand how the spatial location may affect the slow modulation and how the timing of the slow modulation affects behavior,” says Molter. “We are also trying to provide a model that incorporates the theta slow modulation to help propagation of activity between cell assemblies.”

Fluctuations in the size of brain waves contribute to information processing

Cyclical variations in the size of brain wave rhythms may participate in the encoding of information by the brain, according to a new study led by Colin Molter of the Neuroinformatics Japan Center, RIKEN Brain Science Institute, Wako.

Brain waves are produced by the synchronized activity of large populations of neurons. Low frequency brain waves called theta oscillations are known to support memory formation. Researchers typically examine the frequency of oscillations in a given part of the brain and the timing of oscillations in different brain regions, but know very little about how variations in the size of these oscillations contribute to information processing.

Molter and his colleagues used electrode arrays to record brain waves from the rat hippocampus, a structure known to be critical for memory formation and spatial navigation, while the animals performed various behaviors, such as exploring open spaces, running through a maze and in a wheel, and sleeping. They observed fluctuations in the size of theta oscillations during all the behaviors—the brain waves did not remain the same size, but rather waxed and waned second by second.

During spatial navigation for example, individual hippocampal neurons called place cells become more active when the animal is in one or a few specific locations compared to the rest of the explored environment. The researchers found that the time of firing of many of the place cells correlated with the fluctuations in the size of the theta waves. During sleep, the activity of most of the cells was timed with the largest theta oscillations.

Even though the size of theta waves is correlated with motor behavior, their cyclic fluctuations at this time scale, observed while the rats ran and explored, were not correlated with the animals’ speed or acceleration. The fluctuations are instead likely to be generated by the brain itself, as their presence during sleep also suggests they are intrinsic.

The researchers speculate that this phenomenon could be helpful for the neuronal representation of space, resolving the ambiguity of space coding by place cells that become active in multiple preferred locations. “We are currently working on several new experiments to understand how the spatial location may affect the slow modulation and how the timing of the slow modulation affects behavior,” says Molter. “We are also trying to provide a model that incorporates the theta slow modulation to help propagation of activity between cell assemblies.”

Filed under brainwaves memory formation spatial navigation motor behavior neuroscience science

111 notes

Eat to Dream: Penn Study Shows Dietary Nutrients Associated with Certain Sleep Patterns
“You are what you eat,” the saying goes, but is what you eat playing a role in how much you sleep? Sleep, like nutrition and physical activity, is a critical determinant of health and well-being. With the increasing prevalence of obesity and its consequences, sleep researchers have begun to explore the factors that predispose individuals to weight gain and ultimately obesity. Now, a new study from the Perelman School of Medicine at the University of Pennsylvania shows for the first time that certain nutrients may play an underlying role in short and long sleep duration and that people who report eating a large variety of foods – an indicator of an overall healthy diet – had the healthiest sleep patterns. The new research is published online, ahead-of-print in the journal Appetite.

Eat to Dream: Penn Study Shows Dietary Nutrients Associated with Certain Sleep Patterns

You are what you eat,” the saying goes, but is what you eat playing a role in how much you sleep? Sleep, like nutrition and physical activity, is a critical determinant of health and well-being. With the increasing prevalence of obesity and its consequences, sleep researchers have begun to explore the factors that predispose individuals to weight gain and ultimately obesity. Now, a new study from the Perelman School of Medicine at the University of Pennsylvania shows for the first time that certain nutrients may play an underlying role in short and long sleep duration and that people who report eating a large variety of foods – an indicator of an overall healthy diet – had the healthiest sleep patterns. The new research is published online, ahead-of-print in the journal Appetite.

Filed under sleep sleep patterns sleep duration nutrition dietary nutrients health science

117 notes

Turning repulsive feelings into desires
Hunger, thirst, stress and drugs can create a change in the brain that transforms a repulsive feeling into a strong positive “wanting,” a new University of Michigan study indicates.
The research used salt appetite to show how powerful natural mechanisms of brain desires can instantly transform a cue that always predicted a repulsive Dead Sea Salt solution into an eagerly wanted beacon or motivational magnet.
Mike Robinson, a research fellow in the U-M Department of Psychology and the study’s lead author, said the findings help explain how related brain activations in people could cause them to avidly want something that has been always disliked.
This instant transformation of motivation, he said, lies in the ability of events to activate particular brain circuitry—a structure called the nucleus accumbens, which sits near the base of the front of the brain and is also activated by addictive drugs.
Cues for rewards often trigger intense motivation. The smell of food can make a person suddenly feel hungry when this wasn’t the case earlier. Drug cues may prompt relapse in addicts trying to quit. In some cases, desires may be triggered even for a relatively unpleasant event.
Researchers studied how rats responded to metal objects that represented either pleasant sugar or disgustingly intense Dead Sea saltiness. The rats quickly learned to jump on and nibble the sweetness cue, but turned away from and avoided the saltiness cue.
But one day the rats suddenly woke up in a new state of sodium appetite induced by drugs given the night before. On their first re-encounter with the saltiness cue in the new appetite state, their brain systems became activated and the rats instantly jumped on and nibbled the saltiness cue as though it were the sugar cue.
"The cue becomes avidly ‘wanted’ despite knowledge the salt always tasted disgusting," Robinson said.
The sudden brain changes help explain how an event, such as taking an addictive drug, could become “wanted” despite a person’s knowledge of the negative and unpleasant consequences of the drug.
"Our findings highlight what it means to say that drugs hijack our natural reward system," said Robinson, who authored the new study with Kent Berridge, James Olds Collegiate Professor of Psychology and Neuroscience.

Turning repulsive feelings into desires

Hunger, thirst, stress and drugs can create a change in the brain that transforms a repulsive feeling into a strong positive “wanting,” a new University of Michigan study indicates.

The research used salt appetite to show how powerful natural mechanisms of brain desires can instantly transform a cue that always predicted a repulsive Dead Sea Salt solution into an eagerly wanted beacon or motivational magnet.

Mike Robinson, a research fellow in the U-M Department of Psychology and the study’s lead author, said the findings help explain how related brain activations in people could cause them to avidly want something that has been always disliked.

This instant transformation of motivation, he said, lies in the ability of events to activate particular brain circuitry—a structure called the nucleus accumbens, which sits near the base of the front of the brain and is also activated by addictive drugs.

Cues for rewards often trigger intense motivation. The smell of food can make a person suddenly feel hungry when this wasn’t the case earlier. Drug cues may prompt relapse in addicts trying to quit. In some cases, desires may be triggered even for a relatively unpleasant event.

Researchers studied how rats responded to metal objects that represented either pleasant sugar or disgustingly intense Dead Sea saltiness. The rats quickly learned to jump on and nibble the sweetness cue, but turned away from and avoided the saltiness cue.

But one day the rats suddenly woke up in a new state of sodium appetite induced by drugs given the night before. On their first re-encounter with the saltiness cue in the new appetite state, their brain systems became activated and the rats instantly jumped on and nibbled the saltiness cue as though it were the sugar cue.

"The cue becomes avidly ‘wanted’ despite knowledge the salt always tasted disgusting," Robinson said.

The sudden brain changes help explain how an event, such as taking an addictive drug, could become “wanted” despite a person’s knowledge of the negative and unpleasant consequences of the drug.

"Our findings highlight what it means to say that drugs hijack our natural reward system," said Robinson, who authored the new study with Kent Berridge, James Olds Collegiate Professor of Psychology and Neuroscience.

Filed under nucleus accumbens brain activity desires reward system psychology neuroscience science

free counters