Neuroscience

Articles and news from the latest research reports.

Posts tagged sensory perception

392 notes

Mind-controlled prosthetic arms that work in daily life are now a reality
In January 2013 a Swedish arm amputee was the first person in the world to receive a prosthesis with a direct connection to bone, nerves and muscles. An article about this achievement and its long-term stability will now be published in the Science Translational Medicine journal.
“Going beyond the lab to allow the patient to face real-world challenges is the main contribution of this work,” says Max Ortiz Catalan, research scientist at Chalmers University of Technology and leading author of the publication.
“We have used osseointegration to create a long-term stable fusion between man and machine, where we have integrated them at different levels. The artificial arm is directly attached to the skeleton, thus providing mechanical stability. Then the human’s biological control system, that is nerves and muscles, is also interfaced to the machine’s control system via neuromuscular electrodes. This creates an intimate union between the body and the machine; between biology and mechatronics.”
The direct skeletal attachment is created by what is known as osseointegration, a technology in limb prostheses pioneered by associate professor Rickard Brånemark and his colleagues at Sahlgrenska University Hospital. Rickard Brånemark led the surgical implantation and collaborated closely with Max Ortiz Catalan and Professor Bo Håkansson at Chalmers University of Technology on this project.
The patient’s arm was amputated over ten years ago. Before the surgery, his prosthesis was controlled via electrodes placed over the skin. Robotic prostheses can be very advanced, but such a control system makes them unreliable and limits their functionality, and patients commonly reject them as a result.
Now, the patient has been given a control system that is directly connected to his own. He has a physically challenging job as a truck driver in northern Sweden, and since the surgery he has experienced that he can cope with all the situations he faces; everything from clamping his trailer load and operating machinery, to unpacking eggs and tying his children’s skates, regardless of the environmental conditions (read more about the benefits of the new technology below).
The patient is also one of the first in the world to take part in an effort to achieve long-term sensation via the prosthesis. Because the implant is a bidirectional interface, it can also be used to send signals in the opposite direction – from the prosthetic arm to the brain. This is the researchers’ next step, to clinically implement their findings on sensory feedback.
“Reliable communication between the prosthesis and the body has been the missing link for the clinical implementation of neural control and sensory feedback, and this is now in place,” says Max Ortiz Catalan. “So far we have shown that the patient has a long-term stable ability to perceive touch in different locations in the missing hand. Intuitive sensory feedback and control are crucial for interacting with the environment, for example to reliably hold an object despite disturbances or uncertainty. Today, no patient walks around with a prosthesis that provides such information, but we are working towards changing that in the very short term.”
The researchers plan to treat more patients with the novel technology later this year.
“We see this technology as an important step towards more natural control of artificial limbs,” says Max Ortiz Catalan. “It is the missing link for allowing sophisticated neural interfaces to control sophisticated prostheses. So far, this has only been possible in short experiments within controlled environments.”

Mind-controlled prosthetic arms that work in daily life are now a reality

In January 2013 a Swedish arm amputee was the first person in the world to receive a prosthesis with a direct connection to bone, nerves and muscles. An article about this achievement and its long-term stability will now be published in the Science Translational Medicine journal.

“Going beyond the lab to allow the patient to face real-world challenges is the main contribution of this work,” says Max Ortiz Catalan, research scientist at Chalmers University of Technology and leading author of the publication.

“We have used osseointegration to create a long-term stable fusion between man and machine, where we have integrated them at different levels. The artificial arm is directly attached to the skeleton, thus providing mechanical stability. Then the human’s biological control system, that is nerves and muscles, is also interfaced to the machine’s control system via neuromuscular electrodes. This creates an intimate union between the body and the machine; between biology and mechatronics.”

The direct skeletal attachment is created by what is known as osseointegration, a technology in limb prostheses pioneered by associate professor Rickard Brånemark and his colleagues at Sahlgrenska University Hospital. Rickard Brånemark led the surgical implantation and collaborated closely with Max Ortiz Catalan and Professor Bo Håkansson at Chalmers University of Technology on this project.

The patient’s arm was amputated over ten years ago. Before the surgery, his prosthesis was controlled via electrodes placed over the skin. Robotic prostheses can be very advanced, but such a control system makes them unreliable and limits their functionality, and patients commonly reject them as a result.

Now, the patient has been given a control system that is directly connected to his own. He has a physically challenging job as a truck driver in northern Sweden, and since the surgery he has experienced that he can cope with all the situations he faces; everything from clamping his trailer load and operating machinery, to unpacking eggs and tying his children’s skates, regardless of the environmental conditions (read more about the benefits of the new technology below).

The patient is also one of the first in the world to take part in an effort to achieve long-term sensation via the prosthesis. Because the implant is a bidirectional interface, it can also be used to send signals in the opposite direction – from the prosthetic arm to the brain. This is the researchers’ next step, to clinically implement their findings on sensory feedback.

“Reliable communication between the prosthesis and the body has been the missing link for the clinical implementation of neural control and sensory feedback, and this is now in place,” says Max Ortiz Catalan. “So far we have shown that the patient has a long-term stable ability to perceive touch in different locations in the missing hand. Intuitive sensory feedback and control are crucial for interacting with the environment, for example to reliably hold an object despite disturbances or uncertainty. Today, no patient walks around with a prosthesis that provides such information, but we are working towards changing that in the very short term.”

The researchers plan to treat more patients with the novel technology later this year.

“We see this technology as an important step towards more natural control of artificial limbs,” says Max Ortiz Catalan. “It is the missing link for allowing sophisticated neural interfaces to control sophisticated prostheses. So far, this has only been possible in short experiments within controlled environments.”

Filed under prosthetics artificial limbs sensory perception osseointegration neuroscience science

161 notes

Pleasant Smells Increase Facial Attractiveness

New research from the Monell Chemical Senses Center reveals that women’s faces are rated as more attractive in the presence of pleasant odors. In contrast, odor pleasantness had less effect on the evaluation of age. The findings suggest that the use of scented products such as perfumes may, to some extent, alter how people perceive one another.

image

“Odor pleasantness and facial attractiveness integrate into one joint emotional evaluation,” said lead author Janina Seubert, PhD, a cognitive neuroscientist who was a postdoctoral fellow at Monell at the time the research was conducted. “This may indicate a common site of neural processing in the brain.”

Perfumes and scented products have been used for centuries as a way to enhance overall personal appearance. Previous studies had shown perception of facial attractiveness could be influenced when using unpleasant vs. pleasant odors. However, it was not known whether odors influence the actual visual perception of facial features or alternatively, how faces are emotionally evaluated by the brain.

The current study design centered on the principle that judging attractiveness and age involve two distinct perceptual processing methods: attractiveness is regarded as an emotional process while judgments of age are believed to be cognitive, or rationally-based.

In the study, published in open access journal PLOS ONE, 18 young adults, two thirds of whom were female, were asked to rate the attractiveness and age of eight female faces, presented as photographs. The images varied in terms of natural aging features.

While evaluating the images, one of five odors was simultaneously released. These were a blend of fish oil (unpleasant) and rose oil (pleasant) that ranged from predominantly fish oil to predominantly rose oil. The subjects were asked to rate the age of the face in the photograph, the attractiveness of the face and the pleasantness of the odor.

Across the range of odors, odor pleasantness directly influenced ratings of facial attractiveness. This suggests that olfactory and visual cues independently influence judgments of facial attractiveness.

With regard to the cognitive task of age evaluation, visual age cues (more wrinkles and blemishes) were linked to older age perception. However, odor pleasantness had a mixed effect. Visual age cues strongly influenced age perception during pleasant odor stimulation, making older faces look older and younger faces look younger. This effect was weakened in the presence of unpleasant odors, so that younger and older faces were perceived to be more similar in age.

Jean-Marc Dessirier, Lead Scientist at Unilever and a co-author on the study said, “These findings have fascinating implications in terms of how pleasant smells may help enhance natural appearance within social settings. The next step will be to see if the findings extend to evaluation of male facial attractiveness.”

(Source: monell.org)

Filed under facial attractiveness smell odor pleasantness sensory perception face perception psychology neuroscience science

109 notes

Taste Test: Could sense of taste affect length of life?

Perhaps one of the keys to good health isn’t just what you eat but how you taste it.

image

Taste buds – yes, the same ones you may blame for that sweet tooth or French fry craving – may in fact have a powerful role in a long and healthy life – at least for fruit flies, say two new studies that appear in the Proceedings of the National Academy of Sciences of the United States of America.

Researchers from the University of Michigan, Wayne State University and Friedrich Miescher Institute for Biomedical Research in Switzerland found that suppressing the animal’s ability to taste its food –regardless of how much it actually eats – can significantly increase or decrease its length of life and potentially promote healthy aging.
 
Bitter tastes could have negative effects on lifespan, sweet tastes had positive effects, and the ability to taste water had the most significant impact – flies that could not taste water lived up to 43% longer than other flies. The findings suggest that in fruit flies, the loss of taste may cause physiological changes to help the body adapt to the perception that it’s not getting adequate nutrients.

In the case of flies whose loss of water taste led to a longer life, authors say the animals may attempt to compensate for a perceived water shortage by storing greater amounts of fat and subsequently using these fat stores to produce water internally. Further studies are planned to better explore how and why bitter and sweet tastes affect aging.

“This brings us further understanding about how sensory perception affects health. It turns out that taste buds are doing more than we think,” says senior author of the University of Michigan-led study Scott Pletcher, Ph.D., associate professor in the Department of Molecular and Integrative Physiology and research associate professor at the Institute of Gerontology.

“We know they’re able to help us avoid or be attracted to certain foods but in fruit flies, it appears that taste may also have a very profound effect on the physiological state and healthy aging.”
 
Pletcher conducted the study with lead author Michael Waterson, a Ph.D graduate student in U-M’s Cellular and Molecular Biology Program.  

“Our world is shaped by our sensory abilities that help us navigate our surroundings and by dissecting how this affects aging, we can lay the groundwork for new ideas to improve our health,” says senior author of the other study, Joy Alcedo, Ph.D, assistant professor in the Department of Biological Sciences at Wayne State University, formerly of the Friedrich Miescher Institute for Biomedical Research in Switzerland. Alcedo conducted the research with lead author Ivan Ostojic, Ph.D., of the Friedrich Miescher Institute for Biomedical Research in Switzerland.

Recent studies suggest that sensory perception may influence health-related characteristics such as athletic performance, type II diabetes, and aging. The two new studies, however, provide the first detailed look into the role of taste perception.

“These findings help us better understand the influence of sensory signals, which we now know not only tune an organism into its environment but also cause substantial changes in physiology that affect overall health and longevity,” Waterson says. “We need further studies to help us apply this knowledge to health in humans potentially through tailored diets favoring certain tastes or even pharmaceutical compounds that target taste inputs without diet alterations.”

(Source: uofmhealth.org)

Filed under taste taste buds sensory perception fruit flies lifespan aging neuroscience science

296 notes

How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes
A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty.
Full Article

How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes

A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty.

Full Article

Filed under decision making computational models sensory perception neuroscience science

245 notes

Ultrasound directed to the human brain can boost sensory performance

Whales, bats, and even praying mantises use ultrasound as a sensory guidance system – and now a new study has found that ultrasound can modulate brain activity to heighten sensory perception in humans.
Virginia Tech Carilion Research Institute scientists have demonstrated that ultrasound directed to a specific region of the brain can boost performance in sensory discrimination. The study, published online Jan. 12 in Nature Neuroscience, provides the first demonstration that low-intensity, transcranial-focused ultrasound can modulate human brain activity to enhance perception.
“Ultrasound has great potential for bringing unprecedented resolution to the growing trend of mapping the human brain’s connectivity,” said William “Jamie” Tyler, an assistant professor at the Virginia Tech Carilion Research Institute, who led the study. “So we decided to look at the effects of ultrasound on the region of the brain responsible for processing tactile sensory inputs.”
The scientists delivered focused ultrasound to an area of the cerebral cortex that corresponds to processing sensory information received from the hand. To stimulate the median nerve – a major nerve that runs down the arm and the only one that passes through the carpal tunnel – they placed a small electrode on the wrist of human volunteers and recorded their brain responses using electroencephalography, or EEG. Then, just before stimulating the nerve, they began delivering ultrasound to the targeted brain region.
The scientists found that the ultrasound both decreased the EEG signal and weakened the brain waves responsible for encoding tactile stimulation.
The scientists then administered two classic neurological tests: the two-point discrimination test, which measures a subject’s ability to distinguish whether two nearby objects touching the skin are truly two distinct points, rather than one; and the frequency discrimination task, a test that measures sensitivity to the frequency of a chain of air puffs.
What the scientists found was unexpected.
The subjects receiving ultrasound showed significant improvements in their ability to distinguish pins at closer distances and to discriminate small frequency differences between successive air puffs.
“Our observations surprised us,” said Tyler. “Even though the brain waves associated with the tactile stimulation had weakened, people actually got better at detecting differences in sensations.”
Why would suppression of brain responses to sensory stimulation heighten perception? Tyler speculates that the ultrasound affected an important neurological balance.
“It seems paradoxical, but we suspect that the particular ultrasound waveform we used in the study alters the balance of synaptic inhibition and excitation between neighboring neurons within the cerebral cortex,” Tyler said. “We believe focused ultrasound changed the balance of ongoing excitation and inhibition processing sensory stimuli in the brain region targeted and that this shift prevented the spatial spread of excitation in response to stimuli resulting in a functional improvement in perception.”
To understand how well they could pinpoint the effect, the research team moved the acoustic beam one centimeter in either direction of the original site of brain stimulation – and the effect disappeared.
“That means we can use ultrasound to target an area of the brain as small as the size of an M&M,” Tyler said. “This finding represents a new way of noninvasively modulating human brain activity with a better spatial resolution than anything currently available.”
Based on the findings of the current study and an earlier one, the researchers concluded that ultrasound has a greater spatial resolution than two other leading noninvasive brain stimulation technologies – transcranial magnetic stimulation, which uses magnets to activate the brain, and transcranial direct current stimulation, which uses weak electrical currents delivered directly to the brain through electrodes placed on the head.
“Gaining a better understanding of how pulsed ultrasound affects the balance of synaptic inhibition and excitation in targeted brain regions – as well as how it influences the activity of local circuits versus long-range connections – will help us make more precise maps of the richly interconnected synaptic circuits in the human brain,” said Wynn Legon, the study’s first author and a postdoctoral scholar at the Virginia Tech Carilion Research Institute. “We hope to continue to extend the capabilities of ultrasound for noninvasively tweaking brain circuits to help us understand how the human brain works.”
“The work by Jamie Tyler and his colleagues is at the forefront of the coming tsunami of developing new safe yet effective noninvasive ways to modulate the flow of information in cellular circuits within the living human brain,” said Michael Friedlander, executive director of the Virginia Tech Carilion Research Institute and a neuroscientist who specializes in brain plasticity. “This approach is providing the technology and proof of principle for precise activation of neural circuits for a range of important uses, including potential treatments for neurodegenerative disorders, psychiatric diseases, and behavioral disorders. Moreover, it arms the neuroscientific community with a powerful new tool to explore the function of the healthy human brain, helping us understand cognition, decision-making, and thought. This is just the type of breakthrough called for in President Obama’s BRAIN Initiative to enable dramatic new approaches for exploring the functional circuitry of the living human brain and for treating Alzheimer’s disease and other disorders.”
A team of Virginia Tech Carilion Research Institute scientists – including Tomokazu Sato, Alexander Opitz, Aaron Barbour, and Amanda Williams, along with Virginia Tech graduate student Jerel Mueller of Raleigh, N.C. – joined Tyler and Legon in conducting the research. In addition to his position at the institute, Tyler is an assistant professor of biomedical engineering and sciences at the Virginia Tech–Wake Forest University School of Biomedical Engineering and Sciences. In 2012, he shared a Technological Innovation Award from the McKnight Endowment for Neuroscience to work on developing ultrasound as a noninvasive tool for modulating brain activity.
“In neuroscience, it’s easy to disrupt things,” said Tyler. “We can distract you, make you feel numb, trick you with optical illusions. It’s easy to make things worse, but it’s hard to make them better. These findings make us believe we’re on the right path.”

Ultrasound directed to the human brain can boost sensory performance

Whales, bats, and even praying mantises use ultrasound as a sensory guidance system – and now a new study has found that ultrasound can modulate brain activity to heighten sensory perception in humans.

Virginia Tech Carilion Research Institute scientists have demonstrated that ultrasound directed to a specific region of the brain can boost performance in sensory discrimination. The study, published online Jan. 12 in Nature Neuroscience, provides the first demonstration that low-intensity, transcranial-focused ultrasound can modulate human brain activity to enhance perception.

“Ultrasound has great potential for bringing unprecedented resolution to the growing trend of mapping the human brain’s connectivity,” said William “Jamie” Tyler, an assistant professor at the Virginia Tech Carilion Research Institute, who led the study. “So we decided to look at the effects of ultrasound on the region of the brain responsible for processing tactile sensory inputs.”

The scientists delivered focused ultrasound to an area of the cerebral cortex that corresponds to processing sensory information received from the hand. To stimulate the median nerve – a major nerve that runs down the arm and the only one that passes through the carpal tunnel – they placed a small electrode on the wrist of human volunteers and recorded their brain responses using electroencephalography, or EEG. Then, just before stimulating the nerve, they began delivering ultrasound to the targeted brain region.

The scientists found that the ultrasound both decreased the EEG signal and weakened the brain waves responsible for encoding tactile stimulation.

The scientists then administered two classic neurological tests: the two-point discrimination test, which measures a subject’s ability to distinguish whether two nearby objects touching the skin are truly two distinct points, rather than one; and the frequency discrimination task, a test that measures sensitivity to the frequency of a chain of air puffs.

What the scientists found was unexpected.

The subjects receiving ultrasound showed significant improvements in their ability to distinguish pins at closer distances and to discriminate small frequency differences between successive air puffs.

“Our observations surprised us,” said Tyler. “Even though the brain waves associated with the tactile stimulation had weakened, people actually got better at detecting differences in sensations.”

Why would suppression of brain responses to sensory stimulation heighten perception? Tyler speculates that the ultrasound affected an important neurological balance.

“It seems paradoxical, but we suspect that the particular ultrasound waveform we used in the study alters the balance of synaptic inhibition and excitation between neighboring neurons within the cerebral cortex,” Tyler said. “We believe focused ultrasound changed the balance of ongoing excitation and inhibition processing sensory stimuli in the brain region targeted and that this shift prevented the spatial spread of excitation in response to stimuli resulting in a functional improvement in perception.”

To understand how well they could pinpoint the effect, the research team moved the acoustic beam one centimeter in either direction of the original site of brain stimulation – and the effect disappeared.

“That means we can use ultrasound to target an area of the brain as small as the size of an M&M,” Tyler said. “This finding represents a new way of noninvasively modulating human brain activity with a better spatial resolution than anything currently available.”

Based on the findings of the current study and an earlier one, the researchers concluded that ultrasound has a greater spatial resolution than two other leading noninvasive brain stimulation technologies – transcranial magnetic stimulation, which uses magnets to activate the brain, and transcranial direct current stimulation, which uses weak electrical currents delivered directly to the brain through electrodes placed on the head.

“Gaining a better understanding of how pulsed ultrasound affects the balance of synaptic inhibition and excitation in targeted brain regions – as well as how it influences the activity of local circuits versus long-range connections – will help us make more precise maps of the richly interconnected synaptic circuits in the human brain,” said Wynn Legon, the study’s first author and a postdoctoral scholar at the Virginia Tech Carilion Research Institute. “We hope to continue to extend the capabilities of ultrasound for noninvasively tweaking brain circuits to help us understand how the human brain works.”

“The work by Jamie Tyler and his colleagues is at the forefront of the coming tsunami of developing new safe yet effective noninvasive ways to modulate the flow of information in cellular circuits within the living human brain,” said Michael Friedlander, executive director of the Virginia Tech Carilion Research Institute and a neuroscientist who specializes in brain plasticity. “This approach is providing the technology and proof of principle for precise activation of neural circuits for a range of important uses, including potential treatments for neurodegenerative disorders, psychiatric diseases, and behavioral disorders. Moreover, it arms the neuroscientific community with a powerful new tool to explore the function of the healthy human brain, helping us understand cognition, decision-making, and thought. This is just the type of breakthrough called for in President Obama’s BRAIN Initiative to enable dramatic new approaches for exploring the functional circuitry of the living human brain and for treating Alzheimer’s disease and other disorders.”

A team of Virginia Tech Carilion Research Institute scientists – including Tomokazu Sato, Alexander Opitz, Aaron Barbour, and Amanda Williams, along with Virginia Tech graduate student Jerel Mueller of Raleigh, N.C. – joined Tyler and Legon in conducting the research. In addition to his position at the institute, Tyler is an assistant professor of biomedical engineering and sciences at the Virginia Tech–Wake Forest University School of Biomedical Engineering and Sciences. In 2012, he shared a Technological Innovation Award from the McKnight Endowment for Neuroscience to work on developing ultrasound as a noninvasive tool for modulating brain activity.

“In neuroscience, it’s easy to disrupt things,” said Tyler. “We can distract you, make you feel numb, trick you with optical illusions. It’s easy to make things worse, but it’s hard to make them better. These findings make us believe we’re on the right path.”

Filed under somatosensory cortex ultrasound sensory perception brain activity neuroscience science

313 notes

Missing “brake in the brain” can trigger anxiety states

Fear, at the right level, can increase alertness and protect against dangers. Disproportionate fear, on the other hand, can disrupt the sensory perception, be disabling, reduce happiness and therefore become a danger in itself.  Anxiety disorders are therefore a psychiatric condition that should not be underestimated. In these disorders, the fear is so strong that there is tremendous psychological strain and living a normal life appears to be impossible. Researchers at the MedUni Vienna have now found a possible explanation as to how social phobias and fear can be triggered in the brain: a missing inhibitory connection or missing “brake” in the brain.

image

Inside the brain, the amygdala and the orbitofrontal cortex in the frontal lobe form an important control circuit for regulating the emotions. This control circuit is termed the brain’s emotional control centre. Whereas in healthy subjects, this circuit has “negative feedback” and “calmness” was identified, scientists used functional magnetic resonance imaging (MRI) on people with social phobias and found the opposite to be true: an important inhibitory connection is different in these patients, which may explain why they are unable to control their fears.

In collaboration with the Centre for Medical Physics and Biomedical Technology and the University Department of Psychiatry and Psychotherapy at the MedUni Vienna, the research team lead by Christian Windischberger was also able to discover through its recent study at the MedUni Vienna’s High Field MR Centre of Excellence how the areas of the brain that are involved with processing emotions are able to influence each other.

The study participants were shown a series of “emotional faces” while undergoing functional magnetic resonance imaging. fMRI is a non-invasive method which uses radio waves and magnetic fields to measure changes in the levels of oxygen in the blood and therefore neuronal activity in individual regions of the brain. An analysis method developed at University College London was used to provide new perspectives on the data obtained.

Breaking the circle of fear
When emotional facial expressions were shown - from laughing to crying, from happiness to anger - neuronal activity was triggered in the brain. The result: on a purely external basis, the test subjects looked no different, but the healthy subjects were kept calm thanks to their automatic “brake”, despite the emotional nature of the images. For the social phobics, on the other hand, the photographs put their brains into “overdrive”, triggering very strong neuronal activity. This was demonstrated very clearly using the new analysis method: “We have the opportunity not only to localise brain activity and compare it between groups, but we can now also make statements regarding functional connections within the brain. In psychiatric conditions especially, we can assume that there are not complete failures of these connections going on, but rather imbalances in complex regulatory processes,” says Ronald Sladky, the study’s primary author.

This better understanding of the neuronal mechanisms involved will now be used to develop new approaches to treatment. The aim is to understand what effect medications and psycho-therapeutic support have on the networks involved in order to help patients break out of their circles of fear.

(Source: meduniwien.ac.at)

Filed under anxiety anxiety disorders sensory perception orbitofrontal cortex amygdala fear psychology neuroscience science

189 notes

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development 
Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.
The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.
The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.
The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.
In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.
The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.
“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”
The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Study Shows a Solitary Mutation Can Destroy Critical ‘Window’ of Early Brain Development

Scientists from the Florida campus of The Scripps Research Institute (TSRI) have shown in animal models that brain damage caused by the loss of a single copy of a gene during very early childhood development can cause a lifetime of behavioral and intellectual problems.

The study, published this week in the Journal of Neuroscience, sheds new light on the early development of neural circuits in the cortex, the part of the brain responsible for functions such as sensory perception, planning and decision-making.

The research also pinpoints the mechanism responsible for the disruption of what are known as “windows of plasticity” that contribute to the refinement of the neural connections that broadly shape brain development and the maturing of perception, language, and cognitive abilities.

The key to normal development of these abilities is that the neural connections in the brain cortex—the synapses—mature at the right time.

In an earlier study, the team, led by TSRI Associate Professor Gavin Rumbaugh, found that in mice missing a single copy of the vital gene, certain synapses develop prematurely within the first few weeks after birth. This accelerated maturation dramatically expands the process known as “excitability”—how often brain cells fire—in the hippocampus, a part of the brain critical for memory. The delicate balance between excitability and inhibition is especially critical during early developmental periods. However, it remained a mystery how early maturation of brain circuits could lead to lifelong cognitive and behavioral problems.

The current study shows in mice that the interruption of the synapse-regulating gene known as SYNGAP1—which can cause a devastating form of intellectual disability and increase the risk for developing autism in humans—induces early functional maturation of neural connections in two areas of the cortex. The influence of this disruption is widespread throughout the developing brain and appears to degrade the duration of these critical windows of plasticity.

“In this study, we were able to directly connect early maturation of synapses to the loss of an important plasticity window in the cortex,” Rumbaugh said. “Early maturation of synapses appears to make the brain less plastic at critical times in development. Children with these mutations appear to have brains that were built incorrectly from the ground up.”

The accelerated maturation also appeared to occur surprisingly early in the developing cortex. That, Rumbaugh added, would correspond to the first two years of a child’s life, when the brain is expanding rapidly. “Our goal now is to figure out a way to prevent the damage caused by SYNGAP1 mutations. We would be more likely to help that child if we could intervene very early on—before the mutation has done its damage,” he said.

Filed under brain development neuroplasticity sensory perception hippocampus genetics neuroscience science

175 notes

A Confederacy of Senses
Research on multisensory speech perception in recent years has helped revolutionize our understanding of how the brain organizes the information it receives from our many different senses, UC Riverside psychology professor Lawrence D. Rosenblum writes in the January 2013 issue of Scientific American.
“Neuroscientists and psychologists have largely abandoned early ideas of the brain as a Swiss Army knife, in which many distinct regions are dedicated to different senses,” he says. “Instead scientists now think that the brain has evolved to encourage as much cross talk as possible between the senses — that the brain’s sensory regions are physically intertwined.”
The article, “A Confederacy of Senses,” explains how research in the past 15 years has demonstrated that no sense works alone. An abstract of the article can be read here.
“The multisensory revolution is also suggesting new ways to improve devices for the blind and deaf, such as cochlear implants,” Rosenblum writes. This research also has improved speech-recognition software, he says.
Researchers have discovered that the brain “does not channel visual information from the eyes into one neural container and auditory information from the ears into another, discrete, container as though it were sorting coins,” Rosenblum writes. “Rather our brains derive meaning from the world in as many ways as possible by blending the diverse forms of sensory perception.”
Rosenblum is the author of “See What I’m Saying: The Extraordinary Powers of Our Five Senses” (Norton, 2010), and has spent two decades studying multisensory perception, lipreading and hearing. His research has been supported by the National Science Foundation and the National Institutes of Health. He is known internationally for his research on risks the inaudibility of hybrid cars pose for blind and other pedestrians.

A Confederacy of Senses

Research on multisensory speech perception in recent years has helped revolutionize our understanding of how the brain organizes the information it receives from our many different senses, UC Riverside psychology professor Lawrence D. Rosenblum writes in the January 2013 issue of Scientific American.

“Neuroscientists and psychologists have largely abandoned early ideas of the brain as a Swiss Army knife, in which many distinct regions are dedicated to different senses,” he says. “Instead scientists now think that the brain has evolved to encourage as much cross talk as possible between the senses — that the brain’s sensory regions are physically intertwined.”

The article, “A Confederacy of Senses,” explains how research in the past 15 years has demonstrated that no sense works alone. An abstract of the article can be read here.

“The multisensory revolution is also suggesting new ways to improve devices for the blind and deaf, such as cochlear implants,” Rosenblum writes. This research also has improved speech-recognition software, he says.

Researchers have discovered that the brain “does not channel visual information from the eyes into one neural container and auditory information from the ears into another, discrete, container as though it were sorting coins,” Rosenblum writes. “Rather our brains derive meaning from the world in as many ways as possible by blending the diverse forms of sensory perception.”

Rosenblum is the author of “See What I’m Saying: The Extraordinary Powers of Our Five Senses” (Norton, 2010), and has spent two decades studying multisensory perception, lipreading and hearing. His research has been supported by the National Science Foundation and the National Institutes of Health. He is known internationally for his research on risks the inaudibility of hybrid cars pose for blind and other pedestrians.

Filed under brain speech perception sensory perception psychology neuroscience science

188 notes


It Just Smells
If you play sounds of many different frequencies at the same time, they combine to produce neutral “white noise.” Neuroscientists say they have created an analogous generic scent by blending odors. Such “olfactory white” might rarely, if ever, be found in nature, but it could prove useful in research, other scientists say.
Using just a few hundred types of biochemical receptors, each of which respond to just a few odorants, the human nose can distinguish thousands of different odors. Yet humans can’t easily identify the individual components of a mixture, even when they can identify the odors alone, says Noam Sobel, a neuroscientist at the Weizmann Institute of Science in Rehovot, Israel. Now, he and his colleagues suggest, various blends made up of a large number of odors all begin to smell the same—even when the blends share no common components.
…
Although many scents—such as coffee, wine, roses, and dirty socks—are complex blends containing hundreds of components, they are very distinctive. At least two factors are responsible, Sobel says: The individual odorants are often chemically related, and often one or more of them is vastly more intense than the rest.
The team’s findings are “a clever piece of work that shows the olfactory system works exactly as we would predict from our current understanding of it,” says Tim Jacob, a neuroscientist at Cardiff University in the United Kingdom. “That is, if you stimulate every olfactory ‘channel’ to the same extent, the brain cannot characterize or identify a particular smell,” he notes.
“Olfactory white is a neat idea, and it draws interesting parallels to white light and white noise,” says Jay Gottfried, an olfactory neuroscientist at Northwestern University’s Feinberg School of Medicine in Chicago, Illinois. The new study “definitely adds new information about how the brain interprets odors,” he notes.
Even though olfactory white is not likely to be encountered in nature, the concept could be useful, Gottfried says. “Researchers have found that white noise is a useful stimulus in experiments to probe auditory responses,” he notes, and scientists probing the human sense of smell might find similar uses for olfactory white.

It Just Smells

If you play sounds of many different frequencies at the same time, they combine to produce neutral “white noise.” Neuroscientists say they have created an analogous generic scent by blending odors. Such “olfactory white” might rarely, if ever, be found in nature, but it could prove useful in research, other scientists say.

Using just a few hundred types of biochemical receptors, each of which respond to just a few odorants, the human nose can distinguish thousands of different odors. Yet humans can’t easily identify the individual components of a mixture, even when they can identify the odors alone, says Noam Sobel, a neuroscientist at the Weizmann Institute of Science in Rehovot, Israel. Now, he and his colleagues suggest, various blends made up of a large number of odors all begin to smell the same—even when the blends share no common components.

Although many scents—such as coffee, wine, roses, and dirty socks—are complex blends containing hundreds of components, they are very distinctive. At least two factors are responsible, Sobel says: The individual odorants are often chemically related, and often one or more of them is vastly more intense than the rest.

The team’s findings are “a clever piece of work that shows the olfactory system works exactly as we would predict from our current understanding of it,” says Tim Jacob, a neuroscientist at Cardiff University in the United Kingdom. “That is, if you stimulate every olfactory ‘channel’ to the same extent, the brain cannot characterize or identify a particular smell,” he notes.

“Olfactory white is a neat idea, and it draws interesting parallels to white light and white noise,” says Jay Gottfried, an olfactory neuroscientist at Northwestern University’s Feinberg School of Medicine in Chicago, Illinois. The new study “definitely adds new information about how the brain interprets odors,” he notes.

Even though olfactory white is not likely to be encountered in nature, the concept could be useful, Gottfried says. “Researchers have found that white noise is a useful stimulus in experiments to probe auditory responses,” he notes, and scientists probing the human sense of smell might find similar uses for olfactory white.

Filed under olfactory system olfactory white sensory perception smell odor neuroscience psychology science

92 notes

Learning a New Sense
Rats use a sense that humans don’t: whisking. They move their facial whiskers back and forth about eight times a second to locate objects in their environment. Could humans acquire this sense? And if they can, what could understanding the process of adapting to new sensory input tell us about how humans normally sense? At the Weizmann Institute, researchers explored these questions by attaching plastic “whiskers” to the fingers of blindfolded volunteers and asking them to carry out a location task. The findings, which recently appeared in the Journal of Neuroscience, have yielded new insight into the process of sensing, and they may point to new avenues in developing aids for the blind.

Learning a New Sense

Rats use a sense that humans don’t: whisking. They move their facial whiskers back and forth about eight times a second to locate objects in their environment. Could humans acquire this sense? And if they can, what could understanding the process of adapting to new sensory input tell us about how humans normally sense? At the Weizmann Institute, researchers explored these questions by attaching plastic “whiskers” to the fingers of blindfolded volunteers and asking them to carry out a location task. The findings, which recently appeared in the Journal of Neuroscience, have yielded new insight into the process of sensing, and they may point to new avenues in developing aids for the blind.

Filed under perception whiskers sensory perception neuroscience brain science

free counters