Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

105 notes

A New Target for Alcoholism Treatment: Kappa Opioid Receptors
The list of brain receptor targets for opiates reads like a fraternity: Mu Delta Kappa. The mu opioid receptor is the primary target for morphine and endogenous opioids like endorphin, whereas the delta opioid receptor shows the highest affinity for endogenous enkephalins. The kappa opioid receptor (KOR) is very interesting, but the least understood of the opiate receptor family.
Until now, the mu opioid receptor received the most attention in alcoholism research. Naltrexone, a drug approved by the U.S. Food and Drug Administration for the treatment of alcoholism, acts by blocking opiate action at brain receptors and is most potent at the mu opioid receptor. In addition, research has suggested that a variant of the gene that codes for the mu opioid receptor (OPRM1) may be associated with the risk for alcoholism and the response to naltrexone treatment.
However, naltrexone also acts at the kappa opioid receptor and it has not been clear whether this effect of naltrexone is relevant to alcoholism treatment.
A growing body of research in animals implicates the KOR in alcoholism. Stimulation of the KOR, which occurs with alcohol intake, is thought to produce unpleasant and aversive effects. This receptor is hypothesized to play a role in alcohol dependence, at least in part, by promoting negative reinforcement processes. In other words, the theory postulates that during development of alcohol dependence, the KOR system becomes overstimulated, producing dysphoria and anhedonia, which then leads to further alcohol seeking and escalation of alcohol intake that serves to self-medicate those negative symptoms.
A new study in Biological Psychiatry, led by Dr. Brendan Walker at Washington State University, used a rat model of alcohol dependence to directly investigate the KOR system following chronic alcohol exposure and withdrawal.
They found that the KOR system is dysregulated in the amygdala of alcohol-dependent rats, a vital brain region with many functions, including regulation of emotional behavior and decision-making. Chronic alcohol consumption is known to cause neuroadaptations in the amygdala. In this study specifically, they found increased dynorphin A and increased KOR signaling in the amygdala of alcohol-dependent rats.
When the rats were in acute alcohol withdrawal, the researchers administered different drugs, each of which target the KOR system in precise ways, directly into the amygdala. Using this site-specific antagonism, they observed that alcohol dependence-related KOR dysregulation directly contributes to the excessive alcohol consumption that occurs during withdrawal.
“These data provide important new support for the hypothesis that kappa opioid receptor blockers might play a role in the treatment of alcoholism,” said Dr. John Krystal, Editor of Biological Psychiatry. “This study suggests that one role might be to prevent a relapse to alcohol use among patients recently withdrawn from alcohol.”
“This dataset demonstrates the extensive nature of the neuroadaptations the brain undergoes when chronically exposed to alcohol. The implications of these results are far reaching and should help guide pharmacotherapeutic development efforts for the treatment of alcohol use disorders,” said Walker. “Pharmacological compounds that alleviate the negative emotional / mood states that accompany alcohol withdrawal, by attenuating the excessive signaling in the dynorphin / kappa-opioid receptor system, should result in enhanced treatment compliance and facilitate the transition away from alcohol dependence.”
Additional extensive research will be necessary to identify and test the effectiveness of specific drugs that act on the KOR system, but these findings provide researchers with a potentially successful path forward to developing new drugs for the treatment of alcoholism.

A New Target for Alcoholism Treatment: Kappa Opioid Receptors

The list of brain receptor targets for opiates reads like a fraternity: Mu Delta Kappa. The mu opioid receptor is the primary target for morphine and endogenous opioids like endorphin, whereas the delta opioid receptor shows the highest affinity for endogenous enkephalins. The kappa opioid receptor (KOR) is very interesting, but the least understood of the opiate receptor family.

Until now, the mu opioid receptor received the most attention in alcoholism research. Naltrexone, a drug approved by the U.S. Food and Drug Administration for the treatment of alcoholism, acts by blocking opiate action at brain receptors and is most potent at the mu opioid receptor. In addition, research has suggested that a variant of the gene that codes for the mu opioid receptor (OPRM1) may be associated with the risk for alcoholism and the response to naltrexone treatment.

However, naltrexone also acts at the kappa opioid receptor and it has not been clear whether this effect of naltrexone is relevant to alcoholism treatment.

A growing body of research in animals implicates the KOR in alcoholism. Stimulation of the KOR, which occurs with alcohol intake, is thought to produce unpleasant and aversive effects. This receptor is hypothesized to play a role in alcohol dependence, at least in part, by promoting negative reinforcement processes. In other words, the theory postulates that during development of alcohol dependence, the KOR system becomes overstimulated, producing dysphoria and anhedonia, which then leads to further alcohol seeking and escalation of alcohol intake that serves to self-medicate those negative symptoms.

A new study in Biological Psychiatry, led by Dr. Brendan Walker at Washington State University, used a rat model of alcohol dependence to directly investigate the KOR system following chronic alcohol exposure and withdrawal.

They found that the KOR system is dysregulated in the amygdala of alcohol-dependent rats, a vital brain region with many functions, including regulation of emotional behavior and decision-making. Chronic alcohol consumption is known to cause neuroadaptations in the amygdala. In this study specifically, they found increased dynorphin A and increased KOR signaling in the amygdala of alcohol-dependent rats.

When the rats were in acute alcohol withdrawal, the researchers administered different drugs, each of which target the KOR system in precise ways, directly into the amygdala. Using this site-specific antagonism, they observed that alcohol dependence-related KOR dysregulation directly contributes to the excessive alcohol consumption that occurs during withdrawal.

“These data provide important new support for the hypothesis that kappa opioid receptor blockers might play a role in the treatment of alcoholism,” said Dr. John Krystal, Editor of Biological Psychiatry. “This study suggests that one role might be to prevent a relapse to alcohol use among patients recently withdrawn from alcohol.”

“This dataset demonstrates the extensive nature of the neuroadaptations the brain undergoes when chronically exposed to alcohol. The implications of these results are far reaching and should help guide pharmacotherapeutic development efforts for the treatment of alcohol use disorders,” said Walker. “Pharmacological compounds that alleviate the negative emotional / mood states that accompany alcohol withdrawal, by attenuating the excessive signaling in the dynorphin / kappa-opioid receptor system, should result in enhanced treatment compliance and facilitate the transition away from alcohol dependence.”

Additional extensive research will be necessary to identify and test the effectiveness of specific drugs that act on the KOR system, but these findings provide researchers with a potentially successful path forward to developing new drugs for the treatment of alcoholism.

Filed under alcohol alcohol dependence opioid receptors amygdala neuroscience science

139 notes

Visual hallucinations more common than previously thought

Vivid hallucinations experienced by people with sight loss last far longer and have more serious consequences than previously thought, according to new research from King’s College London and the Macular Society. 

image

The study is the largest survey of the phenomenon, known as Charles Bonnet Syndrome, and documented the experiences of 492 visually impaired people who had experienced visual hallucinations. The findings, published in the British Journal of Ophthalmology, show there is a serious discrepancy between medical opinion and the realities of the condition.

Charles Bonnet Syndrome is widely considered by the medical profession to be benign and short-lived. However, the new research shows that 80% of respondents had hallucinations for five years or more and 32% found them predominantly unpleasant, distressing and negative. 

The study described this group of people as having “negative outcome Charles Bonnet Syndrome”. The group was more likely to have frequent, fear inducing, longer duration hallucinations, which affected daily activities. They were more likely to attribute hallucinations to serious mental illness and were less likely to have been warned about the possibility of hallucinations before they started. 

Of respondents, 38% regarded their hallucinations as startling, terrifying or frightening when they first occurred and 46% said hallucinations had an effect on their ability to complete daily tasks. 36% of people who discussed the issue with a medical professional said the professional was “unsure or did not know” about the diagnosis.

Dr Dominic ffytche, who led the research at the Institute of Psychiatry at King’s, says:  “Charles Bonnet Syndrome has been traditionally thought of as benign. Indeed, it has been questioned whether it should even be considered a medical condition given it does not cause problems and goes away by itself. The results of our survey paint a very different picture.

“With no specific treatments for Charles Bonnet Syndrome, the survey highlights the importance of raising awareness to reduce the distress it causes, particularly before symptoms start. All people with Charles Bonnet Syndrome are relieved or reassured to find out about the cause of their hallucinations and our evidence shows the knowledge may help reduce negative outcome.”

People with macular disease are particularly prone to Charles Bonnet hallucinations. They are thought to be a reaction of the brain to the loss of visual stimulation. More than half of people with severe sight loss experience them but many do not tell others for fear they will be thought to have a serious mental illness. 

Age-related macular (AMD) degeneration affects the central vision and is the most common cause of sight loss in the UK. Nearly 600,000 people have late-stage AMD today and more people will become affected as our population ages. Around half will have hallucinations at some stage. 

Tony Rucinski, Chief Executive, the Macular Society, said: “It is essential that people affected by sight loss are given information about Charles Bonnet Syndrome at diagnosis or as soon after as possible. 

“Losing your sight is bad enough without the fear that you have something like dementia as well. We need medical professionals to recognise the seriousness of Charles Bonnet Syndrome and ensure that people don’t suffer unnecessarily. More research is also needed to investigate Charles Bonnet Syndrome and possible ways of reducing its impact.”

Dr ffytche is also leading a large NIHR funded research programme on visual hallucinations to develop a much-needed evidence base to inform NHS practice in managing and treating the symptoms. 

Filed under hallucinations Charles Bonnet Syndrome vision visual impairment neuroscience science

654 notes

How the gut feeling shapes fear
An unlit, deserted car park at night, footsteps in the gloom. The heart beats faster and the stomach ties itself in knots. We often feel threatening situations in our stomachs. While the brain has long been viewed as the centre of all emotions, researchers are increasingly trying to get to the bottom of this proverbial gut instinct.
It is not only the brain that controls processes in our abdominal cavity; our stomach also sends signals back to the brain. At the heart of this dialogue between the brain and abdomen is the vagus nerve, which transmits signals in both directions – from the brain to our internal organs (via the so called efferent nerves) and from the stomach back to our brain (via the afferent nerves). By cutting the afferent nerve fibres in rats, a team of scientists led by Urs Meyer, a researcher in the group of ETH Zurich professor Wolfgang Langhans, turned this two-way communication into a one-way street, enabling the researchers to get to the bottom of the role played by gut instinct. In the test animals, the brain was still able to control processes in the abdomen, but no longer received any signals from the other direction.
Less fear without gut instinct
In the behavioural studies, the researchers determined that the rats were less wary of open spaces and bright lights compared with controlled rats with an intact vagus nerve. “The innate response to fear appears to be influenced significantly by signals sent from the stomach to the brain,” says Meyer.
Nevertheless, the loss of their gut instinct did not make the rats completely fearless: the situation for learned fear behaviour looked different. In a conditioning experiment, the rats learned to link a neutral acoustic stimulus – a sound – to an unpleasant experience. Here, the signal path between the stomach and brain appeared to play no role, with the test animals learning the association as well as the control animals. If, however, the researchers switched from a negative to a neutral stimulus, the rats without gut instinct required significantly longer to associate the sound with the new, neutral situation. This also fits with the results of a recently published study conducted by other researchers, which found that stimulation of the vagus nerve facilitates relearning, says Meyer.
These findings are also of interest to the field of psychiatry, as post-traumatic stress disorder (PTSD), for example, is linked to the association of neutral stimuli with fear triggered by extreme experiences. Stimulation of the vagus nerve could help people with PTSD to once more associate the triggering stimuli with neutral experiences. Vagus nerve stimulation is already used today to treat epilepsy and, in some cases, depression.
Stomach influences signalling in the brain
“A lower level of innate fear, but a longer retention of learned fear – this may sound contradictory,” says Meyer. However, innate and conditioned fear are two different behavioural domains in which different signalling systems in the brain are involved. On closer investigation of the rats’ brains, the researchers found that the loss of signals from the abdomen changes the production of certain signalling substances, so called neurotransmitters, in the brain.
“We were able to show for the first time that the selective interruption of the signal path from the stomach to the brain changed complex behavioural patterns. This has traditionally been attributed to the brain alone,” says Meyer. The study shows clearly that the stomach also has a say in how we respond to fear; however, what it says, i.e. precisely what it signals, is not yet clear. The researchers hope, however, that they will be able to further clarify the role of the vagus nerve and the dialogue between brain and body in future studies.

How the gut feeling shapes fear

An unlit, deserted car park at night, footsteps in the gloom. The heart beats faster and the stomach ties itself in knots. We often feel threatening situations in our stomachs. While the brain has long been viewed as the centre of all emotions, researchers are increasingly trying to get to the bottom of this proverbial gut instinct.

It is not only the brain that controls processes in our abdominal cavity; our stomach also sends signals back to the brain. At the heart of this dialogue between the brain and abdomen is the vagus nerve, which transmits signals in both directions – from the brain to our internal organs (via the so called efferent nerves) and from the stomach back to our brain (via the afferent nerves). By cutting the afferent nerve fibres in rats, a team of scientists led by Urs Meyer, a researcher in the group of ETH Zurich professor Wolfgang Langhans, turned this two-way communication into a one-way street, enabling the researchers to get to the bottom of the role played by gut instinct. In the test animals, the brain was still able to control processes in the abdomen, but no longer received any signals from the other direction.

Less fear without gut instinct

In the behavioural studies, the researchers determined that the rats were less wary of open spaces and bright lights compared with controlled rats with an intact vagus nerve. “The innate response to fear appears to be influenced significantly by signals sent from the stomach to the brain,” says Meyer.

Nevertheless, the loss of their gut instinct did not make the rats completely fearless: the situation for learned fear behaviour looked different. In a conditioning experiment, the rats learned to link a neutral acoustic stimulus – a sound – to an unpleasant experience. Here, the signal path between the stomach and brain appeared to play no role, with the test animals learning the association as well as the control animals. If, however, the researchers switched from a negative to a neutral stimulus, the rats without gut instinct required significantly longer to associate the sound with the new, neutral situation. This also fits with the results of a recently published study conducted by other researchers, which found that stimulation of the vagus nerve facilitates relearning, says Meyer.

These findings are also of interest to the field of psychiatry, as post-traumatic stress disorder (PTSD), for example, is linked to the association of neutral stimuli with fear triggered by extreme experiences. Stimulation of the vagus nerve could help people with PTSD to once more associate the triggering stimuli with neutral experiences. Vagus nerve stimulation is already used today to treat epilepsy and, in some cases, depression.

Stomach influences signalling in the brain

“A lower level of innate fear, but a longer retention of learned fear – this may sound contradictory,” says Meyer. However, innate and conditioned fear are two different behavioural domains in which different signalling systems in the brain are involved. On closer investigation of the rats’ brains, the researchers found that the loss of signals from the abdomen changes the production of certain signalling substances, so called neurotransmitters, in the brain.

“We were able to show for the first time that the selective interruption of the signal path from the stomach to the brain changed complex behavioural patterns. This has traditionally been attributed to the brain alone,” says Meyer. The study shows clearly that the stomach also has a say in how we respond to fear; however, what it says, i.e. precisely what it signals, is not yet clear. The researchers hope, however, that they will be able to further clarify the role of the vagus nerve and the dialogue between brain and body in future studies.

Filed under fear anxiety gut feeling emotions vagus nerve neuroscience science

101 notes

Screening for Autism: There’s an App for That

Most schools across the United States provide simple vision tests to their students—not to prescribe glasses, but to identify potential problems and recommend a trip to the optometrist. Researchers are now on the cusp of providing the same kind of service for autism.

image

Researchers at Duke University have developed software that tracks and records infants’ activity during videotaped autism screening tests. Their results show that the program is as good at spotting behavioral markers of autism as experts giving the test themselves, and better than non-expert medical clinicians and students in training.

The results appear online in the journal Autism Research and Treatment.

“We’re not trying to replace the experts,” said Jordan Hashemi, a graduate student in computer and electrical engineering at Duke. “We’re trying to transfer the knowledge of the relatively few autism experts available into classrooms and homes across the country. We want to give people tools they don’t currently have, because research has shown that early intervention can greatly impact the severity of the symptoms common in autism spectrum disorders.”

The study focused on three behavioral tests that can help identify autism in very young children.

In one test, an infant’s attention is drawn to a toy being shaken on the left side and then redirected to a toy being shaken on the right side. Clinicians count how long it takes for the child’s attention to shift in response to the changing stimulus. The second test passes a toy across the infant’s field of view and looks for any delay in the child tracking its motion. In the last test, a clinician rolls a ball to a child and looks for eye contact afterward—a sign of the child’s engagement with their play partner.

In all of the tests, the person administering them isn’t just controlling the stimulus, he or she is also counting how long it takes for the child to react—an imprecise science at best. The new program allows testers to forget about taking measurements while also providing more accuracy, recording reaction times down to tenths of a second.

“The great benefit of the video and software is for general practitioners who do not have the trained eye to look for subtle early warning signs of autism,” said Amy Esler, an assistant professor of pediatrics and autism researcher at the University of Minnesota, who participated in some of the trials highlighted in the paper.

“The software has the potential to automatically analyze a child’s eye gaze, walking patterns or motor behaviors for signs that are distinct from typical development,” Esler said. “These signs would signal to doctors that they need to refer a family to a specialist for a more detailed evaluation.”

According to Hashemi and his adviser, Guillermo Sapiro, professor of electrical and computer engineering and biomedical engineering at Duke, because the program is non-invasive, it could be useful immediately in homes and clinics. Neither, however, expects it to become widely used—not because clinicians, teachers and parents aren’t willing, but because the researchers are working on an even more practical solution.

Later this year, the Duke team (which includes students and faculty from engineering and psychiatry) plans to test a new tablet application that could do away with the need for a person to administer any tests at all. The program would watch for physical and facial responses to visual cues played on the screen, analyze the data and automatically report any potential red flags. Any parent, teacher or clinician would simply need to download the app and sit their child down in front of it for a few minutes.

The efforts are part of the Information Initiative at Duke, which connects researchers from disparate fields to experts in computer programming to help analyze large data sets.

“We’re currently working with autism experts at Duke Medicine to determine what sorts of easy tests could be used on just a computer or tablet screen to spot any potential concerns,” said Sapiro. “The goal is to mimic the same sorts of social interactions that the tests with the toys and balls measure, but without the toys and balls. The research has shown that the earlier autism can be spotted, the more beneficial intervention can be. And we want to provide everyone in the world with the ability to spot those signs as early as possible.”

(Source: pratt.duke.edu)

Filed under autism infants social interaction eye movements attention ASD neuroscience science

98 notes

New insights could boost treatment for P addiction
A Kiwi researcher’s discovery of new ways methamphetamine can alter the brain could help the development of new drug-based therapies for addiction treatment.
In 2009, New Zealand had one of the highest rates of P users in the world, and today, more than 25,000 Kiwis were estimated to still be using the drug.
Now, new research by a Victoria of University of Wellington graduate has provided valuable insights into how the brain’s natural reward pathways are strongly stimulated following exposure to methamphetamine.
Read more

New insights could boost treatment for P addiction

A Kiwi researcher’s discovery of new ways methamphetamine can alter the brain could help the development of new drug-based therapies for addiction treatment.

In 2009, New Zealand had one of the highest rates of P users in the world, and today, more than 25,000 Kiwis were estimated to still be using the drug.

Now, new research by a Victoria of University of Wellington graduate has provided valuable insights into how the brain’s natural reward pathways are strongly stimulated following exposure to methamphetamine.

Read more

Filed under methamphetamine addiction reward system genetics psychology neuroscience science

156 notes

Neuroscientists discover adaptation mechanisms of the brain when perceiving letters of the alphabet
The headlights – two eyes, the radiator cowling – a smiling mouth: This is how our brain sometimes creates a face out of a car front. The same happens with other objects: in house facades, trees or stones – a “human face” can often be detected as well. Prof. Dr. Gyula Kovács from Friedrich Schiller University Jena (Germany) knows the reason why. “Faces are of tremendous importance for human beings,” the neuroscientist explains. That’s why in the course of the evolution our visual perception has specialized in the recognition of faces in particular. “This sometimes even goes as far as us recognizing faces when there are none at all.”

Until now the researchers assumed that this phenomenon is an exception that can only be applied to faces. But, as Prof. Kovács and his colleague Mareike Grotheer were able to point out in a new study: these distinct adaptation mechanisms are not only restricted to the perception of faces. In the The Journal of Neuroscience the Jena researchers have proved that the effect can also occur in the perception of letters.
Read more

Neuroscientists discover adaptation mechanisms of the brain when perceiving letters of the alphabet

The headlights – two eyes, the radiator cowling – a smiling mouth: This is how our brain sometimes creates a face out of a car front. The same happens with other objects: in house facades, trees or stones – a “human face” can often be detected as well. Prof. Dr. Gyula Kovács from Friedrich Schiller University Jena (Germany) knows the reason why. “Faces are of tremendous importance for human beings,” the neuroscientist explains. That’s why in the course of the evolution our visual perception has specialized in the recognition of faces in particular. “This sometimes even goes as far as us recognizing faces when there are none at all.”

Until now the researchers assumed that this phenomenon is an exception that can only be applied to faces. But, as Prof. Kovács and his colleague Mareike Grotheer were able to point out in a new study: these distinct adaptation mechanisms are not only restricted to the perception of faces. In the The Journal of Neuroscience the Jena researchers have proved that the effect can also occur in the perception of letters.

Read more

Filed under visual perception learning brain activity repetition suppression adaptation neuroscience science

146 notes

Biologists Identify New Neural Pathway in Eyes that Aids in Vision

A type of retina cell plays a more critical role in vision than previously known, a team led by Johns Hopkins University researchers has discovered.

image

Working with mice, the scientists found that the ipRGCs – an atypical type of photoreceptor in the retina – help detect contrast between light and dark, a crucial element in the formation of visual images. The key to the discovery is the fact that the cells express melanopsin, a type of photopigment that undergoes a chemical change when it absorbs light.

“We are quite excited that melanopsin signaling contributes to vision even in the presence of functional rods and cones,” postdoctoral fellow Tiffany M. Schmidt said.

Schmidt is lead author of a recently published study in the journal Neuron. The senior author is Samer Hattar, associate professor of biology in the university’s Krieger School of Arts and Sciences. Their findings have implications for future studies of blindness or impaired vision.

Rods and cones are the most well-known photoreceptors in the retina, activating in different light environments. Rods, of which there are about 120 million in the human eye, are highly sensitive to light and turn on in dim or low-light environments. Meanwhile the 6 million to 7 million cones in the eye are less sensitive to light; they drive vision in brighter light conditions and are essential for color detection.

Rods and cones were thought to be the only light-sensing photoreceptors in the retina until about a decade ago when scientists discovered a third type of retinal photoreceptor – the ipRGC, or intrinsically photosensitive retinal ganglion cell – that contains melanopsin. Those cells were thought to be needed exclusively for detecting light for non-image-dependent functions, for example, to control synchronization of our internal biological clocks to daytime and the constriction of our pupils in response to light.

“Rods and cones were thought to mediate vision and ipRGCs were thought to mediate these simple light-detecting functions that happen outside of conscious perception,” Schmidt said. “But our experiments revealed that ipRGCs influence a greater diversity of behaviors than was previously known and actually contribute to an important aspect of image-forming vision, namely contrast detection.”

The Johns Hopkins team along with other scientists conducted several experiments with mice and found that when melanopin was present in the retinal ganglion cells, the mice were better able to see contrast in a Y-shaped maze, known as the visual water task test. In the test, mice are trained to associate a pattern with a hidden platform that allows them to escape the water. Mice that had the melanopsin gene intact had higher contrast sensitivity than mice that lack the gene.

“Melanopsin signaling is essential for full contrast sensitivity in mouse visual functions,” said Hattar. “The ipRGCs and melanopsin determine the threshold for detecting edges in the visual scene, which means that visual functions that were thought to be solely mediated by rods and cones are now influenced by this system. The next step is to determine if melanopsin plays a similar role in the human retina for image-forming visual functions.”

(Source: releases.jhu.edu)

Filed under vision photoreceptors retina melanopsin retinal ganglion cell neuroscience science

127 notes

Rhythmic bursts of electrical activity from cells in ear teach brain how to hear

A precise rhythm of electrical impulses transmitted from cells in the inner ear coaches the brain how to hear, according to a new study led by researchers at the University of Pittsburgh School of Medicine. They report the first evidence of this developmental process today in the online version of Neuron.

image

The ear generates spontaneous electrical activity to trigger a response in the brain before hearing actually begins, said senior investigator Karl Kandler, Ph.D., professor of otolaryngology and neurobiology, Pitt School of Medicine. These patterned bursts start at inner hair cells in the cochlea, which is part of the inner ear, and travel along the auditory nerve to the brain.

"It’s long been speculated that these impulses are intended to ‘wire’ the brain auditory centers," he said. "Until now, however, no one has been able to provide experimental evidence to support this concept."

To map neural connectivity, Dr. Kandler’s team prepared sections of a mouse brain containing the auditory pathways in a chemical that is inert until UV light hits it. Then, they pulsed laser light at a neuron, making the chemical active, which excites the nerve cells to generate an electrical impulse. They then tracked the spread of the impulse to adjacent cells, allowing them to map the network a neuron at a time.

All mice are born unable to hear, a sense that develops around two weeks after birth. But even before hearing starts, the ear produces rhythmic bursts of electrical activity which causes a broad reaction in the brain’s auditory processing centers. As the beat goes on, the brain organizes itself, pruning unneeded connections and strengthening others. To investigate whether the beat is indeed important for this reorganization, the team used genetically engineered mice that lack a key receptor on the inner hair cells which causes them to change their beat.

"In normal mice, the wiring diagram of the brain gets sharper and more efficient over time and they begin to hear," Dr. Kandler said. "But this doesn’t happen when the inner ear beats in a different rhythm, which means the brain isn’t getting the instructions it needs to wire itself correctly. We have evidence that these mice can detect sound, but they have problems perceiving the pitch of sounds."

In humans, such subtle hearing deficits are associated with Central Auditory-Processing Disorders (CAPD), difficulty processing the meaning of sound. About 2 to 3 percent of children are affected with CAPD and these children often have speech and language disorders or delays, and learning disabilities such as dyslexia. In contrast to causes of hearing impairments due to ear deficits, the causes underlying CAPD have remained obscure.

"Our findings suggest that an abnormal rhythm of electrical impulses early in life may be an important contributing factor in the development of CAPD. More research is needed to find out whether this also holds true for humans, but our results point to a new direction that is worth following up," Dr. Kandler said.

(Source: eurekalert.org)

Filed under nerve cells hair cells inner ear auditory cortex hearing neuroscience science

157 notes

Keeping to the beat is no mean feat: Scientists reveal how two tracks of music become one

How does a DJ mix two songs to make the beat seem common to both tracks? A successful DJ makes the transition between tracks appear seamless while a bad mix is instantly noticeable and results in a ‘galloping horses’ effect that disrupts the dancing of the crowd. How accurate does beat mixing need to be to enhance, rather than disrupt perceived rhythm?

image

In a study published today (Wednesday 21 May 2014) in the journal Proceedings of the Royal Society B, scientists from the Universities of Birmingham and Cambridge present a new model that predicts whether or not two tracks will seem to share a common beat. This model also promises to help us understand how groups of people often start moving in synchrony, for example, football fans bouncing up and down at a stadium, or crowds falling into step when walking over a bridge.

‘We found that the time window in which two beat lines are heard as one isn’t fixed - it changes according to the statistical properties of each beat line, including how consistent or predictable they are,’ said Dr Mark Elliott, lead researcher on the study from the University of Birmingham’s School of Psychology. ‘For example, with two very consistent beat lines we only allow a very small time difference between them before we consider them to be separate. By analogy, given that DJs tend to play songs with a strong bass beat, they need to be very accurate in aligning the beats of the two songs if they are to be heard as one so as not to disrupt the flow of dancing. Our model and experiments reveal the timing properties of separate beat lines that determine whether they will be heard as one or two.’

Dr Elliott and his colleagues tested their model using a laboratory task that involved people tapping their fingers in time with two similar beat lines played simultaneously, one defined by high pitched tones, the other low pitched tones. The concurrency of the lines was varied such that the high and low pitched tones were played close together in time or far apart. Furthermore, the separation between the high-low tones was either consistent or randomly varied across the experiment. The researchers determined when people change from tapping along to a single beat formed from the two tones or targeted one of the tones while ignoring the other. They found that the time separation between tones that was required for people to judge them as distinct beats varied according to the consistency of the timings between the tones. Subsequently, these judgments influenced the timing of their movements.

Dr Elliott added, ‘People develop an expectation of when in time the next beat will occur. In defining the beat, they use the separation and consistency of the beat lines to determine whether the two tones should be combined together or whether just one tone should be attended to and the other ignored. Our model was able to predict the timing of participants’ movements based on the timing statistics of the tones we presented. Therefore, it not only allows us to calculate whether two beats will be heard as one, but also means we can predict the subtle effects the perception of an underlying rhythm can have on the movements people make to keep in synchrony with more complex beats.’

Dr Elliott is currently involved in a study, in collaboration with the University of Leeds, investigating the timing accuracy of movements in professional DJs compared to classical musicians and non-musicians. In addition, the findings of the current research are being applied to other areas: ‘We are currently investigating how spontaneous synchronisation of movements occurs within crowds. For example, in football stadiums the crowd sometimes starts to bounce up and down together. When the crowd moves together like this, it can create problems with structural vibration. Working with vibration engineers from the Universities of Sheffield and Exeter, we are applying our models to understand how such crowd dynamics might arise from the way each person adjusts their timing in relation to timing information from the people around them.’

(Source: birmingham.ac.uk)

Filed under synchronization music sensory integration neuroscience science

262 notes

Researchers examine how touch can trigger our emotions
While touch always involves awareness, it also sometimes involves emotion. For example, picking up a spoon triggers no real emotion, while feeling a gentle caress often does. Now, scientists in the Cell Press journal Neuron describe a system of slowly conducting nerves in the skin that respond to such gentle touch. Using a range of scientific techniques, investigators are beginning to characterize these nerves and to describe the fundamental role they play in our lives as a social species—from a nurturing touch to an infant to a reassuring pat on the back. Their work also suggests that this soft touch wiring may go awry in disorders such as autism.
The nerves that respond to gentle touch, called c-tactile afferents (CTs), are similar to those that detect pain, but they serve an opposite function: they relay events that are neither threatening nor tissue-damaging but are instead rewarding and pleasant.
"The evolutionary significance of such a system for a social species is yet to be fully determined," says first author Francis McGlone, PhD, of Liverpool John Moores University in England. "But recent research is finding that people on the autistic spectrum do not process emotional touch normally, leading us to hypothesize that a failure of the CT system during neurodevelopment may impact adversely on the functioning of the social brain and the sense of self."
For some individuals with autism, the light touch of certain fabrics in clothing can cause distress. Temple Grandin, an activist and assistant professor of animal sciences at Colorado State University who has written extensively on her experiences as an individual with autism, has remarked that her lack of empathy in social situations may be partially due to a lack of “comforting tactual input.” Professor McGlone also notes that deficits in nurturing touch during early life could have negative effects on a range of behaviors and psychological states later in life.
Further research on CTs may help investigators develop therapies for autistic patients and individuals who lacked adequate nurturing touch as children. Also, a better understanding of how nerves that relay rewarding sensations interact with those that signal pain could provide insights into new treatments for certain types of pain.
Professor McGlone believes that possessing an emotional touch system in the skin is as important to well-being and survival as having a system of nerves that protect us from harm. “In a world where human touch is becoming more and more of a rarity with the ubiquitous increase in social media leading to non-touch-based communication, and the decreasing opportunity for infants to experience enough nurturing touch from a carer or parent due to the economic pressures of modern living, it is becoming more important to recognize just how vital emotional touch is to all humankind.”

Researchers examine how touch can trigger our emotions

While touch always involves awareness, it also sometimes involves emotion. For example, picking up a spoon triggers no real emotion, while feeling a gentle caress often does. Now, scientists in the Cell Press journal Neuron describe a system of slowly conducting nerves in the skin that respond to such gentle touch. Using a range of scientific techniques, investigators are beginning to characterize these nerves and to describe the fundamental role they play in our lives as a social species—from a nurturing touch to an infant to a reassuring pat on the back. Their work also suggests that this soft touch wiring may go awry in disorders such as autism.

The nerves that respond to gentle touch, called c-tactile afferents (CTs), are similar to those that detect pain, but they serve an opposite function: they relay events that are neither threatening nor tissue-damaging but are instead rewarding and pleasant.

"The evolutionary significance of such a system for a social species is yet to be fully determined," says first author Francis McGlone, PhD, of Liverpool John Moores University in England. "But recent research is finding that people on the autistic spectrum do not process emotional touch normally, leading us to hypothesize that a failure of the CT system during neurodevelopment may impact adversely on the functioning of the social brain and the sense of self."

For some individuals with autism, the light touch of certain fabrics in clothing can cause distress. Temple Grandin, an activist and assistant professor of animal sciences at Colorado State University who has written extensively on her experiences as an individual with autism, has remarked that her lack of empathy in social situations may be partially due to a lack of “comforting tactual input.” Professor McGlone also notes that deficits in nurturing touch during early life could have negative effects on a range of behaviors and psychological states later in life.

Further research on CTs may help investigators develop therapies for autistic patients and individuals who lacked adequate nurturing touch as children. Also, a better understanding of how nerves that relay rewarding sensations interact with those that signal pain could provide insights into new treatments for certain types of pain.

Professor McGlone believes that possessing an emotional touch system in the skin is as important to well-being and survival as having a system of nerves that protect us from harm. “In a world where human touch is becoming more and more of a rarity with the ubiquitous increase in social media leading to non-touch-based communication, and the decreasing opportunity for infants to experience enough nurturing touch from a carer or parent due to the economic pressures of modern living, it is becoming more important to recognize just how vital emotional touch is to all humankind.”

Filed under touch emotions autism C-tactile fibers somatosensory cortex neuroscience science

free counters