Neuroscience

Articles and news from the latest research reports.

62 notes

This Robotic Mouse Was Designed to Stress Out Real Mice 
Lab rats have a new companion, but it’s not friendly. Researchers at Waseda University in Tokyo, Japan, have developed a robotic rat called WR-3 whose job is to induce stress and depression in lab animals, creating models of psychological conditions on which new drugs can be tested.
Animal are used throughout medicine as models to test treatments for human conditions, including mental disorders like depression. Rats and mice get their sense of smell severed to induce something like depression, or are forced to swim for long periods, for instance. Other methods rely on genetic modification and environmental stress, but none is entirely satisfactory in recreating a human-like version of depression for treatment. Hiroyuki Ishii and his team aim to do better with WR-3.
The researchers tested WR-3’s ability to depress two groups of 12 rats, measured by the somewhat crude assumption that a depressed rat moves around less. Rats in group A were constantly harassed by their robot counterpart, while the other rats were attacked intermittently and automatically by WR-3, whenever they moved. Ishii’s team found that the deepest depression was triggered by intermittent attacks on a mature rat that had been constantly harassed in its youth.
The team say they plan to test their new model of depression against more conventional systems, like forced swimming.
The robot has been developed just as new research by Junhee Seok of Stanford University in Palo Alto, California, and colleagues shows that the use of mouse models for human conditions has led researchers trying to find treatments for sepsis, burns and trauma astray at a cost of billions of tax dollars.

This Robotic Mouse Was Designed to Stress Out Real Mice

Lab rats have a new companion, but it’s not friendly. Researchers at Waseda University in Tokyo, Japan, have developed a robotic rat called WR-3 whose job is to induce stress and depression in lab animals, creating models of psychological conditions on which new drugs can be tested.

Animal are used throughout medicine as models to test treatments for human conditions, including mental disorders like depression. Rats and mice get their sense of smell severed to induce something like depression, or are forced to swim for long periods, for instance. Other methods rely on genetic modification and environmental stress, but none is entirely satisfactory in recreating a human-like version of depression for treatment. Hiroyuki Ishii and his team aim to do better with WR-3.

The researchers tested WR-3’s ability to depress two groups of 12 rats, measured by the somewhat crude assumption that a depressed rat moves around less. Rats in group A were constantly harassed by their robot counterpart, while the other rats were attacked intermittently and automatically by WR-3, whenever they moved. Ishii’s team found that the deepest depression was triggered by intermittent attacks on a mature rat that had been constantly harassed in its youth.

The team say they plan to test their new model of depression against more conventional systems, like forced swimming.

The robot has been developed just as new research by Junhee Seok of Stanford University in Palo Alto, California, and colleagues shows that the use of mouse models for human conditions has led researchers trying to find treatments for sepsis, burns and trauma astray at a cost of billions of tax dollars.

Filed under robots robotics robotic mouse depression WR-3 animal models neuroscience science

102 notes

Roots of language in human and bird biology
The genes activated for human speech are similar to the ones used by singing songbirds, new experiments suggest.
These results, which are not yet published, show that gene products produced for speech in the cortical and basal ganglia regions of the human brain correspond to similar molecules in the vocal communication areas of the brains of zebra finches and budgerigars. But these molecules aren’t found in the brains of doves and quails — vocal birds that do not learn their sounds.
"The results suggest that similar behavior and neural connectivity for a convergent complex trait like speech and song are associated with many similar genetic changes," said Duke neurobiologist Erich Jarvis, a Howard Hughes Medical Institute investigator.
Jarvis studies the molecular pathways that songbirds use while learning to sing. In past experiments, he and his collaborators found that songbirds have a connection between the front part of their brain and nerves in the brainstem that control movement in muscles that make songs in birds. They’ve seen this circuit in a more primitive form related to ultrasonic mating calls in mice. Humans also have this motor learning pathway for speech.
From this and other work, Jarvis developed the motor theory for the origin of vocal learning, which describes how ancient brain systems used to control movement and motor learning evolved into brain systems for learning and producing song and spoken language.
Gustavo Arriaga, Eric P. Zhou, Erich D. Jarvis. Of Mice, Birds, and Men: The Mouse Ultrasonic Song System Has Some Features Similar to Humans and Song-Learning Birds. PLoS ONE
Gustavo Arriaga, Erich D. Jarvis. Mouse vocal communication system: Are ultrasounds learned or innate? Brain and Language
(Image: iStock)

Roots of language in human and bird biology

The genes activated for human speech are similar to the ones used by singing songbirds, new experiments suggest.

These results, which are not yet published, show that gene products produced for speech in the cortical and basal ganglia regions of the human brain correspond to similar molecules in the vocal communication areas of the brains of zebra finches and budgerigars. But these molecules aren’t found in the brains of doves and quails — vocal birds that do not learn their sounds.

"The results suggest that similar behavior and neural connectivity for a convergent complex trait like speech and song are associated with many similar genetic changes," said Duke neurobiologist Erich Jarvis, a Howard Hughes Medical Institute investigator.

Jarvis studies the molecular pathways that songbirds use while learning to sing. In past experiments, he and his collaborators found that songbirds have a connection between the front part of their brain and nerves in the brainstem that control movement in muscles that make songs in birds. They’ve seen this circuit in a more primitive form related to ultrasonic mating calls in mice. Humans also have this motor learning pathway for speech.

From this and other work, Jarvis developed the motor theory for the origin of vocal learning, which describes how ancient brain systems used to control movement and motor learning evolved into brain systems for learning and producing song and spoken language.

Gustavo Arriaga, Eric P. Zhou, Erich D. Jarvis. Of Mice, Birds, and Men: The Mouse Ultrasonic Song System Has Some Features Similar to Humans and Song-Learning Birds. PLoS ONE

Gustavo Arriaga, Erich D. Jarvis. Mouse vocal communication system: Are ultrasounds learned or innate? Brain and Language

(Image: iStock)

Filed under language language production speech vocalizations songbirds vocal learning neuroscience science

671 notes

Bilingual babies know their grammar by 7 months
Babies as young as seven months can distinguish between, and begin to learn, two languages with vastly different grammatical structures, according to new research from the University of British Columbia and Université Paris Descartes.
Published today in the journal Nature Communications and presented at the 2013 Annual Meeting of the American Association for the Advancement of Science (AAAS) in Boston, the study shows that infants in bilingual environments use pitch and duration cues to discriminate between languages – such as English and Japanese – with opposite word orders.
In English, a function word comes before a content word (the dog, his hat, with friends, for example) and the duration of the content word is longer, while in Japanese or Hindi, the order is reversed, and the pitch of the content word higher.
"By as early as seven months, babies are sensitive to these differences and use these as cues to tell the languages apart," says UBC psychologist Janet Werker, co-author of the study.
Previous research by Werker and Judit Gervain, a linguist at the Université Paris Descartes and co-author of the new study, showed that babies use frequency of words in speech to discern their significance.
"For example, in English the words ‘the’ and ‘with’ come up a lot more frequently than other words – they’re essentially learning by counting," says Gervain. "But babies growing up bilingual need more than that, so they develop new strategies that monolingual babies don’t necessarily need to use."
"If you speak two languages at home, don’t be afraid, it’s not a zero-sum game," says Werker. "Your baby is very equipped to keep these languages separate and they do so in remarkable ways."

Bilingual babies know their grammar by 7 months

Babies as young as seven months can distinguish between, and begin to learn, two languages with vastly different grammatical structures, according to new research from the University of British Columbia and Université Paris Descartes.

Published today in the journal Nature Communications and presented at the 2013 Annual Meeting of the American Association for the Advancement of Science (AAAS) in Boston, the study shows that infants in bilingual environments use pitch and duration cues to discriminate between languages – such as English and Japanese – with opposite word orders.

In English, a function word comes before a content word (the dog, his hat, with friends, for example) and the duration of the content word is longer, while in Japanese or Hindi, the order is reversed, and the pitch of the content word higher.

"By as early as seven months, babies are sensitive to these differences and use these as cues to tell the languages apart," says UBC psychologist Janet Werker, co-author of the study.

Previous research by Werker and Judit Gervain, a linguist at the Université Paris Descartes and co-author of the new study, showed that babies use frequency of words in speech to discern their significance.

"For example, in English the words ‘the’ and ‘with’ come up a lot more frequently than other words – they’re essentially learning by counting," says Gervain. "But babies growing up bilingual need more than that, so they develop new strategies that monolingual babies don’t necessarily need to use."

"If you speak two languages at home, don’t be afraid, it’s not a zero-sum game," says Werker. "Your baby is very equipped to keep these languages separate and they do so in remarkable ways."

Filed under infants bilingual language language acquisition prosodic cues psychology neuroscience science

83 notes

Why ‘Good Hair’ Matters: The first animal model of recent human evolution reveals that a mutation for thick hair does much more
The first animal model of recent human evolution reveals that a single mutation produced several traits common in East Asian peoples, from thicker hair to denser sweat glands, an international team of researchers reports.
The team, led by researchers from Harvard Medical School, Harvard University, the Broad Institute of MIT and Harvard, Massachusetts General Hospital, Fudan University and University College London, also modeled the spread of the gene mutation across Asia and North America, concluding that it most likely arose about 30,000 years ago in what is today central China. The findings are reported in the cover story of the Feb. 14 issue of Cell.
“This interdisciplinary approach yields unique insight into the generation of adaptive variation among modern humans,” said Pardis Sabeti, associate professor in the Center for Systems Biology and Department of Organismic and Evolutionary Biology at Harvard University, and one of the paper’s senior authors. Sabeti is also a senior associate member at the Broad Institute.
“This paper tells a story about human evolution in three parts,” said Cliff Tabin, head of the HMS Department of Genetics and co-senior author. “The mouse model links multiple traits to a single mutation, the related association study finds these traits in humans, and computer models tell us where and when the mutation likely arose and spread.”
Read more

Why ‘Good Hair’ Matters: The first animal model of recent human evolution reveals that a mutation for thick hair does much more

The first animal model of recent human evolution reveals that a single mutation produced several traits common in East Asian peoples, from thicker hair to denser sweat glands, an international team of researchers reports.

The team, led by researchers from Harvard Medical School, Harvard University, the Broad Institute of MIT and Harvard, Massachusetts General Hospital, Fudan University and University College London, also modeled the spread of the gene mutation across Asia and North America, concluding that it most likely arose about 30,000 years ago in what is today central China. The findings are reported in the cover story of the Feb. 14 issue of Cell.

“This interdisciplinary approach yields unique insight into the generation of adaptive variation among modern humans,” said Pardis Sabeti, associate professor in the Center for Systems Biology and Department of Organismic and Evolutionary Biology at Harvard University, and one of the paper’s senior authors. Sabeti is also a senior associate member at the Broad Institute.

“This paper tells a story about human evolution in three parts,” said Cliff Tabin, head of the HMS Department of Genetics and co-senior author. “The mouse model links multiple traits to a single mutation, the related association study finds these traits in humans, and computer models tell us where and when the mutation likely arose and spread.”

Read more

Filed under evolution human evolution gene mutations genetics animal model science

77 notes

Retinal implant wins FDA approval
The U.S. Food and Drug Administration (FDA) approved the Argus II retinal prosthesis system for use in the United States.
Mark Humayun, who holds joint appointments at the Keck School of Medicine of USC and the USC Viterbi School of Engineering, was a key member of the team that developed the device, which will be available to qualified patients at the Keck Medical Center of USC.
The Argus II, which received a unanimous recommendation for approval by the FDA’s Ophthalmic Devices Advisory Panel in September, restores some visual capabilities for patients whose blindness is caused by Retinitis Pigmentosa (RP), an inherited retinal degenerative disease that affects about 100,000 people nationwide.
“It is incredibly exciting to have FDA approval to begin implanting the Argus II and provide some restoration of vision to patients blinded from RP,” said Humayun, Cornelius Pings Professor of Biomedical Sciences and professor of ophthalmology, biomedical engineering, cell and neurobiology at USC. “In the patients that have been implanted to date, the improvement in the quality of life has been invaluable.
“The fact that many patients can use the Argus implant in their activities of daily living, such as recognizing large letters, locating the position of objects and more, has been beyond our wildest dreams,” Humayun added, “yet the promise to the patients is real, and we expect it only to improve over time.”
The Argus II, which is manufactured by Sylmar, Calif.-based Second Sight, was approved for use in Europe in 2011 and has been implanted in 30 patients in a clinical trial that began in 2007. Humayun performed many of the surgeries to implant the device.
The FDA approval paves the way for Second Sight to build a surgical network in the United States to implant the device, as well as to recruit hospitals to offer it, according to Robert Greensburg, president and CEO of the company.
The Argus II system uses a camera mounted on special glasses that sends a signal to an electronic receiver with 60 electrodes implanted inside the eye.
The receiver sends signals to the retina that travel through the optic nerve to the brain, where they can be interpreted as a visual picture. The researchers hope that one day the device can be improved to also help individuals with age-related macular degeneration, a similar but far more common disease.
Public inquiries regarding the Argus II can be directed to the Second Sight public information line at (855) 756-3703.
As the Argus II retinal implant is refined, it will be housed in the USC Institute of Biomedical Therapeutics. The new $60 million endowed interdisciplinary institute will bring together scientists, engineers and clinicians from around the world to study neural networks to develop bioelectronic solutions for the millions of people impacted by traumatic brain injury, stroke and debilitating eye diseases.

Retinal implant wins FDA approval

The U.S. Food and Drug Administration (FDA) approved the Argus II retinal prosthesis system for use in the United States.

Mark Humayun, who holds joint appointments at the Keck School of Medicine of USC and the USC Viterbi School of Engineering, was a key member of the team that developed the device, which will be available to qualified patients at the Keck Medical Center of USC.

The Argus II, which received a unanimous recommendation for approval by the FDA’s Ophthalmic Devices Advisory Panel in September, restores some visual capabilities for patients whose blindness is caused by Retinitis Pigmentosa (RP), an inherited retinal degenerative disease that affects about 100,000 people nationwide.

“It is incredibly exciting to have FDA approval to begin implanting the Argus II and provide some restoration of vision to patients blinded from RP,” said Humayun, Cornelius Pings Professor of Biomedical Sciences and professor of ophthalmology, biomedical engineering, cell and neurobiology at USC. “In the patients that have been implanted to date, the improvement in the quality of life has been invaluable.

“The fact that many patients can use the Argus implant in their activities of daily living, such as recognizing large letters, locating the position of objects and more, has been beyond our wildest dreams,” Humayun added, “yet the promise to the patients is real, and we expect it only to improve over time.”

The Argus II, which is manufactured by Sylmar, Calif.-based Second Sight, was approved for use in Europe in 2011 and has been implanted in 30 patients in a clinical trial that began in 2007. Humayun performed many of the surgeries to implant the device.

The FDA approval paves the way for Second Sight to build a surgical network in the United States to implant the device, as well as to recruit hospitals to offer it, according to Robert Greensburg, president and CEO of the company.

The Argus II system uses a camera mounted on special glasses that sends a signal to an electronic receiver with 60 electrodes implanted inside the eye.

The receiver sends signals to the retina that travel through the optic nerve to the brain, where they can be interpreted as a visual picture. The researchers hope that one day the device can be improved to also help individuals with age-related macular degeneration, a similar but far more common disease.

Public inquiries regarding the Argus II can be directed to the Second Sight public information line at (855) 756-3703.

As the Argus II retinal implant is refined, it will be housed in the USC Institute of Biomedical Therapeutics. The new $60 million endowed interdisciplinary institute will bring together scientists, engineers and clinicians from around the world to study neural networks to develop bioelectronic solutions for the millions of people impacted by traumatic brain injury, stroke and debilitating eye diseases.

Filed under eye disease retinitis pigmentosa Argus II bionic eye retina implants neuroscience science

86 notes

Vision restored with total darkness
Restoring vision might sometimes be as simple as turning out the lights. That’s according to a study reported on February 14 in Current Biology, a Cell Press publication, in which researchers examined kittens with a visual impairment known as amblyopia before and after they spent 10 days in complete darkness.
Researchers Kevin Duffy and Donald Mitchell of Dalhousie University in Canada believe that exposure to darkness causes some parts of the visual system to revert to an early stage in development, when there is greater flexibility.
"There may be ways to increase brain plasticity and recover from disorders such as amblyopia without drug intervention," Duffy says. "Immersion in total darkness seems to reset the visual brain to enable remarkable recovery."
Amblyopia affects about four percent of the general population and is thought to develop when the two eyes do not see equally well in early life, as the connections from the eyes to visual areas in the brain are still being refined. Left untreated, that imbalance of vision can lead to permanent vision loss.
In the new study, the researchers examined kittens with amblyopia induced by experimentally depriving them of visual input to one eye. After those animals were plunged into darkness, their vision made a profound and rapid recovery. Further examination suggested that the restoration of vision depends on the loss of neurofilaments that hold the visual system in place. With those stabilizing elements gone, the visual system becomes free to correct itself.
Darkness therapy holds promise for the treatment of children with amblyopia, the researchers say, but don’t try this at home. They think that the darkness must be absolute to work, with no stray light at any time. It is also important to address the original cause of the amblyopia first, and to ensure that a period of darkness will not harm an individual’s good eye.
The researchers are still working out just how much darkness is required, and for how long. Regardless, they say it is unlikely that a drug could ever adequately mimic the effects of darkness that they’ve seen.
"The advantage of a simple nonpharmacological sensory manipulation, such as a period of darkness, is that it may initiate changes in a constellation of molecules in a beneficial temporal order and in appropriate brain regions," they write.

Vision restored with total darkness

Restoring vision might sometimes be as simple as turning out the lights. That’s according to a study reported on February 14 in Current Biology, a Cell Press publication, in which researchers examined kittens with a visual impairment known as amblyopia before and after they spent 10 days in complete darkness.

Researchers Kevin Duffy and Donald Mitchell of Dalhousie University in Canada believe that exposure to darkness causes some parts of the visual system to revert to an early stage in development, when there is greater flexibility.

"There may be ways to increase brain plasticity and recover from disorders such as amblyopia without drug intervention," Duffy says. "Immersion in total darkness seems to reset the visual brain to enable remarkable recovery."

Amblyopia affects about four percent of the general population and is thought to develop when the two eyes do not see equally well in early life, as the connections from the eyes to visual areas in the brain are still being refined. Left untreated, that imbalance of vision can lead to permanent vision loss.

In the new study, the researchers examined kittens with amblyopia induced by experimentally depriving them of visual input to one eye. After those animals were plunged into darkness, their vision made a profound and rapid recovery. Further examination suggested that the restoration of vision depends on the loss of neurofilaments that hold the visual system in place. With those stabilizing elements gone, the visual system becomes free to correct itself.

Darkness therapy holds promise for the treatment of children with amblyopia, the researchers say, but don’t try this at home. They think that the darkness must be absolute to work, with no stray light at any time. It is also important to address the original cause of the amblyopia first, and to ensure that a period of darkness will not harm an individual’s good eye.

The researchers are still working out just how much darkness is required, and for how long. Regardless, they say it is unlikely that a drug could ever adequately mimic the effects of darkness that they’ve seen.

"The advantage of a simple nonpharmacological sensory manipulation, such as a period of darkness, is that it may initiate changes in a constellation of molecules in a beneficial temporal order and in appropriate brain regions," they write.

Filed under vision amblyopia brain plasticity vision loss kittens neurofilaments neuroscience science

130 notes

The good side of the prion: A molecule that is not only dangerous, but can help the brain grow
A few years ago it was found that certain proteins, the prions, when defective are dangerous, as they are involved in neurodegenerative syndromes such as the Creutzfeldt-Jakob and the Alzheimer diseases. But now research is showing their good side, too: when performing well, prions may be crucial in the development of the brain during childhood, as observed by a study carried out by a team of neuroscientists at Trieste’s SISSA which appeared yesterday in the Journal of Neuroscience.
Doctor Jekyll and Mr. Hyde: the metaphor of the good man who hides an evil side suits well the prion (PrPC in its physiological cellular form), a protein which abounds in our brain. Unlike Doctor Jekyll, the prion was at first considered for its upsetting properties: if the molecule abnormally folds over itself it unfortunately plays a crucial role in neurodegenerative processes that lead to dreadful syndromes such as the mad cow disease.
Prions, however, in their normal form abound in synapses, the contact points where the nervous signal is passed from a neuron to the next. Such protein relatively abounds in the brain of very young children, and this is the reason why scientists have assumed it may play a role in thenervous system development, and in particular in neurogenesis, in the development of new synaptic connections and in plasticity.
More in detail
Maddalena Caiati, Victoria Safiulina, Sudhir Sivakumaran, Giuseppe Legname, Enrico Cherubini, all researchers at SISSA, and Giorgia Fattorini of the Università Politecnica delle Marche have verified at the molecular level the effects of PrPC on the cell plasticity of the hippocampus, a brain structure which has important functions related to memory. Maddalena Caiati and her colleagues have demonstrated that PrPC controls synaptic plasticity (the growth capacity of the nervous tissue) through a transduction pathway which involves also another protein, the protein kinase A enzyme (PKA). The recently published research is only the starting point. As for the future, it will be interesting to get a closer look at the role played by the prion protein in the development of neuronal circuits both under physiological and pathologic conditions in neurodegenerative diseases.

The good side of the prion: A molecule that is not only dangerous, but can help the brain grow

A few years ago it was found that certain proteins, the prions, when defective are dangerous, as they are involved in neurodegenerative syndromes such as the Creutzfeldt-Jakob and the Alzheimer diseases. But now research is showing their good side, too: when performing well, prions may be crucial in the development of the brain during childhood, as observed by a study carried out by a team of neuroscientists at Trieste’s SISSA which appeared yesterday in the Journal of Neuroscience.

Doctor Jekyll and Mr. Hyde: the metaphor of the good man who hides an evil side suits well the prion (PrPC in its physiological cellular form), a protein which abounds in our brain. Unlike Doctor Jekyll, the prion was at first considered for its upsetting properties: if the molecule abnormally folds over itself it unfortunately plays a crucial role in neurodegenerative processes that lead to dreadful syndromes such as the mad cow disease.

Prions, however, in their normal form abound in synapses, the contact points where the nervous signal is passed from a neuron to the next. Such protein relatively abounds in the brain of very young children, and this is the reason why scientists have assumed it may play a role in the
nervous system development, and in particular in neurogenesis, in the development of new synaptic connections and in plasticity.

More in detail

Maddalena Caiati, Victoria Safiulina, Sudhir Sivakumaran, Giuseppe Legname, Enrico Cherubini, all researchers at SISSA, and Giorgia Fattorini of the Università Politecnica delle Marche have verified at the molecular level the effects of PrPC on the cell plasticity of the hippocampus, a brain structure which has important functions related to memory. Maddalena Caiati and her colleagues have demonstrated that PrPC controls synaptic plasticity (the growth capacity of the nervous tissue) through a transduction pathway which involves also another protein, the protein kinase A enzyme (PKA). The recently published research is only the starting point. As for the future, it will be interesting to get a closer look at the role played by the prion protein in the development of neuronal circuits both under physiological and pathologic conditions in neurodegenerative diseases.

Filed under neurodegenerative diseases mad cow disease prions protein nervous tissue protein kinase neuroscience science

342 notes

Love of musical harmony is not nature but nurture



Our love of music and appreciation of musical harmony is learnt and not based on natural ability – a new study by University of Melbourne researchers has found.



Associate Professor Neil McLachlan from the Melbourne School of Psychological Sciences said previous theories about how we appreciate music were based on the physical properties of sound, the ear itself and an innate ability to hear harmony.

“Our study shows that musical harmony can be learnt and it is a matter of training the brain to hear the sounds,” Associate Professor McLachlan said.
 “So if you thought that the music of some exotic culture (or Jazz) sounded like the wailing of cats, it’s simply because you haven’t learnt to listen by their rules.”

The researchers used 66 volunteers with a range of musical training and tested their ability to hear combinations of notes to determine if they found the combinations familiar or pleasing.

“What we found was that people needed to be familiar with sounds created by combinations of notes before they could hear the individual notes. If they couldn’t find the notes they found the sound dissonant or unpleasant,” he said.
 “This finding overturns centuries of theories that physical properties of the ear determine what we find appealing.”

Coauthor on the study Associate Professor Sarah Wilson also from the Melbourne School of Psychological Sciences said the study found that trained musicians were much more sensitive to dissonance than non-musicians.

“When they couldn’t find the note, the musicians reported that the sounds were unpleasant, whereas non-musicians were much less sensitive,” Assoc. Prof Wilson said.
 “This highlights the importance of training the brain to like particular variations of combinations of sounds like those found in jazz or rock.” 

Depending on their training, a strange chord or a gong sound was accurately pitched and pleasant to some musicians, but impossible to pitch and very unpleasant to others. “This showed us that even the ability to hear a musical pitch (or note) is learnt,” Assoc. Prof Wilson said.

To confirm this finding they trained 19 non-musicians to find the pitches of a random selection of western chords. Not only did the participants ability to hear notes improve rapidly over ten short sessions, afterward they reported that the chords they had learnt sounded more pleasant – regardless of how the chords were tuned.
The question of why some combinations of musical notes are heard as pleasant or unpleasant has long been debated. “We have shown in this study that for music, beauty is in the brain of the beholder,” Assoc. Prof McLachlan said. The study was published in the Journal of Experimental Psychology: General.

Love of musical harmony is not nature but nurture

Our love of music and appreciation of musical harmony is learnt and not based on natural ability – a new study by University of Melbourne researchers has found.

Associate Professor Neil McLachlan from the Melbourne School of Psychological Sciences said previous theories about how we appreciate music were based on the physical properties of sound, the ear itself and an innate ability to hear harmony.


“Our study shows that musical harmony can be learnt and it is a matter of training the brain to hear the sounds,” Associate Professor McLachlan said.
 “So if you thought that the music of some exotic culture (or Jazz) sounded like the wailing of cats, it’s simply because you haven’t learnt to listen by their rules.”


The researchers used 66 volunteers with a range of musical training and tested their ability to hear combinations of notes to determine if they found the combinations familiar or pleasing.


“What we found was that people needed to be familiar with sounds created by combinations of notes before they could hear the individual notes. If they couldn’t find the notes they found the sound dissonant or unpleasant,” he said.
 “This finding overturns centuries of theories that physical properties of the ear determine what we find appealing.”


Coauthor on the study Associate Professor Sarah Wilson also from the Melbourne School of Psychological Sciences said the study found that trained musicians were much more sensitive to dissonance than non-musicians.


“When they couldn’t find the note, the musicians reported that the sounds were unpleasant, whereas non-musicians were much less sensitive,” Assoc. Prof Wilson said.
 “This highlights the importance of training the brain to like particular variations of combinations of sounds like those found in jazz or rock.” 


Depending on their training, a strange chord or a gong sound was accurately pitched and pleasant to some musicians, but impossible to pitch and very unpleasant to others. “This showed us that even the ability to hear a musical pitch (or note) is learnt,” Assoc. Prof Wilson said.


To confirm this finding they trained 19 non-musicians to find the pitches of a random selection of western chords. Not only did the participants ability to hear notes improve rapidly over ten short sessions, afterward they reported that the chords they had learnt sounded more pleasant – regardless of how the chords were tuned.

The question of why some combinations of musical notes are heard as pleasant or unpleasant has long been debated. “We have shown in this study that for music, beauty is in the brain of the beholder,” Assoc. Prof McLachlan said. The study was published in the Journal of Experimental Psychology: General.

Filed under music musical harmony harmony consonance pitch perception dissonance neuroscience science

95 notes

Newborn babies walk the walk
Before you can run, you have to walk, and before you can walk well, you have to walk like a brand-new baby. A new study uncovers the logistics of newborns’ herky-jerky, Frankensteinian stepping action and how this early reflex morphs into refined adult locomotion.
In the study, electrodes on infants’ chubby legs picked up signals from neurons that tell muscles to fire, revealing that three-day old babies tense up many of their leg muscles all at once. Toddlers, preschoolers and adults, by contrast, showed a progressively more sophisticated, selective pattern of neuron activity.
From birth to adulthood, motor neurons in the spine get an overhaul as neurons in different  locations along the spine become specialized for various aspects of walking, such as foot position, balance and direction, Yuri Ivanenko of the Santa Lucia Foundation in Rome and colleagues conclude in the Feb. 13 Journal of Neuroscience.

Newborn babies walk the walk

Before you can run, you have to walk, and before you can walk well, you have to walk like a brand-new baby. A new study uncovers the logistics of newborns’ herky-jerky, Frankensteinian stepping action and how this early reflex morphs into refined adult locomotion.

In the study, electrodes on infants’ chubby legs picked up signals from neurons that tell muscles to fire, revealing that three-day old babies tense up many of their leg muscles all at once. Toddlers, preschoolers and adults, by contrast, showed a progressively more sophisticated, selective pattern of neuron activity.

From birth to adulthood, motor neurons in the spine get an overhaul as neurons in different  locations along the spine become specialized for various aspects of walking, such as foot position, balance and direction, Yuri Ivanenko of the Santa Lucia Foundation in Rome and colleagues conclude in the Feb. 13 Journal of Neuroscience.

Filed under infants walking motor activity motor neurons neuron activity neuroscience science

48 notes

UCSB Study of Cocaine Addiction Reveals Targets for Treatment
Scientists at UC Santa Barbara are researching cocaine addiction, part of a widespread problem, which, along with other addictions, costs billions of dollars in damage to individuals, families, and society. Laboratory studies at UCSB have revealed that the diminished brain function and learning impairment that result from cocaine addiction can be treated –– and that learning can be restored.
Karen Szumlinski, a professor in the Department of Psychological & Brain Sciences at UCSB, and her colleagues Osnat Ben-Shahar and Tod Kippin, have worked in the field of addiction for many years. Senior author of a paper on this topic published recently in The Journal of Neuroscience, Szumlinski is particularly interested in the part of the brain called the prefrontal cortex, where the process of “executive function” –– or decision-making –– is located. This area is involved in directing one’s behavior in an appropriate manner, and in controlling behavior.
With her research team, Szumlinski discovered that a drug that stimulates a certain type of glutamate receptor –– when aimed at the prefrontal cortex –– could restore learning impairment in rats with simulated cocaine addiction.
"Needless to say, this (the prefrontal cortex) is one of the last parts of the brain to develop, and, of relevance to our students, continues to develop through about age 25 to 28," said Szumlinski.
Szumlinski explained that in the prefrontal cortex there seems to be “hypo-frontality,” or reduced functioning, in drug addicts, as well as in patients with a range of neuropsychiatric diseases, including schizophrenia, depression, and attention deficit disorder.
Szumlinski calls the prefrontal cortex a late-developing brain area that is critical for making proper decisions, and inhibiting behavior. “You damage this brain region and you lose the ability to self-regulate, you make impulsive decisions like engaging in risky sexual behavior or drug-taking, you basically go off the deep end in terms of function,” she said. “So we were very much interested in how drugs of abuse impact the prefrontal cortex, given that human drug addicts show deficits in this brain area when you put them into a scanner. They show hypo-activity.” She said this hypo-activity, or hypo-frontality, might relate to a neurotransmitter that scientists know is involved in exciting the brain.
A key question, according to Szumlinski, is this: “Was that hypo-frontality there in the first place, and that’s why they became an addict; or did the drugs change their prefrontal cortext, to cause it to become hypo-functioning and thus they’re not able to control their drug use? You can’t parse that out in humans. So that’s why we turn then to animal models of the disorder, and we do have this rat model that we use in the paper.”
Szumlinski pointed out a key difficulty in the development of treatments for addiction: There is little money targeted to the study of this disease. Hence, in addition to studying the brain mechanisms that are involved, she is joining forces with researchers who study other neurological diseases that are well-funded, to help find cures. She hopes that government approval of new drugs for these other diseases would eventually make the drugs available for clinical trials to study their effects on cocaine addiction.
(Image: iStock)

UCSB Study of Cocaine Addiction Reveals Targets for Treatment

Scientists at UC Santa Barbara are researching cocaine addiction, part of a widespread problem, which, along with other addictions, costs billions of dollars in damage to individuals, families, and society. Laboratory studies at UCSB have revealed that the diminished brain function and learning impairment that result from cocaine addiction can be treated –– and that learning can be restored.

Karen Szumlinski, a professor in the Department of Psychological & Brain Sciences at UCSB, and her colleagues Osnat Ben-Shahar and Tod Kippin, have worked in the field of addiction for many years. Senior author of a paper on this topic published recently in The Journal of Neuroscience, Szumlinski is particularly interested in the part of the brain called the prefrontal cortex, where the process of “executive function” –– or decision-making –– is located. This area is involved in directing one’s behavior in an appropriate manner, and in controlling behavior.

With her research team, Szumlinski discovered that a drug that stimulates a certain type of glutamate receptor –– when aimed at the prefrontal cortex –– could restore learning impairment in rats with simulated cocaine addiction.

"Needless to say, this (the prefrontal cortex) is one of the last parts of the brain to develop, and, of relevance to our students, continues to develop through about age 25 to 28," said Szumlinski.

Szumlinski explained that in the prefrontal cortex there seems to be “hypo-frontality,” or reduced functioning, in drug addicts, as well as in patients with a range of neuropsychiatric diseases, including schizophrenia, depression, and attention deficit disorder.

Szumlinski calls the prefrontal cortex a late-developing brain area that is critical for making proper decisions, and inhibiting behavior. “You damage this brain region and you lose the ability to self-regulate, you make impulsive decisions like engaging in risky sexual behavior or drug-taking, you basically go off the deep end in terms of function,” she said. “So we were very much interested in how drugs of abuse impact the prefrontal cortex, given that human drug addicts show deficits in this brain area when you put them into a scanner. They show hypo-activity.” She said this hypo-activity, or hypo-frontality, might relate to a neurotransmitter that scientists know is involved in exciting the brain.

A key question, according to Szumlinski, is this: “Was that hypo-frontality there in the first place, and that’s why they became an addict; or did the drugs change their prefrontal cortext, to cause it to become hypo-functioning and thus they’re not able to control their drug use? You can’t parse that out in humans. So that’s why we turn then to animal models of the disorder, and we do have this rat model that we use in the paper.”

Szumlinski pointed out a key difficulty in the development of treatments for addiction: There is little money targeted to the study of this disease. Hence, in addition to studying the brain mechanisms that are involved, she is joining forces with researchers who study other neurological diseases that are well-funded, to help find cures. She hopes that government approval of new drugs for these other diseases would eventually make the drugs available for clinical trials to study their effects on cocaine addiction.

(Image: iStock)

Filed under cocaine addiction brain function learning impairment prefrontal cortex neuroscience science

free counters