Neuroscience

Articles and news from the latest research reports.

207 notes

New Mapping Approach Lets Scientists Zoom In And Out As The Brain Processes Sound
Researchers at Johns Hopkins have mapped the sound-processing part of the mouse brain in a way that keeps both the proverbial forest and the trees in view. Their imaging technique allows zooming in and out on views of brain activity within mice, and it enabled the team to watch brain cells light up as mice “called” to each other. The results, which represent a step toward better understanding how our own brains process language, appear online July 31 the journal Neuron.
In the past, researchers often studied sound processing in various animal brains by poking tiny electrodes into the auditory cortex, the part of the brain that processes sound. They then played tones and observed the response of nearby neurons, laboriously repeating the process over a gridlike pattern to figure out where the active neurons were. The neurons seemed to be laid out in neatly organized bands, each responding to a different tone. More recently, a technique called two-photon microscopy has allowed researchers to focus in on minute slices of the live mouse brain, observing activity in unprecedented detail. This newer approach has suggested that the well-manicured arrangement of bands might be an illusion. But, says David Yue, M.D., Ph.D., a professor of biomedical engineering and neuroscience at the Johns Hopkins University School of Medicine, “You could lose your way within the zoomed-in views afforded by two-photon microscopy and not know exactly where you are in the brain.” Yue led the study along with Eric Young, Ph.D., also a professor of biomedical engineering and a researcher in Johns Hopkins’ Institute for Basic Biomedical Sciences.
To get the bigger picture, John Issa, a graduate student in Yue’s lab, used a mouse genetically engineered to produce a molecule that glows green in the presence of calcium. Since calcium levels rise in neurons when they become active, neurons in the mouse’s auditory cortex glow green when activated by various sounds. Issa used a two-photon microscope to peer into the brains of live mice as they listened to sounds and saw which neurons lit up in response, piecing together a global map of a given mouse’s auditory cortex. “With these mice, we were able to both monitor the activity of individual populations of neurons and zoom out to see how those populations fit into a larger organizational picture,” he says.
With these advances, Issa and the rest of the research team were able see the tidy tone bands identified in earlier electrode studies. In addition, the new imaging platform quickly revealed more sophisticated properties of the auditory cortex, particularly as mice listened to the chirps they use to communicate with each other. “Understanding how sound representation is organized in the brain is ultimately very important for better treating hearing deficits,” Yue says. “We hope that mouse experiments like this can provide a basis for figuring out how our own brains process language and, eventually, how to help people with cochlear implants and similar interventions hear better.”
Yue notes that the same approach could also be used to understand other parts of the brain as they react to outside stimuli, such as the visual cortex and the parts of the brain responsible for processing stimuli from limbs.

New Mapping Approach Lets Scientists Zoom In And Out As The Brain Processes Sound

Researchers at Johns Hopkins have mapped the sound-processing part of the mouse brain in a way that keeps both the proverbial forest and the trees in view. Their imaging technique allows zooming in and out on views of brain activity within mice, and it enabled the team to watch brain cells light up as mice “called” to each other. The results, which represent a step toward better understanding how our own brains process language, appear online July 31 the journal Neuron.

In the past, researchers often studied sound processing in various animal brains by poking tiny electrodes into the auditory cortex, the part of the brain that processes sound. They then played tones and observed the response of nearby neurons, laboriously repeating the process over a gridlike pattern to figure out where the active neurons were. The neurons seemed to be laid out in neatly organized bands, each responding to a different tone. More recently, a technique called two-photon microscopy has allowed researchers to focus in on minute slices of the live mouse brain, observing activity in unprecedented detail. This newer approach has suggested that the well-manicured arrangement of bands might be an illusion. But, says David Yue, M.D., Ph.D., a professor of biomedical engineering and neuroscience at the Johns Hopkins University School of Medicine, “You could lose your way within the zoomed-in views afforded by two-photon microscopy and not know exactly where you are in the brain.” Yue led the study along with Eric Young, Ph.D., also a professor of biomedical engineering and a researcher in Johns Hopkins’ Institute for Basic Biomedical Sciences.

To get the bigger picture, John Issa, a graduate student in Yue’s lab, used a mouse genetically engineered to produce a molecule that glows green in the presence of calcium. Since calcium levels rise in neurons when they become active, neurons in the mouse’s auditory cortex glow green when activated by various sounds. Issa used a two-photon microscope to peer into the brains of live mice as they listened to sounds and saw which neurons lit up in response, piecing together a global map of a given mouse’s auditory cortex. “With these mice, we were able to both monitor the activity of individual populations of neurons and zoom out to see how those populations fit into a larger organizational picture,” he says.

With these advances, Issa and the rest of the research team were able see the tidy tone bands identified in earlier electrode studies. In addition, the new imaging platform quickly revealed more sophisticated properties of the auditory cortex, particularly as mice listened to the chirps they use to communicate with each other. “Understanding how sound representation is organized in the brain is ultimately very important for better treating hearing deficits,” Yue says. “We hope that mouse experiments like this can provide a basis for figuring out how our own brains process language and, eventually, how to help people with cochlear implants and similar interventions hear better.”

Yue notes that the same approach could also be used to understand other parts of the brain as they react to outside stimuli, such as the visual cortex and the parts of the brain responsible for processing stimuli from limbs.

Filed under sound processing brain activity auditory cortex hearing neuroscience science

  1. xcheldrux reblogged this from thenewenlightenmentage
  2. biosensory reblogged this from neurosciencestuff
  3. brightbrowneyes94 reblogged this from neurosciencestuff
  4. fuxkfuxkfuxk reblogged this from neurosciencestuff
  5. artcollisions reblogged this from diamidinophenylindolee
  6. craniosacraltherapy reblogged this from candidscience
  7. hotdogcephalopod reblogged this from diamidinophenylindolee
  8. psilentium reblogged this from diamidinophenylindolee
  9. diamidinophenylindolee reblogged this from neurosciencestuff
  10. ah-thenah reblogged this from neurosciencestuff
  11. limisfolie reblogged this from candidscience
  12. nizdawg reblogged this from neurosciencestuff
  13. breezingby reblogged this from griffstream
  14. griffstream reblogged this from kokjoy334
  15. dumnass reblogged this from esevat0
  16. ishayashelton reblogged this from esevat0 and added:
    OMG
  17. esevat0 reblogged this from thinksquad
  18. manishasekaran reblogged this from thenewenlightenmentage
  19. evilnewssource reblogged this from thinksquad
  20. oliviantoinette reblogged this from caffeinatedcranium
  21. theloneliestbanana reblogged this from thinksquad
  22. chaoticblitz reblogged this from neurosciencestuff
  23. bwansen reblogged this from thinksquad
free counters