Posts tagged neuroscience

Posts tagged neuroscience
A Chimp’s Point Of View: Goggles simultaneously monitor a chimpanzee’s eyes and field of view
Chimps with camera goggles on their heads are helping scientists learn how the apes literally see the world.
From a scientific perspective, the eyes are windows to the mind. What people watch is one key sign of what they might be thinking, so monitoring their gazes can help researchers learn about what is going on inside people’s heads.
Scientists have conducted eye-tracking studies on people for more than 100 years. However, comparably little work has been conducted with other primates. Such work promises to shed light on humanity’s closest living relatives, and how they might perceive the world differently.
"If we know the differences between chimpanzees and humans, we will have an insight into how human perception has evolved," said comparative psychologist Fumihiro Kano at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.
Until recently, eye-tracking research involved desk-sized machines confined to labs. Investigators now have access to portable, wearable eye-trackers, enabling scientists to learn how people look at and interact with the world in a more natural way. This enables them to research topics such as how experts look at the world differently from novices. Now Kano and his colleagues are using these devices to study chimps.
"Everybody wants to see the world through chimpanzee eyes, right?" Kano said. "That’s one of my childhood dreams. How do chimpanzees, the closest relatives of humans, see the world?"
The researchers placed lightweight goggles on a 27-year-old female chimpanzee named Pan that had one camera monitor her right eye and another aimed at her field of view, both of which sent data to a portable recorder. The mobile setup allowed the chimp to move and behave freely.
"We modified the eye-tracker goggle shape so that the chimpanzee could wear it and like it," Kano said. "If the chimpanzee felt uncomfortable wearing the goggles, she wouldn’t care about throwing it away!"
When Pan wore the eye-trackers, the scientists practiced a two-minute gestural task with her that she had already learned for several years. The researchers performed one of three gestures — touching their noses, touching their palms, or clapping their hands — and gave Pan pieces of apple from a transparent box as a reward whenever she copied that task. The goggles also captured the greetings Pan often gave people before tasks, such as pant-grunting or swaying.
"No researcher has been successful in recording the natural gaze of chimpanzees before," Kano said.
The researchers found out how Pan looked at the world differently depending on what she was doing. For instance, when greeting experimenters, the chimpanzee focused on their faces and feet — the latter presumably to see where they were going — but during the gestural task, she gazed at the experimenters’ faces and hands. In addition, while Pan mostly ignored the fruit reward before the gestural task, she looked at it 30 times more during the task. Kano indicated that this focus on the fruit reveals that Pan was thinking ahead to anticipate the future.
"This work builds toward an understanding not just of how chimpanzees learn about the world, but how they want to influence it," said neuroethologist Stephen Shepherd at Rockefeller University in New York, who did not take part in this research. "We can use gaze as a readout of what chimpanzees think is important to attend and affect."
Moreover, past research with desk-mounted eye-trackers hinted chimps did not look at familiar faces any longer than unfamiliar ones, but these new findings suggest otherwise — Pan looked at unfamiliar experimenters longer than familiar ones.
The researchers think one reason for the difference may have been because the previous studies used pictures of faces, shown for a shorter amount of time. In the new experiment, Pan also looked at familiar people longer if they were not in rooms where she was accustomed to seeing them.
The researchers plan on testing more chimpanzees with these wearable eye-trackers. They also want to compare the apes with people and other primates.
"It will be very interesting to see how humans, chimpanzees and other primates use gaze while performing the same real-world tasks," Shepherd said. "I would love to know if chimpanzees are intermediate between humans and monkeys, or if they’re just like humans."
In addition, future research will analyze how chimpanzees predict the actions of people and other chimpanzees. How the apes predict the actions of others in real-time, “that is, within a fraction of a second, is largely unknown,” Kano said.
Kano and his colleague Masaki Tomonaga detailed their findings online March 27 in the journal PLOS ONE.
People often think that other people are staring at them even when they aren’t, vision scientists have found.
In a new article in Current Biology, researchers at The Vision Centre reveal that, when in doubt, the human brain is more likely to tell its owner that they’re under the gaze of another person.
“Gaze perception – the ability to tell what a person is looking at – is a social cue that people often take for granted,” says Professor Colin Clifford of The Vision Centre and The University of Sydney.
“Judging if others are looking at us may come naturally, but it’s actually not that simple – our brains have to do a lot of work behind the scenes.”
To tell if they’re under someone’s gaze, people look at the position of the other person’s eyes and the direction of their heads, Prof. Clifford explains. These visual cues are then sent to the brain where there are specific areas that compute this information.
However, the brain doesn’t just passively receive information from the eyes, Prof. Clifford says. The new study shows that when people have limited visual cues, such as in dark conditions or when the other person is wearing sunglasses, the brain takes over with what it ‘knows’.
In their study, the Vision Centre researchers created images of faces and asked people to observe where the faces were looking.
“We made it difficult for the observers to see where the eyes were pointed so they would have to rely on their prior knowledge to judge the faces’ direction of gaze,” Prof. Clifford explains. “It turns out that we’re hard-wired to believe that others are staring at us, especially when we’re uncertain.
“So gaze perception doesn’t only involve visual cues – our brains generate assumptions from our experiences and match them with what we see at a particular moment.”
There are several speculations to why humans have this bias, Prof. Clifford says. “Direct gaze can signal dominance or a threat, and if you perceive something as a threat, you would not want to miss it. So assuming that the other person is looking at you may simply be a safer strategy.”
“Also, direct gaze is often a social cue that the other person wants to communicate with us, so it’s a signal for an upcoming interaction.”
There is also evidence that babies have a preference for direct gaze, which suggests that this bias is innate, Prof. Clifford says. “It’s important that we find out whether it’s innate or learned – and how this might affect people with certain mental conditions.
“Research has shown, for example, that people who have autism are less able to tell whether someone is looking at them. People with social anxiety, on the other hand, have a higher tendency to think that they are under the stare of others.
“So if it is a learned behaviour, we could help them practice this task – one possibility is letting them observe a lot of faces with different eyes and head directions, and giving them feedback on whether their observations are accurate.”
New learning and memory neurons uncovered
A University of Queensland study has identified precisely when new neurons become important for learning.
Lead researcher Dr Jana Vukovic from UQ’s Queensland Brain Institute (QBI) said the study highlighted the importance of new neuron development.
“New neurons are continually produced in the brain, passing through a number of developmental stages before becoming fully mature,” Dr Vukovic said.
“Using a genetic technique to delete immature neurons in animal models, we found they had great difficulty learning a new spatial task.
“There are ways to encourage the production of new neurons – including physical exercise – to improve learning.
“The new neurons appear particularly important for the brain to detect subtle but critical differences in the environment that can impact on the individual.”
The study, performed in QBI Director Professor Perry Bartlett’s laboratory, also demonstrates that immature neurons, born in a region of the brain known as the hippocampus, are required for learning but not for the retrieval of past memories.
“On the other hand, if the animals needed to remember a task they had already mastered in the past, before these immature neurons were deleted, their ability to perform the task was the same – so, they’ve remembered the task they learned earlier,” Dr Vukovic said.
This research allows for better understanding of the processes underlying learning and memory formation.
(Image Caption: Newly generated neurons doublecortin positive in the dentate gyrus of a degenerating hippocampus in mutant mice lacking the transcription factor TIF-IA. Credit: Rosanna Parlato (AG Schütz, DKFZ-ZMBH Alliance)
Protein spheres in the nucleus give wrong signal for cell division

RUB researchers develop new hypothesis for the degeneration of nerve cells
A new hypothesis has been developed by researchers in Bochum on how Alzheimer’s disease could occur. They analysed the interaction of the proteins FE65 and BLM that regulate cell division. In the cell culture model, they discovered spherical structures in the nucleus that contained FE65 and BLM. The interaction of the proteins triggered a wrong signal for cell division. This may explain the degeneration and death of nerve cells in Alzheimer’s patients. The team led by Dr. Thorsten Müller and Prof. Dr. Katrin Marcus from the Department of Functional Proteomics in cooperation with the RUB’s Medical Proteome Centre headed by Prof. Helmut E. Meyer reported on the results in the “Journal of Cell Science”.
Components of spherical structures in the nucleus identified
The so-called amyloid precursor protein APP is central to Alzheimer’s disease. It spans the cell membrane, and its cleavage products are linked to protein deposits that form in Alzheimer patients outside the nerve cells. APP anchors the protein FE65 to the membrane, which was the focus of the current study. FE65 can migrate into the nucleus, where it plays a role in DNA replication and repair. Based on cells grown in the laboratory, the team led by Dr. Müller established that FE65 can unite with other proteins in the cell nucleus to form spherical structures, so-called “nuclear spheres”. Video microscopy showed that these ring-like structures merge with each other and can thus grow. “By using a special cell culture model, we were able to identify additional components of these spheres”, says Andreas Schrötter, PhD student in the working group Morbus Alzheimer at the Institute for Functional Proteomics. Among other things, the scientists found the protein BLM, which is known from Bloom’s syndrome – an extremely rare hereditary disease, which is associated with dwarfism, immunodeficiency, and an increased risk of cancer. BLM is involved in DNA replication and repair in the nucleus.
The amount of FE65 determines the amount of BLM in the cell nucleus
Müller’s team took a closer look at the function of FE65. By means of genetic manipulation, the researchers generated cell cultures, in which the FE65-production was reduced. A smaller amount of FE65 thus generated a smaller amount of the protein BLM in the nucleus. Instead, BLM collected in another area of the cell, the endoplasmic reticulum. In addition, the researchers found a lower rate of DNA replication in the genetically modified cells. In this way, FE65 influences the replication of the genetic material via the BLM protein. When the researchers cranked up the FE65-production again, the amount of BLM in the nucleus also increased again.
FE65 as a possible trigger for Alzheimer’s
In patients with Alzheimer’s disease, the protein APP, an interaction partner of FE65, changes. The interaction of the two molecules is important for the transport of FE65 into the nucleus, where it regulates cell division in combination with BLM. Müller’s team assumes that the altered APP-FE65 interaction mistakenly sends the cells the signal to divide. Since nerve cells normally cannot divide, they degenerate instead and die. “This hypothesis, which we pursue in the working group Morbus Alzheimer, also delivers new starting points for potential therapies, which are urgently needed for Alzheimer’s disease,” says Dr. Mueller. In the future, the team will also investigate whether and how the amount of BLM is altered in Alzheimer’s patients compared to healthy subjects.
(Source: alphagalileo.org)
We’ve all been there: You’re at work deeply immersed in a project when suddenly you start thinking about your weekend plans. It happens because behind the scenes, parts of your brain are battling for control.

Now, University of Florida researchers and their colleagues are using a new technique that allows them to examine how parts of the brain battle for dominance when a person tries to concentrate on a task. Addressing these fluctuations in attention may help scientists better understand many neurological disorders such as autism, depression and mild cognitive impairment.
Mingzhou Ding, a professor of biomedical engineering, and Xiaotong Wen, an assistant research scientist of biomedical engineering, both of the University of Florida; Yijun Liu of the McKnight Brain Institute of the University of Florida and Peking University, Beijing; and Li Yao of Beijing Normal University, report their findings in the current issue of The Journal of Neuroscience.
Scientists know different networks within the brain have distinct functions. Ding, Wen and their colleagues used a brain imaging technique called functional magnetic resonance imaging and biostatistical methods to examine interactions between a set of areas they call the task control network and another set of areas known as the default mode network.
The task control network regulates attention to surroundings, controlling concentration on a task such as doing homework, or listening for emotional cues during a conversation. The default mode network is thought to regulate self-reflection and emotion, and often becomes active when a person seems to be doing nothing else.
“We knew that the default mode network decreases in activity when a task is being performed, but we didn’t know why or how,” said Ding, a professor of biomedical engineering in the J. Crayton Pruitt department of biomedical engineering. “We also wanted to know what is driving that activity decrease.
“For a long time, the questions we are asking could not be answered.”
In the past, researchers could not distinguish between directions of interactions between regions of the brain, and could come up with only one number to represent an average of the back-and-forth interactions. Ding and his colleagues used a new technique to untangle the interactions in each direction to show how the different brain regions interact with one another.
In their study, the researchers used fMRI to examine the brains of people performing a task that required concentration. The scientists can see the activity in certain areas of the brain at the same time a person is performing a given task. They can see which parts of the brain are active and which are not and correlate this to how successful a person is at a given task. They then applied the Granger causality technique to look at the data they saw in the fMRI. Named for Nobel Prize-winning economist Clive Granger, this technique allows scientists to examine how one variable affects another variable; in this case, how one region of the brain influences another.
“People have hypothesized different functions for signals going in different directions,” Ding said. “We show that when the task control network suppresses the default mode network, the person can do the task better and faster. The better the default mode network is shut down, the better a person performs.”
However, when the default mode network is not sufficiently suppressed, it sends signals to the task control network that effectively distract the person, causing his or her performance to drop. So while the task control network suppresses the default mode network, the default mode network also interferes with the task control network.
“Your brain is a constant seesaw back and forth,” even when trying to concentrate on a task, Ding said.
The Granger causality technique may help researchers learn more about how neurological disorders work. Researchers have found that the default mode network remains unchanged in people with autism whether they are performing a task or interacting with the environment, which could explain symptoms such as difficulty reading social cues or being easily overwhelmed by sensory stimulation. Scientists have made similar findings with depression and mild cognitive impairment. However, until now no one has been able to address what areas of the brain might be regulating the default mode network and which might be interfering with that regulation.
“Now we are able to address these questions,” Ding said.
(Source: news.ufl.edu)

Despite what you may think, your brain is a mathematical genius
The irony of getting away to a remote place is you usually have to fight traffic to get there. After hours of dodging dangerous drivers, you finally arrive at that quiet mountain retreat, stare at the gentle waters of a pristine lake, and congratulate your tired self on having “turned off your brain.”
"Actually, you’ve just given your brain a whole new challenge," says Thomas D. Albright, director of the Vision Center Laboratory at of the Salk Institute and an expert on how the visual system works. "You may think you’re resting, but your brain is automatically assessing the spatio-temporal properties of this novel environment-what objects are in it, are they moving, and if so, how fast are they moving?
The dilemma is that our brains can only dedicate so many neurons to this assessment, says Sergei Gepshtein, a staff scientist in Salk’s Vision Center Laboratory. “It’s a problem in economy of resources: If the visual system has limited resources, how can it use them most efficiently?”
Albright, Gepshtein and Luis A. Lesmes, a specialist in measuring human performance, a former Salk Institute post-doctoral researcher, now at the Schepens Eye Research Institute, proposed an answer to the question in a recent issue of Proceedings of the National Academy of Sciences (Correction). It may reconcile the puzzling contradictions in many previous studies.
Previously, scientists expected that extended exposure to a novel environment would make you better at detecting its subtle details, such as the slow motion of waves on that lake. Yet those who tried to confirm that idea were surprised when their experiments produced contradictory results. “Sometimes people got better at detecting a stimulus, sometimes they got worse, sometimes there was no effect at all, and sometimes people got better, but not for the expected stimulus,” says Albright, holder of Salk’s Conrad T. Prebys Chair in Vision Research.
The answer, according to Gepshtein, came from asking a new question: What happens when you look at the problem of resource allocation from a system’s perspective?
It turns out something’s got to give.
"It’s as if the brain’s on a budget; if it devotes 70 percent here, then it can only devote 30 percent there," says Gepshtein. "When the adaptation happens, if now you’re attuned to high speeds, you’ll be able to see faster moving things that you couldn’t see before, but as a result of allocating resources to that stimulus, you lose sensitivity to other things, which may or may not be familiar."
Summing up, Albright says, “Simply put, it’s a tradeoff: The price of getting better at one thing is getting worse at another.”
Gepshtein, a computational neuroscientist, analyzes the brain from a theoretician’s point of view, and the PNAS paper details the computations the visual system uses to accomplish the adaptation. The computations are similar to the method of signal processing known as Gabor transform, which is used to extract features in both the spatial and temporal domains.
Yes, while you may struggle to balance your checkbook, it turns out your brain is using operations it took a Nobel Laureate to describe. Dennis Gabor won the 1971 Nobel Prize in Physics for his invention and development of holography. But that wasn’t his only accomplishment. Like his contemporary Claude Shannon, he worked on some of the most fundamental questions in communications theory, such as how a great deal of information can be compressed into narrow channels.
"Gabor proved that measurements of two fundamental properties of a signal-its location and frequency content-are not independent of one another," says Gepshtein.
The location of a signal is simply that: where is the signal at what point in time. The content-the “what” of a signal-is “written” in the language of frequencies and is a measurement of the amount of variation, such as the different shades of gray in a photograph.
The challenge comes when you’re trying to measure both location and frequency, because location is more accurately determined in a short time window, while variation needs a longer time window (imagine how much more accurately you can guess a song the longer it plays).
The obvious answer is that you’re stuck with a compromise: You can get a precise measurement of one or the other, but not both. But how can you be sure you’ve come up with the best possible compromise? Gabor’s answer was what’s become known as a “Gabor Filter” that helps obtain the most precise measurements possible for both qualities. Our brains employ a similar strategy, says Gepshtein.
"In human vision, stimuli are first encoded by neural cells whose response characteristics, called receptive fields, have different sizes," he explains. "The neural cells that have larger receptive fields are sensitive to lower spatial frequencies than the cells that have smaller receptive fields. For this reason, the operations performed by biological vision can be described by a Gabor wavelet transform."
In essence, the first stages of the visual process act like a filter. “It describes which stimuli get in, and which do not,” Gepshtein says. “When you change the environment, the filter changes, so certain stimuli, which were invisible before, become visible, but because you moved the filter, other stimuli, which you may have detected before, no longer get in.”
"When you see only small parts of this filter, you find that visual sensitivity sometimes gets better and sometimes worse, creating an apparently paradoxical picture," Gepshtein continues. "But when you see the entire filter, you discover that the pieces - the gains and losses - add up to a coherent pattern."
From a psychological point of view, according to Albright, what makes this especially intriguing is that the assessing and adapting is happening automatically-all of this processing happens whether or not you consciously ‘pay attention’ to the change in scene.
Yet, while the adaptation happens automatically, it does not appear to happen instantaneously. Their current experiments take approximately thirty minutes to conduct, but the scientists believe the adaption may take less time in nature.
(Image: Gary Meader)

Restoring paretic hand function via an artificial neural connection bridging spinal cord injury
Functional loss of limb control in individuals with spinal cord injury or stroke can be caused by interruption of the neural pathways between brain and spinal cord, although the neural circuits located above and below the lesion remain functional. An artificial neural connection that bridges the lost pathway and connects brain to spinal circuits has potential to ameliorate the functional loss. Yukio Nishimura, Associate Professor of the National Institute for Physiological Sciences, Japan, and Eberhard Fetz, Professor and Steve Perlmuter, Research Associate Professor at the University of Washington, United States investigated the effects of introducing a novel artificial neural connection which bridged a spinal cord lesion in a paretic monkey. This allowed the monkey to electrically stimulate the spinal cord through volitionally controlled brain activity and thereby to restore volitional control of the paretic hand. This study demonstrates that artificial neural connections can compensate for interrupted descending pathways and promote volitional control of upper limb movement after damage of neural pathways such as spinal cord injury or stroke. The study will be published online in Frontiers in Neural Circuits on April 11.
"The important point is that individuals who are paralyzed want to be able to move their own bodies by their own will. This study was different from what other research groups have done up to now; we didn’t use any prosthetic limbs like robotic arms to replace the original arm. What’s new is that we have been able to use this artificial neuronal connection bypassing the lesion site to restore volitional control of the subject’s own paretic arm. I think that for lesions of the corticospinal pathway this might even have a better chance of becoming a real prosthetic treatment rather than the sort of robotic devices that have been developed recently", Associate professor Nishimura said.
Scientists create phantom sensations in non-amputees
The sensation of having a physical body is not as self-evident as one might think. Almost everyone who has had an arm or leg amputated experiences a phantom limb: a vivid sensation that the missing limb is still present. A new study by neuroscientists at the Karolinska Institutet in Sweden shows that it is possible to evoke the illusion of having a phantom hand in non-amputated individuals.
In an article in the scientific periodical Journal of Cognitive Neuroscience, the researchers describe a perceptual illusion in which healthy volunteers experience having an invisible hand. The experiment involves the participant sitting at a table with their right arm hidden from their view behind a screen. To evoke the illusion, the scientist touches the right hand of the participant with a small paintbrush while imitating the exact movements with another paintbrush in mid-air within full view of the participant.
"We discovered that most participants, within less than a minute, transfer the sensation of touch to the region of empty space where they see the paintbrush move, and experience an invisible hand in that position", says Arvid Guterstam, lead author of the study. "Previous research has shown that non-bodily objects, such as a block of wood, cannot be experienced as ones own hand, so we were extremely surprised to find that the brain can accept an invisible hand as part of the body."
The study comprises eleven experiments that explore in detail the illusory experience and include 234 volunteers. To demonstrate that the illusion actually worked, the researchers would make a stabbing motion with a knife towards the empty space ‘occupied’ by the invisible hand and measure the participant’s sweat response to the perceived threat. They found that the participants stress responses were elevated while experiencing the illusion but absent when the illusion was broken.
In another experiment, the volunteers were asked to close their eyes and quickly point with their left hand to their right hand (or to where they perceived it to be). After having experienced the illusion for a while, they would point to the location of the invisible hand rather than to their real hand.
The researchers also measured the brain activity of the participants using functional magnetic resonance imaging (fMRI). Perceiving the invisible hand illusion led to increased activity in the same parts of the brain that are normally active when individuals see their real hand being touched or when participants experience a prosthetic hand as their own.
"Taken together, our results show that the sight of a physical hand is remarkably unimportant to the brain for creating the experience of one’s physical self," says Arvid Guterstam.
The researchers hope that the results of their study will offer insight into future research on phantom pain in amputees.
"This illusion suggests that the experience of phantom limbs is not unique to amputated individuals, but can easily be created in non-amputees," says the principal investigator, Dr Henrik Ehrsson, Docent at the Department of Neuroscience. "These results add to our understanding of how phantom sensations are produced by the brain, which can contribute to future research on alleviating phantom pain in amputees."

Mutations found in individuals with autism interfere with endocannabinoid signaling in the brain
Mutations found in individuals with autism block the action of molecules made by the brain that act on the same receptors that marijuana’s active chemical acts on, according to new research reported online April 11 in the Cell Press journal Neuron. The findings implicate specific molecules, called endocannabinoids, in the development of some autism cases and point to potential treatment strategies.
"Endocannabinoids are molecules that are critical regulators of normal neuronal activity and are important for many brain functions," says first author Dr. Csaba Földy, of Stanford University Medical School. "By conducting studies in mice, we found that neuroligin-3, a protein that is mutated in some individuals with autism, is important for relaying endocannabinoid signals that tone down communication between neurons."
When the researchers introduced different autism-associated mutations in neuroligin-3 into mice, this signaling was blocked and the overall excitability of the brain was changed.
"These findings point out an unexpected link between a protein implicated in autism and a signaling system that previously had not been considered to be particularly important for autism," says senior author Dr. Thomas Südhof, also of Stanford. "Thus, the findings open up a new area of research and may suggest novel strategies for understanding the underlying causes of complex brain disorders."
The results also indicate that targeting components of the endocannabinoid signaling system may help reverse autism symptoms.
The study’s findings resulted from a research collaboration between the Stanford laboratories of Dr. Südhof and Dr. Robert Malenka, who is also an author on the paper.
Using a miniature electronic device implanted in the brain, scientists have tapped into the internal reward system of mice, prodding neurons to release dopamine, a chemical associated with pleasure.

The researchers, at Washington University School of Medicine in St. Louis and the University of Illinois at Urbana-Champaign, developed tiny devices, containing light emitting diodes (LEDs) the size of individual neurons. The devices activate brain cells with light. The scientists report their findings April 12 in the journal Science.
“This strategy should allow us to identify and map brain circuits involved in complex behaviors related to sleep, depression, addiction and anxiety,” says co-principal investigator Michael R. Bruchas, PhD, assistant professor of anesthesiology at Washington University. “Understanding which populations of neurons are involved in these complex behaviors may allow us to target specific brain cells that malfunction in depression, pain, addiction and other disorders.”
For the study, Washington University neuroscientists teamed with engineers at the University of Illinois to design microscale (LED) devices thinner than a human hair. This was the first application of the devices in optogenetics, an area of neuroscience that uses light to stimulate targeted pathways in the brain. The scientists implanted them into the brains of mice that had been genetically engineered so that some of their brain cells could be activated and controlled with light.
Although a number of important pathways in the brain can be studied with optogenetics, many neuroscientists have struggled with the engineering challenge of delivering light to precise locations deep in the brain. Most methods have tethered animals to lasers with fiber optic cables, limiting their movement and altering natural behaviors.
But with the new devices, the mice freely moved about and were able to explore a maze or scamper on a wheel. The electronic LEDs are housed in a tiny fiber implanted deep in the brain. That’s important to the device’s ability to activate the proper neurons, according to John A. Rogers, PhD, professor of materials science and engineering at the University of Illinois.
“You want to be able to deliver the light down into the depth of the brain,” Rogers says. “We think we’ve come up with some powerful strategies that involve ultra-miniaturized devices that can deliver light signals deep into the brain and into other organs in the future.”
Using light from the cellular-scale LEDs to stimulate dopamine-producing cells in the brain, the investigators taught the mice to poke their noses through a specific hole in a maze. Each time a mouse would poke its nose through the hole, that would trigger the system to wirelessly activate the LEDs in the implanted device, which then would emit light, causing neurons to release dopamine, a chemical related to the brain’s natural reward system.
“We used the LED devices to activate networks of brain cells that are influenced by the things you would find rewarding in life, like sex or chocolate,” says co-first author Jordan G. McCall, a neuroscience graduate student in Washington University’s Division of Biology and Biomedical Sciences. “When the brain cells were activated to release dopamine, the mice quickly learned to poke their noses through the hole even though they didn’t receive any food as a reward. They also developed an associated preference for the area near the hole, and they tended to hang around that part of the maze.”
The researchers believe the LED implants may be useful in other types of neuroscience studies or may even be applied to different organs. Related devices already are being used to stimulate peripheral nerves for pain management. Other devices with LEDs of multiple colors may be able to activate and control several neural circuits at once. In addition to the tiny LEDs, the devices also carry miniaturized sensors for detecting temperature and electrical activity within the brain.
Bruchas and his colleagues already have begun other studies of mice, using the LED devices to manipulate neural circuits that are involved in social behaviors. This could help scientists better understand what goes on in the brain in disorders such as depression and anxiety.
“We believe these devices will allow us to study complex stress and social interaction behaviors,” Bruchas explains. “This technology enables us to map neural circuits with respect to things like stress and pain much more effectively.”
The wireless, microLED implant devices represent the combined efforts of Bruchas and Rogers. Last year, along with Robert W. Gereau IV, PhD, professor of anesthesiology, they were awarded an NIH Director’s Transformative Research Project award to develop and conduct studies using novel device development and optogenetics, which involves activating or inhibiting brain cells with light.
(Source: newswise.com)