Neuroscience

Articles and news from the latest research reports.

Posts tagged science

69 notes

Research Provides Clues to Alcohol Addiction Vulnerability
A Wake Forest Baptist Medical Center team studying alcohol addiction has new research that might shed light on why some drinkers are more susceptible to addiction than others.
Jeff Weiner, Ph.D., professor of physiology and pharmacology at Wake Forest Baptist, and colleagues used an animal model to look at the early stages of the addiction process and focused on how individual animals responded to alcohol. Their findings may lead not only to a better understanding of addiction, but to the development of better drugs to treat the disease as well, Weiner said.
"We know that some people are much more vulnerable to alcoholism than others, just like some people have a vulnerability to cancer or heart disease," Weiner said. "We don’t have a good understanding of what causes this vulnerability, and that’s a big question. But if we can figure it out, we may be able to better identify people at risk, as well as gain important clues to help develop better drugs to treat the disease."
The findings are published in the March 13 issue of the Journal of Neuroscience. Weiner, who directs the Translational Studies on Early-Life Stress and Vulnerability to Alcohol Addiction project at Wake Forest Baptist, said the study protocol was developed by the first author of the paper, Karina Abrahao, a graduate student visiting from the collaborative lab of Sougza-Formigoni, Ph.D, of the Department of Psychobiology at the Federal University of Sao Paulo, Brazil.
Weiner said the study model focused on how individual animals responded to alcohol. Typically, when a drug like alcohol is given to a mouse every day, the way the animals respond increases - they become more stimulated and run around more. “In high doses, alcohol is a depressant, but in low doses, it can have a mellowing effect that results in greater activity,” he said. “Those low dose effects tend to increase over time and this increase in activity in response to repeated alcohol exposure is called locomotor sensitization.”
Prior studies with other drugs, such as cocaine and amphetamine, have suggested that animals that show the greatest increases in locomotor sensitization are also the animals most likely to seek out or consume these drugs. However, the relationship between locomotor sensitization and vulnerability to high levels of alcohol drinking is not as well established, Weiner said.
Usually when researchers are studying a drug, they give it to one test group while the other group gets a control solution, and then they look for behavioral differences between the two, Weiner said. But in this study, the researchers focused on individual differences in how each animal responded to the alcohol. A control group received a saline injection while another was injected with the same amount of alcohol every day for three weeks. Weiner said they used mice bred to be genetically variable like humans to make the research more relevant.
"We found large variations in the development of locomotor sensitization to alcohol in these mice, with some showing robust sensitization and others showing no more of a change in locomotor activity than control mice given daily saline injections," Weiner said. "Surprisingly, when all of the alcohol-exposed mice were given an opportunity to voluntarily drink alcohol, those that had developed sensitization drank more than those that did not. In fact, the alcohol-treated mice that failed to develop sensitization drank no more alcohol than the saline-treated control group."
The authors also conducted a series of neurobiological studies and discovered that mice that showed robust locomotor sensitization had deficits in a form of brain neuroplasticity - how experiences reorganize neural pathways in the brain - that has been linked with cocaine addiction in other animal models.
"We found that this loss of the ability of brain cells to change the way that they communicate with each other only occurred in the animals that showed the behavioral response to alcohol," he said. "What this suggests for the first time in the alcohol addiction field is that this particular deficit may represent an important brain correlate of vulnerability to alcoholism. It’s a testable hypothesis. That’s why I think it’s an important finding."

Research Provides Clues to Alcohol Addiction Vulnerability

A Wake Forest Baptist Medical Center team studying alcohol addiction has new research that might shed light on why some drinkers are more susceptible to addiction than others.

Jeff Weiner, Ph.D., professor of physiology and pharmacology at Wake Forest Baptist, and colleagues used an animal model to look at the early stages of the addiction process and focused on how individual animals responded to alcohol. Their findings may lead not only to a better understanding of addiction, but to the development of better drugs to treat the disease as well, Weiner said.

"We know that some people are much more vulnerable to alcoholism than others, just like some people have a vulnerability to cancer or heart disease," Weiner said. "We don’t have a good understanding of what causes this vulnerability, and that’s a big question. But if we can figure it out, we may be able to better identify people at risk, as well as gain important clues to help develop better drugs to treat the disease."

The findings are published in the March 13 issue of the Journal of Neuroscience. Weiner, who directs the Translational Studies on Early-Life Stress and Vulnerability to Alcohol Addiction project at Wake Forest Baptist, said the study protocol was developed by the first author of the paper, Karina Abrahao, a graduate student visiting from the collaborative lab of Sougza-Formigoni, Ph.D, of the Department of Psychobiology at the Federal University of Sao Paulo, Brazil.

Weiner said the study model focused on how individual animals responded to alcohol. Typically, when a drug like alcohol is given to a mouse every day, the way the animals respond increases - they become more stimulated and run around more. “In high doses, alcohol is a depressant, but in low doses, it can have a mellowing effect that results in greater activity,” he said. “Those low dose effects tend to increase over time and this increase in activity in response to repeated alcohol exposure is called locomotor sensitization.”

Prior studies with other drugs, such as cocaine and amphetamine, have suggested that animals that show the greatest increases in locomotor sensitization are also the animals most likely to seek out or consume these drugs. However, the relationship between locomotor sensitization and vulnerability to high levels of alcohol drinking is not as well established, Weiner said.

Usually when researchers are studying a drug, they give it to one test group while the other group gets a control solution, and then they look for behavioral differences between the two, Weiner said. But in this study, the researchers focused on individual differences in how each animal responded to the alcohol. A control group received a saline injection while another was injected with the same amount of alcohol every day for three weeks. Weiner said they used mice bred to be genetically variable like humans to make the research more relevant.

"We found large variations in the development of locomotor sensitization to alcohol in these mice, with some showing robust sensitization and others showing no more of a change in locomotor activity than control mice given daily saline injections," Weiner said. "Surprisingly, when all of the alcohol-exposed mice were given an opportunity to voluntarily drink alcohol, those that had developed sensitization drank more than those that did not. In fact, the alcohol-treated mice that failed to develop sensitization drank no more alcohol than the saline-treated control group."

The authors also conducted a series of neurobiological studies and discovered that mice that showed robust locomotor sensitization had deficits in a form of brain neuroplasticity - how experiences reorganize neural pathways in the brain - that has been linked with cocaine addiction in other animal models.

"We found that this loss of the ability of brain cells to change the way that they communicate with each other only occurred in the animals that showed the behavioral response to alcohol," he said. "What this suggests for the first time in the alcohol addiction field is that this particular deficit may represent an important brain correlate of vulnerability to alcoholism. It’s a testable hypothesis. That’s why I think it’s an important finding."

Filed under alcohol addiction alcohol animal model drug development neuroscience science

98 notes

Hunger-spiking neurons could help control autoimmune diseases
Neurons that control hunger in the central nervous system also regulate immune cell functions, implicating eating behavior as a defense against infections and autoimmune disease development, Yale School of Medicine researchers have found in a new study published in the Proceedings of the National Academies of Sciences (PNAS).
Autoimmune diseases have been on a steady rise in the United States. These illnesses develop when the body’s immune system turns on itself and begins attacking its own tissues. The interactions between different kinds of T cells are at the heart of fighting infections, but they have also been linked to autoimmune disorders.
“We’ve found that if appetite-promoting AgRP neurons are chronically suppressed, leading to decreased appetite and a leaner body weight, T cells are more likely to promote inflammation-like processes enabling autoimmune responses that could lead to diseases like multiple sclerosis,” said lead author Tamas Horvath, the Jean and David W. Wallace Professor of Biomedical Research and chair of comparative medicine at Yale School of Medicine.
“If we can control this mechanism by adjusting eating behavior and the kinds of food consumed, it could lead to new avenues for treating autoimmune diseases,” he added.
Horvath and his research team conducted their study in two sets of transgenic mice. In one set, they knocked out Sirt1, a signaling molecule that controls the hunger-promoting neuron AgRP in the hypothalamus. These Sirt1-deficient mice had decreased regulatory T cell function and enhanced effector T cell activity, leading to their increased vulnerability in an animal model of multiple sclerosis.
“This study highlights the important regulatory role of the neurons that control appetite in peripheral immune functions,” said Horvath. “AgRP neurons represent an important site of action for the body’s immune responses.”
The team’s data support the idea that achieving weight loss through the use of drugs that promote a feeling of fullness “could have unwanted effects on the spread of autoimmune disorders,” he notes.

Hunger-spiking neurons could help control autoimmune diseases

Neurons that control hunger in the central nervous system also regulate immune cell functions, implicating eating behavior as a defense against infections and autoimmune disease development, Yale School of Medicine researchers have found in a new study published in the Proceedings of the National Academies of Sciences (PNAS).

Autoimmune diseases have been on a steady rise in the United States. These illnesses develop when the body’s immune system turns on itself and begins attacking its own tissues. The interactions between different kinds of T cells are at the heart of fighting infections, but they have also been linked to autoimmune disorders.

“We’ve found that if appetite-promoting AgRP neurons are chronically suppressed, leading to decreased appetite and a leaner body weight, T cells are more likely to promote inflammation-like processes enabling autoimmune responses that could lead to diseases like multiple sclerosis,” said lead author Tamas Horvath, the Jean and David W. Wallace Professor of Biomedical Research and chair of comparative medicine at Yale School of Medicine.

“If we can control this mechanism by adjusting eating behavior and the kinds of food consumed, it could lead to new avenues for treating autoimmune diseases,” he added.

Horvath and his research team conducted their study in two sets of transgenic mice. In one set, they knocked out Sirt1, a signaling molecule that controls the hunger-promoting neuron AgRP in the hypothalamus. These Sirt1-deficient mice had decreased regulatory T cell function and enhanced effector T cell activity, leading to their increased vulnerability in an animal model of multiple sclerosis.

“This study highlights the important regulatory role of the neurons that control appetite in peripheral immune functions,” said Horvath. “AgRP neurons represent an important site of action for the body’s immune responses.”

The team’s data support the idea that achieving weight loss through the use of drugs that promote a feeling of fullness “could have unwanted effects on the spread of autoimmune disorders,” he notes.

Filed under hunger neurons autoimmune diseases immune system eating behavior neuroscience science

135 notes

Brain scans predict which criminals are more likely to reoffend
In a twist that evokes the dystopian science fiction of writer Philip K. Dick, neuroscientists have found a way to predict whether convicted felons are likely to commit crimes again from looking at their brain scans. Convicts showing low activity in a brain region associated with decision-making and action are more likely to be arrested again, and sooner.
Kent Kiehl, a neuroscientist at the non-profit Mind Research Network in Albuquerque, New Mexico, and his collaborators studied a group of 96 male prisoners just before their release. The researchers used functional magnetic resonance imaging (fMRI) to scan the prisoners’ brains during computer tasks in which subjects had to make quick decisions and inhibit impulsive reactions.
The scans focused on activity in a section of the anterior cingulate cortex (ACC), a small region in the front of the brain involved in motor control and executive functioning. The researchers then followed the ex-convicts for four years to see how they fared.
Among the subjects of the study, men who had lower ACC activity during the quick-decision tasks were more likely to be arrested again after getting out of prison, even after the researchers accounted for other risk factors such as age, drug and alcohol abuse and psychopathic traits. Men who were in the lower half of the ACC activity ranking had a 2.6-fold higher rate of rearrest for all crimes and a 4.3-fold higher rate for nonviolent crimes. The results are published in the Proceedings of the National Academy of Sciences.
There is growing interest in using neuroimaging to predict specific behaviour, says Tor Wager, a neuroscientist at the University of Colorado in Boulder. He says that studies such as this one, which tie brain imaging to concrete clinical outcomes, “provide a new and so far very promising way” to find patterns of brain activity that have broader implications for society.But the authors themselves stress that much more work is needed to prove that the technique is reliable and consistent, and that it is likely to flag only the truly high-risk felons and leave the low-risk ones alone. “This isn’t ready for prime time,” says Kiehl.
Wager adds that the part of the ACC examined in this study “is one of the most frequently activated areas in the human brain across all kinds of tasks and psychological states”. Low ACC activity could have a variety of causes — impulsivity, caffeine use, vascular health, low motivation or better neural efficiency — and not all of these are necessarily related to criminal behaviour.
Crime prediction was the subject of Dick’s 1956 short story “The Minority Report” (adapted for the silver screen by Steven Spielberg in 2002), which highlighted the thorny ethics of arresting people for crimes they had yet to commit.
Brain scans are of course a far cry from the clairvoyants featured in that science-fiction story. But even if the science turns out to be reliable, the legal and social implications remain to be explored, the authors warn. Perhaps the most appropriate use for neurobiological markers would be for helping to make low-stakes decisions, such as which rehabilitation treatment to assign a prisoner, rather than high-stakes ones such as sentencing or releasing on parole.
“A treatment of [these clinical neuroimaging studies] that is either too glibly enthusiastic or over-critical,” Wager says, “will be damaging for this emerging science in the long run.”

Brain scans predict which criminals are more likely to reoffend

In a twist that evokes the dystopian science fiction of writer Philip K. Dick, neuroscientists have found a way to predict whether convicted felons are likely to commit crimes again from looking at their brain scans. Convicts showing low activity in a brain region associated with decision-making and action are more likely to be arrested again, and sooner.

Kent Kiehl, a neuroscientist at the non-profit Mind Research Network in Albuquerque, New Mexico, and his collaborators studied a group of 96 male prisoners just before their release. The researchers used functional magnetic resonance imaging (fMRI) to scan the prisoners’ brains during computer tasks in which subjects had to make quick decisions and inhibit impulsive reactions.

The scans focused on activity in a section of the anterior cingulate cortex (ACC), a small region in the front of the brain involved in motor control and executive functioning. The researchers then followed the ex-convicts for four years to see how they fared.

Among the subjects of the study, men who had lower ACC activity during the quick-decision tasks were more likely to be arrested again after getting out of prison, even after the researchers accounted for other risk factors such as age, drug and alcohol abuse and psychopathic traits. Men who were in the lower half of the ACC activity ranking had a 2.6-fold higher rate of rearrest for all crimes and a 4.3-fold higher rate for nonviolent crimes. The results are published in the Proceedings of the National Academy of Sciences.

There is growing interest in using neuroimaging to predict specific behaviour, says Tor Wager, a neuroscientist at the University of Colorado in Boulder. He says that studies such as this one, which tie brain imaging to concrete clinical outcomes, “provide a new and so far very promising way” to find patterns of brain activity that have broader implications for society.

But the authors themselves stress that much more work is needed to prove that the technique is reliable and consistent, and that it is likely to flag only the truly high-risk felons and leave the low-risk ones alone. “This isn’t ready for prime time,” says Kiehl.

Wager adds that the part of the ACC examined in this study “is one of the most frequently activated areas in the human brain across all kinds of tasks and psychological states”. Low ACC activity could have a variety of causes — impulsivity, caffeine use, vascular health, low motivation or better neural efficiency — and not all of these are necessarily related to criminal behaviour.

Crime prediction was the subject of Dick’s 1956 short story “The Minority Report” (adapted for the silver screen by Steven Spielberg in 2002), which highlighted the thorny ethics of arresting people for crimes they had yet to commit.

Brain scans are of course a far cry from the clairvoyants featured in that science-fiction story. But even if the science turns out to be reliable, the legal and social implications remain to be explored, the authors warn. Perhaps the most appropriate use for neurobiological markers would be for helping to make low-stakes decisions, such as which rehabilitation treatment to assign a prisoner, rather than high-stakes ones such as sentencing or releasing on parole.

“A treatment of [these clinical neuroimaging studies] that is either too glibly enthusiastic or over-critical,” Wager says, “will be damaging for this emerging science in the long run.”

Filed under brain brain activity brain scans neuroimaging anterior cingulate cortex neuroscience science

186 notes

New mechanism for long-term memory formation discovered
UC Irvine neurobiologists have found a novel molecular mechanism that helps trigger the formation of long-term memory. The researchers believe the discovery of this mechanism adds another piece to the puzzle in the ongoing effort to uncover the mysteries of memory and, potentially, certain intellectual disabilities.
In a study led by Marcelo Wood of UC Irvine’s Center for the Neurobiology of Learning & Memory, the team investigated the role of this mechanism – a gene designated Baf53b – in long-term memory formation. Baf53b is one of several proteins making up a molecular complex called nBAF.
Mutations in the proteins of the nBAF complex have been linked to several intellectual disorders, including Coffin-Siris syndrome, Nicolaides-Baraitser syndrome and sporadic autism. One of the key questions the researchers addressed is how mutations in components of the nBAF complex lead to cognitive impairments.
In their study, Wood and his colleagues used mice bred with mutations in Baf53b. While this genetic modification did not affect the mice’s ability to learn, it did notably inhibit long-term memories from forming and severely impaired synaptic function.
“These findings present a whole new way to look at how long-term memories form,” said Wood, associate professor of neurobiology & behavior. “They also provide a mechanism by which mutations in the proteins of the nBAF complex may underlie the development of intellectual disability disorders characterized by significant cognitive impairments.”
How does this mechanism regulate gene expression required for long-term memory formation? Most genes are tightly packaged by a chromatin structure – chromatin being what compacts DNA so that it fits inside the nucleus of a cell. That compaction mechanism represses gene expression. Baf53b, and the nBAF complex, physically open the chromatin structure so specific genes required for long-term memory formation are turned on. The mutated forms of Baf53b did not allow for this necessary gene expression.
“The results from this study reveal a powerful new mechanism that increases our understanding of how genes are regulated for memory formation,” Wood said. “Our next step is to identify the key genes the nBAF complex regulates. With that information, we can begin to understand what can go wrong in intellectual disability disorders, which paves a path toward possible therapeutics.”
Findings appear online today in Nature Neuroscience.

New mechanism for long-term memory formation discovered

UC Irvine neurobiologists have found a novel molecular mechanism that helps trigger the formation of long-term memory. The researchers believe the discovery of this mechanism adds another piece to the puzzle in the ongoing effort to uncover the mysteries of memory and, potentially, certain intellectual disabilities.

In a study led by Marcelo Wood of UC Irvine’s Center for the Neurobiology of Learning & Memory, the team investigated the role of this mechanism – a gene designated Baf53b – in long-term memory formation. Baf53b is one of several proteins making up a molecular complex called nBAF.

Mutations in the proteins of the nBAF complex have been linked to several intellectual disorders, including Coffin-Siris syndrome, Nicolaides-Baraitser syndrome and sporadic autism. One of the key questions the researchers addressed is how mutations in components of the nBAF complex lead to cognitive impairments.

In their study, Wood and his colleagues used mice bred with mutations in Baf53b. While this genetic modification did not affect the mice’s ability to learn, it did notably inhibit long-term memories from forming and severely impaired synaptic function.

“These findings present a whole new way to look at how long-term memories form,” said Wood, associate professor of neurobiology & behavior. “They also provide a mechanism by which mutations in the proteins of the nBAF complex may underlie the development of intellectual disability disorders characterized by significant cognitive impairments.”

How does this mechanism regulate gene expression required for long-term memory formation? Most genes are tightly packaged by a chromatin structure – chromatin being what compacts DNA so that it fits inside the nucleus of a cell. That compaction mechanism represses gene expression. Baf53b, and the nBAF complex, physically open the chromatin structure so specific genes required for long-term memory formation are turned on. The mutated forms of Baf53b did not allow for this necessary gene expression.

“The results from this study reveal a powerful new mechanism that increases our understanding of how genes are regulated for memory formation,” Wood said. “Our next step is to identify the key genes the nBAF complex regulates. With that information, we can begin to understand what can go wrong in intellectual disability disorders, which paves a path toward possible therapeutics.”

Findings appear online today in Nature Neuroscience.

Filed under brain memory formation LTM genes mutations cognitive impairment neuroscience psychology science

77 notes

Developing Our Sense of Smell
When our noses pick up a scent, whether the aroma of a sweet rose or the sweat of a stranger at the gym, two types of sensory neurons are at work in sensing that odor or pheromone. These sensory neurons are particularly interesting because they are the only neurons in our bodies that regenerate throughout adult life—as some of our olfactory neurons die, they are soon replaced by newborns. Just where those neurons come from in the first place has long perplexed developmental biologists.
Previous hypotheses about the origin of these olfactory nerve cells have given credit to embryonic cells that develop into skin or the central nervous system, where ear and eye sensory neurons, respectively, are thought to originate. But biologists at the California Institute of Technology (Caltech) have now found that neural-crest stem cells—multipotent, migratory cells unique to vertebrates that give rise to many structures in the body such as facial bones and smooth muscle—also play a key role in building olfactory sensory neurons in the nose.
"Olfactory neurons have long been thought to be solely derived from a thickened portion of the ectoderm; our results directly refute that concept," says Marianne Bronner, the Albert Billings Ruddock Professor of Biology at Caltech and corresponding author of a paper published in the journal eLIFE on March 19 that outlines the findings.
The two main types of sensory neurons in the olfactory system are ciliated neurons, which detect volatile scents, and microvillous neurons, which usually sense pheromones. Both of these types are found in the tissue lining the inside of the nasal cavity and transmit sensory information to the central nervous system for processing.
In the new study, the researchers showed that during embryonic development, neural-crest stem cells differentiate into the microvillous neurons, which had long been assumed to arise from the same source as the odor-sensing ciliated neurons. Moreover, they demonstrated that different factors are necessary for the development of these two types of neurons. By eliminating a gene called Sox10, they were able to show that formation of microvillous neurons is blocked whereas ciliated neurons are unaffected.
They made this discovery by studying the development of the olfactory system in zebrafish—a useful model organism for developmental biology studies due to the optical clarity of the free-swimming embryo. Understanding the origins of olfactory neurons and the process of neuron formation is important for developing therapeutic applications for conditions like anosmia, or the inability to smell, says Bronner.
"A key question in developmental biology—the extent of neural-crest stem cell contribution to the olfactory system—has been addressed in our paper by multiple lines of experimentation," says Ankur Saxena, a postdoctoral scholar in Bronner’s laboratory and lead author of the study. "Olfactory neurons are unique in their renewal capacity across species, so by learning how they form, we may gain insights into how neurons in general can be induced to differentiate or regenerate. That knowledge, in turn, may provide new avenues for pursuing treatment of neurological disorders or injury in humans."
Next, the researchers will examine what other genes, in addition to Sox10, play a role in the process by which neural-crest stem cells differentiate into microvillous neurons. They also plan to look at whether or not neural-crest cells give rise to new microvillous neurons during olfactory regeneration that happens after the embryonic stage of development.

Developing Our Sense of Smell

When our noses pick up a scent, whether the aroma of a sweet rose or the sweat of a stranger at the gym, two types of sensory neurons are at work in sensing that odor or pheromone. These sensory neurons are particularly interesting because they are the only neurons in our bodies that regenerate throughout adult life—as some of our olfactory neurons die, they are soon replaced by newborns. Just where those neurons come from in the first place has long perplexed developmental biologists.

Previous hypotheses about the origin of these olfactory nerve cells have given credit to embryonic cells that develop into skin or the central nervous system, where ear and eye sensory neurons, respectively, are thought to originate. But biologists at the California Institute of Technology (Caltech) have now found that neural-crest stem cells—multipotent, migratory cells unique to vertebrates that give rise to many structures in the body such as facial bones and smooth muscle—also play a key role in building olfactory sensory neurons in the nose.

"Olfactory neurons have long been thought to be solely derived from a thickened portion of the ectoderm; our results directly refute that concept," says Marianne Bronner, the Albert Billings Ruddock Professor of Biology at Caltech and corresponding author of a paper published in the journal eLIFE on March 19 that outlines the findings.

The two main types of sensory neurons in the olfactory system are ciliated neurons, which detect volatile scents, and microvillous neurons, which usually sense pheromones. Both of these types are found in the tissue lining the inside of the nasal cavity and transmit sensory information to the central nervous system for processing.

In the new study, the researchers showed that during embryonic development, neural-crest stem cells differentiate into the microvillous neurons, which had long been assumed to arise from the same source as the odor-sensing ciliated neurons. Moreover, they demonstrated that different factors are necessary for the development of these two types of neurons. By eliminating a gene called Sox10, they were able to show that formation of microvillous neurons is blocked whereas ciliated neurons are unaffected.

They made this discovery by studying the development of the olfactory system in zebrafish—a useful model organism for developmental biology studies due to the optical clarity of the free-swimming embryo. Understanding the origins of olfactory neurons and the process of neuron formation is important for developing therapeutic applications for conditions like anosmia, or the inability to smell, says Bronner.

"A key question in developmental biology—the extent of neural-crest stem cell contribution to the olfactory system—has been addressed in our paper by multiple lines of experimentation," says Ankur Saxena, a postdoctoral scholar in Bronner’s laboratory and lead author of the study. "Olfactory neurons are unique in their renewal capacity across species, so by learning how they form, we may gain insights into how neurons in general can be induced to differentiate or regenerate. That knowledge, in turn, may provide new avenues for pursuing treatment of neurological disorders or injury in humans."

Next, the researchers will examine what other genes, in addition to Sox10, play a role in the process by which neural-crest stem cells differentiate into microvillous neurons. They also plan to look at whether or not neural-crest cells give rise to new microvillous neurons during olfactory regeneration that happens after the embryonic stage of development.

Filed under olfactory system nerve cells sensory cells stem cells neurons neuroscience science

377 notes

Meet London’s Babylab, where scientists experiment on babies’ brains
In the laboratories of the Henry Wellcome Building at Birkbeck, University of London, children’s squeaky toys lie scattered on the floor. Brightly coloured posters of animals are pasted on the walls and picture books are stacked on the low tables. This is the Babylab — a research centre that  experiments on children aged one month to three years, to understand how they learn, develop and think. “The way babies’ brains change is an amazing and mysterious process,” says the lab director, psychologist Mark Johnson. “The brain increases in size by three- to four-fold between birth and teenage years, but we don’t understand how that relates to its function.”
The Birkbeck neuroscientists are interested in finding out how babies recognise faces, how they learn to pay attention to some things and not others, how they perceive emotion and how their language develops. Studies published by the lab have shown that babies prefer to look at faces over objects. They have also found that differences in the dopamine-producing gene can affect babies’ attention span and that at six to eight months of age, there are detectable differences in the brain patterns of babies who were later  diagnosed with autism. 
The biggest obstacle is designing the right kinds of experiment. “There aren’t many methods for getting inside the mind of an infant or a toddler,” Johnson explains. Graduate students at the Babylab have teamed up with technology companies, using a €1.9 million (£1.7 million) grant from the European Union, to develop tools such as EEG head nets that record electrical brain activity, helmets that use light to measure blood flow in different parts of the brain, and eye-trackers that help study attention. Eventually, they want to create wireless systems so babies can react and play naturally during experiments. But despite the wires, “all our studies are geared towards making sure our babies are contented,” says Johnson. “If we want data, we need happy babies.”

Meet London’s Babylab, where scientists experiment on babies’ brains

In the laboratories of the Henry Wellcome Building at Birkbeck, University of London, children’s squeaky toys lie scattered on the floor. Brightly coloured posters of animals are pasted on the walls and picture books are stacked on the low tables. This is the Babylab — a research centre that experiments on children aged one month to three years, to understand how they learn, develop and think. “The way babies’ brains change is an amazing and mysterious process,” says the lab director, psychologist Mark Johnson. “The brain increases in size by three- to four-fold between birth and teenage years, but we don’t understand how that relates to its function.”

The Birkbeck neuroscientists are interested in finding out how babies recognise faces, how they learn to pay attention to some things and not others, how they perceive emotion and how their language develops. Studies published by the lab have shown that babies prefer to look at faces over objects. They have also found that differences in the dopamine-producing gene can affect babies’ attention span and that at six to eight months of age, there are detectable differences in the brain patterns of babies who were later diagnosed with autism.

The biggest obstacle is designing the right kinds of experiment. “There aren’t many methods for getting inside the mind of an infant or a toddler,” Johnson explains. Graduate students at the Babylab have teamed up with technology companies, using a €1.9 million (£1.7 million) grant from the European Union, to develop tools such as EEG head nets that record electrical brain activity, helmets that use light to measure blood flow in different parts of the brain, and eye-trackers that help study attention. Eventually, they want to create wireless systems so babies can react and play naturally during experiments. But despite the wires, “all our studies are geared towards making sure our babies are contented,” says Johnson. “If we want data, we need happy babies.”

Filed under babies babylab brain research facial recognition attention EEG neuroscience psychology science

73 notes

Parkinsons’ drug helps older people to make decisions

A drug widely used to treat Parkinson’s Disease can help to reverse age-related impairments in decision making in some older people, a study from researchers at the Wellcome Trust Centre for Neuroimaging has shown.

The study, published today in the journal Nature Neuroscience, also describes changes in the patterns of brain activity of adults in their seventies that help to explain why they are worse at making decisions than younger people.

Poorer decision-making is a natural part of the ageing process that stems from a decline in our brains’ ability to learn from our experiences. Part of the decision-making process involves learning to predict the likelihood of getting a reward from the choices that we make.

An area of the brain called the nucleus accumbens is responsible for interpreting the difference between the reward that we’re expecting to get from a decision and the reward that is actually received. These so called ‘prediction errors’, reported by a brain chemical called dopamine, help us to learn from our actions and modify our behaviour to make better choices the next time.

Dr Rumana Chowdhury, who led the study at the Wellcome Trust Centre for Neuroimaging at UCL, said: “We know that dopamine decline is part of the normal aging process so we wanted to see whether it had any effect on reward-based decision making. We found that when we treated older people who were particularly bad at making decisions with a drug that increases dopamine in the brain, their ability to learn from rewards improved to a level comparable to somebody in their twenties and enabled them to make better decisions.”

The team used a combination of behavioural testing and brain imaging techniques, to investigate the decision-making process in 32 healthy volunteers aged in their early seventies compared with 22 volunteers in their mid-twenties. Older participants were tested on and off L-DOPA, a drug that increases levels of dopamine in the brain. L-DOPA, more commonly known as Levodopa, is widely used in the clinic to treat Parkinson’s.

The participants were asked to complete a behavioural learning task called the two-arm bandit, which mimics the decisions that gamblers make while playing slot machines. Players were shown two images and had to choose the one that they thought would give them the biggest reward. Their performance before and after drug treatment was assessed by the amount of money they won in the task.

"The older volunteers who were less able to predict the likelihood of a reward from their decisions, and so performed worst in the task, showed a significant improvement following drug treatment," Dr Chowdhury explains.

The team then looked at brain activity in the participants as they played the game using functional Magnetic Resonance Imaging (fMRI), and measured connections between areas of the brain that are involved in reward prediction using a technique called Diffusor Tensor Imaging (DTI).

The findings reveal that the older adults who performed best in the gambling game before drug treatment had greater integrity of their dopamine pathways. Older adults who performed poorly before drug treatment were not able to adequately signal reward expectation in the brain – this was corrected by L-DOPA and their performance improved on the drug.

Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, said: “This careful investigation into the subtle cognitive changes that take place as we age offers important insights into what may happen at both a functional and anatomical level in older people who have problems with making decisions. That the team were able to reverse these changes by manipulating dopamine levels offers the hope of therapeutic approaches that could allow older people to function more effectively in the wider community.”

(Source: eurekalert.org)

Filed under brain brain activity parkinson's disease nucleus accumbens aging neuroimaging neuroscience science

100 notes

Unraveling the molecular roots of Down syndrome
Researchers discover that the extra chromosome inherited in Down syndrome impairs learning and memory because it leads to low levels of SNX27 protein in the brain.
What is it about the extra chromosome inherited in Down syndrome—chromosome 21—that alters brain and body development? Researchers have new evidence that points to a protein called sorting nexin 27, or SNX27. SNX27 production is inhibited by a molecule encoded on chromosome 21. The study, published March 24 in Nature Medicine, shows that SNX27 is reduced in human Down syndrome brains. The extra copy of chromosome 21 means a person with Down syndrome produces less SNX27 protein, which in turn disrupts brain function. What’s more, the researchers showed that restoring SNX27 in Down syndrome mice improves cognitive function and behavior.
“In the brain, SNX27 keeps certain receptors on the cell surface—receptors that are necessary for neurons to fire properly,” said Huaxi Xu, Ph.D., Sanford-Burnham professor and senior author of the study. “So, in Down syndrome, we believe lack of SNX27 is at least partly to blame for developmental and cognitive defects.”
SNX27’s role in brain function
Xu and colleagues started out working with mice that lack one copy of the snx27 gene. They noticed that the mice were mostly normal, but showed some significant defects in learning and memory. So the team dug deeper to determine why SNX27 would have that effect. They found that SNX27 helps keep glutamate receptors on the cell surface in neurons. Neurons need glutamate receptors in order to function correctly. With less SNX27, these mice had fewer active glutamate receptors and thus impaired learning and memory.
SNX27 levels are low in Down syndrome
Then the team got thinking about Down syndrome. The SNX27-deficient mice shared some characteristics with Down syndrome, so they took a look at human brains with the condition. This confirmed the clinical significance of their laboratory findings—humans with Down syndrome have significantly lower levels of SNX27.
Next, Xu and colleagues wondered how Down syndrome and low SNX27 are connected—could the extra chromosome 21 encode something that affects SNX27 levels? They suspected microRNAs, small pieces of genetic material that don’t code for protein, but instead influence the production of other genes. It turns out that chromosome 21 encodes one particular microRNA called miR-155. In human Down syndrome brains, the increase in miR-155 levels correlates almost perfectly with the decrease in SNX27.
Xu and his team concluded that, due to the extra chromosome 21 copy, the brains of people with Down syndrome produce extra miR-155, which by indirect means decreases SNX27 levels, in turn decreasing surface glutamate receptors. Through this mechanism, learning, memory, and behavior are impaired.
Restoring SNX27 function rescues Down syndrome mice
If people with Down syndrome simply have too much miR-155 or not enough SNX27, could that be fixed? The team explored this possibility. They used a noninfectious virus as a delivery vehicle to introduce new human SNX27 in the brains of Down syndrome mice.
“Everything goes back to normal after SNX27 treatment. It’s amazing—first we see the glutamate receptors come back, then memory deficit is repaired in our Down syndrome mice,” said Xin Wang, a graduate student in Xu’s lab and first author of the study. “Gene therapy of this sort hasn’t really panned out in humans, however. So we’re now screening small molecules to look for some that might increase SNX27 production or function in the brain.”

Unraveling the molecular roots of Down syndrome

Researchers discover that the extra chromosome inherited in Down syndrome impairs learning and memory because it leads to low levels of SNX27 protein in the brain.

What is it about the extra chromosome inherited in Down syndrome—chromosome 21—that alters brain and body development? Researchers have new evidence that points to a protein called sorting nexin 27, or SNX27. SNX27 production is inhibited by a molecule encoded on chromosome 21. The study, published March 24 in Nature Medicine, shows that SNX27 is reduced in human Down syndrome brains. The extra copy of chromosome 21 means a person with Down syndrome produces less SNX27 protein, which in turn disrupts brain function. What’s more, the researchers showed that restoring SNX27 in Down syndrome mice improves cognitive function and behavior.

“In the brain, SNX27 keeps certain receptors on the cell surface—receptors that are necessary for neurons to fire properly,” said Huaxi Xu, Ph.D., Sanford-Burnham professor and senior author of the study. “So, in Down syndrome, we believe lack of SNX27 is at least partly to blame for developmental and cognitive defects.”

SNX27’s role in brain function

Xu and colleagues started out working with mice that lack one copy of the snx27 gene. They noticed that the mice were mostly normal, but showed some significant defects in learning and memory. So the team dug deeper to determine why SNX27 would have that effect. They found that SNX27 helps keep glutamate receptors on the cell surface in neurons. Neurons need glutamate receptors in order to function correctly. With less SNX27, these mice had fewer active glutamate receptors and thus impaired learning and memory.

SNX27 levels are low in Down syndrome

Then the team got thinking about Down syndrome. The SNX27-deficient mice shared some characteristics with Down syndrome, so they took a look at human brains with the condition. This confirmed the clinical significance of their laboratory findings—humans with Down syndrome have significantly lower levels of SNX27.

Next, Xu and colleagues wondered how Down syndrome and low SNX27 are connected—could the extra chromosome 21 encode something that affects SNX27 levels? They suspected microRNAs, small pieces of genetic material that don’t code for protein, but instead influence the production of other genes. It turns out that chromosome 21 encodes one particular microRNA called miR-155. In human Down syndrome brains, the increase in miR-155 levels correlates almost perfectly with the decrease in SNX27.

Xu and his team concluded that, due to the extra chromosome 21 copy, the brains of people with Down syndrome produce extra miR-155, which by indirect means decreases SNX27 levels, in turn decreasing surface glutamate receptors. Through this mechanism, learning, memory, and behavior are impaired.

Restoring SNX27 function rescues Down syndrome mice

If people with Down syndrome simply have too much miR-155 or not enough SNX27, could that be fixed? The team explored this possibility. They used a noninfectious virus as a delivery vehicle to introduce new human SNX27 in the brains of Down syndrome mice.

“Everything goes back to normal after SNX27 treatment. It’s amazing—first we see the glutamate receptors come back, then memory deficit is repaired in our Down syndrome mice,” said Xin Wang, a graduate student in Xu’s lab and first author of the study. “Gene therapy of this sort hasn’t really panned out in humans, however. So we’re now screening small molecules to look for some that might increase SNX27 production or function in the brain.”

Filed under down syndrome chromosome 21 cognitive function brain function neuroscience science

104 notes

DNA damage occurs as part of normal brain activity
Scientists at the Gladstone Institutes have discovered that a certain type of DNA damage long thought to be particularly detrimental to brain cells can actually be part of a regular, non-harmful process. The team further found that disruptions to this process occur in mouse models of Alzheimer’s disease—and identified two therapeutic strategies that reduce these disruptions.
Scientists have long known that DNA damage occurs in every cell, accumulating as we age. But a particular type of DNA damage, known as a double-strand break, or DSB, has long been considered a major force behind age-related illnesses such as Alzheimer’s. Today, researchers in the laboratory of Gladstone Senior Investigator Lennart Mucke, MD, report in Nature Neuroscience that DSBs in neuronal cells in the brain can also be part of normal brain functions such as learning—as long as the DSBs are tightly controlled and repaired in good time. Further, the accumulation of the amyloid-beta protein in the brain—widely thought to be a major cause of Alzheimer’s disease—increases the number of neurons with DSBs and delays their repair.
"It is both novel and intriguing team’s finding that the accumulation and repair of DSBs may be part of normal learning," said Fred H. Gage, PhD, of the Salk Institute who was not involved in this study. "Their discovery that the Alzheimer’s-like mice exhibited higher baseline DSBs, which weren’t repaired, increases these findings’ relevance and provides new understanding of this deadly disease’s underlying mechanisms."
In laboratory experiments, two groups of mice explored a new environment filled with unfamiliar sights, smells and textures. One group was genetically modified to simulate key aspects of Alzheimer’s, and the other was a healthy, control group. As the mice explored, their neurons became stimulated as they processed new information. After two hours, the mice were returned to their familiar, home environment.
The investigators then examined the neurons of the mice for markers of DSBs. The control group showed an increase in DSBs right after they explored the new environment—but after being returned to their home environment, DSB levels dropped.
"We were initially surprised to find neuronal DSBs in the brains of healthy mice," said Elsa Suberbielle, DVM, PhD, Gladstone postdoctoral fellow and the paper’s lead author. "But the close link between neuronal stimulation and DSBs, and the finding that these DSBs were repaired after the mice returned to their home environment, suggest that DSBs are an integral part of normal brain activity. We think that this damage-and-repair pattern might help the animals learn by facilitating rapid changes in the conversion of neuronal DNA into proteins that are involved in forming memories."
The group of mice modified to simulate Alzheimer’s had higher DSB levels at the start—levels that rose even higher during neuronal stimulation. In addition, the team noticed a substantial delay in the DNA-repair process.
To counteract the accumulation of DSBs, the team first used a therapeutic approach built on two recent studies—one of which was led by Dr. Mucke and his team—that showed the widely used anti-epileptic drug levetiracetam could improve neuronal communication and memory in both mouse models of Alzheimer’s and in humans in the disease’s earliest stages. The mice they treated with the FDA-approved drug had fewer DSBs. In their second strategy, they genetically modified mice to lack the brain protein called tau—another protein implicated in Alzheimer’s. This manipulation, which they had previously found to prevent abnormal brain activity, also prevented the excessive accumulation of DSBs.
The team’s findings suggest that restoring proper neuronal communication is important for staving off the effects of Alzheimer’s—perhaps by maintaining the delicate balance between DNA damage and repair.
"Currently, we have no effective treatments to slow, prevent or halt Alzheimer’s, from which more than 5 million people suffer in the United States alone," said Dr. Mucke, who directs neurological research at Gladstone and is a professor of neuroscience and neurology at the University of California, San Francisco, with which Gladstone is affiliated. "The need to decipher the causes of Alzheimer’s and to find better therapeutic solutions has never been more important—or urgent. Our results suggest that readily available drugs could help protect neurons against some of the damages inflicted by this illness. In the future, we will further explore these therapeutic strategies. We also hope to gain a deeper understanding of the role that DSBs play in learning and memory—and in the disruption of these important brain functions by Alzheimer’s disease."
(Image courtesy: Lulu Qian, Erik Winfree & Jehoshua Bruck | California Institute of Technology)

DNA damage occurs as part of normal brain activity

Scientists at the Gladstone Institutes have discovered that a certain type of DNA damage long thought to be particularly detrimental to brain cells can actually be part of a regular, non-harmful process. The team further found that disruptions to this process occur in mouse models of Alzheimer’s disease—and identified two therapeutic strategies that reduce these disruptions.

Scientists have long known that DNA damage occurs in every cell, accumulating as we age. But a particular type of DNA damage, known as a double-strand break, or DSB, has long been considered a major force behind age-related illnesses such as Alzheimer’s. Today, researchers in the laboratory of Gladstone Senior Investigator Lennart Mucke, MD, report in Nature Neuroscience that DSBs in neuronal cells in the brain can also be part of normal brain functions such as learning—as long as the DSBs are tightly controlled and repaired in good time. Further, the accumulation of the amyloid-beta protein in the brain—widely thought to be a major cause of Alzheimer’s disease—increases the number of neurons with DSBs and delays their repair.

"It is both novel and intriguing team’s finding that the accumulation and repair of DSBs may be part of normal learning," said Fred H. Gage, PhD, of the Salk Institute who was not involved in this study. "Their discovery that the Alzheimer’s-like mice exhibited higher baseline DSBs, which weren’t repaired, increases these findings’ relevance and provides new understanding of this deadly disease’s underlying mechanisms."

In laboratory experiments, two groups of mice explored a new environment filled with unfamiliar sights, smells and textures. One group was genetically modified to simulate key aspects of Alzheimer’s, and the other was a healthy, control group. As the mice explored, their neurons became stimulated as they processed new information. After two hours, the mice were returned to their familiar, home environment.

The investigators then examined the neurons of the mice for markers of DSBs. The control group showed an increase in DSBs right after they explored the new environment—but after being returned to their home environment, DSB levels dropped.

"We were initially surprised to find neuronal DSBs in the brains of healthy mice," said Elsa Suberbielle, DVM, PhD, Gladstone postdoctoral fellow and the paper’s lead author. "But the close link between neuronal stimulation and DSBs, and the finding that these DSBs were repaired after the mice returned to their home environment, suggest that DSBs are an integral part of normal brain activity. We think that this damage-and-repair pattern might help the animals learn by facilitating rapid changes in the conversion of neuronal DNA into proteins that are involved in forming memories."

The group of mice modified to simulate Alzheimer’s had higher DSB levels at the start—levels that rose even higher during neuronal stimulation. In addition, the team noticed a substantial delay in the DNA-repair process.

To counteract the accumulation of DSBs, the team first used a therapeutic approach built on two recent studies—one of which was led by Dr. Mucke and his team—that showed the widely used anti-epileptic drug levetiracetam could improve neuronal communication and memory in both mouse models of Alzheimer’s and in humans in the disease’s earliest stages. The mice they treated with the FDA-approved drug had fewer DSBs. In their second strategy, they genetically modified mice to lack the brain protein called tau—another protein implicated in Alzheimer’s. This manipulation, which they had previously found to prevent abnormal brain activity, also prevented the excessive accumulation of DSBs.

The team’s findings suggest that restoring proper neuronal communication is important for staving off the effects of Alzheimer’s—perhaps by maintaining the delicate balance between DNA damage and repair.

"Currently, we have no effective treatments to slow, prevent or halt Alzheimer’s, from which more than 5 million people suffer in the United States alone," said Dr. Mucke, who directs neurological research at Gladstone and is a professor of neuroscience and neurology at the University of California, San Francisco, with which Gladstone is affiliated. "The need to decipher the causes of Alzheimer’s and to find better therapeutic solutions has never been more important—or urgent. Our results suggest that readily available drugs could help protect neurons against some of the damages inflicted by this illness. In the future, we will further explore these therapeutic strategies. We also hope to gain a deeper understanding of the role that DSBs play in learning and memory—and in the disruption of these important brain functions by Alzheimer’s disease."

(Image courtesy: Lulu Qian, Erik Winfree & Jehoshua Bruck | California Institute of Technology)

Filed under brain activity brain function brain cells dna damage neurons animal model neuroscience science

712 notes

Farsighted engineer invents bionic eye to help the blind
For UCLA bioengineering professor Wentai Liu, more than two decades of visionary research burst into the headlines last month when the FDA approved what it called “the first bionic eye for the blind.”
The Argus II Retinal Prosthesis System — developed by a team of physicians and engineers from around the country — aids adults who have lost their eyesight due to retinitis pigmentosa (RP), age-related macular degeneration or other eye diseases that destroy the retina’s light-sensitive photoreceptors.
At the heart of the device is a tiny yet powerful computer chip developed by Liu that, when implanted in the retina, effectively sidesteps the damaged photoreceptors to “trick” the eye into seeing. The Argus II operates with a miniature video camera mounted on a pair of eyeglasses that sends information about images it detects to a microprocessor worn on the user’s waistband. The microprocessor wirelessly transmits electronic signals to the computer chip, a fingernail-size grid made up of 60 circuits. These chips stimulate the retina’s nerve cells with electronic impulses which head up the optic nerve to the brain’s visual cortex. There, the brain assembles them into a composite image.
Recipients of the retinal implant can read oversized letters of the alphabet, discern objects and movement, and even see the outlines and some details of faces. And while the picture is far from perfect — the healthy human eye sees at a much higher resolution — it’s a breakthrough for people like the first patient, a man in his 70s who was blinded at age 20 by RP, to receive the implant in clinical trials. “It was the first time he’d seen light in a half-century,” said Liu, adding that “it feels good as the engineer” to have helped make this possible.
Liu joined the Artificial Retina Project in 1988 as a professor of computer and electrical engineering at North Carolina State University. The multidisciplinary research project was funded by the U.S. Department of Energy’s Office of Science because it envisioned a potential pandemic of eyesight loss in America’s aging population. Leading the project was Duke University ophthalmologist and neurosurgeon Dr. Mark Humayun, now on faculty at USC. He tapped Liu to engineer the artificial retina.
“I thought it was a great idea,” Liu said. “But I asked, ‘What can I do?’ because I didn’t know much about biology.” Humayun handed him a six-inch-thick medical manual on the retina. “The learning curve was very steep,” Liu recalled with a laugh.
However, Liu’s fellow engineers questioned his sanity. “I was working on integrated chip design and had just gotten tenure when I signed on to this project. They said, ‘You’re crazy!’ But I’m glad I made that choice, getting into this new field.”
How the bionic eye works

Farsighted engineer invents bionic eye to help the blind

For UCLA bioengineering professor Wentai Liu, more than two decades of visionary research burst into the headlines last month when the FDA approved what it called “the first bionic eye for the blind.”

The Argus II Retinal Prosthesis System — developed by a team of physicians and engineers from around the country — aids adults who have lost their eyesight due to retinitis pigmentosa (RP), age-related macular degeneration or other eye diseases that destroy the retina’s light-sensitive photoreceptors.

At the heart of the device is a tiny yet powerful computer chip developed by Liu that, when implanted in the retina, effectively sidesteps the damaged photoreceptors to “trick” the eye into seeing. The Argus II operates with a miniature video camera mounted on a pair of eyeglasses that sends information about images it detects to a microprocessor worn on the user’s waistband. The microprocessor wirelessly transmits electronic signals to the computer chip, a fingernail-size grid made up of 60 circuits. These chips stimulate the retina’s nerve cells with electronic impulses which head up the optic nerve to the brain’s visual cortex. There, the brain assembles them into a composite image.

Recipients of the retinal implant can read oversized letters of the alphabet, discern objects and movement, and even see the outlines and some details of faces. And while the picture is far from perfect — the healthy human eye sees at a much higher resolution — it’s a breakthrough for people like the first patient, a man in his 70s who was blinded at age 20 by RP, to receive the implant in clinical trials. “It was the first time he’d seen light in a half-century,” said Liu, adding that “it feels good as the engineer” to have helped make this possible.

Liu joined the Artificial Retina Project in 1988 as a professor of computer and electrical engineering at North Carolina State University. The multidisciplinary research project was funded by the U.S. Department of Energy’s Office of Science because it envisioned a potential pandemic of eyesight loss in America’s aging population. Leading the project was Duke University ophthalmologist and neurosurgeon Dr. Mark Humayun, now on faculty at USC. He tapped Liu to engineer the artificial retina.

“I thought it was a great idea,” Liu said. “But I asked, ‘What can I do?’ because I didn’t know much about biology.” Humayun handed him a six-inch-thick medical manual on the retina. “The learning curve was very steep,” Liu recalled with a laugh.

However, Liu’s fellow engineers questioned his sanity. “I was working on integrated chip design and had just gotten tenure when I signed on to this project. They said, ‘You’re crazy!’ But I’m glad I made that choice, getting into this new field.”

How the bionic eye works

Filed under Argus II prosthetics retina retinal implant photoreceptors neuroscience science

free counters