Neuroscience

Articles and news from the latest research reports.

Posts tagged brain cells

88 notes

Technique inactivates Down-causing chromosome

Borrowing a trick from nature, researchers have switched off the extra chromosome that causes Down syndrome in cells taken from patients with the condition.

Though not a cure, the technique, reported July 17 in Nature, has already produced insights into the disorder. In the long run it might even make the flaw that causes Down syndrome correctable through gene therapy.

“Gene therapy is now on the horizon,” says Elizabeth Fisher, a molecular geneticist at University College London. “But that horizon is very far away.”

Down syndrome, also called trisomy 21, occurs when people inherit three copies of chromosome 21 instead of the usual two. It is the most common chromosomal condition, affecting around one in every 700 babies born in the United States. People with the disorder typically have both physical and cognitive complications of having an extra chromosome.

“Down syndrome has been one of those disorders where people say, ‘Oh, there’s nothing you can do about it,’ ” says Jeanne Lawrence, a chromosome biologist and genetic counselor at the University of Massachusetts Medical School in Worcester, who led the study with colleagues Lisa Hall and Jun Jiang.

The researchers decided to see whether they could shut down the extra chromosome by drawing on a biological process called X inactivation. Women have two X chromosomes and men have only one X and a Y. To halve the amount of X chromosome products, female cells shut down one copy. Cells do that using a chunk of RNA called XIST, which is made by one X chromosome but not the other. The RNA works by pulling in proteins that essentially board up the chromosome like an abandoned building. The other X stays on by making a different RNA.

Lawrence and Hall thought that if they put XIST on another chromosome, it might shut that one down too. So Jiang put the gene for XIST onto one of the three copies of chromosome 21 carried by stem cells grown from a man with Down syndrome. That copy of the chromosome got switched off.

“It’s kind of surprising that it wasn’t done before. I’m smacking my own forehead and saying, ‘duh,’ ” says Roger Reeves, a geneticist at Johns Hopkins University.

One idea about why an extra chromosome 21 causes cognitive problems is that it may slow down the growth of brain cells. Jiang grew nerve cells from the Down patient’s stem cells to see how cells with one shut-down chromosome developed compared with cells bearing three active copies. The cells with only two working chromosomes grew faster, forming clusters of neurons in a day or two, while the uncorrected cells needed four or five days.

The work is an enormous step forward in Down syndrome research, Fisher says, and “may take us much closer to understanding the molecular basis of the disorder.” The technique could allow researchers to figure out which genes are involved in Down syndrome and how extra copies affect cells and ultimately the body, she says.

Reeves wants to use the technology in animal experiments, a critical step in determining whether it could find use as gene therapy for people with Down syndrome. He plans to work with Lawrence’s group to switch off the extra chromosome in mice engineered to have a disorder that simulates some features of Down syndrome.

But Reeves doubts that scientists could use the method to switch off the extra chromosome in every cell in the body. Doing so would probably require gene therapy at a very early stage of pregnancy, something scientists don’t know how to do. “I just don’t see how we would get there from where we are today,” Reeves says.

Such universal silencing of the extra chromosome may be necessary to forestall developmental problems. But other problems associated with Down syndrome might be prevented or reversed by shutting down the extra chromosome after birth. For instance, people with Down syndrome are at high risk of developing childhood leukemia and of getting Alzheimer’s disease. Gene therapy to turn off the extra chromosome in the bone marrow or the brain might prevent those problems.

Therapeutic possibilities are still far in the future and may never pan out, says William Mobley, a neurologist and neuroscientist at the University of California, San Diego. “We have to move cautiously and deliberately and not say that a cure for Down syndrome is on the horizon,” he says. “It’s not true, but gosh is there excitement that progress is being made.”

(Source: sciencenews.org)

Filed under down syndrome gene therapy trisomy chromosome 21 brain cells genetics science

165 notes

Unique Epigenomic Code Identified During Human Brain Development 
Changes in the epigenome, including chemical modifications of DNA, can act as an extra layer of information in the genome, and are thought to play a role in learning and memory, as well as in age-related cognitive decline. The results of a new study by scientists at the Salk Institute for Biological Studies show that the landscape of DNA methylation, a particular type of epigenomic modification, is highly dynamic in brain cells during the transition from birth to adulthood, helping to understand how information in the genomes of cells in the brain is controlled from fetal development to adulthood. The brain is much more complex than all other organs in the body and this discovery opens the door to a deeper understanding of how the intricate patterns of connectivity in the brain are formed.
“These results extend our knowledge of the unique role of DNA methylation in brain development and function,” says senior author Joseph R. Ecker, professor and director of Salk’s Genomic Analysis Laboratory and holder of the Salk International Council Chair in Genetics. “They offer a new framework for testing the role of the epigenome in healthy function and in pathological disruptions of neural circuits.”
A healthy brain is the product of a long process of development. The front-most part of our brain, called the frontal cortex, plays a key role in our ability to think, decide and act. The brain accomplishes all of this through the interaction of special cells such as neurons and glia. We know that these cells have distinct functions, but what gives these cells their individual identities? The answer lies in how each cell expresses the information contained in its DNA. Epigenomic modifications, such as DNA methylation, can control which genes are turned on or off without changing letters of the DNA alphabet (A-T-C-G), and thus help distinguish different cell types.
In this new study, published July 4 in Science, the scientists found that the patterns of DNA methylation undergo widespread reconfiguration in the frontal cortex of mouse and human brains during a time of development when synapses, or connections between nerve cells, are growing rapidly. The researchers identified the exact sites of DNA methylation throughout the genome in brains from infants through adults. They found that one form of DNA methylation is present in neurons and glia from birth. Strikingly, a second form of “non-CG” DNA methylation that is almost exclusive to neurons accumulates as the brain matures, becoming the dominant form of methylation in the genome of human neurons. These results help us to understand how the intricate DNA landscape of brain cells develops during the key stages of childhood.
The genetic code in DNA is made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). DNA methylation typically occurs at so-called CpG sites, where C (cytosine) sits next to G (guanine) in the DNA alphabet. About 80 to 90 percent of CpG sites are methylated in human DNA. Salk researchers previously discovered that in human embryonic stem cells and induced pluripotent stem cells, a type of artificially derived stem cell, DNA methylation can also occur when G does not follow C, hence “non-CG methylation.” Originally, they thought that this type of methylation disappeared when stem cells differentiate into specific tissue-types such as lung or fat cells. The current study finds this is not the case in the brain, where non-CG methylation appears after cells differentiate, usually during childhood and adolescence when the brain is maturing.
By sequencing the genomes of mouse and human brain tissue as well as neurons and glia (from the frontal cortex of the brain) during early postnatal, juvenile, adolescent and adult stages, the Salk team found that non-CG methylation accumulates in neurons through early childhood and adolescence, and becomes the dominant form of DNA methylation in mature human neurons. “This shows that the period during which the neural circuits of the brain mature is accompanied by a parallel process of large-scale reconfiguration of the neural epigenome,” says Ecker, who is a Howard Hughes Medical Institute and Gordon and Betty Moore Foundation investigator.
The study provides the first comprehensive maps of how DNA methylation patterns change in the mouse and human brain during development, forming a critical foundation to now explore whether changes in methylation patterns may be linked to human diseases, including psychiatric disorders. Recent studies have demonstrated a possible role for DNA methylation in schizophrenia, depression, suicide and bipolar disorder. “Our work will let us begin to ask more detailed questions about how changes in the epigenome sculpt the complex identities of brain cells through life,” says co-first author Eran Mukamel, from Salk’s Computational Neurobiology Laboratory.
“The human brain has been called the most complex system that we know of in the universe,” says Ryan Lister, co-corresponding author on the new paper, previously a postdoctoral fellow in Ecker’s laboratory at Salk and now a group leader at The University of Western Australia. “So perhaps we shouldn’t be so surprised that this complexity extends to the level of the brain epigenome. These unique features of DNA methylation that emerge during critical phases of brain development suggest the presence of previously unrecognized regulatory processes that may be critically involved in normal brain function and brain disorders.”
At present, there is consensus among neuroscientists that many mental disorders have a neurodevelopmental origin and arise from an interaction between genetic predisposition and environmental influences (for example, early-life stress or drug abuse), the outcome of which is altered activity of brain networks. The building and shaping of these brain networks requires a long maturation process in which central nervous system cell types (neurons and glia) need to fine-tune the way they express their genetic code.
“DNA methylation fulfills this role,” says study co-author Terrence J. Sejnowski, a Howard Hughes Medical Institute Investigator, holder of the Francis Crick Chair and head of Salk’s Computational Neurobiology Laboratory. “We found that patterns of methylation are dynamic during brain development, in particular for non-CG methylation during early childhood and adolescence, which changes the way that we think about normal brain function and dysfunction.”
By disrupting the transcriptional expression of neurons, adds co-corresponding author M. Margarita Behrens, a staff scientist in the Computational Neurobiology Laboratory, “the alterations of these methylation patterns will change the way in which networks are formed, which could, in turn, lead to the appearance of mental disorders later in life.”

Unique Epigenomic Code Identified During Human Brain Development

Changes in the epigenome, including chemical modifications of DNA, can act as an extra layer of information in the genome, and are thought to play a role in learning and memory, as well as in age-related cognitive decline. The results of a new study by scientists at the Salk Institute for Biological Studies show that the landscape of DNA methylation, a particular type of epigenomic modification, is highly dynamic in brain cells during the transition from birth to adulthood, helping to understand how information in the genomes of cells in the brain is controlled from fetal development to adulthood. The brain is much more complex than all other organs in the body and this discovery opens the door to a deeper understanding of how the intricate patterns of connectivity in the brain are formed.

“These results extend our knowledge of the unique role of DNA methylation in brain development and function,” says senior author Joseph R. Ecker, professor and director of Salk’s Genomic Analysis Laboratory and holder of the Salk International Council Chair in Genetics. “They offer a new framework for testing the role of the epigenome in healthy function and in pathological disruptions of neural circuits.”

A healthy brain is the product of a long process of development. The front-most part of our brain, called the frontal cortex, plays a key role in our ability to think, decide and act. The brain accomplishes all of this through the interaction of special cells such as neurons and glia. We know that these cells have distinct functions, but what gives these cells their individual identities? The answer lies in how each cell expresses the information contained in its DNA. Epigenomic modifications, such as DNA methylation, can control which genes are turned on or off without changing letters of the DNA alphabet (A-T-C-G), and thus help distinguish different cell types.

In this new study, published July 4 in Science, the scientists found that the patterns of DNA methylation undergo widespread reconfiguration in the frontal cortex of mouse and human brains during a time of development when synapses, or connections between nerve cells, are growing rapidly. The researchers identified the exact sites of DNA methylation throughout the genome in brains from infants through adults. They found that one form of DNA methylation is present in neurons and glia from birth. Strikingly, a second form of “non-CG” DNA methylation that is almost exclusive to neurons accumulates as the brain matures, becoming the dominant form of methylation in the genome of human neurons. These results help us to understand how the intricate DNA landscape of brain cells develops during the key stages of childhood.

The genetic code in DNA is made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). DNA methylation typically occurs at so-called CpG sites, where C (cytosine) sits next to G (guanine) in the DNA alphabet. About 80 to 90 percent of CpG sites are methylated in human DNA. Salk researchers previously discovered that in human embryonic stem cells and induced pluripotent stem cells, a type of artificially derived stem cell, DNA methylation can also occur when G does not follow C, hence “non-CG methylation.” Originally, they thought that this type of methylation disappeared when stem cells differentiate into specific tissue-types such as lung or fat cells. The current study finds this is not the case in the brain, where non-CG methylation appears after cells differentiate, usually during childhood and adolescence when the brain is maturing.

By sequencing the genomes of mouse and human brain tissue as well as neurons and glia (from the frontal cortex of the brain) during early postnatal, juvenile, adolescent and adult stages, the Salk team found that non-CG methylation accumulates in neurons through early childhood and adolescence, and becomes the dominant form of DNA methylation in mature human neurons. “This shows that the period during which the neural circuits of the brain mature is accompanied by a parallel process of large-scale reconfiguration of the neural epigenome,” says Ecker, who is a Howard Hughes Medical Institute and Gordon and Betty Moore Foundation investigator.

The study provides the first comprehensive maps of how DNA methylation patterns change in the mouse and human brain during development, forming a critical foundation to now explore whether changes in methylation patterns may be linked to human diseases, including psychiatric disorders. Recent studies have demonstrated a possible role for DNA methylation in schizophrenia, depression, suicide and bipolar disorder. “Our work will let us begin to ask more detailed questions about how changes in the epigenome sculpt the complex identities of brain cells through life,” says co-first author Eran Mukamel, from Salk’s Computational Neurobiology Laboratory.

“The human brain has been called the most complex system that we know of in the universe,” says Ryan Lister, co-corresponding author on the new paper, previously a postdoctoral fellow in Ecker’s laboratory at Salk and now a group leader at The University of Western Australia. “So perhaps we shouldn’t be so surprised that this complexity extends to the level of the brain epigenome. These unique features of DNA methylation that emerge during critical phases of brain development suggest the presence of previously unrecognized regulatory processes that may be critically involved in normal brain function and brain disorders.”

At present, there is consensus among neuroscientists that many mental disorders have a neurodevelopmental origin and arise from an interaction between genetic predisposition and environmental influences (for example, early-life stress or drug abuse), the outcome of which is altered activity of brain networks. The building and shaping of these brain networks requires a long maturation process in which central nervous system cell types (neurons and glia) need to fine-tune the way they express their genetic code.

“DNA methylation fulfills this role,” says study co-author Terrence J. Sejnowski, a Howard Hughes Medical Institute Investigator, holder of the Francis Crick Chair and head of Salk’s Computational Neurobiology Laboratory. “We found that patterns of methylation are dynamic during brain development, in particular for non-CG methylation during early childhood and adolescence, which changes the way that we think about normal brain function and dysfunction.”

By disrupting the transcriptional expression of neurons, adds co-corresponding author M. Margarita Behrens, a staff scientist in the Computational Neurobiology Laboratory, “the alterations of these methylation patterns will change the way in which networks are formed, which could, in turn, lead to the appearance of mental disorders later in life.”

Filed under brain cells dna methylation brain development cognitive function frontal cortex epigenetics neuroscience science

58 notes

Defects in brain cell migration linked to mental retardation

A rare, inherited form of mental retardation has led scientists at Washington University School of Medicine in St. Louis to three important “travel agents” at work in the developing brain.

The agents — two individual proteins and a tightly bound cluster of four additional proteins — make it possible for brain neurons to travel from the area where they are born to other brain regions where they will reside permanently and integrate into neuronal circuits. Inhibiting any of these proteins in embryonic mice reduces the ability of neurons, which process and transmit information, to reach their final destinations and, presumably, to hardwire the brain.

“That kind of misplacement of brain cells is likely to seriously disrupt mental functions,” said Azad Bonni, MD, PhD, the Edison Professor and chairman of the Department of Anatomy and Neurobiology. “This is just one of many ways that brain development can go awry. To understand intellectual disability and develop treatments, we need to understand the many problems that can arise as the brain develops and its circuitry is established.”

The results appeared June 19 in Neuron.

The new work began as an inquiry into PHF6, a gene that is mutated in patients with Börjeson-Forssman-Lehmann syndrome. This disorder causes mental retardation, developmental delays and skeletal abnormalities. More than a decade ago, scientists identified a link between the condition and PHF6, but they did not know what the gene did in the brain.

Bonni’s laboratory added green fluorescent protein to brain cells to track their development and movement in embryonic mice. Then the researchers inhibited PHF6 in some mice.

In normal mice, as expected, brain neurons migrated from the ventricular zone, where they were born, to the cortical plate, the precursor site of the cerebral cortex. In the mature brain, the cerebral cortex is responsible for higher brain functions such as processing of sensory data, attention and decision-making. In mice whose brain cells lacked PHF6, many brain cells either stayed in the ventricular zone or only completed part of their journey.

In a series of additional experiments, Bonni’s research group showed that the PHF6 protein operates in the nucleus of brain neurons, the command center of the cell. The scientists found that the PHF6 protein interacts with the PAF1 complex, a tightly bound cluster of four proteins that regulates programs of gene expression. This cluster then turns on a cell surface protein called neuroglycan C in brain neurons.

If any of these factors were inhibited, mouse brain neurons were unable to complete their normal migration. The researchers could “rescue” the neurons by restoring the missing protein, allowing the cells to complete their journey.

Disrupting proper brain structure and organization may not be the only problem caused by the PHF6 mutation. A portion of patients with Börjeson-Forssman-Lehmann syndrome also have epilepsy.

In tests in mice, Bonni’s group found that the misplaced brain neurons were more excitable. This might result from changes in the activity of other proteins regulated by PHF6 and could make the brain more susceptible to seizures.

The researchers also learned that increasing the production of neuroglycan C in brain neurons overcomes the harmful effects of PHF6 loss on the migration of neurons.

“Cell surface proteins such as neuroglycan C are in good position to help cells move through their environment,” Bonni said. “The protein’s position on the cell surface of neurons also one day might make it an accessible target for drug treatments for developmental cognitive disorders.”

Bonni suspects there might be additional problems in brain cells that develop without normal PHF6 and that errors in the gene might even impair function in neurons that make it to their final destinations. Further studies are underway.

(Source: genetics.wustl.edu)

Filed under mental retardation proteins brain cells brain circuitry PHF6 gene cerebral cortex neuroscience genetics science

44 notes

The discerning fruit fly: Linking brain-cell activity and behavior in smell recognition 
Behind the common expression “you can’t compare apples to oranges” lies a fundamental question of neuroscience: How does the brain recognize that apples and oranges are different? A group of neuroscientists at Cold Spring Harbor Laboratory (CSHL) has published new research that provides some answers.
In the fruit fly, the ability to distinguish smells lies in a region of the brain called the mushroom body (MB). Prior research has demonstrated that the MB is associated with learning and memory, especially in relation to the sense of smell, also known as olfaction.
CSHL Associate Professor Glenn Turner and colleagues have now mapped the activity of brain cells in the MB, in flies conditioned to have Pavlovian behavioral responses to different odors. Their results, outlined in a paper published today by the Journal of Neuroscience, suggest that the activity of a remarkably small number of neurons — as few as 25 — is required to be able to distinguish between different odors.
They also found that a similarly small number of nerve cells are involved in grouping alike odors. This means, for instance, that “if you’ve learned that oranges are good, the smell of a tangerine will also get you thinking about food,” says Robert Campbell, a postdoctoral researcher in the Turner lab and lead author on the new study.
These intriguing new findings are part of a broad effort in contemporary neuroscience to determine how the brain, easily the most complex organ in any animal, manages to make a mass of raw sensory data intelligible to the individual — whether a person or a fly — in order to serve as a basis for making vital decisions.
Looking closely at Kenyon cells
The neurons in the fly MB are known as Kenyon cells, named after their discoverer, the neuroscientist Frederick Kenyon, who was the first person to stain and visualize individual neurons in the insect brain. Kenyon cells receive sensory inputs from organs that perceive smell, taste, sight and sound. This confluence of sensory input in the MB is important for memory formation, which comes about through a linking of different types of information.
Kenyon cells make up only about 4% of the entire fly brain and are extremely sensitive to inputs triggered by odors, in which only two connections between neurons, called synapses, separate them from the receptor cells at the “front end” of the olfactory system.
But in contrast to other regions of the brain, such as the vertebrate hippocampus, the sensory responses in the MB are few in number and relatively weak. It is the sparseness of the signals in the Kenyon cell neurons that makes studying memory formation in flies so promising to Turner and his team. “We set out to learn if these signals were really informative to the animal’s learning and memory with regard to smell,” Turner says.
In particular, Turner’s group wanted to see if they could link these signals with actual behavior in flies. The team used an imaging technique that allowed them to view the responses of over 100 Kenyon cells at a time and, importantly, quantify their results. They found that even the very sparse responses in these cells that are triggered by odors provide a large amount of information about odor identity. Turner suspects the very selectiveness of the response helps in the accurate formation and recall of memories.
When the researchers used two odors blended together in a series of increasingly similar concentrations, they found that two very similar smells could be distinguished as a result of the activity of as few as 25 Kenyon cells. This correlated well with the behavior of the flies: when brain activity suggested the flies had difficulty discerning the odors, their behavior also showed they could not choose between them.
The activity of these cells also accounts for flies’ ability to discern novel odors and group them together. This was determined in a “generalization” test, in which the degree to which flies learned a generalized aversion to unfamiliar test odors could be predicted based upon the relatively similar activity patterns of Kenyon cells that the odors induced.
“Being able to do this type of ‘mind-reading’ means we really understand what signals these activity patterns are sending,” says Turner. Ultimately, he and colleagues hope to be able to relate their findings in the fly brain with the operation of the brain in mammals.

The discerning fruit fly: Linking brain-cell activity and behavior in smell recognition

Behind the common expression “you can’t compare apples to oranges” lies a fundamental question of neuroscience: How does the brain recognize that apples and oranges are different? A group of neuroscientists at Cold Spring Harbor Laboratory (CSHL) has published new research that provides some answers.

In the fruit fly, the ability to distinguish smells lies in a region of the brain called the mushroom body (MB). Prior research has demonstrated that the MB is associated with learning and memory, especially in relation to the sense of smell, also known as olfaction.

CSHL Associate Professor Glenn Turner and colleagues have now mapped the activity of brain cells in the MB, in flies conditioned to have Pavlovian behavioral responses to different odors. Their results, outlined in a paper published today by the Journal of Neuroscience, suggest that the activity of a remarkably small number of neurons — as few as 25 — is required to be able to distinguish between different odors.

They also found that a similarly small number of nerve cells are involved in grouping alike odors. This means, for instance, that “if you’ve learned that oranges are good, the smell of a tangerine will also get you thinking about food,” says Robert Campbell, a postdoctoral researcher in the Turner lab and lead author on the new study.

These intriguing new findings are part of a broad effort in contemporary neuroscience to determine how the brain, easily the most complex organ in any animal, manages to make a mass of raw sensory data intelligible to the individual — whether a person or a fly — in order to serve as a basis for making vital decisions.

Looking closely at Kenyon cells

The neurons in the fly MB are known as Kenyon cells, named after their discoverer, the neuroscientist Frederick Kenyon, who was the first person to stain and visualize individual neurons in the insect brain. Kenyon cells receive sensory inputs from organs that perceive smell, taste, sight and sound. This confluence of sensory input in the MB is important for memory formation, which comes about through a linking of different types of information.

Kenyon cells make up only about 4% of the entire fly brain and are extremely sensitive to inputs triggered by odors, in which only two connections between neurons, called synapses, separate them from the receptor cells at the “front end” of the olfactory system.

But in contrast to other regions of the brain, such as the vertebrate hippocampus, the sensory responses in the MB are few in number and relatively weak. It is the sparseness of the signals in the Kenyon cell neurons that makes studying memory formation in flies so promising to Turner and his team. “We set out to learn if these signals were really informative to the animal’s learning and memory with regard to smell,” Turner says.

In particular, Turner’s group wanted to see if they could link these signals with actual behavior in flies. The team used an imaging technique that allowed them to view the responses of over 100 Kenyon cells at a time and, importantly, quantify their results. They found that even the very sparse responses in these cells that are triggered by odors provide a large amount of information about odor identity. Turner suspects the very selectiveness of the response helps in the accurate formation and recall of memories.

When the researchers used two odors blended together in a series of increasingly similar concentrations, they found that two very similar smells could be distinguished as a result of the activity of as few as 25 Kenyon cells. This correlated well with the behavior of the flies: when brain activity suggested the flies had difficulty discerning the odors, their behavior also showed they could not choose between them.

The activity of these cells also accounts for flies’ ability to discern novel odors and group them together. This was determined in a “generalization” test, in which the degree to which flies learned a generalized aversion to unfamiliar test odors could be predicted based upon the relatively similar activity patterns of Kenyon cells that the odors induced.

“Being able to do this type of ‘mind-reading’ means we really understand what signals these activity patterns are sending,” says Turner. Ultimately, he and colleagues hope to be able to relate their findings in the fly brain with the operation of the brain in mammals.

Filed under fruit flies brain cells kenyon cells learning memory olfaction odor detection neurons neuroscience science

50 notes

Gustatory Tug-of-war Key To Whether Salty Foods Taste Good
Fruit fly’s salt taste sensation strategy may apply to other animals, including humans
As anyone who’s ever mixed up the sugar and salt while baking knows, too much of a good thing can be inedible. What hasn’t been clear, though, is how our tongues and brains can tell when the saltiness of our food has crossed the line from yummy to yucky — or, worse, something dangerous.
Now researchers at the Johns Hopkins University School of Medicine and the University of California, Santa Barbara report that in fruit flies, at least, that process is controlled by competing input from two different types of taste-sensing cells: one that attracts flies to salty foods, and one that repels them. Results of their research are described in the June 14 issue of Science.
“The body needs sodium for crucial tasks like putting our muscles into action and letting brain cells communicate with each other, but too much sodium will cause heart problems and other health concerns,” explains Yali Zhang, Ph.D., who led the recent study as part of his graduate work at Johns Hopkins. To maintain health, Zhang says, humans and other animals perceive foods with relatively low salt concentrations as tasty, but avoid eating things with very high salt content.
To find out how the body pulls off this balancing act, Zhang worked with his adviser, Craig Montell, Ph.D., a leading scientist in the field of sensory biology and now a professor at UC Santa Barbara, and graduate student Jinfei Ni to get an up-close view of the fly equivalent of a tongue: its long, curly proboscis. They zoomed in on the proboscis’ so-called sensilla, hair-like structures that serve as the fly’s taste buds.
Previous research had identified several distinct types of sensilla, one of which attracts flies to a taste, while another repels them. Zhang loaded an electrode with a mixture of water and different concentrations of salt, and touched it to each type of sensilla, using the same electrode to detect the electrical signals fired by the sensilla in response to the salt. He found that up to a point, increasing salt concentrations would produce increasingly strong electrical signals in the attractive sensilla, but after that point, the electrical signals dropped off as the concentration continued to rise. In contrast, the repellant sensilla gave off stronger and stronger electrical signals as the salt concentration rose.
Zhang said the team realized that the taste receptor cells in the attractive and repellant sensilla were likely locked in a tug-of-war over whether the fly would continue eating or go off in search of better food. At lower concentrations, the attractive signal would dominate the repellant signal, sending a cumulative message of “yum!” But at high concentrations, the repellant signal would overwhelm the attractive signal, sending the signal “yuck!”
To further test this conclusion, the team mutated a gene called Ir76b that codes for a protein they suspected was involved in the action of the attractive sensilla. To their great surprise, Zhang found that loss of Ir76b function caused flies to avoid the otherwise attractive low-salt food. The reason for this, he found, was that mutating Ir76b only impaired the responses of the attractive sensilla, leaving the repellant sensilla to win the day. Looking further into the action of the protein produced by Ir76b, the team found that it is a channel with a pore that lets sodium pass into the taste cells of the sensilla. Unlike most pores of this type, which have gates that must be opened by certain key chemical or voltage changes in their environment, this gate is always open, meaning that at any time, sodium can flood into the cell and spark an electrical signal. “It’s an unusual setup, but it makes sense because the local sodium concentration outside taste receptor cells appears to be a lot lower than that surrounding most cells. The taste receptor cells don’t need to keep the gate closed to protect themselves from that excess sodium,” Zhang says.
Long before we humans started worrying about regulating our sodium intake, it was a problem all animals had to deal with, Zhang says, and thus his research has implications for other animals, including humans. Although animal taste buds and insect sensilla have different makeups, he suspects that the tug-of-war principle may apply to salt-tasting throughout the animal kingdom, given that different species behave similarly when it comes to salty foods. Identifying the low-salt sensor in humans could be particularly useful, he says, as it could lead to the development of better salt substitutes to help people control their sodium intake.

Gustatory Tug-of-war Key To Whether Salty Foods Taste Good

Fruit fly’s salt taste sensation strategy may apply to other animals, including humans

As anyone who’s ever mixed up the sugar and salt while baking knows, too much of a good thing can be inedible. What hasn’t been clear, though, is how our tongues and brains can tell when the saltiness of our food has crossed the line from yummy to yucky — or, worse, something dangerous.

Now researchers at the Johns Hopkins University School of Medicine and the University of California, Santa Barbara report that in fruit flies, at least, that process is controlled by competing input from two different types of taste-sensing cells: one that attracts flies to salty foods, and one that repels them. Results of their research are described in the June 14 issue of Science.

“The body needs sodium for crucial tasks like putting our muscles into action and letting brain cells communicate with each other, but too much sodium will cause heart problems and other health concerns,” explains Yali Zhang, Ph.D., who led the recent study as part of his graduate work at Johns Hopkins. To maintain health, Zhang says, humans and other animals perceive foods with relatively low salt concentrations as tasty, but avoid eating things with very high salt content.

To find out how the body pulls off this balancing act, Zhang worked with his adviser, Craig Montell, Ph.D., a leading scientist in the field of sensory biology and now a professor at UC Santa Barbara, and graduate student Jinfei Ni to get an up-close view of the fly equivalent of a tongue: its long, curly proboscis. They zoomed in on the proboscis’ so-called sensilla, hair-like structures that serve as the fly’s taste buds.

Previous research had identified several distinct types of sensilla, one of which attracts flies to a taste, while another repels them. Zhang loaded an electrode with a mixture of water and different concentrations of salt, and touched it to each type of sensilla, using the same electrode to detect the electrical signals fired by the sensilla in response to the salt. He found that up to a point, increasing salt concentrations would produce increasingly strong electrical signals in the attractive sensilla, but after that point, the electrical signals dropped off as the concentration continued to rise. In contrast, the repellant sensilla gave off stronger and stronger electrical signals as the salt concentration rose.

Zhang said the team realized that the taste receptor cells in the attractive and repellant sensilla were likely locked in a tug-of-war over whether the fly would continue eating or go off in search of better food. At lower concentrations, the attractive signal would dominate the repellant signal, sending a cumulative message of “yum!” But at high concentrations, the repellant signal would overwhelm the attractive signal, sending the signal “yuck!”

To further test this conclusion, the team mutated a gene called Ir76b that codes for a protein they suspected was involved in the action of the attractive sensilla. To their great surprise, Zhang found that loss of Ir76b function caused flies to avoid the otherwise attractive low-salt food. The reason for this, he found, was that mutating Ir76b only impaired the responses of the attractive sensilla, leaving the repellant sensilla to win the day. Looking further into the action of the protein produced by Ir76b, the team found that it is a channel with a pore that lets sodium pass into the taste cells of the sensilla. Unlike most pores of this type, which have gates that must be opened by certain key chemical or voltage changes in their environment, this gate is always open, meaning that at any time, sodium can flood into the cell and spark an electrical signal. “It’s an unusual setup, but it makes sense because the local sodium concentration outside taste receptor cells appears to be a lot lower than that surrounding most cells. The taste receptor cells don’t need to keep the gate closed to protect themselves from that excess sodium,” Zhang says.

Long before we humans started worrying about regulating our sodium intake, it was a problem all animals had to deal with, Zhang says, and thus his research has implications for other animals, including humans. Although animal taste buds and insect sensilla have different makeups, he suspects that the tug-of-war principle may apply to salt-tasting throughout the animal kingdom, given that different species behave similarly when it comes to salty foods. Identifying the low-salt sensor in humans could be particularly useful, he says, as it could lead to the development of better salt substitutes to help people control their sodium intake.

Filed under fruit flies brain cells salt taste receptors Ir76b gene neuroscience science

51 notes

Gene sequencing project finds new mutations to blame for a majority of brain tumor subtype

The St. Jude Children’s Research Hospital – Washington University Pediatric Cancer Genome Project has identified mutations responsible for more than half of a subtype of childhood brain tumor that takes a high toll on patients. Researchers also found evidence the tumors are susceptible to drugs already in development.

The study focused on a family of brain tumors known as low-grade gliomas (LGGs). These slow-growing cancers are found in about 700 children annually in the U.S., making them the most common childhood tumors of the brain and spinal cord. For patients whose tumors cannot be surgically removed, the long-term outlook remains bleak due to complications from the disease and its ongoing treatment. Nationwide, surgery alone cures only about one-third of patients.

Using whole genome sequencing, researchers identified genetic alterations in two genes that occurred almost exclusively in a subtype of LGG termed diffuse LGG. This subtype cannot be cured surgically because the tumor cells invade the healthy brain. Together, the mutations accounted for 53 percent of the diffuse LGG in this study. Researchers also demonstrated that one of the mutations, which had not previously been linked to brain tumors, caused tumors when introduced into the glial brain cells of mice.

The findings appear in the April 14 advance online edition of the scientific journal Nature Genetics.

“This subtype of low-grade glioma can be a nasty chronic disease, yet prior to this study we knew almost nothing about its genetic alterations,” said David Ellison, M.D., Ph.D., chair of the St. Jude Department of Pathology and the study’s corresponding author. The first author is Jinghui Zhang, Ph.D., an associate member of the St. Jude Department of Computational Biology.

The Pediatric Cancer Genome Project is using next-generation whole genome sequencing to determine the complete normal and cancer genomes of children and adolescents with some of the least understood and most difficult to treat cancers. Scientists believe that studying differences in the 3 billion chemical bases that make up the human genome will provide the scientific foundation for the next generation of cancer care.

“We were surprised to find that many of these tumors could be traced to a single genetic alteration,” said co-author Richard K. Wilson, Ph.D., director of The Genome Institute at Washington University School of Medicine in St. Louis. “This is a major pathway through which low-grade gliomas develop and it provides new clues to explore as we search for better treatments.”

The study involved whole genome sequencing of 39 paired tumor and normal tissue samples from 38 children and adolescents with different subtypes of LGG and related tumors called low-grade glioneuronal tumors (LGGNTs). Although many cancers develop following multiple genetic abnormalities, 62 percent of the 39 tumors in this study stemmed from a single genetic alteration.

Previous studies have linked LGGs to abnormal activation of the MAPK/ERK pathway. The pathway is involved in regulating cell division and other processes that are often disrupted in cancer. Until now, however, the genetic alterations involved in driving this pathway were unknown for some types of LGG and LGGNT.

This study linked activation in the pathway to duplication of a key segment of the FGFR1 gene, which investigators discovered in brain tumors for the first time. The segment is called a tyrosine kinase domain. It functions like an on-off switch for several cell signaling pathways, including the MAPK/ERK pathway. Investigators also demonstrated that experimental drugs designed to block activity along two altered pathways worked in cells with theFGFR1 tyrosine kinase domain duplication. “The finding suggests a potential opportunity for using targeted therapies in patients whose tumors cannot be surgically removed,” Ellison said.

Researchers also showed that the FGFR1 abnormality triggered an aggressive brain tumor in glial cells from mice that lacked the tumor suppressor gene Trp53.

Whole-genome sequencing found previously undiscovered rearrangements in the MYB and MYBL1 genes in diffuse LGGs. These newly identified abnormalities were also implicated in switching on the MAPK/ERK pathway.

Researchers checked an additional 100 LGGs and LGGNTs for the same FGFR1, MYB and MYBL1 mutations. Overall, MYB was altered in 25 percent of the diffuse LGGs, and 24 percent had alterations in FGFR1. Researchers also turned up numerous other mutations that occurred in just a few tumors. The affected genes included BRAF, RAF1, H3F3A, ATRX, EP300, WHSC1 and CHD2.

“The Pediatric Cancer Genome Project has provided a remarkable opportunity to look at the genomic landscape of this disease and really put the alterations responsible on the map. We can now account for the genetic errors responsible for more than 90 percent of low-grade gliomas,” Ellison said. “The discovery that FGFR1 and MYB play a central role in childhood diffuse LGG also serves to distinguish the pediatric and adult forms of the disease.”

(Source: stjude.org)

Filed under brain tumors brain cells genetic alterations whole-genome sequencing genetics neuroscience science

291 notes

Low Doses of THC Can Halt Brain Damage

Extremely low doses of marijuana’s psychoactive component protect brain before and after injury, says TAU researcher

image

Though marijuana is a well-known recreational drug, extensive scientific research has been conducted on the therapeutic properties of marijuana in the last decade. Medical cannabis is often used by sufferers of chronic ailments, including cancer and post-traumatic stress disorder, to combat pain, insomnia, lack of appetite, and other symptoms.

Now Prof. Yosef Sarne of Tel Aviv University’s Adelson Center for the Biology of Addictive Diseases at the Sackler Faculty of Medicine says that the drug has neuroprotective qualities as well. He has found that extremely low doses of THC — the psychoactive component of marijuana — protects the brain from long-term cognitive damage in the wake of injury from hypoxia (lack of oxygen), seizures, or toxic drugs. Brain damage can have consequences ranging from mild cognitive deficits to severe neurological damage.

Previous studies focused on injecting high doses of THC within a very short time frame — approximately 30 minutes — before or after injury. Prof. Sarne’s current research, published in the journals Behavioural Brain Research and Experimental Brain Research, demonstrates that even extremely low doses of THC — around 1,000 to 10,000 times less than that in a conventional marijuana cigarette — administered over a wide window of 1 to 7 days before or 1 to 3 days after injury can jumpstart biochemical processes which protect brain cells and preserve cognitive function over time.

This treatment, especially in light of the long time frame for administration and the low dosage, could be applicable to many cases of brain injury and be safer over time, Prof. Sarne says.

Conditioning the brain

While performing experiments on the biology of cannabis, Prof. Sarne and his fellow researchers discovered that low doses of the drug had a big impact on cell signalling, preventing cell death and promoting growth factors. This finding led to a series of experiments designed to test the neuroprotective ability of THC in response to various brain injuries.

In the lab, the researchers injected mice with a single low dose of THC either before or after exposing them to brain trauma. A control group of mice sustained brain injury but did not receive the THC treatment. When the mice were examined 3 to 7 weeks after initial injury, recipients of the THC treatment performed better in behavioral tests measuring learning and memory. Additionally, biochemical studies showed heightened amounts of neuroprotective chemicals in the treatment group compared to the control group.

The use of THC can prevent long-term cognitive damage that results from brain injury, the researchers conclude. One explanation for this effect is pre- and post-conditioning, whereby the drug causes minute damage to the brain to build resistance and trigger protective measures in the face of much more severe injury, explains Prof. Sarne. The low dosage of THC is crucial to initiating this process without causing too much initial damage.

Preventative and long-term use

According to Prof. Sarne, there are several practical benefits to this treatment plan. Due to the long therapeutic time window, this treatment can be used not only to treat injury after the fact, but also to prevent injury that might occur in the future. For example, cardiopulmonary heart-lung machines used in open heart surgery carry the risk of interrupting the blood supply to the brain, and the drug can be delivered beforehand as a preventive measure. In addition, the low dosage makes it safe for regular use in patients at constant risk of brain injury, such as epileptics or people at a high risk of heart attack.

Prof. Sarne is now working in collaboration with Prof. Edith Hochhauser of the Rabin Medical Center to test the ability of low doses of THC to prevent damage to the heart. Preliminary results indicate that they will find the same protective phenomenon in relation to cardiac ischemia, in which the heart muscle receives insufficient blood flow.

(Source: aftau.org)

Filed under cannabis brain injury brain cells brain damage PTSD neuroscience science

39 notes

A molecular chain reaction in Alzheimer’s disease

Researchers at Lund University in Sweden have identified the molecular mechanism behind the transformation of one of the components in Alzheimer’s disease. They identified the crucial step leading to formations that kill brain cells.

Alzheimer’s disease is associated with memory loss and personality changes. It is still not known what causes the onset of the disease, but once started it cannot be stopped. The accumulation of plaques in the brain is widely considered a hallmark of the disease. The key discovery identified the chemical reaction that causes the plaques to grow exponentially.

Amyloid beta, a protein fragment that occurs naturally in the fluid around the brain, is one of the building blocks of plaques. However, the processes leading from soluble amyloid beta to the form found in the plaques, known as amyloid fibril, have not been known. In the very early part of the process, two protein fragments can create a nucleus that then grows into a fibril.

In solution this is a slow process, but the rate can be enhanced on surfaces. The current study shows that fibrils present a catalytic surface where new nuclei form and this reaction increases the speed of the process. As soon as the first fibrils are formed, amyloid-beta fragments attach at its surface and form new fibrils that subsequently detach.

This process is thus self-perpetuating, and autocatalytic, and the more fibrils are present, the quicker the new ones are created, says Sara Snogerup Linse, Professor of Chemistry at Lund University and one of the researchers behind the study.

The findings also show that the chemical reaction on the fibril surface creates cell-killing formations. It is hoped that the research could lead to a new type of medication targeting early stages of the disease in the future.

The results have emerged from several years of laboratory work by Professor Snogerup Linse and her colleague in Lund, Erik Hellstrand, including development of extensive methods to obtain amyloid beta in highly pure form and to study its transformation in a highly reproducible manner. Additional methodology based on isotope labelling and spin filters was developed to monitor the surface catalysis and pin-point the origin of the forms that kill brain cells. The collaboration with the theoretical group and cell biologists at Cambridge University has been absolutely crucial for all the findings.

(Source: alphagalileo.org)

Filed under alzheimer’s disease amyloid beta amyloid fibril brain cells neuroscience science

172 notes

Down syndrome neurons grown from stem cells show signature problems

Down syndrome, the most common genetic form of intellectual disability, results from an extra copy of one chromosome. Although people with Down syndrome experience intellectual difficulties and other problems, scientists have had trouble identifying why that extra chromosome causes such widespread effects.

In new research published this week, Anita Bhattacharyya, a neuroscientist at the Waisman Center at UW-Madison, reports on brain cells that were grown from skin cells of individuals with Down syndrome.

"Even though Down syndrome is very common, it’s surprising how little we know about what goes wrong in the brain," says Bhattacharyya. "These new cells provide a way to look at early brain development."

The study began when those skin cells were transformed into induced pluripotent stem cells, which can be grown into any type of specialized cell. Bhattacharyya’s lab, working with Su-Chun Zhang and Jason Weick, then grew those stem cells into brain cells that could be studied in the lab.

One significant finding was a reduction in connections among the neurons, Bhattacharyya says. “They communicate less, are quieter. This is new, but it fits with what little we know about the Down syndrome brain.”  Brain cells communicate through connections called synapses, and the Down neurons had only about 60 percent of the usual number of synapses and synaptic activity. “This is enough to make a difference,” says Bhattacharyya. “Even if they recovered these synapses later on, you have missed this critical window of time during early development.”

The researchers looked at genes that were affected in the Down syndrome stem cells and neurons, and found that genes on the extra chromosome were increased 150 percent, consistent with the contribution of the extra chromosome.

However, the output of about 1,500 genes elsewhere in the genome was strongly affected. “It’s not surprising to see changes, but the genes that changed were surprising,” says Bhattacharyya. The predominant increase was seen in genes that respond to oxidative stress, which occurs when molecular fragments called free radicals damage a wide variety of tissues.

"We definitely found a high level of oxidative stress in the Down syndrome neurons," says Bhattacharyya. "This has been suggested before from other studies, but we were pleased to find more evidence for that. We now have a system we can manipulate to study the effects of oxidative stress and possibly prevent them."

Down syndrome includes a range of symptoms that could result from oxidative stress, Bhattacharyya says, including accelerated aging. “In  their 40s, Down syndrome individuals age very quickly. They suddenly get gray hair; their skin wrinkles, there is rapid aging in many organs, and a quick appearance of Alzheimer’s disease. Many of these processes may be due to increased oxidative stress, but it remains to be directly tested.”

Oxidative stress could be especially significant, because it appears right from the start in the stem cells. “This suggests that these cells go through their whole life with oxidative stress,” Bhattacharyya adds, “and that might contribute to the death of neurons later on, or increase susceptibility to Alzheimer’s.”

Other researchers have created neurons with Down syndrome from induced pluripotent stem cells, Bhattacharyya notes. “However, we are the first to report this synaptic deficit, and to report the effects on genes on other chromosomes in neurons. We are also the first to use stem cells from the same person that either had or lacked the extra chromosome. This allowed us to look at the difference just caused by extra chromosome, not due to the genetic difference among people.”

The research, published the week of May 27 in the Proceedings of the National Academy of Sciences, was a basic exploration of the roots of Down syndrome. Bhattacharyya says that while she did not intend to explore treatments in the short term, “we could potentially use these cells to test or intelligently design drugs to target symptoms of Down syndrome.”

(Source: news.wisc.edu)

Filed under down syndrome stem cells brain cells brain development synapses oxidative stress neuroscience science

63 notes

Finding a family for a pair of orphan receptors in the brain
Researchers at Emory University have identified a protein that stimulates a pair of “orphan receptors” found in the brain, solving a long-standing biological puzzle and possibly leading to future treatments for neurological diseases.
The results are published in the Proceedings of the National Academy of Sciences, Early Edition.
The human genome is littered with orphans: proteins that look like they will bind and respond to a hormone or a brain chemical, based on the similarity of their sequences to other proteins. However, scientists haven’t figured out what each orphan’s partner chemical is yet.
Orphans that look like GPCRs (G protein-coupled receptors) currently number about 100. GPCRs are the targets of many drugs and are involved in vision, smell and brain cells’ responses to a host of hormones and neurotransmitters. One orphan GPCR, called GPR37, has attracted interest from researchers because it is connected with an inherited form of Parkinson’s disease. It is abundant in the dopamine-producing neurons that degenerate in Parkinson’s. But its partner chemical, or “ligand,” has not been found.
"We reasoned that GPR37 had to be doing something important, besides becoming misfolded in some forms of Parkinson’s," says senior author Randy Hall, PhD, professor of pharmacology at Emory University School of Medicine.
Working with Hall, graduate student Rebecca Meyer devised a way to detect when cells producing GPR37 were reacting with GPR37’s ligand.
"Usually, cells remove GPCRs from their surfaces when they encounter their ligand," Meyer says. "So we set things up so that GPR37 would be labeled red on the surface of the cell, but would appear green once internalized."
They discovered that cells producing GPR37 – and also a close relative, GPR37L1 — respond to a protein known as prosaposin, which was discovered by John O’Brien of University of California San Diego in the 1990s.
Prosaposin is a growth factor for brain cells and protects them from stress. Scientists studying it had worked out that it stimulates cells via a GPCR – but which one was unclear until now. In animal models, prosaposin has shown potential for treating conditions such as stroke, Parkinson’s and neuropathic pain. An artificial fragment of prosaposin called prosaptide has been tested in clinical studies, but it quickly breaks down in the body.
"That’s the reason why it was so important to find the receptor," Hall says. "Then we can actually do some pharmacology."
Now, Hall’s laboratory is planning to look for other compounds that can activate GPR37 as well. These could be more stable in the body than the previously studied protein fragment and thus better potential drugs.
Doctors have reported a few cases of genetic deficiency in prosaposin, leading to severe neurodegeneration. Mice engineered to lack GPR37 have more subtle brain perturbations, so Hall also plans to test the hypothesis that prosaposin acts by both GPR37 and GPR37L1, by “knocking out” both in mice, potentially duplicating the same severe effects seen in the human cases of prosaposin deficiency.

Finding a family for a pair of orphan receptors in the brain

Researchers at Emory University have identified a protein that stimulates a pair of “orphan receptors” found in the brain, solving a long-standing biological puzzle and possibly leading to future treatments for neurological diseases.

The results are published in the Proceedings of the National Academy of Sciences, Early Edition.

The human genome is littered with orphans: proteins that look like they will bind and respond to a hormone or a brain chemical, based on the similarity of their sequences to other proteins. However, scientists haven’t figured out what each orphan’s partner chemical is yet.

Orphans that look like GPCRs (G protein-coupled receptors) currently number about 100. GPCRs are the targets of many drugs and are involved in vision, smell and brain cells’ responses to a host of hormones and neurotransmitters. One orphan GPCR, called GPR37, has attracted interest from researchers because it is connected with an inherited form of Parkinson’s disease. It is abundant in the dopamine-producing neurons that degenerate in Parkinson’s. But its partner chemical, or “ligand,” has not been found.

"We reasoned that GPR37 had to be doing something important, besides becoming misfolded in some forms of Parkinson’s," says senior author Randy Hall, PhD, professor of pharmacology at Emory University School of Medicine.

Working with Hall, graduate student Rebecca Meyer devised a way to detect when cells producing GPR37 were reacting with GPR37’s ligand.

"Usually, cells remove GPCRs from their surfaces when they encounter their ligand," Meyer says. "So we set things up so that GPR37 would be labeled red on the surface of the cell, but would appear green once internalized."

They discovered that cells producing GPR37 – and also a close relative, GPR37L1 — respond to a protein known as prosaposin, which was discovered by John O’Brien of University of California San Diego in the 1990s.

Prosaposin is a growth factor for brain cells and protects them from stress. Scientists studying it had worked out that it stimulates cells via a GPCR – but which one was unclear until now. In animal models, prosaposin has shown potential for treating conditions such as stroke, Parkinson’s and neuropathic pain. An artificial fragment of prosaposin called prosaptide has been tested in clinical studies, but it quickly breaks down in the body.

"That’s the reason why it was so important to find the receptor," Hall says. "Then we can actually do some pharmacology."

Now, Hall’s laboratory is planning to look for other compounds that can activate GPR37 as well. These could be more stable in the body than the previously studied protein fragment and thus better potential drugs.

Doctors have reported a few cases of genetic deficiency in prosaposin, leading to severe neurodegeneration. Mice engineered to lack GPR37 have more subtle brain perturbations, so Hall also plans to test the hypothesis that prosaposin acts by both GPR37 and GPR37L1, by “knocking out” both in mice, potentially duplicating the same severe effects seen in the human cases of prosaposin deficiency.

Filed under neurological disorders brain cells receptors proteins ligands neuroscience science

free counters