Neuroscience

Articles and news from the latest research reports.

161 notes

Pleasant Smells Increase Facial Attractiveness

New research from the Monell Chemical Senses Center reveals that women’s faces are rated as more attractive in the presence of pleasant odors. In contrast, odor pleasantness had less effect on the evaluation of age. The findings suggest that the use of scented products such as perfumes may, to some extent, alter how people perceive one another.

image

“Odor pleasantness and facial attractiveness integrate into one joint emotional evaluation,” said lead author Janina Seubert, PhD, a cognitive neuroscientist who was a postdoctoral fellow at Monell at the time the research was conducted. “This may indicate a common site of neural processing in the brain.”

Perfumes and scented products have been used for centuries as a way to enhance overall personal appearance. Previous studies had shown perception of facial attractiveness could be influenced when using unpleasant vs. pleasant odors. However, it was not known whether odors influence the actual visual perception of facial features or alternatively, how faces are emotionally evaluated by the brain.

The current study design centered on the principle that judging attractiveness and age involve two distinct perceptual processing methods: attractiveness is regarded as an emotional process while judgments of age are believed to be cognitive, or rationally-based.

In the study, published in open access journal PLOS ONE, 18 young adults, two thirds of whom were female, were asked to rate the attractiveness and age of eight female faces, presented as photographs. The images varied in terms of natural aging features.

While evaluating the images, one of five odors was simultaneously released. These were a blend of fish oil (unpleasant) and rose oil (pleasant) that ranged from predominantly fish oil to predominantly rose oil. The subjects were asked to rate the age of the face in the photograph, the attractiveness of the face and the pleasantness of the odor.

Across the range of odors, odor pleasantness directly influenced ratings of facial attractiveness. This suggests that olfactory and visual cues independently influence judgments of facial attractiveness.

With regard to the cognitive task of age evaluation, visual age cues (more wrinkles and blemishes) were linked to older age perception. However, odor pleasantness had a mixed effect. Visual age cues strongly influenced age perception during pleasant odor stimulation, making older faces look older and younger faces look younger. This effect was weakened in the presence of unpleasant odors, so that younger and older faces were perceived to be more similar in age.

Jean-Marc Dessirier, Lead Scientist at Unilever and a co-author on the study said, “These findings have fascinating implications in terms of how pleasant smells may help enhance natural appearance within social settings. The next step will be to see if the findings extend to evaluation of male facial attractiveness.”

(Source: monell.org)

Filed under facial attractiveness smell odor pleasantness sensory perception face perception psychology neuroscience science

303 notes

A ‘hands-on’ approach could help babies develop spatial awareness
A study from the Department of Psychology published today found:
Changes in the way the brain processes touch in the first year of life
Babies start keeping track of their hands are when their arms move around from 8 months
Crossing the hands confuses the mind in young babies
The way we perceive touch in the outside world develops in the first year of life
The research, from Goldsmiths’ InfantLab, suggested that babies’ tactile experiences could be important for developing their sense of place in the world around them.
The InfantLab research team carried out their study on 66 babies aged from six to ten months old.
Babies felt harmless ‘buzzes’ on their arms
In the study, babies felt little tactile ‘buzzes’ on their hands first with their arms in an uncrossed position and then in a crossed position, while their brain activity was recorded through an EEG (electroencephalography) sensor net.
This is one of the first pieces of research to focus on the development of ‘touch perception’, which is crucial for investigating how babies learn to perceive how their own bodies fit into the world around them.
Dr Andy Bremner, InfantLab Director, explained: “We discovered that it takes time for babies to build up good mechanisms for perceiving how they fit into the outside world. Specifically, early on they do not appear to perceive the ways in which the body changes when their limbs, in this case their arms, move around.” 
Dr Silvia Rigato, researcher on the project, commented: “The vast majority of previous studies on infant perception has focussed on what babies perceive of a visual environment on a screen and out of reach, giving us a picture of what babies can do and understand when in couch potato mode.”
“Our research has taken this a step further. As adults we need good maps of where our bodies and limbs are in order to be able to act and move around competently. It seems these take time to develop in the first year, and we didn’t know that before.”
The full research paper ‘The neural basis of somatosensory remapping develops in human infancy’ was published in the journal Current Biology.

A ‘hands-on’ approach could help babies develop spatial awareness

A study from the Department of Psychology published today found:

  • Changes in the way the brain processes touch in the first year of life
  • Babies start keeping track of their hands are when their arms move around from 8 months
  • Crossing the hands confuses the mind in young babies
  • The way we perceive touch in the outside world develops in the first year of life

The research, from Goldsmiths’ InfantLab, suggested that babies’ tactile experiences could be important for developing their sense of place in the world around them.

The InfantLab research team carried out their study on 66 babies aged from six to ten months old.

Babies felt harmless ‘buzzes’ on their arms

In the study, babies felt little tactile ‘buzzes’ on their hands first with their arms in an uncrossed position and then in a crossed position, while their brain activity was recorded through an EEG (electroencephalography) sensor net.

This is one of the first pieces of research to focus on the development of ‘touch perception’, which is crucial for investigating how babies learn to perceive how their own bodies fit into the world around them.

Dr Andy Bremner, InfantLab Director, explained: “We discovered that it takes time for babies to build up good mechanisms for perceiving how they fit into the outside world. Specifically, early on they do not appear to perceive the ways in which the body changes when their limbs, in this case their arms, move around.” 

Dr Silvia Rigato, researcher on the project, commented: “The vast majority of previous studies on infant perception has focussed on what babies perceive of a visual environment on a screen and out of reach, giving us a picture of what babies can do and understand when in couch potato mode.”

“Our research has taken this a step further. As adults we need good maps of where our bodies and limbs are in order to be able to act and move around competently. It seems these take time to develop in the first year, and we didn’t know that before.”

The full research paper ‘The neural basis of somatosensory remapping develops in human infancy’ was published in the journal Current Biology.

Filed under brain activity EEG infants somatosensory remapping brain development psychology neuroscience science

159 notes

Citizens help researchers to challenge scientific theory
Science crowdsourcing was used to disprove a widely held theory that “supertasters” owe their special sensitivity to bitter tastes to an usually high density of taste buds on their tongue, according to a study published in the open-access journal Frontiers in Integrative Neuroscience.
Supertasters are people who can detect and are extremely sensitive to phenylthiocarbamide and propylthiouracil, two compounds related to the bitter molecules in certain foods such as broccoli and kale. Supertasting has been used to explain why some people don’t like spicy foods or “hoppy” beers, or why some kids are picky eaters.
The sensitivity to these bitter tastants is partly due to a variation in the taste receptor gene TAS2R38. But some scientists believe that the ability to supertaste is also boosted by a greater-than-average number of “papillae”, bumps on the tongue that contain taste buds. Nicole Garneau, Curator and Chair of the Department of Health Sciences, Denver Museum of Nature & Science, and colleagues tested if this is true.
"There is a long-held belief that if you stick out your tongue and look at the bumps on it, then you can predict how sensitive you are to strong tastes like bitterness in vegetables and strong sensations like spiciness," says Garneau. "The commonly accepted theory has been that the more bumps you have, the more taste buds you have and therefore the more sensitive you are."
Over 3000 visitors to the museum’s Genetics of Taste Lab volunteered to stick their tongue out so that their papillae could be counted and their sensitivity to phenylthiocarbamide and propylthiouracil measured. In total, 394 study subjects were included in the analysis. Cell swabs from volunteers were taken to determine their DNA sequence at TAS2R38. Results confirmed that certain variations in TAS2R38 make it more likely that somebody is sensitive to bitter, but also proved that the number of papillae on the tongue does not affect increased taste sensitivity.
"No matter how we looked at the data, we couldn’t replicate this long held assumption that a high number of papillae equals supertasting," says Garneau.
The authors argue against the continued misuse of the term supertaster, and for the use of the more objective term hypergeusia – abnormally sensitized taste – to describe people who are sensitive to all tastes and sensations from food.
"What we know and understand about how our bodies work improves greatly when we challenge central dogmas of our knowledge. This is the nature of science itself," adds Garneau. "As techniques improve, so too does our ability to do science, and we find that what we accepted as truth 20, 30, or 100 years ago gets replaced with better theories as we gather new data, which advances science. In this case, we’ve proven that with the ‘Denver Papillae Protocol’, our new method for objective analysis for papillae density, we were unable to replicate well-known studies about supertasting."
What make this study unique is that most of the results were collected by citizen scientists including over 130 volunteers who had been specially trained by Garneau and her colleagues. The Genetics of Taste Lab is located in the heart of the museum, uniquely situated to attract volunteers and dedicated citizen scientists who conduct population-based research about human genetics, taste, and health.

Citizens help researchers to challenge scientific theory

Science crowdsourcing was used to disprove a widely held theory that “supertasters” owe their special sensitivity to bitter tastes to an usually high density of taste buds on their tongue, according to a study published in the open-access journal Frontiers in Integrative Neuroscience.

Supertasters are people who can detect and are extremely sensitive to phenylthiocarbamide and propylthiouracil, two compounds related to the bitter molecules in certain foods such as broccoli and kale. Supertasting has been used to explain why some people don’t like spicy foods or “hoppy” beers, or why some kids are picky eaters.

The sensitivity to these bitter tastants is partly due to a variation in the taste receptor gene TAS2R38. But some scientists believe that the ability to supertaste is also boosted by a greater-than-average number of “papillae”, bumps on the tongue that contain taste buds. Nicole Garneau, Curator and Chair of the Department of Health Sciences, Denver Museum of Nature & Science, and colleagues tested if this is true.

"There is a long-held belief that if you stick out your tongue and look at the bumps on it, then you can predict how sensitive you are to strong tastes like bitterness in vegetables and strong sensations like spiciness," says Garneau. "The commonly accepted theory has been that the more bumps you have, the more taste buds you have and therefore the more sensitive you are."

Over 3000 visitors to the museum’s Genetics of Taste Lab volunteered to stick their tongue out so that their papillae could be counted and their sensitivity to phenylthiocarbamide and propylthiouracil measured. In total, 394 study subjects were included in the analysis. Cell swabs from volunteers were taken to determine their DNA sequence at TAS2R38. Results confirmed that certain variations in TAS2R38 make it more likely that somebody is sensitive to bitter, but also proved that the number of papillae on the tongue does not affect increased taste sensitivity.

"No matter how we looked at the data, we couldn’t replicate this long held assumption that a high number of papillae equals supertasting," says Garneau.

The authors argue against the continued misuse of the term supertaster, and for the use of the more objective term hypergeusia – abnormally sensitized taste – to describe people who are sensitive to all tastes and sensations from food.

"What we know and understand about how our bodies work improves greatly when we challenge central dogmas of our knowledge. This is the nature of science itself," adds Garneau. "As techniques improve, so too does our ability to do science, and we find that what we accepted as truth 20, 30, or 100 years ago gets replaced with better theories as we gather new data, which advances science. In this case, we’ve proven that with the ‘Denver Papillae Protocol’, our new method for objective analysis for papillae density, we were unable to replicate well-known studies about supertasting."

What make this study unique is that most of the results were collected by citizen scientists including over 130 volunteers who had been specially trained by Garneau and her colleagues. The Genetics of Taste Lab is located in the heart of the museum, uniquely situated to attract volunteers and dedicated citizen scientists who conduct population-based research about human genetics, taste, and health.

Filed under taste supertasting hypergeusia TAS2R38 genetics neuroscience science

92 notes

Memory Problems After Chemo Linked to Brain Changes

Breast cancer survivors who had chemotherapy show changes in brain activity during multitasking chores, according to a new Belgian study.

These findings may partly explain the phenomenon dubbed “chemo brain.” For years, people who’ve had chemotherapy have reported changes in thinking and memory, especially when doing more than one thing at once.

"Before you can fix a problem, you need to know what the problem is. And this study demonstrates what the problem may be. It’s a really good first step to understanding the what. Now we need to understand the why and how to fix it," said Dr. Courtney Vito, a breast surgeon and assistant clinical professor of surgical oncology at the City of Hope Comprehensive Cancer Center in Duarte, Calif. Vito was not involved in the current study, but reviewed the study’s findings.

In her experience, Vito said, women tend to be affected more by chemo brain than are men after chemotherapy. However, she said, ”women tend to multitask more, so this might explain part of it.”

The new study was published online May 27 in the Journal of Clinical Oncology.

Read more

Filed under breast cancer memory chemo brain chemotherapy health science

618 notes

Cynical? You May Be Hurting Your Brain Health
People with high levels of cynical distrust may be more likely to develop dementia, according to a study published in the May 28, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.
Cynical distrust, which is defined as the belief that others are mainly motivated by selfish concerns, has been associated with other health problems, such as heart disease. This is the first study to look at the relationship between cynicism and dementia.
“These results add to the evidence that people’s view on life and personality may have an impact on their health,” said study author Anna-Maija Tolppanen, PhD, of the University of Eastern Finland in Kuopio. “Understanding how a personality trait like cynicism affects risk for dementia might provide us with important insights on how to reduce risks for dementia.”
For the study, 1,449 people with an average age of 71 were given tests for dementia and a questionnaire to measure their level of cynicism. The questionnaire has been shown to be reliable, and people’s scores tend to remain stable over periods of several years. People are asked how much they agree with statements such as “I think most people would lie to get ahead,” “It is safer to trust nobody” and “Most people will use somewhat unfair reasons to gain profit or an advantage rather than lose it.” Based on their scores, participants were grouped in low, moderate and high levels of cynical distrust.
A total of 622 people completed two tests for dementia, with the last one an average of eight years after the study started. During that time, 46 people were diagnosed with dementia. Once researchers adjusted for other factors that could affect dementia risk, such as high blood pressure, high cholesterol and smoking, people with high levels of cynical distrust were three times more likely to develop dementia than people with low levels of cynicism. Of the 164 people with high levels of cynicism, 14 people developed dementia, compared to nine of the 212 people with low levels of cynicism.
The study also looked at whether people with high levels of cynicism were more likely to die sooner than people with low levels of cynicism. A total of 1,146 people were included in this part of the analysis, and 361 people died during the average of 10 years of follow-up. High cynicism was initially associated with earlier death, but after researchers accounted for factors such as socioeconomic status, behaviors such as smoking and health status, there was no longer any link between cynicism and earlier death.
(Image: Shutterstock)

Cynical? You May Be Hurting Your Brain Health

People with high levels of cynical distrust may be more likely to develop dementia, according to a study published in the May 28, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.

Cynical distrust, which is defined as the belief that others are mainly motivated by selfish concerns, has been associated with other health problems, such as heart disease. This is the first study to look at the relationship between cynicism and dementia.

“These results add to the evidence that people’s view on life and personality may have an impact on their health,” said study author Anna-Maija Tolppanen, PhD, of the University of Eastern Finland in Kuopio. “Understanding how a personality trait like cynicism affects risk for dementia might provide us with important insights on how to reduce risks for dementia.”

For the study, 1,449 people with an average age of 71 were given tests for dementia and a questionnaire to measure their level of cynicism. The questionnaire has been shown to be reliable, and people’s scores tend to remain stable over periods of several years. People are asked how much they agree with statements such as “I think most people would lie to get ahead,” “It is safer to trust nobody” and “Most people will use somewhat unfair reasons to gain profit or an advantage rather than lose it.” Based on their scores, participants were grouped in low, moderate and high levels of cynical distrust.

A total of 622 people completed two tests for dementia, with the last one an average of eight years after the study started. During that time, 46 people were diagnosed with dementia. Once researchers adjusted for other factors that could affect dementia risk, such as high blood pressure, high cholesterol and smoking, people with high levels of cynical distrust were three times more likely to develop dementia than people with low levels of cynicism. Of the 164 people with high levels of cynicism, 14 people developed dementia, compared to nine of the 212 people with low levels of cynicism.

The study also looked at whether people with high levels of cynicism were more likely to die sooner than people with low levels of cynicism. A total of 1,146 people were included in this part of the analysis, and 361 people died during the average of 10 years of follow-up. High cynicism was initially associated with earlier death, but after researchers accounted for factors such as socioeconomic status, behaviors such as smoking and health status, there was no longer any link between cynicism and earlier death.

(Image: Shutterstock)

Filed under cynical distrust aging dementia memory cynicism neuroscience science

130 notes

The claustrum’s proposed role in consciousness is supported by the effect and target localization of Salvia divinorum
This article brings together three findings and ideas relevant for the understanding of human consciousness: (I) Crick’s and Koch’s theory that the claustrum is a “conductor of consciousness” crucial for subjective conscious experience. (II) Subjective reports of the consciousness-altering effects the plant Salvia divinorum, whose primary active ingredient is salvinorin A, a κ-opioid receptor agonist. (III) The high density of κ-opioid receptors in the claustrum. Fact III suggests that the consciousness-altering effects of S. divinorum/salvinorin A (II) are due to a κ-opioid receptor mediated inhibition of primarily the claustrum and, additionally, the deep layers of the cortex, mainly in prefrontal areas. Consistent with Crick and Koch’s theory that the claustrum plays a key role in consciousness (I), the subjective effects of S. divinorum indicate that salvia disrupts certain facets of consciousness much more than the largely serotonergic hallucinogen lysergic acid diethylamide (LSD). Based on this data and on the relevant literature, we suggest that the claustrum does indeed serve as a conductor for certain aspects of higher-order integration of brain activity, while integration of auditory and visual signals relies more on coordination by other areas including parietal cortex and the pulvinar.
Full Article

The claustrum’s proposed role in consciousness is supported by the effect and target localization of Salvia divinorum

This article brings together three findings and ideas relevant for the understanding of human consciousness: (I) Crick’s and Koch’s theory that the claustrum is a “conductor of consciousness” crucial for subjective conscious experience. (II) Subjective reports of the consciousness-altering effects the plant Salvia divinorum, whose primary active ingredient is salvinorin A, a κ-opioid receptor agonist. (III) The high density of κ-opioid receptors in the claustrum. Fact III suggests that the consciousness-altering effects of S. divinorum/salvinorin A (II) are due to a κ-opioid receptor mediated inhibition of primarily the claustrum and, additionally, the deep layers of the cortex, mainly in prefrontal areas. Consistent with Crick and Koch’s theory that the claustrum plays a key role in consciousness (I), the subjective effects of S. divinorum indicate that salvia disrupts certain facets of consciousness much more than the largely serotonergic hallucinogen lysergic acid diethylamide (LSD). Based on this data and on the relevant literature, we suggest that the claustrum does indeed serve as a conductor for certain aspects of higher-order integration of brain activity, while integration of auditory and visual signals relies more on coordination by other areas including parietal cortex and the pulvinar.

Full Article

Filed under consciousness claustrum salvinorin A brain activity neuroscience science

274 notes

Uncovering Clues to the Genetic Cause of Schizophrenia
The overall number and nature of mutations—rather than the presence of any single mutation—influences an individual’s risk of developing schizophrenia, as well as its severity, according to a discovery by Columbia University Medical Center researchers published in the latest issue of Neuron. The findings could have important implications for the early detection and treatment of schizophrenia.
Maria Karayiorgou, MD, professor of psychiatry and Joseph Gogos, MD, PhD, professor of physiology and cellular biophysics and of neuroscience, and their team sequenced the “exome”—the region of the human genome that codes for proteins—of 231 schizophrenia patients and their unaffected parents. Using this data, they demonstrated that schizophrenia arises from collective damage across several genes.
“This study helps define a specific genetic mechanism that explains some of schizophrenia’s heritability and clinical manifestation,” said Dr. Karayiorgou, who is acting chief of the Division of Psychiatric and Medical Genetics at the New York State Psychiatric Institute. “Accumulation of damaged genes inherited from healthy parents leads to higher risk not only to develop schizophrenia but also to develop more severe forms of the disease.”
Schizophrenia is a severe psychiatric disorder in which patients experience hallucination, delusion, apathy and cognitive difficulties. The disorder is relatively common, affecting around 1 in every 100 people, and the risk of developing schizophrenia is strongly increased if a family member has the disease. Previous research has focused on the search for individual genes that might trigger schizophrenia. The availability of new high-throughput DNA sequencing technology has contributed to a more holistic approach to the disease.
The researchers compared sequencing data to look for genetic differences and identify new loss-of-function mutations—which are rarer, but have a more severe effect on ordinary gene function—in cases of schizophrenia that had not been inherited from the patients’ parents. They found an excess of such mutations in a variety of genes across different chromosomes.
Using the same sequencing data, the researchers also looked at what types of mutations are commonly passed on to schizophrenia patients from their parents. It turns out that many of these are “loss-of-function” types. These mutations were also found to occur more frequently in genes with a low tolerance for genetic variation.
“These mutations are important signposts toward identifying the genes involved in schizophrenia,” said Dr. Karayiorgou.
The researchers then looked more deeply into the sequencing data to try to determine the biological functions of the disrupted genes involved in schizophrenia. They were able to verify two key damaging mutations in a gene called SETD1A, suggesting that this gene contributes significantly to the disease.
SETD1A is involved in a process called chromatin modification. Chromatin is the molecular apparatus that packages DNA into a smaller volume so it can fit into the cell and physically regulates how genes are expressed. Chromatin modification is therefore a crucial cellular activity.
The finding fits with accumulating evidence that damage to chromatin regulatory genes is a common feature of various psychiatric and neurodevelopmental disorders. By combining the mutational data from this and related studies on schizophrenia, the authors found that “chromatin regulation” was the most common description for genes that had damaging mutations.
“A clinical implication of this finding is the possibility of using the number and severity of mutations involved in chromatin regulation as a way to identify children at risk of developing schizophrenia and other neurodevelopmental disorders,” said Dr. Gogos. “Exploring ways to reverse alterations in chromatic modification and restore gene expression may be an effective path toward treatment.”
In further sequencing studies, the researchers hope to identify and characterize more genes that might play a role in schizophrenia and to elucidate common biological functions of the genes.

Uncovering Clues to the Genetic Cause of Schizophrenia

The overall number and nature of mutations—rather than the presence of any single mutation—influences an individual’s risk of developing schizophrenia, as well as its severity, according to a discovery by Columbia University Medical Center researchers published in the latest issue of Neuron. The findings could have important implications for the early detection and treatment of schizophrenia.

Maria Karayiorgou, MD, professor of psychiatry and Joseph Gogos, MD, PhD, professor of physiology and cellular biophysics and of neuroscience, and their team sequenced the “exome”—the region of the human genome that codes for proteins—of 231 schizophrenia patients and their unaffected parents. Using this data, they demonstrated that schizophrenia arises from collective damage across several genes.

“This study helps define a specific genetic mechanism that explains some of schizophrenia’s heritability and clinical manifestation,” said Dr. Karayiorgou, who is acting chief of the Division of Psychiatric and Medical Genetics at the New York State Psychiatric Institute. “Accumulation of damaged genes inherited from healthy parents leads to higher risk not only to develop schizophrenia but also to develop more severe forms of the disease.”

Schizophrenia is a severe psychiatric disorder in which patients experience hallucination, delusion, apathy and cognitive difficulties. The disorder is relatively common, affecting around 1 in every 100 people, and the risk of developing schizophrenia is strongly increased if a family member has the disease. Previous research has focused on the search for individual genes that might trigger schizophrenia. The availability of new high-throughput DNA sequencing technology has contributed to a more holistic approach to the disease.

The researchers compared sequencing data to look for genetic differences and identify new loss-of-function mutations—which are rarer, but have a more severe effect on ordinary gene function—in cases of schizophrenia that had not been inherited from the patients’ parents. They found an excess of such mutations in a variety of genes across different chromosomes.

Using the same sequencing data, the researchers also looked at what types of mutations are commonly passed on to schizophrenia patients from their parents. It turns out that many of these are “loss-of-function” types. These mutations were also found to occur more frequently in genes with a low tolerance for genetic variation.

“These mutations are important signposts toward identifying the genes involved in schizophrenia,” said Dr. Karayiorgou.

The researchers then looked more deeply into the sequencing data to try to determine the biological functions of the disrupted genes involved in schizophrenia. They were able to verify two key damaging mutations in a gene called SETD1A, suggesting that this gene contributes significantly to the disease.

SETD1A is involved in a process called chromatin modification. Chromatin is the molecular apparatus that packages DNA into a smaller volume so it can fit into the cell and physically regulates how genes are expressed. Chromatin modification is therefore a crucial cellular activity.

The finding fits with accumulating evidence that damage to chromatin regulatory genes is a common feature of various psychiatric and neurodevelopmental disorders. By combining the mutational data from this and related studies on schizophrenia, the authors found that “chromatin regulation” was the most common description for genes that had damaging mutations.

“A clinical implication of this finding is the possibility of using the number and severity of mutations involved in chromatin regulation as a way to identify children at risk of developing schizophrenia and other neurodevelopmental disorders,” said Dr. Gogos. “Exploring ways to reverse alterations in chromatic modification and restore gene expression may be an effective path toward treatment.”

In further sequencing studies, the researchers hope to identify and characterize more genes that might play a role in schizophrenia and to elucidate common biological functions of the genes.

Filed under schizophrenia genetics genomics neuroscience science

297 notes

Learning Early in Life May Help Keep Brain Cells Alive
Using your brain – particularly during adolescence – may help brain cells survive and could impact how the brain functions after puberty.
According to a recently published study in Frontiers in Neuroscience, Rutgers behavioral and systems neuroscientist Tracey Shors, who co-authored the study, found that the newborn brain cells in young rats that were successful at learning survived while the same brain cells in animals that didn’t master the task died quickly.
“In those that didn’t learn, three weeks after the new brain cells were made, nearly one-half of them were no longer there,” said Shors, professor in the Department of Psychology and Center for Collaborative Neuroscience at Rutgers. “But in those that learned, it was hard to count. There were so many that were still alive.”
The study is important, Shors says, because it suggests that the massive proliferation of new brain cells most likely helps young animals leave the protectiveness of their mothers and face dangers, challenges and opportunities of adulthood.
Scientists have known for years that the neurons in adult rats, which are significant but fewer in numbers than during puberty, could be saved with learning, but they did not know if this would be the case for young rats that produce two to four times more neurons than adult animals.
By examining the hippocampus – a portion of the brain associated with the process of learning  – after the rats learned to associate a sound with a motor response, scientists found that the new brain cells injected with dye a few weeks earlier were still alive in those that had learned the task while the cells in those who had failed did not survive.
“It’s not that learning makes more cells,” says Shors. “It’s that the process of learning keeps new cells alive that are already present at the time of the learning experience.”
Since the process of producing new brain cells on a cellular level is similar in animals, including humans, Shors says ensuring that adolescent children learn at optimal levels is critical.
“What it has shown me, especially as an educator, is how difficult it is to achieve optimal learning for our students. You don’t want the material to be too easy to learn and yet still have it too difficult where the student doesn’t learn and gives up,” Shors says.
So, what does this mean for the 12-year-old adolescent boy or girl?
While scientists can’t measure individual brain cells in humans, Shors says this study, on the cellular level, provides a look at what is happening in the adolescent brain and provides a window into the amazing ability the brain has to reorganize itself and form new neural connections at such a transformational time in our lives.
“Adolescents are trying to figure out who they are now, who they want to be when they grow up and are at school in a learning environment all day long,” says Shors. “The brain has to have a lot of strength to respond to all those experiences.”

Learning Early in Life May Help Keep Brain Cells Alive

Using your brain – particularly during adolescence – may help brain cells survive and could impact how the brain functions after puberty.

According to a recently published study in Frontiers in Neuroscience, Rutgers behavioral and systems neuroscientist Tracey Shors, who co-authored the study, found that the newborn brain cells in young rats that were successful at learning survived while the same brain cells in animals that didn’t master the task died quickly.

“In those that didn’t learn, three weeks after the new brain cells were made, nearly one-half of them were no longer there,” said Shors, professor in the Department of Psychology and Center for Collaborative Neuroscience at Rutgers. “But in those that learned, it was hard to count. There were so many that were still alive.”

The study is important, Shors says, because it suggests that the massive proliferation of new brain cells most likely helps young animals leave the protectiveness of their mothers and face dangers, challenges and opportunities of adulthood.

Scientists have known for years that the neurons in adult rats, which are significant but fewer in numbers than during puberty, could be saved with learning, but they did not know if this would be the case for young rats that produce two to four times more neurons than adult animals.

By examining the hippocampus – a portion of the brain associated with the process of learning – after the rats learned to associate a sound with a motor response, scientists found that the new brain cells injected with dye a few weeks earlier were still alive in those that had learned the task while the cells in those who had failed did not survive.

“It’s not that learning makes more cells,” says Shors. “It’s that the process of learning keeps new cells alive that are already present at the time of the learning experience.”

Since the process of producing new brain cells on a cellular level is similar in animals, including humans, Shors says ensuring that adolescent children learn at optimal levels is critical.

“What it has shown me, especially as an educator, is how difficult it is to achieve optimal learning for our students. You don’t want the material to be too easy to learn and yet still have it too difficult where the student doesn’t learn and gives up,” Shors says.

So, what does this mean for the 12-year-old adolescent boy or girl?

While scientists can’t measure individual brain cells in humans, Shors says this study, on the cellular level, provides a look at what is happening in the adolescent brain and provides a window into the amazing ability the brain has to reorganize itself and form new neural connections at such a transformational time in our lives.

“Adolescents are trying to figure out who they are now, who they want to be when they grow up and are at school in a learning environment all day long,” says Shors. “The brain has to have a lot of strength to respond to all those experiences.”

Filed under brain cells puberty adolescence hippocampus dentate gyrus neuroscience science

117 notes

New epilepsy treatment offers ‘on demand’ seizure suppression

A new treatment for drug-resistant epilepsy with the potential to suppress seizures ‘on demand’ with a pill, similar to how you might take painkillers when you feel a headache coming on, has been developed by UCL researchers funded by the Wellcome Trust.

image

The treatment, described in Nature Communications, combines genetic and chemical approaches to suppress seizures without disrupting normal brain function. The technique was demonstrated in rodents but in future we could see people controlling seizures on-demand with a simple pill.

Epilepsy affects around 50 million people worldwide including 600,000 in the UK and around a quarter of cases are resistant to conventional treatments. Many of these cases could be addressed by the new treatment method, which relies on genetic modification of brain cells to make them sensitive to a normally inactive compound.

“First, we inject a modified virus into the area of the brain where seizures arise,” explains Professor Dimitri Kullmann of the UCL Institute of Neurology, senior author of the research. “This virus instructs the brain cells to make a protein that is activated by CNO (clozapine-N-oxide), a compound that can be taken as a pill. The activated protein then suppresses the over-excitable brain cells that trigger seizures, but only in the presence of CNO.

“At the moment, severe seizures are treated with drugs that suppress the excitability of all brain cells, and patients therefore experience side effects. Sometimes the dose required to stop seizures is so high that patients need to be sedated and taken to intensive care. If we can take our new method into the clinic, which we hope to do within the next decade, we could treat patients who are susceptible to severe seizures with a one-off injection of the modified virus, and then use CNO only when needed.

“CNO would be given as a pill in the event that patients could predict when seizures were likely to occur. For example, many people with treatment-resistant epilepsy experience clusters of seizures, where severe seizures are preceded by smaller ones. Seizure risk is also high when people are ill, sleep deprived, or at certain times of the menstrual cycle, so these would all be good times to take the pill as a preventative measure. In urgent situations, the compound could be given as an injection. We could even consider a fully automatic delivery system, where CNO was given by a pump, as is done for insulin in some people with diabetes.”

As CNO has a half-life of about a few hours and only affects the pre-treated epileptic parts of the brain, the new method avoids the need to permanently alter the brain or treat the whole brain with seizure-suppressing drugs. It builds on similar work by Professor Kullmann’s group using gene therapy to ‘calm down’ brain cells, or using light pulses to activate seizure-suppressing receptors in the brain. The new technique works in a similar way but is reversible and avoids the need for invasive devices to deliver light to the brain.

“After the one-off injection into affected areas of the brain, our new technique would require nothing beyond CNO, administered as an injection or a pill, to suppress seizures when required,” says Professor Kullmann. “This makes it more attractive than alternative forms of targeted therapy such as surgery to remove the brain region where seizures arise, or gene therapy that permanently alters the excitability of brain cells.

“Although there is currently no evidence that permanently suppressing excitability in a small area affects brain function, we cannot be sure that it would have no impact long-term. Our new method is completely reversible, so if there were any side-effects then people could simply stop taking the CNO pill.”

(Source: ucl.ac.uk)

Filed under epilepsy seizure suppression brain cells gene therapy optogenetics neuroscience science

102 notes

Drug used to treat multiple sclerosis may have beneficial effects on memory

Virginia Commonwealth University School of Medicine researchers have uncovered a new mechanism of action of fingolimod, a drug widely used to treat multiple sclerosis: elimination of adverse or traumatic memories.

The findings shed light on how the drug works on the molecular level – something that has not been well understood until now.

Fingolimod, or FTY720, which is the first orally available drug for treatment of multiple sclerosis, works by suppressing the immune system. Fingolimod is a prodrug that is phosphorylated in the body to its active form, FTY720-phosphate.

In a study published by the Nature Neuroscience journal on May 25 as an Advanced Online Publication, researchers used a mouse model to show that fingolimod accumulates in the brain and inhibits histone deacetylases, which are enzymes important to regulate gene expression. The team observed an increased expression of a limited number of genes important for certain memory processes. Fingolimod acted similarly to the natural signaling lipid, sphingosine-1-phosphate, which it closely resembles.

“Our work suggests that some of the beneficial effects of FTY720/fingolimod that are not well understood might be mediated by this new activity that we have discovered,” said first author Sarah Spiegel, Ph.D., an internationally renowned researcher and professor and chair of the Department of Biochemistry and Molecular Biology in the VCU School of Medicine.

“It will be important in the future to determine whether this prodrug can reduce loss of cognitive functions and can erase adverse memories,” she said.

Spiegel added that other histone deacetylase inhibitors have long been used for treatment of psychiatric and neurological disorders, yet the mechanism of their effectiveness is not fully understood.

“FTY720/fingolimod may be a useful adjuvant therapy to help stop aversive memories such as in post-traumatic stress disorder and other anxiety disorders,” Spiegel said.

“The work has not been extended to show effectiveness in humans at this time. We are still working to fully understand the molecular underpinnings of the drug and its link to memory,” she said.

The work is based on previous findings by Spiegel’s group that were published in Science in 2009. They had reported that sphingosine-1-phosphate formed in the nucleus of cells is a natural inhibitor of histone deacetylases and a regulator of gene expression.

(Source: spectrum.vcu.edu)

Filed under MS fingolimod memory histone deacetylase gene expression neuroscience science

free counters