Neuroscience

Articles and news from the latest research reports.

101 notes

Size at birth affects risk of adolescent mental health disorders

New research from the Copenhagen Centre for Social Evolution and Yale University offers compelling support for the general evolutionary theory that birth weight and -length can partially predict the likelihood of being diagnosed with mental health disorders such as autism and schizophrenia later in life. The study analyzed medical records of 1.75 million Danish births, and subsequent hospital diagnoses for up to 30 years, and adjusted for almost all other known risk factors. The study is published today in the Proceedings of the Royal Society, London B.

image

The number of people diagnosed with mental health disorders is on the rise in most affluent countries, but we do not yet have a comprehensive understanding of the factors that make people vulnerable to these disorders.

A new analysis of the extensive Danish public health database suggests that part of the answer may reside in genetic imprints established at conception that influence both size at birth and mental health during childhood and early adolescence.

The study tests predictions of the evolutionary theory of genomic imprinting – the idea that during fetal development some genes inherited from the mother are expressed differently to those inherited from the father. The potential consequence of this asymmetry is that maternal and paternal genes in a fetus will not cooperate fully during this period, even though they subsequently have shared interests due to their lifetime commitment to the same body.

Opposite forces balance each other

The reason for the conflict is that some of the genes known to be expressed in the placenta and the brain carry imprints that affect resource provisioning of the unborn child. When such genes come from the father, they favor investment of more of the mother’s resources in the developing fetus, whereas the maternally-imprinted genes will normally compensate for such paternally-influenced manipulative effects to lessen the drain on maternal resources. These opposite forces balance each other in most pregnancies, with the result that most children are born with close to average length and weight and with a high likelihood of balanced mental health development.

Small deviations may well be favorable in human populations, when somewhat heavier babies are more likely to develop abstract talents and somewhat lighter babies above average social talents, for instance. However, this incurs the risk of increasing the frequency of autistic- and schizophrenic-spectrum disorders in the rare cases where imprinting imbalances are larger. The theory may explain why natural selection has not removed this portion of the burden of mental disease from our ancestors.

The new study tests these predictions and its results are remarkably consistent. They show that the change to the risk of developing mental disorders when born smaller or larger than average are relatively small, but very consistent, clearly diametrical, and part of the single continuum that the theory predicts.

“When we started this large scale analysis four years ago, we hoped to find evidence that genetic imprinting happens, but we did not expect that the results would match the predictions as consistently as we found”, explains Professor Jacobus Boomsma, Director of the Centre for Social Evolution, University of Copenhagen, who coordinated the work.

Boomsma adds: “Our study confirms that larger babies have a higher risk for incurring autism-spectrum diagnoses later in life and lower risk for schizophrenia-spectrum disorders. For example, Danish newborns are on average 52 cm long and being born at 54 cm increases the autism risk by 20%. However, these are relative risks and these disorders remain rare: in this example the absolute risk increases from 0.65% to 0.78%. Risk patterns are opposite in smaller newborns, who have higher risks for schizophrenia and lower risks for autism. Only for the smallest, prematurely-born babies does this diametric pattern disappear, because they have elevated risks for almost all disease categories”.

Evolutionary conflicts

Boomsma also underlines that focused genomic studies will be needed to find out which genes are involved and how they affect brain function: ”Our Centre’s main objective is to develop and test evolutionary theory about the ways in which gene-level conflicts can corrupt even the most sophisticated forms of naturally evolved cooperation. It is no surprise that humans are vulnerable to such deep evolutionary conflicts, as are other mammals, and it is both useful and interesting to be aware of this part of our biological heritage”, says Professor Boomsma.

(Source: science.ku.dk)

Filed under birth weight birth size mental health schizophrenia autism neuroscience science

176 notes

Study links physical activity in older adults to brain white-matter integrity
Like everything else in the body, the white-matter fibers that allow communication between brain regions also decline with age. In a new study, researchers found a strong association between the structural integrity of these white-matter tracts and an older person’s level of daily activity – not just the degree to which he or she engaged in moderate or vigorous exercise, but also whether the person was sedentary the rest of the time.
The study, reported in the journal PLOS ONE, tracked physical activity in 88 healthy but “low-fit” participants aged 60 to 78. The participants agreed to wear accelerometers during most of their waking hours over the course of a week, and also submitted to brain imaging.
“To our knowledge, this is the first study of its kind that uses an objective measure of physical activity along with multiple measures of brain structure,” said University of Illinois postdoctoral researcher Agnieszka Burzynska, who conducted the research with U. of I. Beckman Institute director Arthur Kramer and kinesiology and community health professor Edward McAuley.
Most studies ask subjects to describe how much physical activity they get, which is subjective and imprecise, Burzynska said. The accelerometer continuously tracks a person’s movement, “so it’s not what they say they do or what they think they do, but we have measured what they are actually doing,” she said.
The researchers assumed that participants’ activity levels over a week accurately reflected their overall engagement, or lack of engagement, in physical activity.
The study also relied on two types of brain imaging. The first, diffusion tensor imaging, offers insight into the structural integrity of a tissue by revealing how water is diffused in the tissue. The second method looks for age-related changes in white matter, called lesions. Roughly 95 percent of adults aged 65 and older have such lesions, Burzynska said. While they are a normal part of aging, their early onset or rapid accumulation may spell trouble, she said.
The team found that the brains of older adults who regularly engaged in moderate-to-vigorous exercise generally “showed less of the white-matter lesions,” Burzynska said.
The association between physical activity and white-matter structural integrity was region-specific, the researchers reported. Older adults who engaged more often in light physical activity had greater structural integrity in the white-matter tracts of the temporal lobes, which lie behind the ears and play a key role in memory, language, and the processing of visual and auditory information.
In contrast, those who spent more time sitting had lower structural integrity in the white-matter tracts connecting the hippocampus, “a structure crucial for learning and memory,” Burzynska said.
“This relationship between the integrity of tracts connecting the hippocampus and sedentariness is significant even when we control for age, gender and aerobic fitness,” she said. “It suggests that the physiological effect of sitting too much, even if you still exercise at the end of the day for half an hour, will have a detrimental effect on your brain.”
The findings suggest that engaging in physical activity and avoiding a sedentary lifestyle are both important for brain health in older age, Burzynska said.
“We hope that this will encourage people to take better care of their brains by being more active,” she said.

Study links physical activity in older adults to brain white-matter integrity

Like everything else in the body, the white-matter fibers that allow communication between brain regions also decline with age. In a new study, researchers found a strong association between the structural integrity of these white-matter tracts and an older person’s level of daily activity – not just the degree to which he or she engaged in moderate or vigorous exercise, but also whether the person was sedentary the rest of the time.

The study, reported in the journal PLOS ONE, tracked physical activity in 88 healthy but “low-fit” participants aged 60 to 78. The participants agreed to wear accelerometers during most of their waking hours over the course of a week, and also submitted to brain imaging.

“To our knowledge, this is the first study of its kind that uses an objective measure of physical activity along with multiple measures of brain structure,” said University of Illinois postdoctoral researcher Agnieszka Burzynska, who conducted the research with U. of I. Beckman Institute director Arthur Kramer and kinesiology and community health professor Edward McAuley.

Most studies ask subjects to describe how much physical activity they get, which is subjective and imprecise, Burzynska said. The accelerometer continuously tracks a person’s movement, “so it’s not what they say they do or what they think they do, but we have measured what they are actually doing,” she said.

The researchers assumed that participants’ activity levels over a week accurately reflected their overall engagement, or lack of engagement, in physical activity.

The study also relied on two types of brain imaging. The first, diffusion tensor imaging, offers insight into the structural integrity of a tissue by revealing how water is diffused in the tissue. The second method looks for age-related changes in white matter, called lesions. Roughly 95 percent of adults aged 65 and older have such lesions, Burzynska said. While they are a normal part of aging, their early onset or rapid accumulation may spell trouble, she said.

The team found that the brains of older adults who regularly engaged in moderate-to-vigorous exercise generally “showed less of the white-matter lesions,” Burzynska said.

The association between physical activity and white-matter structural integrity was region-specific, the researchers reported. Older adults who engaged more often in light physical activity had greater structural integrity in the white-matter tracts of the temporal lobes, which lie behind the ears and play a key role in memory, language, and the processing of visual and auditory information.

In contrast, those who spent more time sitting had lower structural integrity in the white-matter tracts connecting the hippocampus, “a structure crucial for learning and memory,” Burzynska said.

“This relationship between the integrity of tracts connecting the hippocampus and sedentariness is significant even when we control for age, gender and aerobic fitness,” she said. “It suggests that the physiological effect of sitting too much, even if you still exercise at the end of the day for half an hour, will have a detrimental effect on your brain.”

The findings suggest that engaging in physical activity and avoiding a sedentary lifestyle are both important for brain health in older age, Burzynska said.

“We hope that this will encourage people to take better care of their brains by being more active,” she said.

Filed under physical activity exercise white matter brain structure neuroimaging aging neuroscience science

264 notes

Hypersensitivity to Non-Painful Events May Be Part of Pathology in Fibromyalgia

New research shows that patients with fibromyalgia have hypersensitivity to non-painful events based on images of the patients’ brains, which show reduced activation in primary sensory regions and increased activation in sensory integration areas. Findings published in Arthritis & Rheumatology, a journal of the American College of Rheumatology (ACR), suggest that brain abnormalities in response to non-painful sensory stimulation may cause the increased unpleasantness that patients experience in response to daily visual, auditory and tactile stimulation.

image

Fibromyalgia is a chronic, musculoskeletal syndrome characterized by widespread pain, affecting roughly two percent of the world population, say experts. According to the ACR, five million people in the U.S. have fibromyalgia, which is more prevalent among women. In previous studies fibromyalgia patients report reduced tolerance to normal sensory (auditory, visual, olfactory, and tactile) stimulation in addition to greater sensitivity to pain.

For the present study, researchers used functional magnetic resonance imaging (fMRI) to assess brain response to sensory stimulation in 35 women with fibromyalgia and 25 healthy, age-matched controls. Patients had an average disease duration of 7 years and a mean age of 47.

According to the study, patients reported increased unpleasantness in response to multisensory stimulation in daily life activities. Furthermore, fMRI displayed reduced activation of both the primary and secondary visual and auditory areas of the brain, and increased activation in sensory integration regions. These brain abnormalities mediated the increased unpleasantness to visual, auditory and tactile stimulation that patients reported to experience in daily life.

Lead study author, Dr. Marina López-Solà from the Institute of Cognitive Science, University of Colorado Boulder said, “Our study provides new evidence that fibromyalgia patients display altered central processing in response to multisensory stimulation, which are linked to core fibromyalgia symptoms and may be part of the disease pathology. The finding of reduced cortical activation in the visual and auditory brain areas that were associated with patient pain complaints may offer novel targets for neurostimulation treatments in fibromyalgia patients.”

(Source: eu.wiley.com)

Filed under fibromyalgia chronic pain neuroimaging sensory sensitivity insular cortex neuroscience science

147 notes

Researchers debunk myth about Parkinson’s disease

Using advanced computer models, neuroscience researchers at the University of Copenhagen have gained new knowledge about the complex processes that cause Parkinson’s disease. The findings have recently been published in the prestigious Journal of Neuroscience.

image

The defining symptoms of Parkinson’s disease are slow movements, muscular stiffness and shaking. There is currently no cure for the condition, so it is essential to conduct innovative research with the potential to shed some light on this terrible disruption to the central nervous system that affects one person in a thousand in Denmark.

Dopamine is an important neurotransmitter which affects physical and psychological functions such as motor control, learning and memory. Levels of this substance are regulated by special dopamine cells. When the level of dopamine drops, nerve cells that constitute part of the brain’s ‘stop signal’ are activated.

“This stop signal is rather like the safety lever on a motorised lawn mower: if you take your hand off the lever, the mower’s motor stops. Similarly, dopamine must always be present in the system to block the stop signal. Parkinson’s disease arises because for some reason the dopamine cells in the brain are lost, and it is known that the stop signal is being over-activated somehow or other. Many researchers have therefore considered it obvious that long-term lack of dopamine must be the cause of the distinctive symptoms that accompanies the disease. However, we can now use advanced computer simulations to challenge the existing paradigm and put forward a different theory about what actually takes place in the brain when the dopamine cells gradually die,” explains Jakob Kisbye Dreyer, Postdoc at the Department of Neuroscience and Pharmacology, University of Copenhagen.

A thorn in the side

Scanning the brain of a patient suffering from Parkinson’s disease reveals that in spite of dopamine cell death, there are no signs of a lack of dopamine – even at a comparatively late stage in the process.

“The inability to establish a lack of dopamine until advanced cases of Parkinson’s disease has been a thorn in the side of researchers for many years. On the one hand, the symptoms indicate that the stop signal is over-activated, and patients are treated accordingly with a fair degree of success. On the other hand, data prove that they are not lacking dopamine,” says Postdoc Jakob Kisbye Dreyer.

Computer models predict the progress of the disease

“Our calculations indicate that cell death only affects the level of dopamine very late in the process, but that symptoms can arise long before the level of the neurotransmitter starts to decline. The reason for this is that the fluctuations that normally make up a signal become weaker. In the computer model, the brain compensates for the shortage of signals by creating additional dopamine receptors. This has a positive effect initially, but as cell death progresses further, the correct signal may almost disappear. At this stage, the compensation becomes so overwhelming that even small variations in the level of dopamine trigger the stop signal – which can therefore cause the patient to develop the disease.”

The new research findings may pave the way for earlier diagnosis of Parkinson’s disease.

(Source: healthsciences.ku.dk)

Filed under parkinson's disease dopamine dopamine neurons cell death neuroscience science

71 notes

(Image caption: The hair cells of mice missing just Hey2 are neatly lined up in four rows (left) while those missing Hey1 and Hey2 are disorganized (right). The cells’ hairlike protrusions (pink) can be misoriented, too. Credit: Angelika Doetzlhofer)
Hey1 and Hey2 ensure inner ear ‘hair cells’ are made at the right time, in the right place 
Two Johns Hopkins neuroscientists have discovered the “molecular brakes” that time the generation of important cells in the inner ear cochleas of mice. These “hair cells” translate sound waves into electrical signals that are carried to the brain and are interpreted as sounds. If the arrangement of the cells is disordered, hearing is impaired.
A summary of the research will be published in The Journal of Neuroscience on Sept. 16.
"The proteins Hey1 and Hey2 act as brakes to prevent hair cell generation until the time is right," says Angelika Doetzlhofer, Ph.D., an assistant professor of neuroscience. "Without them, the hair cells end up disorganized and dysfunctional."
The cochlea is a coiled, fluid-filled structure bordered by a flexible membrane that vibrates when sound waves hit it. This vibration is passed through the fluid in the cochlea and sensed by specialized hair cells that line the tissue in four precise rows. Their name comes from the cells’ hairlike protrusions that detect movement of the cochlear fluid and create electrical signals that relay the sound to the brain.
During development, “parent cells” within the cochlea gradually differentiate into hair cells in a precise sequence, starting with the cells at the base of the cochlea and progressing toward its tip. The signaling protein Sonic Hedgehog was known to be released by nearby nerve cells in a time- and space-dependent pattern that matches that of hair cell differentiation. But the mechanism of Sonic Hedgehog’s action was unclear.
Doetzlhofer and postdoctoral fellow Ana Benito Gonzalez bred mice whose inner ear cells were missing Hey1 and Hey2, two genes known to be active in the parent cells but turned off in hair cells. They found that, without those genes, the cells were generated too early and were abnormally patterned: Rows of hair cells were either too many or too few, and their hairlike protrusions were often deformed and pointing in the wrong direction.
"While these mice didn’t live long enough for us to test their hearing, we know from other studies that mice with disorganized hair cell patterns have serious hearing problems," says Doetzlhofer.
Further experiments demonstrated the role of Sonic Hedgehog in regulating the two key genes.
"Hey1 and Hey2 stop the parent cells from turning into hair cells until the time is right," explains Doetzlhofer. "Sonic Hedgehog applies those ‘brakes,’ then slowly releases pressure on them as the cochlea develops. If the brakes stop working, the hair cells are generated too early and end up misaligned."
She adds that Sonic Hedgehog, Hey1 and Hey2 are found in many other parent cell types throughout the developing nervous system and may play similar roles in timing the generation of other cell types.

(Image caption: The hair cells of mice missing just Hey2 are neatly lined up in four rows (left) while those missing Hey1 and Hey2 are disorganized (right). The cells’ hairlike protrusions (pink) can be misoriented, too. Credit: Angelika Doetzlhofer)

Hey1 and Hey2 ensure inner ear ‘hair cells’ are made at the right time, in the right place

Two Johns Hopkins neuroscientists have discovered the “molecular brakes” that time the generation of important cells in the inner ear cochleas of mice. These “hair cells” translate sound waves into electrical signals that are carried to the brain and are interpreted as sounds. If the arrangement of the cells is disordered, hearing is impaired.

A summary of the research will be published in The Journal of Neuroscience on Sept. 16.

"The proteins Hey1 and Hey2 act as brakes to prevent hair cell generation until the time is right," says Angelika Doetzlhofer, Ph.D., an assistant professor of neuroscience. "Without them, the hair cells end up disorganized and dysfunctional."

The cochlea is a coiled, fluid-filled structure bordered by a flexible membrane that vibrates when sound waves hit it. This vibration is passed through the fluid in the cochlea and sensed by specialized hair cells that line the tissue in four precise rows. Their name comes from the cells’ hairlike protrusions that detect movement of the cochlear fluid and create electrical signals that relay the sound to the brain.

During development, “parent cells” within the cochlea gradually differentiate into hair cells in a precise sequence, starting with the cells at the base of the cochlea and progressing toward its tip. The signaling protein Sonic Hedgehog was known to be released by nearby nerve cells in a time- and space-dependent pattern that matches that of hair cell differentiation. But the mechanism of Sonic Hedgehog’s action was unclear.

Doetzlhofer and postdoctoral fellow Ana Benito Gonzalez bred mice whose inner ear cells were missing Hey1 and Hey2, two genes known to be active in the parent cells but turned off in hair cells. They found that, without those genes, the cells were generated too early and were abnormally patterned: Rows of hair cells were either too many or too few, and their hairlike protrusions were often deformed and pointing in the wrong direction.

"While these mice didn’t live long enough for us to test their hearing, we know from other studies that mice with disorganized hair cell patterns have serious hearing problems," says Doetzlhofer.

Further experiments demonstrated the role of Sonic Hedgehog in regulating the two key genes.

"Hey1 and Hey2 stop the parent cells from turning into hair cells until the time is right," explains Doetzlhofer. "Sonic Hedgehog applies those ‘brakes,’ then slowly releases pressure on them as the cochlea develops. If the brakes stop working, the hair cells are generated too early and end up misaligned."

She adds that Sonic Hedgehog, Hey1 and Hey2 are found in many other parent cell types throughout the developing nervous system and may play similar roles in timing the generation of other cell types.

Filed under inner ear hair cells hearing genes cochlea neuroscience science

157 notes

Brain Structure of Kidney Donors May Make Them More Altruistic
That’s the finding of a study published in today’s Proceedings of the National Academy of Sciences (PNAS) by Georgetown researchers.
Georgetown College psychology professor Abigail Marsh worked with John VanMeter, director of Center for Functional and Molecular Imaging at Georgetown University Medical Center, to scan the brains of 19 altruistic kidney donors.
More Sensitive to Distress
“The results of brain scans and behavioral testing suggests that these donors have some structural and functional brain differences that may make them more sensitive, on average, to other people’s distress,” Marsh explains.
The Georgetown researchers used functional MRI to record the neural activity of the kidney donors and 20 control subjects who had never donated an organ as they viewed faces with fearful, angry or neutral expressions.
Underlying Neural Basis
In the right amygdala, an emotion-sensitive brain region, altruists displayed greater neural activity while viewing fearful expressions than did control subjects.
When asked to identify the emotional expressions presented in the face images, altruists recognized fearful facial expressions relatively more accurately than the control subjects.
“The brain scans revealed that the right amygdala volume of altruists is larger than that of non-altruists,” Marsh says. “The findings suggest that individual differences in altruism may have an underlying neural basis.”
Opposite From Psychopaths?
These findings dovetail with previous research by the professor showing  structural and functional brain differences that appear to make people with psychopathic traits less sensitive to others’ fear and distress.
These differences include amygdalas that are smaller and less responsive to fearful expressions. People who are unusually altruistic may therefore be the opposite in some ways from people who are psychopathic.
To find kidney donors, the researchers reached out to the Washington Regional Transplant Community (WRTC), a federally designated organ procurement organizations.
A Donor’s Story
Harold Mintz, former northern Virginian who volunteered with WRTC and agreed to participate in the Georgetown study, donated a kidney to an anonymous stranger he later learned was an Ethiopian refugee who had settled in Washington, D.C.
Mintz, who now lives in California and speaks to high school students about his 2000 donation, says a series of events over time led him to supply the kidney, including his father dying of cancer diagnosed too late at the age of 56.
One Valentine’s Day in 1988, Mintz and his wife were shopping separately for presents and Mintz noticed parents in a mall with a sign saying “Please Save Our Daughter’s Life.” He walked past them, then turned around and asked what they needed. It turned out the daughter had leukemia and needed a bone marrow transplant.
The couple decided to forget about the holiday and donated blood to see if either of them were a match. But no match was found and Mintz later noticed the daughter’s obituary in the newspaper.
Stories Taken to Heart
Mintz also was surprised to hear that although the couple’s daughter had just died, they thanked everyone who tried to help and expressed hope that they might help someone else.
“All these stories just kind of stuck inside my head and every time I’d see a story about a medical story of distress, it would just kind of get put away in a file inside my heart,” Mintz says.
Marsh notes that kidney disease is now the eighth-leading cause of death in the U.S., and that living kidney donations are the best hope for restoring people to health who have kidney disease.
“Dr. Marsh’s work is a great example of how fMRI can be used to provide insight into how differences in the brain’s response can lead individuals to perform such magnanimous acts,” VanMeter says.

Brain Structure of Kidney Donors May Make Them More Altruistic

That’s the finding of a study published in today’s Proceedings of the National Academy of Sciences (PNAS) by Georgetown researchers.

Georgetown College psychology professor Abigail Marsh worked with John VanMeter, director of Center for Functional and Molecular Imaging at Georgetown University Medical Center, to scan the brains of 19 altruistic kidney donors.

More Sensitive to Distress

“The results of brain scans and behavioral testing suggests that these donors have some structural and functional brain differences that may make them more sensitive, on average, to other people’s distress,” Marsh explains.

The Georgetown researchers used functional MRI to record the neural activity of the kidney donors and 20 control subjects who had never donated an organ as they viewed faces with fearful, angry or neutral expressions.

Underlying Neural Basis

In the right amygdala, an emotion-sensitive brain region, altruists displayed greater neural activity while viewing fearful expressions than did control subjects.

When asked to identify the emotional expressions presented in the face images, altruists recognized fearful facial expressions relatively more accurately than the control subjects.

“The brain scans revealed that the right amygdala volume of altruists is larger than that of non-altruists,” Marsh says. “The findings suggest that individual differences in altruism may have an underlying neural basis.”

Opposite From Psychopaths?

These findings dovetail with previous research by the professor showing  structural and functional brain differences that appear to make people with psychopathic traits less sensitive to others’ fear and distress.

These differences include amygdalas that are smaller and less responsive to fearful expressions. People who are unusually altruistic may therefore be the opposite in some ways from people who are psychopathic.

To find kidney donors, the researchers reached out to the Washington Regional Transplant Community (WRTC), a federally designated organ procurement organizations.

A Donor’s Story

Harold Mintz, former northern Virginian who volunteered with WRTC and agreed to participate in the Georgetown study, donated a kidney to an anonymous stranger he later learned was an Ethiopian refugee who had settled in Washington, D.C.

Mintz, who now lives in California and speaks to high school students about his 2000 donation, says a series of events over time led him to supply the kidney, including his father dying of cancer diagnosed too late at the age of 56.

One Valentine’s Day in 1988, Mintz and his wife were shopping separately for presents and Mintz noticed parents in a mall with a sign saying “Please Save Our Daughter’s Life.” He walked past them, then turned around and asked what they needed. It turned out the daughter had leukemia and needed a bone marrow transplant.

The couple decided to forget about the holiday and donated blood to see if either of them were a match. But no match was found and Mintz later noticed the daughter’s obituary in the newspaper.

Stories Taken to Heart

Mintz also was surprised to hear that although the couple’s daughter had just died, they thanked everyone who tried to help and expressed hope that they might help someone else.

“All these stories just kind of stuck inside my head and every time I’d see a story about a medical story of distress, it would just kind of get put away in a file inside my heart,” Mintz says.

Marsh notes that kidney disease is now the eighth-leading cause of death in the U.S., and that living kidney donations are the best hope for restoring people to health who have kidney disease.

“Dr. Marsh’s work is a great example of how fMRI can be used to provide insight into how differences in the brain’s response can lead individuals to perform such magnanimous acts,” VanMeter says.

Filed under altruism prosocial behavior amygdala fMRI psychopathy brain structure psychology neuroscience science

205 notes

No sedative necessary: Scientists discover new “sleep node” in the brain
A sleep-promoting circuit located deep in the primitive brainstem has revealed how we fall into deep sleep. Discovered by researchers at Harvard School of Medicine and the University at Buffalo School of Medicine and Biomedical Sciences, this is only the second “sleep node” identified in the mammalian brain whose activity appears to be both necessary and sufficient to produce deep sleep.
Published online in August in Nature Neuroscience, the study demonstrates that fully half of all of the brain’s sleep-promoting activity originates from the parafacial zone (PZ) in the brainstem. The brainstem is a primordial part of the brain that regulates basic functions necessary for survival, such as breathing, blood pressure, heart rate and body temperature.
“The close association of a sleep center with other regions that are critical for life highlights the evolutionary importance of sleep in the brain,” says Caroline E. Bass, assistant professor of Pharmacology and Toxicology in the UB School of Medicine and Biomedical Sciences and a co-author on the paper.
The researchers found that a specific type of neuron in the PZ that makes the neurotransmitter gamma-aminobutyric acid (GABA) is responsible for deep sleep. They used a set of innovative tools to precisely control these neurons remotely, in essence giving them the ability to turn the neurons on and off at will.
 “These new molecular approaches allow unprecedented control over brain function at the cellular level,” says Christelle Ancelet, postdoctoral fellow at Harvard School of Medicine. “Before these tools were developed, we often used ‘electrical stimulation’ to activate a region, but the problem is that doing so stimulates everything the electrode touches and even surrounding areas it didn’t. It was a sledgehammer approach, when what we needed was a scalpel.”
“To get the precision required for these experiments, we introduced a virus into the PZ that expressed a ‘designer’ receptor on GABA neurons only but didn’t otherwise alter brain function,” explains Patrick Fuller, assistant professor at Harvard and senior author on the paper. “When we turned on the GABA neurons in the PZ, the animals quickly fell into a deep sleep without the use of sedatives or sleep aids.”
How these neurons interact in the brain with other sleep and wake-promoting brain regions still need to be studied, the researchers say, but eventually these findings may translate into new medications for treating sleep disorders, including insomnia, and the development of better and safer anesthetics.
“We are at a truly transformative point in neuroscience,” says Bass, “where the use of designer genes gives us unprecedented ability to control the brain. We can now answer fundamental questions of brain function, which have traditionally been beyond our reach, including the ‘why’ of sleep, one of the more enduring mysteries in the neurosciences.”

No sedative necessary: Scientists discover new “sleep node” in the brain

A sleep-promoting circuit located deep in the primitive brainstem has revealed how we fall into deep sleep. Discovered by researchers at Harvard School of Medicine and the University at Buffalo School of Medicine and Biomedical Sciences, this is only the second “sleep node” identified in the mammalian brain whose activity appears to be both necessary and sufficient to produce deep sleep.

Published online in August in Nature Neuroscience, the study demonstrates that fully half of all of the brain’s sleep-promoting activity originates from the parafacial zone (PZ) in the brainstem. The brainstem is a primordial part of the brain that regulates basic functions necessary for survival, such as breathing, blood pressure, heart rate and body temperature.

“The close association of a sleep center with other regions that are critical for life highlights the evolutionary importance of sleep in the brain,” says Caroline E. Bass, assistant professor of Pharmacology and Toxicology in the UB School of Medicine and Biomedical Sciences and a co-author on the paper.

The researchers found that a specific type of neuron in the PZ that makes the neurotransmitter gamma-aminobutyric acid (GABA) is responsible for deep sleep. They used a set of innovative tools to precisely control these neurons remotely, in essence giving them the ability to turn the neurons on and off at will.

 “These new molecular approaches allow unprecedented control over brain function at the cellular level,” says Christelle Ancelet, postdoctoral fellow at Harvard School of Medicine. “Before these tools were developed, we often used ‘electrical stimulation’ to activate a region, but the problem is that doing so stimulates everything the electrode touches and even surrounding areas it didn’t. It was a sledgehammer approach, when what we needed was a scalpel.”

“To get the precision required for these experiments, we introduced a virus into the PZ that expressed a ‘designer’ receptor on GABA neurons only but didn’t otherwise alter brain function,” explains Patrick Fuller, assistant professor at Harvard and senior author on the paper. “When we turned on the GABA neurons in the PZ, the animals quickly fell into a deep sleep without the use of sedatives or sleep aids.”

How these neurons interact in the brain with other sleep and wake-promoting brain regions still need to be studied, the researchers say, but eventually these findings may translate into new medications for treating sleep disorders, including insomnia, and the development of better and safer anesthetics.

“We are at a truly transformative point in neuroscience,” says Bass, “where the use of designer genes gives us unprecedented ability to control the brain. We can now answer fundamental questions of brain function, which have traditionally been beyond our reach, including the ‘why’ of sleep, one of the more enduring mysteries in the neurosciences.”

Filed under sleep slow wave sleep brainstem brain activity GABA parafacial zone neuroscience science

251 notes

Neuroscientists decode conscious experiences with Hitchcock film

Western researchers have extended their game-changing brain scanning techniques by showing that a short Alfred Hitchcock movie can be used to detect consciousness in vegetative state patients. The study included a Canadian participant who had been entirely unresponsive for 16 years, but is now known to be aware and able to follow the plot of movies.

Lorina Naci, a postdoctoral fellow from Western’s Brain and Mind Institute, and her Western colleagues, Rhodri Cusack, Mimma Anello and Adrian Owen, reported their findings today in The Proceedings of the National Academy of Sciences of the USA (PNAS), in a study titled, A common neural code for similar conscious experiences in different individuals.

While inside the 3T Magnetic Resonance Imaging (MRI) Scanner at Western’s Centre for Functional and Metabolic Mapping, participants watched a highly engaging short film by Alfred Hitchcock. Movie viewing elicited a common pattern of synchronized brain activity. The long-time unresponsive participant’s brain response during the same movie strongly resembled that of the healthy participants, suggesting not only that he was consciously aware, but also that he understood the movie.

“For the first time, we show that a patient with unknown levels of consciousness can monitor and analyze information from their environment, in the same way as healthy individuals,” said Naci, lead researcher on the new study. “We already know that up to one in five of these patients are misdiagnosed as being unconscious and this new technique may reveal that that number is even higher.”

Owen, the Canada Excellence Research Chair in Cognitive Neuroscience and Imaging, explained, “This approach can detect not only whether a patient is conscious, but also what that patient might be thinking. Thus, it has important practical and ethical implications for the patient’s standard of care and quality of life.”

The researchers hope that this novel method will enable better understanding of behaviorally unresponsive patients, who may be misdiagnosed as lacking consciousness.

Filed under consciousness vegetative state brain activity neuroimaging neuroscience science

135 notes

This Is Your Brain on Snacks—Brain Stimulation Affects Craving and Consumption

Magnetic stimulation of a brain area involved in “executive function” affects cravings for and consumption of calorie-dense snack foods, reports a study in the September issue of Psychosomatic Medicine: Journal of Biobehavioral Medicine, the official journal of the American Psychosomatic Society. The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.

image

After stimulation of the dorsolateral prefrontal cortex (DLPFC), young women experience increased cravings for high-calorie snacks—and eat more of those foods when given the opportunity, according to the study by researchers at University of Waterloo, Ont., Canada. “These findings shed a light on the role of the DLPFC in food cravings (specifically reward anticipation), the consumption of appealing high caloric foods, and the relation between self-control and food consumption,” the researchers write. The senior author was Peter Hall, PhD.

Brain Stimulation Affects Cravings and Consumption for ‘Appetitive’ Snacks

The study included 21 healthy young women, selected because they reported strong and frequent cravings for chocolate and potato chips. Such “appetitive,” calorie-dense snack foods are often implicated in the development of obesity.

The women were shown pictures of these foods to stimulate cravings. The researchers then applied a type of magnetic stimulation, called continuous theta-burst stimulation, to decrease activity in the DLPFC. Previous studies have suggested that DLPFC activity plays a role in regulating food cravings.

After theta-burst stimulation, the women reported stronger food cravings—specifically for “appetitive” milk chocolate and potato chips. During a subsequent “taste test,” they consumed more of these foods, rather than alternative, less-appetitive foods (dark chocolate and soda crackers).

Stimulation to weaken DLPFC activity was also associated with lower performance on a test of inhibitory control strength (the Stroop test). Decreased DLPFC activity appeared to be associated with increased “reward sensitivity”—it made the participants “more sensitive to the rewarding properties of palatable high caloric foods,” the researchers write.

Weak Executive Function May Contribute to Obesity Risk

The results highlight the role of executive function in governing “dietary self-restraint,” the researchers believe. Executive function, which involves the DLPFC, refers to a set of cognitive functions that enable “top-down” control of action, emotion, and thought.

At the “basic neurobiological level,” the study provides direct evidence that the DLPFC is involved in one specific aspect of food cravings: reward anticipation. People with weak executive function may lack the dietary self-control necessary to regulate snack food consumption in “the modern obesogenic environment.” Faced with constant cues and opportunities to consume energy-dense foods, such individuals may be more likely to become overweight or obese.

The results suggest that interventions aimed at enhancing or preserving DLPFC function may help to prevent obesity and related diseases. In conditions such as type 2 diabetes, where healthy dietary habits are essential for effective disease control, “Interventions focused on enhancing DLPFC activity, through aerobic exercise or other means, may result in increased dietary self-control and subsequently improve disease management,” Dr Hall and coauthors add.

(Source: newswise.com)

Filed under food consumption prefrontal cortex executive function brain stimulation self-control psychology neuroscience science

85 notes

Researcher Develops and Proves Effectiveness of New Drug for Spinal Muscular Atrophy
According to recent studies, approximately one out of every 40 individuals in the United States is a carrier of the gene responsible for spinal muscular atrophy (SMA), a neurodegenerative disease that causes muscles to weaken over time. Now, researchers at the University of Missouri have made a recent breakthrough with the development of a new compound found to be highly effective in animal models of the disease. In April, a patent was filed for the compound for use in SMA.
“The strategy our lab is using to fight SMA is to ‘repress the repressor,’” said Chris Lorson, a researcher in the Bond Life Sciences Center and professor in the MU Department of Veterinary Pathobiology. “It’s a lot like reading a book, but in this case, the final chapter of the book—or the final exon of the genetic sequence—is omitted. The exciting part is that the important chapter is still there—and can be tricked into being read correctly, if you know how. The new SMA therapeutic compound, an antisense oligonucleotide, repairs expression of the gene affected by the disease.”
In individuals affected by SMA, the spinal motor neuron-1 (SMN1) gene is mutated and lacks the ability to process a key protein that helps muscle neurons function. Muscles in the lower extremities are usually affected first, followed by muscles in the upper extremities, including areas around the neck and spine.
Fortunately, humans have a nearly identical copy gene called SMN2. Lorson’s drug targets that specific genetic sequence and allows proper “editing” of the SMN2 gene. The drug allows the SMN2 gene to bypass the defective gene and process the protein that helps the muscle neurons function.
Lorson’s breakthrough therapeutic compound was patented in April. His research found that the earlier the treatment can be administered in mice with SMA, the better the outcome. In mice studies, the drug improved the survival rate by 500 to 700 percent, with a 90 percent improvement demonstrated in severe SMA cases, according to the study.
Although there is no cure for SMA currently, the National Institutes of Health (NIH) has listed SMA as the neurological disease closest to finding a cure, due in part to effective drugs like the one developed in Lorson’s lab.

Researcher Develops and Proves Effectiveness of New Drug for Spinal Muscular Atrophy

According to recent studies, approximately one out of every 40 individuals in the United States is a carrier of the gene responsible for spinal muscular atrophy (SMA), a neurodegenerative disease that causes muscles to weaken over time. Now, researchers at the University of Missouri have made a recent breakthrough with the development of a new compound found to be highly effective in animal models of the disease. In April, a patent was filed for the compound for use in SMA.

“The strategy our lab is using to fight SMA is to ‘repress the repressor,’” said Chris Lorson, a researcher in the Bond Life Sciences Center and professor in the MU Department of Veterinary Pathobiology. “It’s a lot like reading a book, but in this case, the final chapter of the book—or the final exon of the genetic sequence—is omitted. The exciting part is that the important chapter is still there—and can be tricked into being read correctly, if you know how. The new SMA therapeutic compound, an antisense oligonucleotide, repairs expression of the gene affected by the disease.”

In individuals affected by SMA, the spinal motor neuron-1 (SMN1) gene is mutated and lacks the ability to process a key protein that helps muscle neurons function. Muscles in the lower extremities are usually affected first, followed by muscles in the upper extremities, including areas around the neck and spine.

Fortunately, humans have a nearly identical copy gene called SMN2. Lorson’s drug targets that specific genetic sequence and allows proper “editing” of the SMN2 gene. The drug allows the SMN2 gene to bypass the defective gene and process the protein that helps the muscle neurons function.

Lorson’s breakthrough therapeutic compound was patented in April. His research found that the earlier the treatment can be administered in mice with SMA, the better the outcome. In mice studies, the drug improved the survival rate by 500 to 700 percent, with a 90 percent improvement demonstrated in severe SMA cases, according to the study.

Although there is no cure for SMA currently, the National Institutes of Health (NIH) has listed SMA as the neurological disease closest to finding a cure, due in part to effective drugs like the one developed in Lorson’s lab.

Filed under spinal muscular atrophy spinal motor neuron gene mutation SMN1 neuroscience science

free counters