Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

171 notes

New Medtronic Deep Brain Stimulation System. The First To Sense And Record Brain Activity While Delivering Therapy

Medtronic, Inc. (NYSE: MDT) announced the first implant of a novel deep brain stimulation (DBS) system that, for the first time, enables the sensing and recording of select brain activity while simultaneously providing targeted DBS therapy. This initiates research on how the brain responds to the therapy and could yield insights that one day significantly change the way people with devastating neurological and psychological disorders, such as Parkinson’s disease, essential tremor, dystonia, and treatment-resistant obsessive-compulsive disorder, are treated.

The Activa® PC+S DBS system delivers proven Medtronic DBS Therapy while at the same time sensing and recording electrical activity in key areas of the brain using sensing technology and an adjustable algorithm, which enable the system to gather brain signals at various moments as selected by a physician. Initially, this new technology will be made available to a select group of physicians worldwide for use in clinical studies. These physicians will use the system to map the brain’s responses to Medtronic DBS Therapy and explore applications for the therapy across a range of neurological and psychological conditions.

The Activa PC+S system, which delivers stimulation to targeted areas of the brain like existing Medtronic DBS systems, was implanted for the first time at Ludwig Maximilians University in Munich, Germany in a person with Parkinson’s disease. This patient will be treated by a team that includes neurologist Kai Bötzel, department of neurology, Ludwig Maximilian University and neurosurgeon Jan Mehrkens, M.D., head of functional neurosurgery, Ludwig Maximilian University, who implanted the system.

Dr. Bötzel will be the first to use data gathered by the Activa PC+S system to gain unprecedented insight into how the brain responds to DBS therapy.

“DBS therapy works for people with Parkinson’s disease and other movement disorders, but there is much to learn about how the brain responds to the therapy,” said Dr. Bötzel. “This new system will allow us to treat patients with conventional DBS therapy, while at the same time opening the door for research that was not possible until now. We hope these insights will lead to the development of effective new treatments tailored to the needs of individuals. ”

“Devastating conditions like Parkinson’s disease and obsessive-compulsive disorder take a significant toll on countless people, as well as their loved ones,” said Lothar Krinke, Ph.D., vice president and general manager of the Deep Brain Stimulation business in Medtronic’s Neuromodulation division. “Medtronic is excited to provide this new system to researchers worldwide, and we expect that their respective studies will lead to accelerated understanding of how neurological and psychological conditions develop and progress. This represents a significant milestone for DBS therapy and the long-term journey toward a closed-loop DBS system, which could personalize therapy by using device data to automatically adjust to the needs of individual patients.”

Medtronic’s Activa PC+S system received CE (Conformité Européenne) mark in January 2013. It is not approved by the U.S. Food and Drug Administration for commercial use in the United States, and will be made available to select physicians for investigational use only. Additional implants of the Activa PC+S system, including the first implant in the United States, will take place in the coming months.

Filed under deep brain stimulation brain activity Activa PC+S system parkinson's disease neuroscience science

106 notes

High Blood Sugar Linked to Dementia
People with diabetes face an increased risk of Alzheimer’s disease and other forms of dementia, a connection scientists and physicians have worried about for years. They still can’t explain it.
Now comes a novel observational study of patients at a large health care system in Washington State showing that higher blood glucose levels are associated with a greater risk of dementia — even among people who don’t have diabetes. The results, published Thursday in The New England Journal of Medicine, “may have influence on the way we think about blood sugar and the brain,” said Dr. Paul Crane, the lead author and associate professor of medicine at the University of Washington.
The researchers tracked the blood glucose levels of 2,067 members of Group Health, a nonprofit HMO, for nearly seven years on average. Some patients had Type 2 diabetes when the study began, but most didn’t. None had dementia.
Over the years, as they saw doctors at Group Health, the participants received blood glucose tests. “It’s a common test in routine clinical practice,” Dr. Crane said. “We had an amazing opportunity with all this data. All the lab results since 1988 were available to us.”
The participants (average age at the start: 76) also reported to Group Health every other year for cognitive screening and, if their results were below normal, further testing and evaluation. Over the course of the study, about a quarter developed dementia of some kind, primarily Alzheimer’s disease or vascular dementia.
To measure blood sugar levels, the researchers combined glucose measurements, both fasting and nonfasting, with the HbA1c glycated hemoglobin assay, which provides a more accurate long-term picture. They also adjusted the data for other cardiovascular factors already linked to dementia, like high blood pressure and smoking.
“We found a steadily increasing risk associated with ever-higher blood glucose levels, even in people who didn’t have diabetes,” Dr. Crane said. Of particular interest: “There’s no threshold, no place where the risk doesn’t go up any further or down any further.” The association with dementia kept climbing with higher blood sugar levels and, at the other end of the spectrum, continued to decrease with lower levels.
This held true even at glucose levels considered normal. Among those whose blood sugar averaged 115 milligrams per deciliter, the risk of dementia was 18 percent higher than among those at 100 mg/dL, just slightly lower. The effects were also pronounced among those with diabetes: patients with average glucose levels of 190 mg/dL had a 40 percent higher risk of dementia than those whose levels averaged 160 mg/dL.
Though a longitudinal study like this one provides insight into the differences between people, it can’t explain why higher blood glucose might be connected to dementia, or tell individuals whether lower blood glucose is protective.
“People shouldn’t run for the hills or try crazy diets,” Dr. Crane cautioned. While an epidemiological study like this one can guide further exploration, he said, “This doesn’t show that changes in behavior that lower your individual blood sugar would decrease your individual risk of dementia.”
As for the blood glucose levels the study recorded, “clinically, they’re not big differences,” said Dr. Medha Munshi, a geriatrician and endocrinologist who directs the geriatric diabetes program at the Joslin Diabetes Center in Boston, who was not involved in the study. “I wouldn’t change my goals for diabetes management based on this study.” Nor would she warn someone whose blood glucose hits 115 mg/dL that he or she faces a greater risk of dementia.
But because diabetes itself can pose such a threat to health and quality of life, she still urges patients to adopt healthy practices like exercising regularly and maintaining a normal weight to try to avoid the disease. If by doing so they also lower their dementia risk — and knowing that would require a different study, focused on interventions — that would be a bonus.
This research “offers more evidence that the brain is a target organ for damage by high blood sugar,” said Dr. Munshi. “And everyone is still working on the ‘why’.

High Blood Sugar Linked to Dementia

People with diabetes face an increased risk of Alzheimer’s disease and other forms of dementia, a connection scientists and physicians have worried about for years. They still can’t explain it.

Now comes a novel observational study of patients at a large health care system in Washington State showing that higher blood glucose levels are associated with a greater risk of dementia — even among people who don’t have diabetes. The results, published Thursday in The New England Journal of Medicine, “may have influence on the way we think about blood sugar and the brain,” said Dr. Paul Crane, the lead author and associate professor of medicine at the University of Washington.

The researchers tracked the blood glucose levels of 2,067 members of Group Health, a nonprofit HMO, for nearly seven years on average. Some patients had Type 2 diabetes when the study began, but most didn’t. None had dementia.

Over the years, as they saw doctors at Group Health, the participants received blood glucose tests. “It’s a common test in routine clinical practice,” Dr. Crane said. “We had an amazing opportunity with all this data. All the lab results since 1988 were available to us.”

The participants (average age at the start: 76) also reported to Group Health every other year for cognitive screening and, if their results were below normal, further testing and evaluation. Over the course of the study, about a quarter developed dementia of some kind, primarily Alzheimer’s disease or vascular dementia.

To measure blood sugar levels, the researchers combined glucose measurements, both fasting and nonfasting, with the HbA1c glycated hemoglobin assay, which provides a more accurate long-term picture. They also adjusted the data for other cardiovascular factors already linked to dementia, like high blood pressure and smoking.

“We found a steadily increasing risk associated with ever-higher blood glucose levels, even in people who didn’t have diabetes,” Dr. Crane said. Of particular interest: “There’s no threshold, no place where the risk doesn’t go up any further or down any further.” The association with dementia kept climbing with higher blood sugar levels and, at the other end of the spectrum, continued to decrease with lower levels.

This held true even at glucose levels considered normal. Among those whose blood sugar averaged 115 milligrams per deciliter, the risk of dementia was 18 percent higher than among those at 100 mg/dL, just slightly lower. The effects were also pronounced among those with diabetes: patients with average glucose levels of 190 mg/dL had a 40 percent higher risk of dementia than those whose levels averaged 160 mg/dL.

Though a longitudinal study like this one provides insight into the differences between people, it can’t explain why higher blood glucose might be connected to dementia, or tell individuals whether lower blood glucose is protective.

“People shouldn’t run for the hills or try crazy diets,” Dr. Crane cautioned. While an epidemiological study like this one can guide further exploration, he said, “This doesn’t show that changes in behavior that lower your individual blood sugar would decrease your individual risk of dementia.”

As for the blood glucose levels the study recorded, “clinically, they’re not big differences,” said Dr. Medha Munshi, a geriatrician and endocrinologist who directs the geriatric diabetes program at the Joslin Diabetes Center in Boston, who was not involved in the study. “I wouldn’t change my goals for diabetes management based on this study.” Nor would she warn someone whose blood glucose hits 115 mg/dL that he or she faces a greater risk of dementia.

But because diabetes itself can pose such a threat to health and quality of life, she still urges patients to adopt healthy practices like exercising regularly and maintaining a normal weight to try to avoid the disease. If by doing so they also lower their dementia risk — and knowing that would require a different study, focused on interventions — that would be a bonus.

This research “offers more evidence that the brain is a target organ for damage by high blood sugar,” said Dr. Munshi. “And everyone is still working on the ‘why’.

Filed under alzheimer's disease dementia diabetes glucose levels medicine neuroscience science

352 notes

Chocolate may help keep brain healthy, sharp in old age, study says
Older chocoholics may have a new excuse to indulge their cravings: The dark stuff not only soothes the soul, but might also sharpen the mind. 
In a study published Wednesday in the journal Neurology, researchers reported that chocolate may help improve brain health and thinking skills in the elderly. The Boston-based team found that older people who initially performed poorly on a memory and reasoning test and also had reduced blood flow to their brains showed improvement after drinking two cups of cocoa every day for a month.  
The researchers had set out to test whether chocolate could increase blood flow to the brain during problem solving, boosting performance, after finding in earlier studies that consuming chocolate high in the antioxidant flavanol was associated with better brain and blood vessel functioning. They recruited 60 elderly subjects for the new study. Since they suspected that flavanol would improve the subjects’ thinking skills and blood flow, they randomly assigned subjects to drink either flavanol-rich or flavanol-poor hot chocolate.
The participants drank two cups of hot chocolate every day for 30 days. Before and after the study period, they completed a memory and reasoning test, which assessed their ability to recognize patterns in a series of letters on a computer screen. Additionally, the researchers used ultrasound to indirectly measure the blood flow to subjects’ brains, as well as magnetic resonance imaging, or MRI,  to examine subjects’ white matter — the nerve fibers that connect different parts of the brain.
People who performed poorly on the initial cognitive test — about a third of the participants — also had reduced blood flow to their brains and widespread white matter damage. Those who scored high on the test had signficantly better blood flow and more intact white matter, indicating that blood flow, cognitive functioning and brain structure were linked.
At the end of the 30 days, the team found that drinking hot chocolate benefited only the subjects who had poor cognitive and neurovascular function to begin with. After the hot cocoa regimen, those individuals showed an 8% improvement in blood flow and a roughly 1 minute faster reaction time on the cognitive task. There was barely any improvement among those who had started out with normal blood flow and cognitive skills.
To the scientists’ surprise, there weren’t significant differences in the neurovascular or cognitive changes between the flavanol-rich and flavanol-poor groups — suggesting that something else in the chocolate was causing the improvements. The researchers plan to identify and test this component in future trials, said study leader Dr.  Farzaneh A. Sorond, a neurologist at Brigham and Women’s Hospital in Boston.
After identifying the substance, the researchers may even be able to produce it in pill form, said Dr. Costantino Iadecola, a neurologist at Weill Cornell Medical College in New York City, who was not involved in the study.
By showing that blood flow to the brain is associated with cognitive function, the study helps explain earlier findings that people with high blood pressure and other cardiovascular conditions were prone to developing dementia. This, in turn, suggests that the cognitive functioning test and other measures used in the trial may one day serve as cheap, noninvasive methods to screen people for risk of dementia.
Scientists have focused more on treating than on preventing age-related cognitive decline, Sorond said.
“By the time people develop these problems, it’s too late to initiate the drugs we have,” she said. “If we could diagnose them earlier, before they have clinical symptoms, using physiological markers … maybe we could prevent the disease or lessen its impact.”
The study has its limitations. The ultrasound technique the researchers used offered only an estimate of blood flow to the brain – a precise measurement would require a more invasive method. “This was an easy way to get this information, but not the most accurate way,” Iadecola said.
He added that the study was small, and that it was unclear how long the chocolate’s effects would last.
“Will these changes persist after a month of cocoa or go back to where they were before? Would you take the cocoa forever?” Iadecola said. “We don’t know.”
Although the study results may tempt some to add chocolate to their diet,  Sorond noted that the participants’ food intake was strictly regulated to offset the excess fat and sugar in hot chocolate. For people seeking to keep their brains healthy, she recommends an intervention already known to improve cognitive function: exercise.

Chocolate may help keep brain healthy, sharp in old age, study says

Older chocoholics may have a new excuse to indulge their cravings: The dark stuff not only soothes the soul, but might also sharpen the mind. 

In a study published Wednesday in the journal Neurology, researchers reported that chocolate may help improve brain health and thinking skills in the elderly. The Boston-based team found that older people who initially performed poorly on a memory and reasoning test and also had reduced blood flow to their brains showed improvement after drinking two cups of cocoa every day for a month.  

The researchers had set out to test whether chocolate could increase blood flow to the brain during problem solving, boosting performance, after finding in earlier studies that consuming chocolate high in the antioxidant flavanol was associated with better brain and blood vessel functioning. They recruited 60 elderly subjects for the new study. Since they suspected that flavanol would improve the subjects’ thinking skills and blood flow, they randomly assigned subjects to drink either flavanol-rich or flavanol-poor hot chocolate.

The participants drank two cups of hot chocolate every day for 30 days. Before and after the study period, they completed a memory and reasoning test, which assessed their ability to recognize patterns in a series of letters on a computer screen. Additionally, the researchers used ultrasound to indirectly measure the blood flow to subjects’ brains, as well as magnetic resonance imaging, or MRI,  to examine subjects’ white matter — the nerve fibers that connect different parts of the brain.

People who performed poorly on the initial cognitive test — about a third of the participants — also had reduced blood flow to their brains and widespread white matter damage. Those who scored high on the test had signficantly better blood flow and more intact white matter, indicating that blood flow, cognitive functioning and brain structure were linked.

At the end of the 30 days, the team found that drinking hot chocolate benefited only the subjects who had poor cognitive and neurovascular function to begin with. After the hot cocoa regimen, those individuals showed an 8% improvement in blood flow and a roughly 1 minute faster reaction time on the cognitive task. There was barely any improvement among those who had started out with normal blood flow and cognitive skills.

To the scientists’ surprise, there weren’t significant differences in the neurovascular or cognitive changes between the flavanol-rich and flavanol-poor groups — suggesting that something else in the chocolate was causing the improvements. The researchers plan to identify and test this component in future trials, said study leader Dr.  Farzaneh A. Sorond, a neurologist at Brigham and Women’s Hospital in Boston.

After identifying the substance, the researchers may even be able to produce it in pill form, said Dr. Costantino Iadecola, a neurologist at Weill Cornell Medical College in New York City, who was not involved in the study.

By showing that blood flow to the brain is associated with cognitive function, the study helps explain earlier findings that people with high blood pressure and other cardiovascular conditions were prone to developing dementia. This, in turn, suggests that the cognitive functioning test and other measures used in the trial may one day serve as cheap, noninvasive methods to screen people for risk of dementia.

Scientists have focused more on treating than on preventing age-related cognitive decline, Sorond said.

“By the time people develop these problems, it’s too late to initiate the drugs we have,” she said. “If we could diagnose them earlier, before they have clinical symptoms, using physiological markers … maybe we could prevent the disease or lessen its impact.”

The study has its limitations. The ultrasound technique the researchers used offered only an estimate of blood flow to the brain – a precise measurement would require a more invasive method. “This was an easy way to get this information, but not the most accurate way,” Iadecola said.

He added that the study was small, and that it was unclear how long the chocolate’s effects would last.

“Will these changes persist after a month of cocoa or go back to where they were before? Would you take the cocoa forever?” Iadecola said. “We don’t know.”

Although the study results may tempt some to add chocolate to their diet,  Sorond noted that the participants’ food intake was strictly regulated to offset the excess fat and sugar in hot chocolate. For people seeking to keep their brains healthy, she recommends an intervention already known to improve cognitive function: exercise.

Filed under brain function cognitive function cocoa consumption white matter blood flow neuroscience science

200 notes

Religious Factors and Hippocampal Atrophy in Late Life
Despite a growing interest in the ways spiritual beliefs and practices are reflected in brain activity, there have been relatively few studies using neuroimaging data to assess potential relationships between religious factors and structural neuroanatomy. This study examined prospective relationships between religious factors and hippocampal volume change using high-resolution MRI data of a sample of 268 older adults. Religious factors assessed included life-changing religious experiences, spiritual practices, and religious group membership. Hippocampal volumes were analyzed using the GRID program, which is based on a manual point-counting method and allows for semi-automated determination of region of interest volumes. Significantly greater hippocampal atrophy was observed for participants reporting a life-changing religious experience. Significantly greater hippocampal atrophy was also observed from baseline to final assessment among born-again Protestants, Catholics, and those with no religious affiliation, compared with Protestants not identifying as born-again. These associations were not explained by psychosocial or demographic factors, or baseline cerebral volume. Hippocampal volume has been linked to clinical outcomes, such as depression, dementia, and Alzheimer’s Disease. The findings of this study indicate that hippocampal atrophy in late life may be uniquely influenced by certain types of religious factors.

Religious Factors and Hippocampal Atrophy in Late Life

Despite a growing interest in the ways spiritual beliefs and practices are reflected in brain activity, there have been relatively few studies using neuroimaging data to assess potential relationships between religious factors and structural neuroanatomy. This study examined prospective relationships between religious factors and hippocampal volume change using high-resolution MRI data of a sample of 268 older adults. Religious factors assessed included life-changing religious experiences, spiritual practices, and religious group membership. Hippocampal volumes were analyzed using the GRID program, which is based on a manual point-counting method and allows for semi-automated determination of region of interest volumes. Significantly greater hippocampal atrophy was observed for participants reporting a life-changing religious experience. Significantly greater hippocampal atrophy was also observed from baseline to final assessment among born-again Protestants, Catholics, and those with no religious affiliation, compared with Protestants not identifying as born-again. These associations were not explained by psychosocial or demographic factors, or baseline cerebral volume. Hippocampal volume has been linked to clinical outcomes, such as depression, dementia, and Alzheimer’s Disease. The findings of this study indicate that hippocampal atrophy in late life may be uniquely influenced by certain types of religious factors.

Filed under hippocampal atrophy hippocampus brain function religious beliefs neuroscience science

171 notes

Autism affects different parts of the brain in women and men
Autism affects different parts of the brain in females with autism than males with autism, a new study reveals. The research is published today in the journal Brain as an open-access article.
Scientists at the Autism Research Centre at the University of Cambridge used magnetic resonance imaging to examine whether autism affects the brain of males and females in a similar or different way. They found that the anatomy of the brain of someone with autism substantially depends on whether an individual is male or female, with brain areas that were atypical in adult females with autism being similar to areas that differ between typically developing males and females. This was not seen in men with autism.
“One of our new findings is that females with autism show neuroanatomical ‘masculinization’,” said Professor Simon Baron-Cohen, senior author of the paper. “This may implicate physiological mechanisms that drive sexual dimorphism, such as prenatal sex hormones and sex-linked genetic mechanisms.”
Autism affects 1% of the general population and is more prevalent in males. Most studies have therefore focused on male-dominant samples. As a result, our understanding of the neurobiology of autism is male-biased.
“This is one of the largest brain imaging studies of sex/gender differences yet conducted in autism. Females with autism have long been under-recognized and probably misunderstood,” said Dr Meng-Chuan Lai, who led the research project. “The findings suggest that we should not blindly assume that everything found in males with autism applies to females. This is an important example of the diversity within the ‘spectrum’.”
Dr Michael Lombardo, who co-led the study, added that although autism manifests itself in many different ways, grouping by gender may help provide a better understanding of this condition.
He said: “Autism as a whole is complex and vastly diverse, or heterogeneous, and this new study indicates that there are ways to subgroup the autism spectrum, such as whether an individual is male or female. Reducing heterogeneity via subgrouping will allow research to make significant progress towards understanding the mechanisms that cause autism.”

Autism affects different parts of the brain in women and men

Autism affects different parts of the brain in females with autism than males with autism, a new study reveals. The research is published today in the journal Brain as an open-access article.

Scientists at the Autism Research Centre at the University of Cambridge used magnetic resonance imaging to examine whether autism affects the brain of males and females in a similar or different way. They found that the anatomy of the brain of someone with autism substantially depends on whether an individual is male or female, with brain areas that were atypical in adult females with autism being similar to areas that differ between typically developing males and females. This was not seen in men with autism.

“One of our new findings is that females with autism show neuroanatomical ‘masculinization’,” said Professor Simon Baron-Cohen, senior author of the paper. “This may implicate physiological mechanisms that drive sexual dimorphism, such as prenatal sex hormones and sex-linked genetic mechanisms.”

Autism affects 1% of the general population and is more prevalent in males. Most studies have therefore focused on male-dominant samples. As a result, our understanding of the neurobiology of autism is male-biased.

“This is one of the largest brain imaging studies of sex/gender differences yet conducted in autism. Females with autism have long been under-recognized and probably misunderstood,” said Dr Meng-Chuan Lai, who led the research project. “The findings suggest that we should not blindly assume that everything found in males with autism applies to females. This is an important example of the diversity within the ‘spectrum’.”

Dr Michael Lombardo, who co-led the study, added that although autism manifests itself in many different ways, grouping by gender may help provide a better understanding of this condition.

He said: “Autism as a whole is complex and vastly diverse, or heterogeneous, and this new study indicates that there are ways to subgroup the autism spectrum, such as whether an individual is male or female. Reducing heterogeneity via subgrouping will allow research to make significant progress towards understanding the mechanisms that cause autism.”

Filed under autism sex differences MRI brain neuroscience psychology science

211 notes

How parents see themselves may affect their child’s brain and stress level

Self-perceived social status predicts hippocampal function and stress hormones

A mother’s perceived social status predicts her child’s brain development and stress indicators, finds a study at Boston Children’s Hospital. While previous studies going back to the 1950s have linked objective socioeconomic factors — such as parental income or education — to child health, achievement and brain function, the new study is the first to link brain function to maternal self-perception.

In the study, children whose mothers saw themselves as having a low social status were more likely to have increased cortisol levels, an indicator of stress, and less activation of their hippocampus, a structure in the brain responsible for long-term memory formation (required for learning) and reducing stress responses.

Findings were published online August 6th by the journal Developmental Science, and will be part of a special issue devoted to the effects of socioeconomic status on brain development.

"We know that there are big disparities among people in income and education," says Margaret Sheridan, PhD, of the Labs of Cognitive Neuroscience at Boston Children’s Hospital, the study’s first author. "Our results indicate that a mother’s perception of her social status ‘lives’ biologically in her children."

Sheridan, senior investigator Charles Nelson, PhD, of Boston Children’s Hospital and colleagues studied 38 children aged 8.3 to 11.8 years. The children gave saliva samples to measure levels of cortisol, and 19 also underwent functional MRI of the brain, focusing on the hippocampus.

Mothers, meanwhile, rated their social standing on a ladder on a scale of 1 to 10, comparing themselves with others in the United States. Findings were as follows:

  • After controlling for gender and age, the mother’s self-perceived social status was a significant predictor of cortisol levels in the child. This finding is consistent with studies in animals. “In animal research, your stress response is related to your relative standing in the hierarchy,” Sheridan says.
  • Similarly, the mother’s perceived social status significantly predicted the degree of hippocampal activation in their children during a learning task.
  • In contrast, actual maternal education or income-to-needs ratio (income relative to family size) did not significantly predict cortisol levels or hippocampal activation.

The findings suggest that while actual socioeconomic status varies, how people perceive and adapt to their situation is an important factor in child development. Some of this may be culturally determined, Sheridan notes. She is currently participating in a much larger international study of childhood poverty, the Young Lives Project, that is looking at objective and subjective measures of social status along with health measures and cognitive function. The study will capture much wider extremes of socioeconomic status than would a U.S.-based study.

What the current study didn’t find was evidence that stress itself alters hippocampal function; no relationship was found between cortisol and hippocampal function, as has been seen in animals, perhaps because of the small number children having brain fMRIs. “This needs further exploration,” says Sheridan. “There may be more than one pathway leading to differences in long-term memory, or there may be an effect of stress on the hippocampus that comes out only in adulthood.”

(Source: eurekalert.org)

Filed under social status hippocampus memory formation stress hormones brain development neuroscience psychology science

77 notes

Researchers find caffeine during pregnancy negatively impacts mice brains
A team of European researchers has found that mice who consume caffeine while pregnant give birth to pups with negative changes to their brains. In their paper published in the journal Science Translational Medicine, the team reports on their findings after examining the brains of mice pups whose mothers were given caffeine during pregnancy.
Medical researchers have shown that drugs such as cocaine, heroin or even marijuana can have a negative impact on fetal development—in contrast most believe that moderate amounts of caffeine consumption during pregnancy is “safe” meaning it has little or no adverse impact on fetal development. This new study doesn’t change that view, but it does suggest that perhaps more research needs to be done.
In their study, the researchers administered the equivalent of 4 or 5 cups of coffee a day to pregnant mice—afterwards they studied the brains of the pups that were born. In so doing, they found that GABA neurons didn’t migrate during brain development to their proper location in the Hippocampus at the same rate as untreated mice. GABA neurons are responsible for controlling the flow of information in the brain. Subsequent tests found the treated pups to be more susceptible to seizures.
The team also found that if they allowed the treated pups to grow to adulthood, they tended to demonstrate problems with memory—instead of playing with new objects placed in their cages, for example, they were satisfied with playing with objects they already knew—a trait that is uncommon for mice. Autopsies of adult brains also showed fewer neurons in the Hippocampus.
The researchers point out that their results in mice are not necessarily applicable to humans and to reinforce that point another team of researchers also published a Focus piece in the same journal pointing out that there are significant differences in the developmental process of humans and mice fetuses and thus the study with mice has no real bearing on whether caffeine may or may not cause developmental problems with human babies.
Still, the results do indicate that perhaps more research should be done to find out if caffeine does indeed have an unknown negative impact on human fetal development.

Researchers find caffeine during pregnancy negatively impacts mice brains

A team of European researchers has found that mice who consume caffeine while pregnant give birth to pups with negative changes to their brains. In their paper published in the journal Science Translational Medicine, the team reports on their findings after examining the brains of mice pups whose mothers were given caffeine during pregnancy.

Medical researchers have shown that drugs such as cocaine, heroin or even marijuana can have a negative impact on fetal development—in contrast most believe that moderate amounts of caffeine consumption during pregnancy is “safe” meaning it has little or no adverse impact on fetal development. This new study doesn’t change that view, but it does suggest that perhaps more research needs to be done.

In their study, the researchers administered the equivalent of 4 or 5 cups of coffee a day to pregnant mice—afterwards they studied the brains of the pups that were born. In so doing, they found that GABA neurons didn’t migrate during brain development to their proper location in the Hippocampus at the same rate as untreated mice. GABA neurons are responsible for controlling the flow of information in the brain. Subsequent tests found the treated pups to be more susceptible to seizures.

The team also found that if they allowed the treated pups to grow to adulthood, they tended to demonstrate problems with memory—instead of playing with new objects placed in their cages, for example, they were satisfied with playing with objects they already knew—a trait that is uncommon for mice. Autopsies of adult brains also showed fewer neurons in the Hippocampus.

The researchers point out that their results in mice are not necessarily applicable to humans and to reinforce that point another team of researchers also published a Focus piece in the same journal pointing out that there are significant differences in the developmental process of humans and mice fetuses and thus the study with mice has no real bearing on whether caffeine may or may not cause developmental problems with human babies.

Still, the results do indicate that perhaps more research should be done to find out if caffeine does indeed have an unknown negative impact on human fetal development.

Filed under caffeine fetal development brain development animal model pregnancy neuroscience science

85 notes

Making connections in the eye
Wiring diagram of retinal neurons is first step toward mapping the human brain.
The human brain has 100 billion neurons, connected to each other in networks that allow us to interpret the world around us, plan for the future, and control our actions and movements. MIT neuroscientist Sebastian Seung wants to map those networks, creating a wiring diagram of the brain that could help scientists learn how we each become our unique selves.
In a paper appearing in the Aug. 7 online edition of Nature, Seung and collaborators at MIT and the Max Planck Institute for Medical Research in Germany have reported their first step toward this goal: Using a combination of human and artificial intelligence, they have mapped all the wiring among 950 neurons within a tiny patch of the mouse retina.
Composed of neurons that process visual information, the retina is technically part of the brain and is a more approachable starting point, Seung says. By mapping all of the neurons in this 117-micrometer-by-80-micrometer patch of tissue, the researchers were able to classify most of the neurons they found, based on their patterns of wiring. They also identified a new type of retinal cell that had not been seen before.
“It’s the complete reconstruction of all the neurons inside this patch. No one’s ever done that before in the mammalian nervous system,” says Seung, a professor of computational neuroscience at MIT.
Other MIT authors of the paper are former postdoc Srinivas Turaga and former graduate student Viren Jain. The Max Planck team was led by Winfried Denk, a physicist and the Max Planck Institute’s director. Moritz Helmstaedter, a research group leader at the Max Planck Institute, is the lead author of the paper, and Kevin Briggman, a former postdoc at Max Planck, is also an author. Tracing connections
Neurons in the retina are classified into five classes: photoreceptors, horizontal cells, bipolar cells, amacrine cells and ganglion cells. Within each class are many types, classified by shape and by the connections they make with other neurons.
“Neurons come in many types, and the retina is estimated to contain 50 to 100 types, but they’ve never been exhaustively characterized. And their connections are even less well known,” Seung says.
In this study, the research team focused on a section of the retina known as the inner plexiform layer, which is one of several layers sandwiched between the photoreceptors, which receive visual input, and the ganglion cells, which relay visual information to the brain via the optic nerve. The neurons of the inner plexiform layer help to process visual information as it passes from the surface of the eye to the optic nerve.
To map all of the connections in this small patch of retina, the researchers first took electron micrographs of the targeted section. The Max Planck researchers obtained these images using a technique called serial block face scanning electron microscopy, which they invented to generate high-resolution three-dimensional images of biological samples.
Developing a wiring diagram from these images required both human and artificial intelligence. First, the researchers hired about 225 German undergraduates to trace the “skeleton” of each neuron, which took more than 20,000 hours of work (a little more than two years).
To flesh out the bodies of the neurons, the researchers fed these traced skeletons into a computer algorithm developed in Seung’s lab, which expands the skeletons into full neuron shapes. The researchers used machine learning to train the algorithm, known as a convolutional network, to detect the boundaries between neurons. Using those as reference points, the algorithm can fill in the entire body of each neuron.
“Tracing neurons in these images is probably one of the world’s most challenging computer vision problems. Our convolutional networks are actually deep artificial neural networks designed with inspiration from how our own visual system processes visual information to solve these difficult problems,” Turaga says.
If human workers were to fill in the entire neuron body, it would take 10 to 100 times longer than just drawing the skeleton. “This speeds up the whole process,” Seung says. “It’s a way of combining human and machine intelligence.”
The only previous complete wiring diagram, which mapped all of the connections between the 302 neurons found in the worm Caenorhabditis elegans, was reported in 1986 and required more than a dozen years of tedious labor.
“I think this is going to be a really significant paper in the history of how we study complex systems,” says Richard Masland, a professor of ophthalmology at the Massachusetts Eye and Ear Infirmary, who was not part of the research team. “This paper identifies circuit motifs that are interesting but really are just symbolic of the many types of questions that could be answered using these techniques.”
Classifying neurons
Wiring diagrams allow scientists to see where neurons connect with each other to form synapses — the junctions that allow neurons to relay messages. By analyzing how neurons are connected to each other, researchers can classify different types of neurons.
The researchers were able to identify most of the 950 neurons included in the new retinal-wiring diagram based on their connections with other neurons, as well as the shape of the neuron. A handful of neurons could not be classified because there was only one of their type, or because only a fragment of the neuron was included in the imaged sample.
“We haven’t completed the project of classifying types but this shows that it should be possible. This method should be able to do it, in principle, if it’s scaled up to a larger piece of tissue,” Seung says.
In this study, the researchers identified a new class of bipolar cells, which relay information from photoreceptors to ganglion cells. However, further study is needed to determine this cell type’s exact function.
Seung’s lab is now working on a wiring diagram of a larger piece of the retina — 0.3 millimeter by 0.3 millimeter — using a slightly different approach. In that study, the researchers first feed their electron micrographs into the computer algorithm, then ask human volunteers to check over the computer’s work and correct mistakes through a crowd-sourcing project known as EyeWire.

Making connections in the eye

Wiring diagram of retinal neurons is first step toward mapping the human brain.

The human brain has 100 billion neurons, connected to each other in networks that allow us to interpret the world around us, plan for the future, and control our actions and movements. MIT neuroscientist Sebastian Seung wants to map those networks, creating a wiring diagram of the brain that could help scientists learn how we each become our unique selves.

In a paper appearing in the Aug. 7 online edition of Nature, Seung and collaborators at MIT and the Max Planck Institute for Medical Research in Germany have reported their first step toward this goal: Using a combination of human and artificial intelligence, they have mapped all the wiring among 950 neurons within a tiny patch of the mouse retina.

Composed of neurons that process visual information, the retina is technically part of the brain and is a more approachable starting point, Seung says. By mapping all of the neurons in this 117-micrometer-by-80-micrometer patch of tissue, the researchers were able to classify most of the neurons they found, based on their patterns of wiring. They also identified a new type of retinal cell that had not been seen before.

“It’s the complete reconstruction of all the neurons inside this patch. No one’s ever done that before in the mammalian nervous system,” says Seung, a professor of computational neuroscience at MIT.

Other MIT authors of the paper are former postdoc Srinivas Turaga and former graduate student Viren Jain. The Max Planck team was led by Winfried Denk, a physicist and the Max Planck Institute’s director. Moritz Helmstaedter, a research group leader at the Max Planck Institute, is the lead author of the paper, and Kevin Briggman, a former postdoc at Max Planck, is also an author.

Tracing connections

Neurons in the retina are classified into five classes: photoreceptors, horizontal cells, bipolar cells, amacrine cells and ganglion cells. Within each class are many types, classified by shape and by the connections they make with other neurons.

“Neurons come in many types, and the retina is estimated to contain 50 to 100 types, but they’ve never been exhaustively characterized. And their connections are even less well known,” Seung says.

In this study, the research team focused on a section of the retina known as the inner plexiform layer, which is one of several layers sandwiched between the photoreceptors, which receive visual input, and the ganglion cells, which relay visual information to the brain via the optic nerve. The neurons of the inner plexiform layer help to process visual information as it passes from the surface of the eye to the optic nerve.

To map all of the connections in this small patch of retina, the researchers first took electron micrographs of the targeted section. The Max Planck researchers obtained these images using a technique called serial block face scanning electron microscopy, which they invented to generate high-resolution three-dimensional images of biological samples.

Developing a wiring diagram from these images required both human and artificial intelligence. First, the researchers hired about 225 German undergraduates to trace the “skeleton” of each neuron, which took more than 20,000 hours of work (a little more than two years).

To flesh out the bodies of the neurons, the researchers fed these traced skeletons into a computer algorithm developed in Seung’s lab, which expands the skeletons into full neuron shapes. The researchers used machine learning to train the algorithm, known as a convolutional network, to detect the boundaries between neurons. Using those as reference points, the algorithm can fill in the entire body of each neuron.

“Tracing neurons in these images is probably one of the world’s most challenging computer vision problems. Our convolutional networks are actually deep artificial neural networks designed with inspiration from how our own visual system processes visual information to solve these difficult problems,” Turaga says.

If human workers were to fill in the entire neuron body, it would take 10 to 100 times longer than just drawing the skeleton. “This speeds up the whole process,” Seung says. “It’s a way of combining human and machine intelligence.”

The only previous complete wiring diagram, which mapped all of the connections between the 302 neurons found in the worm Caenorhabditis elegans, was reported in 1986 and required more than a dozen years of tedious labor.

“I think this is going to be a really significant paper in the history of how we study complex systems,” says Richard Masland, a professor of ophthalmology at the Massachusetts Eye and Ear Infirmary, who was not part of the research team. “This paper identifies circuit motifs that are interesting but really are just symbolic of the many types of questions that could be answered using these techniques.”

Classifying neurons

Wiring diagrams allow scientists to see where neurons connect with each other to form synapses — the junctions that allow neurons to relay messages. By analyzing how neurons are connected to each other, researchers can classify different types of neurons.

The researchers were able to identify most of the 950 neurons included in the new retinal-wiring diagram based on their connections with other neurons, as well as the shape of the neuron. A handful of neurons could not be classified because there was only one of their type, or because only a fragment of the neuron was included in the imaged sample.

“We haven’t completed the project of classifying types but this shows that it should be possible. This method should be able to do it, in principle, if it’s scaled up to a larger piece of tissue,” Seung says.

In this study, the researchers identified a new class of bipolar cells, which relay information from photoreceptors to ganglion cells. However, further study is needed to determine this cell type’s exact function.

Seung’s lab is now working on a wiring diagram of a larger piece of the retina — 0.3 millimeter by 0.3 millimeter — using a slightly different approach. In that study, the researchers first feed their electron micrographs into the computer algorithm, then ask human volunteers to check over the computer’s work and correct mistakes through a crowd-sourcing project known as EyeWire.

Filed under mouse retina retinal cells ganglion cells EyeWire wiring diagram neuroscience science

162 notes

Our brains can (unconsciously) save us from temptation
Inhibitory self control – not picking up a cigarette, not having a second drink, not spending when we should be saving – can operate without our awareness or intention.
That was the finding by scientists at the University of Pennsylvania’s Annenberg School for Communication and the University of Illinois at Urbana-Champaign. They demonstrated through neuroscience research that inaction-related words in our environment can unconsciously influence our self-control. Although we may mindlessly eat cookies at a party, stopping ourselves from over-indulging may seem impossible without a deliberate, conscious effort. However, it turns out that overhearing someone – even in a completely unrelated conversation – say something as simple as “calm down” might trigger us to stop our cookie eating frenzy without realizing it.
The findings were reported in the journal Cognition by Justin Hepler, M.A., University of Illinois; and Dolores Albarracín, Ph.D., the Martin Fishbein Chair of Communication and a Professor of Psychology at Penn.
Volunteers completed a study where they were given instructions to press a computer key when they saw the letter “X” on the computer screen, or not press a key when they saw the letter “Y.” Their actions were affected by subliminal messages flashing rapidly on the screen. Action messages (“run,” “go,” “move,” “hit,” and “start”) alternated with inaction messages (“still,” “sit,” “rest,” “calm,” and “stop”) and nonsense words (“rnu,” or “tsi”). The participants were equipped with electroencephalogram recording equipment to measure brain activity.
The unique aspect of this test is that the action or inaction messages had nothing to do with the actions or inactions volunteers were doing, yet Hepler and Albarracín found that the action/inaction words had a definite effect on the volunteers’ brain activity. Unconscious exposure to inaction messages increased the activity of the brain’s self-control processes, whereas unconscious exposure to action messages decreased this same activity.
“Many important behaviors such as weight loss, giving up smoking, and saving money involve a lot of self-control,” the researchers noted. “While many psychological theories state that actions can be initiated automatically with little or no conscious effort, these same theories view inhibition as an effortful, consciously controlled process. Although reaching for that cookie doesn’t require much thought, putting it back on the plate seems to require a deliberate, conscious intervention. Our research challenges the long-held assumption that inhibition processes require conscious control to operate.”
The full article, “Complete unconscious control: Using (in)action primes to demonstrate completely unconscious activation of inhibitory control mechanisms,” will be available in the September issue of the journal.
(Image: Getty Images)

Our brains can (unconsciously) save us from temptation

Inhibitory self control – not picking up a cigarette, not having a second drink, not spending when we should be saving – can operate without our awareness or intention.

That was the finding by scientists at the University of Pennsylvania’s Annenberg School for Communication and the University of Illinois at Urbana-Champaign. They demonstrated through neuroscience research that inaction-related words in our environment can unconsciously influence our self-control. Although we may mindlessly eat cookies at a party, stopping ourselves from over-indulging may seem impossible without a deliberate, conscious effort. However, it turns out that overhearing someone – even in a completely unrelated conversation – say something as simple as “calm down” might trigger us to stop our cookie eating frenzy without realizing it.

The findings were reported in the journal Cognition by Justin Hepler, M.A., University of Illinois; and Dolores Albarracín, Ph.D., the Martin Fishbein Chair of Communication and a Professor of Psychology at Penn.

Volunteers completed a study where they were given instructions to press a computer key when they saw the letter “X” on the computer screen, or not press a key when they saw the letter “Y.” Their actions were affected by subliminal messages flashing rapidly on the screen. Action messages (“run,” “go,” “move,” “hit,” and “start”) alternated with inaction messages (“still,” “sit,” “rest,” “calm,” and “stop”) and nonsense words (“rnu,” or “tsi”). The participants were equipped with electroencephalogram recording equipment to measure brain activity.

The unique aspect of this test is that the action or inaction messages had nothing to do with the actions or inactions volunteers were doing, yet Hepler and Albarracín found that the action/inaction words had a definite effect on the volunteers’ brain activity. Unconscious exposure to inaction messages increased the activity of the brain’s self-control processes, whereas unconscious exposure to action messages decreased this same activity.

“Many important behaviors such as weight loss, giving up smoking, and saving money involve a lot of self-control,” the researchers noted. “While many psychological theories state that actions can be initiated automatically with little or no conscious effort, these same theories view inhibition as an effortful, consciously controlled process. Although reaching for that cookie doesn’t require much thought, putting it back on the plate seems to require a deliberate, conscious intervention. Our research challenges the long-held assumption that inhibition processes require conscious control to operate.”

The full article, “Complete unconscious control: Using (in)action primes to demonstrate completely unconscious activation of inhibitory control mechanisms,” will be available in the September issue of the journal.

(Image: Getty Images)

Filed under brain activity self-control EEG inhibition neuroscience science

80 notes

Scientists watch live brain cell circuits spark and fire

NIH-funded scientists show new genetically engineered proteins may be important tool for the President’s BRAIN Initiative

image

Scientists used fruit flies to show for the first time that a new class of genetically engineered proteins can be used to watch electrical activity in individual brain cells in live brains. The results, published in Cell, suggest these proteins may be a promising new tool for mapping brain cell activity in multiple animals and for studying how neurological disorders disrupt normal nerve cell signaling. Understanding brain cell activity is a high priority of the President’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative.

Brain cells use electricity to control thoughts, movements and senses.  Ever since the late nineteenth century, when Dr. Luigi Galvani induced frog legs to move with electric shocks, scientists have been trying to watch nerve cell electricity to understand how it is involved in these actions. Usually they directly monitor electricity with cumbersome electrodes or toxic voltage-sensitive dyes, or indirectly with calcium detectors. This study, led by Michael Nitabach, Ph.D., J.D., and Vincent Pieribone, Ph.D., at the Yale School of Medicine, New Haven, CT, shows that a class of proteins, called genetically encoded fluorescent voltage indicators (GEVIs), may allow researchers to watch nerve cell electricity in a live animal.

Dr. Pieribone and his colleagues helped develop ArcLight, the protein used in this study. ArcLight fluoresces, or glows, as a nerve cell’s voltage changes and enables researchers to watch, in real time, the cell’s electrical activity. In this study, Dr. Nitabach and his colleagues engineered fruit flies to express ArcLight in brain cells that control the fly’s sleeping cycle or sense of smell. Initial experiments in which the researchers simultaneously watched brain cell electricity with a microscope and recorded voltage with electrodes showed that ArcLight can accurately monitor electricity in a living brain. Further experiments showed that ArcLight illuminated electricity in parts of the brain that were previously inaccessible using other techniques. Finally, ArcLight allowed the researchers to watch brain cells spark and fire while the flies were awakening and smelling. These results suggest that in the future neuroscientists may be able to use ArcLight and similar GEVIs in a variety of ways to map brain cell circuit activity during normal and disease states.

(Source: ninds.nih.gov)

Filed under brain cells fruit flies brain mapping GEVIs ArcLight cell activity neuroscience science

free counters