Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

9 notes

Imaging Study Sheds New Light on Alcohol-Related Birth Defects

A collaborative research effort by scientists at the University of North Carolina School of Medicine, Duke University, and University College of London in the UK, sheds new light on alcohol-related birth defects.

The project, led by Kathleen K. Sulik, PhD, a professor in the Department of Cell and Developmental Biology and the Bowles Center for Alcohol Studies at UNC, could help enhance how doctors diagnose birth defects caused by alcohol exposure in the womb. The findings also illustrate how the precise timing of that exposure could determine the specific kinds of defects.

“We now know that maternal alcohol use is the leading known and preventable cause of birth defects and mental disability in the United States,” Sulik said. “Alcohol’s effects can cause a range of cognitive, developmental and behavioral problems that typically become evident during childhood, and last a lifetime.”

Fetal alcohol syndrome (FAS) is at the severe end of fetal alcohol spectrum disorders (FASD). First described in 1972, FAS is recognized by a specific pattern of facial features: small eyelid openings, a smooth ridge on the upper lip (absence of a central groove, or philtrum), and a thin upper lip border.

In its full-blown state, FAS affects roughly 1 in 750 live births in the U.S. And while clinicians typically look for those classical facial features in making a diagnosis, within the broader classification of FASD “adverse outcomes vary considerably and most individuals don’t exhibit the facial characteristics that currently define FAS,” said the study’s lead author Robert J. Lipinski, PhD, a postdoctoral scientist in Sulik’s lab. “This study could expand the base of diagnostic criteria used by clinicians who suspect problems caused by maternal alcohol use.”

In their animal-based studies, the Sulik lab team has collaborated with co-author G. Allan Johnson, PhD and his group at Duke University’s Center for In Vivo Microscopy. Johnson, professor of radiology and physics, has developed new imaging tools with spatial resolution up to a million times higher than clinical magnetic resonance imaging (MRI). These include small bore tools suitable for imaging fetal mice that are only 15 mm long.

To quantify facial shape from MRI data, the study team turned to co-author Peter Hammond, a professor of computational biology at UCL’s Institute of Child Health, in London. Hammond invented powerful new techniques for 3D shape analysis that have already proven successful in objectively defining facial shape changes in humans.

In the study, described in the August 22, 2012 issue of the online journal PLOS ONE, Lipinski and Sulik treated one group of mice with alcohol on their seventh day of pregnancy, a time corresponding to the third week of pregnancy in humans. A second group of mice was treated just 36 hours later, approximating the fourth week of human pregnancy. The amount of alcohol given was large, “high doses that most women wouldn’t achieve unless they were alcoholic and had a tolerance for alcohol,” Sulik said.

Near the end of pregnancy, the fetuses were then imaged at Duke University. These 3D data sets showed individual brain regions, as well as accurate and detailed facial surfaces, from which Hammond and research assistant and co-author Michael Suttie performed shape analyses.

The team found that the earlier alcohol exposure time elicited the classic FAS facial features, including characteristic abnormalities of the upper lip and eyes. What they observed in fetuses exposed just 36 hours later, however, was a surprise. These mice exhibited unique and in some cases opposing facial patterns, such as shortened upper lip, a present philtrum, and the brain, instead of appearing too narrow in the front, appeared wide.

“Overall, the results of our studies show that alcohol can cause more than one pattern of birth defects, and that the type and extent of brain abnormalities—which are the most devastating manifestation of prenatal alcohol exposure—in some cases may be predicted by specific facial features,” Sulik said. “And, importantly, alcohol can cause tremendously devastating and permanent damage at a time in development when most women don’t recognize that they’re pregnant.”

Source: Newswise

Filed under alcohol science neuroscience psychology birth defects FAS pregnancy

44 notes

Intensive preparation for the Law School Admission Test (LSAT) actually changes the microscopic structure of the brain, physically bolstering the connections between areas of the brain important for reasoning, according to neuroscientists at the University of California, Berkeley.
The results suggest that training people in reasoning skills – the main focus of LSAT prep courses – can reinforce the brain’s circuits involved in thinking and reasoning and could even up people’s IQ scores.
“The fact that performance on the LSAT can be improved with practice is not new. People know that they can do better on the LSAT, which is why preparation courses exist,” said Allyson Mackey, a graduate student in UC Berkeley’s Helen Wills Neuroscience Institute who led the study. “What we were interested in is whether and how the brain changes as a result of LSAT preparation, which we think is, fundamentally, reasoning training. We wanted to show that the ability to reason is malleable in adults.”
The new study shows that reasoning training does alter brain connections, which is good news for the test prep industry, but also for people who have poor reasoning skills and would like to improve them. The findings are reported today (Wednesday, Aug. 22) in the open access journal Frontiers in Neuroanatomy.

Intensive preparation for the Law School Admission Test (LSAT) actually changes the microscopic structure of the brain, physically bolstering the connections between areas of the brain important for reasoning, according to neuroscientists at the University of California, Berkeley.

The results suggest that training people in reasoning skills – the main focus of LSAT prep courses – can reinforce the brain’s circuits involved in thinking and reasoning and could even up people’s IQ scores.

“The fact that performance on the LSAT can be improved with practice is not new. People know that they can do better on the LSAT, which is why preparation courses exist,” said Allyson Mackey, a graduate student in UC Berkeley’s Helen Wills Neuroscience Institute who led the study. “What we were interested in is whether and how the brain changes as a result of LSAT preparation, which we think is, fundamentally, reasoning training. We wanted to show that the ability to reason is malleable in adults.”

The new study shows that reasoning training does alter brain connections, which is good news for the test prep industry, but also for people who have poor reasoning skills and would like to improve them. The findings are reported today (Wednesday, Aug. 22) in the open access journal Frontiers in Neuroanatomy.

Filed under science neuroscience brain LSAT reasoning psychology intelligence

36 notes

With a little training, signs of schizophrenia are averted

August 22, 2012

Animals that literally have holes in their brains can go on to behave as normal adults if they’ve had the benefit of a little cognitive training in adolescence. That’s according to new work in the August 23 Neuron, a Cell Press publication, featuring an animal model of schizophrenia, where rats with particular neonatal brain injuries develop schizophrenia-like symptoms.

"The brain can be loaded with all sorts of problems," said André Fenton of New York University. "What this work shows is that experience can overcome those disabilities."

Fenton’s team made the discovery completely by accident. His team was interested in what Fenton considers a core problem in schizophrenia: the inability to sift through confusing or conflicting information and focus on what’s relevant.

"As you walk through the world, you might be focused on a phone conversation, but there are also kids in the park and cars and other distractions," he explained. "These information streams are all competing for our brain to process them. That’s a really challenging situation for someone with schizophrenia."

Fenton and his colleagues developed a laboratory test of cognitive control needed for that kind of focus. In the test, rats had to learn to avoid a foot shock while they were presented with conflicting information. Normal rats can manage that task quickly. Rats with brain lesions can also manage this task, but only up until they become young adults—the equivalent of an 18- or 20-year-old person—when signs of schizophrenia typically set in.

While that was good to see, Fenton says, it wasn’t really all that surprising. But then some unexpected circumstances in the lab led them to test animals with adolescent experience in the cognitive control test again, once they had grown into adults.

These rats should have shown cognitive control deficits, similar to those that had not received prior cognitive training, or so the researchers thought. Instead, they were just fine. Their schizophrenic symptoms had somehow been averted.

Fenton believes their early training for focus forged some critical neural connections, allowing the animals to compensate for the injury still present in their brains in adulthood. Not only were the animals’ behaviors normalized with training, but the patterns of activity in their brains were also.

The finding is consistent with the notion that mental disorders are the consequence of problems in brain development that might have gotten started years before. They raise the optimistic hope that the right kinds of experiences at the right time could change the future by enabling people to better manage their diseases and better function in society. Adolescence, when the brain undergoes significant change and maturation, might be a prime time for such training.

"You may have a damaged brain, but the consequences of that damage might be overcome without changing the damage itself," Fenton says. "You could target schizophrenia, but other disorders aren’t very different," take autism or depression, for example.

And really, in this world of infinite distraction, couldn’t we all use a little more cognitive control?

Source: medicalxpress.com

Filed under science neuroscience brain psychology schizophrenia cognitive training

31 notes

A study in mice conducted by researchers at Tufts University School of Medicine suggests that a woman’s risk of anxiety and dysfunctional social behavior may depend on the experiences of her parents, particularly fathers, when they were young.
The study, published online in Biological Psychiatry, suggests that stress caused by chronic social instability during youth contributes to epigenetic changes in sperm cells that can lead to psychiatric disorders in female offspring across multiple generations.

A study in mice conducted by researchers at Tufts University School of Medicine suggests that a woman’s risk of anxiety and dysfunctional social behavior may depend on the experiences of her parents, particularly fathers, when they were young.

The study, published online in Biological Psychiatry, suggests that stress caused by chronic social instability during youth contributes to epigenetic changes in sperm cells that can lead to psychiatric disorders in female offspring across multiple generations.

Filed under science neuroscience brain psychology stress

26 notes

Fathers bequeath more mutations as they age

22 August 2012 by Ewen Callaway

Genome study may explain links between paternal age and conditions such as autism.

Older fathers’ sperm have more mutations — as do their children.
V. Peñafiel/Flickr/GETTY

In the 1930s, the pioneering geneticist J. B. S. Haldane noticed a peculiar inheritance pattern in families with long histories of haemophilia. The faulty mutation responsible for the blood-clotting disorder tended to arise on the X chromosomes that fathers passed to their daughters, rather than on those that mothers passed down. Haldane subsequently proposed that children inherit more mutations from their fathers than their mothers, although he acknowledged that “it is difficult to see how this could be proved or disproved for many years to come”.

(Source: nature.com)

Read more …

Filed under science neuroscience psychology genomics autism mutations genetics

20 notes

'Genomic CSI' Helps Contain a Killer

22 August 2012 

In June of last year, a 43-year old woman was admitted to the Clinical Center of the National Institutes of Health in Bethesda, Maryland, for a lung disease. Doctors knew she was carrying a highly resistant form of a deadly bacterium known as Klebsiella pneumoniae—although it didn’t make her sick—and they placed her in isolation. When the woman was discharged, no one else appeared to have become infected. A few weeks later, however, another patient was found to be carrying the bacterium, and over the next 3 months, 12 more intensive care patients contracted it. Six died as a direct result of the infection.

Doctors could not make sense of the outbreak with the usual methods: A survey of bed locations showed that the first patient had had no direct contact with any of the others and, in theory, Klebsiella might have been introduced into the hospital multiple times. So physicians turned to the bacterium’s genome for answers. The approach, known as genomic epidemiology, helped them track the path of the microbe, contain the disease, and save lives, according to a new study.

Tracking a killer. Full-genome sequencing revealed the movements of Klebsiella (shown) within one hospital. Credit: Image courtesy of Adrian Zelazny

Genomic epidemiology makes use of the fact that when bacteria divide, they accumulate mutations. As a result, the bacterial genome differs slightly—often by just one or two letters of genetic code, or base pairs—from one patient to the next. By fully sequencing the genomes of patients’ bacteria and finding these minute differences, researchers can track microbial movements with unprecedented precision. The technique has already been used to reconstruct the spread of methicillin-resistant Staphylococcus aureus (MRSA) around the world and to pinpoint the origin of a cholera outbreak in Haiti.

It also helped the doctors at the hospital in Bethesda. Comparing the genomes from all patients showed that the female patient admitted in June had indeed initiated the outbreak; the researchers showed that the bacteria had been transmitted from her to other patients three times independently. Apparently, transmission occurred in ways the researchers didn’t understand, says Tara Palmore, an infectious disease physician at the hospital. “When we realized there was more than met the eye, we started testing everyone in the hospital,” she says. That helped identify four more infected patients outside the intensive care unit, the scientists report online today in Science Translational Medicine. They were quickly isolated, which Palmore believes prevented further spread.

Just how the microbes were transmitted is still unclear. Palmore assumes that the bacteria mainly traveled on the hands of doctors. But the clinic had stationed a person outside the isolation rooms to make sure everyone who entered followed a hygiene regimen 24/7. That suggests that bacteria might have established colonies on surfaces or medical equipment and spread that way as well. “The conventional wisdom is that Klebsiellas do not really survive in the environment, but we found them in six sink drains and a ventilator,” Palmore says.

"This small study demonstrates the potential power of whole genome sequencing for outbreak investigation and surveillance," says Sharon Peacock, a microbiologist at the University of Cambridge in the United Kingdom who was not involved in the work. And infectious disease specialist Dag Harmsen of the University Clinic of Münster in Germany says it is "further proof that the time is ripe for using genomic sequencing of pathogens in a hospital setting." The paper also highlights the dangers of resistant Gram-negative bacteria like Klebsiella p., he adds. In many patients, the bacteria were not susceptible to any available antibiotic; not even to colistin, an old compound used only when all else fails. “This is even more dramatic than MRSA, because you have nothing left to treat the patients with,” Harmsen says. Since the outbreak, every patient at the hospital is checked for such dangerous pathogens; one more resistant Klebsiella case—although a different strain—has been found so far.

Genomic epidemiology could make it easier for hospitals to deal with similar outbreaks, Palmore says. “A lot of academic centers have the ability to do this now,” she says. The cost is becoming less of an issue; during last year’s outbreak, scientists still paid about $2000 per genome sequenced; now that would be closer to $500. But Peacock cautions that it still takes bioinformatics specialists several weeks to interpret the data. “This technology will not be applicable to routine clinical practice until automated interpretation tools become available.”

(Source: news.sciencemag.org)

Filed under bacteria disease genomic epidemiology genomics microbes neuroscience science Klebsiella pneumoniae

34 notes

Most babies born in developed countries share a common painful experience — a heel prick that is done soon after birth. Blood from this is deposited onto a slip of paper, called a Guthrie card, which doctors use to screen for devastating and sometimes fatal diseases. A study published today in Genome Research suggests that these cards, which are sometimes stored for decades, could provide an early snapshot of an individual’s epigenome, the chemical changes that influence gene expression and are likely to have a role in heart disease, diabetes, cancer and other diseases.

Most babies born in developed countries share a common painful experience — a heel prick that is done soon after birth. Blood from this is deposited onto a slip of paper, called a Guthrie card, which doctors use to screen for devastating and sometimes fatal diseases. A study published today in Genome Research suggests that these cards, which are sometimes stored for decades, could provide an early snapshot of an individual’s epigenome, the chemical changes that influence gene expression and are likely to have a role in heart disease, diabetes, cancer and other diseases.

Filed under science neuroscience genetics genomics epigenome diseases

19 notes

Low-Dose Sedative Alleviates Autistic-Like Behavior in Mice With Dravet Syndrome Mutation

ScienceDaily (Aug. 22, 2012) — A low dose of the sedative clonazepam alleviated autistic-like behavior in mice with a mutation that causes Dravet syndrome in humans, University of Washington researchers have shown.

(Credit: © Vasiliy Koval / Fotolia)

Dravet syndrome is an infant seizure disorder accompanied by developmental delays and behavioral symptoms that include autistic features. It usually originates spontaneously from a gene mutation in an affected child not found in either parent.

Studies of mice with a similar gene mutation are revealing the overly excited brain circuits behind the autistic traits and cognitive impairments common in this condition. The research report appears in the Aug. 23 issue of Nature. Dr William Catterall, professor and chair of pharmacology at the UW, is the senior author.

Dravet syndrome mutations cause loss-of-function of the human gene called SCN1A. People or mice with two copies of the mutation do not survive infancy; one copy results in major disability and sometimes early death. The mutation causes malformation in one type of sodium ion channels, the tiny pores in nerve cells that produce electrical signals by gating the flow of sodium ions.

The Catteralll lab is studying these defective ion channels and their repercussion on cell-to-cell signaling in the brain. They also are documenting the behavior of mice with this mutation, compared to their unaffected peers. Their findings may help explain how the sporadic gene mutations that cause Dravet syndrome lead to its symptoms of cognitive deficit and autistic behaviors.

Read more …

Filed under science neuroscience dravet syndrome genetics autistic traits autism mutation SCN1A cognitive deficit

17 notes

A new UCLA study pinpoints uniquely human patterns of gene activity in the brain that could shed light on how we evolved differently than our closest relative. Published Aug. 22 in the advance online edition of Neuron, these genes’ identification could improve understanding of human brain diseases like autism and schizophrenia, as well as learning disorders and addictions.
Read more
(Image by Michael Nichols)

A new UCLA study pinpoints uniquely human patterns of gene activity in the brain that could shed light on how we evolved differently than our closest relative. Published Aug. 22 in the advance online edition of Neuron, these genes’ identification could improve understanding of human brain diseases like autism and schizophrenia, as well as learning disorders and addictions.

Read more

(Image by Michael Nichols)

Filed under science neuroscience brain psychology evolution genetics disorder addiction

12 notes

Rewired visual input to sound-processing part of the brain leads to compromised hearing

Scientists at Georgia State University have found that the ability to hear is lessened when, as a result of injury, a region of the brain responsible for processing sounds receives both visual and auditory inputs.

Yu-Ting Mao, a former graduate student under Sarah L. Pallas, professor of neuroscience, explored how the brain’s ability to change, or neuroplasticity, affected the brain’s ability to process sounds when both visual and auditory information is sent to the auditory thalamus.

The study was published in the Journal of Neuroscience.

The auditory thalamus is the region of the brain responsible for carrying sound information to the auditory cortex, where sound is processed in detail.

When a person or animal loses input from one of the senses, such as hearing, the region of the brain that processes that information does not become inactive, but instead gets rewired with input from other sensory systems.

In the case of this study, early brain injury resulted in visual inputs into the auditory thalamus, which altered how the auditory cortex processes sounds.

The cortical “map” for discriminating different sound frequencies was significantly disrupted, she explained.

“One of the possible reasons the sound frequency map is so disrupted is that visual responsive neurons are sprinkled here and there, and we also have a lot of single neurons that respond to both light and sound,” Pallas said. “So those strange neurons sprinkled there probably keeps the map from forming properly.”

Mao also discovered reduced sensitivity and slower responses of neurons in the auditory cortex to sound.

Finally, the neurons in the auditory cortex were less sharply tuned to different frequencies of sound.

“Generally, individual neurons will be pretty sensitive to one sound frequency that we call their ‘best frequency,’” Pallas said. “We found that they would respond to a broader range of frequencies after the rewiring with visual inputs.”

While Pallas’ research seeks to create a basic understanding of brain development, knowledge gained from her lab’s studies may help to give persons who are deaf, blind, or have suffered brain injuries ways to keep visual and auditory functions from being compromised.

“Usually we think of plasticity as a good thing, but in this case, it’s a bad thing,” she said. “We would like to limit the plasticity so that we can keep the function that’s supposed to be there.”

Source: Georgia State University

Filed under science neuroscience brain neuroplasticity auditory thalamus hearing

free counters