Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

37 notes

Primitive consciousness emerges first as you awaken from anesthesia

April 4, 2012

Awakening from anesthesia is often associated with an initial phase of delirious struggle before the full restoration of awareness and orientation to one’s surroundings. Scientists now know why this may occur: primitive consciousness emerges first. Using brain imaging techniques in healthy volunteers, a team of scientists led by Adjunct Professor Harry Scheinin, M.D. from the University of Turku, Finland in collaboration with investigators from the University of California, Irvine, have now imaged the process of returning consciousness after general anesthesia. The emergence of consciousness was found to be associated with activations of deep, primitive brain structures rather than the evolutionary younger neocortex.

This image shows one returning from oblivion — imaging the neural core of consciousness. Positron emission tomography (PET) findings show that the emergence of consciousness after anesthesia is associated with activation of deep, phylogenetically old brain structures rather than the neocortex. Left: Sagittal (top) and axial (bottom) sections show activation in the anterior cingulate cortex (i), thalamus (ii) and the brainstem (iii) locus coeruleus/parabrachial area overlaid on magnetic resonance image (MRI) slices. Right: Cortical renderings show no evident activations. Credit: Turku PET Center

These results may represent an important step forward in the scientific explanation of human consciousness.

"We expected to see the outer bits of brain, the cerebral cortex (often thought to be the seat of higher human consciousness), would turn back on when consciousness was restored following anesthesia. Surprisingly, that is not what the images showed us. In fact, the central core structures of the more primitive brain structures including the thalamus and parts of the limbic system appeared to become functional first, suggesting that a foundational primitive conscious state must be restored before higher order conscious activity can occur" Scheinin said.

Twenty young healthy volunteers were put under anesthesia in a brain scanner using either dexme-detomidine or propofol anesthetic drugs. The subjects were then woken up while brain activity pictures were being taken. Dexmedetomidine is used as a sedative in the intensive care unit setting and propofol is widely used for induction and maintenance of general anesthesia. Dexmedetomidineinduced unconsciousness has a close resemblance to normal physiological sleep, as it can be reversed with mild physical stimulation or loud voices without requiring any change in the dosing of the drug. This unique property was critical to the study design, as it enabled the investigators to separate the brain activity changes associated with the changing level of consciousness from the drugrelated effects on the brain. The staterelated changes in brain activity were imaged with positron emission tomography (PET).

The emergence of consciousness, as assessed with a motor response to a spoken command, was associated with the activation of a core network involving subcortical and limbic regions that became functionally coupled with parts of frontal and inferior parietal cortices upon awakening from dexme-detomidine-induced unconsciousness. This network thus enabled the subjective awareness of the external world and the capacity to behaviorally express the contents of consciousness through voluntary responses.

Interestingly, the same deep brain structures, i.e. the brain stem, thalamus, hypothalamus and the anterior cingulate cortex, were activated also upon emergence from propofol anesthesia, suggesting a common, drugindependent mechanism of arousal. For both drugs, activations seen upon regaining consciousness were thus mostly localized in deep, phylogenetically old brain structures rather than in the neocortex.

The researchers speculate that because current depth-of-anesthesia monitoring technology is based on cortical electroencephalography (EEG) measurement (i.e., measuring electrical signals on the sur-face of the scalp that arise from the brain’s cortical surface), their results help to explain why these devices fail in differentiating the conscious and unconscious states and why patient awareness during general anesthesia may not always be detected. The results presented here also add to the current understanding of anesthesia mechanisms and form the foundation for developing more reliable depth-of-anesthesia technology.

The anesthetized brain provides new views into the emergence of consciousness. Anesthetic agents are clinically useful for their remarkable property of being able to manipulate the state of consciousness. When given a sufficient dose of an anesthetic, a person will lose the precious but mysterious capacity of being aware of one’s own self and the surrounding world, and will sink into a state of oblivion. Conversely, when the dose is lightened or wears off, the brain almost magically recreates a subjective sense of being as experience and awareness returns. The ultimate nature of consciousness remains a mystery, but anesthesia offers a unique window for imaging internal brain activity when the subjective phenomenon of consciousness first vanishes and then re-emerges. This study was designed to give the clearest picture so far of the internal brain processes involved in this phenomenon.

The results may also have broader implications. The demonstration of which brain mechanisms are involved in the emergence of the conscious state is an important step forward in the scientific explanation of consciousness. Yet, much harder questions remain. How and why do these neural mechanisms create the subjective feeling of being, the awareness of self and environment the state of being conscious?

Provided by Academy of Finland

Source: medicalxpress.com

Filed under science neuroscience brain psychology

0 notes

Light Switch Added to Gene Tool Opens New View of Cell Development

ScienceDaily (Apr. 4, 2012) — University of Oregon scientists collaborating with an Oregon company that synthesizes antisense Morpholinos for genetic research have developed a UV light-activated on-off switch for the vital gene-blocking molecule. Based on initial testing in zebra-fish embryos, the enhanced molecule promises to deliver new insights for developmental biologists and brain researchers.

The seven-member team describes the advancement in an open-access paper published in the May issue of the journal Development. UO neuroscientist Philip Washbourne, a professor of biology, says the paper is a “proof-of-concept” on an idea he began discussing with scientists at Gene Tools LLC in Philomath, Ore., about four years ago. Gene Tools was founded in the 1980s by James Summerton, who first invented Morpholino oligos. The company holds the exclusive license to distribute these molecules to researchers around the world.

Morpholinos are short-chain, artificially produced oligomers that bind to RNA in cells and block protein synthesis. For a decade, biologists have used them in zebra fish, mice and African clawed toads to study development, but they remained in the active, or on, position. Gene Tools created and introduced a light-sensitive linker, allowing researchers to control the molecule — even leaving one on in one cell and off in an adjacent cell — with a pinpoint UV laser beam.

Researchers in Washbourne’s lab — led by neuroscience research associate Alexandra Tallafuss — were challenged to give the new molecules a test run. They applied them to their work in zebra fish. “Now we can turn them on and off,” Washbourne said. “You can insert them and then manipulate them to learn just when a gene is important, and we learned two things right away.”

Researchers have known that if a gene known as “no tail” is blocked in development, zebra fish fail to grow tails. They now know that the no-tail gene does not need to produce protein for tail formation until about 10 hours, or very late, into an embryo’s development.

Secondly, the researchers looked at the gene sox10, which is vital in the formation of neural crest cells, which give rise to dorsal root ganglion cells — neurons that migrate out of the spinal cord — and pigment cells. “Again, we found that sox10 is not needed as early in development as theorized,” Washbourne said.

"These light-sensitive molecules significantly expand the power and precision of molecular genetic studies in zebrafish," said Robert Riddle, a program director at the National Institute of Neurological Disorders and Stroke (NINDS). "Researchers from many fields will be able to use these tools to explore the function of different genes in embryonic regions, specific cell types and at precise times in an animal’s lifespan."

The NINDS and National Institute of Child Health and Human Development, both at the National Institutes of Health, supported the research through grants to Washbourne and Eisen.

"This successful collaboration between our scientists and this Oregon-based company shows that commercial innovation can come quickly by jointly addressing common needs," said Kimberly Andrews Espy, vice president for research and innovation at the UO. "This is a remarkable example of turning a concept into a working tool that likely will benefit many researchers around the world."

Source: Science Daily

Filed under science neuroscience brain psychology

4 notes

Study shows why some pain drugs become less effective over time

April 3, 2012

Researchers at the University of Montreal’s Sainte-Justine Hospital have identified how neural cells like those in our bodies are able to build up resistance to opioid pain drugs within hours. Humans have known about the usefulness of opioids, which are often harvested from poppy plants, for centuries, but we have very little insight into how they lose their effectiveness in the hours, days and weeks following the first dose.

"Our study revealed cellular and molecular mechanisms within our bodies that enables us to develop resistance to this medication, or what scientists call drug tolerance," lead author Dr. Graciela Pineyro explained. "A better understanding of these mechanisms will enable us to design drugs that avoid tolerance and produce longer therapeutic responses."

The research team looked at how drug molecules would interact with molecules called “receptors” that exist in every cell in our body. Receptors, as the name would suggest, receive “signals” from the chemicals that they come into contact with, and the signals then cause the various cells to react in different ways. They sit on the cell wall, and wait for corresponding chemicals known as receptor ligands to interact with them. “Until now, scientists have believed that ligands acted as ‘on- off’ switches for these receptors, all of them producing the same kind of effect with variations in the magnitude of the response they elicit,” Pineyro explained. “We now know that drugs that activate the same receptor do not always produce the same kind of effects in the body, as receptors do not always recognize drugs in the same way. Receptors will configure different drugs into specific signals that will have different effects on the body.”

Pineyro is attempting to tease the “painkilling” function of opioids from the part that triggers mechanisms that enable tolerance to build up. “My laboratory and my work are mostly structured around rational drug design, and trying to define how drugs produce their desired and non desired effects, so as to avoid the second, Pineyro said. “If we can understand the chemical mechanisms by which drugs produce therapeutic and undesired side effects, we will be able to design better drugs.”

Once activated by a drug, receptors move from the surface of the cell to its interior, and once they have completed this ‘journey’, they can either be destroyed or return to the surface and used again through a process known as “receptor recycling.” By comparing two types of opioids – DPDPE and SNC-80 – the researchers found that the ligands that encouraged recycling produced less analgesic tolerance than those that didn’t. “We propose that the development of opioid ligands that favour recycling could be away of producing longer-acting opioid analgesics,” Pineyro said.

Provided by University of Montreal

Source: medicalxpress.com

Filed under science neuroscience brain psychology

2 notes

Early warning system for seizures could cut false alarms

April 3, 2012

Epilepsy affects 50 million people worldwide, but in a third of these cases, medication cannot keep seizures from occurring. One solution is to shoot a short pulse of electricity to the brain to stamp out the seizure just as it begins to erupt. But brain implants designed to do this have run into a stubborn problem: too many false alarms, triggering unneeded treatment. To solve this, Johns Hopkins biomedical engineers have devised new seizure detection software that, in early testing, significantly cuts the number of unneeded pulses of current that an epilepsy patient would receive.

Sridevi Sarma’s research focuses on a system with three components: electrodes implanted in the brain, which are connected by wires to a neurostimulator or battery pack, and a sensing device, also located in the brain implant, which detects when a seizure is starting and activates the current to stop it. Credit: Greg Stanley/JHU

Sridevi V. Sarma, an assistant professor of biomedical engineering, is leading this effort to improve anti-seizure technology that sends small amounts of current into the brain to control seizures.

"These devices use algorithms — a series of mathematical steps —to figure out when to administer the treatment," Sarma said. "They’re very good at detecting when a seizure is about to happen, but they also produce lots of false positives, sometimes hundreds in one day. If you introduce electric current to the brain too often, we don’t know what the health impacts might be. Also, too many false alarms can shorten the life of the battery that powers the device, which must be replaced surgically."

Her new software was tested on real-time brain activity recordings collected from four patients with drug-resistant epilepsy who experienced seizures while being monitored. In a study published recently in the journal Epilepsy & Behavior, Sarma’s team reported that its system yielded superior results, including flawless detection of actual seizures and up to 80 percent fewer alarms when a seizure was not occurring. Although the testing was not conducted on patients in a clinical setting, the results were promising.

Read more …

Filed under science neuroscience brain psychology epilepsy

9 notes

Activity in Brain Networks Related to Features of Depression

ScienceDaily (Apr. 3, 2012) — Depressed individuals with a tendency to ruminate on negative thoughts, i.e. to repeatedly think about particular negative thoughts or memories, show different patterns of brain network activation compared to healthy individuals, report scientists of a new study in Biological Psychiatry.

The risk for depression is increased in individuals with a tendency towards negative ruminations, but patterns of autobiographic memory also may be predictive of depression.

When asked to recall specific events, some individuals have a tendency to recall broader categories of events instead of specific events. This is termed overgeneral memory and, like those who tend to ruminate, these individuals also have a higher risk of developing depression.

These self-referential activities engage a network of brain regions called the default mode network, or DMN. Prior studies using imaging techniques have already shown that the DMN activates abnormally in individuals with depression, but the relationship between DMN activity and depressive ruminations was not clear.

In this new report, Dr. Shuqiao Yao of Central South University in Hunan, China and colleagues evaluated DMN functional connectivity in untreated young adults experiencing their first episode of major depression and healthy volunteers. Each participant underwent a brain scan and completed tests to measure their levels of rumination and overgeneral memory.

As expected, the depressed patients exhibited higher levels of rumination and overgeneral memory than did the control subjects. They also observed increased functional connectivity in the anterior medial cortex regions and decreased functional connectivity in the posterior medial cortex regions in depressed patients compared with control subjects.

Among the depressed subjects, an interesting pattern of dissociation emerged. The increased connectivity in anterior regions was positively associated with rumination, while the decreased connectivity in posterior regions was negatively associated with overgeneral memory.

Dr. Yao commented on the importance of these findings: “In the future, resting-state network activity in the brain will provide useful models for investigating network features of cognitive dysfunction in psychopathology.”

"As we dig deeper in brain imaging studies, we are becoming increasingly interested in the activity of brain circuits rather than single brain regions," said Dr. John Krystal, Editor of Biological Psychiatry. “Although it is a more complicated process, studying brain circuits may provide greater insight into symptoms, such as depressive ruminations. The current study nicely illustrates how altered activity at different sites within a brain network may be related to different features of depression.”

Source: Science Daily

Filed under science neuroscience brain psychology depression

0 notes

Northwestern study compares endovascular brain aneurysm repair devices

April 3, 2012

Approximately 6 million Americans have brain aneurysms, a condition that occurs when a weak or thin spot develops on a blood vessel in the brain causing it to balloon. Often, these do not cause symptoms and go undetected, but every year an estimated 30,000 Americans experience a ruptured aneurysm that bleeds into the brain causing a life threatening injury. Immediate medical treatment is necessary to prevent stroke, nerve damage or death, and includes surgery or coiling. Coiling is an approach that blocks blood flow to the aneurysm by filling it with platinum coils. While less invasive than surgery, the likelihood of future aneurysm recurrence and subsequent treatment is higher with coiling. In an effort to lower the risk for repeat aneurysm treatment after coiling, Northwestern Medicine researchers are examining a new type of gel-coated coil to determine if it is more effective than the standard bare coils in preventing aneurysm recurrence.

Aneurysms can be a very serious health threat, according to Bernard R. Bendok, MD, a neurosurgeon at Northwestern Memorial Hospital, who is the principal investigator for the new generation Hydrogel Endovascular Aneurysm Treatment Trial (HEAT). “When an aneurysm needs treatment, it is important to perform the safest, most effective and most durable treatment. This clinical research trial, called HEAT, will help us determine whether bare platinum coils, which have been used for years, or the newer gel-coated coils are more effective long-term,” said Bendok, who is also an associate professor of neurological surgery and radiology at Northwestern University Feinberg School of Medicine.

Coiling involves inserting a catheter into an artery and threading it through the body using live x-rays as a guide to the site of the aneurysm. Coils are passed through the catheter and released into the aneurysm filling it to block blood from entering. Blood clots then form around the coil preventing the vessels from rupturing or leaking and destroying the aneurysm.

"Coils are not always able to fill the aneurysm completely, which leaves dead space in the aneurysm. This space has been associated with a higher rate of aneurysm recurrence," explained Bendok. "The new coils are made with platinum and a hydrogel that expands over time to eliminate the space between the coils, potentially limiting the need for future treatment."

HEAT is an international randomized study that seeks to determine how the gel packed coils measure up to the standard option in preventing future aneurysm recurrence. Northwestern is the lead site for the trial. Patients may be eligible for the trial if they are between the ages of 18 and 75 years with aneurysms 3 to 14mm in size, amenable to coiling. An estimated 30 sites around the world are expected to join the trial which has an enrollment goal of 600 participants. 

On average aneurysms impact about one percent of the adult population. Understanding symptoms and risk factors can be potentially lifesaving. Small aneurysms may not be associated with symptoms, but a larger, growing aneurysm may cause pressure on tissues and nerves, leading to symptoms including headache, pain above and behind the eye, a dilated pupil, double vision, and weakness, numbness or paralysis on one side of face or body.

"In many cases, brain aneurysms remain silent until there’s a major problem," said Bendok. "Most are not found until they rupture or are found incidentally on brain images taken to assess another condition. The number one sign to look for is a sudden and extremely severe headache. If this occurs, one should seek immediate medical attention."

Other indicators that a person may have a ruptured aneurysm include double vision, nausea, vomiting, stroke-like symptoms, stiff neck, loss of consciousness and in some cases, seizure and changes in memory. Risk factors include hypertension, alcohol and drug abuse, and smoking. Aneurysms can be influenced by genetic factors and family history may be an indication for screening. People with certain hereditary diseases including connective tissue disorders or polycystic kidney disease can have a higher occurrence. Other associations include arteriovenous malformation (AVM) and blockage of certain blood vessels in the brain. Women are more likely than men to have brain aneurysms. It’s estimated about 10 in every 100,000 people will experience a ruptured aneurysm each year.

"Brain aneurysm rupture can be very devastating," said H. Hunt Batjer, MD, chairman of the department of neurological surgery at Northwestern Memorial and Michael J. Marchese Professor of neurological surgery at the Feinberg School. "It’s important to know what to look for and who might be at increased risk for aneurysm disease. While current treatments are effective, trials like HEAT have the potential to advance the art and science of brain aneurysm treatment and lead to even better treatment options in the future."

Provided by Northwestern Memorial Hospital

Source: medicalxpress.com

Filed under science neuroscience brain psychology

6 notes

Seeing Beyond the Visual Cortex

April 3, 2012 By Miles O’ Brien and Jon Baime

(Medical Xpress) — It’s a chilling thought—losing the sense of sight because of severe injury or damage to the brain’s visual cortex. But, is it possible to train a damaged or injured brain to “see” again after such a catastrophic injury? Yes, according to Tony Ro, a neuroscientist at the City College of New York, who is artificially recreating a condition called blindsight in his lab.

"Blindsight is a condition that some patients experience after having damage to the primary visual cortex in the back of their brains. What happens in these patients is they go cortically blind, yet they can still discriminate visual information, albeit without any awareness." explains Ro.

[Video]

While no one is ever going to say blindsight is 20/20, Ro says it holds tantalizing clues to the architecture of the brain. “There are a lot of areas in the brain that are involved with processing visual information, but without any visual awareness.” he points out. “These other parts of the brain receive input from the eyes, but they’re not allowing us to access it consciously.”

With support from the National Science Foundation’s (NSF) Directorate for Social, Behavioral and Economic Sciences, Ro is developing a clearer picture of how other parts of the brain, besides the visual cortex, respond to visual stimuli.

In order to recreate blindsight, Ro must find a volunteer who is willing to temporarily be blinded by having a powerful magnetic pulse shot right into their visual cortex. The magnetic blast disables the visual cortex and blinds the person for a split second. “That blindness occurs very shortly and very rapidly—on the order of one twentieth of a second or so,” says Ro.

On the day of Science Nation’s visit to Ro’s lab in the Hamilton Heights section of Manhattan, volunteer Lei Ai is seated in a small booth in front of a computer with instructions to keep his eyes on the screen. A round device is placed on the back of Ai’s head. Then, the booth is filled with the sound of consistent clicks, about two seconds apart. Each click is a magnetic pulse disrupting the activity in his visual cortex, blinding him. Just as the pulse blinds him, a shape, such as a diamond or a square, flashes onto a computer screen in front of him.

Ro says that 60 to nearly 100 percent of the time, test subjects report back the shape correctly. “They’ll be significantly above chance levels at discriminating those shapes, even though they’re unaware of them. Sometimes they’re nearly perfect at it,” he adds.

Ro observes what happens to other areas of Ai’s brain during the instant he is blinded and a shape is flashed on the screen. While the blindness wears off immediately with no lasting effects, according to Ro, the findings are telling. “There are likely to be a lot of alternative visual pathways that go into the brain from our eyes that process information at unconscious levels,” he says.

Ro believes understanding and mapping those alternative pathways might be the key to new rehabilitative therapies. “We have a lot of soldiers returning home who have a lot of brain damage to visual areas of the brain. We might be able to rehabilitate these patients,” he says. And that’s something worth looking into.

Provided by National Science Foundation

Source: medicalxpress.com

Filed under science neuroscience brain psychology

3 notes

Our brains on food: From anorexia to obesity and everything in between

April 3, 2012

The brains of people with anorexia and obesity are wired differently, according to new research. Neuroscientists for the first time have found that how our brains respond to food differs across a spectrum of eating behaviors – from extreme overeating to food deprivation. This study is one of several new approaches to help better understand and ultimately treat eating disorders and obesity.

Eating disorders have the highest mortality rate of any mental illness. And more than two-thirds of the U.S. population are overweight or obese – a health factor associated with cardiovascular issues, diabetes, and cancer. “This body of work not only increases our understanding of the relationship between food and brain function but can also inform weight loss programs,” says Laura Martin of Hoglund Brain Imaging Center at the University of Kansas Medical Center, one of several researchers whose work being presented today at a meeting of cognitive neuroscientists in Chicago.

"One of the most intriguing aspects of these studies of the brain on food," Martin says, is that they show "consistent activations of reward areas of the brain that are also implicated in studies of addiction." However, how those reward areas respond to food differs between people depending on their eating behaviors, according to the new brain imaging study by Laura Holsen of Harvard Medical School and Brigham and Women’s Hospital and colleagues.

Holsen’s team conducted fMRI brain scans of individuals with one of three eating conditions – anorexia nervosa, simple obesity, and Prader-Willi syndrome (extreme obesity) – as well as healthy control subjects. When hungry, those with anorexia, who severely restrict their food intake, showed substantially decreased responses to various pictures of food in regions of their brains associated with reward and pleasure. For those who chronically overeat, there were significantly increased responses in those same brain regions.

"Our findings provide evidence of an overall continuum relating food intake behavior and weight outcomes to food reward circuitry activity," Holsen says. Her work also has implications, she says, for everyday eating decisions in healthy individuals. "Even in individuals who do not have eating disorders, there are areas of the brain that assist in evaluating the reward value of different foods, which in turn plays a role in the decisions we make about which foods to eat."

Kyle Simmons of the Laureate Institute studies the neural mechanisms that govern such everyday eating decisions. His work with fMRI scans has found that as soon as people see food, their brains automatically gather information about how they think it will taste and how that will make them feel. The brain scans showed an apparent overlap in the region on the insula that responds to seeing food pictures and the region of the insula that processes taste, the “primary gustatory cortex.”

Simmons is currently expanding this work to better understand the differences in taste preferences between lean, healthy individuals and obese ones. “We simply don’t know yet if differences exist between lean and obese participants,” he says. “And knowing which brain regions underlie inferences about food taste and reward is critical if we are going to develop efficacious interventions for obesity and certain eating disorders, both of which are associated with enormous personal and public health costs.”

Provided by Cognitive Neuroscience Society

Source: medicalxpress.com

Filed under science neuroscience psychology brain eating disorders obesity

0 notes

Autistic Kids Born Preterm, Post-term Have More Severe Symptoms

April 3rd, 2012

For children with autism, being born several weeks early or several weeks late tends to increase the severity of their symptoms, according to new research out of Michigan State University.

Additionally, autistic children who were born either preterm or post-term are more likely to self-injure themselves compared with autistic children born on time, revealed the study by Tammy Movsas of MSU’s Department of Epidemiology.

Though the study did not uncover why there is an increase in autistic symptoms, the reasons may be tied to some of the underlying causes of why a child is born preterm (prior to 37 weeks) or post-term (after 42 weeks) in the first place.

The research appears online in the Journal of Autism and Development Disorders.

Movsas, a postdoctoral epidemiology fellow in MSU’s College of Human Medicine, said the study reveals there are many different manifestations of autism spectrum disorder, a collection of developmental disorders including both autism and Asperger syndrome. It also shows the length of the mother’s pregnancy is one factor affecting the severity of the disorder.

While previous research has linked premature birth to higher rates of autism, this is one of the first studies to look at the severity of the disease among autistic children who had been born early, on time and late.

“We think about autism being caused by a combination of genetic and environmental factors,” she said. “With preterm and post-term babies, there is something underlying that is altering the genetic expression of autism.

“The outside environment in which a preterm baby continues to mature is very different than the environment that the baby would have experienced in utero. This change in environment may be part of the reason why there is a difference in autistic severity in this set of infants.”

Movsas added that for post-term babies, the longer exposure to hormones while a baby is in utero, the higher chance of placental malfunction and the increased rate of C-section and instrument-assisted births may play a role.

The study also found that babies born outside of normal gestational age (40 weeks) – specifically very preterm babies – showed an increase in stereotypical autistic mannerisms.

“Normal gestation age of birth seems to mitigate the severity of autism spectrum disorder symptoms, and the types of autistic traits tend to be different depending on age at birth,” she said.

The study analyzed an online database compiled by Kennedy Krieger Institute at Johns Hopkins University of nearly 4,200 mothers – with autistic children ages 4-21 – between 2006 and 2010. It divided the data on births into four categories: very preterm (born prior to 34 weeks); preterm (34 to 37 weeks); standard (37 to 42 weeks); and post-term (born after 42 weeks)

The mothers filled out a pair of questionnaires regarding the symptoms of their autistic children, and the results revealed very preterm, preterm and post-term autistic children had significantly higher screening scores for autism spectrum disorder than autistic children born full term.

“The findings point to the fact that although autism has a strong genetic component, something about pregnancy or the perinatal period may affect how autism manifests,” said Nigel Paneth, an MSU epidemiologist who worked with Movsas on the paper. “This adds to our earlier finding that prematurity is a major risk factor for autism spectrum disorder and may help us understand if anything can be done during early life to prevent or alleviate autism spectrum disorder.”

Source: Neuroscience News

Filed under science neuroscience psychology brain autism

1 note

Aging accelerates brain abnormalities in childhood onset epilepsy patients

April 2, 2012

New research confirms that childhood onset temporal lobe epilepsy has a significant impact on brain aging. Study findings published in Epilepsia, a peer-reviewed journal of the International League Against Epilepsy (ILAE), show age-accelerated ventricular expansion outside the normal range in this patient population.

According to the Centers for Disease Control and Prevention (CDC), epilepsy affects nearly 2 million Americans. Temporal lobe epilepsy is the most common form of partial epilepsy, with 60% of all patients having this form of the disease. Previous evidence suggests that patients with childhood onset epilepsy have significant cognitive and developmental deficiencies, which continue into adulthood, particularly in those resistant to antiepileptic drugs.

Prior imaging studies of patients with temporal lobe epilepsy have shown abnormalities in brain structure in hippocampus, in thalamus and other subcortical structures, and also in cortical and white matter volume. However, there is limited knowledge of the effects of aging on these structural changes.

To characterize differences in brain structure and patterns of age-related change, Dr. Bruce Hermann and colleagues from the University of Wisconsin-Madison recruited 55 patients with chronic temporal lobe epilepsy and 53 healthy controls for their study. Participants were between the ages of 14 and 60, with patients having mean age of onset of epilepsy in childhood/adolescence. Magnetic resonance imaging (MRI) was used to measure cortical thickness, area and volume in the brains of all subjects.

In participants with epilepsy, there were extensive abnormalities in brain structure, involving subcortical regions, cerebellum and cortical gray matter thickness and volume in the temporal and extratemporal lobes. Furthermore, researchers found that increasing chronological age was associated with progressive changes in cortical, subcortical and cerebellar regions for both epilepsy subjects and healthy controls. The pattern of change was similar for both groups, but epilepsy patients always showed more extensive abnormalities. In particular, epilepsy patients displayed age-accelerated expansion of the lateral and third ventricles. “The anatomic abnormalities in patients with epilepsy indicate a significant neurodevelopmental impact,” said Dr. Hermann.

"Patients with epilepsy are burdened with significant neurodevelopmental challenges due to these cumulative brain abnormalities," concludes Dr. Hermann. "The consequences of these anatomical changes for epilepsy patients as they progress into elder years remain unknown and further study of the adverse effects in those of older chronological age is needed."

Provided by Wiley

Source: medicalxpress.com

Filed under science neuroscience brain epilepsy psychology

free counters