Neuroscience

Month

June 2012

Psychologists reveals how brain performs 'motor chunking' tasks

June 12, 2012

You pick up your cell phone and dial the new number of a friend. Ten numbers. One. Number. At. A. Time. Because you haven’t actually typed the number before, your brain handles each button press separately, as a sequence of distinct movements.

image

This image shows identified brain regions linked to the parsing (left) and concatenation (right) processes involved in motor chunking. Trials with greater parsing showed increased activation of the left prefrontal and parietal cortex and trials with greater concatenation showed increased activation of the putamen. Credit: Photo by Nicholas Wymbs

After dialing the number a few more times, you find yourself typing it out as a series of three successive bursts of movement: the area code, the first three numbers, the last four numbers. Those three separate chunks allow you to type the number faster, and with greater precision. Eventually, dialed often enough, the number is stored in your brain as one chunk. Who needs speed dial?

"You can think about a chunk as a rhythm," said Nicholas Wymbs, a postdoctoral researcher in UC Santa Barbara’s Department of Psychological and Brain Sciences, and the lead author of a new study on motor chunking in the journal Neuron, published by Cell Press. “We highlight the two-part process that seems to occur when we are chunking. This is demonstrated by the rhythm we use when typing the phone number: rapid bursts of finger movements that are interspersed by pauses.”

The rhythm is the human brain taking information and processing it in an efficient way, according to Wymbs. “On one level, the brain is going to try to divide up, or parse, long sequences of movement,” he said. “This parsing process functions to group or cluster movements in the most efficient way possible.”

But it is also in our brain’s best interest to assemble single or short strings of movements into longer, integrated sequences so that a complex behavior can be made with as little effort as possible. “The motor system in the brain wants to output movement in the most computational, low-cost way as possible,” Wymbs said. “With this integrative process, it’s going to try to bind as many individual motor movements into a fluid, uniform movement as it possibly can.”

image

This diagram illustrates how the subjects in the experiment used their left hands to respond to the “notes” on a button box. Credit: Illustration by Nicholas Wymbs

The two processes are at odds with each other, and it’s how the brain reconciles this struggle during motor learning that intrigues Wymbs and the study’s other authors, including Scott Grafton, professor of psychology and director of the UCSB Brain Imaging Center. “What we are interested in is functional plasticity of the brain –– how the brain changes when we learn actions, or motor sequences as we refer to them in this paper,” Wymbs said.

The study was conducted using human subjects in the Magnetic Resonance Imaging (MRI) scanner in the Brain Imaging Center. The experiment involved three days of training with people performing and practicing three separate motor sequences for up to 200 trials each during the collection of functional MRI data. The subjects were all right-handed but they were asked to learn the sequences using the four fingers of their left hands. Participants practiced the sequences during the operation of the MRI scanner by tapping out responses with a button box that looked like a set of piano keys, with long, rectangular buttons.

"People would see a static image shown on a video screen that detailed the sequence to be typed out," Wymbs said. "They’re lying down inside the scanner and they see this image above their eyes. Interestingly, some people reported that the images looked like something out of (the video game) Guitar Hero, and, indeed, it does look a bit like guitar tablature. They would have to type out the ‘notes’ from left to right, as you normally would when reading music.

"After practicing a sequence for 200 trials, they would get pretty good at it," Wymbs added. "After awhile, the note patterns become familiar. At the start of the training, it would take someone about four and a half seconds to complete each sequence of 12 button presses. By the end of the experiment, the average participant could produce the same sequence in under three seconds."

The researchers’ goal was to look at which areas of the brain support the two-part process of chunking. “We feel that the motor process, or the concatenation process as we refer to it in the paper, tends to take over as you continue to practice and continue to learn the sequences,” Wymbs said. “That’s the one that’s tied to the motor output system –– the thing that’s actually accomplishing what we set out to do.”

With the experience of repeating a motor sequence, such as typing out a phone number, speaking, typing on a computer, or even texting, it becomes more automatic. “With automaticity comes the recruitment of core motor output regions,” Wymbs said.

The scientists discovered that the putamen –– a brain region that is critically important to movement –– supports the concatenation process of motor chunking, with robust connectivity to parts of the brain that are intimately tied to the output of skilled motor behavior. On the other hand, they found that cortical regions in the left hemisphere respond more during the parsing process of motor chunking. “These regions have been linked to the manipulation of motor information, which is something that we probably do more of when we just begin to learn the sequences as chunks,” Wymbs said.

"Initially, when you’re doing one of these 12-element sequences, you want to pause,” Wymbs added. “That would evoke more of the parsing mechanism. But then, over time, as you learn a sequence so that it becomes more automatic, and the concatenation process takes over and it wants to put all of these individual elements into a single fluid behavior.”

According to Wymbs, the findings could have implications for the study and diagnosis of Parkinson’s and other diseases of the motor system that involve action. “We show here that there are two potentially competing systems that lead to the isolation of different systems that both work to allow us to process things efficiently when we’re learning,” Wymbs said.

Provided by University of California - Santa Barbara

Source: medicalxpress.com

Jun 13, 201228 notes
#science #neuroscience #psychology #brain
Fruit Flies Reveal Mechanism Behind ALS-Like Disease

ScienceDaily (June 12, 2012) — Studying how nerve cells send and receive messages, Johns Hopkins scientists have discovered new ways that genetic mutations can disrupt functions in neurons and lead to neurodegenerative disease, including amyotrophic lateral sclerosis (ALS).

image

Neurons are shown in green. A normal neuron is on the left and p150glued mutant neuron is on the right. The red cargo accumulates in the mutant but not in the normal neuron. Areas with the highest cargo accumulation are yellow at the tip of the neuron. (Credit: Image courtesy of Johns Hopkins Medicine)

In a report published April 26 in Neuron, the research team says it has discovered that a mutation responsible for a rare, hereditary motor neuron disease called hereditary motor neuropathy 7B (HMN7B) disrupts the link between molecular motors and the nerve cell tip where they reside. This mutation results in the production of a faulty protein that prevents material from being transported from the cell’s edge, which is located at the muscle and extends back toward its “body” in the central nervous system. In pinpointing how and where this cargo transport is disrupted, the scientists are now closer to understanding mechanisms underlying this condition and ALS.

"An important question we need to answer is how defects in proteins that normally perform important cellular functions for neurons lead to disease," says Alex Kolodkin, Ph.D., a Howard Hughes Medical Institute Investigator and professor of neuroscience at the Johns Hopkins University School of Medicine. "A major issue in understanding neurodegenerative diseases is determining how certain proteins that are expressed in all types of neurons, or even in all cells in the body, can lead to devastating effects in one, or a few, subsets of neurons." Kolodkin notes that many neurodegenerative diseases involve proteins that serve general functions required in nearly every type of cell in the body, including the transport of material between different parts of a cell, yet certain alterations in these proteins can result in specific neurological disorders.

One particular protein, p150glued, is known to play a role in at least two of these disorders, HMN7B, which is similar to ALS, and Perry syndrome, which leads to symptoms similar to Parkinson’s disease. p150glued is part of a larger complex of proteins that forms a molecular “motor” capable of transporting various molecules and other “cargo” from the nerve end toward the cell body. To better understand how mutations in p150glued lead to HMN7B and Perry syndrome, the researchers turned to fruit flies, which are easy to genetically manipulate and where the same protein has been well studied.

They engineered the fruit fly p150glued protein to contain the same mutations as those implicated in the two diseases and used microscopy techniques that enable them to follow in live cells the movement of fluorescently tagged cargo along motor neurons.

They found, surprisingly, that the movement of cargo along the length of the cell was normal. However, at the far end of the cell, they found that the HMN7B-associated mutation caused an unusually large accumulation of cargo. “This was an unexpected finding,” says Thomas Lloyd, M.D., Ph.D., an assistant professor in neurology and neuroscience at the Johns Hopkins School of Medicine. “We need to better understand how this is causing disease.”

Using flies engineered to contain mutations in other motor proteins, and again watching cargo transport in live cells, the team found that p150glued works in concert with another motor to control cargo transport. Their results suggest that when p150glued is compromised, this control is lost and cargo accumulates at the nerve end, leading to disease.

"It’s still unclear how these two different mutations in different regions of the same protein cause very distinct neurodegenerative diseases," Lloyd says. Encouraged by their results, the team plans to continue using fruit flies to unravel the molecular mechanisms underlying these diseases.

Source: Science Daily

Jun 13, 201212 notes
#science #neuroscience #psychology #neuron
Kill the Germs, Spare the Ears: Encouraging Study Shows How

ScienceDaily (June 11, 2012) — The world needs new antibiotics to overcome the ever increasing resistance of disease-causing bacteria — but it doesn’t need the side effect that comes with some of the most powerful ones now available: hearing loss. Today, researchers report they have developed a new approach to designing antibiotics that kill even “superbugs” but spare the delicate sensory cells of the inner ear.

image

These delicate hair cells from the inner ear of mice were tested to see the effects of powerful antibiotics on structures that are crucial to hearing. At left, cells that were exposed to the antibiotic gentamycin showed signs of high levels of damaging free radicals (seen in green). But cells treated with the veterinary drug apramycin. shown at right, didn’t show these effects — adding to evidence that this drug could be used to treat humans without damaging hearing. (Credit: University of Michigan, Schacht laboratory)

Surprisingly, they have found that apramycin, an antibiotic already used in veterinary medicine, fits this bill — setting the stage for testing in humans.

In a paper published online in the Proceedings of the National Academy of Sciences, a team from Switzerland, England and the University of Michigan show apramycin’s high efficacy against bacteria, and low potential for causing hearing loss, through a broad range of tests in animals. That testing platform is now being used to evaluate other potential antibiotics that could tackle infections such as multidrug-resistant tuberculosis.

The research aims to overcome a serious limitation of aminoglycoside antibiotics, a class of drugs which includes the widely used kanamycin, gentamicin and amikacin.

While great at stopping bacterial infections, these drugs also cause permanent partial hearing loss in 20 percent of people who take them for a short course, and up to 100 percent of people who take them over months or years, for example to treat tuberculosis or lung infections in cystic fibrosis.

U-M researcher Jochen Schacht, Ph.D., a professor of biological chemistry and otolaryngology and director of the Kresge Hearing Research Institute at the U-M Medical School, has spent decades studying why these drugs cause this “ototoxicity” — a side effect that makes doctors hesitant to prescribe them. Hearing damage has also caused patients to discontinue treatment before their antibiotic prescription is over, potentially allowing drug-resistant strains of bacteria to flourish.

Schacht has found that the drugs produce damaging free radicals inside the hair cells of the inner ear. Hair cells, named for the tiny sound-sensing hairs on their surface, are the linchpin of hearing — and once destroyed, cannot be regrown.

In the new paper, Schacht and his research group joined teams led by University of Zurich microbiologist Erik Böttger, and structural biologist and Nobel Prize winner Venkatraman Ramakrishnan of England’s Medical Research Council Laboratory of Molecular Biology, as well as scientists from ETH Zurich. Each team brought its particular expertise to the issue, and after four years of work they developed and tested this new approach to designing antibiotics.

"Aminoglycosides are some of the most valuable broad-spectrum antibiotics and indispensable drugs today, but we need new options to combat drug-resistant bacteria. Importantly, we must find ways to overcome their ototoxicity," Schacht says. "Instead of the trial-and-error approach of the past, this new hypothesis-driven tactic allows us to design drugs with simultaneous attention toward both antibacterial action and impact on hair cells."

According to the World Health Organization, about 440,000 new cases of multidrug-resistant tuberculosis emerge annually, causing at least 150,000 deaths worldwide. Aminoglycoside antibiotics, while carefully controlled in the U.S., Europe, and other developed countries are available over the counter in many developing nations, leading to overuse that makes it even easier for drug-resistant strains of many kinds of bacteria to emerge and spread.

The new paper outlines a rational approach to designing drugs to combat this threat without ototoxicity, based on a theoretical framework that emerged from the work of the three laboratories and centers around the role of ribosomes, the structures inside the cell that “read” DNA and translate the genetic message into proteins. Böttger’s lab, at the Institut für Medizinische Mikrobiologie which he directs, studies aminoglycoside effects on mitochondrial ribosomes and antibacterial activity with an eye toward designing new ones. Ramakrishnan’s lab studies ribosomes, and partners from ETH Zurich also collaborated.

Aminoglycosides bind to the ribosomes inside bacterial cells, preventing the ability to produce proteins. But while the drugs spare most human ribosomes, they can attach to ribosomes in the mitochondria of cells, which are similar to bacterial ribosomes.

Consistent with U-M-generated theories about ototoxicity, the drugs then cause the production of free radicals in such quantities that they overwhelm the hair cells’ defense mechanisms — destroying the cells and causing hearing loss.

The team’s approach is to design drugs that more specifically target bacterial ribosomes over mitochondrial ribosomes, simultaneously testing the impact on hair cells as well as the ability to kill bacteria. In this way, the researchers try to avoid creating antibiotics that harm hearing.

They are already using the platform employed for this study — which involves cells from mouse ears, and tests of hearing and hair cell damage in guinea pigs — to test other promising novel drugs synthesized based on the theoretical framework that was driving the current research.

Meanwhile, the team hopes to launch a clinical trial of apramycin, an antibiotic that could prove immediately useful because multidrug-resistant TB and lung-infecting bacteria have not shown resistance to the drug yet.

The research also lends more evidence to support the use of antioxidants to protect the hearing of patients taking current aminoglycoside antibiotics. Schacht has already led a clinical trial in China that showed a major reduction in hearing loss if aspirin was given at the same time as aminoglycoside antibiotics. “This kind of protection is important, while we search for the long-term answer to drug resistance without ototoxicity,” he says.

Source: Science Daily

Jun 13, 20125 notes
#science #neuroscience #hearing #psychology
Scientists identify brain area that determines distance from which sound originates

June 11, 2012

Researchers at the Martinos Center for Biomedical Imaging at Massachusetts General Hospital have identified a portion of the brain responsible for determining how far away a sound originates, a process that does not rely solely on how loud the sound is. The investigators’ report, which will appear in the early edition of Proceeding of the National Academy of Sciences, is receiving early online release this week.

image

This is an image of human cerebral cortex, digitally “inflated” to smooth out normal folds and ridges, showing in red the portion of auditory cortex that responds to the distance from which sounds arrive. Credit: Jyrki Ahveninen, Ph.D., Martinos Center for Biomedical Imaging, Massachusetts General Hospital

"Although sounds get louder when the source approaches us, humans are able to discriminate between loud sounds that come from far away and softer sound from a closer source, suggesting that our brains use distance cues independent of loudness," says Jyrki Ahveninen, PhD, of the Martinos Center, senior author of the PNAS report. "Using functional MRI we found a group of neurons in the auditory cortex sensitive to the distance of sound sources and different from those that process changes in loudness. In addition to providing basic scientific information, our results could help future studies of hearing disorders.”

The human brain has distinct areas for processing sensory information – signals responsible for vision, hearing, taste etc. Studies of the visual cortex, located at the back of the brain, have produced detailed maps of areas handling particular portions of the visual field. But understanding of the auditory cortex, located on the side of the head above and behind the ear, is quite limited. While it is known that the portion of the auditory cortex extending toward the back of the head determines where a sound comes from, exactly how the brain translates complex auditory signals to determine both the location and distance from which a sound originates is not yet known.

In their search for auditory neurons that process sound distance, the research team faced some particular challenges. In research laboratories that study hearing, sounds must be delivered to study participants through headphones, which means the acoustical “space” in which a sound is generated must be simulated. This must be done with exquisite accuracy, since environmental aspects causing sound to reverberate probably contribute to distance perception. Since the MRI equipment itself generates a loud noise, the researchers scanned participants’ brains once every 12 seconds to measure responses to sounds presented during intervening quiet periods.

In the first experiment, study participants – 12 adults with normal hearing – listened to a series of paired sounds of varying degrees of loudness and at simulated distances ranging from 15 to 100 cm and were asked to indicate whether the second sound was closer or farther away than the first. Although the differences in loudness varied randomly, participants were quite accurate in distinguishing the simulated distances of the sounds. Acoustical analysis of the particular sound cues presented indicated that the reverberations produced by a sound, which are more pronounced in a closed environment and for sounds traveling farther, may be more important distance cues than are the differences between sounds perceived by a participant’s two ears.

After the first experiment confirmed the accuracy of the simulated acoustical environment, functional MR images taken while participants listened to another series of paired sounds recorded how activity in the auditory cortex changed in response to sounds of varying loudness and direction as well as during sound of constant levels and silence. The images produced identified a small area that appears to be sensitive to cues indicating distance but not loudness. As far as the investigators know, this is the first time neurons sensitive to sound-source distances have been discovered.

"The identified area is located near other auditory cortical areas that process spatial information," says corresponding author Norbert Kopco, PhD. "This is consistent with a general model of perceptual processing in the brain, suggesting that in hearing, as in vision and other senses, spatial information is processed separately from information about the object’s identity or characteristics such as the musical pitch of sound. Our study also illustrates how important it is to combine expertise from different fields – in our case imaging/physiology, psychology, and computational neuroscience – to advance our understanding of such a complex system as the human brain.”

Provided by Massachusetts General Hospital

Source: medicalxpress.com

Jun 13, 201216 notes
#science #neuroscience #brain #psychology #hearing
New Molecules Important for Vision and Brain Function Identified

ScienceDaily (June 11, 2012) — In a pair of related studies, scientists from the Florida campus of The Scripps Research Institute have identified several proteins that help regulate cells’ response to light — and the development of night blindness, a rare disease that abolishes the ability to see in dim light.

In the new studies, published recently in the journals Proceedings of the National Academy of Sciences (PNAS) and The Journal of Cell Biology, Scripps Florida scientists were able to show that a family of proteins known as Regulator of G protein Signaling (RGS) proteins plays an essential role in vision in a dim-light environment.

"We were looking at the fundamental mechanisms that shape our light sensation," said Kirill Martemyanov, a Scripps Research associate professor who led the studies. "In the process, we discovered a pair of molecules that are indispensible for our vision and possibly play critical roles in the brain."

In the PNAS study, Martemyanov and his colleagues identified a pair of regulator proteins known as RGS7 and RGS11 that are present specifically in the main relay neurons of the retina called the ON-bipolar cells. “The ON-bipolar cells provide an essential link between the retinal light detectors — photoreceptors and the neurons that send visual information to the brain,” explained Martemyanov. “Stimulation with light excites these neurons by opening the channel that is normally kept shut by the G proteins in the dark. RGS7 and RGS11 facilitate the G protein inactivation, thus promoting the opening of the channel and allowing the ON-bipolar cells to transmit the light signal. It really takes a combined effort of two RGS proteins to help the light overcome the barrier for propagating the excitation that makes our dim vision possible.”

In the Journal of Cell Biology study, Martemyanov and his colleagues unraveled another key aspect of the RGS7/RGS11 regulatory response — they identified a previously unknown pair of orphan G protein-coupled receptors (GPCRs) that interact with these RGS proteins and dictate their biological function.

GPCRs are a large family of more than 700 proteins, which sit in the cell membrane and sense various molecules outside the cell, including odors, hormones, neurotransmitters, and light. After binding these molecules, GPCRs trigger the appropriate response inside the cell. However, for many GPCRs the activating molecules have not yet been identified and these are called “orphan” receptors.

The Martemyanov group has found that two orphan GPCRs — GPR158 and GPR179 — recruit RGS proteins and thus help serve as brakes for the conventional GPCR signaling rather than play an active signaling role.

In the case of retinal ON-bipolar cells, GPR179 is required for the correct localization of RGS7 and RGS11. Their mistargeting in animal models lacking GPR179 or human patients with mutations in the GPR179 gene may account for their night blindness, according to the new study. Intriguingly, in the brain GPR158 appears to play a similar role in localizing RGS proteins, but instead of contributing to vision, it helps RGS proteins regulate the m-opioid receptor, a GPCRs that mediates pleasurable and pain-killing effects of opioids.

"We are really in the very beginning of unraveling this new biology and understanding the role of discovered orphan GPR158/179 in regulation of neurotransmitter signaling in the brain and retina," Martemyanov said. "The hope is that better understanding of these new molecules will lead to the design of better treatments for addictive disorders, pain, and blindness."

Source: Science Daily

Jun 13, 201213 notes
#science #neuroscience #brain #psychology #proteins
New Stroke Treatment Could Prevent and Reduce Brain Damage

ScienceDaily (June 11, 2012) — Researchers at the University of Missouri have demonstrated the effectiveness of a potential new therapy for stroke patients in an article published in the journal Molecular Neurodegeneration. Created to target a specific enzyme known to affect important brain functions, the new compound being studied at MU is designed to stop the spread of brain bleeds and protect brain cells from further damage in the crucial hours after a stroke.

image

In a model of induced stroke in mice, MU researchers have shown the success of a treatment in stopping further bleeding in the brain after a stroke (above). The outlined area shows the stroke damage. (Credit: Image courtesy of University of Missouri School of Medicine)

Stroke is a leading cause of death in the U.S. with more than 800,000 deaths occurring each year from stroke and other cardiac events. Other than surgery, existing emergency treatments for stroke victims such as the use of a tissue plasminogen activator (tPA) must be administered within hours of the stroke onset because of the risk for brain hemorrhaging. The injectable medication can only be used to treat the most common type of stroke that occurs when blood clots block blood flow to the brain, called ischemic stroke.

"For a stroke victim, time is a matter of life and death. While we are still in the research phase for this type of compound, we believe it could be combined with tPA in the future to buy ischemic stroke patients a longer window of time to receive emergency treatment," said Zezong Gu, MD, PhD, the article’s corresponding author and assistant professor of pathology and anatomical sciences at the MU School of Medicine. The new compound being studied also has potential for use in patients experiencing hemorrhagic stroke, which is a less common type of stroke caused by bleeding within the brain, Gu said.

MU researchers collaborated with a team at the University of Notre Dame to study the effects of the new compound, a thiirane class of gelatinase selective inhibitors, on the function of a type of matrix metalloproteinase (MMP) enzyme, particularly MMP-9. MMP-9 is part of a group of more than 20 enzymes or MMPs that are known to contribute to many key pathological events in the brain after stroke, traumatic brain injury and other neurodegenerative events.

In 2005, Gu served as a lead author on a research paper published in the Journal of Neuroscience that identified MMP-9 as a promising target for development of therapeutic drugs for stroke patients. Since then, his lab at MU medical school’s Center for Translational Neuroscience has been studying the function of MMP enzymes and how to inhibit the harmful effects of MMP-9.

"MMPs play a role in the structure of blood vessels in the brain and are also needed in the interactions between cells during development and tissue remodeling," Gu said. "Unregulated, the activity of these enzymes contributes to neurological disorders and stroke. With this compound, we’ve now confirmed a potential method to rescue the blood vessels from the damaging effects of MMP-9 and protect neurons at the same time."

MU researchers successfully used a model of ischemic stroke in mice and studied the effects of the MMP-9 inhibitor compound on brain activity after a stroke.

"Our lab at the Center for Translational Neuroscience is one of only a few in the United States that has successfully induced a blood clot in the brains of mice," said Jiankun Cui, MD, the article’s lead author and assistant professor of pathology and anatomical sciences at the MU School of Medicine. "To be able to study the effectiveness of this potential new treatment under these conditions provides us with a highly unique set of data showing this compound can disrupt key harmful pathological events that occur after a stroke."

Source: Science Daily

Jun 12, 20123 notes
#science #neuroscience #brain #psychology #stroke
Molecular Imaging Finds Link Between Low Dopamine Levels and Aggression

ScienceDaily (June 11, 2012) — Out of control competitive aggression could be a result of a lagging neurotransmitter called dopamine, say researchers presenting a study at the Society of Nuclear Medicine’s 2012 Annual Meeting. During a computer game against a putative cheating adversary, participants who had a lower capacity to synthesize this neurotransmitter in the brain were more distracted from their basic motivation to earn money and were more likely to act out with aggression.

image

Out of control competitive aggression could be a result of a lagging neurotransmitter called dopamine, say researchers. During a computer game against a putative cheating adversary, participants who had a lower capacity to synthesize this neurotransmitter in the brain were more distracted from their basic motivation to earn money and were more likely to act out with aggression. (Credit: © lassedesignen / Fotolia)

For many people, anger is an almost automatic response to life’s challenges. In clinical psychiatry, scientists look at not only the impact of aggressive behavior on the individual, their loved ones and the community but also the triggers in the brain that lead to aggressive response. The neurobiology of aggression is not well understood, but scientists are aware of a relationship between the neurotransmitter serotonin and certain aggressive behaviors. The objective of this study was to explore whether higher levels of another brain chemical called dopamine, involved in pleasure and reward, increased aggressive response in its subjects. To scientists’ surprise, it was not as they first theorized.

"The results of this study were astonishingly opposite of what was previously hypothesized," says Ingo Vernaleken, M.D., lead author of the study and research scientist for the department of psychiatry at RWTH Aachen University in Aachen, Germany. "Subjects with more functional dopaminergic reward-systems were not more aggressive in competitive situations and could concentrate even more on the game. Subjects with lower dopaminergic capacity were more likely to be distracted by the cheating behavior."

In this study, 18 healthy adults in their twenties were tested for aggression using the psychological behavioral task known as the point subtraction aggression paradigm (PSAP). Participants were asked to play a computer game that required them to press a bar multiple times with the incentive of winning money, but they were also told that an adversary in the next room who is able to cheat may steal some of their winnings. What the paranoid participants did not know was that there was no adversary. The computer program is designed to perform randomized deductions of the subjects’ monetary reward to simulate the cheating competitor.The participant had three choices to react: punish the cheater, shield against the adversary by repeatedly pressing a defense button, or continue playing the game in order to maximize their ability to win cash, which indicated resilience.

"The PSAP focuses on aggressive reaction within a competitive situation," says Vernaleken. "Aggression and its neurobiological mechanisms in humans have been only moderately investigated in the past. Furthermore, most of the previous studies mainly covered the more reactive part of aggression, which merely reflects impulsive behavior and appears to be associated merely with the serotonin system. This investigation focuses on the association with the dopaminergic reward-system, which reflects goal-directed aggression."

Subjects’ brains were imaged using positron emission tomography, which provides a range of information about physiological functions inside the body, depending on the imaging probe used. In this investigation, F-18 FDOPA, a biomarker that lights up enzymes’ ability to synthesize this transmitter, was used and the uptake of this drug in the brain was analyzed to gauge the correlation between the participants’ dopamine synthesis capacity and aggressive behavior.

Results of the study showed a significant impact on aggressive response in areas in the brain where dopamine synthesis was present, especially in the basal ganglia, which among other functions include the motivation center. Minimized aggression was associated with higher dopamine levels in both the midbrain and the striatum, which plays a role in planning and executive function. People with greater capacity for dopamine synthesis were more invested in the monetary reward aspect of the PSAP, instead of acting in defense or with aggression against their perceived adversary, whereas subjects with lower capacities had a higher vulnerability to act either aggressive, defensive or both.

"Thus, we think that a well-functioning reward system causes more resilience against provocation," says Vernaleken. "However, we cannot exclude that in a situation where the subject would directly profit from aggressive behavior, in absence of alternatives, the correlation might be the other way around."

Further research is required to explore the link between dopamine and a range of aggressive behavior. More insight into these relationships could potentially lead to new psychological therapies and drug treatments to moderate or prevent aggressive response.

Source: Science Daily

Jun 12, 20128 notes
#science #neuroscience #brain #psychology #aggression #dopamine
Molecular Imaging Detects Signs of Alzheimer's in Healthy Patients

ScienceDaily (June 11, 2012) — An arsenal of Alzheimer’s research revealed at the Society of Nuclear Medicine’s 59th Annual Meeting indicates that beta-amyloid plaque in the brain not only is involved in the pathology of Alzheimer’s disease but may also precede even mild cognitive decline. These and other studies advance molecular imaging for the early detection of beta-amyloid, for which one product is now approved in the United States , as a major push forward in the race for better treatments.

"Diagnosis of Alzheimer’s disease can now be made when the patient first presents symptoms and still has largely preserved mental function," says Christopher Rowe, M.D., a lead investigator for the Australian Imaging, Biomarkers and Lifestyle study of aging (AIBL) and professor of nuclear medicine at Austin Hospital in Melbourne, Australia. "Previously there was an average delay of three years between consulting a doctor over memory concerns and the diagnosis of Alzheimer’s, as the diagnosis required the presence of dementia. When used as an adjunct to other diagnostic measures, molecular imaging can help lead to earlier diagnosis. This may give the patient several years to prepare for dementia while they still have control over their destiny."

According to the World Health Organization, Alzheimer’s disease affects an estimated 18 million people worldwide, and incidence of the disease is expected to double by the year 2025 to 34 million. The National Institute on Aging estimates that as many as 50 percent of Americans aged 85 or older are affected.

Alzheimer’s disease is a chronic and currently incurable neurodegenerative disease. Beta-amyloid burden can begin to build in the brain several years, if not more than a decade, before an individual shows any sign of dementia. Those who go on to develop Alzheimer’s disease not only lose their ability to remember their loved ones but also have difficulty with essential bodily functions such as breathing and swallowing in the late stages of disease.

In one study, researchers used a molecular imaging technique called positron emission tomography (PET), which images physiological patterns in the body. PET was combined with an imaging agent called F-18 florbetaben, which binds to amyloid in the brain. This and other PET agents are drawn to targets in the body and emit a positron signal that is picked up by a scanner. Here molecular imaging was performed in conjunction with clinical and neuropsychological testing in order to better understand the long-term effects of beta amyloid plaques in the brains of older individuals with mild cognitive impairment. Those of the 45 subjects in the study who showed high levels of imaging agent binding during imaging and atrophy of the hippocampus, the memory center, had an 80 percent chance of developing Alzheimer’s disease within two years, researchers said.

"Molecular imaging is proving to be an essential part of Alzheimer’s disease detection," says Rowe. "This and other amyloid imaging techniques will have an increasing role in the earlier and more accurate diagnosis of neurodegenerative conditions such as Alzheimer’s disease due to their ability to measure the actual underlying disease process."

Another AIBL study included 194 healthy participants, 92 people with mild cognitive impairment and 70 subjects with Alzheimer’s disease, and used another imaging agent called C-11 PiB (Pittsburgh compound B) with PET to gauge amyloid burden in the brain. Researchers showed that, in this study group, widespread amyloid plaque build-up preceded cognitive impairment, and those with extensive amyloid burden were at higher risk of cognitive decline.

This and another study mark two of the first studies of their kind focusing on beta amyloid in healthy subjects. In the other study, 137 adults with normal cognitive function aged 30 to 89 years were imaged using PET with F-18 florbetapir, now FDA-approved for the detection of beta amyloid plaques, as well as functional magnetic resonance imaging in order to explore how amyloid build-up affects connections in specific areas of the brain involved in cognition, namely the default mode and salience networks, which are responsible for different states of wakeful rest and alertness. Those with increased amyloid burden in these neural networks were prone to impaired cognitive performance.

"The effect of beta amyloid in healthy aging is of great interest since this protein is strongly associated with Alzheimer’s disease and may be predictive of the transition from mild cognitive impairment to Alzheimer’s disease," says Michael Devous, Sr., Ph.D., director of neuroimaging at the Alzheimer’s Disease Center at UT Southwestern Medical Center in Dallas, Texas. "Less is known about its impact on cognition in otherwise healthy aging individuals. In addition, brain connectivity in these areas is thought to be sensitive to early changes in brain function caused both by aging itself and by disease processes such as Alzheimer’s disease."

Another study assessed the PET imaging agent C-11 PiB for its ability to detect amyloid plaque in comparison to another imaging agent, 18-F fluorodeoxyglucose (F-18 FDG). The latter acts like glucose, the brain’s primary energy source, to map out the metabolic functioning of the brain. Results of the study showed C-11 PiB amyloid imaging to be a better means of evaluating amyloid patterns in the brain than F-18 FDG imaging. In addition, of the 100 healthy participants, 15 percent were shown to have some amyloid build-up when molecular imaging was performed.

"We are using state-of-the-art, noninvasive PET and MRI technologies to look at some of the earliest developments of Alzheimer’s disease onset in the brains of normal middle-aged people," says Guofan Xu, M.D., Ph.D., lead author of the study and research scientist at the department of nuclear medicine and radiology at the University of Wisconsin located in Madison. "With this we can evaluate whether pathological changes associated with Alzheimer’s disease are happening many years before onset of significant clinical symptoms."

No treatments are currently available to cure or prevent Alzheimer’s disease. With advances in molecular imaging to detect beta amyloid plaques, researchers have an important new tool that may bring the medical community one step closer to making therapies and vaccines a reality for the disease.

Source: Science Daily

Jun 12, 201211 notes
#science #neuroscience #brain #psychology #alzheimer
Chinese mindfulness meditation prompts double positive punch in brain white matter

June 11, 2012

Scientists studying the Chinese mindfulness meditation known as integrative body-mind training (IBMT) say they’ve confirmed and expanded their findings on changes in structural efficiency of white matter in the brain that can be related to positive behavioral changes in subjects practicing the technique regularly for a month.

In a paper appearing this week in the online Early Edition of the Proceedings of the National Academy of Sciences, scientists Yi-Yuan Tang and Michael Posner report improved mood changes coincided with increased axonal density — more brain-signaling connections — and an expansion of myelin, the protective fatty tissue that surrounds the axons, in the brain’s anterior cingulate region.

Deficits in activation of the anterior cingulate cortex have been associated with attention deficit disorder, dementia, depression, schizophrenia and many other disorders.

IBMT was adapted from traditional Chinese medicine in the 1990s in China, where it is practiced by thousands of people. It differs from other forms of meditation because it depends heavily on the inducement of a high degree of awareness and balance of the body, mind and environment. The meditative state is facilitated through training and trainer-group dynamics, harmony and resonance.

In 2010, research led by Tang, a visiting research professor at the University of Oregon, and Michael I. Posner, professor of psychology at the UO, first reported positive structural changes in brain connectivity, based on functional magnetic resonance imaging, that correlated to behavioral regulation. The study was done in the UO’s Robert and Beverly Lewis Center for Neuroimaging with 45 participating UO undergraduate students.

The new findings came from additional scrutiny of the 2010 study and another that involved 68 undergraduate students at China’s Dalian University of Technology. The researchers revisited data obtained from using an MRI technique known as diffusion tensor imaging. The research team found improved density of the axons involved in brain connections but no change in myelin formation after two weeks. After a month, or about 11 hours of IBMT, both increases in axon density and myelin formation were found as measured by fractional anisotropy, axial diffusivity and radial diffusivity — the important indexes for measuring the integrity of white matter fibers.

"This dynamic pattern of white matter change involving the anterior cingulate cortex, a part of the brain network related to self-regulation, could provide a means for intervention to improve or prevent mental disorders," the authors concluded.

"When we got the results, we all got very excited because all of the other training exercises, like working-memory training or computer-based training, only have been shown to change myelination," Tang said. "We believe these changes may be reflective of the time of training involved in IBMT. We found a different pattern of neural plasticity induced by the training."

"This study gives us a much more detailed picture of what it is that is actually changing," Posner said. "We did confirm the exact locations of the white-matter changes that we had found previously. And now we show that both myelination and axon density are improving. The order of changes we found may be similar to changes found during brain development in early childhood, allowing a new way to reveal how such changes might influence emotional and cognitive development.”

The improved mood changes noted in this and earlier studies are based on self-ratings of subjects based on a standard six-dimensional mood-state measure, said Tang, who is now the director of Texas Tech University’s Neuroimaging Institute and holder of the Presidential Endowed Chair in Neuroscience in TTU’s psychology department.

Tang and Posner first reported findings related to IBMT in 2007, also in PNAS. They found that doing IBMT for five days prior to a mental math test led to low levels of the stress hormone cortisol among Chinese students. The experimental group also showed lower levels of anxiety, depression, anger and fatigue than students in a relaxation control group.

In 2009 in PNAS, Tang and his Chinese colleagues, with assistance from Posner and UO psychology professor Mary K. Rothbart, found that IBMT subjects in China had increased blood flow in the right anterior cingulate cortex after receiving training for 20 minutes a day over five days. Compared with the relaxation group, IBMT subjects also had lower heart rates and skin conductance responses, increased belly breathing amplitude and decreased chest respiration rates.

"These new findings provide fundamental new insights on how the brain responds in positive ways to new inputs and reflect the excellence in cognitive neuroscience research that has defined Michael Posner’s work at the University of Oregon," said Kimberly Andrews Espy, vice president for research and innovation. "The research by professors Posner and Tang also reflects the university’s long-running commitment to collaborate with institutions in Pacific Rim countries."

Provided by University of Oregon

Source: medicalxpress.com

Jun 12, 201221 notes
#science #neuroscience #brain #psychology
Keeping pace: Walking speed may signal thinking problems ahead

June 11, 2012

A new study shows that changes in walking speed in late life may signal the early stages of dementia known as mild cognitive impairment (MCI). The research is published in the June 12, 2012, print issue of Neurology, the medical journal of the American Academy of Neurology.

"In our study, we used a new technique that included installing infrared sensors in the ceilings of homes, a system designed to detect walking movement in hallways,” said study author Hiroko Dodge, PhD, with Oregon Health and Science University in Portland and a member of the American Academy of Neurology. “By using this new monitoring method, we were able to get a better idea of how even subtle changes in walking speed may correlate with the development of MCI.”

The study involved 93 people age 70 or older who lived alone. Of those, 54 participants had no cognitive impairment, 31 had non-memory related MCI and eight had memory-related MCI. Participants were given memory and thinking tests and had their walking speed monitored at their homes unobtrusively over a three-year period. Participants were placed in groups of slow, moderate or fast based on their average weekly walking speed and how much their walking speed fluctuated at home.

The study found that people with non-memory related MCI were nine times more likely to be slow walkers than moderate or fast walkers and the amount of the fluctuation in walking speed was also associated with MCI.

"Further studies need to be done using larger groups of participants to determine whether walking speed and its fluctuations could be a predictor of future memory and thinking problems in the elderly,” said Dodge. “If we can detect dementia at its earliest phases, then we can work to maintain people’s independence, provide treatments and ultimately develop ways to prevent the disease from developing. Our in-home monitoring approach has a lot of potential to be used for sustaining independence of the elderly.”

Provided by American Academy of Neurology

Source: medicalxpress.com

Jun 12, 20129 notes
#science #neuroscience #brain #psychology
Treating Childhood Anxiety With Computers, Not Drugs

ScienceDaily (June 11, 2012) — According to the Anxiety and Depression Association of America, one in eight children suffers from an anxiety disorder. And because many anxious children turn into severely anxious adults, early intervention can have a major impact on a patient’s life trajectory. The understandable reluctance to use psychiatric medications when it comes to children means child psychologists are always searching for viable therapeutic alternatives.

Now Prof. Yair Bar-Haim of Tel Aviv University’s School of Psychological Sciences and his fellow researchers are pursuing a new method to address childhood anxiety. Based on a computer program, the treatment uses a technique called Attention Bias Modification (ABM) to reduce anxiety by drawing children away from their tendency to dwell on potential threats, ultimately changing their thought patterns. In its initial clinical trial, the program was as effective as medication and cognitive therapy for children — with several distinct advantages.

The results of the trial were reported in the American Journal of Psychiatry.

Computers instead of capsules

Children are comfortable with computers, explains Prof. Bar-Haim. And because of the potential side effects of medications or the difficulty in obtaining cognitive behavioral therapy, such as the need for highly trained professionals, it is good to have an alternative treatment method. ABM treatments can be disseminated over the Internet or administered by personnel who don’t have to be Ph.D.s. “This could be a game-changer for providing treatment,” he says.

Anxious individuals have a heightened sensitivity towards threats that the average person would ignore, a sensitivity which creates and maintains anxiety, says Prof. Bar-Haim. One of the ways to measure a patient’s threat-related attention patterns is called the dot-probe test. The patient is presented with two pictures or words, one threatening and one neutral. These words then disappear and a dot appears where one of the pictures or words had been, and the patient is asked to press a button to indicate the dot’s location. A fast response time to a dot that appears in the place of the threatening picture or word indicates a bias towards threat.

To turn this test into a therapy, the location of the dot target is manipulated to appear more frequently beneath the neutral word or picture. Gradually, the patient begins to focus on that stimulus instead, predicting that this is where the dot will appear — helping to normalize the attention bias pattern and reduce anxiety.

Prof. Bar-Haim and his colleagues enlisted the participation of 40 pediatric patients with ongoing anxiety disorders and divided them into three groups. The first received the new ABM treatment; the second served as a placebo group where the dot appeared equally behind threatening and neutral images; and the third group was shown only neutral stimuli. Patients participated in one session a week for four weeks, completing 480 dot probe trials each session.

The children’s anxiety levels were measured before and after the training sessions using interviews and questionnaires. In both the placebo group and neutral images group, researchers found no significant change in the patients’ bias towards threatening stimuli. However, in the ABM group, there were marked differences in the participants’ threat bias. By the end of the trial, approximately 33 percent of the patients in this group no longer met the diagnostic criteria for anxiety disorder.

New methods for personalized treatment

These indications of the method’s success in treating children warrant further investigation, says Prof. Bar-Haim. In collaboration with the National Institute of Mental Health in the US, a large international trial involving his computer program is now being carried out at more than 20 sites across five continents.

The more options that exist for patients, the better that clinicians can tailor treatment for their patient’s individual needs, Prof. Bar-Haim observes. There are always patients for whom medication or cognitive therapy is not a viable option, he explains. “Psychological disorders are complex, and not every patient will respond well to every treatment. It’s great to have new methods that have a basis in neuroscience and clinical evidence.”

Source: Science Daily

Jun 12, 201221 notes
#science #neuroscience #brain #psychology #anxiety
Painkiller Abuse Linked to Depression, Suicide in College Students

ScienceDaily (June 11, 2012) — Non-medical prescription drug use by college students is a growing trend on most campuses, according to the U.S. Department of Education’s Higher Education Center for Alcohol, Drug Abuse and Violence Prevention. Due to this trend, Western Illinois University Department of Health Sciences Assistant Professor Amanda Divin and her colleague, Keith Zullig, an associate professor in the West Virginia University School of Public Health, recently conducted and published a study that explores non-medical prescription drug use and depressive symptoms in college students.

Divin and Zullig utilized data from the fall 2008 American College Health Association National College Health Assessment (ACHA-NCHA), a national research survey that addresses seven areas of health and behavior of college students, one of which is alcohol, tobacco and other drug use. The sample used for the study (from the ACHA-NCHA data) contained 26,600 randomly selected college students from 40 campuses in the U.S. The student respondents were asked about their non-medical prescription drug use (including painkillers, stimulants, sedatives and antidepressants) and mental health symptoms within the last year.

According to Divin’s and Zullig’s results, approximately 13 percent of the college-student respondents reported non-medical prescription drug use, with those who reported feeling hopeless, sad, depressed or considered suicide being significantly more likely to report non-medical use of any prescription drug. The results also showed this relationship was more pronounced for females who reported painkiller use. The study — which is titled, “The association between non-medical prescription drug use, depressive symptoms, and suicidality among college students” — will appear in the August 2012 issue of Addictive Behaviors: An International Journal.

"Because prescription drugs are tested by the U.S. Food and Drug Administration and prescribed by a doctor, most people perceive them as ‘safe’ and don’t see the harm in sharing with friends or family if they have a few extra pills left over," Divin explained. "Unfortunately, all drugs potentially have dangerous side effects. As our study demonstrates, use of prescription drugs — particularly painkillers like Vicodin and Oxycontin — is related to depressive symptoms and suicidal thoughts and behaviors in college students. This is why use of such drugs need to be monitored by a doctor and why mental health outreach on college campuses is particularly important."

Divin and Zullig believe the results suggest that students are self-medicating their psychological distress with prescription medications.

"Considering how common prescription sharing is on college campuses and the prevalence of mental health issues during the college years, more investigation in this area is definitely warranted," Divin added. "Our study is just one of the many first steps in exploring the relationship between non-medical prescription drug use and mental health."

Source: Science Daily

Jun 12, 20128 notes
#science #neuroscience #psychology #brain
The Doping-Drug Epo Has an Impact in the Brain

ScienceDaily (June 11, 2012) — Sportsmen and women have been known to dope with the blood hormone Epo to enhance their performance. Researchers from the University of Zurich have now discovered, through animal testing, that Epo has a performance-enhancing effect in the brain shortly after an injection by improving oxygen transport in blood. As Epo also increases motivation, it could be useful in treating depression, experts say.

The well-known blood hormone Epo is not only used for medicinal purposes; some athletes misuse it for doping. It boosts the number of red blood cells, thereby increasing the transport of oxygen to the muscles. This leads to improvements in performance, which can especially give endurance athletes such as cyclists or marathon runners the edge.

Epo has immediate impact on exercise performance

In a recently published study, Max Gassmann, a veterinary physiologist from the University of Zurich, proved that Epo also drastically increases motivation in the brain as soon as it has been injected, without the number of red blood cells increasing.

Gassmann’s team tested exercise performance of differently treated mice, studying genetically modified mice that produce human Epo solely in the brain and mice that the researchers had injected with Epo and the hormone reached the brain thus by blood. Both mouse groups exhibited an increased performance on the treadmill compared to the untreated control animals. “We assume that Epo in the brain triggers a motivation boost to increase physical performance,” explains Professor Gassmann. He and his team are now testing the performance-enhancing effect of Epo on volunteers.

Epo probably has an impact on people’s moods, too. It might thus be used in patients who suffer from depression. The latest experiments conducted by a German-Danish research group reveal that Epo can also alleviate the condition of patients suffering from schizophrenia by improving their mental performance.

Source: Science Daily

Jun 12, 20125 notes
#neuroscience #performance #psychology #science #brain
Early menopause linked to increased risk of brain aneurysm

June 11, 2012

The younger a woman is when she goes through the menopause, the greater may be her risk of having a brain (cerebral) aneurysm, suggests research published online first in the Journal of NeuroInterventional Surgery.

A cerebral aneurysm refers to an abnormal bulging of one of the arteries in the brain, which is often only discovered when it ruptures, causing a potentially fatal and/or disabling bleed.

Women are more prone to cerebral aneurysms than men. And fluctuations in the female hormone oestrogen have been implicated in the development of aneurysms, the incidence of which, along with cardiovascular disease, rises sharply after menopause.

The authors base their findings on 76 postmenopausal women who had had a cerebral aneurysm, which, in most cases had not ruptured, and who were subsequently quizzed about their medical and reproductive histories.

Conditions, such as high blood pressure, diabetes, high cholesterol and an underactive thyroid gland (hypothyroidism) can all boost the risk of a stroke, while the number of pregnancies and the age at which periods start and stop determine lifetime exposure to oestrogen.

This information was then compared with that taken from more than 4,500 women participants of the 2002 National Institute of Child Health and Human Development Contraceptive and Reproductive Experiences Study, and matched for age and educational attainment.

The average age at which women in both groups had started the menopause was similar, and analysis of the results showed that later menopause and use of hormone replacement therapy (HRT) protected against the risk of a cerebral aneurysm, lessening the risk by 21% and 77%, respectively.

Premature menopause - before the age of 40 - had occurred in one in four (26%) of the women who had had an aneurysm compared with around one in five (19%) of those in the comparison group.

And each successive four year increase in the age at which a woman went through the menopause lessened the likelihood of a cerebral aneurysm by around 21%.

Smoking did not seem to be linked to an increase in risk, while alcohol consumption was of borderline significance.

The outcomes for ruptured cerebral aneurysms are poor, with around one in two people who have one likely to die. One in 10 people die before they reach hospital and of those who survive, one in five is severely disabled, say the authors, so finding a potential marker may help to detect the condition earlier.

"Loss of oestrogen earlier in a woman’s life may contribute to the [development] of cerebral aneurysm," conclude the authors, adding that HRT may protect against this. And they suggest: "These data may identify a risk factor for [the development of this condition] and also a potential target for future therapies."

Provided by British Medical Journal

Source: medicalxpress.com

Jun 12, 20127 notes
#science #neuroscience #brain #aneurysm
AAN issues new guideline for treating rare seizure disorder in babies, young children

June 11, 2012

The American Academy of Neurology has issued an updated guideline outlining the best treatments for infantile spasms, a rare type of seizure that can occur in infants and young children. The guideline, which was co-developed with the Child Neurology Society, is published in the June 12, 2012, print issue of Neurology, the medical journal of the American Academy of Neurology.

Infantile spasms is a rare disorder that usually begins in infants aged four to six months. The spasms are a type of seizure that mainly consists of a sudden bending forward of the body with stiffening of the arms and legs or arching of the back while the arms and legs are extended. Infantile spasms rarely respond to the usual anti-seizure medications. Most children with infantile spasms have developmental disabilities later in life.

The guideline found that the hormone therapy adrenocorticotropic hormone, also known as ACTH, may be effective for treatment of infantile spasms. The seizure drug vigabatrin may also be considered for treatment, although evidence suggests ACTH may be more effective than vigabatrin. For children with seizures caused by the genetic disorder tuberous sclerosis complex, however, vigabatrin may be more effective.

The guideline, which is based on a review of all available evidence on treatment for infantile spasms and is an update of a guideline published in 2004, also found that low-dose ACTH is probably as effective as high-dose ACTH and it may lower the risk of side effects.

There is not enough evidence to know whether other treatments, alone or combined, are effective in treating infantile spasms, according to the guideline.

The guideline recommends that early diagnosis and early treatment may lead to better long-term outcomes for children’s development and learning skills.

"It is important for parents to talk to their child’s doctor if they suspect their child may be having seizures or spasms because early diagnosis and treatment may help in the long term with educational and learning skills," said guideline author Cristina Go, MD, of the Hospital for Sick Children in Toronto.

Go noted that children with the syndrome also have a specific pattern that shows up in tests of brain waves, and an EEG (electroencephalogram) is required for confirmation of the diagnosis.

Provided by American Academy of Neurology

Source: medicalxpress.com

Jun 12, 20121 note
#science #neuroscience #brain
Brain scans show specific neuronal response to junk food when sleep-restricted

June 10, 2012

The sight of unhealthy food during a period of sleep restriction activated reward centers in the brain that were less active when participants had adequate sleep, according to a new study using brain scans to better understand the link between sleep restriction and obesity.

Researchers from St. Luke’s – Roosevelt Hospital Center and Columbia University in New York performed functional magnetic resonance imaging (fMRI) on 25 men and women of normal weights while they looked at images of healthy and unhealthy foods. The scans were taken after five nights in which sleep was either restricted to four hours or allowed to continue up to nine hours. Results were compared.

"The same brain regions activated when unhealthy foods were presented were not involved when we presented healthy foods," said Marie-Pierre St-Onge, PhD, the study’s principal investigator. "The unhealthy food response was a neuronal pattern specific to restricted sleep. This may suggest greater propensity to succumb to unhealthy foods when one is sleep restricted."

Previous research has shown that restricted sleep leads to increased food consumption in healthy people, and that a self-reported desire for sweet and salty food increases after a period of sleep deprivation. St-Onge said the new study’s results provide additional support for a role of short sleep in appetite-modulation and obesity.

"The results suggest that, under restricted sleep, individuals will find unhealthy foods highly salient and rewarding, which may lead to greater consumption of those foods," St-Onge said. "Indeed, food intake data from this same study showed that participants ate more overall and consumed more fat after a period of sleep restriction compared to regular sleep. The brain imaging data provided the neurocognitive basis for those results.”

Provided by American Academy of Sleep Medicine

Source: medicalxpress.com

Jun 10, 201224 notes
#science #neuroscience #brain
MRI scans show how sleep loss affects the ability to choose proper foods

June 10, 2012

MRI scans from a study being presented today at SLEEP 2012 reveal how sleep deprivation impairs the higher-order regions in the human brain where food choices are made, possibly helping explain the link between sleep loss and obesity that previous research has uncovered.

Twenty-three healthy adults participated in two sessions using functional magnetic resonance imaging (fMRI), one after a normal night’s sleep and a second after a night of sleep deprivation. In both sessions, participants rated how much they wanted various food items shown to them while they were inside the scanner.

"Our goal was to see if specific regions of the brain associated with food processing were disrupted by sleep deprivation," said lead author Stephanie Greer, a graduate student at the Sleep and Neuroimaging Laboratory at the University of California, Berkeley.

Results show that sleep deprivation significantly impaired brain activity in the frontal lobe, a region critical for controlling behavior and making complex choices, such as the selection of food to eat. The study suggests that sleep loss may prevent the higher brain functions normally critical for making appropriate food choices, rather than necessarily changing activity in deeper brain structures that react to basic desire.

"We did not find significant differences following sleep deprivation in brain areas traditionally associated with basic reward reactivity,” Greer said. “Instead, it seems to be about the regions higher up in the brain, specifically within the frontal lobe, failing to integrate all the different signals that help us normally make wise choices about what we should eat.”

She added that this failure of the frontal lobe to optimally gather the information needed to decide on the right types of foods to eat – such as how healthy relative to how tasty an item may be – may represent one brain mechanism explaining the link between sleep loss and obesity.

"These results shed light on how the brain becomes impaired by sleep deprivation, leading to improper food choices," Greer said.

Provided by American Academy of Sleep Medicine

Source: medicalxpress.com

Jun 10, 201224 notes
#science #neuroscience #brain
The balancing act to regulate the brain machinery

June 8, 2012

Molecular imbalance lies at the root of many psychiatric disorders. Current EU-funded research has discovered a major RNA molecular player in neurogenesis and has characterised its action and targets in the zebrafish embryo.

image

Credit: Thinkstock

Neural circuits are constantly in the process of modification according to experience and changes in the environment, a phenomenon known as plasticity. Classical Hebbian plasticity is crucial for encoding information whereas homeostatic plasticity stabilises neuronal activity in the face of changes that disturb excitability.

Homeostatic plasticity plays a big role in activity-dependent development of neural circuits. Interestingly, this type of homeostasis is frequently distorted in psychiatric disorders such as schizophrenia and autism.

Unlike the molecular basis of Hebbian homeostasis, the biochemistry behind homeostatic plasticity is relatively unknown. The ‘MicroRNAs and neurogenesis control’ (Neuromir) project set about investigating neural development in the zebrafish embryo to unravel the action of one class of gene regulator in particular – microRNAs.

The microRNA machinery is potentially very powerful in cell regulation. It influences many development processes and each microRNA molecule can regulate hundreds of target genes.

Numerous microRNAs are expressed in the development of the vertebrate central nervous system (CNS). Results from the in vivo study of the zebrafish revealed that miR-9 plays an important role in balancing the production of neurons during development of the embryo.

Neuromir researchers have successfully identified the molecular targets of miR-9. Future research may exploit this knowledge base by assessing their importance in disease and using their molecular format for drug therapy design.

Provided by CORDIS

Source: medicalxpress.com

Jun 8, 201215 notes
#science #neuroscience #brain
Novel brain imaging technique explains why concussions affect people differently

June 8, 2012

Patients vary widely in their response to concussion, but scientists haven’t understood why. Now, using a new technique for analyzing data from brain imaging studies, researchers at Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center have found that concussion victims have unique spatial patterns of brain abnormalities that change over time.

The new technique could eventually help in assessing concussion patients, predicting which head injuries are likely to have long-lasting neurological consequences, and evaluating the effectiveness of treatments, according to lead author Michael L. Lipton, M.D., Ph.D., associate director of the Gruss Magnetic Resonance Research Center at Einstein and medical director of magnetic resonance imaging (MRI) services at Montefiore. The findings are published today in the online edition of Brain Imaging and Behavior.

The Centers for Disease Control and Prevention estimates that more than one million Americans sustain a concussion (also known as mild traumatic brain injury, or mTBI) each year. Concussions in adults result mainly from motor vehicle accidents or falls. At least 300,000 adults and children are affected by sports-related concussions each year. While most people recover from concussions with no lasting ill effects, as many as 30 percent suffer permanent impairment – undergoing a personality change or being unable to plan an event. A 2003 federal study called concussions “a serious public health problem” that costs the U.S. an estimated $80 billion a year.

Previous imaging studies found differences between the brains of people who have suffered concussions and normal individuals. But those studies couldn’t assess whether concussion victims differ from one another. “In fact, most researchers have assumed that all people with concussions have abnormalities in the same brain regions,” said Dr. Lipton, who is also associate professor of radiology, of psychiatry and behavioral sciences, and in the Dominick P. Purpura Department of Neuroscience at Einstein. “But that doesn’t make sense, since it is more likely that different areas would be affected in each person because of differences in anatomy, vulnerability to injury and mechanism of injury.”

In the current study, the Einstein researchers used a recently developed MRI technique called diffusion tensor imaging (DTI) on 34 consecutive patients (19 women and 15 men aged 19 to 64) diagnosed with mTBI at Montefiore in the Bronx and on 30 healthy controls. The patients were imaged within two weeks of injury and again three and six months afterward.

The imaging data were then analyzed using a new software tool called Enhanced Z-score Microstructural Assessment Pathology (EZ-MAP), which allows researchers for the first time to examine microstructural abnormalities across the entire brain of individual patients. EZ-MAP was developed by Dr. Lipton and his colleagues at Einstein.

DTI detects subtle damage to the brain by measuring the direction of diffusion of water in white matter. The same technology was used by Dr. Lipton and his team in widely publicized research on more than 30 amateur soccer players who had all played the sport since childhood. They found that frequent headers showed brain injury similar to that seen in patients with concussion.

The uniformity of diffusion direction – an indicator of whether tissue has maintained its microstructural integrity – is measured on a zero-to-one scale called fractional anisotropy (FA). In the latest study, areas of abnormally low FA (reflecting abnormal brain regions) were observed in concussion patients but not in controls. Each concussion patient had a unique spatial pattern of low FA that evolved over the study period.

Surprisingly, each patient also had a unique, evolving pattern of abnormally high FA distinct from the areas of low FA. “We found widespread high FA at every time point, all the way out to six months and even in patients more than one year out from their injury.” said Dr. Lipton. “We suspect that high FA represents a response to the injury. In other words, the brain may be trying to compensate for the injury by developing and enhancing other neural connections. This is a new and unexpected finding.”

At present, diagnosis of concussions is based mainly on the nature of the patient’s accident and the presence of symptoms including headache, dizziness and behavioral abnormalities. DTI, combined with EZ-MAP analysis, might offer a more objective tool for diagnosing concussion injuries and for predicting which patients will have persistent and progressive symptoms.

Provided by Albert Einstein College of Medicine

Source: medicalxpress.com

Jun 8, 201215 notes
#science #neuroscience #brain #psychology
Mapping Genes: New Risk Factors for Neurodegenerative Diseases Found

ScienceDaily (June 7, 2012) — Using a new and powerful approach to understand the origins of neurodegenerative disorders such as Alzheimer’s disease, researchers at Mayo Clinic in Florida are building the case that these diseases are primarily caused by genes that are too active or not active enough, rather than by harmful gene mutations.

In the June 7 online issue of PLoS Genetics, they report that several hundred genes within almost 800 brain samples of patients with Alzheimer’s disease or other disorders had altered expression levels that did not result from neurodegeneration. Many of those variants were likely the cause.

"We now understand that disease likely develops from gene variants that have modest effects on gene expression, and which are also found in healthy people. But some of the variants — elevating expression of some genes, reducing levels of others — combine to produce a perfect storm that leads to dysfunction," says lead investigator Nilufer Ertekin-Taner, M.D., Ph.D., a Mayo Clinic neurologist and neuroscientist.

"If we can identify the genes linked to a disease that are too active or too dormant, we might be able to define new drug targets and therapies," she says. "That could be the case for both neurodegenerative disease as well as disease in general."

Dr. Ertekin-Taner says no other lab has performed the extent of brain gene expression study conducted at Mayo Clinic’s Florida campus. “The novelty, and the usefulness, of our study is the sheer number of brain samples that we looked at and the way in which we analyzed them. These results demonstrate the significant contribution of genetic factors that alter brain gene expression and increase risk of disease,” she says.

This form of data analysis measures gene expression levels by quantifying the amount of RNA produced in tissue and scans the genome of patients to identify genetic variants that associate with these levels.

Mayo researchers measured the level of 24,526 transcripts (messenger RNA) for 18,401 genes using cerebellar autopsy tissue from 197 Alzheimer’s disease patients and from 177 patients with other forms of neurodegeneration. The researchers then validated the results by examining the temporal cortex from 202 Alzheimer’s disease patients and from 197 with other pathologies. The difference between these samples is that while the temporal cortex is affected by Alzheimer’s disease, the cerebellum is relatively spared.

From these analyses, the researchers identified more than 2,000 markers of altered expression in both groups of patients that were common between the cerebellum and temporal cortex. Some of these markers also influenced risk of human diseases, suggesting their contribution to development of neurodegenerative and other diseases regardless of their location in the brain.

They identified novel expression “hits” for genetic risk markers of diseases that included progressive supranuclear palsy, Parkinson’s disease, and Paget’s disease, and confirmed other known associations for lupus, ulcerative colitis, and type 1 diabetes.

"Altered expression of brain genes can be linked to a number of diseases that affect the entire body," Dr. Ertekin-Taner says.

They then compared their eGWAS to GWAS data on Alzheimer’s disease, conducted by the federally funded Alzheimer’s Disease Genetics Consortium, to test whether some of the risk genes already identified promote disease through altered expression.

"We found that a number of genes already linked to Alzheimer’s disease do, in fact, have altered gene expression, but we also discovered that many of the variants in what we call the gray zone of the GWAS — genes whose contribution to Alzheimer’s disease was uncertain — were also influencing brain expression levels," Dr. Ertekin-Taner says. "That offers us new candidate risk genes to explore.

"This is a powerful approach to understanding disease," she says. "It can find new genes that contribute to risk, as well as new genetic pathways, and can also help us understand the function for a large number of genes and other molecular regulators in the genome that are implicated in very important diseases."

Source: Science Daily

Jun 8, 20127 notes
#science #neuroscience #brain #psychology
Scientists Identify First Gene in Programmed Axon Degeneration

ScienceDaily (June 7, 2012) — Degeneration of the axon and synapse, the slender projection through which neurons transmit electrical impulses to neighboring cells, is a hallmark of some of the most crippling neurodegenerative and brain diseases such as amyotrophic lateral sclerosis (ALS), Huntington’s disease and peripheral neuropathy. Scientists have worked for decades to understand axonal degeneration and its relation to these diseases. Now, researchers at the University of Massachusetts Medical School are the first to describe a gene — dSarm/Sarm1 — responsible for actively promoting axon destruction after injury. The research, published June 7 online by Science, provides evidence of an exciting new therapeutic target that could be used to delay or even stop axon decay.

"This discovery has the potential to have a profound impact on our understanding of neurodegenerative diseases, much like the discovery of apoptosis (programmed cell death) fundamentally changed our understanding of cancer," said Marc R. Freeman, PhD, associate professor of neurobiology at the University of Massachusetts Medical School and lead investigator on the study. "Identification of this gene allows us to start asking exciting new questions about the role of axon death in neurodegenerative diseases. For example, is it possible that these pathways are being inappropriately activated to cause premature axon death?"

For more than a century, scientists believed that injured axons severed from the neuron cell body passively wasted away due to a lack of nutrients. However, a mouse mutation identified in the early 1990s — called slow Wallerian degeneration (Wlds) — was able to suppress axon degeneration for weeks. This finding forced scientists to reassess Wallerian degeneration, the process through which an injured axon degenerates, as a passive process and consider the possibility that an active program of axon auto-destruction, akin to apoptotic death, was at work instead.

If Wallerian degeneration was an active process, hypothesized Dr. Freeman, a Howard Hughes Medical Institute Early Career Scientist, then it should be possible through forward genetic screens in Drosophila to identify mutants exhibiting Wlds-like axon protection. Freeman and colleagues screened more than 2,000 Drosophila mutants for ones that exhibited long-term survival of severed axons. Freeman says this was a heroic effort on the part of his colleagues. The screen took place over the next two and a half years, and involved seven students and post-docs in the Freeman lab — Jeannette M. Osterloh, A. Nicole Fox, PhD, Michelle A. Avery, PhD, Rachel Hackett, Mary A. Logan, PhD, Jennifer M. MacDonald, Jennifer S. Zeigenfuss — who performed the painstaking and labor-intensive experiments needed on each Drosophila mutant to identify flies that suppressed axonal degeneration after nerve injury.

Through these tests, they identified three mutants (out of the 2,000 screened) where severed axons survived for the lifespan of the fly. Next generation sequencing and chromosome deficiency mapping techniques were then used to isolate the single gene affected in all three — dSarm. These were loss-of-function alleles, meaning that Drosophila unable to produce the dSarm/Sarm1 molecule exhibited prolonged axon survival for as many as 30 days after injury. Freeman and colleagues went on to show that mice lacking Sarm1, the mammalian homolog of dSarm, also displayed remarkable preservation of injured axons. These findings provided the first direct evidence that Wallerian degeneration was driven by a conserved axonal death program and not a passive response to axon injury.

"For 20 years people have been looking for a gene whose normal function is to promote axon degeneration," said Osterloh, first author on the study. "Identification of the dSarm/Sarm1 gene has enormous therapeutic potential, for example as a knockdown target for patients suffering from diseases involving axonal loss."

The next step for Freeman and colleagues is to identify additional genes in the axon death pathway and investigate whether any have links with specific neurodegenerative diseases. “We’re already working with scientists at UMMS to understand the role axon death plays in ALS and Huntington’s disease,” said Freeman. “We are very excited about the possibility that these findings could have broad therapeutic potential in many neurodegenerative diseases.”

Source: Science Daily

Jun 8, 20129 notes
#science #neuroscience #brain #psychology
Newly Identified Protein Function Protects Cells During Injury

ScienceDaily (June 7, 2012) — Scientists have discovered a new function for a protein that protects cells during injury and could eventually translate into treatment for conditions ranging from cardiovascular disease to Alzheimer’s.

Researchers report online June 7 in the journal Cell that a type of protein called thrombospondin activates a protective pathway that prevents heart cell damage in mice undergoing simulated extreme hypertension, cardiac pressure overload and heart attack.

"Our results suggest that medically this protein could be targeted as a way to help people with many different disease states where various organs are under stress,” said Jeffery Molkentin, PhD, lead investigator and a researcher at Cincinnati Children’s Hospital Medical Center and the Howard Hughes Medical Institute. "Although more study is needed to determine how our findings might be applied clinically, a possible therapeutic strategy could include a drug or gene therapy that induces overexpression of the protein in tissues or organs undergoing injury."

Thrombospondin (Thbs) proteins are produced by the body in cells where tissues are being injured, reconfigured or remodeled, such as in chronic cardiac disease. They appear in part of the cell’s internal machinery called the endoplasmic reticulum. There, Thbs triggers a stress response process to regulate production of other proteins and help correct or rid cells of proteins that misfold and lose their form and intended function. Misfolded proteins help drive tissue damage and organ dysfunction.

The researchers zeroed in on how one thrombospondin protein (Thbs4) activates cellular stress responses in mice bred to overexpress the protein in heart cells. They compared how the hearts of the Thbs4-positive mice responded to simulated stress and injury to mice not bred to overexpress cardiac-specific Thbs4.

Overexpression of Thbs4 had no effect on the animals prior to cardiac stress — although during simulated hypertension and cardiac infarction the protein reduced injury and protected them from death. Mice not bred for Thbs4 overexpression were extremely sensitive to cardiac injury, according to Molkentin, a member of the Division of Molecular Cardiovascular Biology and Cincinnati Children’s Heart Institute.

The researchers reported that overexpressed Thbs4 enhanced the ability of heart cells to secrete helpful proteins, resolve misfolded proteins and properly reconstruct extracellular matrix — connective tissues that help give the heart functional form and structural integrity.

Critical to the stress response process was Thbs4 activating and regulating a transcription factor called Aft6alpha. Transcription factors help decode genetic instructions of other genes to control their expression. In the case of Aft6alpha in the heart, it helps mediate repair processes. When Aft6alpha is activated by Thbs4, the endoplasmic reticulum in cells expands and the production of chaperone molecules and other repair proteins is enhanced.

Mice bred not to overexpress cardiac Thbs4 did not exhibit activated Aft6alpha or robust repair processes following cardiac injury, leading to their poor outcomes.

Molkentin said the research team continues to examine the Thbs-dependent stress response pathway to better understand the involved processes. This includes seeing how the pathway affects laboratory models of neurodegenerative diseases like Parkinson’s, Alzheimer’s and amyotrophic lateral sclerosis.

Source: Science Daily

Jun 8, 20127 notes
#alzheimer #brain #neuroscience #psychology #science #protein
Reach2HD, a Phase II study in Huntington's disease, launched

June 7, 2012

The Huntington Study Group (HSG), under the leadership of Ray Dorsey, M.D. with Johns Hopkins Medical and Diana Rosas, M.D. with Massachusetts General Hospital, is conducting a clinical trial in Huntington’s disease (HD) throughout the United States and Australia, “A randomized, double-blind, placebo-controlled, study to assess the safety and tolerability, and efficacy of PBT2 in patients with early to mid-stage Huntington’s disease” comparing a 100 mg dose or 250 mg dose versus placebo. The HSG is a not-for-profit group of physicians and other clinical researchers who are experienced in the care of HD patients and dedicated to clinical research of the disease. This trial is sponsored by Prana Biotechnology Limited (Melbourne, Australia) and is being managed by the University of Rochester Medical Center.

Huntington’s disease is an inherited neurodegenerative disease which affects over 30,000 people in both the United States and Australia. HD is characterized by brain cell death that usually begins between the ages of 30 to 50, and results in motor, cognitive and behavioral signs and symptoms. While there are medications to help relieve some of the disease symptoms, there is no known treatment to address the cognitive impairment associated with HD.

Research has shown that normally occurring metals in the brain play a significant role in diseases such as Alzheimer’s disease and more recently, HD. Researchers at Prana Biotechnology are identifying drugs designed to interrupt interactions between these biological metals and target proteins in the brain, to prevent deterioration of brain cells. One of the chemical compounds, called PBT2, has shown in animal models, and as well as in a small group of patients with Alzheimer’s disease, that it may improve cognition. There is some indication in animal models of HD, that the drug may improve motor function and control, increase life span and reduce the amount of brain cell degeneration. Based on these results, Prana is investigating whether the drug will have similar effects with HD patients.

Reach2HD will evaluate how safe and well tolerated PBT2 is at a dose of 100 mg or 250 mg a day compared to a placebo over six months. The trial will also measure whether there is an effect on cognitive abilities as well as other HD symptoms including motor and overall functioning of individuals with HD.

"We are excited to work with Prana to investigate the safety and tolerability of an interesting and innovative experimental treatment for Huntington’s disease, PBT2," said Dorsey. "We have few treatment options for Huntington disease, and none for cognition. We hope this is a step to addressing this large unmet need for patients and their families."

Provided by University of Rochester Medical Center

Source: medicalxpress.com

Jun 8, 20125 notes
#science #neuroscience #brain #psychology #huntington
Data release from the Allen Institute for Brain Science expands online atlas offerings

June 7, 2012

The Allen Institute for Brain Science announced today its latest public data release, enhancing online resources available via the Allen Brain Atlas data portal and expanding its application programming interface (API).

With this release, the Allen Institute has expanded access to its data and services via the Allen Brain Atlas API and added new data and feature enhancements to four atlas resources: the Allen Human Brain Atlas, the Allen Mouse Brain Connectivity Atlas, the Allen Developing Mouse Brain Atlas, and the Allen Mouse Brain Atlas. In addition, two new video tutorials have been added to the Institute’s tutorial library.

The Allen Human Brain Atlas, a multi-modal, three-dimensional map of the human brain that integrates anatomical and gene expression data throughout the adult human brain, has been expanded to include gene expression data from brains of autistic individuals, allowing scientists to compare disease and control states. In addition, the Atlas contains new features to facilitate search, navigation, and download of data.

The Allen Mouse Brain Connectivity Atlas is a three-dimensional, high-resolution map of neural connections throughout the mouse brain. Today’s data release expands the set of available high-resolution images of axonal projections and adds multiplanar viewing capabilities, offering a first step towards three-dimensional visualization of neural connectivity throughout the mouse brain. This foundational map will help scientists understand how the brain is wired, offering new insights into how the brain works and what goes awry in brain diseases and disorders.

Additionally, the Allen Mouse Brain Atlas and the Allen Developing Mouse Brain Atlas have been updated with new search capabilities based on additional data annotation, allowing users to explore the gene expression data in new ways.

Application Programming Interface (API)

To broaden the user community and enable further innovation, the Allen Institute has expanded access to its data and services via the Allen Brain Atlas API, now offering access to data from across the suite of Allen Brain Atlas resources. The Allen Brain Atlas API provides the programming community with under-the-hood access to the Allen Institute’s vast datasets, sample applications and programming solutions for data searches and download, as well as opportunities for discovery and creation of new applications or data representations. This release coincides with the Allen Brain Atlas Hackathon, an elite programming event to be held later this month.

Available data in the Allen Brain Atlas API includes high-resolution images, 3-D expression summaries, primary microarray and RNA-sequencing results, and MRI and DTI files from across the Institute’s suite of atlas resources. Services offered by the Allen Brain Atlas API include RESTful model access to retrieve all experimental information; image download service for all gene expression, connectivity, histology and atlas data; as well as API access to various integrated search services.

Provided by Allen Institute for Brain Science

Source: medicalxpress.com

Jun 8, 20124 notes
#science #neuroscience #brain
Skin Cells Reprogrammed Into Brain Cells

ScienceDaily (June 7, 2012) — Scientists at the Gladstone Institutes have for the first time transformed skin cells — with a single genetic factor — into cells that develop on their own into an interconnected, functional network of brain cells. The research offers new hope in the fight against many neurological conditions because scientists expect that such a transformation — or reprogramming — of cells may lead to better models for testing drugs for devastating neurodegenerative conditions such as Alzheimer’s disease.

image

Rendering of neural network. Scientists at the Gladstone Institutes have for the first time transformed skin cells — with a single genetic factor — into cells that develop on their own into an interconnected, functional network of brain cells. (Credit: © nobeastsofierce / Fotolia)

This research comes at a time of renewed focus on Alzheimer’s disease, which currently afflicts 5.4 million people in the United States alone — a figure expected to nearly triple by 2050. Yet there are no approved medications to prevent or reverse the progression of this debilitating disease.

In findings appearing online June 7 in Cell Stem Cell, researchers in the laboratory of Gladstone Investigator Yadong Huang, MD, PhD, describe how they transferred a single gene called Sox2 into both mouse and human skin cells. Within days the skin cells transformed into early-stage brain stem cells, also called induced neural stem cells (iNSCs). These iNSCs began to self-renew, soon maturing into neurons capable of transmitting electrical signals. Within a month, the neurons had developed into neural networks.

"Many drug candidates — especially those developed for neurodegenerative diseases — fail in clinical trials because current models don’t accurately predict the drug’s effects on the human brain," said Dr. Huang, who is also an associate professor of neurology at the University of California, San Francisco (UCSF), with which Gladstone is affiliated. "Human neurons — derived from reengineered skin cells — could help assess the efficacy and safety of these drugs, thereby reducing risks and resources associated with human trials."

Dr. Huang’s findings build on the work of other Gladstone scientists, starting with Gladstone Investigator, Shinya Yamanaka, MD, PhD. In 2007, Dr. Yamanaka used four genetic factors to turn adult human skin cells into cells that act like embryonic stem cells — called induced pluripotent stem cells.

Also known as iPS cells, these cells can become virtually any cell type in the human body — just like embryonic stem cells. Then last year, Gladstone Senior Investigator Sheng Ding, PhD, announced that he had used a combination of small molecules and genetic factors to transform skin cells directly into neural stem cells. Today, Dr. Huang takes a new tack by using one genetic factor — Sox2 — to directly reprogram one cell type into another without reverting to the pluripotent state.

Avoiding the pluripotent state as Drs. Ding and Huang have done is one approach to avoiding the potential danger that “rogue” iPS cells might develop into a tumor if used to replace or repair damaged organs or tissue.

"We wanted to see whether these newly generated neurons could result in tumor growth after transplanting them into mouse brains," said Karen Ring, UCSF Biomedical Sciences graduate student and the paper’s lead author. "Instead we saw the reprogrammed cells integrate into the mouse’s brain — and not a single tumor developed."

This research, which was performed at the Roddenberry Center for Stem Cell Biology and Medicine at Gladstone, has also revealed the precise role of Sox2 as a master regulator that controls the identity of neural stem cells. In the future, Dr. Huang and his team hope to identify similar regulators that guide the development of specific neural progenitors and subtypes of neurons in the brain.

"If we can pinpoint which genes control the development of each neuron type, we can generate them in the petri dish from a single sample of human skin cells," said Dr. Huang. "We could then test drugs that affect different neuron types — such as those involved in Parkinson’s disease — helping us to put drug development for neurodegenerative diseases on the fast track."

Source: Science Daily

Jun 7, 201241 notes
#science #neuroscience #brain #psychology #alzheimer
Alzheimer’s Vaccine Trial a Success

ScienceDaily (June 7, 2012) — A study led by Karolinska Institutet in Sweden reports for the first time the positive effects of an active vaccine against Alzheimer’s disease. The new vaccine, CAD106, can prove a breakthrough in the search for a cure for this seriously debilitating dementia disease. The study is published in the scientific journal Lancet Neurology.

image

A study led by Karolinska Institutet in Sweden reports for the first time the positive effects of an active vaccine against Alzheimer’s disease. (Credit: © Tyler Olson / Fotolia)

Alzheimer’s disease is a complex neurological dementia disease that is the cause of much human suffering and a great cost to society. According to the World Health Organisation, dementia is the fastest growing global health epidemic of our age. The prevailing hypothesis about its cause involves APP (amyloid precursor protein), a protein that resides in the outer membrane of nerve cells and that, instead of being broken down, form a harmful substance called beta-amyloid, which accumulates as plaques and kills brain cells.

There is currently no cure for Alzheimer’s disease, and the medicines in use can only mitigate the symptoms. In the hunt for a cure, scientists are following several avenues of attack, of which vaccination is currently the most popular. The first human vaccination study, which was done almost a decade ago, revealed too many adverse reactions and was discontinued. The vaccine used in that study activated certain white blood cells (T cells), which started to attack the body’s own brain tissue.

The new treatment, which is presented in Lancet Neurology, involves active immunisation, using a type of vaccine designed to trigger the body’s immune defence against beta-amyloid. In this second clinical trial on humans, the vaccine was modified to affect only the harmful beta-amyloid. The researchers found that 80 per cent of the patients involved in the trials developed their own protective antibodies against beta-amyloid without suffering any side-effects over the three years of the study. The researchers believe that this suggests that the CAD106 vaccine is a tolerable treatment for patients with mild to moderate Alzheimer’s. Larger trials must now be conducted to confirm the CAD106 vaccine’s efficacy.

Source: Science Daily

Jun 7, 201235 notes
#science #neuroscience #brain #psychology #alzheimer
New Brain Target for Appetite Control Identified

ScienceDaily (June 7, 2012) — Researchers at Columbia University Medical Center (CUMC) have identified a brain receptor that appears to play a central role in regulating appetite. The findings, published June 7 in the online edition of Cell, could lead to new drugs for preventing or treating obesity.

"We’ve identified a receptor that is intimately involved in regulating food intake," said study leader Domenico Accili, MD, professor of Medicine at CUMC. "What is especially encouraging is that this receptor is belongs to a class of receptors that turn out to be good targets for drug development, making it a highly ‘druggable’ target. In fact, several existing medications already seem to interact with this receptor. So, it’s possible that we could have new drugs for obesity sooner rather than later."

In their search for new targets for obesity therapies, scientists have focused on the hypothalamus, a tiny brain structure that regulates appetite. Numerous studies suggest that the regulatory mechanism is concentrated in neurons that express a neuropeptide, or brain modulator, called AgRP. But the specific factors that influence AgRP expression are not known.

The CUMC researchers found new clues to appetite control by tracing the actions of insulin and leptin. Both hormones are involved in maintaining the body’s energy balance, and both are known to inhibit AgRP. “Surprisingly, blocking either the insulin or leptin signaling pathway has little effect on appetite,” says Dr. Accili. “We hypothesized that both pathways have to be blocked simultaneously in order to influence feeding behavior.”

To test their hypothesis, the researchers created a strain of mice whose AgRP neurons lack a protein that is integral to both insulin and leptin signaling. As the researchers hypothesized, removing this protein — Fox01 — had a profound effect on the animals’ appetite. “Mice that lack Fox01 ate less and were leaner than normal mice,” said lead author Hongxia Ren, PhD, associate research scientist in Medicine. “In addition, the Fox01-deficient mice had better glucose balance and leptin and insulin sensitivity — all signs of a healthier metabolism.”

Since Fox01 is a poor drug target, the researchers searched for other ways to inhibit the action of this protein. Using gene-expression profiling, they found a gene that is highly expressed in mice with normal AgRP neurons but is effectively silenced in mice with Fox01-deficient neurons. That gene is Gpr17 (for G-protein coupled receptor 17), which produces a cell-surface receptor called Gpr17.

To confirm that the receptor is involved in appetite control, the researchers injected a Gpr17 activator into normal mice, and their appetite increased. Conversely, when the mice were given a Gpr17 inhibitor, their appetite decreased. Similar injections had no effect on Fox01-deficient mice.

According to Dr. Accili, there are several reasons why Gpr17, which is also found in humans, would be a good target for anti-obesity medications. Since Grp17 is part of the so-called G-protein-coupled receptor family, it is highly druggable. About a third of all existing drugs work through G-protein-coupled receptors. In addition, the receptor is abundant in AgRP neurons but not in other neurons, which should limit unwanted drug side effects.

Source: Science Daily

Jun 7, 201225 notes
#science #neuroscience #brain #psychology #obesity
Wider Letter Spacing Helps Dyslexic Children

ScienceDaily (June 7, 2012) — Increasing the spacing between characters and words in a text improves the speed and quality of dyslexic children’s reading, without prior training. They read 20% faster on average and make half as many errors. This is the conclusion reached by a French-Italian research team, jointly headed by Johannes Ziegler of the Laboratoire de Psychologie Cognitive (CNRS/Aix-Marseille Université).

image

Increasing the spacing between characters and words in a text improves the speed and quality of dyslexic children’s reading, without prior training. (Credit: © Johannes Ziegler, courtesy CNRS)

These results were published 4 June 2012 in the Proceedings of the National Academy of Science (PNAS). In parallel, the team has developed an iPad/iPhone application, available under the name “DYS.” It allows both parents and children to modify the spacing between letters and thus test the benefits of these changes on reading. This will enable researchers to collect large-scale, real time data, which they will then analyze and study.

Dyslexia is a learning disability that impairs an individual’s capacity to read and is linked to difficulty in identifying letters, syllables and words — despite suitable schooling and in the absence of intellectual or sensorial deficiencies. Dyslexia, which often causes writing problems, affects on average one child in every class and 5% of the world’s population.

In this study, the researchers tested the effects of letter spacing on the reading ability of 54 dyslexic Italian and 40 dyslexic French children aged between 8 and 14 years. The children had to read a text composed of 24 sentences, in which the spacing was either normal or wider than usual. The results showed that wider spacing enabled the children to improve their reading both in terms of speed and precision. On average, they read 20% faster and made half as many errors. This progress could stem from the fact that dyslexic children are particularly sensitive to “perceptual crowding,” in other words the visual masking of each individual letter by those surrounding it. The results of this study show that this crowding effect may be reduced by spacing letters apart.

This finding opens interesting perspectives in the field of dyslexia treatment techniques. Indeed, reading better means reading more — yet it takes one year for a dyslexic child to read what a “normal reader” reads in two days. This is because reading can be “torture” for dyslexic children, whose decoding difficulties cause to stumble, putting them off reading on a regular basis. The researchers have found a simple and efficient “trick” that helps these children break the vicious circle and correctly read more words in less time.

An iPad/iPhone application known as “DYS” has been developed in parallel with these research results by Stéphane Dufau, CNRS research engineer at the Laboratoire de Psychologie Cognitive. Available initially in French and English and downloadable free of charge from Apple Store, it enables both parents and children to adjust the spacing between letters and to test the benefits of such modifications on reading. The researchers for their part hope to be able to collect large-scale data that will allow them to quantify and analyze whether optimal spacing exists as a function of the subject’s age and reading level.

Download available: http://itunes.apple.com/us/app/dys-help-people-with-dyslexia/id529867852?mt=8

Source: Science Daily

Jun 7, 201214 notes
#science #neuroscience #brain #psychology #dyslexia
Study Links PTSD to Hidden Head Injuries Suffered in Combat

ScienceDaily (June 6, 2012) — Even when brain injury is so subtle that it can only be detected by an ultra-sensitive imaging test, the injury might predispose soldiers in combat to post-traumatic stress disorder, according to a University of Rochester Medical Center study.

The research is important for physicians who are caring for troops in the years following deployment, as they try to untangle the symptom overlap between PTSD and mild traumatic brain injury (mild TBI) and provide the appropriate treatment. Until now, the nature of the interaction between TBI and PTSD was unclear. URMC researchers believe they are the first to find an association that can be demonstrated with advanced imaging techniques.

The study is published online by the Journal of Head Trauma Rehabilitation.

"Most people believe that, to a large extent, chronic stress from intense combat experiences triggers PTSD. Our study adds more information by suggesting that a physical force such as exposure to a bomb blast also may play a role in the genesis the syndrome," said lead author Jeffrey J. Bazarian, M.D., M.P.H., associate professor of Emergency Medicine at URMC, and a member of the 2007 Institute of Medicine committee that investigated brain injuries among war veterans.

By 2008 it was estimated that 320,000 troops suffered concussions in Iraq and Afghanistan. Bazarian’s research involved 52 war veterans from western New York who served in combat areas between 2001 and 2008. Approximately four years after their final tour of duty, researchers asked each veteran about PTSD symptoms, blast exposures, mild concussions, and combat experiences.

Researchers measured combat stress in the study participants with a standard Walter Reed Institute of Research Combat Experiences Survey, which asks about the intensity of deployment duties (such as handling or uncovering remains), exposure to explosive devices, vehicle accidents, falls or assaults, and events such as being ambushed or knowing someone who was seriously injured or killed. The investigators also performed standard MRI testing, as well as a more sensitive test called diffusion tensor imaging, or DTI. The latter has been used to detect axonal injury, a type of neuronal damage that occurs during a concussion.

Results showed that 30 of the 52 New York veterans suffered at least one mild traumatic brain injury, and seven reported having more than one. Sixty percent of the veterans were exposed to one or more explosive blasts.

All 52 veterans had one or more PTSD symptoms, and 15 met the formal criteria for PTSD, which is a devastating psychiatric illness. The severity of veterans’ PTSD symptoms correlated with the amount of axonal injury seen on the DTI scans.

In addition, five of the 52 veterans showed abnormalities on standard MRI scans, and their PTSD severity was much worse than the 46 veterans with normal MRIs.

Interestingly, PTSD severity did not correlate with the clinical diagnosis of mild TBI. This suggests that subtle brain injury can occur without producing the loss of consciousness or amnesia that is typically associated with diagnosis of mild TBI, and that this injury may make a person more vulnerable to psychiatric illness when coupled with extreme chronic stress.

"Based on our results, it looks like the only way to detect this injury is with DTI/MRI," Bazarian said. "While it may not be feasible due to costs and limited availability of some neuro-imaging tests to screen thousands of service members for brain injury, our study highlights the pressing need to develop simpler tests that are accurate and practical, that correlate with brain injury."

Source: Science Daily

Jun 7, 20126 notes
#science #neuroscience #brain
Fish Show Autism-Like Gene Expression in Water With Psychoactive Pharmaceuticals

ScienceDaily (June 6, 2012) — Psychoactive medications in water affect the gene expression profiles of fathead minnows in a way that mimics the gene expression patterns associated with autism spectrum disorder in genetically susceptible humans, according to research published June 6 in the open access journal PLoS ONE. These results suggest a potential environmental trigger for autism spectrum disorder in this vulnerable population, the authors write.

The researchers, led by Michael A. Thomas of Idaho State University, exposed the fish to three psychoactive pharmaceuticals — fluoxetine, a selective serotonin reuptake inhibitor, or SSR1; venlafaxine, a serotonin-norepinephrine reuptake inhibitor, and carbamazepine, used to control seizures — at concentrations comparable to the highest estimated environmental levels.

They found that the only gene expression patterns affected were those associated with idiopathic autism spectrum disorders, caused by genetic susceptibility interacting with unknown environmental triggers. These results suggest that exposure to environmental psychoactive pharmaceuticals may play a role in the development of autism spectrum disorder in genetically predisposed individuals.

Lead researcher Michael A. Thomas remarks, “While others have envisioned a causal role for psychotropic drugs in idiopathic autism, we were astonished to find evidence that this might occur at very low dosages, such as those found in aquatic systems.”

Source: Science Daily

Jun 7, 20125 notes
#science #neuroscience #psychology #autism
Variations in Sex Steroid Gene Expression Can Predict Aggressive Behaviors, Bird Study Shows

ScienceDaily (June 6, 2012) — An Indiana University biologist has shown that natural variation in measures of the brain’s ability to process steroid hormones predicts functional variation in aggressive behavior.

image

Researchers studied the behaviors of free-living dark-eyed juncos during breeding season to measure variations in aggressiveness. (Credit: Image courtesy of Indiana University)

The new work led by Kimberly A. Rosvall, a postdoctoral fellow and assistant research scientist in the IU Bloomington College of Arts and Sciences’ Department of Biology, has found strong and significant relationships between aggressive behavior in free-living birds and the abundance of messenger RNA in behaviorally relevant brain areas for three major sex steroid processing molecules: androgen receptor, estrogen receptor and aromatase.

"Individual variation is the raw material of evolution, and in this study we report that free-living birds vary in aggression and that more aggressive individuals express higher levels of genes related to testosterone processing in the brain," she said. "We’ve long hypothesized that the brain’s ability to process steroids may account for individual differences in hormone-mediated behaviors, but direct demonstrations are rare, particularly in unmanipulated or free-living animals."

Rosvall said the study shows that aggression is strongly predicted by individual variation in gene expression of the molecules that initiate the genomic effects of testosterone. The new work, “Neural sensitivity to sex steroids predicts individual differences in aggression: implications for behavioral evolution,” was published June 6 in Proceedings of The Royal Society B.

The findings are among the first to show that individual variation in neural gene expression for three major sex steroid processing molecules predicts individual variation in aggressiveness in both sexes in nature, results that should have broad implications for understanding the mechanisms by which aggressive behavior may evolve.

"On the one hand, we have lots of evidence to suggest that testosterone is important in the evolution of all kinds of traits," Rosvall noted. "On the other hand, we know that individual variation is a requirement for natural selection, but individual variation in testosterone does not always predict behavior. This conundrum has led to debate among researchers about how hormone-mediated traits evolve."

To find such strong relationships between behavior and individual variation in the expression of genes related to hormone-processing is exciting because it tells scientists that evolution could shape behavior via changes in the expression of these genes, as well as via changes in testosterone levels themselves.

The team measured natural variation in aggressiveness toward the same sexes in male and female free-living dark-eyed juncos (Junco hyemalis) early in the breeding season. The dark-eyed junco is a North American sparrow that is well studied with respect to hormones, behavior and sex differences. By comparing individual differences in aggressiveness (flyovers or songs directed at intruders) to circulating levels of testosterone and to neural gene expression for the three major sex steroid processing molecules, the researchers were able to quantify measures of sensitivity to testosterone in socially relevant brain areas: the hypothalamus, the ventromedial telencephalon and the right posterior telencephalon.

Their results suggest selection could shape the evolution of aggression through changes in the expression of androgen receptor, estrogen receptor and aromatase in both males and females, to some degree independently of circulating levels of testosterone. They found, for example, that males that sing more songs at an intruder have more mRNA for aromatase and estrogen receptor in the posterior telencephalon, and also that males and females that dive-bomb an intruder more frequently have more androgen receptor, estrogen receptor and aromatase mRNA in brain tissues including the medial amygdala, an area of the brain that’s known to control aggression in rodents and other birds. mRNA are single-stranded copies of genes that are translated into protein molecules.

The work reveals there is ample variation in hormone signal and in gene expression on which selection may act to affect aggressiveness. It also establishes a prerequisite for the evolution of testosterone-mediated characteristics through changes in localized gene expression for the key molecules that process sex steroids, and suggests that trait evolution can occur with some degree of independence from circulating testosterone levels.

"Researchers have thought this was probably the case for about a hundred years, based on a lot of really important work that uses experimental manipulations like castration or hormone replacement," Rosvall said. "But very few people have looked to see if individuals actually do vary in expression of these genes, and whether this individual variation means anything, in terms of an animal’s behavior. Our work shows that it does."

The new insights into how neuroendocrine mechanisms of aggression may be modified as populations diverge into species also offer opportunities for future research, including trying to determine whether genes that are up- or down-regulated in response to environmental stimuli may be the same genes that contribute to the evolution of certain traits and characteristics.

Source: Science Daily

Jun 7, 20123 notes
#science #neuroscience #brain #psychology
This Is Your Brain On No Self-Control

ScienceDaily (June 6, 2012) — New pictures from the University of Iowa show what it looks like when a person runs out of patience and loses self-control.

image

Brain activity when people exert self-control. (Credit: Image courtesy of University of Iowa)

A study by University of Iowa neuroscientist and neuro-marketing expert William Hedgcock confirms previous studies that show self-control is a finite commodity that is depleted by use. Once the pool has dried up, we’re less likely to keep our cool the next time we’re faced with a situation that requires self-control.

But Hedgcock’s study is the first to actually show it happening in the brain using fMRI images that scan people as they perform self-control tasks. The images show the anterior cingulate cortex (ACC) — the part of the brain that recognizes a situation in which self-control is needed and says, “Heads up, there are multiple responses to this situation and some might not be good” — fires with equal intensity throughout the task.

However, the dorsolateral prefrontal cortex (DLPFC) — the part of the brain that manages self-control and says, “I really want to do the dumb thing, but I should overcome that impulse and do the smart thing” — fires with less intensity after prior exertion of self-control.

He said that loss of activity in the DLPFC might be the person’s self-control draining away. The stable activity in the ACC suggests people have no problem recognizing a temptation. Although they keep fighting, they have a harder and harder time not giving in.

Which would explain why someone who works very hard not to take seconds of lasagna at dinner winds up taking two pieces of cake at desert. The study could also modify previous thinking that considered self-control to be like a muscle. Hedgcock says his images seem to suggest that it’s like a pool that can be drained by use then replenished through time in a lower conflict environment, away from temptations that require its use.

The researchers gathered their images by placing subjects in an MRI scanner and then had them perform two self-control tasks — the first involved ignoring words that flashed on a computer screen, while the second involved choosing preferred options. The study found the subjects had a harder time exerting self-control on the second task, a phenomenon called “regulatory depletion.” Hedgcock says that the subjects’ DLPFCs were less active during the second self-control task, suggesting it was harder for the subjects to overcome their initial response.

Hedgcock says the study is an important step in trying to determine a clearer definition of self-control and to figure out why people do things they know aren’t good for them. One possible implication is crafting better programs to help people who are trying to break addictions to things like food, shopping, drugs, or alcohol. Some therapies now help people break addictions by focusing at the conflict recognition stage and encouraging the person to avoid situations where that conflict arises. For instance, an alcoholic should stay away from places where alcohol is served.

But Hedgcock says his study suggests new therapies might be designed by focusing on the implementation stage instead. For instance, he says dieters sometimes offer to pay a friend if they fail to implement control by eating too much food, or the wrong kind of food. That penalty adds a real consequence to their failure to implement control and increases their odds of choosing a healthier alternative.

The study might also help people who suffer from a loss of self-control due to birth defect or brain injury.

"If we know why people are losing self-control, it helps us design better interventions to help them maintain control," says Hedgcock, an assistant professor in the Tippie College of Business marketing department and the UI Graduate College’s Interdisciplinary Graduate Program in Neuroscience.

Source: Science Daily

Jun 7, 201229 notes
#science #neuroscience #brain #psychology
Stress may delay brain development in early years

June 6, 2012 by Chris Barncard

Stress may affect brain development in children — altering growth of a specific piece of the brain and abilities associated with it — according to researchers at the University of Wisconsin–Madison.

"There has been a lot of work in animals linking both acute and chronic stress to changes in a part of the brain called the prefrontal cortex, which is involved in complex cognitive abilities like holding on to important information for quick recall and use,” says Jamie Hanson, a UW–Madison psychology graduate student. “We have now found similar associations in humans, and found that more exposure to stress is related to more issues with certain kinds of cognitive processes.”

Children who had experienced more intense and lasting stressful events in their lives posted lower scores on tests of what the researchers refer to as spatial working memory. They had more trouble navigating tests of short-term memory such as finding a token in a series of boxes, according to the study, which will be published in the June 6 issue of the Journal of Neuroscience.

Brain scans revealed that the anterior cingulate, a portion of the prefrontal cortex believed to play key roles in spatial working memory, takes up less space in children with greater exposure to very stressful situations.

"These are subtle differences, but differences related to important cognitive abilities" Hanson says.

But they maybe not irreversible differences.

"We’re not trying to argue that stress permanently scars your brain. We don’t know if and how it is that stress affects the brain," Hanson says. "We only have a snapshot — one MRI scan of each subject — and at this point we don’t understand whether this is just a delay in development or a lasting difference. It could be that, because the brains is very plastic, very able to change, that children who have experienced a great deal of stress catch up in these areas."

The researchers determined stress levels through interviews with children ages 9 to 14 and their parents. The research team, which included UW–Madison psychology professors Richard Davidson and Seth Pollak and their labs, collected expansive biographies of stressful events from slight to severe.

"Instead of focusing in on one specific type of stress, we tried to look at a range of stressors," Hanson says. "We wanted to know as much as we could, and then use all this information to later to get an idea of how challenging and chronic and intense each experience was for the child."

Interestingly, there was little correlation between cumulative life stress and age. That is, children who had several more years of life in which to experience stressful episodes were no more likely than their younger peers to have accumulated a length stress resume. Puberty, on the other hand, typically went hand-in-hand with heavier doses of stress.

The researchers, whose work was funded by the National Institutes of Health, also took note of changes in brain tissue known as white matter and gray matter. In the important brain areas that varied in volume with stress, the white and gray matter volumes were lower in tandem.

White matter, Hanson explained, is like the long-distance wiring of the brain. It connects separated parts of the brain so that they can share information. Gray matter “does the math,” Hanson says. “It takes care of the processing, using the information that gets shared along the white matter connections.”

Gray matter early in development appears to enable flexibility; children can play and excel at many different activities. But as kids age and specialize, gray matter thins. It begins to be “pruned” after puberty, while the amount of white matter grows into adulthood.

"For both gray and white matter, we actually see smaller volumes associated with high stress," Hanson says. "Those kinds of effects across different kinds of tissue, those are the things we would like to study over longer periods of time. Understanding how these areas change can give you a better picture of whether this is just a delay in development or more lasting."

More study could also show the researchers how to help children who have experienced an inordinate amount of stress.

"There are groups around the country doing working memory interventions to try to train or retrain people on this particular cognitive ability and improve performance," Hanson says. "Understanding if and how stress affects these processes could help us know whether there may be similar interventions that could aid children living in stressful conditions, and how this may affect the brain.”

Provided by University of Wisconsin-Madison

Source: medicalxpress.com

Jun 7, 201219 notes
#science #neuroscience #brain #psychology #stress
Brain cell activity imbalance may account for seizure susceptibility in Angelman syndrome

June 6, 2012

New research by scientists at the University of North Carolina School of Medicine may have pinpointed an underlying cause of the seizures that affect 90 percent of people with Angelman syndrome (AS), a neurodevelopmental disorder.

image

This image shows inhibitory neurons (red) and cell bodies (blue) in the cerebral cortex of an Angelman syndrome model mouse. Credit: Philpot Lab, UNC School of Medicine

Published online Thursday June 7, 2012 in the journal Neuron, researchers led by Benjamin D. Philpot, PhD, professor of cell and molecular physiology at UNC, describe how seizures in individuals with AS could be linked to an imbalance in the activity of specific types of brain cells.

"Our study indicates that a common abnormality that may apply to many neurodevelopmental disorders is an imbalance between neuronal excitation and inhibition," Philpot said. This imbalance has been observed in several genetic disorders including Fragile X and Rett syndromes, both of these, like AS, can be associated with autism.

Angelman syndrome occurs in one in 15,000 live births. The syndrome often is misdiagnosed as cerebral palsy or autism. Its characteristics, along with seizures, include cognitive delay, severe intellectual disability, lack of speech (minimal or no use of words), sleep disturbance, hand flapping and motor and balance disorders.

The most common genetic defect of the syndrome is the lack of expression of the maternally inherited allele of gene UBE3A on chromosome 15.

This loss of gene function in AS animal models has been linked to decreased release of an excitatory neurotransmitter which increases the activity of other neurons. But that seems at odds with the high seizure activity observed in AS patients. The new study may clarify this issue.

In his lab in UNC’s Neuroscience Research Center, Philpot and graduate student Michael L. Wallace, the study’s first author, explored the neurocircuitry of an Angelman syndrome mouse model. These mice show behavioral features similar to humans with AS, including seizures.

The researchers used electrophysiological methods to record excitatory and inhibitory activity from individual neurons. These involved highly precise recording electrodes, microscopic tips attached to individual neurons. “In this way you can record from precise neuron types and tell which neuron you’re recording from and what its activity is,” explained Philpot.

"You can stimulate it to drive other neurons and also record the activity on other neurons onto it."

The researchers found that neurotransmitters sent from inhibitory neurons and carrying chemical messages meant to stop excitatory neurons from increasing their activity were defective.

In addition, they found that AS model mice have a defect in their inhibitory neurons which decreases their ability to recover from high levels of activity. “One of the reasons why inhibition is so important is that it’s needed to ensure that brain activity is regulated,” Philpot said. “Inhibition plays an important role in timing of information transfer between neurons, and if the timing is messed up, as you might observe if you had a decrease in inhibition, then a lot of information is lost in that transfer.”

"We found a disproportionately large decrease in inhibition to excitation," Wallace said. "We think that the circuit we investigated is in a hyperexcitable state and may be underlying some of the epileptic problems observed in the AS animal model. This improperly regulated brain activity might also underlie cognitive impairments in AS.”

Philpot says one of their goals is to understand exactly how these changes in the connections between neurons underlie seizures in AS. “A very long term goal is to try to get better treatments for these individuals because their epilepsy is very hard to treat.”

Provided by University of North Carolina Health Care

Source: medicalxpress.com

Jun 6, 20123 notes
#science #neuroscience #psychology #brain
Statistical Model Attempting to Estimate Level of Alcohol Consumption That Is 'Optimal' for Health

ScienceDaily (June 6, 2012) — Cutting the amount we drink to just over half a unit a day could save 4,600 lives a year in England, according to a modelling study by Oxford University researchers published in the journal BMJ Open.

image

Half a unit of alcohol is as little as a quarter of a glass of wine, or a quarter of a pint. (Credit: © G.G. Lattek / Fotolia)

Scientists have carried out a complex analysis in an attempt to determine the “optimal” level of alcohol consumption that is associated with the lowest rates of chronic disease in the UK. They conclude that the intake of about one-half of a typical drink per day would result in the healthiest outcomes, and the authors conclude that the recommended alcohol intake for the UK should be reduced from the current advised level of drinking.

Half a unit of alcohol is as little as a quarter of a glass of wine, or a quarter of a pint. That’s much lower than current government recommendations of between 3 to 4 units a day for men and 2-3 units for women.

The researchers set out to find the optimum daily amount of alcohol that would see fewest deaths across England from a whole range of diseases connected to drink. Previous studies have often looked at the separate effects of alcohol on heart disease, liver disease or cancers in isolation.

'Although there is good evidence that moderate alcohol consumption protects against heart disease, when all of the chronic disease risks are balanced against each other, the optimal consumption level is much lower than many people believe,' says lead author Dr Melanie Nichols of the BHF Health Promotion Research Group in the Department of Public Health at Oxford University.

The team used a mathematical model to assess what impact changing average alcohol consumption would have on deaths from 11 conditions known to be at least partially linked to drink.

These included coronary heart disease, stroke, high blood pressure, diabetes, cirrhosis of the liver, epilepsy, and five cancers. Over 170,000 people in England died from these 11 conditions in 2006, and ill health linked to alcohol is estimated to cost the NHS in England £3.3 billion every year.

The researchers used information from the 2006 General Household Survey on levels of alcohol consumption among adults in England. They combined this with the disease risks for differing levels of alcohol consumption as established in large analyses of published research.

They found that just over half a unit of alcohol a day was the optimal level of consumption among current drinkers.

They calculate this level of drinking would prevent around 4,579 premature deaths, or around 3% of all deaths from the 11 conditions.

The number of deaths from heart disease would increase by 843, but this would be more than offset by around 2,600 fewer cancer deaths and almost 3,000 fewer liver cirrhosis deaths.

'Moderating your alcohol consumption overall, and avoiding heavy-drinking episodes, is one of several things, alongside a healthy diet and regular physical activity, that you can do to reduce your risk of dying early of chronic diseases,' says Dr Nichols.

She adds: ‘We are not telling people what to do, we are just giving them the best balanced information about the different health effects of alcohol consumption, so that they can make an informed decision about how much to drink.

'People who justify their drinking with the idea that it is good for heart disease should also consider how alcohol is increasing their risk of other chronic diseases. A couple of pints or a couple of glasses of wine per day is not a healthy option.'

Although this study in BMJ Open did not look at patterns of drinking, Dr Nichols says: ‘Regardless of your average intake, if you want to have the best possible health, it is also very important to avoid episodes of heavy drinking (“binge drinking”) as there is very clear evidence that this will increase your risks of many diseases, as well as your risk of injuries.’

Source: Science Daily

Jun 6, 20124 notes
#science #neuroscience #psychology #alcohol
Using rabies virus, researcher tracks inputs to dopamine neurons

June 6, 2012

A genetically-modified version of the rabies virus is helping scientists at Harvard to trace neural pathways in the brain, a research effort that could one day lead to treatments for Parkinson’s disease and addiction.

As described in a paper published on June 7 in the journal Neuron, a team of researchers led by Associate Professor of Molecular and Cellular Biology Naoshige Uchida used the virus to create the first-ever comprehensive list of inputs that connect directly to dopamine neurons in two regions of the brain, the ventral tegmental area (VTA), known for processing reward, and the substantia nigra (SNc), known for motor control.

"You may be familiar with the term connectome," Uchida explained. "The basic idea is we want to understand the brain in terms of connectivity and the various cell types. In this case, we’re examining long-range connections; that is, how other parts of the brain connect directly to dopamine neurons.

Dopamine neurons are thought to be important for processing reward and regulating motor output.

"By understanding their inputs, we might be able to better understand how the function of dopamine neurons is regulated, and, in turn, how addiction happens, and how Parkinson’s disease and other motor-control disorders are affected by problems with dopamine neurons,” Uchida continued. “And because this application provides us with very quantitative data, it’s possible that this is a technique that might be useful in attacking the causes of those diseases.”

Creating that connectivity diagram, however, is anything but easy.

While both the VTA and SNc are known to have high concentrations of dopamine neurons, Uchida chose to examine both areas because the cells in the two regions fire differently.

"We wanted to know what the difference was, generally," Uchida said. "That’s why we compared the inputs to both structures. Based on how other neurons are connected there, we can start to explain why these two regions of the brain do different things."

The challenge, however, is that dopamine neurons are packed into relatively small regions with several other cell types. To ensure they were only observing dopamine neurons, researchers turned to an organism more typically known for damaging neurons – the rabies virus.

Before they infect genetically-engineered mice with the rabies virus, however, they first inject the animals with a pair of “helper” viruses. The first causes dopamine neurons to produce a receptor protein, meaning the rabies virus can only infect dopamine neurons, while the second restores the virus’ ability to “hop” from one neuron to another.

The mice are then infected with a version of the rabies virus that has been genetically-modified to produce a fluorescent protein, allowing researchers to track the virus as it binds with dopamine neurons, and then jumps to the cells that link directly to those neurons.

The results, as depicted in images of a mouse’s brain showing the wealth of connections to dopamine neurons, show that a number of brain regions – including some previously unknown areas – are connected to dopamine neurons.

"We found some new connections, and we found some that we suspected were there, but that were not well understood," Uchida said. "For example, we found that there are connection between the motor cortex and the SNc, which may be related to SNc dopamine neurons’ role in motor control.

"Other connections, though, were more intriguing," he continued. "We found that the subthalmic nucleus preferentially connects to SNc neurons – that’s particularly important because that region is a popular target for deep brain stimulation as a treatment for Parkinson’s."

Often used as a treatment for Parkinson’s and a variety of other disorders, deep brain stimulation involves implanting a device, called a brain pacemaker, into a patient’s brain. The device then electrically stimulates specific regions of the brain, helping to mitigate symptoms of the disease.

"The mechanism for why deep brain stimulation works is not completely understood," Uchida said. "There was speculation that it might have been inhibiting neurons in the subthalmic nucleus, but our findings suggest, because there is a direct connection between those neurons and dopamine neurons in the SNc, that it is actually activating those neurons. I don’t think this explains the entire mechanism for why deep brain stimulation works, but this may be part of it.”

"This work also offers us a roadmap for other areas we might investigate, so now we can target those areas and record from them," Uchida added. "This is a critical step for future investigations."

Provided by Harvard University

Source: medicalxpress.com

Jun 6, 201214 notes
#science #neuroscience #brain #psychology #dopamine
Research shows mice brains are 'very wired up' at birth, suggests experience selects which connections to keep

June 6, 2012

Ask the average person the street how the brain develops, and they’ll likely tell you that the brain’s wiring is built as newborns first begin to experience the world. With more experience, those connections are strengthened, and new branches are built as they learn and grow.

A new study conducted in a Harvard lab, however, suggests that just the opposite is true.

As reported on June 7 in the journal Neuron, a team of researchers led by Jeff Lichtman, the Jeremy R. Knowles Professor of Molecular and Cellular Biology, has found that just days before birth mice undergo an explosion of neuromuscular branching. At birth, the research showed, some muscle fibers are contacted by as many as 10 nerve cells. Within days, however, all but one of those connections had been pruned away.

"By the time mammals – and humans would certainly be included – are first coming into the world, when they can do almost nothing, the brain is probably very wired up," Lichtman said. "Through experience, the brain works to select, out of this mass of possible circuits, a very small subset…and everything else that could have been there is gone.

"I don’t think anyone suspected that this was taking place – I certainly didn’t," he continued. "In some simple muscles, every nerve cell branches out and contacts every muscle fiber. That is, the wiring diagram is as diffuse as possible. But by the end, only two weeks later, every muscle fiber is the lifelong partner of a single nerve cell, and 90 percent of the wires have disappeared."

Though researchers, including Lichtman, had shown as early as the 1970’s that mice undergo an early developmental period in which target cells including muscle fibers and some neurons are contacted by multiple nerve cells before being reduced to a single connection, those early studies and his current work were hampered by the same problem – technological challenges make it difficult to identify individual nerve cells in earlier and earlier stages of life.

And though the use of mice that have been genetically-engineered to express fluorescent protein molecules in nerve cells has made it easier for researchers to identify nerve cells, it remains challenging to study early stages of development because the fluorescent labeling in the finest nerve cell wires often becomes so weak as to be invisible.

Read More →

Jun 6, 201212 notes
#science #neuroscience #brain #psychology #neuron
The Real Culprit Behind Hardened Arteries? Stem Cells, Says Landmark Study

ScienceDaily (June 6, 2012) — One of the top suspects behind killer vascular diseases is the victim of mistaken identity, according to researchers from the University of California, Berkeley, who used genetic tracing to help hunt down the real culprit.

image

Within the walls of blood vessels are smooth muscle cells and newly discovered vascular stem cells. The stem cells are multipotent and are not only able to differentiate into smooth muscle cells, but also into fat, cartilage and bone cells. UC Berkeley researchers provide evidence that the stem cells are contributing to clogged and hardened arteries. (Credit: Song Li illustration)

The guilty party is not the smooth muscle cells within blood vessel walls, which for decades was thought to combine with cholesterol and fat that can clog arteries. Blocked vessels can eventually lead to heart attacks and strokes, which account for one in three deaths in the United States.

Instead, a previously unknown type of stem cell — a multipotent vascular stem cell — is to blame, and it should now be the focus in the search for new treatments, the scientists report in a new study appearing June 6 in the journal Nature Communications.

"For the first time, we are showing evidence that vascular diseases are actually a kind of stem cell disease," said principal investigator Song Li, professor of bioengineering and a researcher at the Berkeley Stem Cell Center. "This work should revolutionize therapies for vascular diseases because we now know that stem cells rather than smooth muscle cells are the correct therapeutic target."

The finding that a stem cell population contributes to artery-hardening diseases, such as atherosclerosis, provides a promising new direction for future research, the study authors said.

"This is groundbreaking and provocative work, as it challenges existing dogma," said Dr. Deepak Srivastava, who directs cardiovascular and stem cell research at the Gladstone Institutes in San Francisco, and who provided some of the mouse vascular tissues used by the researchers. "Targeting the vascular stem cells rather than the existing smooth muscle in the vessel wall might be much more effective in treating vascular disease."

Read More →

Jun 6, 20127 notes
#science #neuroscience
Jun 6, 2012694 notes
Alzheimer Plaques in 3D

ScienceDaily (June 6, 2012) — Swiss researchers have succeeded in generating detailed three-dimensional images of the spatial distribution of amyloid plaques in the brains of mice afflicted with Alzheimer’s disease. These plaques are accumulations of small pieces of protein in the brain and are a typical characteristic of Alzheimer’s. The new technique used in the investigations provides an extremely precise research tool for a better understanding of the disease. In the future, scientists hope that it will also provide the basis for a new and reliable diagnosis method.

image

Virtual cut. (Credit: Image courtesy of Paul Scherrer Institut (PSI))

The results were achieved within a joint project of two teams of researchers — one from the Paul Scherrer Institute (PSI) and ETH Zurich, the other from the École Polytechnique Fédérale de Lausanne (EPFL). They have been published in the journal Neuroimage.

Alzheimer’s disease is responsible for about 60% to 80% of all cases of dementia. This disease affects people differently, but the most common initial symptom is the difficulty in remembering new information, because the disease first affects brain regions involved in the formation of new memories. Alzheimer’s dementia is characterized by typical brain lesions that spread to other brain regions as the disease progresses. One of these lesions, the so-called amyloid plaque, is composed of the accumulation of extracellular protein aggregates. These lesions appear early in the course of the disease and there is a high interest in detecting them in patients to diagnose or evaluate the progression of the disease. Recently, medical imaging methods have been developed and validated for this purpose. These allow regional amount of amyloid deposits to be measured, but individual plaques cannot be quantified.

The latest results obtained by researchers from the Paul Scherrer Institute (PSI), ETH Zurich and the École Polytechnique Fédérale de Lausanne (EPFL) show that imaging single plaques is feasible under certain conditions. “This achievement could help to advance the development and evaluation of new imaging diagnostic markers for ultimately improving the diagnosis of Alzheimer’s disease,” explains Matthias Cacquevel, one of the authors at EPFL.

Precise plaque distribution in 3D

Using a method known as Phase Contrast Imaging, the researchers were able, within a short time, to make visible the exact three-dimensional distribution of amyloid plaques in the brains of mice with Alzheimer’s. Before this achievement, the only possibility of studying the distribution of amyloid plaques at the single-plaque level was to perform time-consuming studies. “Until now, for such an investigation, the brain had to be cut into slices and the slices coloured so that the plaques became visible,” explains Bernd Pinzer, from the Paul Scherrer Institute, who carried out the investigations. “This process is the gold standard amongst such investigations. It is, however, very time-consuming, as everything has to be done by hand. At the same time, it provides much less information than our new method. Naturally, we compared the results from our new method with those obtained using this traditional method, and they showed excellent agreement.”

As a first concrete result, the researchers determined the distribution of plaques in the brains of a number of mice with different stages of the disease. For each brain, the scientists obtained a three-dimensional image of the overall plaque distribution so that the development of the disease could be followed in detail. With conventional processes, it would hardly have been possible to gather such comprehensive information.

Developments for reliable diagnostic techniques

"One goal is to use the phase contrast technique to help improve imaging methods which make visible the plaques in the brain of a living patient, and thereby allow a reliable diagnosis of Alzheimer’s disease to be made," explains Pinzer. "These methods are under constant development and it is important to compare their results with those achieved using a known and reliable method. Now it will be possible to directly compare the two sets of 3D images of a mouse brain produced both by a diagnostic method and by our phase contrast technique. One of the diagnostic methods available is Positron Emission Tomography (PET), in which special molecules are attached to the plaques and, after some time, emit gamma radiation, which can be ascertained externally."

Although the deposited radiation dose required — which is high, in order to generate the necessary high resolution — prevents measurements being made on living animals at the moment, the method is already an outstanding research tool, which will lead to a better understanding of Alzheimer’s disease. “This tool will allow much more precise studies on how amyloid plaques are distributed,” explains Matthias Cacquevel, one of the authors at EPFL. “The relationship between plaques and the symptoms of the disease are still unclear, and information on how these plaques spread throughout the brain is also missing.”

Comprehensive information from changes in the light

These investigations were carried out at the Swiss Light Source (SLS) at PSI. The SLS generates synchrotron light — X-rays that are very intensive and well focused. The investigation is similar to a conventional X-ray examination — the scientists pass the X-rays through the object under investigation and determine how they have changed on their way. A normal X-ray picture, however, only shows how strongly the light is attenuated by the object; in a sense, it shows the shadow of the object. The problem is that various kinds of soft tissue attenuate X-rays in approximately the same way, which makes it difficult to distinguish between them.

"With the phase contrast method that we are using here, we also take into consideration the fact that different tissues deviate the light slightly from its original direction by a different amount. In physics, this effect is known to generate a so-called X-ray phase shift," explains Marco Stampanoni, Professor of X-Ray Microscopy at the Institute of Biomedical Technology at the ETH Zürich and Project Manager at PSI. The team he is leading built up the measuring station and designed the experiment. "Our instrument is able to measure such subtle shifts very precisely and transform this information into understandable images."

Phase Contrast Imaging for various medical applications

"While we cannot carry out an investigation on patients using the phase contrast method to detect Alzheimer’s disease, we are close to developing diagnostic tools for other diseases," emphasises Stampanoni. "We have already shown, in a pilot study on the imaging of tumours in the female breast, how useful the additional information can be. A first step in the direction of the hospital is the development of a mammography facility, the first prototype of which can be used in a doctor’s practice."

Source: Science Daily

Jun 6, 201210 notes
#science #neuropscience #brain #psychology #alzheimer
Study reports seizure-freedom in 68 percent of juvenile myoclonic epilepsy patients

June 6, 2012

A 25-year follow-up study reveals that 68% of patients with juvenile myoclonic epilepsy (JME) became seizure-free, with nearly 30% no longer needing antiepileptic drug (AED) treatment. Findings published today in Epilepsia, a journal of the International League Against Epilepsy (ILAE), report that the occurrence of generalized tonic-clonic seizures preceded by bilateral myoclonic seizures, and AED polytherapy significantly predicted poor long-term seizure outcome.

Patients with JME experience “jerking” of the arms, shoulders, and sometimes the legs. Previous evidence suggests that JME is a common type of epilepsy (in up to 11% of people with epilepsy), occurring more frequently in females than in males, and with onset typically in adolescence.. There is still much debate among experts over the long-term outcome of JME, and about which factors predict seizure outcome.

To further investigate JME outcomes and predictive factors, Dr. Felix Schneider and colleagues from the Epilepsy Center at the University of Greifswald in Germany studied data from 12 male and 19 female patients with JME. All participants had a minimum of 25 years follow-up which included review of medical records, and telephone or in-person interviews.

Sixty-eight percent of the 31 JME patients became free of seizures, and 28% discontinued AED treatment due to seizure-freedom. Significant predictors of poor long-term seizure outcome included: occurrence of generalized tonic-clonic seizures (GTCS - formerly known as grand mal seizures) that affect the entire brain and which are preceded by bilateral myoclonic seizures (abnormal movements on both sides of the body and a regimen of AED polytherapy.

Researchers also determined that remission of GTCS using AED therapy significantly increased the possibility of complete seizure-freedom. However, once AED therapy is discontinued, the occurrence of photoparoxysmal responses (brain discharges in response to brief flashes of light) significantly predicted an increased risk of seizure recurrence.

"Our findings confirm the feasibility of personalized treatment of the individual JME patient," concludes Dr. Schneider. "Life-long AED therapy is not necessarily required in many patients to maintain seizure freedom. Understanding the predictors for successful long-term seizure outcome will aid clinicians in their treatment options for those with JME.”

Provided by Wiley

Source: medicalxpress.com

Jun 6, 20121 note
#science #neuroscience #brain #psychology #epilepsy
Anxious Girls' Brains Work Harder

ScienceDaily (June 5, 2012) — In a discovery that could help in the identification and treatment of anxiety disorders, Michigan State University scientists say the brains of anxious girls work much harder than those of boys.

image

This electrode cap was worn by participants in an MSU experiment that measured how people responded to mistakes. Female subjects who identified themselves as big worriers recorded the highest brain activity. (Credit: G.L. Kohuth)

The finding stems from an experiment in which college students performed a relatively simple task while their brain activity was measured by an electrode cap. Only girls who identified themselves as particularly anxious or big worriers recorded high brain activity when they made mistakes during the task.

Jason Moser, lead investigator on the project, said the findings may ultimately help mental health professionals determine which girls may be prone to anxiety problems such as obsessive compulsive disorder or generalized anxiety disorder.

"This may help predict the development of anxiety issues later in life for girls," said Moser, assistant professor of psychology. "It’s one more piece of the puzzle for us to figure out why women in general have more anxiety disorders."

The study, reported in the International Journal of Psychophysiology, is the first to measure the correlation between worrying and error-related brain responses in the sexes using a scientifically viable sample (79 female students, 70 males).

Participants were asked to identify the middle letter in a series of five-letter groups on a computer screen. Sometimes the middle letter was the same as the other four (“FFFFF”) while sometimes it was different (“EEFEE”). Afterward they filled out questionnaires about how much they worry.

Although the worrisome female subjects performed about the same as the males on simple portions of the task, their brains had to work harder at it. Then, as the test became more difficult, the anxious females performed worse, suggesting worrying got in the way of completing the task, Moser said.

"Anxious girls’ brains have to work harder to perform tasks because they have distracting thoughts and worries," Moser said. "As a result their brains are being kind of burned out by thinking so much, which might set them up for difficulties in school. We already know that anxious kids — and especially anxious girls — have a harder time in some academic subjects such as math."

Currently Moser and other MSU researchers are investigating whether estrogen, a hormone more common in women, may be responsible for the increased brain response. Estrogen is known to affect the release of dopamine, a neurotransmitter that plays a key role in learning and processing mistakes in the front part of the brain.

"This may end up reflecting hormone differences between men and women," Moser said.

In addition to traditional therapies for anxiety, Moser said other ways to potentially reduce worry and improve focus include journaling — or “writing your worries down in a journal rather than letting them stick in your head” — and doing “brain games” designed to improve memory and concentration.

Source: Science Daily

Jun 6, 201234 notes
#science #neuroscience #brain #psychology #anxiety
Mothers' Teen Cannabinoid Exposure May Increase Response of Offspring to Opiate Drugs

ScienceDaily (June 5, 2012) — Mothers who use marijuana as teens — long before having children — may put their future children at a higher risk of drug abuse, new research suggests.

Researchers in the Neuroscience and Reproductive Biology section at the Cummings School of Veterinary Medicine conducted a study to determine the transgenerational effects of cannabinoid exposure in adolescent female rats. For three days, adolescent rats were administered the cannabinoid receptor agonist WIN-55, 212-2, a drug that has similar effects in the brain as THC, the active ingredient in marijuana. After this brief exposure, they remained untreated until being mated in adulthood.

The male offspring of the female rats were then measured against a control group for a preference between chambers that were paired with either saline or morphine. The rats with mothers who had adolescent exposure to WIN-55,212-2 were significantly more likely to opt for the morphine-paired chamber than those with mothers who abstained. The results suggest that these animals had an increased preference for opiate drugs.

The study was published in the Journal of Psychopharmocology and funded by the National Institutes of Health.

"Our main interest lies in determining whether substances commonly used during adolescence can induce behavioral and neurochemical changes that may then influence the development of future generations," said Research Assistant Professor John J. Byrnes, the study’s lead author, "We acknowledge that we are using rodent models, which may not fully translate to the human condition. Nevertheless, the results suggest that maternal drug use, even prior to pregnancy, can impact future offspring."

Byrnes added that much research is needed before a definitive connection is made between adolescent drug use and possible effects on future children.

The study builds on earlier findings by the Tufts group, most notably a study published last year in Behavioral Brain Research by Assistant Professor Elizabeth Byrnes that morphine use as adolescent rats induces changes similar to those observed in the present study.

Other investigators in the field have previously reported that cannabinoid exposure during pregnancy (in both rats and humans) can affect offspring development, including impairment of cognitive function, and increased risk of depression and anxiety.

Source: Science Daily

Jun 5, 201223 notes
#science #neuroscience #brain #psychology #marijuana
Noninvasive Genetic Test for Down Syndrome and Edwards Syndrome Highly Accurate

ScienceDaily (June 5, 2012) — Using a noninvasive test on maternal blood that deploys a novel biochemical assay and a new algorithm for analysis, scientists can detect, with a high degree of accuracy, the risk that a fetus has the chromosomal abnormalities that cause Down syndrome and a genetic disorder known as Edwards syndrome. The new approach is more scalable than other recently developed genetic screening tests and has the potential to reduce unnecessary amniocentesis or CVS.

Two studies evaluating this approach are available online in advance of publication in the April issue of the American Journal of Obstetrics & Gynecology (AJOG).

Diagnosis of fetal chromosomal abnormalities, or aneuploidies, relies on invasive testing by chorionic villous sampling or amniocentesis in pregnancies identified as high-risk. Although accurate, the tests are expensive and carry a risk of miscarriage. A technique known as massively parallel shotgun sequencing (MPSS) that analyzes cell-free DNA (cfDNA) from the mother’s plasma for fetal conditions has been used to detect trisomy 21 (T21) pregnancies, those with an extra copy of chromosome 21 that leads to Down syndrome, and trisomy 18 (T18), the chromosomal defect underlying Edwards syndrome. MPSS accurately identifies the conditions by analyzing the entire genome, but it requires a large amount of DNA sequencing, limiting its clinical usefulness.

Scientists at Aria Diagnostics in San Jose, CA developed a novel assay, Digital Analysis of Selected Regions (DANSR™), which sequences loci from only the chromosomes under investigation. The assay requires 10 times less DNA sequencing than MPSS approaches.

In the current study, the researchers report on a novel statistical algorithm, the Fetal-fraction Optimized Risk of Trisomy Evaluation (FORTE™), which considers age-related risks and the percentage of fetal DNA in the sample to provide an individualized risk score for trisomy. Explains author Ken Song, MD, “The higher the fraction of fetal cfDNA, the greater the difference in the number of cfDNA fragments originating from trisomic versus disomic [normal] chromosomes and hence the easier it is to detect trisomy. The FORTE algorithm explicitly accounts for fetal fraction in calculating trisomy risk.”

To test the performance of the DANSR/FORTE assay, Dr. Song and his colleagues evaluated a set of subjects consisting of 123 normal, 36 T21, and 8 T18 pregnancies. All samples were assigned FORTE odd scores for chromosome 18 and chromosome 21. The combination of DANSR and FORTE correctly identified all 36 cases of T21 and 8 cases of T18 as having a greater than 99% risk for each trisomy in a blinded analysis. There was at least a 1,000 fold magnitude separation in the risk score between trisomic and disomic samples.

In a related study, researchers from the Harris Birthright Research Centre for Fetal Medicine, Kings College Hospital, University of London and the University College London Hospital, University College London, provided 400 maternal plasma samples to Aria for analysis using the DANSR assay with the FORTE algorithm. The subjects were all at risk for aneuploidies, and they had been tested by chorionic villous sampling. The analysis distinguished all cases of T21 and 98% of T18 cases from euploid pregnancies. In all cases of T21, the estimated risk for this aneuploidy was greater than or equal to 99%, whereas in all normal pregnancies and those with T18, the risk score for T21 was less than or equal to 0.01%.

"Combining the DANSR assay with the FORTE algorithm provides a robust and accurate assessment of fetal trisomy risk," says Dr. Song. "Because DANSR allows analysis of specific genomic regions, it could be potentially used to evaluate genetic conditions other than trisomy. The incorporation of additional risk information, such as from ultrasonography, into the FORTE algorithm warrants investigation."

Kypros H. Nicolaides, MD, senior author of the University of London study, suggests that fetal trisomy evaluation with cfDNA testing will inevitably be introduced into clinical practice. “It would be useful as a secondary test contingent upon the results of a more universally applicable primary method of screening. The extent to which it could be applied as a universal screening tool depends on whether the cost becomes comparable to that of current methods of sonographic and biochemical testing.”

Dr. Nicolaides also notes that the plasma samples were obtained from high-risk pregnancies where there is some evidence of impaired placental function. It would also be necessary to demonstrate that the observed accuracy with cfDNA testing obtained from the investigation of pregnancies at high-risk for aneuploidies is applicable to the general population where the prevalence of fetal trisomy 21 is much lower. “This may well prove to be the case because the ability to detect aneuploidy with cfDNA is dependent upon assay precision and fetal DNA percentage in the sample rather than the prevalence of the disease in the study population,” he concludes.

Source: Science Daily

Jun 5, 20127 notes
#science #neuroscience #brain #psychology #biology
How Immune System, Inflammation May Play Role in Lou Gehrig's Disease

ScienceDaily (June 5, 2012) — In an early study, UCLA researchers found that the immune cells of patients with amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease, may play a role in damaging the neurons in the spinal cord. ALS is a disease of the nerve cells in the brain and spinal cord that control voluntary muscle movement.

image

In the ALS spinal cord, a patient’s own immune cells called macrophages (green) impact neurons (live neurons =red, which are also marked by an asterisk (*), and dead neurons = magenta that are marked by an arrow. (Credit: University of California, Los Angeles)

Specifically, the team found that inflammation instigated by the immune system in ALS can trigger macrophages — cells responsible for gobbling up waste products in the brain and body — to also ingest healthy neurons. During the inflammation process, motor neurons, whether healthy or not, are marked for clean-up by the macrophages.

In addition, the team found that a lipid mediator called resolvin D1, which is made in the body from the omega-3 fatty acid DHA, was able to “turn off” the inflammatory response that made the macrophages so dangerous to the neurons. Resolvin D1 blocked the inflammatory proteins being produced by the macrophages, curbing the inflammation process that marked the neurons for clean-up. It inhibited key inflammatory proteins like IL-6 with a potency 1,100 times greater than the parent molecule, DHA. DHA has been shown in studies to be neuroprotective in a number of conditions, including stroke and Alzheimer’s disease.

For the study, the team isolated macrophages from blood samples taken from both ALS patients and controls and spinal cord cells from deceased donors.

The study findings on resolvin D1 may offer a new approach to attenuating the inflammation in ALS. Currently, there is no effective way of administering resolvins to patients, so clinical research with resolvin D1 is still several years away. The parent molecule, DHA, is available in stores, although it has not been tested in clinical trials for ALS. Studies with DHA are in progress for Alzheimer’s disease, stroke and brain injury and have been mostly positive.

Source: Science Daily

Jun 5, 20123 notes
#science #neuroscience #psychology #neuron
Ear delivers sound information to brain in surprisingly organized fashion: study

June 5, 2012

The brain receives information from the ear in a surprisingly orderly fashion, according to a University at Buffalo study scheduled to appear June 6 in the Journal of Neuroscience.

image

Light microscope image of a bushy neuron in the cochlear nucleus, with a glass microelectrode for recording electrical activity inside the cell. The cell is about 12 micrometers in diameter. New research, published in the Journal of Neuroscience, shows that the synapses onto these cells are sorted according to their plasticity. Credit: Dr. L. Pliss

The research focuses on a section of the brain called the cochlear nucleus, the first way-station in the brain for information coming from the ear. In particular, the study examined tiny biological structures called synapses that transmit signals from the auditory nerve to the cochlear nucleus.

The major finding: The synapses in question are not grouped randomly. Instead, like orchestra musicians sitting in their own sections, the synapses are bundled together by a key trait: plasticity.

Plasticity relates to how quickly a synapse runs down the supply of neurotransmitter it uses to send signals, and plasticity can affect a synapse’s sensitivity to different qualities of sound. Synapses that unleash supplies rapidly may provide good information on when a sound began, while synapses that release neurotransmitter at a more frugal pace may provide better clues on traits like timbre that persist over the duration of a sound.

UB Associate Professor Matthew Xu-Friedman, who led the study, said the findings raise new questions about the physiology of hearing. The research shows that synapses in the cochlear nucleus are arranged by plasticity, but doesn’t yet explain why this arrangement is beneficial, he said.

"It’s clearly important, because the synapses are sorted based on this. What we don’t know is why," said Xu-Friedman, a member of UB’s Department of Biological Sciences. "If you look inside a file cabinet and find all these pieces of paper together, you know it’s important that they’re together, but you may not know why."

In the study, Xu-Friedman and Research Assistant Professor Hua Yang used brain slices from mice to study about 20 cells in the cochlear nucleus called bushy cells, which receive information from synapses attached to auditory nerve fibers.

The experiments revealed that each bushy cell was linked to a network of synapses with similar plasticity. This means that bushy cells themselves may become specialized, developing unique sensitivities to particular characteristics of a sound, Xu-Friedman said.

The study hints that the cochlear nucleus may not be the only part of the brain where synapses are organized by plasticity. The researchers observed the phenomenon in the excitatory synapses of the cerebellum as well.

"One reason this may not have been noticed before is that measuring the plasticity of two different synapses onto one cell is technically quite difficult," Xu-Friedman said.

Provided by University at Buffalo

Source: medicalxpress.com

Jun 5, 201210 notes
#science #neuroscience #brain #psychology
Magnetic stimulation to improve visual perception

June 5, 2012

(Medical Xpress) — Using transcranial magnetic stimulation (TMS), an international team led by French researchers from the Centre de Recherche de l’Institut du Cerveau (CNRS) has succeeded in enhancing the visual abilities of a group of healthy subjects. Following stimulation of an area of the brain’s right hemisphere involved in perceptual awareness and in orienting spatial attention, the subjects appeared more likely to perceive a target appearing on a screen. This work, published in the journal PLoS ONE, could lead to the development of novel rehabilitation techniques for certain visual disorders. In addition, it could help improve the performance of individuals whose tasks require very high precision.

TMS is a non-invasive technique that consists in sending a magnetic pulse into a given area of the brain. This results in an activation of the cortical neurons located within the range of the magnetic field, which modifies their activity in a painless and temporary manner. For several years, scientists have been looking at the possibility of using this technique to enhance certain brain functions in healthy subjects.

In this respect, the team led by Antoni Valero-Cabré has carried out research involving the stimulation of a region of the right cerebral hemisphere known as the frontal eye field. Strictly speaking, this is not a primary visual area but it participates in the planning of ocular movements and the orientation of each individual’s attention in the visual space. In a first experiment, a group of healthy subjects tried to distinguish a very low contrast target appearing on a screen for just 30 ms. In some of the tests, the subjects received a magnetic pulse lasting between 80 and 140 ms on this frontal region before the target appeared. The researchers found that the success rate was higher when using TMS. The visual sensitivity of healthy subjects was temporarily increased by around 12%. In a second experiment, the subjects were shown a fleeting visual cue indicating the spot where the target could appear. In this configuration, the enhancement of visual sensitivity, which remained of the same order, was only apparent when the cue indicated the correct location of the target.

Although cerebral functions such as conscious vision are highly optimized in healthy adults, these results show that there is a significant margin for improvement, which can be “enhanced” by TMS. This technique could be tested for the rehabilitation of patients suffering from cortical damage, due for example to a cardiovascular accident, and for that of patients with retinal disorders. The second experiment suggests that rehabilitation based on both TMS and visual cues could be more selective than the use of stimulation alone. The researchers want to further explore this possibility using repetitive TMS, which, in this case, could make it possible to obtain long-lasting modification of cerebral activity.

Furthermore, according to the researchers, TMS could be used in the near future to increase the attentional abilities of individuals performing tasks that require good visual skills.

Provided by CNRS

Source: medicalxpress.com

Jun 5, 20127 notes
#brain #neuroscience #psychology #science #perception
Post-stroke depression linked to functional brain impairment

June 5, 2012

Researchers studying stroke patients have found a strong association between impairments in a network of the brain involved in emotional regulation and the severity of post-stroke depression. Results of the study are published online in the journal Radiology.

"A third of patients surviving a stroke experience post-stroke depression (PSD),” said lead researcher Igor Sibon, M.D., Ph.D., professor of neurology at the University of Bordeaux in Bordeaux, France. “However, studies have failed to identify a link between lesions in the brain caused by ischemia during a stroke and subsequent depression.”

Instead of looking for dysfunction in a specific area of the brain following a stroke, Dr. Sibon’s study was designed to assess a group of brain structures organized in a functional network called the default-mode network (DMN). Modifications of connectivity in the DMN, which is associated with internally generated thought processes, has been observed in depressive patients.

"The default-mode network is activated when the brain is at rest," Dr. Sibon said. "When the brain is not actively involved in a task, this area of the brain is engaged in internal thoughts involving self-related memory retrieval and processing.”

In the study, 24 patients between the ages of 18 and 80 underwent resting-state functional magnetic resonance imaging (fMRI) 10 days after having mild to moderate ischemic stroke. An fMRI imaging study measures metabolic changes in specific areas of the brain. Although many fMRI exams are designed to measure brain changes while a patient performs a specific task, during a resting-state fMRI exam, patients lie motionless.

The patients, which included 19 men and five women, were also clinically evaluated 10 days and three months post-stroke to determine the presence and severity of depression and anxiety symptoms. At three months post-stroke, patients were evaluated for depression using the DSM-IV diagnostic classification system.

Using the DSM-IV criteria, 10 patients had minor to moderate depression, and 14 patients had no depression. Results of the fMRI exams revealed an association between modifications of connectivity in the DMN 10 days after stroke and the severity of depression three months post-stroke.

"We found a strong association between early resting-state network modifications and the risk of post-stroke mood disorders," Dr. Sibon said. "These results support the theory that functional brain impairment following a stroke may be more critical than structural lesions."

According to Dr. Sibon, the widespread chemical changes that result from a stroke may lead to the modification of connectivity in brain networks such as the DMN. He said results of his study may contribute to the clinical management of stroke patients by providing an opportunity to investigate the effects of a variety of treatments on patients whose fMRI results immediately post-stroke indicate impaired connectivity in the DMN.

Provided by Radiological Society of North America

Source: medicalxpress.com

Jun 5, 20126 notes
#science #neuroscience #psychology #brain #stroke #depression
Hands-on research: Neuroscientists show how brain responds to sensual caress

June 4, 2012

A nuzzle of the neck, a stroke of the wrist, a brush of the knee—these caresses often signal a loving touch, but can also feel highly aversive, depending on who is delivering the touch, and to whom. Interested in how the brain makes connections between touch and emotion, neuroscientists at the California Institute of Technology (Caltech) have discovered that the association begins in the brain’s primary somatosensory cortex, a region that, until now, was thought only to respond to basic touch, not to its emotional quality.

image

The new finding is described in this week’s issue of the Proceedings of the National Academy of Sciences (PNAS).

The team measured brain activation while self-identified heterosexual male subjects lay in a functional MRI scanner and were each caressed on the leg under two different conditions. In the first condition, they saw a video of an attractive female bending down to caress them; in the second, they saw a video of a masculine man doing the same thing. The men reported the experience as pleasurable when they thought the touch came from the woman, and aversive when they thought it came from the man. And their brains backed them up: this difference in experience was reflected in the activity measured in each man’s primary somatosensory cortex.

"We demonstrated for the first time that the primary somatosensory cortex—the brain region encoding basic touch properties such as how rough or smooth an object is—also is sensitive to the social meaning of a touch," explains Michael Spezio, a visiting associate at Caltech who is also an assistant professor of psychology at Scripps College in Claremont, California. "It was generally thought that there are separate brain pathways for how we process the physical aspects of touch on the skin and for how we interpret that touch emotionally—that is, whether we feel it as pleasant, unpleasant, desired, or repulsive. Our study shows that, to the contrary, emotion is involved at the primary stages of social touch."

Unbeknownst to the subjects, the actual touches on their leg were always exactly the same—and always from a woman. Yet, it felt different to them when they believed a man versus a woman was doing the touching.

"The primary somatosensory cortex responded more to the ‘female’ touch than to the ‘male’ touch condition, even while subjects were only viewing a video showing a person approach their leg," says Ralph Adolphs, Bren Professor of Psychology and Neuroscience at Caltech and director of the Caltech Brain Imaging Center, where the research was done. "We see responses in a part of the brain thought to process only basic touch that were elicited entirely by the emotional significance of social touch prior to the touch itself, simply in anticipation of the caress that our participants would receive."

The study was carried out in collaboration with the husband-and-wife team of Valeria Gazzola and Christian Keysers, who were visiting Caltech from the University of Groningen in the Netherlands.

"Intuitively, we all believe that when we are touched by someone, we first objectively perceive the physical properties of the touch—its speed, its gentleness, the roughness of the skin," says Gazzola. "Only thereafter, in a separable second step based on who touched us, do we believe we value this touch more or less."

The experiment showed that this two-step vision is incorrect, at least in terms of separation between brain regions, she says, and who we believe is touching us distorts even the supposedly objective representation of what the touch was like on the skin.

"Nothing in our brain is truly objective," adds Keysers. "Our perception is deeply and pervasively shaped by how we feel about the things we perceive."

One possible practical implication of the work is to help reshape social responses to touch in people with autism.

"Now that we have clear evidence that primary somatosensory cortex encodes emotional significance of touch, it may be possible to work with early sensory pathways to help children with autism respond more positively to the gentle touch of their parents and siblings," says Spezio.

The work also suggests that it may be possible to use film clips or virtual reality to reestablish positive responses to gentle touch in victims of sexual and physical abuse, and torture.

Next, the researchers hope to test whether the effect is as robust in women as in men, and in both sexes across sexual orientation. They also plan to explore how these sensory pathways might develop in infants or children.

Provided by California Institute of Technology

Source: medicalxpress.com

Jun 4, 2012262 notes
#science #neuroscience #brain #psychology
High Blood Caffeine Levels in Older Adults Linked to Avoidance of Alzheimer’s Disease

ScienceDaily (June 4, 2012) — Those cups of coffee that you drink every day to keep alert appear to have an extra perk — especially if you’re an older adult. A recent study monitoring the memory and thinking processes of people older than 65 found that all those with higher blood caffeine levels avoided the onset of Alzheimer’s disease in the two-to-four years of study follow-up. Moreover, coffee appeared to be the major or only source of caffeine for these individuals.

image

Those cups of coffee that you drink every day to keep alert appear to have an extra perk — especially if you’re an older adult. A recent study monitoring the memory and thinking processes of people older than 65 found that all those with higher blood caffeine levels avoided the onset of Alzheimer’s disease in the two-to-four years of study follow-up. (Credit: © Yuri Arcurs / Fotolia)

Researchers from the University of South Florida and the University of Miami say the case control study provides the first direct evidence that caffeine/coffee intake is associated with a reduced risk of dementia or delayed onset. Their findings will appear in the online version of an article to be published June 5 in the Journal of Alzheimer’s Disease. The collaborative study involved 124 people, ages 65 to 88, in Tampa and Miami.

"These intriguing results suggest that older adults with mild memory impairment who drink moderate levels of coffee — about 3 cups a day — will not convert to Alzheimer’s disease — or at least will experience a substantial delay before converting to Alzheimer’s," said study lead author Dr. Chuanhai Cao, a neuroscientist at the USF College of Pharmacy and the USF Health Byrd Alzheimer’s Institute. "The results from this study, along with our earlier studies in Alzheimer’s mice, are very consistent in indicating that moderate daily caffeine/coffee intake throughout adulthood should appreciably protect against Alzheimer’s disease later in life."

The study shows this protection probably occurs even in older people with early signs of the disease, called mild cognitive impairment, or MCI. Patients with MCI already experience some short-term memory loss and initial Alzheimer’s pathology in their brains. Each year, about 15 percent of MCI patients progress to full-blown Alzheimer’s disease. The researchers focused on study participants with MCI, because many were destined to develop Alzheimer’s within a few years.

Blood caffeine levels at the study’s onset were substantially lower (51 percent less) in participants diagnosed with MCI who progressed to dementia during the two-to-four year follow-up than in those whose mild cognitive impairment remained stable over the same period.

No one with MCI who later developed Alzheimer’s had initial blood caffeine levels above a critical level of 1200 ng/ml — equivalent to drinking several cups of coffee a few hours before the blood sample was drawn. In contrast, many with stable MCI had blood caffeine levels higher than this critical level.

"We found that 100 percent of the MCI patients with plasma caffeine levels above the critical level experienced no conversion to Alzheimer’s disease during the two-to-four year follow-up period," said study co-author Dr. Gary Arendash.

The researchers believe higher blood caffeine levels indicate habitually higher caffeine intake, most probably through coffee. Caffeinated coffee appeared to be the main, if not exclusive, source of caffeine in the memory-protected MCI patients, because they had the same profile of blood immune markers as Alzheimer’s mice given caffeinated coffee. Alzheimer’s mice given caffeine alone or decaffeinated coffee had a very different immune marker profile.

Since 2006, USF’s Dr. Cao and Dr. Arendash have published several studies investigating the effects of caffeine/coffee administered to Alzheimer’s mice. Most recently, they reported that caffeine interacts with a yet unidentified component of coffee to boost blood levels of a critical growth factor that seems to fight off the Alzheimer’s disease process.

"We are not saying that moderate coffee consumption will completely protect people from Alzheimer’s disease," Dr. Cao cautioned. "However, we firmly believe that moderate coffee consumption can appreciably reduce your risk of Alzheimer’s or delay its onset."

Alzheimer’s pathology is a process in which plaques and tangles accumulate in the brain, killing nerve cells, destroying neural connections, and ultimately leading to progressive and irreversible memory loss. Since the neurodegenerative disease starts one or two decades before cognitive decline becomes apparent, the study authors point out, any intervention to cut the risk of Alzheimer’s should ideally begin that far in advance of symptoms.

"Moderate daily consumption of caffeinated coffee appears to be the best dietary option for long-term protection against Alzheimer’s memory loss," Dr. Arendash said. "Coffee is inexpensive, readily available, easily gets into the brain, and has few side-effects for most of us. Moreover, our studies show that caffeine and coffee appear to directly attack the Alzheimer’s disease process."

In addition to Alzheimer’s disease, moderate caffeine/coffee intake appears to reduce the risk of several other diseases of aging, including Parkinson’s disease, stroke, Type II diabetes, and breast cancer. However, supporting studies for these benefits have all been observational (uncontrolled), and controlled clinical trials are needed to definitively demonstrate therapeutic value.

A study tracking the health and coffee consumption of more than 400,000 older adults for 13 years, and published earlier this year in the New England Journal of Medicine, found that coffee drinkers reduced their risk of dying from heart disease, lung disease, pneumonia, stroke, diabetes, infections, and even injuries and accidents.

With new Alzheimer’s diagnostic guidelines encompassing the full continuum of the disease, approximately 10 million Americans now fall within one of three developmental stages of Alzheimer’s disease — Alzheimer’s disease brain pathology only, MCI, or diagnosed Alzheimer’s disease. That number is expected to climb even higher as the baby-boomer generation continues to enter older age, unless an effective and proven preventive measure is identified.

"If we could conduct a large cohort study to look into the mechanisms of how and why coffee and caffeine can delay or prevent Alzheimer’s disease, it might result in billions of dollars in savings each year in addition to improved quality of life," Dr. Cao said.

Source: Science Daily

Jun 4, 201230 notes
#science #neuroscience #brain #psychology #caffeine #alzheimer
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December