Neuroscience

Month

July 2012

Training Improves Recognition of Quickly Presented Objects

ScienceDaily (July 9, 2012) — “Attentional blink” is the term psychologists use to describe our inability to recognize a second important object if we see it less than half a second after a first one. It always seemed impossible to overcome, but in a new paper in the Proceedings of the National Academy of Sciences, Brown University psychologists report they’ve found a way.

So far it has seemed an irreparable limitation of human perception that we strain to perceive things in the very rapid succession of, say, less than half a second. Psychologists call this deficit “attentional blink.” We’ll notice that first car spinning out in our path, but maybe not register the one immediately beyond it. It turns out, we can learn to do better after all. In a new study researchers now based at Brown University overcame the blink with just a little bit of training that was never been tried before.

"A color change can be very conspicuous. If all items are black and white and all of a sudden a color item is shown, you pay attention to that." Credit: Mike Cohea/Brown University"Attention is a very important component of visual perception," said Takeo Watanabe, professor of cognitive, linguistic and psychological sciences at Brown. "One of the best ways to enhance our visual ability is to improve our attentional function."

Watanabe and his team were at Boston University when they performed experiments described in a paper published the week of July 9 in the Proceedings of the National Academy of Sciences. The bottom line of the research is that making the second target object a distinct color is enough to train people to switch their attention more quickly than they could before. After that, they can perceive a second target object presented as quickly as a fifth of a second later, even when it isn’t distinctly colored.

Read More →

Jul 10, 201218 notes
#science #neuroscience #brain #psychology #memory #object recognition
Small Molecule May Play Big Role in Alzheimer's Disease

ScienceDaily (July 9, 2012) — Alzheimer’s disease is one of the most dreaded and debilitating illnesses one can develop. Currently, the disease afflicts 6.5 million Americans and the Alzheimer’s Association projects it to increase to between 11 and 16 million, or 1 in 85 people, by 2050.

image

Cell death in the brain causes one to grow forgetful, confused and, eventually, catatonic. Recently approved drugs provide mild relief for symptoms but there is no consensus on the underlying mechanism of the disease.

"We don’t know what the problem is in terms of toxicity," said Joan-Emma Shea, professor of chemistry and biochemistry at the University of California, Santa Barbara (UCSB). "This makes the disease difficult to cure."

Accumulations of amyloid plaques have long been associated with the disease and were presumed to be its cause. These long knotty fibrils, formed from misfolded protein fragments, are almost always found in the brains of diseased patients. Because of their ubiquity, amyloid fibrils were considered a potential source of the toxicity that causes cell death in the brain. However, the quantity of fibrils does not correspond with the degree of dementia and other symptoms.

New findings support a hypothesis that fibrils are a by-product of the disease rather than the toxic agent itself. This paradigm shift changes the focus of inquiry to smaller, intermediate molecules that form and dissipate quickly. These molecules are difficult to perceive in brain tissue.

Read More →

Jul 10, 201214 notes
#science #neuroscience #brain #psychology #alzheimer
Nutrient mixture improves memory in patients with early Alzheimer's

July 10, 2012 by Anne Trafton

A clinical trial of an Alzheimer’s disease treatment developed at MIT has found that the nutrient cocktail can improve memory in patients with early Alzheimer’s. The results confirm and expand the findings of an earlier trial of the nutritional supplement, which is designed to promote new connections between brain cells.

image

A graphic depicting a synapse, a connection between brain cells. Graphic: Christine Daniloff

Alzheimer’s patients gradually lose those connections, known as synapses, leading to memory loss and other cognitive impairments. The supplement mixture, known as Souvenaid, appears to stimulate growth of new synapses, says Richard Wurtman, a professor emeritus of brain and cognitive sciences at MIT who invented the nutrient mixture.

“You want to improve the numbers of synapses, not by slowing their degradation — though of course you’d love to do that too — but rather by increasing the formation of the synapses,” Wurtman says.

To do that, Wurtman came up with a mixture of three naturally occurring dietary compounds: choline, uridine and the omega-3 fatty acid DHA. Choline can be found in meats, nuts and eggs, and omega-3 fatty acids are found in a variety of sources, including fish, eggs, flaxseed and meat from grass-fed animals. Uridine is produced by the liver and kidney, and is present in some foods as a component of RNA.

These nutrients are precursors to the lipid molecules that, along with specific proteins, make up brain-cell membranes, which form synapses. To be effective, all three precursors must be administered together.

Results of the clinical trial, conducted in Europe, appear in the July 10 online edition of the Journal of Alzheimer’s Disease. The new findings are encouraging because very few clinical trials have produced consistent improvement in Alzheimer’s patients, says Jeffrey Cummings, director of the Cleveland Clinic’s Lou Ruvo Center for Brain Health.

“Memory loss is the central characteristic of Alzheimer’s, so something that improves memory would be of great interest,” says Cummings, who was not part of the research team.

Plans for commercial release of the supplement are not finalized, according to Nutricia, the company testing and marketing Souvenaid, but it will likely be available in Europe first. Nutricia is the specialized health care division of the food company Danone, known as Dannon in the United States.

Making connections

Wurtman first came up with the idea of targeting synapse loss to combat Alzheimer’s about 10 years ago. In animal studies, he showed that his dietary cocktail boosted the number of dendritic spines, or small outcroppings of neural membranes, found in brain cells. These spines are necessary to form new synapses between neurons.

Following the successful animal studies, Philip Scheltens, director of the Alzheimer Center at VU University Medical Center in Amsterdam, led a clinical trial in Europe involving 225 patients with mild Alzheimer’s. The patients drank Souvenaid or a control beverage daily for three months.

That study, first reported in 2008, found that 40 percent of patients who consumed the drink improved in a test of verbal memory, while 24 percent of patients who received the control drink improved their performance.

The new study, performed in several European countries and overseen by Scheltens as principal investigator, followed 259 patients for six months. Patients, whether taking Souvenaid or a placebo, improved their verbal-memory performance for the first three months, but the placebo patients deteriorated during the following three months, while the Souvenaid patients continued to improve. For this trial, the researchers used more comprehensive memory tests taken from the neuropsychological test battery, often used to assess Alzheimer’s patients in clinical research.

Patients showed a very high compliance rate: About 97 percent of the patients followed the regimen throughout the study, and no serious side effects were seen.

Both clinical trials were sponsored by Nutricia. MIT has patented the mixture of nutrients used in the study, and Nutricia holds the exclusive license on the patent.

Brain patterns

In the new study, the researchers used electroencephalography (EEG) to measure how patients’ brain-activity patterns changed throughout the study. They found that as the trial went on, the brains of patients receiving the supplements started to shift from patterns typical of dementia to more normal patterns. Because EEG patterns reflect synaptic activity, this suggests that synaptic function increased following treatment, the researchers say.

Patients entering this study were in the early stages of Alzheimer’s disease, averaging around 25 on a scale of dementia that ranges from 1 to 30, with 30 being normal. A previous trial found that the supplement cocktail does not work in patients with Alzheimer’s at a more advanced stage. This makes sense, Wurtman says, because patients with more advanced dementia have probably already lost many neurons, so they can’t form new synapses.

A two-year trial involving patients who don’t have Alzheimer’s, but who are starting to show mild cognitive impairment, is now underway. If the drink seems to help, it could be used in people who test positive for very early signs of Alzheimer’s, before symptoms appear, Wurtman says. Such tests, which include PET scanning of the hippocampus, are now rarely done because there are no good Alzheimer’s treatments available.

Provided by Massachusetts Institute of Technology

Source: medicalxpress.com

Jul 10, 201226 notes
#science #neuroscience #brain #psychology #alzheimer #memory
Jul 8, 201231 notes
#science #neuroscience #brain #development #psychology
What Makes Us Musical Animals

ScienceDaily (July 6, 2012) — In a forthcoming issue of Topics in Cognitive Science researchers from the University of Amsterdam (UvA) argue that at least two, seemingly trivial musical skills can be considered fundamental to the evolution of music: relative pitch — the skill to recognise a melody independent of its pitch level — and beat induction — the skill to pick up regularity (the beat) from a varying rhythm. Both are considered cognitive mechanisms that are essential to perceive, make and appreciate music, and, as such, could be argued to be conditional to the origin of music.

While it recently became quite popular to address the study of the origins of music from an evolutionary perspective, there is still little agreement on the idea that music is in fact an adaptation, that it influenced our survival, or that it made us sexually more attractive. Music appears to be of little use. It doesn’t quell our hunger, nor do we live a day longer because of it. So why argue that music is an adaptation? There are even researchers who claim that studying the evolution of cognition is virtually impossible (Lewontin, 1998; Bolhuis & Wynne, 2009).

Distinguishing between music and musicality

The alternative that Henkjan Honing and Annemie Ploeger of the UvA propose is, first, to distinguish between the notion of ‘music’ and ‘musicality’, with musicality being defined as a natural, spontaneously developing trait based on and constrained by our cognitive system, and music as a social and cultural construct based on that very musicality. And secondly, to collect accumulative evidence from a variety of sources (e.g., psychological, physiological, genetic, phylogenetic, and cross-cultural evidence) to be able to show that a specific cognitive trait is indeed an adaptation.

Both relative pitch and beat induction are suggested as primary candidates for such cognitive traits, musical skills that are considered trivial by most humans, but that turn out to be quite special in the rest of the animal world.

Once these fundamental cognitive mechanisms are identified, it becomes possible to see how these might have evolved. In short: the study of the evolution of music cognition is conditional on a characterisation of the basic mechanisms that make up musicality.

Source: Science Daily

Jul 7, 201251 notes
#science #neuroscience #psychology #music #brain
Can You Hear Me Now? New Strategy Discovered to Prevent Hearing Loss

ScienceDaily (July 6, 2012) — If you’re concerned about losing your hearing because of noise exposure (earbud deafness syndrome), a new discovery published online in the FASEB Journal offers some hope. That’s because scientists from Germany and Canada show that the protein, AMPK, which protects cells during a lack of energy, also activates a channel protein in the cell membrane that allows potassium to leave the cell. This activity is important because this mechanism helps protect sensory cells in the inner ear from permanent damage following acoustic noise exposure.

This information could lead to new strategies and therapies to prevent and treat trauma resulting from extreme noise, especially in people with AMPK gene variants that may make them more vulnerable to hearing loss.

"Future research on the basis of the present study may lead to the development of novel strategies preventing noise-induced hearing loss or accelerating recovery from acoustic trauma," said Florian Lang, Ph.D., a researcher involved in the work from the Department of Physiology at the University of Tübingen, in Tübingen, Germany.

To make this discovery, Lang and colleagues compared two groups of mice. The first group was normal and the second lacked the AMPK protein. Hearing of the mice was tested by measuring sound-induced brain activity. All mice were exposed to well-defined noise causing an acoustic trauma and leading to hearing impairment. Prior to noise exposure, the hearing ability was similar in normal mice and mice lacking AMPK. After exposure, the hearing of the normal mice mostly recovered after two weeks, but the recovery of hearing in AMPK-deficient mice remained significantly impaired.

"When it comes to preventing hearing loss, keeping the volume down is still the best strategy, and this discovery doesn’t prevent loud music from beating on our ear drums," said Gerald Weissmann, M.D., Editor-in-Chief of the FASEB Journal. “This discovery does help explain why some people seem more likely to lose their hearing than others. At the same time, it also provides a target for new preventive strategies — and perhaps even a treatment — for earbud deafness syndrome.”

Source: Science Daily

Jul 7, 201222 notes
#science #neuroscience #brain #psychology #hearing
'Stoned' gene key to maintaining normal brain function

July 6, 2012

(Medical Xpress) — Scientists at the University of Liverpool have found that a protein produced by a gene identified in fruitflies, is responsible for communication between nerve cells in the brain.

image

Dr Stephen Royle: “This research is another step towards fully understanding the complexities of the human brain.”

The ‘stoned’ gene was discovered in fruitflies by scientists in the 1970s. When this gene was mutated, the flies had problems walking and flying, giving rise to the term ‘stoned’ gene. The same gene was found in mammals some years later, but until now scientists have not known precisely what this gene is responsible for and why it causes problems with physical functions when it mutates.

‘Packets of chemicals’

Scientists at Liverpool have found that the protein the gene expresses in mammals, called stonin2, is responsible for retrieving ‘packets’ of chemicals that nerve cells in the brain release in order to communicate with each other.  The inability of the gene to express this protein in the fruitfly study, suggests why the insect appeared not to be able to walk or fly normally.

The team used advanced techniques to inactivate stonin2 for short and long periods of time in animal cells grown in the laboratory. The cells used where from an area of the brain associated with learning and memory.  They showed that without stonin2 the nerve cells could not retrieve the ‘packets’ needed to transport the chemicals required for communications between nerve cells.

Dr Stephen Royle, from the University’s Institute of Translational Medicine, explains: “Nerve cells in the brain communicate by releasing ‘packets’ of chemicals.  These ‘packets’ must be retrieved and refilled with chemicals so that they can be used once again. This recycling programme is very important for nerve cells to keep communicating with each other. 

“We have shown that a protein called stonin 2 is needed for the packets to be retrieved. There is currently no evidence to suggest that the gene which expresses this protein is mutated in human disease, but any failure in its function would be disastrous.  The research is another step towards fully understanding the complexities of the human brain.”

The research is published in the journal, Current Biology.

Provided by University of Liverpool

Source: medicalxpress.com

Jul 7, 201225 notes
#science #neuroscience #brain #genes #biology #fruitflies
Zebrafish Reveal Promising Process for Healing Spinal Cord Injury

ScienceDaily (July 6, 2012) — Yona Goldshmit, Ph.D., is a former physical therapist who worked in rehabilitation centers with spinal cord injury patients for many years before deciding to switch her focus to the underlying science.

"After a few years in the clinic, I realized that we don’t really know what’s going on," she said.

Now a scientist working with Peter Currie, Ph.D., at Monash University in Australia, Dr. Goldshmit is studying the mechanisms of spinal cord repair in zebrafish, which, unlike humans and other mammals, can regenerate their spinal cord following injury. On June 23 at the 2012 International Zebrafish Development and Genetics Conference in Madison, Wisconsin, she described a protein that may be a key difference between regeneration in fish and mammals.

One of the major barriers to spinal regeneration in mammals is a natural protective mechanism, which incongruously results in an unfortunate side effect. After a spinal injury, nervous system cells called glia are activated and flood the area to seal the wound to protect the brain and spinal cord. In doing so, however, the glia create scar tissue that acts as a physical and chemical barrier, which prevents new nerves from growing through the injury site.

One striking difference between the glial cells in mammals and fish is the resulting shape: mammalian glia take on highly branched, star-like arrangements that appear to intertwine into dense tissue. Fish glia cells, by contrast, adopt a simple elongated shape — called bipolar morphology — that bridges the injury site and appears to help new nerve cells grow through the damaged area to heal the spinal cord.

"Zebrafish don’t have so much inflammation and the injury is not so severe as in mammals, so we can actually see the pro-regenerative effects that can happen," Dr. Goldshmit explained.

Studies in mice have found that mammalian glia can take up the same elongated shape, but in response to the environment around the injury they instead mature into scar tissue that does not allow nerve regrowth.

Dr. Goldshmit and her colleagues have focused on a family of molecules called fibroblast growth factors (Fgf), which have shown some evidence of improving recovery in mice and humans with spinal cord damage. The Monash University group found that Fgf activity around the damage site promotes the bipolar glial shape and encourages nerve regeneration in zebrafish.

Preliminary results in mice show that Fgf injections near a spinal injury increase both the number of glia cells at the site and the elongated morphology. Their evidence suggests that Fgfs may work to create an environment more supportive of regeneration in mammals as well and could be a valuable therapeutic target.

Spinal injury patients usually have few options, Dr. Goldshmit emphasized, and development of new, biologically-based approaches will be critical.

"This is a nice example of how we can use the zebrafish model," she said. "When we learn from the zebrafish what to look at, we can find things that give us hope for finding therapeutic approaches for spinal cord injury in humans."

Source: Science Daily

Jul 7, 201214 notes
#science #neuroscience #spinal cord #zebrafish
Brain scanner, not joystick, is in human-robot future

July 6, 2012 by Nancy Owano

(Phys.org) — Talk about fMRI may not be entirely familiar to many people, but that could change with new events that are highlighting efforts to link up humans and machines. fMRI (Functional Magnetic Resonance Imaging) is a promising technology that can help human move beyond joysticks to control robots via brain scanners instead. Now a research project exploring ways to develop robot surrogates with whom humans can interact has turned a corner. A university student‘s ability to make his robot surrogate move around, using fMRI technology, was successful. The experiment linked up Israeli student Tirosh Shapira in a lab at Bar-Ilan University, Israel, with a small robot in another lab far away at Beziers Technology Institute in France.

Shapira merely had to think about moving his arms or legs and the robot, with a camera on its head with an image displayed in front of Shapira, successfully would do the same. If Shapira thought about moving forward or backward, the robot responded accordingly.

fmri monitors blood flowing through the brain and can spot when areas associated with certain actions, such as movement, are in use. The fMRI read the student’s thoughts, which were translated via computer into commands relayed across the Internet to the robot in France.

There is much more work to be done to advance this approach, however. The researchers seek to devise a different type of scanning. An fMRI scanner is an expensive piece of equipment but the scientists believe that improvements in software might allow for a head-mounted device. Another research goal is to see if they can get humans to speak via the robot. The size of the robot will need modification, closer to the size and movement of a human, and engineered with a wider range of movement that would include hand gestures. In sum, according to the researchers, this experiment is only one of many steps ahead.

Medical applications for this technology are seen as promising, especially as scientists explore how patients with paralysis can interface with robots so that the patients can reconnect to the world. Another suggested application has been in the military, where robot surrogates rather than soldiers would be sent into battle.

Source: PHYS.ORG

Jul 7, 201215 notes
#science #neuroscience #brain #fMRI #robotics
Researchers decode molecular mechanism that sheds light on how trauma can become engraved in the brain

July 6, 2012

(Medical Xpress) — Researchers decode a molecular mechanism that sheds light on how trauma can become engraved in the brain

image

Scientists at the Universities of Bonn and Berlin have discovered a mechanism which stops the process of forgetting anxiety after a stress event. In experiments they showed that feelings of anxiety don’t subside if too little dynorphin is released into the brain. The results can help open up new paths in the treatment of trauma patients. The study has been published in the current edition of the Journal of Neuroscience.

Feelings of anxiety very effectively prevent people from getting into situations that are too dangerous. Those who have had a terrible experience initially tend to avoid the place of tragedy out of fear. If no other oppressive situation arises, normally the symptoms of fear gradually subside. “The memory of the terrible events is not just erased.” states first author, PD Dr. Andras Bilkei Gorzo, from the Institute for Molecular Psychiatry at the University of Bonn. “Those impacted learn rather via an active learning process that they no longer need to be afraid because the danger has passed.” But following extreme psychical stress resulting from wars, hostage-takings, accidents or catastrophes chronic anxiety disorders can develop which even after months don’t subside.

Body’s own dynorphin weakens fears

Why is it that in some people terrible events are deeply engraved in their memory, while after a while others seem to have completely put aside any anxiety related to the incident? Scientists in the fields of psychiatry, molecular psychiatry and radiology at the University of Bonn are all involved in probing this issue. “We were able to demonstrate by way of a series of experiments that dynorphin plays an important role in weakening anxiety,” says Prof. Dr. Andreas Zimmer, Director of the Institute for Molecular Psychiatry at the University of Bonn. The substance group in question is opiods which also includes, for instance, endorphins. The latter are released by the body of athletes and have an analgesic and euphoric effect. The reverse, however, is true of dynorphins: They are known for putting a damper on emotional moods.

Mice with disabled gene exhibit persistent anxiety

The team working with Prof. Zimmer tested the exact impact of dynorphins on the brain using mice whose gene for the formation of this substance had been disabled. After being exposed to a brief and unpleasant electric shock, the animals exhibited persistent anxiety symptoms, even if they hadn’t been confronted with the negative stimulus over a longer time. Mice exhibiting a normal amount of released dynorphin were anxious to begin with as well, but the symptoms quickly subsided. “This behavior is the same in humans: If you burn your hand on the stove once, you don’t forget the incident that quickly,” explains Prof. Zimmer. “Learning vocabulary, on the other hand, typically tends to be more tedious because it’s not tied to emotions.”

Results are transferrable to people

Next the researchers showed that these results can be transferred to people. “We took advantage of the fact that people exhibit natural variations of the dynorphin gene that lead to different levels of this substance being released in the brain,” reports Prof. Dr. Henrik Walter, Director of the Research Area Mind and Brain at the Psychiatric University Clinic at the Charité in Berlin, who also used to perform research in this area at the University Clinic in Bonn. A total of 33 healthy probands were divided into two groups: One with the genetically stronger dynorphin release and the other which exhibits less gene activity.

Unpleasant stimulus leads to stress reactions in the probands

Equipped with computer glasses the probands observed blue and green squares which appeared and then disappeared again in a magnetic resonance tomograph (MRT). When the green square was visible the scientists repeatedly gave probands an unpleasant stimulus on the hand using a laser. Scientists were able to prove that these negative stimuli actually led to a stress reaction given the increased sweat on the skin. At the same time, researchers recorded the activities of various brain areas with the tomograph. After this conditioning stage came part two of the experiment: The researchers showed the colored squares without any unpleasant stimuli and recorded how long the stress reaction acquired earlier lasted. The next day the experiment was continued without the laser stimulus in an effort to monitor the longer-term development.

New paths in the treatment of trauma patients

It became apparent that, as in mice human, probands with lower gene activity for dynorphin exhibited stress reactions lasting considerably longer than those probands who released considerably more. Moreover, in brain scans it could be observed that the amygdala – a brain structure in the temporal lobes that processes emotional contents - was also active even if in later testing rounds a green square was shown without the subsequent laser stimulus.

“After the negative laser stimulus stopped this amygdala activity gradually became weaker. This means that the acquired anxiety reaction to the stimulus was forgotten,” reports Prof. Walter. This effect was not as pronounced in the group with less dynorphin activity and prolonged anxiety. “But the ‘forgetting’ of acquired anxiety reactions isn’t a fading, but, rather, an active process which involves the ventromedial prefrontal cortex,” emphasizes Prof. Walter. To corroborate this, researchers found that in the group with less dynorphin activity there was reduced coupling between the prefrontal cortex and the amygdala. “In all likelihood dynorphins affect fear forgetting in a crucial way through this structure,” says Prof. Walter. The scientists now hope that by using the results they will be able to develop long-term approaches for new strategies when it comes to the treatment of trauma patients.

Provided by University of Bonn

Source: medicalxpress.com

Jul 7, 201238 notes
#science #neuroscience #brain #psychology #anxiety
Gene Linked to Facial, Skull and Cognitive Impairment Identified

ScienceDaily (July 5, 2012) — A gene whose mutation results in malformed faces and skulls as well as mental retardation has been found by scientists.

They looked at patients with Potocki-Shaffer syndrome, a rare disorder that can result in significant abnormalities such as a small head and chin and intellectual disability, and found the gene PHF21A was mutated, said Dr. Hyung-Goo Kim, molecular geneticist at the Medical College of Georgia at Georgia Health Sciences University.

The scientists confirmed PHF21A’s role by suppressing it in zebrafish, which developed head and brain abnormalities similar to those in patients. “With less PHF21A, brain cells died, so this gene must play a big role in neuron survival,” said Kim, lead and corresponding author of the study published in The American Journal of Human Genetics. They reconfirmed the role by giving the gene back to the malformed fish — studied for their adeptness at regeneration — which then became essentially normal. They also documented the gene’s presence in the craniofacial area of normal mice.

While giving the normal gene unfortunately can’t cure patients as it does zebrafish, the scientists believe the finding will eventually enable genetic screening and possibly early intervention during fetal development, including therapy to increase PHF21A levels, Kim said. It also provides a compass for learning more about face, skull and brain formation.

The scientists zeroed in on the gene by using a distinctive chromosomal break found in patients with Potocki-Shaffer syndrome as a starting point. Chromosomes — packages of DNA and protein — aren’t supposed to break, and when they do, it can damage genes in the vicinity.

"We call this breakpoint mapping and the breakpoint is where the trouble is," said Dr. Lawrence C. Layman, study co-author and Chief of the MCG Section of Reproductive Endocrinology, Infertility and Genetics. Damaged genes may no longer function optimally; in PHF21A’s case it’s about half the norm.

"When you see the chromosome translocation, you don’t know which gene is disrupted," Layman said. "You use the break as a focus then use a bunch of molecular techniques to zoom in on the gene." Causes of chromosomal breaks are essentially unknown but likely are environmental and/or genetic, Kim said.

Little was known about PHF21A other than its role in determining how tightly DNA is wound in a package with proteins called histones. How tightly DNA is wound determines whether proteins called transcription factors have the access needed to regulate gene expression, which is important, for example, when a gene needs to be expressed only at a specific time or tissue. PHF21A is believed to primarily work by suppressing other genes, for example, ensuring that genes that should be expressed only in brain cells don’t show up in other cell types, Kim said.

Next steps include using PHF21A as a sort of geographic positioning system to identify other “depressor” genes it regulates then screening patients to look for mutations in those genes as well. “We want to find other people with different genes causing the same problem,” Layman said, and they suspect the genes PHF21A interacts with or regulates are the most likely suspects. It’s too early to know what percentage of Potocki-Shaffer syndrome patients have the PHF21A mutation, Kim noted. “Now that we know the causative gene, we can sequence the gene in more patients and see if they have a mutation,” Layman said.

They also want to look at less-severe forms of mental deficiency, including autism, for potentially milder mutations of PHF21A. More than a dozen of the 25,000 human genes are known to cause craniofacial defects and mental retardation, which often occur together, Kim said.

Source: Science Daily

Jul 6, 20129 notes
#science #neuroscience #psychology #gene #genetic disorders
Music to My Eyes: Device Converting Images Into Music Helps Visually Impaired Find Things With Ease

ScienceDaily (July 5, 2012) — Sensory substitution devices (SSDs) use sound or touch to help the visually impaired perceive the visual scene surrounding them. The ideal SSD would assist not only in sensing the environment but also in performing daily activities based on this input. For example, accurately reaching for a coffee cup, or shaking a friend’s hand. In a new study, scientists trained blindfolded sighted participants to perform fast and accurate movements using a new SSD, called EyeMusic. Their results are published in the July issue of Restorative Neurology and Neuroscience.

image

Left: An illustration of the EyeMusic SSD, showing a user with a camera mounted on the glasses, and scalp headphones, hearing musical notes that create a mental image of the visual scene in front of him. He is reaching for the red apple in a pile of green ones. Top right: close-up of the glasses-mounted camera and headphones; bottom right: hand-held camera pointed at the object of interest. (Credit: Maxim Dupliy, Amir Amedi and Shelly Levy-Tzedek)

The EyeMusic, developed by a team of researchers at the Hebrew University of Jerusalem, employs pleasant musical tones and scales to help the visually impaired “see” using music. This non-invasive SSD converts images into a combination of musical notes, or “soundscapes.”

The device was developed by the senior author Prof. Amir Amedi and his team at the Edmond and Lily Safra Center for Brain Sciences (ELSC) and the Institute for Medical Research Israel-Canada at the Hebrew University. The EyeMusic scans an image and represents pixels at high vertical locations as high-pitched musical notes and low vertical locations as low-pitched notes according to a musical scale that will sound pleasant in many possible combinations. The image is scanned continuously, from left to right, and an auditory cue is used to mark the start of the scan. The horizontal location of a pixel is indicated by the timing of the musical notes relative to the cue (the later it is sounded after the cue, the farther it is to the right), and the brightness is encoded by the loudness of the sound.

The EyeMusic’s algorithm uses different musical instruments for each of the five colors: white (vocals), blue (trumpet), red (reggae organ), green (synthesized reed), yellow (violin); Black is represented by silence. Prof. Amedi mentions that “The notes played span five octaves and were carefully chosen by musicians to create a pleasant experience for the users.” Sample sound recordings are available at http://brain.huji.ac.il/em/.

"We demonstrated in this study that the EyeMusic, which employs pleasant musical scales to convey visual information, can be used after a short training period (in some cases, less than half an hour) to guide movements, similar to movements guided visually," explain lead investigators Drs. Shelly Levy-Tzedek, an ELSC researcher at the Faculty of Medicine, Hebrew University, Jerusalem, and Prof. Amir Amedi. "The level of accuracy reached in our study indicates that performing daily tasks with an SSD is feasible, and indicates a potential for rehabilitative use."

The study tested the ability of 18 blindfolded sighted individuals to perform movements guided by the EyeMusic, and compared those movements to those performed with visual guidance. At first, the blindfolded participants underwent a short familiarization session, where they learned to identify the location of a single object (a white square) or of two adjacent objects (a white and a blue square).

In the test sessions, participants used a stylus on a digitizing tablet to point to a white square located either in the north, the south, the east or the west. In one block of trials they were blindfolded (SSD block), and in the other block (VIS block) the arm was placed under an opaque cover, so they could see the screen but did not have direct visual feedback from the hand. The endpoint location of their hand was marked by a blue square. In the SSD block, they received feedback via the EyeMusic. In the VIS block, the feedback was visual.

"Participants were able to use auditory information to create a relatively precise spatial representation," notes Dr. Levy-Tzedek.

The study lends support to the hypothesis that representation of space in the brain may not be dependent on the modality with which the spatial information is received, and that very little training is required to create a representation of space without vision, using sounds to guide fast and accurate movements. “SSDs may have great potential to provide detailed spatial information for the visually impaired, allowing them to interact with their external environment and successfully make movements based on this information, but further research is now required to evaluate the use of our device in the blind ” concludes Dr. Levy-Tzedek. These results demonstrate the potential application of the EyeMusic in performing everyday tasks — from accurately reaching for the red (but not the green!) apples in the produce aisle, to, perhaps one day, playing a Kinect / Xbox game.

Source: Science Daily

Jul 6, 2012134 notes
#science #neuroscience #brain #psychology #vision
Jul 6, 2012128 notes
#science #neuroscience #brain #psychology #AI #robotics #vision
How a protein meal tells your brain you are full

July 5, 2012

Feeling full involves more than just the uncomfortable sensation that your waistband is getting tight. Investigators reporting online on July 5th in the Cell Press journal Cell have now mapped out the signals that travel between your gut and your brain to generate the feeling of satiety after eating a protein-rich meal. Understanding this back and forth loop between the brain and gut may pave the way for future approaches in the treatment and/or prevention of obesity.

image

Feeling full involves more than just the uncomfortable sensation that your waistband is getting tight. Investigators reporting online on July 5th in the Cell Press journal Cell have now mapped out the signals that travel between your gut and your brain to generate the feeling of satiety after eating a protein-rich meal. Understanding this back and forth loop between the brain and gut may pave the way for future approaches in the treatment and/or prevention of obesity. Credit: Duraffourd et al., Cell

Food intake can be modulated through mu-opioid receptors (MORs, which also bind morphine) on nerves found in the walls of the portal vein, the major blood vessel that drains blood from the gut. Specifically, stimulating the receptors enhances food intake, while blocking them suppresses intake. Investigators have now found that peptides, the products of digested dietary proteins, block MORs, curbing appetite. The peptides send signals to the brain that are then transmitted back to the gut to stimulate the intestine to release glucose, suppressing the desire to eat.

Mice that were genetically engineered to lack MORs did not carry out this release of glucose, nor did they show signs of ‘feeling full’, after eating high-protein foods. Giving them MOR stimulators or inhibitors did not affect their food intake, unlike normal mice.

Because MORs are also present in the neurons lining the walls of the portal vein in humans, the mechanisms uncovered here may also take place in people.

"These findings explain the satiety effect of dietary protein, which is a long-known but unexplained phenomenon,” says senior author Dr. Gilles Mithieux of the Université de Lyon, in France. “They provide a novel understanding of the control of food intake and of hunger sensations, which may offer novel approaches to treat obesity in the future,” he adds.

Provided by Cell Press

Source: medicalxpress.com

Jul 6, 201242 notes
#science #neuroscience #brain #psychology #obesity #proteins
Diabetes Drug Makes Brain Cells Grow

ScienceDaily (July 5, 2012) — The widely used diabetes drug metformin comes with a rather unexpected and alluring side effect: it encourages the growth of new neurons in the brain. The study reported in the July 6th issue of Cell Stem Cell, a Cell Press publication, also finds that those neural effects of the drug also make mice smarter.

image

New research finds that the widely used diabetes drug metformin comes with a rather unexpected and alluring side effect: it encourages the growth of new neurons in the brain. (Credit: iStockphoto/Guido Vrola)

The discovery is an important step toward therapies that aim to repair the brain not by introducing new stem cells but rather by spurring those that are already present into action, says the study’s lead author Freda Miller of the University of Toronto-affiliated Hospital for Sick Children. The fact that it’s a drug that is so widely used and so safe makes the news all that much better.

Earlier work by Miller’s team highlighted a pathway known as aPKC-CBP for its essential role in telling neural stem cells where and when to differentiate into mature neurons. As it happened, others had found before them that the same pathway is important for the metabolic effects of the drug metformin, but in liver cells.

"We put two and two together," Miller says. If metformin activates the CBP pathway in the liver, they thought, maybe it could also do that in neural stem cells of the brain to encourage brain repair.

The new evidence lends support to that promising idea in both mouse brains and human cells. Mice taking metformin not only showed an increase in the birth of new neurons, but they were also better able to learn the location of a hidden platform in a standard maze test of spatial learning.

While it remains to be seen whether the very popular diabetes drug might already be serving as a brain booster for those who are now taking it, there are already some early hints that it may have cognitive benefits for people with Alzheimer’s disease. It had been thought those improvements were the result of better diabetes control, Miller says, but it now appears that metformin may improve Alzheimer’s symptoms by enhancing brain repair.

Miller says they now hope to test whether metformin might help repair the brains of those who have suffered brain injury due to trauma or radiation therapies for cancer.

Source: Science Daily

Jul 6, 201261 notes
#science #neuroscience #psychology #diabetes #brain
Brain Center for Social Choices Discovered: Poker-Playing Subjects Seen Weighing Whether to Bluff

ScienceDaily (July 5, 2012) — Although many areas of the human brain are devoted to social tasks like detecting another person nearby, a new study has found that one small region carries information only for decisions during social interactions. Specifically, the area is active when we encounter a worthy opponent and decide whether to deceive them.

image

(Credit: © wtamas / Fotolia)

A brain imaging study conducted by researchers at the Duke Center for Interdisciplinary Decision Science (D-CIDES) put human subjects through a functional MRI brain scan while playing a simplified game of poker against a computer and human opponents. Using computer algorithms to sort out what amount of information each area of the brain was processing, the team found only one brain region — the temporal-parietal junction, or TPJ — carried information that was unique to decisions against the human opponent.

Some of the time, the subjects were dealt an obviously weak hand. The researchers wanted to see whether they could watch the player calculate whether to bluff his opponent. The brain signals in the TPJ told the researchers whether the subject would soon bluff against a human opponent, especially if that opponent was judged to be skilled. But against a computer, signals in the TPJ did not predict the subject’s decisions.

The TPJ is in a boundary area of the brain, and may be an intersection for two streams of information, said lead researcher McKell Carter, a postdoctoral fellow at Duke. It brings together a flow of attentional information and biological information, such as “is that another person?”

Carter observed that in general, participants paid more attention to their human opponent than their computer opponent while playing poker, which is consistent with humans’ drive to be social.

Throughout the poker game experiment, regions of the brain that are typically thought to be social in nature did not carry information specific to a social context. “The fact that all of these brain regions that should be specifically social are used in other circumstances is a testament to the remarkable flexibility and efficiency of our brains,” said Carter.

"There are fundamental neural differences between decisions in social and non-social situations," said D-CIDES Director Scott Huettel, the Hubbard professor of psychology & neuroscience at Duke and senior author of the study. "Social information may cause our brain to play by different rules than non-social information, and it will be important for both scientists and policymakers to understand what causes us to approach a decision in a social or a non-social manner.

"Understanding how the brain identifies important competitors and collaborators — those people who are most relevant for our future behavior — will lead to new insights into social phenomena like dehumanization and empathy," Huettel added.

Source: Science Daily

Jul 6, 201215 notes
#science #neuroscience #brain #psychology
Scientific Study Reveals That Individuals Cooperate According to Their Emotional State and Their Prior Experiences

ScienceDaily (July 4, 2012) — A study by researchers at Universidad Carlos III de Madrid and Universidad de Zaragoza has determined that when deciding whether to cooperate with others, people do not act thinking about their own reward, as had been previously believed, but rather individuals are more influenced by their own mood at the time and by the number of individuals with whom they have cooperated before.

In addition to previous studies, this research is also based on an experiment carried out by the Institute for Biocomputation and Physics of Complex Systems (BIFI) at the Universidad de Zaragoza, together with the Fundación Ibercivis and Universidad Carlos III de Madrid (UC3M), the largest study of its kind to date in real time regarding cooperation in society. It was carried out during this past December, with 1,200 Aragon secondary students participating, who interacted electronically in real time via a social conflict prototype known as the “Prisoner’s Dilemma.” This game shows that the greatest benefit for individuals who interact is produced when both of them collaborate, but if one collaborates and the other does not, the latter will receive more benefits than the one who cooperates. On occasion, this allows an individual to take advantage of the cooperation of others, but if this tendency is extended, in the end, no one cooperates and as such, nobody obtains rewards.

After analyzing the information, the main conclusion drawn by the researchers is that in a situation where cooperating with others is beneficial, the way the individuals involved are organized into one social structure or another is irrelevant. A first analysis contradicts what many researchers have held based on theoretical studies.

In the experiment, the degree of cooperation in a network in which each subject interacts with four other individuals is compared to a network in which the number of connections vary between 2 and 16, that is, one that is more similar to a social network. What has been observed is that the results in the two networks are identical. “This happens because, contrary to what has been proposed in the majority of studies, people do not make their decisions based on the rewards obtained (by them or by their neighbors), but rather based on how many people have recently cooperated with them, as well as on their own mood at the time,” the researchers explained.

These results help understand how people make decisions, above all in the context in which one has to decide between collaborating with or taking advantage of others. “Understanding why we do one thing or another can help in designing incentives that induce people to cooperate,” the authors of the research pointed out. On the other hand, the fact that the networks are not important has implications, for organizational design, for example. The experiment revealed that people are not going to cooperate more because of being organized in a certain way. In this respect, it can be inferred that we do not have to be concerned with the design of organizational structure, but rather with motivating people individually to cooperate.

Ruling out that network organization influences in the cooperation of people, and having discovered that what is important is reciprocity, that is, cooperating according to cooperation received, will radically change the focus of a significant number of researchers who are developing theories on the emergence of cooperation among individuals.

Source: Science Daily

Jul 5, 201232 notes
#science #neuroscience #brain #psychology
Skin patch improves attention span in stroke patients

July 4, 2012

(Medical Xpress) — Researchers at the UCL Institute of Neurology have found that giving the drug rotigotine as a skin patch can improve inattention in some stroke patients.

Hemi-spatial neglect, a severe and common form of inattention that can be caused by brain damage following a stroke, is one of the most debilitating symptoms, frequently preventing patients from living independently. When the right side of the brain has suffered damage, the patient may have little awareness of their left-hand side and have poor memory of objects that they have seen, leaving them inattentive and forgetful. Currently there are few treatment options.

The randomised control trial took 16 patients who had suffered a stroke on the right-hand side of their brain and assessed to see whether giving the drug rotigotine improved their ability to concentrate on their left-hand side. The results showed that even with treatment for just over a week, patients who received the drug performed significantly better on attention tests than when they received the placebo treatment.

Rotigotine acts by stimulating receptors on nerve cells for dopamine, a chemical normally produced within the brain.

Professor Masud Husain who led the study at the Institute of Neurology at UCL says: “Inattention can have a devastating effect on stroke patients and their families. It impacts on all aspects of their lives. If the results of our clinical trial are replicated in further, larger studies, we will have overcome a major hurdle towards providing a new treatment for this important consequence of stroke.

“Milder forms of inattention occur in other brain disorders, across all ages - from ADHD (attention deficit hyperactivity disorder) to Parkinson’s disease. Our findings show that it is possible to alter attention by using a drug that acts at specific receptors in the brain, and therefore have implications for understanding the mechanisms that might cause inattention in conditions other than stroke.”

Provided by University College London

Source: medicalxpress.com

Jul 5, 20127 notes
#science #neuroscience #brain #psychology #stroke
Artificial Cerebellum Than Enables Robotic Human-Like Object Handling Developed

ScienceDaily (July 3, 2012) — University of Granada researchers have developed an artificial cerebellum (a biologically-inspired adaptive microcircuit) that controls a robotic arm with human-like precision. The cerebellum is the part of the human brain that controls the locomotor system and coordinates body movements.

To date, although robot designers have achieved very precise movements, such movements are performed at very high speed, require strong forces and are power consuming. This approach cannot be applied to robots that interact with humans, as a malfunction might be potentially dangerous.

To solve this challenge, University of Granada researchers have implemented a new cerebellar spiking model that adapts to corrections and stores their sensorial effects; in addition, it records motor commands to predict the action or movement to be performed by the robotic arm. This cerebellar model allows the user to articulate a state-of-the-art robotic arm with extraordinary mobility.

Automatic Learning

The developers of the new cerebellar model have obtained a robot that performs automatic learning by extracting the input layer functionalities of the brain cortex. Furthermore, they have developed two control systems that enable accurate and robust control of the robotic arm during object handling.

The synergy between the cerebellum and the automatic control system enables robot’s adaptability to changing conditions i.e. the robot can interact with humans. The biologically-inspired architectures used in this model combine the error training approach with predictive adaptive control.

The designers of this model are Silvia Tolu, Jesús Garrido and Eduardo Ros Vidal, at the University of Granada Department of Computer Architecture and Technology, and the University of Almería researcher Richard Carrillo.

Source: Science Daily

Jul 5, 201239 notes
#science #neuroscience #brain #psychology
Childhood Adversity Increases Risk for Depression and Chronic Inflammation

ScienceDaily (July 3, 2012) — When a person injures their knee, it becomes inflamed. When a person has a cold, their throat becomes inflamed. This type of inflammation is the body’s natural and protective response to injury.

Interestingly, there is growing evidence that a similar process happens when a person experiences psychological trauma. Unfortunately, this type of inflammation can be destructive.

Previous studies have linked depression and inflammation, particularly in individuals who have experienced early childhood adversity, but overall, findings have been inconsistent. Researchers Gregory Miller and Steve Cole designed a longitudinal study in an effort to resolve these discrepancies, and their findings are now published in a study in Biological Psychiatry.

They recruited a large group of female adolescents who were healthy, but at high risk for experiencing depression. The volunteers were then followed for 2 ½ years, undergoing interviews and giving blood samples to measure their levels of C-reactive protein and interleukin-6, two types of inflammatory markers. Their exposure to childhood adversity was also assessed.

The researchers found that when individuals who suffered from early childhood adversity became depressed, their depression was accompanied by an inflammatory response. In addition, among subjects with previous adversity, high levels of interleukin-6 forecasted risk of depression six months later. In subjects without childhood adversity, there was no such coupling of depression and inflammation.

Dr. Miller commented on their findings: “What’s important about this study is that it identifies a group of people who are prone to have depression and inflammation at the same time. That group of people experienced major stress in childhood, often related to poverty, having a parent with a severe illness, or lasting separation from family. As a result, these individuals may experience depressions that are especially difficult to treat.”

Another important aspect to their findings is that the inflammatory response among the high-adversity individuals was still detectable six months later, even if their depression had abated, meaning that the inflammation is chronic rather than acute. “Because chronic inflammation is involved in other health problems, like diabetes and heart disease, it also means they have greater-than-average risk for these problems. They, along with their doctors, should keep an eye out for those problems,” added Dr. Miller.

"This study provides important additional support for the notion that inflammation is an important and often under-appreciated factor that compromises resilience after major life stresses. It provides evidence that these inflammatory states persist for long periods of time and have important functional correlates," said Dr. John Krystal, Editor of Biological Psychiatry.

Further research is necessary, to extend the findings beyond female adolescents and particularly in individuals with more severe, long-term depression.. However, findings such as these may eventually help doctors and clinicians better manage depression and medical illness for particularly vulnerable patients.

Source: medicalxpress.com

Jul 5, 201231 notes
#science #neuroscience #depression #brain #psychology
Molecular Clues to Link Between Childhood Maltreatment and Later Suicide

ScienceDaily (July 3, 2012) — Exposure to childhood maltreatment increases the risk for most psychiatric disorders as well as many negative consequences of these conditions. This new study, by Dr. Gustavo Turecki and colleagues at McGill University, Canada, provides important insight into one of the most extreme outcomes, suicide.

"In this study, we expanded our previous work on the epigenetic regulation of the glucocorticoid receptor gene by investigating the impact of severe early-life adversity on DNA methylation," explained Dr. Turecki. The glucocorticoid receptor is important because it is a brain target for the stress hormone cortisol.

The researchers studied brain tissue from people who had committed suicide, some of whom had a history of childhood maltreatment, and compared that tissue to people who had died from other causes. They found that particular variants of the glucocorticoid receptor were less likely to be present in the limbic system, or emotion circuit, of the brain in people who had committed suicide and were maltreated as children compared to the other two groups..

This study also advances the understanding of how the altered pattern of glucocorticoid receptor regulation developed in the maltreated suicide completers. The authors found that the pattern of methylation of the gene coding for the glucocorticoid receptors was altered in those who completed suicide and who also had a history of abuse. These DNA methylation differences were associated with distinct gene expression patterns.

Since methylation is one way that genes are switched on or off for long periods of time, it appears that childhood adversity can produce long-lasting changes in the regulation of a key stress response system that may be associated with increased risk for suicide.

"Preventing suicide is a critical challenge for psychiatry. This study provides important new information about brain changes that may increase the risk of suicide," said Dr. John Krystal, Editor of Biological Psychiatry. "It is striking that early life maltreatment can produce these long-lasting changes in the control of specific genes in the brain. It is also troubling that the consequences of this process can be so dire. Thus, it is important that we continue to study these epigenetic processes that seem to underlie aspects of the lasting consequences of childhood adversity."

Source: Science Daily

Jul 5, 201234 notes
#science #neuroscience #brain #psychology
Adult Stem Cells from Bone Marrow: Cell Replacement/Tissue Repair Potential in Adult Bone Marrow Stem Cells in Animal Model

ScienceDaily (July 3, 2012) — searchers from the University of Maryland School of Maryland report promising results from using adult stem cells from bone marrow in mice to help create tissue cells of other organs, such as the heart, brain and pancreas — a scientific step they hope may lead to potential new ways to replace cells lost in diseases such as diabetes, Parkinson’s or Alzheimer’s.

The research in collaboration with the University of Paris Descartes is published online in the June 29, 2012 edition of Comptes Rendus Biologies, a publication of the French Academy of Sciences.

"Finding stem cells capable of restoring function to different damaged organs would be the Holy Grail of tissue engineering," says lead author David Trisler, PhD, assistant professor of neurology at the University of Maryland School of Medicine.

He adds, “This research takes us another step in that process by identifying the potential of these adult bone marrow cells, or a subset of them known as CD34+ bone marrow cells, to be ‘multipotent,’ meaning they could transform and function as the normal cells in several different organs.”

University of Maryland researchers previously developed a special culturing system to collect a select sample of these adult stem cells in bone marrow, which normally makes red and white blood cells and immune cells. In this project, the team followed a widely recognized study model, used to prove the multipotency of embryonic stem cells, to prove that these bone marrow stem cells could make more than just blood cells. The investigators also found that the CD34+ cells had a limited lifespan and did not produce teratomas, tumors that sometimes form with the use of embryonic stem cells and adult stem cells cultivated from other methods that require some genetic manipulation.

"When taken at an early stage, we found that the CD34+ cells exhibited similar multipotent capabilities as embryonic stem cells, which have been shown to be the most flexible and versatile. Because these CD34+ cells already exist in normal bone marrow, they offer a vast source for potential cell replacement therapy, particularly because they come from a person’s own body, eliminating the need to suppress the immune system, which is sometimes required when using adults stem cells derived from other sources," explains Paul Fishman, MD, PhD, professor of neurology at the University of Maryland School of Medicine.

The researchers say that proving the potential of these adult bone marrow stem cells opens new possibilities for scientific exploration, but that more research will be needed to see how this science can be translated to humans.

Source: Science Daily

Jul 5, 20124 notes
#science #neuroscience #brain #parkinson #alzheimer
Why Current Strategies for Fighting Obesity Are Not Working

ScienceDaily (July 3, 2012) — As the United States confronts the growing epidemic of obesity among children and adults, a team of University of Colorado School of Medicine obesity researchers concludes that what the nation needs is a new battle plan — one that replaces the emphasis on widespread food restriction and weight loss with an emphasis on helping people achieve “energy balance” at a healthy body weight.

In a paper published in the July 3 issue of the journal Circulation, James O. Hill, PhD. and colleagues at the Anschutz Health and Wellness Center take on the debate over whether excessive food intake or insufficient physical activity cause obesity, using the lens of energy balance — which combines food intake, energy expended through physical activity and energy (fat) storage — to advance the concept of a “regulated zone,” where the mechanisms by which the body establishes energy balance are managed to overcome the body’s natural defenses towards preserving existing body weight. This is accomplished by strategies that match food and beverage intake to a higher level of energy expenditure than is typical in America today, enabling the biological system that regulates body weight to work more effectively. Additional support for this concept comes from many studies showing that higher levels of physical activity are associated with low weight gain whereas comparatively low levels of activity are linked to high weight gain over time.

"A healthy body weight is best maintained with a higher level of physical activity than is typical today and with an energy intake that matches," explained Hill, professor of pediatrics and medicine and executive director of the Anschutz Health and Wellness Center at the University of Colorado Anschutz Medical Campus and the lead author of the paper. "We are not going to reduce obesity by focusing only on reducing food intake. Without increasing physical activity in the population we are simply promoting unsustainable levels of food restriction. This strategy hasn’t worked so far and it is not likely to work in the future.

As Dr. Hill explains, “What we are really talking about is changing the message from ‘Eat Less, Move More” to ‘Move More, Eat Smarter.’ “

The authors argue that preventing excessive weight gain is a more achievable goal than treating obesity once it is present. Here, the researchers stress that reducing calorie intake by 100 calories a day would prevent weight gain in 90 percent of the adult population and is achievable through small increases in physical activity and small changes in food intake.

People who have a low level of physical activity have trouble achieving energy balance because they must constantly use food restriction to match energy intake to a low level of energy expenditure. Constant food restriction is difficult to maintain long-term and when it cannot be maintained, the result is positive energy balance (when the calories consumed are greater than the calories expended) and an increase in body mass, of which 60 percent to 80 percent is usually body fat. The increasing body mass elevates energy expenditure and helps reestablish energy balance. In fact, the researchers speculate that becoming obese may be the only way to achieve energy balance when living a sedentary lifestyle in a food-abundant environment.

Using an exhaustive review of the energy balance literature as the basis, the researchers also refuted the popular theory that escalating obesity rates can be attributed exclusively to two factors — the change in the American diet and the rise in overall energy intake without a compensatory increase in energy expenditure. Using rough estimates of increases in food intake and decreases in physical activity from 1971 to 2000, the researchers calculated that were it not for the physiological processes that produce energy balance, American adults would have experienced a 30 to 80 fold increase in weight gain during that period, which demonstrates why it is not realistic to attribute obesity solely to caloric intake or physical activity levels. In fact, energy expenditure has dropped dramatically over the past century as our lives now require much less physical activity just to get through the day. The authors argue that this drop in energy expenditure was a necessary prerequisite for the current obesity problem, which necessitates adding a greater level of physical activity back into our modern lives.

"Addressing obesity requires attention to both food intake and physical activity, said co-author John Peters, PhD., assistant director of the Anschutz Health and Wellness Center. "Strategies that focus on either alone will not likely work."

In addition, the researchers conclude that food restriction alone is not effective in reducing obesity, explaining that although caloric restriction produces weight loss, this process triggers hunger and the body’s natural defense to preserve existing body weight, which leads to a lower resting metabolic rate and notable changes in how the body burns calories. As a result, energy requirements after weight loss can be reduced from 170 to 250 calories for a 10 percent weight loss and from 325 to 480 calories for a 20 percent weight loss. These findings provide insight concerning weight loss plateau and the common occurrence of regaining weight after completing a weight loss regimen.

Recognizing that energy balance is a new concept for to the public, the researchers call for educational efforts and new information tools that will teach Americans about energy balance and how food and physical activity choices affect energy balance.

Source: Science Daily

Jul 5, 201225 notes
#science #neuroscience #obesity #psychology
Jul 5, 201222 notes
#science #neuroscience #brain #toxoplasma #animals
Jul 5, 201257 notes
#science #neuroscience #brain #psychology
Bees Can 'Turn Back Time,' Reverse Brain Aging

ScienceDaily (July 3, 2012) — Scientists at Arizona State University have discovered that older honey bees effectively reverse brain aging when they take on nest responsibilities typically handled by much younger bees. While current research on human age-related dementia focuses on potential new drug treatments, researchers say these findings suggest that social interventions may be used to slow or treat age-related dementia.

image

Old bees collect nectar and pollen. Most bees start doing this job when they are 3-4 weeks old, and after that they age very quickly. Their bodies and wings become worn and they loose the ability to learn new things. Most food collector bees die after about 10 days. (Credit: Christofer Bang)

In a study published in the scientific journal Experimental Gerontology, a team of scientists from ASU and the Norwegian University of Life Sciences, led by Gro Amdam, an associate professor in ASU’s School of Life Sciences, presented findings that show that tricking older, foraging bees into doing social tasks inside the nest causes changes in the molecular structure of their brains.

"We knew from previous research that when bees stay in the nest and take care of larvae — the bee babies — they remain mentally competent for as long as we observe them," said Amdam. "However, after a period of nursing, bees fly out gathering food and begin aging very quickly. After just two weeks, foraging bees have worn wings, hairless bodies, and more importantly, lose brain function — basically measured as the ability to learn new things. We wanted to find out if there was plasticity in this aging pattern so we asked the question, ‘What would happen if we asked the foraging bees to take care of larval babies again?"

During experiments, scientists removed all of the younger nurse bees from the nest — leaving only the queen and babies. When the older, foraging bees returned to the nest, activity diminished for several days. Then, some of the old bees returned to searching for food, while others cared for the nest and larvae. Researchers discovered that after 10 days, about 50 percent of the older bees caring for the nest and larvae had significantly improved their ability to learn new things.

Amdam’s international team not only saw a recovery in the bees’ ability to learn, they discovered a change in proteins in the bees’ brains. When comparing the brains of the bees that improved relative to those that did not, two proteins noticeably changed. They found Prx6, a protein also found in humans that can help protect against dementia — including diseases such as Alzheimer’s — and they discovered a second and documented “chaperone” protein that protects other proteins from being damaged when brain or other tissues are exposed to cell-level stress.

In general, researchers are interested in creating a drug that could help people maintain brain function, yet they may be facing up to 30 years of basic research and trials.

"Maybe social interventions — changing how you deal with your surroundings — is something we can do today to help our brains stay younger," said Amdam. "Since the proteins being researched in people are the same proteins bees have, these proteins may be able to spontaneously respond to specific social experiences."

Amdam suggests further studies are needed on mammals such as rats in order investigate whether the same molecular changes that the bees experience might be socially inducible in people.

Source: Science Daily

Jul 4, 201233 notes
#science #neuroscience #brain #animals #psychology
Road-mapping the Asian brain

July 3, 2012

Scientists at The University of Nottingham are leading research that will develop the world’s first ‘atlas’ of the Asian brain.

Working in collaboration with colleagues in South Korea, the project aims to build a detailed picture of how the Asian brain develops normally, taking into account the differences and variations which occur from person to person.

The resulting road-map of the brain could be used to help doctors in countries like South Korea, Japan and China to develop new diagnostic tools for age-related neurodegenerative diseases such as Alzheimer’s, Parkinson’s and dementia, allowing them to spot illnesses at a much earlier stage, thereby improving treatment options and outcomes.

The two-year project will marry the expertise of Nottingham academics in advanced brain imaging techniques, including ultra high field magnetic resonance imaging (MRI), with the clinical expertise and specialist computer software development skills of researchers at Korea University in Seoul.

Stephen Jackson, Professor of Cognitive Neuroscience in the University’s School of Psychology, said: “Developing this atlas of the Asian brain will be a major step forward in furthering the field of neuroscience, which is developing rapidly in the East.

"We hope this two-year project will also act as a template for further UK-South Korean collaboration and knowledge transfer, which has been highlighted by Government as a strategic priority."

The project, initially funded with a Global Partnership Fund grant from the British Foreign and Commonwealth Office (BIS), will see the Nottingham academics working with colleagues in the College of Medicine, Biomedical Engineering, and Psychology at Korea University, to scan the brains of healthy Asian adults using advanced MRI techniques.

Data from the hundreds of images produced will then be analysed and computer modelling techniques used to build up a detailed picture of how a normal Asian brain develops in adults, taking into account the slight variations that occur from person to person.

There are subtle differences in the size and genetics of the Asian brain compared to its Western cousin and the research will allow for the development of new diagnostic aids for age-related neuro-degenerative diseases which are specifically tailored to Asian patients.

The research will build on The University of Nottingham’s reputation as a world-leader in MRI research — the technique was invented there by Professor Sir Peter Mansfield, whose work jointly earned him the Nobel Prize for Medicine in 2003.

Biomedical imaging remains a strategic research priority for Nottingham through its Sir Peter Mansfield Magnetic Resonance Centre, which hosts the UK’s only 7 Tesla MRI scanner.

The University has recently established a UK Centre for Child Neuroimaging, a core theme of Nottingham’s Impact Campaign, the biggest fundraising campaign in The University of Nottingham’s 130 year history. It aims to raise £150m to transform research, enrich the student experience and enable the institution to make an even greater contribution to the global communities it serves.

The work to map the Asian brain will also involve collaboration with academics at other UK and European institutions, including University College London, the Institute of Neurology, Institute of Psychiatry, Imperial College and the University of Aachen in Germany.

The collaboration between The University of Nottingham and Korea University is the latest in a long-running relationship between the two higher education institutions and follows the signing of a Memorandum of Understanding, along with 12 other universities in the Universitas 21 group, in 2009 that aimed to offer postdoc students international opportunities through a joint PhD programme.

Provided by University of Nottingham

Source: medicalxpress.com

Jul 4, 201218 notes
#science #neuroscience #brain #psychology
3-D Movies Linked to Increased Vision Symptoms

ScienceDaily (July 2, 2012) — Watching 3D movies can “immerse” you in the experience — but can also lead to visual symptoms and even motion sickness, reports a study — “Stereoscopic Viewing and Reported Perceived Immersion and Symptoms,” in the July issue of Optometry and Vision Science, official journal of the American Academy of Optometry.

The journal is published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.

Symptoms related to 3D viewing are affected by where you sit while watching, and even how old you are. “Younger viewers incurred higher immersion but also greater visual and motion sickness symptoms in 3D viewing,” according to the authors, led by Shun-nan Yang, PhD, of Pacific University College of Optometry, Forest Grove, Ore. “Both [problems] will be reduced if a farther distance and a wider viewing angle are adopted.”

Greater ‘Immersion’ in 3D Also Associated With Increased Symptoms

The researchers performed experiments in which adults, from young adult to middle-aged, were invited to watch a movie (Cloudy with a Chance of Meatballs) in 2D or 3D while sitting at different angles and distances. Visual and other symptoms were assessed — including the role of factors including age, seating position, and level of “immersion” in the movie.

Twenty-one percent of participants reported symptoms while watching the movie in 3D, compared to twelve percent with 2D viewing. For younger study participants blurred vision, double vision, dizziness, disorientation, and nausea were all more frequent and severe when watching the movie in 3D.

3D viewing also led to a greater sense of immersion — “a greater sense of object motion and motion of the viewer in space” — compared to 2D viewing. Subjects sitting in more central or closer positions reported greater immersion as well as increased symptoms of motion sickness — that is, nausea. Sitting at an angle to the screen was associated with less immersion as well as reduced motion symptoms.

There were some differences by age, including a lower rate of blurred vision in older viewers (age 46 and older). Older viewers had more visual and motion sickness symptoms in 2D viewing, while younger viewers (age 24 to 34) had more symptoms in 3D viewing. The same age-related changes leading to lower rates of blurred vision in older viewers may also explain their lower rates of symptoms during 3D vision.

As 3D movies become more common, including on home screens, there are reports of visual and other symptoms among 3D viewers. Vision and orientation symptoms related to 3D viewing may be related to a “mismatch” between focusing and converging the eyes. Anthony Adams, OD, PhD, Editor-in-Chief of Optometry and Vision Science notes “the technology for reducing mismatch between where the eyes converge and where they focus is likely to improve rapidly.”

The study identifies several factors associated with symptoms during 3D viewing. “3D viewing is quite specific in causing blurred vision and double vision, and the resultant symptoms are greater for younger adults,” Dr Yang and colleagues write. 3D produces a greater sense of immersion than 2D viewing, which leads to more symptoms of motion sickness — especially for younger adults and when viewing from a closer distance and a more direct angle.

The study will help optometrists and other eye care professionals in talking to patients about visual and other symptoms related to today’s sophisticated 3D video setups.

Source: Science Daily

Jul 4, 201212 notes
#science #neuroscience #brain #psychology #vision
Novel Mechanism and Potential Link Responsible for Huntington's Disease

ScienceDaily (July 2, 2012) — Using an in vitro cell model of Huntington’s disease (HD), researchers at Florida Atlantic University’s Charles E. Schmidt College of Medicine have discovered a novel mechanism and potential link between mutant huntingtin, cell loss and cell death or apoptosis in the brain, which is responsible for the devastating effects of this disease. Apoptosis has been proposed as one of the mechanisms leading to neuronal death in HD.

Dr. Jianning Wei, Ph.D., assistant professor of biomedical science in the Schmidt College of Medicine, has received a $428,694 grant from the National Institutes of Health (NIH) for a project titled “Regulation of BimEL phosphorylation in the pathogenesis of Huntington’s disease.” With this grant, she will further her research and investigation of the molecular and physiological functions of BimEL, a protein known to promote cell death, in a rodent HD model to better understand the pathogenesis of this disease and develop treatments and therapies to prevent or slow down its progression. Wei’s previous findings may also represent a universal mechanism in the pathogenesis of neurodegenerative diseases that are involved with protein misfolding and aggregation — a phenomenon that occurs in many highly debilitating disorders including neurodegenerative diseases.

HD is a fatal, inherited disease caused by abnormal repeats of a small segment in an individual’s DNA or genetic code. The production of malfunctioning proteins in the body are results of this mutation, and the more repeat the protein contains, the worse the disease. A person who has the disease carries one normal copy of the gene and one mutated copy in his or her cells. Although the mutated forms of these genes are known for their devastating effects, their normal forms are critical for nerve function, embryonic development and other bodily processes. Similar mutations in other proteins are involved in several other neurodegenerative diseases.

"HD is a highly complex genetic, neurological disorder that causes certain nerve cells in the brain to waste away, and the underlying molecular mechanism of this disease still remains elusive," said Wei. "We are continuing our research to identify the pathways in the brain that are altered in response to mutant proteins, as well as to understand the cellular processes impacted by the disease in order to facilitate the development of effective pharmacological interventions."

Named after American physician George Huntington, HD is characterized by a selective loss of neurons in the brain and affects the basal ganglia, which controls motor control, cognition, learning and emotions. It also affects the outer surface of the brain or the cortex, which controls thought, perception, and memory. It is estimated that more than 250,000 Americans have HD or are at risk of inheriting the disease from an affected parent.

"The vital research that Dr. Wei and her colleagues are conducting at Florida Atlantic University will help to shed light on a very devastating and difficult disease for which there are currently no treatments available to stop or reverse its course," said Dr. David J. Bjorkman, M.D., M.S.P.H., dean of FAU’s Charles E. Schmidt College of Medicine.

Source: Science Daily

Jul 4, 20125 notes
#science #neuroscience #brain #psychology #huntington
Chronic Inflammation in the Brain Leads the Way to Alzheimer's Disease

ScienceDaily (July 2, 2012) — Research published July 2 in Biomed Central’s open access journal Journal of Neuroinflammation suggests that chronic inflammation can predispose the brain to develop Alzheimer’s disease.

To date it has been difficult to pin down the role of inflammation in Alzheimer’s disease (AD), especially because trials of NSAIDs appeared to have conflicting results. Although the ADAPT (The Alzheimer`s Disease Anti-inflammatory Prevention Trial) trial was stopped early, recent results suggest that NSAIDs can help people with early stages of AD but that prolonged treatment is necessary to see benefit.

Researchers from the University of Zurich, in collaboration with colleagues from the ETH Zurich and University of Bern investigated what impact immune system challenges (similar to having a severe viral infection) would have on the development of AD in mice. Results showed that a single infection before birth (during late gestation) was enough to induce long-term neurological changes and significant memory problems at old age.

These mice had a persistent increase in inflammatory cytokines, increased levels of amyloid precursor protein (APP), and altered cellular localization of Tau. If this immune system challenge was repeated during adulthood the effect was strongly exacerbated, resulting in changes similar to those seen for pathological aging.

Dr Irene Knuesel who led this research explained, “The AD-like changes within the brain of these mice occurred without an increase in amyloid β (Aβ). However, in mice genetically modified to produce the human version of Aβ, the viral-like challenge drastically increased the amount of Aβ at precisely the sites of inflammation-induced APP deposits. Based on the similarity between these APP/AƒÒ aggregates in mice and those found in human AD, it seems likely that chronic inflammation due to infection could be an early event in the development of AD.

Source: Science Daily

Jul 4, 20124 notes
#science #neuroscience #brain #psychology #alzheimer
Years Before Diagnosis, Quality of Life Declines for Parkinson's Disease Patients

ScienceDaily (July 2, 2012) — Growing evidence suggests that Parkinson’s disease (PD) often starts with non-motor symptoms that precede diagnosis by several years. In the first study to examine patterns in the quality of life of Parkinson’ disease patients prior to diagnosis, researchers have documented declines in physical and mental health, pain, and emotional health beginning several years before the onset of the disease and continuing thereafter.

Their results are reported in the latest issue of Journal of Parkinson’s Disease.

"We observed a decline in physical function in PD patients relative to their healthy counterparts beginning three years prior to diagnosis in men and seven and a half years prior to diagnosis in women," says lead investigator Natalia Palacios, PhD, Department of Nutrition, Harvard School of Public Health. "The decline continues at a rate that is five to seven times faster than the average yearly decline caused by normal aging in individuals without the disease."

The study included 51,350 male health professionals enrolled in the Health Professionals Follow Up Study (HPFS) and 121,701 female registered nurses enrolled in the Nurses’ Health Study (NHS). In both ongoing studies, participants fill out biannual questionnaires about a variety of lifestyle characteristics and document the occurrence of major chronic disease. In the NHS study, questionnaires measured health-related quality of life in eight areas: physical functioning, role limitations due to physical problems, role limitations due to emotional problems, vitality, bodily pain, social functioning, mental health, and general health perceptions. In the HPFS, only physical functioning was assessed.

Researchers identified 454 men and 414 women with PD in the two cohorts. At 7.5 years prior to diagnosis, physical function among PD cases, in both men and women, was comparable to that in the overall cohort. A decline began approximately 3 years prior to diagnosis in men and approximately 7.5 years prior to diagnosis in women. Physical function continued to decline thereafter at a rate of 1.43 and 2.35 points per year in men and women, respectively. In comparison, the average yearly decline in individuals without PD was 0.23 in men and 0.42 in women. Other measures of quality of life, available only in women, declined in a similar pattern.

Dr. Palacios notes that a strength of the study is the availability of prospective data on both PD patients and a healthy comparison group, and the ability to chart the deterioration in functioning and quality of life over the whole study follow-up, which included many years prior to diagnosis.

"This result provides support to the notion that the pathological process leading to PD may start several years before PD diagnosis," says Dr. Palacios. "Our hope is that, with future research, biological markers of the disease process may be recognizable in this preclinical phase."

Source: Science Daily

Jul 4, 20129 notes
#science #neuroscience #brain #psychology #parkinson
Premature Infants Do Feel Pain from Procedures: Physiological Markers for Neonate Pain Identified

ScienceDaily (July 2, 2012) — There was a time when a belief was widely held that premature neonates did not perceive pain. That, of course, has been refuted but measurements of neonate pain tend to rely on inexact measures, such as alertness and ability to react expressively to pain sensations. Researchers at Loma Linda University reported in The Journal of Pain that there is a significant relationship between procedural pain and detectable oxidative stress in neonates.

Previous studies have shown an approach involving measurement of systemic biochemical reactions to pain offers the benefit of providing an objective method for measuring pain in premature neonates. Exposure to painful procedures often results in reductions in oxygen saturations and tachycardia, but few studies have quantified the effects of increased pain oxygen consumption. No studies have examined the relationship between pain scores that reflect behavioral and physiological markers of pain and plasma markers of ATP utilization and oxidative stress.

In this study, 80 preterm neonates were evaluated. In about half, tape was taken off the skin following removal of catheters, and they were evaluated for oxidative stress by measuring uric acid and malondialdehyde (MDA) concentration in plasma before and after the procedure. These subjects were compared with a control group not experiencing tape removal. Pain scores were assessed using the Premature Infant Pain Profile. The data showed there was a significant relationship between procedural pain and MDA, which is a well accepted marker of oxidative stress.

There were increases in MDA in preterm neonates exposed to the single painful procedure and not in the control group. Since premature neonates undergo several painful procedures a day, the researchers concluded that if exposure to multiple painful procedures is shown to contribute to oxidative stress, biochemical markers might be useful in evaluating mechanism-based interventions that could decrease adverse effects of painful procedures.

Source: Science Daily

Jul 4, 20126 notes
#science #neuroscience #brain #psychology #pain
Childless Women With Fertility Problems at Higher Risk of Hospitalization for Psychiatric Disorders

ScienceDaily (July 2, 2012) — While many small studies have shown a relationship between infertility and psychological distress, reporting a high prevalence of anxiety, mood disorders and depressive symptoms, few have studied the psychological effect of childlessness on a large population basis. Now, based on the largest cohort of women with fertility problems compiled to date, Danish investigators have shown that women who remained childless after their first investigation for infertility had more hospitalisations for psychiatric disorders than women who had at least one child following their investigation.

The results of the study were presented July 1 at the annual meeting of ESHRE (European Society of Human Reproduction and Embryology) by Dr Birgitte Baldur-Felskov, an epidemiologist from the Danish Cancer Research Center in Copenhagen.

Most studies of this kind have been based on single clinics and self-reported psychological effects. This study, however, was a nationwide follow-up of 98,737 Danish women investigated for infertility between 1973 and 2008, who were then cross-linked via Denmark’s population-based registries to the Danish Psychiatric Central Registry. This provided information on hospitalisations for psychiatric disorders, which were divided into an inclusive group of “all mental disorders,” and six discharge sub-groups which comprised “alcohol and intoxicant abuse,” “schizophrenia and psychoses,” “affective disorders including depression,” “anxiety, adjustment and obsessive compulsive disorder,” “eating disorders,” and “other mental disorders.”

All women were followed from the date of their initial fertility investigation until the date of psychiatric event, date of emigration, date of death, date of hospitalisation or 31st December 2008, whichever came first. Such studies, said Dr Baldur-Felskov, could only be possible in somewhere like Denmark, where each citizen has a personal identification number which can be linked to any or all of the country’s diagnostic registries.

Results of the study showed that, over an average follow-up time of 12.6 years (representing 1,248,243 woman-years), 54% of the 98,737 women in the cohort did have a baby. Almost 5000 women from the entire cohort were hospitalised for a psychiatric disorder, the most common discharge diagnosis being “anxiety, adjustment and obsessive compulsive disorders” followed by “affective disorders including depression.”

However, those women who remained childless after their initial fertility investigation had a statistically significant (18%) higher risk of hospitalisations for all mental disorders than the women who went on to have a baby; the risk was also significantly greater for alcohol/substance abuse (by 103%), schizophrenia (by 47%) and other mental disorders (by 43%). The study also showed that childlessness increased the risk of eating disorders by 47%, although this was not statistically significant.

However, the most commonly seen discharge diagnosis in the entire cohort (anxiety, adjustment and obsessive compulsive disorders) was not affected by fertility status.

Commenting on the study’s results, Dr Baldur-Felskov said: “Our study showed that women who remained childless after fertility evaluation had an 18% higher risk of all mental disorders than the women who did have at least one baby. These higher risks were evident in alcohol and substance abuse, schizophrenia and eating disorders, although appeared lower in affective disorders including depression.

"The results suggest that failure to succeed after presenting for fertility investigation may be an important risk modifier for psychiatric disorders. This adds an important component to the counselling of women being investigated and treated for infertility. Specialists and other healthcare personnel working with infertile patients should also be sensitive to the potential for psychiatric disorders among this patient group."

Source: Science Daily

Jul 4, 201212 notes
#science #neuroscience #psychology #brain #disorders
Day Dreaming Good for You? Reflection Is Critical for Development and Well-Being

ScienceDaily (July 2, 2012) — As each day passes, the pace of life seems to accelerate — demands on productivity continue ever upward and there is hardly ever a moment when we aren’t, in some way, in touch with our family, friends, or coworkers. While moments for reflection may be hard to come by, a new article suggests that the long-lost art of introspection — even daydreaming — may be an increasingly valuable part of life.

image

The long-lost art of introspection — even daydreaming — may be an increasingly valuable part of life. (Credit: © HaywireMedia / Fotolia)

In the article, published in the July issue of Perspectives on Psychological Science, a journal of the Association for Psychological Science, psychological scientist Mary Helen Immordino-Yang and colleagues survey the existing scientific literature from neuroscience and psychological science, exploring what it means when our brains are ‘at rest.’

In recent years, researchers have explored the idea of rest by looking at the so-called ‘default mode’ network of the brain, a network that is noticeably active when we are resting and focused inward. Findings from these studies suggest that individual differences in brain activity during rest are correlated with components of socioemotional functioning, such as self-awareness and moral judgment, as well as different aspects of learning and memory. Immordino-Yang and her colleagues believe that research on the brain at rest can yield important insights into the importance of reflection and quiet time for learning.

"We focus on the outside world in education and don’t look much at inwardly focused reflective skills and attentions, but inward focus impacts the way we build memories, make meaning and transfer that learning into new contexts," says Immordino-Yang, a professor of education, psychology and neuroscience at the University of Southern California. "What are we doing in schools to support kids turning inward?"

Accumulated research suggests that the networks that underlie a focus inward versus outward likely are interdependent, and our ability to regulate and move between them probably improves with maturity and practice. While outward attention is essential for carrying out tasks and learning from classroom lessons, for example, the reflection and consolidation that may accompany mind wandering is equally important, fostering healthy development and learning in the longer term.

"Balance is needed between outward and inward attention, since time spent mind wandering, reflecting and imagining may also improve the quality of outward attention that kids can sustain," says Immordino-Yang.

She and her colleagues argue that mindful introspection can become an effective part of the classroom curriculum, providing students with the skills they need to engage in constructive internal processing and productive reflection. Research indicates that when children are given the time and skills necessary for reflecting, they often become more motivated, less anxious, perform better on tests, and plan more effectively for the future.

And mindful reflection is not just important in an academic context — it’s also essential to our ability to make meaning of the world around us. Inward attention is an important contributor to the development of moral thinking and reasoning and is linked with overall socioemotional well-being.

Immordino-Yang and her colleagues worry that the high attention demands of fast-paced urban and digital environments may be systematically undermining opportunities for young people to look inward and reflect, and that this could have negative effects on their psychological development. This is especially true in an age when social media seems to be a constant presence in teens’ day-to-day lives.

"Consistently imposing overly high-attention demands on children, either in school, through entertainment, or through living conditions, may rob them of opportunities to advance from thinking about ‘what happened’ or ‘how to do this’ to constructing knowledge about ‘what this means for the world and for the way I live my life,’ " Immordino-Yang writes.

According to the authors, perhaps the most important conclusion to be drawn from research on the brain at rest is the fact that all rest is not idleness. While some might be inclined to view rest as a wasted opportunity for productivity, the authors suggest that constructive internal reflection is critical for learning from past experiences and appreciating their value for future choices, allowing us to understand and manage ourselves in the social world.

Source: Science Daily

Jul 4, 201252 notes
#science #neuroscience #psychology #brain
Activity of Rare Genetic Variant in Glioma Validated

ScienceDaily (July 2, 2012) — Researchers at Moffitt Cancer Center working with colleagues at three other institutions have validated a link between a rare genetic variant and the risk of glioma, the most common and lethal type of brain tumor. The validation study also uncovered an association between the same rare genetic variant and improved rates of survival for patients with glioma.

The study, the first to confirm a rare susceptibility variant in glioma, appeared in a recent issue of the Journal of Medical Genetics, a journal published by the British Medical Association.

"Glioma is a poorly understood cancer with high morbidity and devastating outcomes," said study lead author Kathleen M. Egan, Sc.D., interim program leader of Cancer Epidemiology and vice chair of the Department of Cancer Epidemiology. "However, the discovery of the association of the TP53 genetic variant rs78378222 with glioma provides new insights into these tumors and offers better prospects for identifying people at risk."

According to the authors, their study “genotyped’ the single nucleotide polymorphism (SNP, or “snip”) rs78378222 in TP53, an important tumor suppressor gene. The researchers said the SNP disrupts the TP53 signal and, because of its activity, has been linked to a variety of cancers. This study linked the presence of the rare form of rs78378222 to deadly glioma.

The researchers conducted a large, clinic-based, case-control study of individuals age 18 and older with a recent glioma diagnosis. A total of 566 glioma cases and 603 controls were genotyped for the rs78378222 variant.

Study results reveal that the odds of developing glioma were increased 3.5 times among the rare variant allele carriers. However, when researchers examined the impact of rs78378222 on survival, they found an approximately 50 percent reduction in death rates for those who were variant allele carriers.

"That the variant increased survival chances was an unexpected finding," Egan said. "It is tempting to speculate that the presence of the risk allele could direct tumor development into a less aggressive path."

The researchers concluded that their study results “may shed light on the etiology and progression of these tumors.”

Source: Science Daily

Jul 4, 20124 notes
#science #neuroscience #brain #glioma #psychology #genetics
New Brain Receptor for Drug 'Fantasy' Identified

ScienceDaily (July 2, 2012) — Researchers are closer to understanding the biology behind GHB, a transmitter substance in the brain, best known in its synthetic form as the illegal drug fantasy.

In the 1960s, gamma-hydroxybutyric acid (GHB) was first discovered as a naturally occurring substance in the brain. Since then it has been manufactured as a drug with a clinical application and has also developed a reputation as the illegal drug fantasy and as a date rape drug. Its physiological function is still unknown.

Now a team of researchers at the Department of Drug Design and Pharmacology at the University of Copenhagen has shown for the first time exactly where the transmitter substance binds in the brain under physiologically relevant conditions. The results have recently been published in the Proceedings of the National Academy of Sciences.

"We have discovered that GHB binds to a special protein in the brain — more specifically a GABAA-receptor. The binding is strong even at very low dosage. This suggests that we have found the natural receptor, which opens new and exciting research opportunities, in that we have identified an important unknown that can provide the basis for a full explanation of the biological significance of the transmitter,” says Laura Friis Eghorn, PhD student.

Illegal use and possible antidote

Fantasy is also used as a so-called date rape drug, because in moderate amounts it has sedative, sexually stimulating and soporific effects. The compound is also abused for its euphoric effect, but in combination with alcohol, for example, it is a deadly cocktail that can lead to a state of deep unconsciousness or coma.

"GHB is registered for use as a drug to treat alcoholism and certain types of sleep disorders, but the risk of abuse presents difficulties. In the long-term, understanding how GHB works will enable us to develop new and better pharmaceuticals with a targeted effect in the brain, without the dangerous side-effects of fantasy," explains Laura Friis Eghorn, Department of Drug Design and Pharmacology.

Fantasy is an extremely toxic euphoriant, because the difference between a normal intoxicating dose and a fatal dose is so small. A better understanding of the biological mechanisms behind GHB-binding in the brain will benefit research into a life-saving antidote for this drug. Today there is no known antidote.

Statistics from Denmark in 2010 show that 8-10 percent of young people who frequent night clubs have had experience with Fantasy. However, since the drug is often also used in private for its sedative effect, it is difficult to estimate the extent of abuse.

Researchers on a targeted fishing expedition

The new research findings are the result of a collaboration between researchers at the University of Sydney in Australia and medicinal chemists at the Faculty of Health and Medical Sciences:

"Our chemist colleagues designed and produced special ligands — that are mimics of GHB in several variations. This enabled us to go on a targeted fishing expedition in the brain. We have slowly found our way to the receptor, which we have also been able to test pharmacologically. In itself, it is not unusual to find new receptors in the brain for known compounds. However, when we find a natural match rooted in the brain’s transmitter system, the biological implications are extremely interesting," explains Petrine Wellendorph, associate professor and head of the responsible research group that produced the pioneering results.

Source: Science Daily

Jul 4, 201211 notes
#science #neuroscience #brain #psychology #receptors
Study examines fingolimod therapy in patients with multiple sclerosis

July 2, 2012

The medication fingolimod reduced inflammatory lesion activity and reduced brain volume loss in patients with multiple sclerosis who participated in a two-year placebo-controlled clinical trial and were assessed by magnetic resonance imaging (MRI) measures, according to a report published Online First by Archives of Neurology.

Fingolimod is the first in a new class of drugs called the sphingosine 1-phosphate receptor (S1PR) modulators that was recently approved at 0.5 mg once daily for the treatment of relapsing multiple sclerosis (MS), a debilitating disease of the central nervous system, according to the study background.

The inflammatory pathology of MS can be seen by counting gadolinium (Gd)-enhancing lesions on T1-weighted images or new and enlarging T2 lesions on serial MRI scans. The extent of hyperintense areas on T2-weighted images provides an indication of the overall burden of disease, the study background explains.

The study by Ernst-Wilhelm Radue, M.D., of the Medical Image Analysis Center, University Hospital, Basel, Switzerland, and colleagues included 1,272 patients who were part of the fingolimod FTY720 Research Evaluating Effects of Daily Oral Therapy in Multiple Sclerosis (FREEDOMS) clinical trial, a worldwide, multicenter effort. Patients received once-daily fingolimod capsules of 0.5 mg or 1.25 mg, or placebo.

"The anti-inflammatory effects of fingolimod therapy, as depicted by Gd-enhancing lesions and new/newly enlarged T2 lesions, were evident as early as 6 months after treatment initiation and were sustained over two years. Approximately half the patients receiving fingolimod therapy were free from any new inflammatory lesions throughout this 2-year study, compared with only 21 percent of patients receiving placebo," the authors comment.

Fingolimod, 0.5 mg (licensed dose), “significantly reduced” brain volume loss during the trial versus placebo, according to the study results. Brain atrophy is recognized as a useful way to monitor MS disease progression.

"These results, coupled with the significant reductions in relapse rates and disability progression reported previously, support the positive impact on long-term disease evolution," the study concludes.

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Jul 4, 20124 notes
#science #neuroscience #brain #MS
Botulinum Toxin a Shot in the Arm for Preventing Multiple Sclerosis Tremor

ScienceDaily (July 2, 2012) — Botulinum toxin may help prevent shaking or tremor in the arms and hands of people with multiple sclerosis (MS), according to new research published in the July 3, 2012, print issue of Neurology®, the medical journal of the American Academy of Neurology.

"Treatments in use for tremor in MS are not sufficiently effective and new alternatives are needed," said study author Anneke van der Walt, MD, consultant neurologist at The Royal Melbourne Hospital and research fellow with the University of Melbourne in Australia.

For the study 23 people with MS were given botulinum toxin type A injections or a saline placebo for three months. Then they received the opposite treatment for the next three months. Scientists measured the tremor severity and their ability to write and draw before, during and after receiving the treatments. Video assessments were also taken every six weeks for six months.

The study found that people saw significant improvement in tremor severity, writing and drawing at six weeks and three months after the botulinum toxin treatment compared to after placebo. In tremor severity, the participants improved an average of two points on a 10-point scale, bringing their tremor from moderate to mild. In writing and drawing, participants improved by an average of one point on a 10-point scale.

"Our study suggests a new way to approach arm tremor related to MS where there are currently major treatment challenges and it also sets the framework for larger studies," said van der Walt.

Muscle weakness developed in 42 percent of people after treatment with botulinum toxin compared to six percent after placebo. The weakness was generally mild and went away within two weeks.

Source: Science Daily

Jul 4, 20124 notes
#science #neuroscience #brain #MS
Abuse During Childhood May Contribute to Obesity in Adulthood

ScienceDaily (July 2, 2012) — Investigators from Boston University School of Medicine (BUSM) and Boston University’s Slone Epidemiology Center report research findings that may shed light on influences on obesity during adulthood. Appearing in the journal Pediatrics, the study found an association of severity of sexual and physical abuse during childhood and adolescence with obesity during adulthood.

The findings were based on the ongoing Black Women’s Health Study, which has followed a large cohort of African-American women since 1995. Information provided in 2005 by more than 33,000 participants on early life experiences of abuse was assessed in relation to two measures of obesity: body mass index of 30 kg/m2 or more as a measure of overall obesity and waist circumference greater than 35 inches as a measure of central obesity.

The risk of obesity in 2005 by either measure was estimated to be approximately 30 percent greater among women in the highest category of physical and sexual abuse than in women who reported no abuse. The association was dampened but not fully explained by allowance for reproductive history, diet, physical activity and depressive symptoms, which might have been intermediates between abuse and weight gain.

According to the researchers, the findings add to growing evidence that experiences during childhood may have long-term health consequences. “Abuse during childhood may adversely shape health behaviors and coping strategies, which could lead to greater weight gain in later life,” explained Renee Boynton-Jarrett, MD, the lead investigator of the study and a pediatric primary care physician at Boston Medical Center. She also noted that metabolic and hormonal disruptions resulting from abuse could have that effect and that childhood abuse could be a marker for other adversities. “Ultimately, greater understanding of pathways between early life abuse and adult weight status may inform obesity prevention and treatment approaches.” Boynton-Jarrett cautioned that further studies are needed to clarify just which factors are responsible for the association of abuse with obesity and noted there is a consensus that pediatric providers should screen for abuse.

Source: Science Daily

Jul 3, 201215 notes
#science #neuroscience #psychology #obesity
Jul 3, 201251 notes
#science #neuroscience #brain #mental illness #toxoplasma
Genes May Play Role in Educational Achievement

ScienceDaily (July 2, 2012) — Researchers have identified genetic markers that may influence whether a person finishes high school and goes on to college, according to a national longitudinal study of thousands of young Americans.

The study is in the July issue of Developmental Psychology, a publication of the American Psychological Association.

"Being able to show that specific genes are related in any way to academic achievement is a big step forward in understanding the developmental pathways among young people," said the study’s lead author, Kevin Beaver, PhD, a professor at the College of Criminology and Criminal Justice at Florida State University.

The three genes identified in the study — DAT1, DRD2 and DRD4 — have been linked to behaviors such as attention regulation, motivation, violence, cognitive skills and intelligence, according to the study. Previous research has explored the genetic underpinnings of intelligence but virtually none has examined genes that potentially contribute to educational attainment in community samples, said Beaver.

He and his colleagues analyzed data from the National Longitudinal Study of Adolescent Health, also known as Add Health. Add Health is a four-wave study of a nationally representative sample of American youths who were enrolled in middle or high school in 1994 and 1995. The study continued until 2008, when most of the respondents were between the ages of 24 and 32. The participants completed surveys, provided DNA samples and were interviewed, along with their parents. The sample used for this analysis consisted of 1,674 respondents.

The genes identified in this research are known as dopamine transporter and receptor genes. Every person has the genes DAT1, DRD2 and DRD4, but what is of interest are molecular differences within the genes, known as alleles, according to Beaver. Subjects who possessed certain alleles within these genes achieved the highest levels of education, according to the findings.

Dopamine transporter genes assist in the production of proteins that regulate levels of the neurotransmitter dopamine in the brain, while dopamine receptor genes are involved in neurotransmission. Previous research has shown that dopamine levels play a role in regulating impulsive behavior, attention and intelligence.

The presence of the alleles alone did not guarantee higher levels of education, the study found. Having a lower IQ was more strongly associated with lower levels of education. Also, living in poverty and essentially “running with a bad crowd” resulted in lower levels of education despite the genetic effects.

Even though the genetic variants were found to be associated with educational levels, having a specific allele does not determine whether someone will graduate from high school or earn a college degree, according to Beaver. Rather, these genes work in a probabilistic way, with the presence of certain alleles simply increasing or decreasing the likelihood of educational outcomes, he said. “No one gene is going to say, ‘Sally will graduate from high school’ or ‘Johnny will earn a college degree,’” he said. “These genetic effects operate indirectly, through memory, violent tendencies and impulsivity, which are all known predictors of how well a kid will succeed in school. If we can keep moving forward and identify more genetic markers for educational achievement, we can begin to truly understand how genetics play a role in how we live and succeed in life.”

Source: Science Daily

Jul 3, 201225 notes
#science #neuroscience #brain #psychology #genetics
DNA Sequenced for Parrot’s Ability to Parrot

July 2nd, 2012

Third-generation sequencing debugged to glimpse parrots’ ability to imitate.

Scientists say they have assembled more completely the string of genetic letters that could control how well parrots learn to imitate their owners and other sounds.

The research team unraveled the specific regions of the parrots’ genome using a new technology, single molecule sequencing, and fixing its flaws with data from older DNA-decoding devices. The team also decoded hard-to-sequence genetic material from corn and bacteria as proof of their new sequencing approach.

The results of the study appeared online July 1 in the journal Nature Biotechnology.

Single molecule sequencing “got a lot of hype last year” because it generates long sequencing reads, “supposedly making it easier to assemble complex parts of the genome,” said Duke University neurobiologist Erich Jarvis, a co-author of the study.

He is interested in the sequences that regulate parrots’ imitation abilities because they could give neuroscientists information about the gene regions that control speech development in humans.

image

This male budgie from the Fort Worth Zoo is like the parrots Erich Jarvis uses to study vocal learning behaviors, but probably without the text bubble. Image adapted from an image credited to Jerry Tillery via Wikimedia Commons. More info in notes below.

Jarvis began his project with collaborators by trying to piece together the genome regions with what are known as next-generation sequencers, which read chunks of 100 to 400 DNA base pairs at a time and then take a few days to assemble them into a draft genome. After doing the sequencing, the scientists discovered that the read lengths were not long enough to assemble the regulatory regions of some of the genes that control brain circuits for vocal learning.

University of Maryland computational biologists Adam Phillippy and Sergey Koren — experts at assembling genomes — heard about Jarvis’s sequencing struggles at a conference and approached him with a possible solution of modifying the algorithms that order the DNA base pairs. But the fix was still not sufficient.

Last year, 1000 base-pair reads by Roch 454 became available, as did the single molecule sequencer by Pacific Biosciences. The Pacbio technology generates strands of 2,250 to 23,000 base pairs at a time and can draft an entire genome in about a day.

Jarvis and others thought the new technologies would solve the genome-sequencing challenges. Through a competition, called the Assemblathon, the scientists discovered that the Pacbio machine had trouble accurately decoding complex regions of the parrot, Melopsittacus undulates, genome. The machine had a high error rate, generating the wrong genetic letter at every fifth or sixth spot in a string of DNA. The mistakes made it nearly impossible to create a genome assembly with the very long reads, Jarvis said.

But with a team, including scientists from the DOE Genome Science Institute and Cold Spring Harbor in New York, Phillippy, Koren and Jarvis corrected the Pacbio sequencer’s errors using shorter, more accurate codes from the next-generation devices. The fix reduces the single-molecule, or third-generation, sequencing machine’s error rate from 15 percent to less than one-tenth of one percent.

“Finally we have been able to assemble the regulatory regions of genes, such as FoxP2 and egr1, that are of interest to us and others in vocal learning behavior,” Jarvis said.

He explained that FoxP2 is a gene required for speech development in humans and vocal learning in birds that learn to imitate sounds, like songbirds and parrots. Erg1 is a gene that controls the brain’s ability to reorganize itself based on new experiences.

By being able to decode and organize the DNA that regulates these regions, neuroscientists may be able to better understand what genetic mechanism causes birds to imitate and sing well. They may also be able to collect more information about genetic factors that affect a person’s ability to learn how to communicate well and to speak, Jarvis said. He and his team plan to describe the biology of the parrot’s genetic code they sequenced in more detail in an upcoming paper.

Jarvis added that as more scientists use the hybrid sequencing approach, they could possibly decode complex, elusive genes linked to how cancer cells develop and to the sequences that control other brain functions.

Source: Neuroscience News

Jul 3, 201215 notes
#science #neuroscience #animals
Electrical brain stimulation can alleviate swallowing disorders after stroke

July 2, 2012

After stroke, patients often suffer from dysphagia, a swallowing disorder that results in greater healthcare costs and higher rates of complications such as dehydration, malnutrition, and pneumonia. In a new study published in the July issue of Restorative Neurology and Neuroscience, researchers have found that transcranial direct current stimulation (tDCS), which applies weak electrical currents to the affected area of the brain, can enhance the outcome of swallowing therapy for post-stroke dysphagia.

"Our pilot study demonstrated that ten daily sessions of tDCS over the affected esophageal motor cortex of the brain hemisphere affected by the stroke, combined with swallowing training, improved post-stroke dysphagia. We observed long-lasting effects of anodal tDCS over three months,” reports lead investigator Nam-Jong Paik, MD, PhD, of the Department of Rehabilitation Medicine, Seoul National University College of Medicine, Seoul, South Korea.

Sixteen patients with acute post-stroke dysphagia were enrolled in the trial. They showed signs of swallowing difficulties such as reduced tongue movements, coughing and choking during eating, and vocal cord palsy. Patients underwent ten 30-minute sessions of swallowing therapy and were randomly assigned to a treatment or control group. Both groups were fitted with an electrode on the scalp, on the side of the brain affected by the stroke, and in the region associated with swallowing. For the first 20 minutes of their sessions, tDCS was administered to the treatment group and then swallowing training alone continued for the remaining 10 minutes. In the control group, the direct current was tapered down and turned off after thirty seconds. Outcomes were measured before the experiment, just after the experiment, and again three months after the experiment. A patient from each group underwent a PET scan at before and just after the treatment to view the effect of the treatment on metabolism.

All patients underwent interventions without any discomfort or fatigue. There were no significant differences in age, sex, stroke lesion site, or extent of brain damage. Evaluation just after the conclusion of the sessions found that dysphagia improved for all patients, without much difference between the two groups. However, at the three month follow-up, the treatment group showed significantly greater improvement than the control group.

In the PET study, there were significant differences in cerebral metabolism between the first PET scan and the second PET scan in the patient who had received tDCS. Increased glucose metabolism was observed in the unaffected hemisphere, although tDCS was only applied to the affected hemisphere, indicating that tDCS might activate a large area of the cortical network engaged in swallowing recovery rather than just the areas stimulated under the electrode.

"The results indicate that tDCS can enhance the outcome of swallowing therapy in post-stroke dysphagia," notes Dr. Paik. "As is always the case in exploratory research, further investigation involving a greater number of patients is needed to confirm our results. It will be important to determine the optimal intensity and duration of the treatment to maximize the long-term benefits."

Provided by IOS Press

Source: medicalxpress.com

Jul 3, 20126 notes
#science #neuroscience #brain #stroke #dysphagia
Jul 3, 201264 notes
#science #neuroscience #brain #psychology #neuron
Researchers Report Success in Treating Autism Spectrum Disorder

July 2nd, 2012

Using a mouse model of autism, researchers at the University of Cincinnati (UC) and Cincinnati Children’s Hospital Medical Center have successfully treated an autism spectrum disorder characterized by severe cognitive impairment.

The research team, led by Joe Clark, PhD, a professor of neurology at UC, reports its findings online July 2, 2012, in the Journal of Clinical Investigation, a publication of the American Society for Clinical Investigation.

The disorder, creatine transporter deficiency (CTD) is caused by a mutation in the creatine transporter protein that results in deficient energy metabolism in the brain. Linked to the X chromosome, CTD affects boys most severely; women are carriers and pass it on to their sons.

image

Using cyclocreatine, researchers successfully treated an autism spectrum disorder known as creatine transporter deficiency in a mouse model of autism.

The brains of boys with CTD do not function normally, resulting in severe speech deficits, developmental delay, seizures and profound mental retardation. CTD is estimated to currently affect about 50,000 boys in the United States and is the second-most common cause of X-linked mental retardation after Fragile X syndrome.

Following CTD’s discovery at UC in 2000, researchers at UC and Cincinnati Children’s led by Clark discovered a method to treat it with cyclocreatine—also known as CincY, and pronounced cinci-why—a creatine analogue originally developed as an adjunct to cancer treatment. They then treated genetically engineered mice as an animal model of the human disease.

“CincY successfully entered the brain and reversed the mental retardation-like symptoms in the mice, with benefits seen in nine weeks of treatment,” says Clark, adding that no harmful effects to the mice were observed in the study. “Treated mice exhibited a profound improvement in cognitive abilities, including recognition of novel objects, spatial learning and memory.”

As a repurposed drug (originally developed for another therapy), CincY has already been through part of the U.S. Food and Drug Administration (FDA) approval process. It is taken orally as a pill or powder.

UC’s Office of Entrepreneurial Affairs and Technology Commercialization has reached agreement with Lumos Pharma, a privately held Austin, Texas, startup company based on UC technology, to develop and commercialize CincY. Lumos Pharma was created with technology licensed from UC’s Office of Entrepreneurial Affairs and Technology Commercialization. Its CEO is Rick Hawkins, a 30-year biotech industry veteran. Jon Saxe is its chairman.

“It has taken many years to get here and I am happy that our efforts have led to this translational effort to make a therapy available to those afflicted with CTD,” says Clark. “We look forward with commitment and hope to the day when those patients will benefit from our work.”

The collaboration gained momentum when Lumos Pharma submitted a proposal based on Clark’s technology to the National Institutes of Health and was selected as a drug development project partner by the National Center for Advancing Translational Sciences’ Therapeutics for Rare and Neglected Diseases (TRND) program. Under TRND’s collaborative operational model, project partners form joint project teams with TRND and receive in-kind support from TRND drug development scientists, laboratory and contract resources.

Lumos Pharma plans to initiate a TRND-supported preclinical development plan, with TRND support continuing through the filing of an Investigational New Drug (IND) application with the FDA prior to beginning a clinical trial. Such a trial would be about three years away, Clark says.

Source: Neuroscience News

Jul 3, 201233 notes
#science #neuroscience #brain #psychology #autism
Charting Autism's Neural Circuitry: Deleting Single Gene Results in Autism-Like Behavior and Immunosuppressant Drug Prevents Symptoms

ScienceDaily (July 2, 2012) — Deleting a single gene in the cerebellum of mice can cause key autistic-like symptoms, researchers have found. They also discovered that rapamycin, a commonly used immunosuppressant drug, prevented these symptoms.

The deleted gene is associated with Tuberous Sclerosis Complex (TSC), a rare genetic condition. Since nearly 50 percent of all people with TSC develop autism, the researchers believe their findings will help us better understand the condition’s development.

"We are trying to find out if there are specific circuits in the brain that lead to autism-spectrum disorders in people with TSC," said Mustafa Sahin, Harvard Medical School associate professor of neurology at Boston Children’s Hospital and senior author on the paper. "And knowing that deleting the genes associated with TSC in the cerebellum leads to autistic symptoms is a vital step in figuring out that circuitry."

This is the first time researchers have identified a molecular component for the cerebellum’s role in autism. “What is so remarkable is that loss of this gene in a particular cell type in the cerebellum was sufficient to cause the autistic-like behaviors,” said Peter Tsai, HMS instructor of neurology and the first author of this particular study.

These findings were published online July 1 in Nature.

TSC is a genetic disease caused by mutations in either one of two genes, TSC1 and TSC2. Patients develop benign tumors in various organs in the body, including the brain, kidneys and heart, and often suffer from seizures, delayed development and behavioral problems.

Researchers have known that there was a link between TSC genes and autism, and have even identified the cerebellum as the key area where autism and related conditions develop.

In both cases, deleting this gene caused the three main signs of autistic-like behaviors:

  • Abnormal social interactions. The mice spent less time with each other and more with inanimate objects, compared to controls.
  • Repetitive behaviors. The mice spent extended amounts of time pursuing one activity or with one particular object far more than normal.
  • Abnormal communication. Ultrasonic vocalizations, the communication technique among rodents, were highly distressed.

The researchers also tested learning. “These mice were able to learn new things normally,” said Tsai, “but they had trouble with ‘reversal learning,’ or re-learning what they had learned when their environment changed.”

Tsai and colleagues tested this by training the mice to swim a particular path in which a platform where they could rest was set up on one side of the pool. When the researchers moved the platform to the other side of the pool, the mice had greater difficulty than the control mice re-learning to swim to the other side.

"These changes in behavior indicate that the TSC1 gene in Purkinje cells, and by extension, the cerebellum, are a part of the circuitry for autism disorders,” emphasized Sahin.

The researchers also found that the drug rapamycin averted the effects of the deleted gene. Administering the drug to the mice during development prevented the formation of autistic-like behaviors.

Currently, Sahin is the sponsor-principal investigator for an ongoing Phase II clinical trial to test the efficacy of everolimus, a compound in the same family as rapamycin, in improving neurocognition in children with TSC. The trial will be open for enrollment until December 2013.

"Our next step will be to see how the abnormalities in Purkinje cells affect autism-like development. We don’t know how generalizable our current findings are, but understanding mechanisms beyond TSC genes might be useful to autism," said Tsai.

Source: Science Daily

Jul 3, 201216 notes
#science #neuroscience #brain #psychology #genetics #autism
Autism, Schizophrenia and Bipolar Disorder May Share Common Underlying Factors, Family Histories Suggest

ScienceDaily (July 2, 2012) — New research led by Patrick F. Sullivan, MD, FRANZCP, a medical geneticist at the University of North Carolina School of Medicine, points to an increased risk of autism spectrum disorders (ASDs) among individuals whose parents or siblings have been diagnosed with schizophrenia or bipolar disorder.

The findings were based on a case-control study using population registers in Sweden and Israel, and the degree to which these three disorders share a basis in causation “has important implications for clinicians, researchers and those affected by the disorders,” according to a report of the research published online July 2, 2012 in the Archives of General Psychiatry.

"The results were very consistent in large samples from several different countries and lead us to believe that autism and schizophrenia are more similar than we had thought," said Dr. Sullivan, professor in the department of genetics and director of psychiatric genomics at UNC.

Sullivan and colleagues found that the presence of schizophrenia in parents was associated with an almost three times increased risk for ASD in groups from both Stockholm and all of Sweden.

Schizophrenia in a sibling also was associated with roughly two and a half times the risk for autism in the Swedish national group and a 12 times greater risk in a sample of Israeli military conscripts. The authors speculate that the latter finding from Israel resulted from individuals with earlier onset schizophrenia, “which has a higher sibling recurrence.”

Bipolar disorder showed a similar pattern of association but of a lesser magnitude, study results indicate.

"Our findings suggest that ASD, schizophrenia and bipolar disorder share etiologic risk factors," the authors state. "We suggest that future research could usefully attempt to discern risk factors common to these disorders."

Source: Science Daily

Jul 3, 201240 notes
#science #neuroscience #brain #psychology #genetics
DNA Methylation Linked to Memory Loss

By Sabrina Richards | July 2, 2012

Scientists find that declining DNA methylation in mouse neurons may cause age-related memory deficits.

image

An elderly man
Flickr, BLEU MAN

Research is increasingly connecting changes in epigenetic regulation of gene expression  to the aging process. Many studies demonstrate that DNA methylation declines with age. Now, new research published yesterday (July 1) in Nature Neuroscience links DNA methylation with brain aging. Researchers show that levels of an enzyme that attaches methyl groups to cytosine nucleotides throughout the genome is linked to cognitive decline, and that its overexpression can restore performance of aging mice on memory-related tasks.

“We already know normal aging is associated with cognitive decline, but this paper links that with expression a specific DNA methyltransferase,” said Yuan Gao, an epigeneticist at the Lieber Institute for Brain Development in Maryland, who did not participate in the study. The current work also builds on other studies demonstrating that proper regulation of methylation in brain cells is critical to memory formation. Previous studies have suggested a connection between loss of DNA methylation and Alzheimer’s disease, said Gao, suggesting that if researchers could “restore [methyltransferase] activity and cure or delay dementia, it would make a nice model” for developing drugs to tackle age-related cognitive diseases.

DNA methylation, wherein a methyl group is attached to a cytosine next to a guanosine, is one form of epigenetic regulation that can modulate how available genes are to the cell’s transcription machinery, and thus how highly expressed they are. Scientists already appreciate how differences in epigenetic regulation can affect development of diseases like cancer, without need for gene mutations. Studies are also accumulating that correlate declining methylation with aging, although the mechanism remains unclear.

Classically, DNA methylation is considered a repressive modification, but that view is beginning to change, suggesting a more nuanced role for methylation in gene regulation, explained senior author Hilmar Bading of the University of Heidelberg. The twist in Bading’s current research is that the methyltransferase his group focuses on, Dnmt3a2, may be working to enable gene transcription, rather than repress it.

This gene-activating role may stem from methylation that blocks repressors, rather than activators, explained Trygve Tollesfbol, who investigates the role of epigenetics in cancer and aging at the University of Alabama, who did not participate in the research. Whether methylation is located in the promoter or body of the gene can also determine whether it inhibits or enhances transcription, explained Guoping Fan, who studies epigenetic regulation of neuron development at the University of California, Los Angeles.

Bading’s group identified Dnmt3a2 when looking for genes that are upregulated by neuronal activity. Knowing that DNA methylation decreases with age, first author Ana Oliviera compared Dnmt3a2 expression in 3-month-old and 18-month-old mice, and found lower levels of Dnmt3a2 in the older mice. Furthermore, learning tasks designed to stimulate hippocampus neurons failed to upregulate Dnmt3a2 expression in old mice as robustly as in young mice.

Theorizing that reduced Dnmt3a2-dependent DNA methylation contributed to older mice’s poorer performance on learning and memory tasks, the scientists used an adeno-associated virus to supplement Dnmt3a2 expression in their hippocampal neurons. Boosting its expression enhanced both brain methylation in the older mice, and their ability to learn. Conversely, when the researchers used short hairpin RNA to knockdown Dnmt3a2 expression in young mice, their performance on learning and memory tests worsened.

“I think Dnmt3a2 has a basic gating function,” said Bading. Neurons need to turn genes on and off quickly in response to changing stimulation. Bading hypothesizes that Dnmt3a2-dependent methylation helps keep genes—like brain-derived neurotrophic factor (BDNF) and Arc, both regulated by Dnmt3a2 and both involved in responses to signaling changes—receptive to changing stimulation, putting “the genome in the right state for being inducible,” Bading said. Genes like BDNF shouldn’t be transcribed all the time, but it may be that without Dnmt3a2-dependent methylation, “the door is closed” neurons can’t express them when they need to.

This could set up a vicious cycle, Bading explained, because Dnmt3a2 is also induced by neuronal activity. Less Dnmt3a2 would result in less expression of methylation-dependent genes, possibly including Dnmt3a2 itself, and the effect would worsen over time. “It would take many years to add up, but aging takes years,” Bading noted.

Methylation is unlikely to be the only epigenetic factor in aging, said Tollefsbol, who anticipates similar investigations into other DNA and histone modifications. BDNF itself has already been linked to histone acetylation and brain aging. “A good paper like this raises more questions than it answers,” Tollefsbol noted. “DNA methylation is probably only about a half or third of the [epigenetics and aging] equation.”

Source: TheScientist

Jul 3, 201230 notes
#science #neuroscience #psychology #brain #memory
Why Chronic Pain Is All in Your Head: Early Brain Changes Predict Which Patients Develop Chronic Pain

ScienceDaily (July 1, 2012) — When people have similar injuries, why do some end up with chronic pain while others recover and are pain free? The first longitudinal brain imaging study to track participants with a new back injury has found the chronic pain is all in their heads — quite literally.

image

(Credit: © drubig-photo / Fotolia)

A new Northwestern Medicine study shows for the first time that chronic pain develops the more two sections of the brain — related to emotional and motivational behavior — talk to each other. The more they communicate, the greater the chance a patient will develop chronic pain.

The finding provides a new direction for developing therapies to treat intractable pain, which affects 30 to 40 million adults in the United States.

Researchers were able to predict, with 85 percent accuracy at the beginning of the study, which participants would go on to develop chronic pain based on the level of interaction between the frontal cortex and the nucleus accumbens.

The study is published in the journal Nature Neuroscience.

"For the first time we can explain why people who may have the exact same initial pain either go on to recover or develop chronic pain," said A. Vania Apakarian, senior author of the paper and professor of physiology at Northwestern University Feinberg School of Medicine.

"The injury by itself is not enough to explain the ongoing pain. It has to do with the injury combined with the state of the brain. This finding is the culmination of 10 years of our research."

The more emotionally the brain reacts to the initial injury, the more likely the pain will persist after the injury has healed. “It may be that these sections of the brain are more excited to begin with in certain individuals, or there may be genetic and environmental influences that predispose these brain regions to interact at an excitable level,” Apkarian said.

The nucleus accumbens is an important center for teaching the rest of the brain how to evaluate and react to the outside world, Apkarian noted, and this brain region may use the pain signal to teach the rest of the brain to develop chronic pain.

"Now we hope to develop new therapies for treatment based on this finding," Apkarian added.

Chronic pain participants in the study also lost gray matter density, which is likely linked to fewer synaptic connections or neuronal and glial shrinkage, Apkarian said. Brain synapses are essential for communication between neurons.

"Chronic pain is one of the most expensive health care conditions in the U. S. yet there still is not a scientifically validated therapy for this condition," Apkarian said. Chronic pain costs an estimated $600 billion a year, according to a 2011 National Academy of Sciences report. Back pain is the most prevalent chronic pain condition.

A total of 40 participants who had an episode of back pain that lasted four to 16 weeks — but with no prior history of back pain — were studied. All subjects were diagnosed with back pain by a clinician. Brain scans were conducted on each participant at study entry and for three more visits during one year.

Source: Science Daily

Jul 2, 201224 notes
#science #neuroscience #brain #psychology #pain
Jul 1, 2012395 notes
#science #neuroscience #tractography #fiber tracking #psychology #brain
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December