Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

23 notes

A different drummer: Neural rhythms drive physical movement

June 3, 2012

Unlike their visual cousins, the neurons that control movement are not a predictable bunch. Scientists working to decode how such neurons convey information to muscles have been stymied when trying to establish a one-to-one relationship between a neuron’s behavior and external factors such as muscle activity or movement velocity.

The 19th century mathematician Joseph Fourier showed that two rhythms could be summed to produce a third rhythm. Researchers at Stanford have shown that this principle is behind the brain activity that produces arm movements. Credit: Mark Churchland, Stanford School of Engineering

In an article published online June 3rd by the journal Nature, a team of electrical engineers and neuroscientists working at Stanford University propose a new theory of the brain activity behind arm movements. Their theory is a significant departure from existing understanding and helps to explain, in relatively simple and elegant terms, some of the more perplexing aspects of the activity of neurons in motor cortex.

In their paper, electrical engineering Associate Professor Krishna Shenoy and post-doctoral researchers Mark Churchland, now a professor at Columbia, and John Cunningham of Cambridge University, now a professor at Washington University in Saint Louis, have shown that the brain activity controlling arm movement does not encode external spatial information—such as direction, distance and speed—but is instead rhythmic in nature.

Understanding the brain

Neuroscientists have long known that the neurons responsible for vision encode specific, external-world information—the parameters of sight. It had been theorized and widely suggested that motor cortex neurons function similarly, conveying specifics of movement such as direction, distance and speed, in the same way the visual cortex records color, intensity and form.

"Visual neurons encode things in the world. They are a map, a representation," said Churchland, who is first author of the paper. "It’s not a leap to imagine that neurons in the motor cortex should behave like neurons in the visual cortex, relating in a faithful way to external parameters, but things aren’t so concrete for movement."

Scientists have disagreed about which movement parameters are being represented by individual neurons. They could not look at a particular neuron firing in the motor cortex and determine with confidence what information it was encoding.

"Many experiments have sought such lawfulness and yet none have found it. Our findings indicate an alternative principle is at play," said co-first author Cunningham.

Read more …

Filed under science neuroscience brain psychology neuron

100 notes

Neuroscientists Reach Major Milestone in Whole-Brain Circuit Mapping Project

ScienceDaily (June 1, 2012) — Neuroscientists at Cold Spring Harbor Laboratory (CSHL) just reached an important milestone, publicly releasing the first installment of data from the 500 terabytes so far collected in their pathbreaking project to construct the first whole-brain wiring diagram of a vertebrate brain, that of the mouse.

Composite image generated with Mouse Brain Architecture project data. Injections of two fluorescently marked (red and green) adeno-associated viral (AAV) tracers indicate neural pathways, superimposed upon a whole-brain image stained to reveal the protective sheathing around myelinated axons. Axonal paths leaving the injection site are seen, including horizontal ones crossing over to the other side of the brain along the Corpus Callosum. (Credit: Image courtesy of Cold Spring Harbor Laboratory)

The data consist of gigapixel images (each close to 1 billion pixels) of whole-brain sections that can be zoomed to show individual neurons and their processes, providing a “virtual microscope.” The images are integrated with other data sources from the web, and are being made fully accessible to neuroscientists as well as interested members of the general public (http://mouse.brainarchitecture.org). The data are being released pre-publication in the spirit of open science initiatives that have become familiar in digital astronomy (e.g., Sloan Digital Sky Survey) but are not yet as widespread in neurobiology.

Each sampled brain is represented in about 500 images, each image showing an optical section through a 20 micron-thick slice of brain tissue. A multi-resolution viewer permits users to journey through each brain from “front” to “back,” and thus enables them to follow the pathways taken through three-dimensional brain space by tracer-labeled neuronal pathways. The tracers were picked to follow neuronal inputs and outputs of given brain regions.

"We’re executing a grid-based "shotgun" strategy for neuronal tract tracing that we first proposed a few years ago, and which I am pleased to note has gained acceptance elsewhere within the neuroscience community," says Partha P. Mitra, Ph.D., the Crick-Clay Professor of Biomathematics at CSHL and director of the Mouse Brain Architecture (MBA) Project. After the initial June 1 release, project data will be made public continuously on a monthly basis, Mitra says.

Project addresses a large gap in knowledge

"Our project seeks to address a remarkable gap in our knowledge of the brain," Mitra explains. "Our knowledge of how the brain is wired remains piecemeal and partial after a century of intense activity. Francis Crick and Ted Jones emphasized this in an article published in Nature nearly 20 years ago. Yet to understand how the brain works (or fails to work in neurological or neuropsychiatric disease), it is critical that we understand this wiring diagram more fully. Further, there remain fundamental questions about brain evolution that cannot be addressed without obtaining such wiring diagrams for the brains of different species.”

The MBA Project, which has received critical funding from the Keck Foundation and from the National Institutes of Health, is distinguished by the approach advocated by Mitra and colleagues in a position paper published in 2009. Mitra there proposed mapping vertebrate brains at what he calls the “mesoscopic” scale, a middle-range amenable to light microscopy, providing far more detail than, for instance, MRI-based methods, and yet considerably less detail than is achievable via electron microscopy (EM). The latter approach, while useful for mapping synaptic connections between individual neurons, is feasible on a whole-brain basis only for very small brains (e.g. that of the fruitfly) or very small portions of the mouse brain.

The pragmatic approach Mitra advocated and which is realized in this first data release, is to image whole mouse brains in a semi-automated, quality-controlled process using light microscopy and injected neural tracers (both viruses and classically used tracer substances). While the basic methodology has been available for some time, systematically applying it to a grid of locations spanning the entire brain, and digitizing and re-assembling the resulting collection of brains, is a new approach made feasible by the rapidly falling costs of computer storage. A single mouse brain at light-microscope resolution produces about a terabyte (1 trillion bytes, or 1000 GB) of data; thus, generating and storing the data set currently being gathered would have been prohibitively expensive a decade or so ago.

Assembling the circuit diagram at a mesoscopic scale using ‘shotgun approach’

A key point is that at the mesoscopic scale, the team expects to assemble a picture of connections that are stereotypical — that is, essentially the same in different individuals, and probably genetically determined in a species-specific manner. By dividing the volume of a hemisphere of the mouse brain into 250 equidistant, predefined grid-points, and administering four different kinds of tracer injections at each grid point — in different animals of the same sex and age — the eight-member team at CSHL assisted by collaborating scientists at Boston University, MIT and the University of California, San Diego seeks to assemble a complete wiring diagram that will be stitched together from the full dataset.

The project in this sense is analogous to the Human Genome Project’s “shotgun” approach, in that its final product — a comprehensive wiring diagram — will be the product of many individually obtained data components, woven together thanks to the power of advanced computing and informatics. Indeed, Mitra says one of the genome project’s early advocates, Dr. James D. Watson (now CSHL Chancellor Emeritus), provided him with motivation and encouragement to pursue the project.

"We will never understand how the brain works until we have the wiring diagram," Dr. Watson comments today. "Mitra is on the right track and I’m impressed he’s gone from conception to putting out data in a couple of years on a quite modest budget. His approach deserves strong funding support."

The MBA Project was also inspired by early efforts of the Allen Institute, funded by Microsoft co-founder and philanthropist Paul Allen, which resulted in assembly of a comprehensive map of gene expression across the mouse brain. That effort was the product of standard molecular biology procedures iterated in a quasi-industrialized process. The resulting whole-brain gene-expression map, while a triumph, was not designed to shed light on connections in the brain, which became a point of departure for Mitra.

Since the 2009 publication of Mitra and colleagues’ proposal for meso-scale circuit-mapping projects for whole vertebrate brains, the approach has not only spawned Mitra’s CSHL project, but also other meso-scale circuit-mapping projects for the mouse at the Allen Institute and at UCLA. Each differs in aim and technical detail.

A number of features distinguish the “meso-scale” circuit project at CSHL. The 20-micron spacing between brain “slices” gives the CSHL results a particularly rich sense of three-dimensional depth and detail. The team’s use of four tracers including both classical tracer substances as well as neurotropic viruses (attenuated or disabled viruses that infect nerve cells), provides redundancy and helps control for differing efficacies of the different tracer substances. The images one sees on the MBA Project website begininng today provide hard data on actual neuronal processes — the “ground truth” of neuroanatomy, in Mitra’s words — and do not rely on inferential methodologies such as functional MRI scans and diffusion tensor imaging to suggest areas in which connections occur. Finally, it is noteworthy that the slides generated by the project are being physically stored, to permit re-examination at a later date, using more refined imaging methods if necessary or as new methods become available.

"Our project is what I’d call a necessary first step in a much larger enterprise, that of understanding both structure and dynamics of the vertebrate, and ultimately, the human brain," says Mitra. "While facile comparisons with Genome projects should be avoided, the data sets generated by the MBA and similar projects will provide a useful framework — not unlike a reference genome — on which we can ‘hang’ all kinds of neuroscience knowledge, the body of which has always been notably fragmentary."

Source: Science Daily

Filed under science neuroscience brain psychology

11 notes

How Does Exercise Affect Nerve Pain?

ScienceDaily (June 1, 2012) — Exercise helps to alleviate pain related to nerve damage (neuropathic pain) by reducing levels of certain inflammation-promoting factors, suggests an experimental study in the June issue of Anesthesia & Analgesia, official journal of the International Anesthesia Research Society (IARS).

The results support exercise as a potentially useful nondrug treatment for neuropathic pain, and suggest that it may work by reducing inflammation-promoting substances called cytokines. The lead author was Yu-Wen Chen, PhD, of China Medical University, Taichung, Taiwan.

Exercise Reduces Nerve Pain and Cytokine Expression in Rats Neuropathic pain is a common and difficult-to-treat type of pain caused by nerve damage, seen in patients with trauma, diabetes, and other conditions. Phantom limb pain after amputation is an example of neuropathic pain.

Dr Chen and colleagues examined the effects of exercise on neuropathic pain induced by sciatic nerve injury in rats. After nerve injury, some animals performed progressive exercise — either swimming or treadmill running — over a few weeks. The researchers assessed the effects of exercise on neuropathic pain severity by monitoring observable pain behaviors.

The results suggested significant reductions in neuropathic pain in rats assigned to swimming or treadmill running. Exercise reduced abnormal responses to temperature and pressure — both characteristic of neuropathic pain.

Exercise also led to reduced expression of inflammation-promoting cytokines in sciatic nerve tissue — specifically, tumor necrosis factor-alpha and interleukin-1-beta. That was consistent with previous studies suggesting that inflammation and pro-inflammatory cytokines play a role in the development of neuropathic pain in response to nerve injury.

Exercise also led to increased expression of a protein, called heat shock protein-27, which may have contributed to the reductions in cytokine expression.

Neuropathic pain causes burning pain and numbness that is not controlled by conventional pain medications. Antidepressant and antiepileptic drugs may be helpful, but have significant side effects. Exercise is commonly recommended for patients with various types of chronic pain, but there are conflicting data as to whether it is helpful in neuropathic pain.

The new results support the benefits of exercise in reducing neuropathic pain, though not eliminating it completely. In the experiments, exercise reduced abnormal pain responses by 30 to 50 percent.

The study also adds new evidence that inflammation contributes to the development of neuropathic pain, including the possible roles of pro-inflammatory cytokines. The results provide support for exercise as a helpful, nondrug therapy for neuropathic pain — potentially reducing the need for medications and resulting side effects.

Source: Science Daily

Filed under science neuroscience brain psychology pain

16 notes

Prototype Device Translates Sign Language

ScienceDaily (June 1, 2012) — Too often, communication barriers exist between those who can hear and those who cannot. Sign language has helped bridge such gaps, but many people are still not fluent in its motions and hand shapes.

During the past semester, students in UH’s engineering technology and industrial design programs teamed up to develop the concept and prototype for MyVoice, a device that reads sign language and translates its motions into audible words. (Credit: Image courtesy of University of Houston)

Thanks to a group of University of Houston students, the hearing impaired may soon have an easier time communicating with those who do not understand sign language. During the past semester, students in UH’s engineering technology and industrial design programs teamed up to develop the concept and prototype for MyVoice, a device that reads sign language and translates its motions into audible words. Recently, MyVoice earned first place among student projects at the American Society of Engineering Education (ASEE) — Gulf Southwest Annual Conference.

The development of MyVoice was through a collaborative senior capstone project for engineering technology students (Anthony Tran, Jeffrey Seto, Omar Gonzalez and Alan Tran) and industrial design students (Rick Salinas, Sergio Aleman and Ya-Han Chen). Overseeing the student teams were Farrokh Attarzadeh, associate professor of engineering technology, and EunSook Kwon, director of UH’s industrial design program.

MyVoice’s concept focuses on a handheld tool with a built-in microphone, speaker, soundboard, video camera and monitor. It would be placed on a hard surface where it reads a user’s sign language movements. Once MyVoice processes the motions, it then translates sign language into space through an electronic voice. Likewise, it would capture a person’s voice and can translate words into sign language, which is projected on its monitor.

The industrial designers researched the application of MyVoice by reaching out to the deaf community to understand the challenges associated with others not understanding sign language. They then designed MyVoice, while the engineering technology students had the arduous task of programming the device to translate motion into sound.

"The biggest difficulty was sampling together a databases of images of the sign languages. It involved 200-300 images per sign," Seto said. "The team was ecstatic when the prototype came together."

From its conceptual stage, MyVoice evolved into a prototype that could translate a single phrase: “A good job, Cougars.”

"This wasn’t just a project we did for a grade," said Aleman, who just graduated from UH. "While designing and developing it, it turned into something very personal. When we got to know members of the deaf community and really understood their challenges, it made this MyVoice very important to all of us."

Since MyVoice’s creation and first place prize at the ASEE conference, all of the team members have graduated. Still, Aleman said that the project is not history.

"We got it to work, but we hope to work with someone to implement this as a product," Aleman said. "We want to prove to the community that this will work for the hearing impaired."

"We are proud of such a contribution to society through MyVoice, which breaks the barrier between deaf community and common society," added Attarzadeh.

Source: Science Daily

Filed under science neuroscience brain psychology language

6 notes

Noninvasive brain stimulation shown to impact walking patterns

June 1, 2012

In a step towards improving rehabilitation for patients with walking impairments, researchers from the Kennedy Krieger Institute found that non-invasive stimulation of the cerebellum, an area of the brain known to be essential in adaptive learning, helped healthy individuals learn a new walking pattern more rapidly. The findings suggest that cerebellar transcranial direct current stimulation (tDCS) may be a valuable therapy tool to aid people relearning how to walk following a stroke or other brain injury.

Previous studies in the lab of Amy Bastian, PhD, PT, director of the Motion Analysis Laboratory at Kennedy Krieger Institute, have shown that the cerebellum, a part of the brain involved in movement coordination, is essential for walking adaptation. In this new study, Dr. Bastian and her colleagues explored the impact of stimulation over the cerebellum on adaptive learning of a new walking pattern. Specifically, her team tested how anode (positive), cathode (negative) or sham (none) stimulation affected this learning process.

"We’ve known that the cerebellum is essential to adaptive learning mechanisms like reaching, walking, balance and eye movements,” says Dr. Bastian. “In this study, we wanted to examine the effects of direct stimulation of the cerebellum on locomotor learning utilizing a split-belt treadmill that separately controls the legs.”

The study, published today in the Journal of Neurophysiology, found that by placing electrodes on the scalp over the cerebellum and applying very low levels of current, the rate of walking adaptation could be increased or decreased. Dr. Bastian’s team studied 53 healthy adults in a series of split-belt treadmill walking tests. Rather than a single belt, a split-belt treadmill consists of two belts that can move at different speeds. During split-belt walking, one leg is set to move faster than the other. This initially disrupts coordination between the legs so the user is not walking symmetrically, however over time the user learns to adapt to the disturbance.

The main experiment consisted of a two-minute baseline period of walking with both belts at the same slow speed, followed by a 15-minute period with the belts at two separate speeds. While people were on the treadmill, researchers stimulated one side of the cerebellum to assess the impact on the rate of re-adjustment to a symmetric walking pattern.

Dr. Bastian’s team found not only that cerebellar tDCS can change the rate of cerebellum-dependent locomotor learning, but specifically that the anode speeds up learning and the cathode slows it down. It was also surprising that the side of the cerebellum that was stimulated mattered; only stimulation of the side that controls the leg walking on the faster treadmill belt changed adaptation rate.

"It is important to demonstrate that we can make learning faster or slower, as it suggests that we are not merely interfering with brain function," says Dr. Bastian. "Our findings also suggest that tDCS can be selectively used to assess and understand motor learning."

The results from this study present an exciting opportunity to test cerebellar tDCS as a rehabilitation tool. Dr. Bastian says, “If anodal tDCS prompts faster learning, this may help reduce the amount of time needed for stroke patients to relearn to walk evenly. It may also be possible to use tDCS to help sustain gains made in therapy, so patients can retain and practice improved walking patterns for a longer period of time. We are currently testing these ideas in individuals who have had a stroke.”

Provided by Kennedy Krieger Institute

Source: medicalxpress.com

Filed under science neuroscience brain psychology

4 notes

Flies With Restless Legs Syndrome Point to a Genetic Cause

ScienceDaily (May 31, 2012) — When flies are made to lose a gene with links to Restless Legs Syndrome (RLS), they suffer the same sleep disturbances and restlessness that human patients do. The findings reported online on May 31 in Current Biology, a Cell Press publication, strongly suggest a genetic basis for RLS, a condition in which patients complain of an irresistible urge to move that gets worse as they try to rest.

"Although widely prevalent, RLS is a disorder whose pathophysiological basis remains very poorly understood," said Subhabrata Sanyal of Emory University School of Medicine. "The major significance of our study is to highlight the fact that there might be a genetic basis for RLS. Understanding the function of these genes also helps to understand and diagnose the disease and may offer more focused therapeutic options that are currently limited to very general approaches."

Sanyal’s team recognized that a number of genome-wide association studies in humans had suggested connections between RLS and variation in a single gene (BTBD9).

"BTBD9 function or its relationship to RLS and sleep were a complete mystery," Sanyal said.

His team realized that there might be a way to shed some light on that mystery in fruit flies. Flies have a single, highly conserved version of the human BTBD9. They decided to test whether the gene that had turned up in those human studies would have any effect on sleep in the insects. In fact, flies need sleep just like humans do, and their sleep patterns are influenced by the same kinds of brain chemistry.

The researchers now report that flies lacking their version of the RLS-associated gene do lose sleep as they move more. When those flies were treated with a drug used for RLS, they showed improvements in their sleep.

The studies also yielded evidence about how the RLS gene works by controlling dopamine levels in the brain as well as iron balance in cells. Sanyal said his team will continue to explore other RLS-related genes that have been identified in human studies in search of more details of their interaction and function.

"Our results support the idea that genetic regulation of dopamine and iron metabolism constitute the core pathophysiology of at least some forms of RLS," the researchers write.

More broadly, they say, the study emphasizes the utility of simple animals such as fruit flies in unraveling the genetics of sleep and sleep disorders.

Source: Science Daily

Filed under neuroscience psychology science RLS genes

21 notes

Walking and Running Again After Spinal Cord Injury

ScienceDaily (May 31, 2012) — Rats with spinal cord injuries and severe paralysis are now walking (and running) thanks to researchers at EPFL. Published in the June 1, 2012 issue of Science, the results show that a severed section of the spinal cord can make a comeback when its own innate intelligence and regenerative capacity is awakened. The study, begun five years ago at the University of Zurich, points to a profound change in our understanding of the central nervous system. According to lead author Grégoire Courtine, it is yet unclear if similar rehabilitation techniques could work for humans, but the observed nerve growth hints at new methods for treating paralysis.

Test subject takes first steps up stairs after neurorehabilitation with a combination of robotic harness and electrical-chemical stimulation. (Credit: EPFL/Grégoire Courtine)

"After a couple of weeks of neurorehabilitation with a combination of a robotic harness and electrical-chemical stimulation, our rats are not only voluntarily initiating a walking gait, but they are soon sprinting, climbing up stairs and avoiding obstacles when stimulated," explains Courtine, who holds the International Paraplegic Foundation (IRP) Chair in Spinal Cord Repair at EPFL.

Waking up the spinal cord

It is well known that the brain and spinal cord can adapt and recover from moderate injury, a quality known as neuroplasticity. But until now the spinal cord expressed so little plasticity after severe injury that recovery was impossible. Courtine’s research proves that, under certain conditions, plasticity and recovery can take place in these severe cases — but only if the dormant spinal column is first woken up.

To do this, Courtine and his team injected a chemical solution of monoamine agonists into the rats. These chemicals trigger cell responses by binding to specific dopamine, adrenaline, and serotonin receptors located on the spinal neurons. This cocktail replaces neurotransmitters released by brainstem pathways in healthy subjects and acts to excite neurons and ready them to coordinate lower body movement when the time is right.

Five to 10 minutes after the injection, the scientists electrically stimulated the spinal cord with electrodes implanted in the outermost layer of the spinal canal, called the epidural space. “This localized epidural stimulation sends continuous electrical signals through nerve fibers to the chemically excited neurons that control leg movement. All that is left was to initiate that movement,” explains Rubia van den Brand, contributing author to the study.

The innate intelligence of the spinal column

In 2009, Courtine already reported on restoring movement, albeit involuntary. He discovered that a stimulated rat spinal column — physically isolated from the brain from the lesion down — developed in a surprising way: It started taking over the task of modulating leg movement, allowing previously paralyzed animals to walk over treadmills. These experiments revealed that the movement of the treadmill created sensory feedback that initiated walking — the innate intelligence of the spinal column took over, and walking essentially occurred without any input from the rat’s actual brain. This surprised the researchers and led them to believe that only a very weak signal from the brain was needed for the animals to initiate movement of their own volition.

To test this theory, Courtine replaced the treadmill with a device that vertically supported the subjects, a mechanical harness did not facilitate forward movement and only came into play when they lost balance, giving them the impression of having a healthy and working spinal column. This encouraged the rats to will themselves toward a chocolate reward on the other end of the platform. “What they deemed willpower-based training translated into a fourfold increase in nerve fibers throughout the brain and spine — a regrowth that proves the tremendous potential for neuroplasticity even after severe central nervous system injury,” says Janine Heutschi, co-author in the study.

First human rehabilitation on the horizon

Courtine calls this regrowth “new ontogeny,” a sort of duplication of an infant’s growth phase. The researchers found that the newly formed fibers bypassed the original spinal lesion and allowed signals from the brain to reach the electrochemically-awakened spine. And the signal was sufficiently strong to initiate movement over ground — without the treadmill — meaning the rats began to walk voluntarily towards the reward, entirely supporting their own weight with their hind legs.

"This is the world-cup of neurorehabilitation," exclaims Courtine. "Our rats have become athletes when just weeks before they were completely paralyzed. I am talking about 100% recuperation of voluntary movement."

In principle, the radical reaction of the rat spinal cord to treatment offers reason to believe that people with spinal cord injury will soon have some options on the horizon. Courtine is optimistic that human, phase-two trials will begin in a year or two at Balgrist University Hospital Spinal Cord Injury Centre in Zurich, Switzerland. Meanwhile, researchers at EPFL are coordinating a nine million Euro project called NeuWalk that aims at designing a fully operative spinal neuroprosthetic system, much like the one used here with rats, for implanting into humans.

Source: Science Daily

Filed under science neuroscience CNS psychology

35 notes

Alzheimer’s Protein Structure Suggests New Treatment Directions

ScienceDaily (May 31, 2012) — The molecular structure of a protein involved in Alzheimer’s disease — and the surprising discovery that it binds cholesterol — could lead to new therapeutics for the disease, Vanderbilt University investigators report in the June 1 issue of the journal Science.

Vanderbilt Center for Structural Biology investigators determined the structure of the C99 protein (shown in green and blue), which participates in triggering Alzheimer’s disease. Their discovery that C99 binds to cholesterol (shown in black, white and red) suggests a mechanism for cholesterol’s recognized role in promoting the memory-robbing disease and may lead to new therapeutics. (Credit: Charles Sanders and colleagues/Vanderbilt University)

Charles Sanders, Ph.D., professor of Biochemistry, and colleagues in the Center for Structural Biology determined the structure of part of the amyloid precursor protein (APP) — the source of amyloid-beta, which is believed to trigger Alzheimer’s disease. Amyloid-beta clumps together into oligomers that kill neurons, causing dementia and memory loss. The amyloid-beta oligomers eventually form plaques in the brain — one of the hallmarks of the disease.

"Anything that lowers amyloid-beta production should help prevent, or possibly treat, Alzheimer’s disease," Sanders said.

Amyloid-beta production requires two “cuts” of the APP protein. The first cut, by the enzyme beta-secretase, generates the C99 protein, which is then cut by gamma-secretase to release amyloid-beta. The Vanderbilt researchers used nuclear magnetic resonance and electron paragmagnetic resonance spectroscopy to determine the structure of C99, which has one membrane-spanning region.

They were surprised to discover what appeared to be a “binding” domain in the protein. Based on previously reported evidence that cholesterol promotes Alzheimer’s disease, they suspected that cholesterol might be the binding partner. The researchers used a model membrane system called “bicelles” (that Sanders developed as a postdoctoral fellow) to demonstrate that C99 binds cholesterol.

"It has long been thought that cholesterol somehow promotes Alzheimer’s disease, but the mechanisms haven’t been clear," Sanders said. "Cholesterol binding to APP and its C99 fragment is probably one of the ways it makes the disease more likely."

Sanders and his team propose that cholesterol binding moves APP to special regions of the cell membrane called “lipid rafts,” which contain “cliques of molecules that like to hang out together,” he said.

Beta- and gamma-secretase are part of the lipid raft clique.

"We think that when APP doesn’t have cholesterol around, it doesn’t care what part of the membrane it’s in," Sanders said. "But when it binds cholesterol, that drives it to lipid rafts, where these ‘bad’ secretases are waiting to clip it and produce amyloid-beta."

The findings suggest a new therapeutic strategy to reduce amyloid-beta production, he said.

"If you could develop a drug that blocks cholesterol from binding to APP, then you would keep the protein from going to lipid rafts. Instead it would be cleaved by alpha-secretase — a ‘good’ secretase that isn’t in rafts and doesn’t generate amyloid-beta."

Drugs that inhibit beta- or gamma-secretase — to directly limit amyloid-beta production — have been developed and tested, but they have toxic side effects. A drug that blocks cholesterol binding to APP may be more specific and effective in reducing amyloid-beta levels and in preventing, or treating, Alzheimer’s disease.

The C99 structure had some other interesting details, Sanders said.

The membrane domain of C99 is curved, which was unexpected but fits perfectly into the predicted active site of gamma-secretase. Also, a certain sequence of amino acids (GXXXG) that usually promotes membrane protein dimerization (two of the same proteins interacting with each other) turned out to be central to the cholesterol-binding domain. This is a completely new function for GXXXG motifs, Sanders said.

"This revealing new information on the structure of the amyloid precursor protein and its interaction with cholesterol is a perfect example of the power of team science," said Janna Wehrle, Ph.D., who oversees grants focused on the biophysical properties of proteins at the National Institutes of Health’s National Institute of General Medical Sciences (NIGMS), which partially funded the work. "The researchers at Vanderbilt brought together biological and medical insight, cutting-edge physical techniques and powerful instruments, each providing a valuable tool for piecing together the puzzle."

"When we were developing bicelles 20 years ago, no one was saying, ‘someday these things are going to lead to discoveries in Alzheimer’s disease,’" he said. "It was interesting basic science research that is now paying off."

Source: Science Daily

Filed under science neuroscience brain psychology alzheimer

4 notes

Memory Training Unlikely to Help in Treating ADHD, Boosting IQ

ScienceDaily (May 31, 2012) — Working memory training is unlikely to be an effective treatment for children suffering from disorders such as attention-deficit/hyperactivity or dyslexia, according to a research analysis published by the American Psychological Association. In addition, memory training tasks appear to have limited effect on healthy adults and children looking to do better in school or improve their cognitive skills.

"The success of working memory training programs is often based on the idea that you can train your brain to perform better, using repetitive memory trials, much like lifting weights builds muscle mass," said the study’s lead author, Monica Melby-Lervåg, PhD, of the University of Oslo. "However, this analysis shows that simply loading up the brain with training exercises will not lead to better performance outside of the tasks presented within these tests." The article was published online in Developmental Psychology.

Working memory enables people to complete tasks at hand by allowing the brain to retain pertinent information temporarily. Working memory enhancing tasks usually involve trying to get people to remember information presented to them while they are performing distracting activities. For example, participants may be presented with a series of numbers one at a time on a computer screen. The computer presents a new digit and then prompts participants to recall the number immediately preceding. More difficult versions might ask participants to recall what number appeared two, three or four digits ago.

In this meta-analysis, researchers from the University of Oslo and University College London examined 23 peer-reviewed studies with 30 different comparisons of groups that met their criteria. The studies were randomized controlled trials or experiments, had some sort of working memory treatment and a control group. The studies comprised a wide range of participants, including young children, children with cognitive impairments, such as ADHD, and healthy adults. Most of the studies had been published within the last 10 years.

Overall, working memory training improved performance on tasks related to the training itself but did not have an impact on more general cognitive performance such as verbal skills, attention, reading or arithmetic. “In other words, the training may help you improve your short-term memory when it’s related to the task implemented in training but it won’t improve reading difficulties or help you pay more attention in school,” said Melby-Lervåg.

In recent years, several commercial, computer-based working memory training programs have been developed and purport to benefit students suffering from ADHD, dyslexia, language disorders, poor academic performance or other issues. Some even claim to boost people’s IQs. These programs are widely used around the world in schools and clinics, and most involve tasks in which participants are given many memory tests that are designed to be challenging, the study said.

"In the light of such evidence, it seems very difficult to justify the use of working memory training programs in relation to the treatment of reading and language disorders," said Melby-Lervåg. "Our findings also cast strong doubt on claims that working memory training is effective in improving cognitive ability and scholastic attainment."

Source: Science Daily

Filed under science neuroscience brain psychology memory

5 notes

Fantasizing About Your Dream Vacation Could Lead to Poor Decision-Making

ScienceDaily (May 31, 2012) — Summer vacation time is upon us. If you have been saving up for your dream vacation for years, you may want to make sure your dream spot is still the best place to go. A new study has found that when we fantasize about such trips before they are possible, we tend to overlook the negatives — thus influencing our decision-making down the line.

Summer vacation time is upon us. If you have been saving up for your dream vacation for years, you may want to make sure your dream spot is still the best place to go. A new study has found that when we fantasize about such trips before they are possible, we tend to overlook the negatives — thus influencing our decision-making down the line. (Credit: © XtravaganT / Fotolia)

"We were interested in the effects of positive fantasies — what happens when people imagine an idealized, best-case-scenario version of the future, compared to when they imagine a less idealized version," says Heather Barry Kappes of New York University, co-author of the study published online this week in Personality and Social Psychology Bulletin. “This is one of the first papers to examine selective information acquisition at this early stage, before people are seriously considering a possibility.”

Say, for example, that you would like to take a trip to Australia this year but think you are very unlikely to do so — you have no more vacation time left, cannot afford it, or would rather save up for a new car. But you still daydream about how nice it would be to see the Australian Outback and lie on the white sand beaches, perhaps without thinking about the long plane ride there or the poisonous animals. Those daydreams, Kappes says, have powerful effects.

To test those effects, Kappes and co-author Gabriele Oettingen asked people to imagine a particular future about one of three topics: wearing glamorous high-heeled shoes, making money in the stock market, or taking a vacation. To induce positive fantasies for each topic, the study participants were prompted to think about how great it would be to do each activity. In the control condition, participants also imagined experiencing the future, but were prompted to think about the negatives as well, with questions like “Would it really be so great?” In both conditions, participants wrote down what they were thinking, for the researchers to ensure they were engaged in the imagery.

After that exercise, the researchers offered the participants a choice of different types of information. For example, participants could browse a website describing the positive and negative health consequences of wearing high heels, and researchers noted how much more time they spent reading about positive versus negative consequences. Or, they could choose which of five (fictitious) tripadvisor.com reviews they wanted to read, and researchers recorded whether they chose one that was more pro-trip (i.e., five stars) or con-trip (i.e., one star).

Kappes’ team found that for each topic, imagining the idealized version made people prefer to learn about the pros rather than the cons of the future event. “These effects are pronounced when people are not seriously considering pursuing a given future,” Kappes says.

The work has important implications for even the most deliberate of decision-makers. “When people are seriously considering implementing a decision like taking a trip, they often engage in careful deliberations about the pros versus cons,” Kappes says. “Our work suggests that before getting to this point, positive fantasies might lead people to acquire biased information — to learn more about the pros rather than the cons. Thus, even if people deliberate very carefully on the information they’ve acquired, they could still make poor decisions.”

People need to be aware of these effects to ensure that they acquire balanced information before it is time to make a decision, she says. The study also contributes to a larger body of research about the powerful consequences of mental imagery — and shows that positive thinking may not always be best. “Although there are benefits to imagining a positive future, there are also drawbacks, and it’s important to recognize them in order to most effectively pursue our goals.”

Source: Science Daily

Filed under science neuroscience psychology brain

free counters