Neuroscience

Month

May 2012

Surgeons Restore Some Hand Function to Quadriplegic Patient

May 15th, 2012

Technique could help those with C6, C7 spinal cord injuries.

Surgeons at Washington University School of Medicine in St. Louis have restored some hand function in a quadriplegic patient with a spinal cord injury at the C7 vertebra, the lowest bone in the neck. Instead of operating on the spine itself, the surgeons rerouted working nerves in the upper arms. These nerves still “talk” to the brain because they attach to the spine above the injury.

Following the surgery, performed at Barnes-Jewish Hospital, and one year of intensive physical therapy, the patient regained some hand function, specifically the ability to bend the thumb and index finger. He can now feed himself bite-size pieces of food and write with assistance.

The case study, published online May 15 in the Journal of Neurosurgery, is, to the authors’ knowledge, the first reported case of using nerve transfer to restore the ability to flex the thumb and index finger after a spinal cord injury.

“This procedure is unusual for treating quadriplegia because we do not attempt to go back into the spinal cord where the injury is,” says surgeon Ida K. Fox, MD, assistant professor of plastic and reconstructive surgery at Washington University, who treats patients at Barnes-Jewish Hospital. “Instead, we go out to where we know things work — in this case the elbow — so that we can borrow nerves there and reroute them to give hand function.”

image

To detour around the block in this patient’s C7 spinal cord injury and return hand function, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury (green) and the non-working nerves that connect below the injury (red) run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor (yellow arrow). Image adapted from Eric Young image available in press release mentioned.

Although patients with spinal cord injuries at the C6 and C7 vertebra have no hand function, they do have shoulder, elbow and some wrist function because the associated nerves attach to the spinal cord above the injury and connect to the brain. Since the surgeon must tap into these working nerves, the technique will not benefit patients who have lost all arm function due to higher injuries — in vertebrae C1 through C5.

The surgery was developed and performed by the study’s senior author Susan E. Mackinnon, MD, chief of the Division of Plastic and Reconstructive Surgery at Washington University School of Medicine. Specializing in injuries to peripheral nerves, she has pioneered similar surgeries to return function to injured arms and legs.

Mackinnon originally developed this procedure for patients with arm injuries specifically damaging the nerves that provide the ability to flex the thumb and index finger. This is the first time she has applied this peripheral nerve technique to return limb function after a spinal cord injury.

[Video: Surgeons restore some hand function to quadriplegic patient]
Surgeons at Washington University School of Medicine in St. Louis have restored some hand function in a quadriplegic patient with a spinal cord injury at the C7 vertebra, the lowest bone in the neck. Instead of operating on the spine itself, the surgeons rerouted working nerves in the upper arms. These nerves still “talk” to the brain because they attach to the spine above the injury. Following the surgery, performed at Barnes-Jewish Hospital, and one year of intensive physical therapy, the patient regained the ability to pinch and can now feed himself bite-size pieces of food and write with assistance.

“Many times these patients say they would like to be able to do very simple things,” Fox says. “They say they would like to be able to feed themselves or write without assistance. If we can restore the ability to pinch, between thumb and index finger, it can return some very basic independence.”

Mackinnon cautions that the hand function restored to the patient was not instantaneous and required intensive physical therapy. It takes time to retrain the brain to understand that nerves that used to bend the elbow now provide pinch, she says.

Though this study reports only one case, Mackinnon and her colleagues do not anticipate a limited window of time during which a patient with a similar spinal cord injury must be treated with this nerve transfer technique. This patient underwent the surgery almost two years after his injury. As long as the nerve remains connected to the support and nourishment of the spinal cord, even though it no longer “talks” to the brain, the nerve and its associated muscle remain healthy, even years after the injury.

“The spinal cord is the control center for the nerves, which run like spaghetti all the way out to the tips of the fingers and the tips of the toes,” says Mackinnon, the Sydney M. Shoenberg Jr. and Robert H. Shoenberg Professor and director of the School of Medicine’s Center for Nerve Injury and Paralysis. “Even nerves below the injury remain healthy because they are still connected to the spinal cord. The problem is that these nerves no longer ‘talk’ to the brain because the spinal cord injury blocks the signals.”

To detour around the block in this patient’s C7 spinal cord injury and return hand function below the level of the injury, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury and the non-working nerves that connect below the injury run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor.

In this case, Mackinnon took a non-working nerve that controls the ability to pinch and plugged it into a working nerve that drives one of two muscles that flex the elbow. After the surgery, the bicep still flexes the elbow, but a second muscle, called the brachialis, that used to also provide elbow flexion, now bends the thumb and index finger.

“This is not a particularly expensive or overly complex surgery,” Mackinnon says. “It’s not a hand or a face transplant, for example. It’s something we would like other surgeons around the country to do.”

By Julia Evangelou Strait

Source: Neuroscience News

May 16, 201210 notes
#science #neuroscience
This Is Your Brain On Sugar: Study in Rats Shows High-Fructose Diet Sabotages Learning, Memory

ScienceDaily (May 15, 2012) — Attention, college students cramming between midterms and finals: Binging on soda and sweets for as little as six weeks may make you stupid.

image

New research suggests that binging on soda and sweets for as little as six weeks may make you stupid. (Credit: © RTimages / Fotolia)

A new UCLA rat study is the first to show how a diet steadily high in fructose slows the brain, hampering memory and learning — and how omega-3 fatty acids can counteract the disruption. The peer-reviewed Journal of Physiology publishes the findings in its May 15 edition.

"Our findings illustrate that what you eat affects how you think," said Fernando Gomez-Pinilla, a professor of neurosurgery at the David Geffen School of Medicine at UCLA and a professor of integrative biology and physiology in the UCLA College of Letters and Science. "Eating a high-fructose diet over the long term alters your brain’s ability to learn and remember information. But adding omega-3 fatty acids to your meals can help minimize the damage."

While earlier research has revealed how fructose harms the body through its role in diabetes, obesity and fatty liver, this study is the first to uncover how the sweetener influences the brain.

The UCLA team zeroed in on high-fructose corn syrup, an inexpensive liquid six times sweeter than cane sugar, that is commonly added to processed foods, including soft drinks, condiments, applesauce and baby food. The average American consumes more than 40 pounds of high-fructose corn syrup per year, according to the U.S. Department of Agriculture. “We’re not talking about naturally occurring fructose in fruits, which also contain important antioxidants,” explained Gomez-Pinilla, who is also a member of UCLA’s Brain Research Institute and Brain Injury Research Center. “We’re concerned about high-fructose corn syrup that is added to manufactured food products as a sweetener and preservative.”

Gomez-Pinilla and study co-author Rahul Agrawal, a UCLA visiting postdoctoral fellow from India, studied two groups of rats that each consumed a fructose solution as drinking water for six weeks. The second group also received omega-3 fatty acids in the form of flaxseed oil and docosahexaenoic acid (DHA), which protects against damage to the synapses — the chemical connections between brain cells that enable memory and learning.

"DHA is essential for synaptic function — brain cells’ ability to transmit signals to one another," Gomez-Pinilla said. "This is the mechanism that makes learning and memory possible. Our bodies can’t produce enough DHA, so it must be supplemented through our diet."

The animals were fed standard rat chow and trained on a maze twice daily for five days before starting the experimental diet. The UCLA team tested how well the rats were able to navigate the maze, which contained numerous holes but only one exit. The scientists placed visual landmarks in the maze to help the rats learn and remember the way.

Six weeks later, the researchers tested the rats’ ability to recall the route and escape the maze. What they saw surprised them.

"The second group of rats navigated the maze much faster than the rats that did not receive omega-3 fatty acids," Gomez-Pinilla said. "The DHA-deprived animals were slower, and their brains showed a decline in synaptic activity. Their brain cells had trouble signaling each other, disrupting the rats’ ability to think clearly and recall the route they’d learned six weeks earlier."

The DHA-deprived rats also developed signs of resistance to insulin, a hormone that controls blood sugar and regulates synaptic function in the brain. A closer look at the rats’ brain tissue suggested that insulin had lost much of its power to influence the brain cells.

"Because insulin can penetrate the blood-brain barrier, the hormone may signal neurons to trigger reactions that disrupt learning and cause memory loss," Gomez-Pinilla said.

He suspects that fructose is the culprit behind the DHA-deficient rats’ brain dysfunction. Eating too much fructose could block insulin’s ability to regulate how cells use and store sugar for the energy required for processing thoughts and emotions.

"Insulin is important in the body for controlling blood sugar, but it may play a different role in the brain, where insulin appears to disturb memory and learning," he said. "Our study shows that a high-fructose diet harms the brain as well as the body. This is something new."

Gomez-Pinilla, a native of Chile and an exercise enthusiast who practices what he preaches, advises people to keep fructose intake to a minimum and swap sugary desserts for fresh berries and Greek yogurt, which he keeps within arm’s reach in a small refrigerator in his office. An occasional bar of dark chocolate that hasn’t been processed with a lot of extra sweetener is fine too, he said.

Still planning to throw caution to the wind and indulge in a hot-fudge sundae? Then also eat foods rich in omega-3 fatty acids, like salmon, walnuts and flaxseeds, or take a daily DHA capsule. Gomez-Pinilla recommends one gram of DHA per day.

"Our findings suggest that consuming DHA regularly protects the brain against fructose’s harmful effects," said Gomez-Pinilla. "It’s like saving money in the bank. You want to build a reserve for your brain to tap when it requires extra fuel to fight off future diseases."

Source: Science Daily

May 16, 201228 notes
#science #neuroscience #brain #memory #psychology
Chronic Child Abuse Strong Indicator of Negative Adult Experiences

ScienceDaily (May 15, 2012) — Child abuse or neglect are strong predictors of major health and emotional problems, but little is known about how the chronicity of the maltreatment may increase future harm apart from other risk factors in a child’s life.

image

This chart illustrates the individual childhood and adult outcomes according to the number of reports that occurred before the event of interest. Because it was possible for some children to enter the study period with a pre-existing condition, these are indicated as gray or black bars with the legend indicating the outcome occurred “before the study.” Chronicity is associated with increasing risk for all but child maltreatment perpetration, violent delinquency, and head or brain injury. In these cases, there is a slight decline in prevalence for the highest category compared with middle categories, but in all cases having reports was associated with higher rates of outcomes. (Credit: Image courtesy of Washington University in St. Louis)

In a new study published in the current issue of the journal Pediatrics, Melissa Jonson-Reid, PhD, child welfare expert and a professor at the Brown School at Washington University in St. Louis, looked at how chronic maltreatment impacted the future health and behavior of children and adults.

The study tracked children by number of child maltreatment reports (zero to four or more) and followed the children into early adulthood, by which time some of the children had become parents.

The study sought to determine how well the number of child maltreatment reports predicted poor outcomes in adolescence, such as delinquency, substance abuse in the teen years or getting a sexually transmitted disease.

"For every measure studied, a more chronic history of child maltreatment reports was powerfully predictive of worse outcomes," Jonson-Reid says.

"For most outcomes, having a single maltreatment report put children at a 20 percent to 50 percent higher risk than non-maltreated comparison children.

In addition, a series of adult outcomes were tracked to see if the chronicity of maltreatment still mattered after controlling for the poor outcomes in adolescence. Adult outcomes included adult substance abuse or growing up and having children whom they then maltreated.

"In models of adult outcomes, children with four or more reports were about least twice as likely to later abuse their own children and have contact with the mental health system, even when controlling for the negative outcomes during adolescence." Jonson-Reid says that there appears to be good reason to put resources into preventing ongoing maltreatment.

"Successfully interrupting chronic child maltreatment may well reduce risk of a wide range of other costly child and adolescent health and behavioral problems," she says.

Jonson-Reid cites a recently published Centers for Disease Control and Prevention study estimating lifetime costs for a single year’s worth of children reported for maltreatment at $242 billion.

"What our study illustrates is that these costs are even more likely to accrue for children who continue to be re-reported," she says.

The study also found that maltreatment predicts a range of negative adolescent outcomes, and those adolescent outcomes then predict poor adult outcomes.

"If the poor outcomes in adolescence can be dealt with effectively, then later adult outcomes may also be forestalled," Jonson-Reid says.

"Our findings could therefore be interpreted as supporting many current evidence-based interventions that seek to improve behavioral and social functioning among children and adolescents who have experienced trauma like abuse or neglect."

Source: Science Daily

May 15, 201213 notes
#science #neuroscience #psychology
Mystery Gene Reveals New Mechanism for Anxiety Disorders

ScienceDaily (May 15, 2012) — A novel mechanism for anxiety behaviors, including a previously unrecognized inhibitory brain signal, may inspire new strategies for treating psychiatric disorders, University of Chicago researchers report.

By testing the controversial role of a gene called Glo1 in anxiety, scientists uncovered a new inhibitory factor in the brain: the metabolic by-product methylglyoxal. The system offers a tantalizing new target for drugs designed to treat conditions such as anxiety disorder, epilepsy, and sleep disorders.

The study, published in the Journal of Clinical Investigation, found that animals with multiple copies of the Glo1 gene were more likely to exhibit anxiety-like behavior in laboratory tests. Further experiments showed that Glo1 increased anxiety-like behavior by lowering levels of methylglyoxal (MG). Conversely, inhibiting Glo1 or raising MG levels reduced anxiety behaviors.

"Animals transgenic for Glo1 had different levels of anxiety-like behavior, and more copies made them more anxious," said Abraham Palmer, PhD, assistant professor of human genetics at the University of Chicago Medicine and senior author of the study. "We showed that Glo1 was causally related to anxiety-like behavior, rather than merely correlated."

In 2005, a comparison of different mouse strains found a link between anxiety-like behaviors and Glo1, the gene encoding the metabolic enzyme glyoxylase 1. However, subsequent studies questioned the link, and the lack of an obvious connection between glyoxylase 1 and brain function or behavior made some scientists skeptical.

Read More →

May 15, 201227 notes
#science #neuroscience #brain #psychology #anxiety
Drugs from lizard saliva reduces the cravings for food

May 15, 2012

A drug made from the saliva of the Gila monster lizard is effective in reducing the craving for food. Researchers at the Sahlgrenska Academy, University of Gothenburg, have tested the drug on rats, who after treatment ceased their cravings for both food and chocolate.

image

In a study with rats published in the Journal of Neuroscience, Assistant Professor Karolina Skibicka and her colleagues show that exendin-4 effectively reduces the cravings for food. Credit: Photo: University of Gothenburg

An increasing number of patients suffering from type 2 diabetes are offered a pharmaceutical preparation called Exenatide, which helps them to control their blood sugar. The drug is a synthetic version of a natural substance called exendin-4, which is obtained from a rather unusual source – the saliva of the Gila monster lizard (Heloderma suspectum), North America’s largest lizard.

Researchers at the Sahlgrenska Academy at the University of Gothenburg, have now found an entirely new and unexpected effect of the lizard substance.

In a study with rats published in the Journal of Neuroscience, Assistant Professor Karolina Skibicka and her colleagues show that exendin-4 effectively reduces the cravings for food.

"This is both unknown and quite unexpected effect," comments an enthusiastic Karolina Skibicka:

" Our decision to eat is linked to the same mechanisms in the brain which control addictive behaviours. We have shown that exendin-4 affects the reward and motivation regions of the brain"

The implications of the findings are significant” states Suzanne Dickson, Professor of Physiology at the Sahlgrenska Academy: “Most dieting fails because we are obsessed with the desire to eat, especially tempting foods like sweets. As exendin-4 suppresses the cravings for food, it can help obese people to take control of their weight,” suggests Professor Dickson.

Research on exendin-4 also gives hope for new ways to treat diseases related to eating disorders, for example, compulsive overeating.

Another hypothesis for the Gothenburg researchers’ continuing studies is that exendin-4 may be used to reduce the craving for alcohol.

"It is the same brain regions which are involved in food cravings and alcohol cravings, so it would be very interesting to test whether exendin-4 also reduces the cravings for alcohol,” suggests Assistant Professor Skibicka.

Provided by University of Gothenburg

Source: medicalxpress.com

May 15, 201218 notes
#neuroscience #science #psychology
Active lifestyle in elderly keeps their brains running

May 15, 2012

(Medical Xpress) — New research from Uppsala University, Sweden, suggests that an active lifestyle in late life protects grey matter and cognitive functions in humans. The findings are now published in the scientific journal Neurobiology of Aging.

In a new study, a multidisciplinary research team from the Uppsala University has systematically studied 331 men and women at the age of 75 years. The researchers examined whether an active lifestyle is tied to brain health in seniors living in Uppsala, Sweden. The brain structure of each participant was measured using magnetic imaging technology, so-called MRT, and various memory tests were administered in order to monitor the seniors’ cognitive status.

“We found that those elderly who reported to be more active in daily routine had larger grey and white matter and showed better performances on various memory tests, compared to those who had a sedentary lifestyle. Interestingly, active elderly had also more grey matter in the precuneus, a brain region that typically shrinks at the beginning of Alzheimer’s disease. Our findings suggest that an active lifestyle is a promising strategy for counteracting cognitive aging late in life,” says Christian Benedict.

The data for the study were taken from the major epidemiological study Prospective Investigation of the Vasculature in Uppsala Seniors (PIVUS). http://www.medsci.uu.se/pivus/

More information: Benedict C et al., Association between physical activity and brain health in older adults, Neurobiology of Aging, in press. http://www.sciencedirect.com/science/article/pii/S0197458012002618

Provided by Uppsala University

Source: medicalxpress.com

May 15, 20128 notes
#science #neuroscience #brain #psychology
First Gene Therapy Successful Against Aging-Associated Decline: Mouse Lifespan Extended Up to 24% With a Single Treatment

ScienceDaily (May 14, 2012) — A new study consisting of inducing cells to express telomerase, the enzyme which — metaphorically — slows down the biological clock — was successful. The research provides a “proof-of-principle” that this “feasible and safe” approach can effectively “improve health span.”

image

Pictured are Maria A. Blasco and Bruno M. Bernardes de Jesús (co-author) in the CNIO building in Madrid. (Credit: CNIO)

A number of studies have shown that it is possible to lengthen the average life of individuals of many species, including mammals, by acting on specific genes. To date, however, this has meant altering the animals’ genes permanently from the embryonic stage — an approach impracticable in humans. Researchers at the Spanish National Cancer Research Centre (CNIO), led by its director María Blasco, have demonstrated that the mouse lifespan can be extended by the application in adult life of a single treatment acting directly on the animal’s genes. And they have done so using gene therapy, a strategy never before employed to combat aging. The therapy has been found to be safe and effective in mice.

The results were recently published in the journal EMBO Molecular Medicine. The CNIO team, in collaboration with Eduard Ayuso and Fátima Bosch of the Centre of Animal Biotechnology and Gene Therapy at the Universitat Autònoma de Barcelona (UAB), treated adult (one-­‐year-­‐old) and aged (two-­‐year-­‐old) mice, with the gene therapy delivering a “rejuvenating” effect in both cases, according to the authors.

Mice treated at the age of one lived longer by 24% on average, and those treated at the age of two, by 13%. The therapy, furthermore, produced an appreciable improvement in the animals’ health, delaying the onset of age-­‐related diseases — like osteoporosis and insulin resistance — and achieving improved readings on aging indicators like neuromuscular coordination.

The gene therapy consisted of treating the animals with a DNA-­modified virus, the viral genes having been replaced by those of the telomerase enzyme, with a key role in aging. Telomerase repairs the extreme ends or tips of chromosomes, known as telomeres, and in doing so slows the cell’s and therefore the body’s biological clock. When the animal is infected, the virus acts as a vehicle depositing the telomerase gene in the cells.

This study “shows that it is possible to develop a telomerase-­based anti-­aging gene therapy without increasing the incidence of cancer,” the authors affirm. “Aged organisms accumulate damage in their DNA due to telomere shortening, [this study] finds that a gene therapy based on telomerase production can repair or delay this kind of damage,” they add.

'Resetting' the biological clock

Telomeres are the caps that protect the end of chromosomes, but they cannot do so indefinitely: each time the cell divides the telomeres get shorter, until they are so short that they lose all functionality. The cell, as a result, stops dividing and ages or dies. Telomerase gets around this by preventing telomeres from shortening or even rebuilding them. What it does, in essence, is stop or reset the cell’s biological clock.

But in most cells the telomerase gene is only active before birth; the cells of an adult organism, with few exceptions, have no telomerase. The exceptions in question are adult stem cells and cancer cells, which divide limitlessly and are therefore immortal — in fact several studies have shown that telomerase expression is the key to the immortality of tumour cells.

It is precisely this risk of promoting tumour development that has set back the investigation of telomerase-­‐based anti-­‐aging therapies.

In 2007, Blasco’s group demonstrated that it was feasible to prolong the lives of transgenic mice, whose genome had been permanently altered at the embryonic stage, by causing their cells to express telomerase and, also, extra copies of cancer-­‐resistant genes. These animals live 40% longer than is normal and do not develop cancer.

The mice subjected to the gene therapy now under test are likewise free of cancer. Researchers believe this is because the therapy begins when the animals are adult so do not have time to accumulate sufficient number of aberrant divisions for tumours to appear.

Also important is the kind of virus employed to carry the telomerase gene to the cells. The authors selected demonstrably safe viruses that have been successfully used in gene therapy treatment of hemophilia and eye disease. Specifically, they are non-­‐replicating viruses derived from others that are non-­‐pathogenic in humans.

This study is viewed primarily as “a proof-­‐of-­‐principle that telomerase gene therapy is a feasible and generally safe approach to improve healthspan and treat disorders associated with short telomeres,” state Virginia Boccardi (Second University of Naples) and Utz Herbig (New Jersey Medical School-­‐University Hospital Cancer Centre) in a commentary published in the same journal.

Although this therapy may not find application as an anti-­‐aging treatment in humans, in the short term at least, it could open up a new treatment option for ailments linked with the presence in tissue of abnormally short telomeres, as in some cases of human pulmonary fibrosis.

More healthy years

As Blasco says, “aging is not currently regarded as a disease, but researchers tend increasingly to view it as the common origin of conditions like insulin resistance or cardiovascular disease, whose incidence rises with age. In treating cell aging, we could prevent these diseases.”

With regard to the therapy under testing, Bosch explains: “Because the vector we use expresses the target gene (telomerase) over a long period, we were able to apply a single treatment. This might be the only practical solution for an anti-­‐aging therapy, since other strategies would require the drug to be administered over the patient’s lifetime, multiplying the risk of adverse effects.”

Source: Science Daily

May 15, 201218 notes
#science #neuroscience #brain #psychology
Smoked Cannabis Reduces Some Symptoms of Multiple Sclerosis

May 14th, 2012

Controlled trial shows improved spasticity, reduced pain after smoking medical marijuana.

A clinical study of 30 adult patients with multiple sclerosis (MS) at the University of California, San Diego School of Medicine has shown that smoked cannabis may be an effective treatment for spasticity – a common and disabling symptom of this neurological disease.

The placebo-controlled trial also resulted in reduced perception of pain, although participants also reported short-term, adverse cognitive effects and increased fatigue. The study will be published in the Canadian Medical Association Journal on May 14.

Principal investigator Jody Corey-Bloom, MD, PhD, professor of neurosciences and director of the Multiple Sclerosis Center at UC San Diego, and colleagues randomly assigned participants to either the intervention group (which smoked cannabis once daily for three days) or the control group (which smoked identical placebo cigarettes, also once a day for three days). After an 11-day interval, the participants crossed over to the other group.

“We found that smoked cannabis was superior to placebo in reducing symptoms and pain in patients with treatment-resistant spasticity, or excessive muscle contractions,” said Corey-Bloom.

Earlier reports suggested that the active compounds of medical marijuana were potentially effective in treating neurologic conditions, but most studies focused on orally administered cannabinoids. There were also anecdotal reports of MS patients that endorsed smoking marijuana to relieve symptoms of spasticity.

However, this trial used a more objective measurement, a modified Ashford scale which graded the intensity of muscle tone by measuring such things as resistance in range of motion and rigidity. The secondary outcome, pain, was measured using a visual analogue scale. The researchers also looked at physical performance (using a timed walk) and cognitive function and – at the end of each visit – asked patients to assess their feeling of “highness.”

Although generally well tolerated, smoking cannabis did have mild effects on attention and concentration. The researchers noted that larger, long-terms studies are needed to confirm their findings and determine whether lower doses can result in beneficial effects with less cognitive impact.

The current study is the fifth clinical test of the possible efficacy of cannabis for clinical use reported by the University of California Center for Medicinal Cannabis Research (CMCR). Four other human studies on control of neuropathic pain also reported positive results.

Source: Neuroscience News

May 15, 20128 notes
#science #neuroscience #brain #psychology
New Type of Retinal Prosthesis Could Better Restore Sight to Blind

May 14th, 2012

Using tiny solar-panel-like cells surgically placed underneath the retina, scientists at the Stanford University School of Medicine have devised a system that may someday restore sight to people who have lost vision because of certain types of degenerative eye diseases.

This device — a new type of retinal prosthesis — involves a specially designed pair of goggles, which are equipped with a miniature camera and a pocket PC that is designed to process the visual data stream. The resulting images would be displayed on a liquid crystal microdisplay embedded in the goggles, similar to what’s used in video goggles for gaming. Unlike the regular video goggles, though, the images would be beamed from the LCD using laser pulses of near-infrared light to a photovoltaic silicon chip — one-third as thin as a strand of hair — implanted beneath the retina.

Electric currents from the photodiodes on the chip would then trigger signals in the retina, which then flow to the brain, enabling a patient to regain vision.

A study, to be published online May 13 in Nature Photonics, discusses how scientists tested the photovoltaic stimulation using the prosthetic device’s diode arrays in rat retinas in vitro and how they elicited electric responses, which are widely accepted indicators of visual activity, from retinal cells . The scientists are now testing the system in live rats, taking both physiological and behavioral measurements, and are hoping to find a sponsor to support tests in humans.

“It works like the solar panels on your roof, converting light into electric current,” said Daniel Palanker, PhD, associate professor of ophthalmology and one of the paper’s senior authors. “But instead of the current flowing to your refrigerator, it flows into your retina.” Palanker is also a member of the Hansen Experimental Physics Laboratory at Stanford and of the interdisciplinary Stanford research program, Bio-X. The study’s other senior author is Alexander Sher, PhD, of the Santa Cruz Institute of Particle Physics at UC Santa Cruz; its co-first authors are Keith Mathieson, PhD, a visiting scholar in Palanker’s lab, and James Loudin, PhD, a postdoctoral scholar. Palanker and Loudin jointly conceived and designed the prosthesis system and the photovoltaic arrays.

image

This pinpoint-sized photovoltaic chip (upper right corner) is implanted under the retina in a blind rat to restore sight. The center image shows how the chip is comprised of an array of photodiodes, which can be activated by pulsed near-infrared light to stimulate neural signals in the eye that propagate then to the brain. A higher magnification view (lower left corner) shows a single pixel of the implant, which has three diodes around the perimeter and an electrode in the center. The diodes turn light into an electric current which flows from the chip into the inner layer of retinal cells. Adapted from Stanford image courtesy of the Daniel Palanker lab.

Read More →

May 15, 20127 notes
#science #neuroscience #brain #vision #psychology
Sleepwalking more prevalent among US adults than previously suspected

May 14, 2012

What goes bump in the night? In many U.S. households: people. That’s according to new Stanford University School of Medicine research, which found that about 3.6 percent of U.S. adults are prone to sleepwalking. The work also showed an association between nocturnal wanderings and certain psychiatric disorders, such as depression and anxiety.

The study, the researchers noted, “underscores the fact that sleepwalking is much more prevalent in adults than previously appreciated.”

Maurice Ohayon, MD, DSc, PhD, professor of psychiatry and behavioral sciences, is the lead author of the paper, which will appear in the May 15 issue of Neurology, the medical journal of the American Academy of Neurology.

Sleepwalking is a disorder “of arousal from non-REM sleep.” While wandering around at night can be harmless and is often played for laughs — anyone remember the Simpsons episode where Homer began wandering around and doing silly things in his sleep? — sleepwalking can have serious consequences. Episodes can result in injuries to the wanderer or others and lead to impaired psychosocial functioning.

It is thought that medication use and certain psychological and psychiatric conditions can trigger sleepwalking, but the exact causes are unknown. Also unclear to experts in the field is the prevalence.

"Apart from a study we did 10 years ago in the European general population, where we reported a prevalence of 2 percent of sleepwalking," the researchers wrote in their paper, "there are nearly no data regarding the prevalence of nocturnal wanderings in the adult general population. In the United States, the only prevalence rate was published 30 years ago."

For this study, the first to use a large, representative sample of the U.S. general population to demonstrate the number of sleepwalkers, the researchers also aimed to evaluate the importance of medication use and mental disorders associated with sleepwalking. Ohayon and his colleagues secured a sample of 19,136 individuals from 15 states and then used phone surveys to gather information on participants’ mental health, medical history and medication use.

Participants were asked specific questions related to sleepwalking, including frequency of episodes during sleep, duration of the sleep disorder and any inappropriate or potentially dangerous behaviors during sleep. Those who didn’t report any episodes in the last year were asked if they had sleepwalked during their childhood. Participants were also queried about whether there was a family history of sleepwalking and whether they had other parasomnia symptoms, such as sleep terrors and violent behaviors during sleep.

The researchers determined that as many as 3.6 percent of the sample reported at least one episode of sleepwalking in the previous year, with 1 percent saying they had two or more episodes in a month. Because of the number of respondents who reported having episodes during childhood or adolescence, lifetime prevalence of sleepwalking was found to be 29.2 percent.

The study also showed that people with depression were 3.5 times more likely to sleepwalk than those without, and people with alcohol abuse/dependence or obsessive-compulsive disorder were also significantly more likely to have sleepwalking episodes. In addition, individuals taking SSRI antidepressants were three times more likely to sleepwalk twice a month or more than those who didn’t.

"There is no doubt an association between nocturnal wanderings and certain conditions, but we don’t know the direction of the causality," said Ohayon. "Are the medical conditions provoking sleepwalking, or is it vice versa? Or perhaps it’s the treatment that is responsible."

Although more research is needed, the work could help raise awareness of this association among primary care physicians. “We’re not expecting them to diagnose sleepwalking, but they might detect symptoms that could be indices of sleepwalking,” said Ohayon.

Among the researchers’ other findings:

  • The duration of sleepwalking was mostly chronic, with just over 80 percent of those who have sleepwalked reporting they’ve done so for more than five years.
  • Sleepwalking was not associated with gender and seemed to decrease with age.
  • Nearly one-third of individuals with nocturnal wandering had a family history of the disorder.
  • People using over-the-counter sleeping pills had a higher likelihood of reporting sleepwalking episodes at least two times per month. (Indeed, a sleeping pill was the trigger for Homer Simpson’s middle-of-the-night shenanigans.)

D. Le’ger, MD, PhD, from the Universite Paris Descartes in France, was senior author of the study. Researchers from the University of Minnesota Medical School, the Hopital Gui-de-Chauliac in Montpellier, France, and Duke University School of Medicine were also involved.

Provided by Stanford University Medical Center

Source: medicalxpress.com

May 15, 20123 notes
#science #neuroscience #brain #psychology #depression #anxiety
Brain circuitry is different for women with anorexia and obesity

May 14, 2012

Why does one person become anorexic and another obese? A study recently published by a University of Colorado School of Medicine researcher shows that reward circuits in the brain are sensitized in anorexic women and desensitized in obese women. The findings also suggest that eating behavior is related to brain dopamine pathways involved in addictions.

Guido Frank, MD, assistant professor director of the Developmental Brain Research Program at the CU School of Medicine and his colleagues used functional magnetic resonance imaging (fMRI) to examine brain activity in 63 women who were either anorexic or obese. Scientists compared them to women considered “normal” weight. The participants were visually conditioned to associate certain shapes with either a sweet or a non-sweet solution and then received the taste solutions expectedly or unexpectedly. This task has been associated with brain dopamine function in the past.

The authors found that during these fMRI sessions, an unexpected sweet-tasting solution resulted in increased neural activation of reward systems in the anorexic patients and diminished activation in obese individuals. In rodents, food restriction and weight loss have been associated with greater dopamine-related reward responses in the brain.

"It is clear that in humans the brain’s reward system helps to regulate food intake" said Frank. "The specific role of these networks in eating disorders such as anorexia nervosa and, conversely, obesity, remains unclear.”

Scientists agree that more research is needed in this area. The study was published in Neuropsychopharmacology.

Provided by University of Colorado Denver

Source: medicalxpress.com

May 15, 201225 notes
#science #neuroscience #brain #psychology #anorexia #obesity
How to minimize stroke damage

May 14, 2012

Following a stroke, factors as varied as blood sugar, body temperature and position in bed can affect patient outcomes, Loyola University Medical Center researchers report.

In a review article in the journal MedLink Neurology, first author Murray Flaster, MD, PhD and colleagues summarize the latest research on caring for ischemic stroke patients. (Most strokes are ischemic, meaning they are caused by blood clots.)

"The period immediately following an acute ischemic stroke is a time of significant risk,” the Loyola neurologists write. “Meticulous attention to the care of the stroke patient during this time can prevent further neurologic injury and minimize common complications, optimizing the chance of functional recovery.”

Stroke care has two main objectives – minimizing injury to brain tissue and preventing and treating the many neurologic and medical complications that can occur just after a stroke.

The authors discuss the many complex factors that affect outcomes. For example, there is considerable evidence of a link between hyperglycemia (high blood sugar) and poor outcomes after stroke. The authors recommend strict blood sugar control, using frequent finger-stick glucose checks and aggressive insulin treatment.

For each 1 degree C increase in the body temperature of stroke patients, the risk of death or severe disability more than doubles. Therapeutic cooling has been shown to help cardiac arrest patients, and clinical trials are underway to determine whether such cooling could also help stroke patients. Until those trials are completed, the goal should be to keep normal temperatures (between 95.9 and 99.5 degrees F).

Position in bed also is important, because sitting upright decreases blood flow in the brain. A common practice is to keep the patient lying flat for 24 hours. If a patient has orthopnea (difficulty breathing while lying flat), the head of the bed should be kept at the lowest elevation the patient can tolerate.

The authors discuss many other issues in stroke care, including blood pressure management; blood volume; statin therapy; management of complications such as pneumonia and sepsis; heart attack and other cardiac problems; blood clots; infection; malnutrition and aspiration; brain swelling; seizures; recurrent stroke; and brain hemorrhages.

Studies have shown that hospital units that specialize in stroke care decrease mortality, increase the likelihood of being discharged to home and improve functional status and quality of life.

All patients should receive supportive care — including those who suffer major strokes and the elderly. “Even in these populations, the majority of patients will survive their stroke,” the authors write. “The degree of functional recovery, however, may be dramatically impacted by the intensity and appropriateness of supportive care.”

Provided by Loyola University Health System

Source: medicalxpress.com

May 15, 201214 notes
#science #neuroscience #brain #stroke #psychology
Brain oscillations reveal that our senses do not experience the world continuously

May 14, 2012

(Medical Xpress) — It has long been suspected that humans do not experience the world continuously, but rather in rapid snapshots.

Now, researchers at the University of Glasgow have demonstrated this is indeed the case. Just as the body goes through a 24-hour sleep-wake cycle controlled by a circadian clock, brain function undergoes such cyclic activity – albeit at a much faster rate.

Professor Gregor Thut of the Institute of Neuroscience and Psychology, said: “Rhythms are intrinsic to biological systems. The circadian rhythm, with its very slow periodicity of sleep and wake cycles every 24 hours has an obvious, periodic effect on bodily functions.

“Brain oscillations – the recurrent neural activity that we see in the brain – also show periodicity but cycle at much faster speeds. What we wanted to know was whether brain function was affected in a cyclic manner by these rapid oscillations.”

The researchers studied a prominent brain rhythm associated with visual cortex functioning that cycles at a rate of 10 times per second (10Hz).

They used a ‘simple trick’ to affect the oscillations of this rhythm which involved presenting a brief sound to ‘reset’ the oscillation.

Testing subsequent visual perception, by using transcranial magnetic stimulation of the visual cortex, revealed a cyclic pattern at the very rapid rate of brain oscillations, in time with the underlying brainwaves.

Prof Thut said: “Rhythmicity therefore is indeed omnipresent not only in brain activity but also brain function. For perception, this means that despite experiencing the world as a continuum, we do not sample our world continuously but in discrete snapshots determined by the cycles of brain rhythms.”

The research, ‘Sounds reset rhythms of visual cortex and corresponding human visual perception’ is published in the journal Current Biology.

Provided by University of Glasgow

Source: medicalxpress.com

May 15, 201268 notes
#science #neuroscience #brain #psychology
Let there be light: It's good for our brains

May 14, 2012 By Sandy Evangelista

(Medical Xpress) — Swiss scientists have proven that light intensity influences our cognitive performance and how alert we feel, and that these positive effects last until early evening.

image

Credit: 2012 EPFL

Tests conducted in EPFL’s Solar Energy and Building Physics Laboratory (LESO) have confirmed the hypothesis that light influences our subjective feeling of sleepiness. The research team, led by Mirjam Münch, also showed that the effects of light exposure last until the early evening, and that light intensity has an impact on cognitive mechanisms. The results of this research were recently published in the journal Behavioral Neuroscience.

Light synchronizes our biological clocks. It is collected in the eye by photoreceptors that use photopigments (pigments that change when exposed to light), known as melanopsin. These cells, which differ from rods and cones, are considered a third class of photoreceptors in the retina and were discovered just ten years ago. They’re not there to form an image, but to perceive and absorb photons in the visible light spectrum. In addition, they are stimulated by blue light.

Exploring office lighting

Münch and her team wanted to know how our circadian rhythm could be influenced by our perception of light during the daytime. They created realistic office lighting conditions and recruited 29 young participants. “For this study, we took into account the intensity of natural and artificial light without specifically evaluating their spectra.”

From daytime to dusk

To synchronize their internal biological clocks, the volunteers had to maintain a regular sleep schedule during the seven days leading up to the test. They wore bracelets equipped with light sensors and accelerometers, so that the scientists could monitor their movements.

The study itself took place over two eight-hour sessions. The participants spent the first six hours in an experiment room, first in well-lighted conditions (1000-2000 lux, more or less equivalent to natural light in a room). In the second session, the light intensity was about 170 lux, which is what the eye perceives in a room without a window, lit with artificial light. For this experiment, light intensity was measured at eye-level. Every 30 minutes, the subjects were asked to assess how alert or sleepy they felt.

Finally, at the end of each session, the participants underwent two hours of supplemental memory tests in a darkened room – less than 6 lux. During these last two hours, the researchers took saliva samples in order to measure cortisol and melatonin concentrations. These two hormones are produced in a in a 24-hour cycle by the human body.

Boosted by the light

The volunteers who were subjected to higher light intensity during the afternoon were more alert all the way into the early evening. When they were subjected to light intensity ten times weaker, however, they showed signs of sleepiness and obtained lower scores on the memory tests.

These results were observed even in the absence of changes in cortisol and melatonin concentrations in their saliva. “With this study, we have discovered that light intensity has a direct effect on the subjective feeling of sleepiness as well as on objective cognitive performance, and that the benefits of more intense light during the daytime last long past the time of exposure,” concludes Münch.

Provided by Ecole Polytechnique Federale de Lausanne

Source: medicalxpress.com

May 14, 201212 notes
#science #neuroscience #brain #psychology
Powerful Function of Single Protein That Controls Neurotransmission Discovered

ScienceDaily (May 13, 2012) — Scientists at Weill Cornell Medical College have discovered that the single protein — alpha 2 delta — exerts a spigot-like function, controlling the volume of neurotransmitters and other chemicals that flow between the synapses of brain neurons. The study, published online in Nature, shows how brain cells talk to each other through these signals, relaying thoughts, feelings and action, and this powerful molecule plays a crucial role in regulating effective communication.

In the study, the investigators also suggest how the widely used pain drug Lyrica might work. The alpha 2 delta protein is the target of this drug and the new work suggests an approach to how other drugs could be developed that effectively twist particular neurotransmitter spigots on and off to treat neurological disorders. The research findings surprised the research team, which includes scientists from University College London.

"We are amazed that any single protein has such power," says the study’s lead investigator Dr. Timothy A. Ryan, professor of Biochemistry and associate professor of Biochemistry in Anesthesiology at Weill Cornell Medical College. "It is indeed rare to identify a biological molecule’s function that is so potent, that seems to be controlling the effectiveness of neurotransmission."

The researchers found that alpha 2 delta determines how many calcium channels will be present at the synaptic junction between neurons. The transmission of chemical signals is triggered at the synapse by the entry of calcium into these channels, so the volume and speed of neurotransmission depends on the availability of these channels.

Researchers discovered that taking away alpha 2 delta from brain cells prevented calcium channels from getting to the synapse. “But if you add more alpha 2 delta, you can triple the number of channels at synapses,” Dr. Ryan says. “This change in abundance was tightly linked to how well synapses carry out their function, which is to release neurotransmitters.”

Before this study, it was known that Lyrica, which is used for neuropathic pain, seizures and fibromyalgia, binds to alpha 2 delta, but little was understood about how this protein works to control synapses.

Read More →

May 14, 20129 notes
#science #neuroscience #brain #psychology
Vitamin K2: New Hope for Parkinson's Patients?

ScienceDaily (May 11, 2012) — Neuroscientist Patrik Verstreken, associated with VIB and KU Leuven, succeeded in undoing the effect of one of the genetic defects that leads to Parkinson’s using vitamin K2. His discovery gives hope to Parkinson’s patients.

image

Male fruit fly (Drosophila Melanogaster). Scientists have succeeded in undoing the effect of one of the genetic defects that leads to Parkinson’s using vitamin K2. The research was done in fruit flies. (Credit: © Studiotouch / Fotolia)

This research was done in collaboration with colleagues from Northern Illinois University (US) and was recently published in the journal Science.

"It appears from our research that administering vitamin K2 could possibly help patients with Parkinson’s. However, more work needs to be done to understand this better," says Patrik Verstreken.

Malfunctioning power plants are at the basis of Parkinson’s.

If we looked at cells as small factories, then mitochondria would be the power plants responsible for supplying the energy for their operation. They generate this energy by transporting electrons. In Parkinson’s patients, the activity of mitochondria and the transport of electrons have been disrupted, resulting in the mitochondria no longer producing sufficient energy for the cell. This has major consequences as the cells in certain parts of the brain will start dying off, disrupting communication between neurons. The results are the typical symptoms of Parkinson’s: lack of movement (akinesia), tremors and muscle stiffness.

The exact cause of this neurodegenerative disease is not known. In recent years, however, scientists have been able to describe several genetic defects (mutations) found in Parkinson’s patients, including the so-called PINK1 and Parkin mutations, which both lead to reduced mitochondrial activity. By studying these mutations, scientists hope to unravel the mechanisms underlying the disease process.

Paralyzed fruit flies

Fruit flies (Drosophila) are frequently used in lab experiments because of their short life spans and breeding cycles, among other things. Within two weeks of her emergence, every female is able to produce hundreds of offspring. By genetically modifying fruitflies, scientists can study the function of certain genes and proteins. Patrik Verstreken and his team used fruitflies with a genetic defect in PINK1 or Parkin that is similar to the one associated with Parkinson’s. They found that the flies with a PINK1 or Parkin mutation lost their ability to fly.

Upon closer examination, they discovered that the mitochondria in these flies were defective, just as in Parkinson’s patients. Because of this they generated less intracellular energy — energy the insects needed to fly. When the flies were given vitamin K2, the energy production in their mitochondria was restored and the insects’ ability to fly improved. The researchers were also able to determine that the energy production was restored because the vitamin K2 had improved electron transport in the mitochondria. This in turn led to improved energy production.

Conclusion

Vitamin K2 plays a role in the energy production of defective mitochondria. Because defective mitochondria are also found in Parkinson’s patients with a PINK1 or Parkin mutation, vitamin K2 potentially offers hope for a new treatment for Parkinson’s.

Source: Science Daily

May 14, 20128 notes
#science #neuroscience #brain #psychology #parkinson
Gene therapy for hearing loss: Potential and limitations

May 11, 2012

Regenerating sensory hair cells, which produce electrical signals in response to vibrations within the inner ear, could form the basis for treating age- or trauma-related hearing loss. One way to do this could be with gene therapy that drives new sensory hair cells to grow.

Researchers at Emory University School of Medicine have shown that introducing a gene called Atoh1 into the cochleae of young mice can induce the formation of extra sensory hair cells.

Their results show the potential of a gene therapy approach, but also demonstrate its current limitations. The extra hair cells produce electrical signals like normal hair cells and connect with neurons. However, after the mice are two weeks old, which is before puberty, inducing Atoh1 has little effect. This suggests that an analogous treatment in adult humans would also not be effective by itself.

The findings were published May 9 in the Journal of Neuroscience.

"We’ve shown that hair cell regeneration is possible in principle," says Ping Chen, PhD, associate professor of cell biology at Emory University School of Medicine. “In this paper, we have identified which cells are capable of becoming hair cells under the influence of Atoh1, and we show that there are strong age-dependent limitations on the effects of Atoh1 by itself.”

The first author of the paper, Michael Kelly, now a postdoctoral fellow at the National Institute on Deafness and Other Communication Disorders, was a graduate student in Emory’s Neuroscience program.

Kelly and his coworkers engineered mice to turn on the Atoh1 gene in the inner ear in response to the antibiotic doxycycline. Previous experimenters had used a virus to introduce Atoh1 into the cochleae of animals. This approach resembles gene therapy, but has the disadvantage of being slightly different each time, Chen says. In contrast, the mice have the Atoh1 gene turned on in specific cells along the lining of the inner ear, called the cochlear epithelium, but only when fed doxycycline.

Young mice given doxycycline for two days had extra sensory hair cells, in parts of the cochlea where developing hair cells usually appear, and also additional locations (see accompanying image).

The extra hair cells could generate electrical signals, although those signals weren’t as strong as mature hair cells. Also, the extra hair cells appeared to attract neuronal fibers, which suggests that those signals could connect to the rest of the nervous system.

"They can generate electrical signals, but we don’t know if they can really function in the context of hearing.” Chen says. “For that to happen, the hair cells’ signals need to be coordinated and integrated.”

Although doxycycline could turn on Atoh1 all over the surface of the cochlea, extra sensory hair cells did not appear everywhere. When they removed cochleae from the mice and grew them in culture dishes, her team was able to provoke even more hair cells to grow when they added a drug that inhibits the Notch pathway.

Manipulating the Notch pathway affects several aspects of embryonic development and in some contexts appears to cause cancer, so the approach needs to be refined further. Chen says that it may be possible to unlock the age-related limits on hair cell regeneration by supplying additional genes or drugs in combination with Atoh1, and the results with the Notch drug provide an example.

"Our future goals are to develop approaches to stimulate hair cell formation in older animals, and to examine functional recovery after Atoh1 induction," she says.

Provided by Emory University

Source: medicalxpress.com

May 14, 20124 notes
#science #neuroscience #brain #psychology
Study raises questions about use of anti-epilepsy drugs in newborns

May 11, 2012

A brain study in infant rats demonstrates that the anti-epilepsy drug phenobarbital stunts neuronal growth, which could prompt new questions about using the first-line drug to treat epilepsy in human newborns.

In Annals of Neurology EarlyView posted online May 11, researchers at Georgetown University Medical Center (GUMC) report that the anti-epilepsy drug phenobarbital given to rat pups about a week old changed the way the animals’ brains were wired, causing cognitive abnormalities later in life.

The researchers say it has been known that some of the drugs used to treat epilepsy increase the amount of neurons that die shortly after birth in the rat brain, but, until this study, no one had shown whether this action had any adverse impact on subsequent brain development.

"Our study is the first to show that the exposure to these drugs — and just a single exposure — can prevent brain circuits from developing their normal connectivity, meaning they may not be wired correctly, which can have long-lasting effects on brain function,” says the study’s senior investigator, Karen Gale, Ph.D., a professor of pharmacology at GUMC. “These findings suggest that in the growing brain, these drugs are not as benign as one would like to believe.”

For their study, the Georgetown researchers studied four agents including phenobarbital.

"The good news is not all anti-epilepsy drugs have this disruptive effect in the animal studies," Gale says.

The researchers found that the anti-epilepsy drug levetiracetam did not stunt synaptic growth. Animals treated with a third drug, lamotrigine, showed neural maturation, but it was delayed. An additional finding involved melatonin. When added to phenobarbital, it appeared to prevent the persistent adverse neural effects in the rat pups. Melatonin has been used clinically to protect cells from injury in humans.

"Many clinicians have been advocating for a reexamination of the use of these drugs in infants, and our findings provide experimental data to support that need," says the study’s co-lead investigator, Patrick A. Forcelli, Ph.D., a postdoctoral fellow in the department of pharmacology and physiology at GUMC. "Phenobarbital has been used to treat seizures for over 100 years — well before a Food and Drug Administration approval process was established— and for more than 50 years, it has been the first drug of choice in the treatment of seizures in neonates."

Read More →

May 14, 20121 note
#science #neuroscience #brain #epilepsy #psychology
Confirmation of repeated patterns of neurons indicates stereotypical organization throughout brain's cerebral cortex

May 11, 2012

Neurons are arranged in periodic patterns that repeat over large distances in two areas of the cerebral cortex, suggesting that the entire cerebral cortex has a stereotyped organization, reports a team of researchers led by Toshihiko Hosoya of the RIKEN Brain Science Institute. The entire cortex has a stereotypical layered structure with the same cell types arranged in the same way, but how neurons are organized in the other orientation—parallel to the brain’s surface—is poorly understood.

image

Figure 1: In the mouse visual cortex, neurons expressing id2 mRNA (magenta) are found in regularly repeating clusters. Reproduced from Ref. 1 © 2011 Hisato Maruoka et al., RIKEN Brain Science Institute

Hosoya and his colleagues therefore examined layer V (5) of the mouse cortex, which contains two classes of large pyramidal neurons that look identical but differ in the connections they form. One projects axons straight down to regions beneath the cortex; the other projects to the cortex on the opposite side of the brain.

First, the researchers examined expression of the id2 gene in cells of the visual cortex, because these cells form clusters in that part of the brain. They found that id2 is expressed in nearly all cells that project axons downward, but not in those that cross over. Hosoya and colleagues verified this by visualizing the connections of cells using fluorescent cholera toxin, which binds to cell membranes and travels along the axons.

Further examination of gene expression patterns in tissue slices revealed that the cells are arranged in clusters aligned perpendicular to the brain’s surface, and that the clusters are organized in a regular pattern, with the same basic unit repeating every thirty micrometers (Fig. 1). They also observed the same pattern in layer V of the somatosensory cortex, suggesting that this organization is common to all other areas.

By generating a strain of mutant mice expressing green fluorescent protein in the progenitor cells that produce the cells in layer V during brain development, Hosoya and his colleagues investigated the embryonic origin of these cells. This revealed that each cluster contains neurons that are produced by different progenitor cells.

Finally, the researchers showed that the regular pattern persists in the adult visual cortex, and that neurons in each cluster show the same activity patterns in response to visual stimulation. “Our preliminary data suggest that at least several other areas in the cortex have the same structure,” says Hosoya. “It’s likely that the entire cortex has the same organization, and I expect that the human cortex has the same structure.”

Provided by RIKEN

Source: medicalxpress.com

May 14, 20127 notes
#science #neuroscience #brain #neuron #psychology
Astrocytes found to bridge gap between global brain activity and localized circuits

May 11, 2012

Global network activity in the brain modulates local neural circuitry via calcium signaling in non-neuronal cells called astrocytes (Fig. 1), according to research led by Hajime Hirase of the RIKEN Brain Science Institute. The finding clarifies the link between two important processes in the brain.

image

Figure 1: Astrocytes are star-shaped cells with numerous fine projections that ensheath synapses in the brain. © 2012 Hajime Hirase

Activity in large-scale brain networks is thought to modulate changes in neuronal connectivity, so-called ‘synaptic plasticity’, in the cerebral cortex. The neurotransmitter acetylcholine regulates global brain activity associated with attention and awareness, and is involved in plasticity.

To investigate how these processes are linked, Hirase and his colleagues simultaneously stimulated the whiskers of mice and the nucleus basalis of Meynert (NBM), a basal forebrain structure containing neurons that synthesize acetylcholine and project widely to the cortex. Using electrodes and an imaging technique called two-photon microscopy, performed through a ‘cranial window’, they monitored the responses of cells in the barrel cortex, which receives inputs from the whiskers.

Recordings from the electrodes showed that repeated co-stimulation of the whiskers and NBM induced plasticity in the barrel cortex. This plasticity depended on two types of receptors—muscarinic acetylcholine receptors (mAChRs) and N-methyl-D-aspartic acid receptors (NMDARs). Two-photon imaging microscopy further revealed that activation of the mAChRs during co-stimulation elevated the concentration of calcium ions within astrocytes of the barrel cortex.

The researchers repeated these experiments in mutant mice lacking the receptor that controls the release of calcium ions in astrocytes. Since co-stimulation of whiskers and NBM did not induce plasticity in the mutants, Hirase and colleagues concluded that calcium signaling in astrocytes acts as a ‘gate’ linking the changes in global brain state induced by acetylcholine to activity in local cortical circuits.

Furthermore, the researchers found that stimulation of the NBM led to an increase in the extracellular concentration of the amino acid D-serine in the normal, but not the mutant, mice. D-serine is secreted by astrocytes and activates NMDARs. Hirase’s team had previously shown that astrocytes are electrically silent in living rodents even in the presence of neural activity2. The new findings showed that the biochemical, as opposed to electrical, activation of astrocytes induces them to release the transmitter that modulates synaptic plasticity in the neuronal circuitry.

“Our study is probably the first to show that calcium signaling in astrocytes is related to neuronal circuit plasticity in living animals,” says Hirase. “We are now studying if this type of calcium signaling occurs in all parts of an astrocyte or is restricted to some parts of the cell.”

Provided by RIKEN

Source: medicalxpress.com

May 14, 201212 notes
#science #neuroscience #brain #psychology
Mild traumatic brain injury may alter brain's neuronal circuit excitability and contribute to brain network dysfunction

May 11, 2012

Even mild head injuries can cause significant abnormalities in brain function that last for several days, which may explain the neurological symptoms experienced by some individuals who have experienced a head injury associated with sports, accidents or combat, according to a study by Virginia Commonwealth University School of Medicine researchers.

These findings, published in the May issue of the Journal of Neuroscience, advance research in the field of traumatic brain injury (TBI), enabling researchers to better understand what brain structural or functional changes underlie posttraumatic disorders – a question that until now has remained unclear.

Previous research has shown that even a mild case of TBI can result in long-lasting neurological issues that include slowing of cognitive processes, confusion, chronic headache, posttraumatic stress disorder and depression.

The VCU team, led by Kimberle M. Jacobs, Ph.D., associate professor in the Department of Anatomy and Neurobiology, demonstrated for the first time, using sophisticated bioimaging and electrophysiological approaches, that mild injury can cause structural disruption of axons in the brain while also changing the way the neurons fire in areas where they have not been structurally altered. Axons are nerve fibers in the brain responsible for conducting electrical impulses. The team used models of mild traumatic brain injury and followed morphologically identified neurons in live cortical slices.

“These findings should help move the field forward by providing a unique bioimaging and electrophysiological approach to assess the evolving changes evoked by mild TBI and their potential therapeutic modulation,” said co-investigator, John T. Povlishock, Ph.D., professor and chair of the VCU School of Medicine’s Department of Anatomy and Neurobiology and director of the Commonwealth Center for the Study of Brain Injury.

According to Povlishock, additional benefit may also derive from the use of this model system with repetitive injuries to determine if repeated insults exacerbate the observed abnormalities.

Provided by Virginia Commonwealth University

Source: medicalxpress.com

May 14, 20125 notes
#science #neuroscience #brain #psychology
Maternal Antibodies to Gluten Linked to Schizophrenia Risk in Children

May 11th, 2012

Babies born to women with sensitivity to gluten appear to be at increased risk for certain psychiatric disorders later in life, according to research by scientists at Karolinska Institutet in Sweden and Johns Hopkins Children’s Center in Baltimore.

The team’s findings, published in The American Journal of Psychiatry, add to a growing body of evidence that many “adult” diseases may take root before and shortly after birth.

“Lifestyle and genes are not the only factors that shape disease risk, and factors and exposures before, during and after birth can help pre-program much of our adult health,” said investigator Robert Yolken, M.D., a neuro-virologist at Johns Hopkins Children’s Center. “Our study is an illustrative example suggesting that a dietary sensitivity before birth could be a catalyst in the development of schizophrenia or a similar condition 25 years later.”

Maternal infections and other inflammatory disorders during pregnancy have long been linked to greater risk for schizophrenia in the offspring but, the Swedish and U.S. investigators say, this is the first study that points to maternal food sensitivity as a possible culprit in the development of such disorders. The findings establish a strong link but do not mean that gluten sensitivity will invariably cause schizophrenia, the investigators caution. The research, however, does suggest an intriguing new mechanism that may drive up risk and illuminate possible prevention strategies.

“Our research not only underscores the importance of maternal nutrition during pregnancy and its lifelong effects on the offspring, but also suggests one potential cheap and easy way to reduce risk if we were to find further proof that gluten sensitivity exacerbates or drives up schizophrenia risk,” said study lead investigator Håkan Karlsson, M.D., Ph.D., a neuroscientist at Karolinska Institutet and former neuro-virology fellow at Johns Hopkins.

The team’s findings are based on an examination of 764 birth records and neonatal blood samples of Swedes born between 1975 and 1985. Some 211 of them subsequently developed non-affective psychoses, such as schizophrenia and delusional disorders.

Using stored neonatal blood samples, the investigators measured levels of IgG antibodies to milk and wheat. IgG antibodies are markers of immune system reaction triggered by the presence of certain proteins. Because a mother’s antibodies cross the placenta during pregnancy to confer immunity to the baby, a newborn’s elevated IgG levels are proof of protein sensitivity in the mother.

Children born to mothers with abnormally high levels of antibodies to the wheat protein gluten had nearly twice the risk of developing schizophrenia later in life, compared with children who had normal levels of gluten antibodies. The link persisted even after researchers accounted for other factors known to increase schizophrenia risk, including maternal age, gestational age, mode of delivery and the mother’s immigration status. The risk for psychiatric disorders was not increased among those with elevated levels of antibodies to milk protein.

The researchers say the suspicion that food sensitivity in the mother can affect her child’s risk for psychiatric disorders stems from an observation made in the wake of the World War II by U.S. Army researcher F. Curtis Dohan, M.D. Dohan noted that food scarcity in post-war Europe and wheat-poor diets led to notably fewer hospital admissions for schizophrenia.  The link was merely observational, but it has piqued the curiosity of scientists ever since.

Researchers in the past also have observed that people diagnosed with schizophrenia have disproportionately high rates celiac disease, a rare autoimmune disorder characterized by gluten sensitivity. Although it is a hallmark of the condition, gluten sensitivity alone is not enough to diagnose celiac disease. Other studies have found that some people with schizophrenia have gluten sensitivity without other signs of celiac disease, the researchers note.

Yolken and Karlsson say the team already is conducting follow-up studies to clarify how gluten or sensitivity to it increases schizophrenia risk and whether it does so only in those genetically predisposed.

Source: Neuroscience News

May 14, 201221 notes
#science #neuroscience #brain #psychology #schizophrenia
Neurodegeneration 'Switched Off' in Mice

ScienceDaily (May 10, 2012) — Researchers at the Medical Research Council (MRC) Toxicology Unit at the University of Leicester have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice.

image

Scientists have identified a major pathway leading to brain cell death in mice with neurodegenerative disease. The team was able to block the pathway, preventing brain cell death and increasing survival in the mice. (Credit: © pressmaster / Fotolia)

In human neurodegenerative diseases, including Alzheimer’s, Parkinson’s and prion diseases, proteins “mis-fold” in a variety of different ways resulting in the build up of mis-shapen proteins. These form the plaques found in Alzheimer’s and the Lewy bodies found in Parkinson’s disease.

The researchers studied mice with neurodegeneration caused by prion disease. These mouse models currently provide the best animal representation of human neurodegenerative disorders, where it is known that the build up of mis-shapen proteins is linked with brain cell death.

They found that the build up of mis-folded proteins in the brains of these mice activates a natural defense mechanism in cells, which switches off the production of new proteins. This would normally switch back ‘on’ again, but in these mice the continued build-up of mis-shapen protein keeps the switch turned ‘off’. This is the trigger point leading to brain cell death, as those key proteins essential for nerve cell survival are not made.

By injecting a protein that blocks the ‘off’ switch of the pathway, the scientists were able to restore protein production, independently of the build up of mis-shapen proteins,and halt the neurodegeneration. The brain cells were protected, protein levels and synaptic transmission (the way in which brain cells signal to each other) were restored and the mice lived longer, even though only a very small part of their brain had been treated.

Mis-shapen proteins in human neurodegenerative diseases, such as Alzheimer’s and Parkinson’s diseases, also over-activate this fundamental pathway controlling protein synthesis in the brains of patients, which represents a common target underlying these different clinical conditions. The scientists’ results suggest that treatments focused on this pathway could be protective in a range of neurodegenerative disease in which mis-shapen proteins are building up and causing neurons to die.

Professor Giovanna Mallucci, who led the team, said, “What’s exciting is the emergence of a common mechanism of brain cell death, across a range of different neurodegenerative disorders, activated by the different mis-folded proteins in each disease. The fact that, in mice with prion disease, we were able to manipulate this mechanism and protect the brain cells means we may have a way forward in how we treat other disorders. Instead of targeting individual mis-folded proteins in different neurodegenerative diseases, we may be able to target the shared pathways and rescue brain cell degeneration irrespective of the underlying disease.”

Professor Hugh Perry, chair of the MRC’s Neuroscience and Mental Health Board, said, “Neurodegenerative diseases such as Alzheimer’s and Parkinson’s are debilitating and largely untreatable conditions. Alzheimer’s disease and related disorders affect over seven million people in Europe, and this figure is expected to double every 20 years as the population ages across Europe. The MRC believes that research such as this, which looks at the fundamental mechanisms of these devastating diseases, is absolutely vital. Understanding the mechanism that leads to neuronal dysfunction prior to neuronal loss is a critical step in finding ways to arrest disease progression.”

Source: Science Daily

May 14, 20129 notes
#science #neuroscience #brain #psychology #alzheimer #parkinson
Glial Cells Supply Nerve Fibers with Energy-Rich Metabolic Products

May 10th, 2012

Glial cells pass on metabolites to neurons.

Around 100 billion neurons in the human brain enable us to think, feel and act. They transmit electrical impulses to remote parts of the brain and body via long nerve fibres known as axons. This communication requires enormous amounts of energy, which the neurons are thought to generate from sugar. Axons are closely associated with glial cells which, on the one hand, surround them with an electrically insulating myelin sheath and, on the other hand support their long-term function. Klaus Armin and his research group from the Max Planck Institute of Experimental Medicine in Göttingen have now discovered a possible mechanisms by which these glial cells in the brain can support their associated axons and keep them alive in the long term.

Oligodendrocytes are a group of highly specialised glial cells in the central nervous system. They are responsible for the formation of the fat-rich myelin sheath that surrounds the nerve fibres as an insulating layer. The comparison with the coating on electricity cables is an obvious one; however, myelin can do much more than act as the insulating layer on electricity cables: it increases the transmission speed of the axons and also reduces ongoing energy consumption. The extreme importance of myelin for a functioning nervous system is shown by the diseases that arise from a defective insulating layer, such as multiple sclerosis

Interestingly, the function of the oligodendrocytes goes far beyond the mere provision of myelin. Klaus-Armin Nave and his team at the Max Planck Institute in Göttingen already succeeded in demonstrating years ago that healthy glial cells are also essential for the long-term function and survival of the axons themselves, irrespective of myelination. “The way in which the oligodendrocytes functionally support their associated axons was not clear to us up to now,” says Nave. In a new study, the researchers were able to show that the glial cells are involved in, among other things, the replenishment of energy in the nerve fibres. “They could be described as the petrol stations on the data highway of the axons,” says Nave, explaining the results.

image

Electron microscope cross-section image of the nerve fibres (axons) of the optic nerve. Axons are surrounded by special glial cells, the oligodendrocytes, wrapping themselves around the axons in several layers. Between the axons, there are extensions of astrocytes, another type of glial cells. © K.-A.Nave/MPI f. Experimental Medicine

But how does the energy refuelling work? Is there a metabolic connection between the oligodendrocytes and axons? To find out, Ursula Fünfschilling generated genetically modified mice: the function of the mitochondria was deliberately disrupted in the oligodendrocytes through the inactivation of the Cox10 gene. This affects the final stages of sugar breakdown taking place in the mitochondria where energy is harnessed – a process known as the respiratory chain. If a link in this chain is missing, in this instance cytochrome oxidase, which is only functional when cells have the enzyme Cox10, the glial cells gradually lose the capacity for cell respiration in their mitochondria. “Without independent breathing, the manipulated glial cells of the nervous systems should have died,” explains the scientist. That is, unless the low level of energy harnessed from the splitting of the glucose to form pyruvate or milk acid, a process known as glycolysis, is sufficient for them.

And this is precisely what the scientists observed in their mice: the animals’ myelin was initially formed in the normal way. The loss of the mitochondrial respiratory chain, which started at this point, did not appear to affect the glial cells in the central nervous system. Even one year later, there were no neurodegenerative changes in the brain to be observed. The scientists assume that in the early weeks of life – a phase characterised by maximum energy requirement – the mutated oligodendrocytes still rely on many intact mitochondria. All of the more mature oligodendrocytes later appear to reduce the mitochondrial respiration and set it to energy generation through increased glycolysis. This has the advantage in healthy glial cells that the metabolic products which arise during the breaking down of glucose can be used as components for myelin synthesis. In addition, the lactic acid that arises in the oligodendrocytes can be given to the axons where it can be used to produce energy with the help of the axon’s own mitochondria.

“The complete loss of the respiratory chain in the deliberately modified oligodendrocytes probably elevates a developmental step that unfolds naturally,” explains Nave. Thus the loss of glial mitochondria does not result in the deterioration of the energy supply to the axons but, conversely, to an oversupply of exploitable lactic acid. The affected nerve pathways themselves have no problem demonstrably in metabolising the lactic acid from oligodendrocytes. Transport proteins ensure the rapid transfer of the lactic acid between the oligodendrocytes and their myelinated axons.

This finding provides a new understanding of the role of oligodendrocytes: in addition to their known significance for myelinisation [aka myelination], they can directly provide the axons with glucose products that can be used as fuel with the help of axonal mitochondria in periods of high activity. This coupling of glial cells could explain, among other things, why in many myelin diseases, for example multiple sclerosis, the affected demyelinised axons often suffer irreversible damage.

Source: Neuroscience News

May 14, 201216 notes
#science #neuroscience #brain #psychology #neuron
Key Cellular Mechanisms Behind the Onset of Tinnitus Identified

ScienceDaily (May 10, 2012) — Research into hearing loss after exposure to loud noises could lead to the first drug treatments to prevent the development of tinnitus.

Researchers in the University of Leicester’s Department of Cell Physiology and Pharmacology have identified a cellular mechanism that could underlie the development of tinnitus following exposure to loud noises. The discovery could lead to novel tinnitus treatments, and investigations into potential drugs to prevent tinnitus are currently underway.

Tinnitus is a sensation of phantom sounds, usually ringing or buzzing, heard in the ears when no external noise is present. It commonly develops after exposure to loud noises (acoustic over-exposure), and scientists have speculated that it results from damage to nerve cells connected to the ears.

Although hearing loss and tinnitus affect around ten percent of the population, there are currently no drugs available to treat or prevent tinnitus.

University of Leicester researcher Dr Martine Hamann, who led the study published in the journal Hearing Research, said: “We need to know the implications of acoustic over exposure, not only in terms of hearing loss but also what’s happening in the brain and central nervous system. It’s believed that tinnitus results from changes in excitability in cells in the brain — cells become more reactive, in this case more reactive to an unknown sound.”

Dr Hamann and her team, including PhD student Nadia Pilati, looked at cells in an area of the brain called the dorsal cochlear nucleus — the relay carrying signals from nerve cells in the ear to the parts of the brain that decode and make sense of sounds. Following exposure to loud noises, some of the nerve cells (neurons) in the dorsal cochlear nucleus start to fire erratically, and this uncontrolled activity eventually leads to tinnitus.

Dr Hamann said “We showed that exposure to loud sound triggers hearing loss a few days after the exposure to the sound. It also triggers this uncontrolled activity in the neurons of the dorsal cochlear nucleus. This is all happening very quickly, in a matter of days”

In a key breakthrough in collaboration with GSK who sponsored Dr Pilati’s PhD, the team also discovered the specific cellular mechanism that leads to the neurons’ over-activity. Malfunctions in specific potassium channels that help regulate the nerve cell’s electrical activity mean the neurons cannot return to an equilibrium resting state.

Ordinarily, these cells only fire regularly and therefore regularly return to a rest state. However, if the potassium channels are not working properly, the cells cannot return to a rest state and instead fire continuously in random bursts, creating the sensation of constant noise when none exists.

Dr Hamann explained: “In normal conditions the channel helps to drag down the cellular electrical activity to its resting state and this allows the cell to function with a regular pattern. After exposure to loud sound, the channel is functioning less and therefore the cell is constantly active, being unable to reach its resting state and displaying those irregular bursts.”

Although many researchers have investigated the mechanisms underlying tinnitus, this is the first time that cellular bursting activity has been characterised and linked to specific potassium channels. Identifying the potassium channels involved in the early stages of tinnitus opens up new possibilities for preventing tinnitus with early drug treatments.

Dr Hamann’s team is currently investigating potential drugs that could regulate the damaged cells, preventing their erratic firing and returning them to a resting state. If suitable drug compounds are discovered, they could be given to patients who have been exposed to loud noises to protect them against the onset of tinnitus.

These investigations are still in the preliminary stages, and any drug treatment would still be years away.

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #hearing #psychology #brain
Testosterone-Fueled Infantile Males Might Be a Product of Mom's Behavior

ScienceDaily (May 10, 2012) — By comparing the testosterone levels of five-month old pairs of twins, both identical and non-identical, University of Montreal researchers were able to establish that testosterone levels in infancy are not inherited genetically but rather determined by environmental factors.

image

Angry boy. Testosterone levels in infancy are not inherited genetically but rather determined by environmental factors, new research suggests. (Credit: © crestajohnson / Fotolia)

"Testosterone is a key hormone for the development of male reproductive organs, and it is also associated with behavioural traits, such as sexual behaviour and aggression," said lead author Dr. Richard E. Tremblay of the university’s Research Unit on Children’s Psychosocial Maladjustment. "Our study is the largest to be undertaken with newborns, and our results contrast with the findings gained by scientists working with adolescents and adults, indicating that testosterone levels are inherited."

The findings were presented in an article published inPsychoneuroendocrinology on May 7, 2012.

The researchers took saliva samples from 314 pairs of twins and measured the levels of testosterone. They then compared the similarity in testosterone levels between identical and fraternal twins to determine the contribution of genetic and environmental factors. Results indicated that differences in levels of testosterone were due mainly to environmental factors. “The study was not designed to specifically identify these environmental factors which could include a variety of environmental conditions, such as maternal diet, maternal smoking, breastfeeding and parent-child interactions.”

"Because our study suggests that testosterone levels in infants are determined by the circumstances in which the child develops before and after birth, further studies will be needed to find out exactly what these influencing factors are and to what extent they change from birth to puberty," Tremblay said.

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #brain #psychology
Evolution’s Gift May Also Be at the Root of a Form of Autism

May 10th, 2012

A recently evolved pattern of gene activity in the language and decision-making centers of the human brain is missing in a disorder associated with autism and learning disabilities, a new study by Yale University researchers shows.

“This is the cost of being human,” said Nenad Sestan, associate professor of neurobiology, researcher at Yale’s Kavli Institute for Neuroscience, and senior author of the paper. “The same evolutionary mechanisms that may have gifted our species with amazing cognitive abilities have also made us more susceptible to psychiatric disorders such as autism.”

The findings are reported in the May 11 issue of the journal Cell.

In the Cell paper, Kenneth Kwan, the lead author, and other members of the Sestan laboratory identified the evolutionary changes that led the NOS1 gene to become active specifically in the parts of the developing human brain that form the adult centers for speech and language and decision-making. This pattern of NOS1 activity is controlled by a protein called FMRP and is missing in Fragile X syndrome, a disorder caused by a genetic defect on the X chromosome that disrupts FMRP production. Fragile X syndrome, the leading inherited form of intellectual disability, is also the most common single-gene cause of autism. The loss of NOS1 activity may contribute to some of the many cognitive deficits suffered by those with Fragile X syndrome, such as lower IQ, attention deficits, and speech and language delays, the authors say.

The pattern of NOS1 activity in these brain centers does not occur in the developing mouse brain — suggesting that it is a more recent evolutionary adaptation possibly involved in the wiring of neural circuits important for higher cognitive abilities. The findings of the Cell paper support this hypothesis. The study also provides insights into how genetic deficits in early development, a time when brain circuits are formed, can lead to disorders such as autism, in which symptoms appear after birth.

“This is an example of where the function of genetic changes that likely drove aspects of human brain evolution was disrupted in disease, possibly reverting some of our newly acquired cognitive abilities and thus contributing to a psychiatric outcome,” Kwan said.

image

Artist representation of early developmental brain cells that when disrupted cause Fragile X syndrome. Adapted from Yale University press release image.

By Bill Hathaway

Source: Neuroscience News

May 10, 201219 notes
#science #neuroscience #psychology #brain #autism
Researchers identify genetic mutation causing rare form of spinal muscular atrophy

May 10, 2012

Scientists have confirmed that mutations of a gene are responsible for some cases of a rare, inherited disease that causes progressive muscle degeneration and weakness: spinal muscular atrophy with lower extremity predominance, also known as SMA-LED.

"Typical spinal muscular atrophies begin in infancy or early childhood and are fatal, involving all motor neurons, but SMA-LED predominantly affects nerve cells controlling muscles of the legs. It is not fatal and the prognosis is good, although patients usually are moderately disabled and require assistive devices such as bracing and wheelchairs throughout their lives," said Robert H. Baloh, MD, PhD, director of Cedars-Sinai Medical Center’s Neuromuscular Division and senior author of a Neurology article describing the new findings on DYNC1H1.

It is a molecule inside cells that acts as a motor to transport cellular components. Using cells cultured from patients, Baloh’s group showed that the mutation disrupts this motor’s function. The researchers found that some subjects with mutations had global developmental delay in addition to weakness, indicating the brain also is involved.

"Our observations suggest that a range of DYNC1H1-related disease exists in humans – from a widespread neurodevelopmental abnormality of the central nervous system to more selective involvement of certain motor neurons, which manifests as spinal muscular atrophy," Baloh said.

He pointed out that while this molecule is responsible for some inheritable cases of spinal muscular atrophy with lower extremity predominance, the genetic mutation is absent in others. The search continues, therefore, to find other culprit genetic mutations and develop biological therapies to correct them.

"Although this is a rare form of motor neuron disease, it tells us that dynein function – the molecular motor – is crucial for the development and maintenance of motor neurons, which we hope will provide insight into the common form of spinal muscular atrophy and also amyotrophic lateral sclerosis," Baloh said. ALS (also known as Lou Gehrig’s disease) is a progressive neurodegenerative disease that affects nerve cells in the brain and spinal cord.

Baloh, an expert in treating and studying neuromuscular and neurodegenerative diseases, joined Cedars-Sinai in early 2012, working with other physicians and scientists in the Department of Neurology and the Regenerative Medicine Institute to establish one of the most comprehensive neuromuscular disease treatment and research teams in California.

Provided by Cedars-Sinai Medical Center

Source: medicalxpress.com

May 10, 20123 notes
#science #neuroscience #psychology #biology #disease
Mathematical model unlocks key to brain wiring

May 10, 2012

(Medical Xpress) — A new mathematical model predicting how nerve fibres make connections during brain development could aid understanding of how some cognitive disorders occur.

The model, constructed by scientists at the Queensland Brain Institute (QBI) and School of Mathematics and Physics at the University of Queensland (UQ), gives new insight into how changing chemical levels in nerve fibres can modify nerve wiring underpinning connections in the brain.

Professor Geoff Goodhill says that while scientists have long known that changing these chemical levels can change where nerve fibres grow, only now are they understanding why this is the case.

“Our mathematical model allows us to predict precisely how these chemical levels control the direction in which nerve fibres grow, during both neural development and regeneration after injury,” he said.

Correct brain wiring is fundamental for normal brain function.

Recent discoveries suggest that wiring problems may underpin a number of nervous system disorders including autism, dyslexia, Down syndrome, Tourette’s syndrome and Parkinson’s disease.

The new model, published in the prestigious cell journal Neurondemonstrates the important role mathematics can play in understanding how the brain develops, and perhaps ultimately preventing such disorders. 

Provided by University of Queensland 

Source: medicalxpress.com

May 10, 20127 notes
#science #neuroscience #brain #psychology
Researchers move closer to delaying dementia

May 10, 2012

(Medical Xpress) — Scientists at University of Queensland’s Brain Institute are one step closer to developing new therapies for treating dementia.

QBI’s Dr Jana Vukovic said the work was aimed at understanding the molecular mechanism that may impair learning and memory in the ageing population.

“Ageing slows the production of new nerve cells, reducing the brain’s ability to form new memories,” said Dr Vokovic, who performed the work in the laboratory of Professor Perry Bartlett, the Director of QBI at The University of Queensland.

"But our research shows for the first time that the brain cells usually responsible for mediating immunity, microglia, have an inhibitory effect on memory during ageing.

“Furthermore, they have shown that a molecule produced by nerve cells, fractalkine, can reverse this process and stimulate stem cells to produce new neurons.”

The discovery, published in The Journal of Neuroscience today, came after QBI scientists observed that the increased production of new neurons in mice that were actively running was due to the release of fractalkine in the hippocampus – the brain structure responsible for specific types of learning and memory.

Professor Bartlett said it had been known for some time that exercise increased the production of new nerve cells in the hippocampus in young and even aged mice.

“But this study found that it is fractalkine that appears to be specifically mediating this effect by making the microglia produce factors that activate the stem cells that produce new nerve cells,” he said.

“Once the cells are activated they divide and produce new cells, which underpin the animal’s ability to learn and form memories.

"This means that fractalkine may form the basis for the development of future therapies.

“The discovery is especially exciting because we have found that older animals suffering cognitive decline showed significantly lower levels of fractalkine.

“We are seeking ways of increasing fractalkine levels in patients with cognitive decline, and hoping this may be a new frontline therapy in treating dementia.”

Dr Vukovic said that until relatively recently, it was thought the adult brain was incapable of generating new neurons.

“But work from Professor Bartlett’s laboratory over the past 20 years has demonstrated that the brains of adult animals, including humans, retain the ability to make new nerve cells,” she said.

“The challenge is to find out how to stimulate this production in the aged animal and human where production has slowed.”

The latest work was a significant step toward achieving this goal, she said.

Provided by University of Queensland

Source: medicalxpress.com

May 10, 20128 notes
#science #neuroscience #brain #psychology
Think global, act local: New roles for protein synthesis at synapses

May 10, 2012

(Medical Xpress) — How do we build a memory in the brain? It is well known that for animals (and humans) new proteins are needed to establish long-term memories. During learning information is stored at the synapses, the junctions connecting nerve cells. Synapses also require new proteins in order to show changes in their strength (synaptic plasticity). Historically, scientists have focused on the cell body as the place where the required proteins are synthesized. However, in recent years there has been increasing focus on the dendrites and axons (the compartments that meet to form synapses) as a potential site for protein synthesis.

Protein synthesis machines have been observed there as well as a limited number of their templates, the messenger RNA molecules. The limited number of mRNAs observed in dendrites and axons placed constraints on the constellation of proteins that could be synthesized to help synapses work and change. Researchers from Erin Schuman’s lab at the Max Planck Institute (MPI) for Brain Research used new-generation sequencing to directly identify a very large number (over 2500) of new mRNA molecules that are present at the axons and dendrites. Using high-resolution imaging techniques they were able to both quantify and visualize individual mRNA molecules. They published their findings in the latest issue of Neuron.

[Video]
Erin Schuman and her colleagues describe how they were able to detect numerous new mRNAs in the processes of neurons with unprecedented sensitivity. Video: Neuron.

Using microarray approaches and/or in situ hybridization techniques, many different groups had each identified a hundred or so mRNAs that might reside in the dendrites. By analyzing and comparing these studies the Schuman team discovered something surprising: it seems that not a single mRNA type was found in all three studies. This observation made the scientist at the MPI for Brain Research wonder whether the already discovered mRNAs are just the tip of the iceberg and whether there were many more mRNA molecules waiting to be discovered.

In order to find out the researchers dissected the neuropil layer of the rat hippocampus. This layer comprises a high concentration of axons and dendrites, but lacks the cell bodies of pyramidal neurons (the principal cell type in the hippocampus and other brain areas). By using sensitive high-resolution sequencing techniques, mRNAs could be detected which, due to their lower concentrations, were not discovered before. The researchers found an impressive number of 2550 unique mRNAs present at the dendrites and/or axons. To determine the relative abundance in the neuronal cells, the scientists at Erin Schuman’s lab used the Nanostring nCounter, a new technique allowing the high-resolution visualization and quantification of single mRNA molecules. They found that the concentration of mRNAs in the euronal cells varies by three orders of magnitude. Additionally, the researchers were able to classify many of the mRNAs and determine their function in synaptic plasticity. These include signaling molecules, scaffolds and the receptors for neurotransmitter molecules. In addition, many mRNAs coding for protein implicated in diseases like autism were discovered in the dendrites and axons. Finally, by using advanced imaging techniques, the researchers could directly visualize some of the mRNAs in the neuronal dendrites, hundreds of micrometers from the cell body.

These results reveal a previously unappreciated enormous potential for the local protein synthesis machinery to supply, maintain and modify the dendritic and synaptic protein population. It seems that neurons use a local control mechanism much in the same way that modern societies have learned that the most efficient means to distribute goods to the population is to use local distribution centers.

Provided by Max Planck Society

Source: medicalxpress.com

May 10, 20126 notes
#science #neuroscience #brain #memory #psychology
Researchers say genes and vascular risk modify effects of aging on brain and cognition

May 9, 2012 

Efforts to understand how the aging process affects the brain and cognition have expanded beyond simply comparing younger and older adults.

"Everybody ages differently. By looking at genetic variations and individual differences in markers of vascular health, we begin to understand that preventable factors may affect our chances for successful aging," said Wayne State University psychology doctoral student Andrew Bender, lead author of a study supported by the National Institute on Aging of the National Institutes of Health and now in press in the journal Neuropsychologia.

The report, “Age-related Differences in Memory and Executive Functions in Healthy APOE ε4 Carriers: The Contribution of Individual Differences in Prefrontal Volumes and Systolic Blood Pressure,” focuses on carriers of the ε4 variant of the apolipoprotein (APOE) gene, present in roughly 25 percent of the population. Compared to those who possess other forms of the APOE gene, carriers of the ε4 allele are at significantly greater risk for Alzheimer’s, dementia and cardiovascular disease.

Many studies also have shown that nondemented carriers of the APOE ε4 variant have smaller brain volumes and perform less well on cognitive tests than carriers of other gene variants. Those findings, however, are not consistent, and a possible explanation may come from examining interactions between the risky genes and other factors, such as markers of cardiovascular health. Prior research in typical samples of older adults has shown that indeed other vascular risk factors — such as elevated cholesterol, hypertension or diabetes — can exacerbate the impact of the APOE ε4 variant on brain and cognition, but it is unclear if such synergy of risks is present in healthy adults.

Thus, Wayne State researchers evaluated a group of volunteers from 19 to 77 years of age who self-reported as exceptionally healthy on a questionnaire that screened for a number of conditions, representing a “best case scenario” of healthy aging. The research project, led by Naftali Raz, Ph.D., professor of psychology and director of the Lifespan Cognitive Neuroscience Research Program at WSU’s Institute of Gerontology, tested different cognitive abilities known for their sensitivity to aging and the effects of the APOE ε4 variant. Those abilities include speed of information processing, working memory (holding and manipulating information in one’s mind) and episodic memory (memory for events).

Researchers also measured participants’ blood pressure, performed genetic testing to determine which APOE variant participants carried, and measured the volumes of several critical brain regions using a high-resolution structural magnetic resonance imaging brain scan. Bender and Raz showed that for older APOE ε4 carriers, even minor increases in systolic blood pressure (the higher of the two numbers that are reported in blood pressure measures) were linked with smaller volumes of the prefrontal cortex and prefrontal white matter, slower speed of information processing, reduced working memory capacity and worse verbal memory. Notably, they said, that pattern was not evident in those who lacked the ε4 gene variant.

The study concludes that the APOE ε4 gene may make its carriers sensitive to negative effects of relatively subtle elevations in systolic blood pressure, and that the interplay between two risk factors, genetic and physiological, is detrimental to the key brain structures and associated cognitive functions.

"Although genes play a significant role in shaping the effects of age and vascular risk on the brain and cognition, the impact of single genetic variants is relatively small, and there are quite a few of them. Thus, one’s aging should not be seen through the lens of one’s genetic profile," cautioned the study’s authors. They continued, "The negative impact of many genetic variations needs help from other risk factors, and while there isn’t much one can do about genes, a lot can be done about vascular risk factors such as blood pressure or cholesterol."

"Everybody should try to keep those in check, although people with certain genetic variants more so than others." Raz said. "Practically speaking, even with the best deck of genetic cards dealt to you, it still makes sense to reduce risk through whatever works: exercise, diet or, if those fail, medication."

Because the study is part of a longitudinal project, he and Bender said the immediate future task now is to determine how the interaction between risky genes and vascular risk factors affect the trajectory of age-related changes — not differences, as in this cross-sectional study — in brain and cognition.

Provided by Wayne State University

Source: medicalxpress.com

May 10, 20121 note
#science #neuroscience #brain #psychology
Chronic cocaine use triggers changes in brain's neuron structure

May 9, 2012

Chronic exposure to cocaine reduces the expression of a protein known to regulate brain plasticity, according to new, in vivo research on the molecular basis of cocaine addiction. That reduction drives structural changes in the brain, which produce greater sensitivity to the rewarding effects of cocaine.

image

The research, led by UB’s Dietz, suggests a potential new target for development of a treatment for cocaine addiction. Credit: Douglas Levere, UB Communications

The finding suggests a potential new target for development of a treatment for cocaine addiction. It was published last month in Nature Neuroscience by researchers at the University at Buffalo and Mount Sinai School of Medicine.

"We found that chronic cocaine exposure in mice led to a decrease in this protein’s signaling," says David Dietz, PhD, assistant professor of pharmacology and toxicology in the School of Medicine and Biomedical Sciences, who did the work while at Mt. Sinai. "The reduction of the expression of the protein, called Rac1, then set in motion a cascade of events involved in structural plasticity of the brain — the shape and growth of neuronal processes in the brain. Among the most important of these events is the large increase in the number of physical protrusions or spines that grow out from the neurons in the reward center of the brain.

"This suggests that Rac1 may control how exposure to drugs of abuse, like cocaine, may rewire the brain in a way that makes an individual more susceptible to the addicted state," says Dietz.

The presence of the spines demonstrates the spike in the reward effect that the individual obtains from exposure to cocaine. By changing the level of expression of Rac1, Dietz and his colleagues were able to control whether or not the mice became addicted, by preventing enhancement of the brain’s reward center due to cocaine exposure.

To do the experiment, Dietz and his colleagues used a novel tool, which allowed for light activation to control Rac1 expression, the first time that a light-activated protein has been used to modulate brain plasticity.

"We can now understand how proteins function in a very temporal pattern, so we could look at how regulating genes at a specific time point could affect behavior, such as drug addiction, or a disease state," says Dietz.

In his UB lab, Dietz is continuing his research on the relationship between behavior and brain plasticity, looking, for example, at how plasticity might determine how much of a drug an animal takes and how persistent the animal is in trying to get the drug.

Provided by University at Buffalo

Source: medicalxpress.com

May 10, 20123 notes
#science #neuroscience #brain #psychology
Scientists identify neurotranmitters that lead to forgetting

May 9, 2012

While we often think of memory as a way of preserving the essential idea of who we are, little thought is given to the importance of forgetting to our wellbeing, whether what we forget belongs in the “horrible memories department” or just reflects the minutia of day-to-day living.

Despite the fact that forgetting is normal, exactly how we forget—the molecular, cellular, and brain circuit mechanisms underlying the process—is poorly understood.

Now, in a study that appears in the May 10, 2012 issue of the journal Neuron, scientists from the Florida campus of The Scripps Research Institute have pinpointed a mechanism that is essential for forming memories in the first place and, as it turns out, is equally essential for eliminating them after memories have formed.

"This study focuses on the molecular biology of active forgetting," said Ron Davis, chair of the Scripps Research Department of Neuroscience who led the project. "Until now, the basic thought has been that forgetting is mostly a passive process. Our findings make clear that forgetting is an active process that is probably regulated."

The Two Faces of Dopamine

To better understand the mechanisms for forgetting, Davis and his colleagues studied Drosophila or fruit flies, a key model for studying memory that has been found to be highly applicable to humans. The flies were put in situations where they learned that certain smells were associated with either a positive reinforcement like food or a negative one, such as a mild electric shock. The scientists then observed changes in the flies’ brains as they remembered or forgot the new information.

The results showed that a small subset of dopamine neurons actively regulate the acquisition of memories and the forgetting of these memories after learning, using a pair of dopamine receptors in the brain. Dopamine is a neurotransmitter that plays an important role in a number of processes including punishment and reward, memory, learning and cognition.

But how can a single neurotransmitter, dopamine, have two seemingly opposite roles in both forming and eliminating memories? And how can these two dopamine receptors serve acquiring memory on the one hand, and forgetting on the other?

The study suggests that when a new memory is first formed, there also exists an active, dopamine-based forgetting mechanism—ongoing dopamine neuron activity—that begins to erase those memories unless some importance is attached to them, a process known as consolidation that may shield important memories from the dopamine-driven forgetting process.

The study shows that specific neurons in the brain release dopamine to two different receptors known as dDA1 and DAMB, located on what are called mushroom bodies because of their shape; these densely packed networks of neurons are vital for memory and learning in insects. The study found the dDA1 receptor is responsible for memory acquisition, while DAMB is required for forgetting.

When dopamine neurons begin the signaling process, the dDA1 receptor becomes overstimulated and begins to form memories, an essential part of memory acquisition. Once that memory is acquired, however, these same dopamine neurons continue signaling. Except this time, the signal goes through the DAMB receptor, which triggers forgetting of those recently acquired, but not yet consolidated, memories.

Jacob Berry, a graduate student in the Davis lab who led the experimentation, showed that inhibiting the dopamine signaling after learning enhanced the flies’ memory. Hyperactivating those same neurons after learning erased memory. And, a mutation in one of the receptors, dDA1, produced flies unable to learn, while a mutation in the other, DAMB, blocked forgetting.

Intriguing Issues

While Davis was surprised by the mechanisms the study uncovered, he was not surprised that forgetting is an active process. “Biology isn’t designed to do things in a passive way,” he said. “There are active pathways for constructing things, and active ones for degrading things. Why should forgetting be any different?”

The study also brings into a focus a lot of intriguing issues, Davis said—savant syndrome, for example.

"Savants have a high capacity for memory in some specialized areas," he said. "But maybe it isn’t memory that gives them this capacity, maybe they have a bad forgetting mechanism. This also might be a strategy for developing drugs to promote cognition and memory—what about drugs that inhibit forgetting as cognitive enhancers?"

Provided by The Scripps Research Institute

Source: medicalxpress.com

May 10, 2012114 notes
#science #neuroscience #brain #psychology
Why Do People Choke When the Stakes Are High? Loss Aversion May Be the Culprit

ScienceDaily (May 9, 2012) — In sports, on a game show, or just on the job, what causes people to choke when the stakes are high? A new study by researchers at the California Institute of Technology (Caltech) suggests that when there are high financial incentives to succeed, people can become so afraid of losing their potentially lucrative reward that their performance suffers.

image

In the study, each participant was asked to control this virtual object on a screen. The virtual object consisted of two weighted balls connected by a spring. The task was to place the object, which stretched and contracted as a weighted spring would in real life, into a square target within two seconds. (Credit: Image courtesy of California Institute of Technology)

It is a somewhat unexpected conclusion. After all, you would think that the more people are paid, the harder they will work, and the better they will do their jobs — until they reach the limits of their skills. That notion tends to hold true when the stakes are low, says Vikram Chib, a postdoctoral scholar at Caltech and lead author on a paper published in the May 10 issue of the journalNeuron. Previous research, however, has shown that if you pay people too much, their performance actually declines.

Some experts have attributed this decline to too much motivation: they think that, faced with the prospect of earning an extra chunk of cash, you might get so excited that you will fail to do the task properly. But now, after looking at brain-scan data of volunteers performing a specific motor task, the Caltech team says that what actually happens is that you become worried about losing your potential prize. The researchers also found that the more someone is afraid of loss, the worse they perform.

In the study, each participant was asked to control a virtual object on a screen by moving an index finger that had a tracking device attached to it. The virtual object consisted of two weighted balls connected by a spring. The task was to place the object, which stretched and contracted as a weighted spring would in real life, into a square target within two seconds.

The researchers controlled for individual skill levels by customizing the size of the target so that everyone would have the same success rate. That way, people who happened to be really good or bad at this task would not skew the data.

After a training period, the subjects were asked to perform the task while inside an fMRI machine, which measures blood flow in the brain — a proxy for brain activity, since wherever a brain is active, it needs extra oxygen, and thus a larger volume of blood. By monitoring blood flow, the researchers can pinpoint areas of the brain that turn on when a particular task is performed.

The task began with the researchers offering the participants a randomized range of rewards — from $0 to $100 — if they could successfully place the object into the square within the time limit. At the end of hundreds of trials — each with varying reward amounts — the participant was given their reward, based on the result of just one of the trials, picked at random.

As expected, the team found that performance improved as the incentives increased — but only when the cash reward amounts were at the low end of the spectrum. Once the rewards passed a certain threshold, which depended on the individual, performance began to fall off.

Incentives are known to activate a part of your brain called the ventral striatum, Chib says; the researchers thus expected to see the ventral striatum become increasingly active as they bumped up the prizes. And if the conventional thought were correct — that the reason for the observed performance decline was over-motivation — they would expect the striatum to continue showing a lot of activation when the incentives became high enough for performance to suffer.

What they found, instead, was that when the participants were shown their potential rewards, activity in the striatum did indeed increase with rising incentives. But once the volunteers started doing the task, striatal activity decreased with rising incentives. They also noticed that the less activity they saw in a participant’s striatum, the worse that person performed on the task.

Other studies have shown that decreasing striatal activity is related to fear or aversion to loss, Chib says. “When people see the incentive that they’re being offered, they initially encode it as a gain,” he explains. “But when they’re actually doing the task, the thing that causes them to perform poorly is that they worry about losing a potential incentive they haven’t even received yet.” He adds, “We’re showing loss aversion even though there are no explicit losses anywhere in the task — that’s very strange and something you really wouldn’t expect.”

To further test their hypothesis, Chib and his colleagues decided to measure how loss-averse each participant was. They had the participants play a coin-flip game in which there was an equal chance they could win or lose varying amounts of money.

Each participant was offered varying potential win-loss amounts ($20-$20, $20-$10, $20-$5, for example), and then given the opportunity to either accept each possible gamble or decline it. The win-loss ratio at which the subjects chose to take the gamble provided a measure of how loss-averse each person was; someone willing to gamble even when they might win or lose $20 is less loss-averse than someone who is only willing to gamble if they can win $20 but only lose $5.

Once the numbers had been crunched and compared to the original experiment, it turned out that the more averse a participant was, the worse they did on the task when the stakes were high. And for a particularly loss-aversive person, the threshold at which their performance started to decline did not have to be very high. “If you’re more loss-averse, it really hurts you,” Chib says. “You’re going to reach peak performance at a lower incentive level, and your performance is also going to be worse for higher incentives.”

"Previously, it’s been shown that the ventral striatum is involved in mediating performance increases in response to rising incentives," says John O’Doherty, professor of psychology and coauthor of the paper. "But our study shows that changes in activity in this same region can, under certain situations, also lead to worsening performance."

While this study only involved a specific motor task and financial incentives, these results may well be universal, says Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology and another coauthor of the study. “The implications and applications can include any sort of decision making that contains high stakes and uncertainties, such as business and politics.”

These findings, the researchers say, might be used to develop new ways to motivate people to perform better or to train them to be less loss-averse. “This loss aversion can be an important way of deciding how to set up incentive mechanisms and how to figure out who’s going to perform well and who isn’t,” Chib says. “If you can train somebody to be less loss-averse, maybe you can help them avoid performing poorly in stressful situations.”

Source: Science Daily

May 10, 20126 notes
#science #neuroscience #brain #psychology
Response to first drug treatment may signal likelihood of future seizures in people with epilepsy

May 9, 2012

How well people with newly diagnosed epilepsy respond to their first drug treatment may signal the likelihood that they will continue to have more seizures, according to a study published in the May 9, 2012, online issue ofNeurology, the medical journal of the American Academy of Neurology.

"Our research shows a pattern based on how a person responds to initial treatment and specifically, to their first two courses of drug treatment," said study author Patrick Kwan, MD, PhD, with the University of Melbourne in Australia.

For the study, 1,098 people from Scotland between the ages of nine and 93 with newly diagnosed epilepsy were followed for as long as 26 years after being given their first drug therapy. Participants were considered seizure-free if they had no seizures for at least a year without changes in their treatment. If they had further seizures, a second drug was chosen to be given alone or to be added to the first. If seizures continued, a third drug regimen was selected, and the process continued for up to nine drug regimens.

The study found that 50 percent of the people were seizure-free after the first drug tried, 13 percent were seizure-free after the second drug regimen tried and 4 percent were seizure-free after the third drug regimen tried. Less than two percent of the participants stopped having seizures on additional drug treatment courses up to the seventh one tried, and none became seizure-free after that.

The research also found that 37 percent of people in the study became seizure-free within six months of treatment. Another 22 percent became seizure-free after more than six months of starting treatment. Both groups continued to be seizure-free. However, 16 percent had fluctuating periods of seizure freedom and relapses, and 25 percent were never seizure-free for one year.

At the end of the study, 749 people (68 percent) were seizure-free and 678 people (62 percent) were on only one drug. The results were independent of the age when the person had the first seizure or the type of epilepsy.

"A person who doesn’t respond well to two courses of epilepsy drug treatment should be further evaluated to verify an epilepsy diagnosis and to identify whether surgery is the best next step," said Patricia E. Penovich, MD, with the Minnesota Epilepsy Group PA and the University of Minnesota School of Medicine in St. Paul, Minn., and a Fellow with the American Academy of Neurology, who wrote an accompanying editorial on the study.

Provided by American Academy of Neurology

Source: medicalxpress.com

May 9, 20121 note
#science #neuroscience #brain #psychology
The music of the (hemi)spheres sheds new light on schizophrenia

May 9, 2012

In 1619, the pioneering astronomer Johannes Kepler published Harmonices Mundi in which he analyzed data on the movement of planets and asserted that the laws of nature governing the movements of planets show features of harmonic relationships in music. In so doing, Kepler provided important support for the, then controversial, model of the universe proposed by Copernicus.

In the latest issue of Biological Psychiatry, researchers at the University of California in San Diego suggest that careful analyses of the electrical signals of brain activity, measured using electroencephalography (EEG), may reveal important harmonic relationships in the electrical activity of brain circuits.

The underlying premise is a simple one - that brain function is expressed by circuits that fire, and therefore generate oscillating EEG signals, at different frequencies.

High frequency EEG activity called gamma, for example, might reflect the activity of fast-spiking cells which are often a subclass of inhibitory nerve cells containing parvalbumin. Represented musically, this would be a high pitch, i.e., toward the right side of the piano.

Lower frequency EEG activity, called theta, might come from cells that fire with a lower frequency.

As circuits interact with each other, one would see different “musical combinations”, like the chords of music, emerging in the EEG signal. Abnormalities in the structure and function of brain circuits would be reflected in cacophonous music, chords where the musical “voices” are firing at the wrong rate (pitch), volume (amplitude), or timing.

It is increasingly evident that schizophrenia is a disorder characterized by disturbances in the “music of the brain hemispheres.” This new report describes relationships between low- and high-frequency EEG oscillations in the human brain produced when high frequency auditory stimuli are presented to a research subject. The authors observed relatively slower oscillations and reduced cross-phase synchrony (for example, peak of theta coinciding with peak of gamma) in schizophrenia patients compared to healthy study participants.

Dr. John Krystal, Editor of Biological Psychiatry, commented, “The new findings highlight the importance of understanding the relationships between different circuits. It seems that cortical abnormalities in schizophrenia disturb brain function, in part, by disturbing the ‘tuning’ of brain circuits in relation to each other.”

Provided by Elsevier

Source: medicalxpress.com

May 9, 201211 notes
#science #neuroscience #brain #psychology
Researchers Discover a New Family of Key Mitochondrial Proteins for the Function and Variability of the Brain

May 9th, 2012

This family comprises a cluster of six genes that may be altered in neurological conditions, such as Parkinson’s and Charcot-Marie-Tooth disease.

A team headed by Eduardo Soriano at the Institute for Research in Biomedicine (IRB Barcelona) has published a study in Nature Communications describing a new family of six genes whose function regulates the movement and position of mitochondria in neurons. Many neurological conditions, including Parkinson’s and various types of Charcot-Marie-Tooth disease, are caused by alterations of genes that control mitochondrial transport, a process that provides the energy required for cell function.

“We have identified a set of new genes that are highly expressed in the nervous system and have a specific function in a biological process that is crucial for the activity and viability of the nervous system”, explains Eduardo Soriano, head of the Neurobiology and Cell Regeneration group at IRB Barcelona and full professor at the University of Barcelona (UB).

By means of comparative genomic analyses, the scientists have discovered that these genes are found only in more evolved mammals, the so-called Eutharia, these characterized by internal fertilization and development. “This finding indicates the relevance of mitochondrial biology. When the brain evolved in size, function and structure, the mitochondrial transport process also became more complex and probably required additional regulatory mechanisms”, says Soriano. “Likewise, given the origin of the gene cluster, in the transition between primitive mammals, such as marsupials (kangaroos) and the remaining placental mammals, it is tempting to propose that the cluster is linked to the increased complexity of the cerebral cortex in the lineage that leads to humans”, adds the full UB professor Jordi Garcia-Fernàndez, collaborator in the study.

image

In the image, red indicates the localization of mitochondria in a neuron. The new proteins described help to regulate their positions in the cell. Image adapted from IRB Barcelona press release image.

Correct brain function is highly energy-demanding. However, this energy must be finely distributed throughout neurons —cells that have ramifications that can reach up to tens of centimetres in length, from the brain to the limbs. This cluster of genes forms part of the “wheel” machinery of mitochondria and regulates the localization of each cell on the basis of its energy requirements. “These genes would be like an extra control in cellular mitochondrial trafficking and they interact with the major proteins associated with the regulation of mitochondrial transport”, explains Soriano.

Another striking characteristic of these new proteins is that they are found both in mitochondria, the function of which has already been described, and in the cell nucleus, where their function is unknown. “They may also be involved in the regulation of gene expression, a possibility that we are now studying”. In addition to their potential involvement in brain pathologies, the researchers believe that these proteins may be related to metabolic diseases and cancer.

Source: Neuroscience News

May 9, 20121 note
#science #neuroscience #brain #psychology
Virtual reality allows researchers to measure brain activity during behavior at unprecedented resolution

May 9, 2012

Researchers have developed a new technique which allows them to measure brain activity in large populations of nerve cells at the resolution of individual cells. The technique, reported today in the journal Nature, has been developed in zebrafish to represent a simplified model of how brain regions work together to flexibly control behaviour.

Our thoughts and actions are the product of large populations of nerve cells, called neurons, working in harmony, often millions at a time. Measuring brain activity during behaviour at detailed resolution in these groups of cells has proved extremely challenging. Currently, scientists are restricted to measuring their activity in individual brain areas of, for example, moving rats, typically in less than a few hundred neurons.

Dr Misha Ahrens, a Sir Henry Wellcome Postdoctoral Fellow based at Harvard University and the University of Cambridge, worked with colleagues to develop a technique which allows neuroscientists to study as many as 2,000 neurons simultaneously, anywhere in the brain of a transparent zebrafish. Their work was funded by the Wellcome Trust and the National Institutes of Health.

Dr Ahrens and colleagues created a virtual environment for zebrafish, which allowed them to measure activity in the neurons as the fish ‘moved’. In reality, the zebrafish was paralysed to allow the researchers to image its brain; the fish perceived to ‘move’ through the virtual environment by activating their motor neuron axons, the cells responsible for generating movement.

Zebrafish are often used as a simple organism to study genetics and characteristics of the nervous system that are conserved in humans . They are genetically modifiable, so by manipulating the fish’s genetic make-up, Dr Ahrens and colleagues created a fish in which all neurons contained a particular protein that increases its fluorescence when the cells are active. The fish are transparent and so the team were able to use a laser-scanning microscope, to see activity in any neuron in the brain of the fish, and up to 2,000 neurons simultaneously.

Dr Ahrens explains: “Our behaviour is determined by thousands, possibly millions, of nerve cells working in harmony. The zebrafish performs complex behaviors, with a brain of about 100,000 neurons, almost all of which are accessible to optical recording of neural activity. Our new technique will help us examine how large networks mediate behaviour, while at the same time telling us what each individual cell is doing.”

Using the technique, Dr Ahrens and colleagues asked the question: dozebrafish adapt their behaviour in response to changes in their environment? To do this, they manipulated the virtual environment to simulate the fish suddenly becoming more “muscular”. This served as a simplified version of what happens when the brain needs to adapt the way it drives behavior, for example, when water temperature changes the efficacy of the muscles, or when the fish gets injured.

Dr Ahrens adds: “The paralyzed fish in the virtual world do indeed adapt their behaviour, by adjusting the amount of impulses the brain sends to the muscles. They also ‘remember’ this change for a while. Imaging the brain everywhere during this behaviour, we identified certain brain regions that were involved, most notably the cerebellum and related structures. This technique opens the possibility that eventually, the behaviour may be used to gain insights into human motor control and motor control deficits.

"Our own motor control is continuously recalibrating itself in a similar way to the fish’s to cope with ever changing conditions of our body and environment, such as when we injure a leg, or if we’re walking on a slippery floor or carrying a heavy bag. The zebrafish’s behaviour is an ultra-simplified version of this and we have been able to gain some insight into how its brain structures drive behaviour. This might someday help us understand how damage to certain brain regions in humans affects the way in which the brain integrates sensory information to control body movements."

Understanding the brain is one of the Wellcome Trust’s five strategic challenges.

Provided by Wellcome Trust

Source: medicalxpress.com

May 9, 20123 notes
#science #neuroscience #brain #psychology
Reduction of excess brain activity improves memory in amnestic mild cognitive impairment

May 9, 2012

Research published in the May 10 issue of the journal Neuron, describes a potential new therapeutic approach for improving memory and modifying disease progression in patients with amnestic mild cognitive impairment. The study finds that excess brain activity may be doing more harm than good in some conditions that cause mild cognitive decline and memory impairment.

Elevated activity in specific parts of the hippocampus, a brain region involved in memory, is often seen in disorders associated with an increased risk for Alzheimer’s disease. Amnestic mild cognitive impairment (aMCI), where memory is worse than would be expected for a person’s age, is one such disorder. “In the case of early aMCI, it has been suggested that the increased hippocampal activation may serve a beneficial function by recruiting additional neural resources to compensate for those that are lost,” explains senior study author, Dr. Michela Gallagher, from Johns Hopkins University. “However, animal studies have raised the alternative view that this excess activation may be contributing to memory impairment.”

Dr. Gallagher and colleagues tested how a reduction of hippocampal activity would impact human patients with aMCI. The researchers used a low dose of a drug used clinically to treat epilepsy, for the purpose of reducing hippocampal activity in subjects with aMCI to levels that were similar to activity levels in healthy, age-matched subjects in a control group. The researchers found that treatment with the drug improved performance on a memory task. These findings point to the therapeutic potential of reducing excess activation in the hippocampus in aMCI.

The results also have broader significance as elevated activity in the hippocampus is also observed in other conditions that are thought to precede Alzheimer’s disease, and may be one of the underlying mechanisms of neurodegeneration. “Apart from a direct role in memory impairment, there is concern that elevated activity in vulnerable neural networks could be causing additional damage and, possibly, widespread disease-related degeneration that underlies cognitive decline and the conversion to Alzheimer’s disease,” concludes Dr. Gallagher. “Therefore, reducing the elevated activity in the hippocampus may help to restore memory and protect the brain.”

Provided by Cell Press

More information: Bakker et al.: “Reduction of hippocampal hyperactivity improves cognition in amnestic mild cognitive impairment.”,DOI:10.1016/j.neuron.2012.03.023

Source: medicalxpress.com

May 9, 20127 notes
#science #neuroscience #brain #psychology #memory
Babies’ Brains Benefit From Music Lessons

Released: 5/9/2012 11:20 AM EDT

Newswise — After completing the first study of its kind, researchers at McMaster University have discovered that very early musical training benefits children even before they can walk or talk.

They found that one-year-old babies who participate in interactive music classes with their parents smile more, communicate better and show earlier and more sophisticated brain responses to music.

The findings were published recently in the scientific journals Developmental Science and Annals of the New York Academy of Sciences.

“Many past studies of musical training have focused on older children,” says Laurel Trainor, director of the McMaster Institute for Music and the Mind. “Our results suggest that the infant brain might be particularly plastic with regard to musical exposure.”

Trainor, together with David Gerry, a music educator and graduate student, received an award from the Grammy Foundation in 2008 to study the effects of musical training in infancy. In the recent study, groups of babies and their parents spent six months participating in one of two types of weekly music instruction.

One music class involved interactive music-making and learning a small set of lullabies, nursery rhymes and songs with actions. Parents and infants worked together to learn to play percussion instruments, take turns and sing specific songs.

In the other music class, infants and parents played at various toy stations while recordings from the popular Baby Einstein series played in the background.

Before the classes began, all the babies had shown similar communication and social development and none had previously participated in other baby music classes.

“Babies who participated in the interactive music classes with their parents showed earlier sensitivity to the pitch structure in music,” says Trainor. “Specifically, they preferred to listen to a version of a piano piece that stayed in key, versus a version that included out-of-key notes. Infants who participated in the passive listening classes did not show the same preferences. Even their brains responded to music differently. Infants from the interactive music classes showed larger and/or earlier brain responses to musical tones.”

The non-musical differences between the two groups of babies were even more surprising, say researchers.

Babies from the interactive classes showed better early communication skills, like pointing at objects that are out of reach, or waving goodbye. Socially, these babies also smiled more, were easier to soothe, and showed less distress when things were unfamiliar or didn’t go their way.

While both class types included listening to music and all the infants heard a similar amount of music at home, a big difference between the classes was the interactive exposure to music.

“There are many ways that parents can connect with their babies,” says study coordinator Andrea Unrau. “The great thing about music is, everyone loves it and everyone can learn simple interactive musical games together.”

Source: newswise

May 9, 20127 notes
#science #brain #neuroscience #psychology
Cellist Achieves Optimal Performance Through Neurofeedback

Released: 5/9/2012 11:00 AM EDT 

Newswise — “Practice makes perfect,” the saying goes. Optimal performance, however, can require more than talent, effort, and repetition. Training the brain to reduce stress through neurofeedback can remove barriers and enhance one’s innate abilities.

image

An article in the journal Biofeedback presents the narrative of a young cellist who was able to realize the potential of his talent and eliminate debilitating migraine headaches. This case study is part of a special section in the Spring 2012 issue focusing on optimal functioning.

Enhancing people’s performance in business, performing and visual arts, academia, and sports can be realized through biofeedback and neurofeedback training. Tools of stress reduction, mental imagery training, psychology, and psycho-physiological technology are combined to help people reach their goals.

The author and practitioner in this case study has combined her work and study in the fields of theater, social work, and neurofeedback. In her practice, she coaches clients to achieve outstanding performances. For example, a singer can better understand and interpret a musical selection, allowing that singer to better convey the emotion of the music, resulting in a noticeably improved performance.

William, the young musician, sought relief from migraine headaches that were affecting him almost daily. His therapy, however, did not take the approach of treating the headaches, but of focusing on William as a person and as a performer. By improving his functionality, working through moments of obsessiveness, self-criticism, fear, and anxiety, the headaches could also be resolved.

William’s therapist conducted neurofeedback — using sensors to read his brainwaves, analyzing these with NeuroOptimal™ software, and then giving feedback to the brain through a visual display and sound. With this information, the brain can learn to self-correct. This technology assists in getting people past that moment when they obsess over whether they have given the correct answer or hit the right note.

NeuroOptimal feedback, guided imagery, and coaching about decisions regarding his music helped William move beyond the difficulties he encountered. During his senior recital at his college, he was able to give a relaxed, confident performance that was met with a standing ovation.

Full text of the article, “William’s Story: A Case Study in Optimal Performance,” Biofeedback, Volume 40, Issue 1, Spring 2012, is available at http://www.aapb-biofeedback.com/

Source: newswise

May 9, 20125 notes
#science #neuroscience #brain #psychology
Can new diagnostic approaches help assess brain function in unconscious, brain-injured patients?

May 9, 2012

Disorders of consciousness such as coma or a vegetative state caused by severe brain injury are poorly understood and their diagnosis has relied mainly on patient responses and measures of brain activity. However, new functional and imaging-based diagnostic tests that measure communication and signaling between different brain regions may provide valuable information about the potential for consciousness in patients unable to communicate. These innovative approaches are described and compared in a Review article in the groundbreaking neuroscience journal Brain Connectivity.

image

Brain Connectivity is the journal of record for researchers and clinicians interested in all aspects of brain connectivity. Credit: ©2012 Mary Ann Liebert, Inc., publishers

Mélanie Boly and coauthors from University of Liège (Belgium), University of Milan (Italy), and University College London (UK) compare the benefits and limitations of three methods for studying the dynamics of brain communication and connectivity in response to internal and external stimulation: functional magnetic resonance imaging f(MRI); transcranial magnetic stimulation (TMS) combined with electroencephalograpy (EEG); and response to neuronal perturbation, measuring, for example, sensory evoked potentials (ERP). They report their findings and propose future research directions in the article “Brain Connectivity in Disorders of Consciousness.”

"In recent years, there has been a tremendous interest in gaining a better understanding of the various disorders of consciousness. A variety of methods including fMRI and PET have been used to study these disorders," says Bharat Biswal, PhD, Co-Editor-in-Chief of Brain Connectivity and Associate Professor, University of Medicine and Dentistry of New Jersey. “This article provides a comprehensive analysis using three new and innovative methods to study disorders of consciousness.”

More information: The article is available free on the Brain Connectivitywebsite at http://online.liebertpub.com/doi/full/10.1089/brain.2011.0049

Provided by Mary Ann Liebert, Inc.

Source: medicalxpress.com

May 9, 2012
#science #neuroscience #brain #psychology #consciousness
Computer Scientists Show What Makes Movie Lines Memorable

ScienceDaily (May 8, 2012) — Whether it’s a line from a movie, an advertising slogan or a politician’s catchphrase, some statements take hold in people’s minds better than others. But why?

Cornell researchers who applied computer analysis to a database of movie scripts think they may have found the secret of what makes a line memorable.

The study suggests that memorable lines use familiar sentence structure but incorporate distinctive words or phrases, and they make general statements that could apply elsewhere. The latter may explain why lines such as, “You’re gonna need a bigger boat” or “These aren’t the droids you’re looking for” (accompanied by a hand gesture) have become standing jokes. You can use them in a different context and apply the line to your own situation.

While the analysis was based on movie quotes, it could have applications in marketing, politics, entertainment and social media, the researchers said.

"Using movie scripts allowed us to study just the language, without other factors. We needed a way of asking a question just about the language, and the movies make a very nice dataset," said graduate student Cristian Danescu-Niculescu-Mizil, first author of a paper to be presented at the 50th Annual Meeting of the Association for Computational Linguistics July 8-14 in Jeju, South Korea.

The study grows out of ongoing work on how ideas travel across networks.

"We’ve been looking at things like who talks to whom," said Jon Kleinberg, a professor of computer science who worked on the study, "but we hadn’t explored how the language in which an idea was presented might have an effect."

To address that, they collaborated with Lillian Lee, a professor of computer science who specializes in computer processing of natural human language.

They obtained scripts from about 1,000 movies, and a database of memorable quotes from those movies from the Internet Movie Database. Each quote was paired with another from the movie’s script, spoken by the same character in the same scene and about the same length, to eliminate every factor except the language itself. Obi-Wan Kenobi, for example, also said, “You don’t need to see his identification,” but you don’t hear that a lot.

They asked a group of people who had not seen the movies to choose which quote in the pairs was most memorable. Two patterns emerged to identify the memorable choice: distinctiveness and generality.

Then the researchers programmed a computer with linguistic rules reflecting these concepts. A line will be less general if it contains third-person pronouns and definite articles (which refer to people, objects or events in the scene) and uses past tense (usually referring to something that happened previously in the story). Distinctive language can be identified by comparison with a database of news stories. The computer was able to choose the memorable quote an average of 64 percent of the time.

Later analysis also found subtle differences in sound and word choice: Memorable quotes use more sounds made in the front of the mouth, words with more syllables and fewer coordinating conjunctions.

In a further test, the researchers found that the same rules applied to popular advertising slogans.

Although teaching a computer how to write memorable dialogue is probably a long way off, applications might be developed to monitor the work of human writers and evaluate it in progress, Kleinberg suggested.

The researchers have set up a website where you can test your skill at identifying memorable movie quotes, and perhaps contribute some data to the research, at www.cs.cornell.edu/~cristian/memorability.html

Source: Science Daily

May 9, 20126 notes
#science #neuroscience #memory #psychology #brain
Future Treatment for Nearsightedness — Compact Fluorescent Light Bulbs?

ScienceDaily (May 8, 2012) — Researchers at the University of Alabama at Birmingham hope to one day use fluorescent light bulbs to slow nearsightedness, which affects 40 percent of American adults and can cause blindness.

In an early step in that direction, results of a study found that small increases in daily artificial light slowed the development of nearsightedness by 40 percent in tree shrews, which are close relatives of primates.

The team, led by Thomas Norton, Ph.D., professor in the UAB Department of Vision Sciences, presented the study results May 8 at the 2012 Association for Research in Vision and Ophthalmology annual meeting in Ft. Lauderdale.

People can see clearly because the front part of the eye bends light and focuses it on the retina in back. Nearsightedness, also called myopia, occurs when the physical length of the eye is too long, causing light to focus in front of the retina and blurring images.

Myopia has many causes, some related to inheritance and some to the environment. Research in recent years had, for instance, suggested that children who spent more time outdoors, presumably in brighter outdoor light, had less myopia as young adults. That raised the question of whether artificial light, like sunlight, could help reduce myopia development, without the risks of prolonged sun exposure, such as skin cancer and cataracts.

"Our hope is to develop programs that reduce the rate of myopia using energy efficient, fluorescent lights for a few hours each day in homes or classrooms," said John Siegwart, Ph.D., research assistant professor in UAB Vision Sciences and co-author of the study. "Trying to prevent myopia by fixing defective genes through gene therapy or using a drug is a multi-year, multimillion-dollar effort with no guarantee of success. We hope to make a difference just with light bulbs."

Sorting through theories

Work over 25 years had shown that putting a goggle over one eye of a study animal, one that lets in light but blurs images, causes the eye to grow too long, which in turn causes myopia. Other past studies had shown that elevated light levels could reduce myopia under these conditions, whether the light was produced by halogen lamps, metal halide bulbs or daylight. The current study is the first to show that the development of myopia can be slowed by increasing daily fluorescent light levels.

One prevailing theory on myopia-related shape changes in the eye is that they are caused by the blurriness of images experienced while reading or doing other near-work chores. Another holds some people develop myopia because they have low levels of vitamin D, which goes up with exposure to sunlight and could explain the connection between outdoor light and reduced myopia. A third theory, one reinforced by the current results, is that bright light causes an increase in levels of dopamine, a signaling molecule in the retina.

To test the theories, the team used a goggle that lets in light but no images to produce myopia in one eye of each tree shrew. They found that a group exposed to elevated fluorescent light levels for eight hours per day developed 47 percent less myopia than a control group exposed to normal indoor lighting, even though the images were neither more nor less blurry. They also found that animals fed vitamin D supplements developed myopia just like ones without the supplement. Given these results, the team is now experimenting with light levels and treatment times to see if a short, bright light treatment could be effective. They have also begun studies looking at the effect of elevated light on retinal dopamine levels as it relates to the reduction of myopia.

"If we can find the best kind of light, treatment period and light level, we’ll have the scientific justification to begin studies raising light levels in schools, for instance," said Norton. "Compact fluorescent bulbs use much less electricity than standard light bulbs, and future programs raising light levels will have more impact the less expensive they are."

Source: Science Daily

May 9, 20124 notes
#science #neuroscience #vision
'Blindness’ May Rapidly Enhance Other Senses

ScienceDaily (May 8, 2012) — Can blindness or other forms of visual deprivation really enhance our other senses such as hearing or touch? While this theory is widely regarded as being true, there are still many questions about the science behind it.

New findings from a Canadian research team investigating this link suggest that not only is there a real connection between vision and other senses, but that connection is important to better understand the underlying mechanisms that can quickly trigger sensory changes. This may demystify the true potential of human adaptation and, ultimately, help develop innovative and effective methods for rehabilitation following sensory loss or injury.

François Champoux, director of the University of Montreal’s Laboratory of Auditory Neuroscience Research, will present his team’s research and findings at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Studies have shown, in terms of hearing, that blind people are better at localizing sound. One study even suggested that blindness might improve the ability to differentiate between sound frequencies. “The supposed enhanced tactile abilities have been studied at a greater degree and can be seen as early as days or even minutes following blindness,” says Champoux. “This rapid change in auditory ability hasn’t yet been clearly demonstrated.”

Two big questions about blindness and enhanced abilities remain unanswered: Can blindness improve more complex auditory abilities and, if so, can these changes be triggered after only a few minutes of visual deprivation, similar to those seen with tactile abilities?

"When we speak or play a musical instrument, the sounds have specific harmonic relations. In other words, if we play a certain note on a piano, that note has many related ‘layers.’ However, we don’t hear all of these layers because our brain simply associates them all together and we only hear the lowest one," Champoux explains.

It’s through this complex computation based on specific components of the sound that the brain can interpret and distinguish auditory signals coming from different people or instruments. The ability to identify harmonicity — the harmonic relation between sounds — is one of the most powerful factors involved in interpreting our auditory surroundings.

"Harmonicity can easily be evaluated using a simple task in which similar harmonic layers are set up and one of them is gradually modified until the individual notices two layers instead of one," says Champoux. "In our study, healthy individuals completed such a task while blindfolded. This task was administered twice, separated by a 90-minute interval during which the participants conversed with the experimenter in a quiet room. Half of the participants kept the blindfold on during the interval period, depriving them of all visual input, while the other half removed their blindfolds."

They found no significant differences between the two groups in their ability to differentiate harmonicity prior to visual deprivation. However, the results of the testing session following visual deprivation revealed that visually deprived individuals performed significantly better than the group that took their blindfolds off.

"Regardless of the neural basis for such an enhancement, our results suggest that the potential for change in auditory perception is much greater than previously assumed," Champoux notes.

Source: Science Daily

May 9, 20126 notes
#science #neuroscience #brain #psychology
The Risk of Listening to Amplified Music

ScienceDaily (May 8, 2012) — Listening to amplified music for less than 1.5 hours produces measurable changes in hearing ability that may place listeners at risk of noise-induced hearing loss, new research shows. While further research is needed to firmly establish this risk, the investigation is significant because it provides the first acoustical data for a new method to assess the potential harm from a widespread cultural behavior: “leisure listening” to amplified music, whether in live environments or through headphones.

A team of Danish acoustics researchers present the results of their preliminary study at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics. Their goal is to help develop recommendations for how sound engineers, musicians, event organizers, and the general public should safely enjoy amplified music so they are protected from hearing loss — just as workers are now protected by occupational health standards.

Explains Rodrigo Ordonez, Ph.D., lead scientist of the Danish team from Aalborg University’s Department of Electronic Systems: “Modern low-distortion, high-power loudspeaker systems and headphones make it easy for people to be exposed to potentially harmful sound levels at discotheques, concerts, or while using portable music players.”

He adds that in the realm of industrial noise and work-related sound exposures, decades of experience and personal tragedy — many workers lost hearing from factory conditions — has produced the hearing-damage risk criteria currently used. Based on well-documented acoustical parameters, these criteria outline measurement procedures and expected impact on hearing.

"Yet when it comes to musical sound exposure — and in particular, amplified music — it is not known if the same measures used for industrial noise will accurately describe the effects on hearing and the risk these behaviors pose," Dr. Ordonez says.

To investigate the potential health risk from amplified music, the team measured sounds known as “otoacoustic emissions” as an index of auditory function. These are sounds generated within the inner ear in response to sound stimuli, and they can be measured in the ear canals of people who have healthy hearing. Research shows that otoacoustic emissions disappear when the inner ear is damaged. In this study, the researchers measured otoacoustic emissions to gauge changes in hearing ability before and after exposure to amplified music, testing this method in a live concert environment. Comparing how these two sets of measures change after a sound exposure with the acoustical parameters of the amplified music can lead to a better understanding of how our hearing is affected.

Results revealed two main findings: One is that it is possible to measure changes in hearing after exposures of relatively short duration, less than 1.5 hours. The second is that there are noticeable individual differences in sound exposure levels, as well as in the changes on otoacoustic emissions produced by similar exposure conditions.

Next steps in the team’s work include refining their measurement methods and describing the biophysical effects and mechanics that music sound levels have on individuals. Ultimately they hope to provide data and a scientific rationale on which to establish damage risk criteria for music sound exposure.

Source: Science Daily

May 9, 20128 notes
#science #neuroscience #brain #psychology
Scientists Tuning in to How You Tune out Noise

ScienceDaily (May 8, 2012) — Although we have little awareness that we are doing it, we spend most of our lives filtering out many of the sounds that permeate our lives and acutely focusing on others — a phenomenon known as auditory selective attention. In research that could some day lead to the development of improved devices allowing users to control things like wheelchairs through thought alone, hearing scientists at the University of Washington (UW) are attempting to tease apart the process.

The work will be presented at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Auditory selective attention is extremely important in everyday life, notes UW postdoctoral researcher Ross Maddox. “In situations as mundane as ordering your morning cup of coffee, you must focus on the barista while tuning out the loud hiss of the espresso machine and the annoying cell phone conversation happening in line right behind you,” says Maddox. “However, the mechanisms behind selective attention are still not well understood.” In addition, some individuals suffer from Central Auditory Processing Disorder (CAPD), “which means they have normal hearing when tested by an audiologist,” he says, “but they are completely lost in loud settings like restaurants and airports.”

To determine how auditory selective attention works — and perhaps how it fails in people with CAPD — Maddox, along with Adrian K.C. Lee, an assistant professor of speech and hearing sciences, and colleague Willy Cheung, created laboratory situations that promoted the breakdown of the process. The researchers had 10 subjects try to focus their attention on just one target sound — a continuously repeating utterance of a single letter — among a total of 4, 6, 8, or 12 such sounds. The subjects had to determine when an “oddball” item (the letter “R,” chosen because it doesn’t rhyme with any other letter) was inserted into the target sound stream.

"Most studies systematically degrade sounds and measure the effects on listeners’ performance," Maddox explains. "Here, we made the target sound as easy to distinguish from all the other sounds present as possible, and tested the upper limit on the number of sounds a listener could tune out, given all these acoustical advantages."

Unsurprisingly, it is harder to tune in to just one stream when the number of streams increases. However, study subjects did better than expected — successfully identifying the target 70 percent of the time in the most difficult conditions. Repeating letters faster did make the task harder — although with faster repetition, listeners more quickly learn what the letter they’re listening to sounds like, “so there is a tradeoff involved when deciding on repetition speed,” Maddox says.

The work, Maddox and colleagues say, is a first step toward developing an auditory brain-computer interface (BCI) — a device that reads brain activity to allow users to control computers or machines such as wheelchairs. “We hope to create a system that presents a user with an auditory ‘menu’ of sounds — similar to the letter streams here — and allows the listener to make a choice by reading their brainwaves to determine which sound they are focusing on. The more sound streams a user is able to tune out, the more menu options we can present at a single time.”

Source: Science Daily

May 9, 20129 notes
#science #neuroscience #psychology #brain
Gestures Fulfill a Big Role in Language

ScienceDaily (May 8, 2012) — People of all ages and cultures gesture while speaking, some much more noticeably than others. But is gesturing uniquely tied to speech, or is it, rather, processed by the brain like any other manual action?

image

Scientists have discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. (Credit: Image courtesy of Acoustical Society of America (ASA))

A U.S.-Netherlands research collaboration delving into this tie discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. This is surprising because there is less visual information contained in gestures than in actual actions on objects. In short: Less may actually be more when it comes to gestures and actions in terms of understanding language.

Spencer Kelly, associate professor of Psychology, director of the Neuroscience program, and co-director of the Center for Language and Brain at Colgate University, and colleagues from the National Institutes of Health and Max Planck Institute for Psycholinguistics will present their research at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.

Among their key findings is that gestures — more than actions — appear to make people pay attention to the acoustics of speech. When we see a gesture, our auditory system expects to also hear speech. But this is not what the researchers found in the case of manual actions on objects.

Just think of all the actions you’ve seen today that occurred in the absence of speech. “This special relationship is interesting because many scientists have argued that spoken language evolved from a gestural communication system — using the entire body — in our evolutionary past,” points out Kelly. “Our results provide a glimpse into this past relationship by showing that gestures still have a tight and perhaps special coupling with speech in present-day communication. In this way, gestures are not merely add-ons to language — they may actually be a fundamental part of it.”

A better understanding of the role hand gestures play in how people understand language could lead to new audio and visual instruction techniques to help people overcome major challenges with language delays and disorders or learning a second language.

What’s next for the researchers? “We’re interested in how other types of visual inputs, such as eye gaze, mouth movements, and facial expressions, combine with hand gestures to impact speech processing. This will allow us to develop even more natural and effective ways to help people understand and learn language,” says Kelly.

Source: Science Daily

May 9, 20129 notes
#science #neuroscience #psychology #brain
Psychologists reveal how emotion can shut down high-level mental processes without our knowledge

May 8, 2012

Psychologists at Bangor University believe that they have glimpsed for the first time, a process that takes place deep within our unconscious brain, where primal reactions interact with higher mental processes. Writing in the Journal of Neuroscience, they identify a reaction to negative language inputs which shuts down unconscious processing.

For the last quarter of a century, psychologists have been aware of, and fascinated by the fact that our brain can process high-level information such as meaning outside consciousness. What the psychologists at Bangor University have discovered is the reverse- that our brain can unconsciously ‘decide’ to withhold information by preventing access to certain forms of knowledge.

The psychologists extrapolate this from their most recent findings working with bilingual people. Building on their previous discovery that bilinguals subconsciously access their first language when reading in their second language; the psychologists at the School of Psychology and Centre for Research on Bilingualism have now made the surprising discovery that our brain shuts down that same unconscious access to the native language when faced with a negative word such as war, discomfort, inconvenience, and unfortunate.

They believe that this provides the first proven insight to a hither-to unproven process in which our unconscious mind blocks information from our conscious mind or higher mental processes.

This finding breaks new ground in our understanding of the interaction between emotion and thought in the brain. Previous work on emotion and cognition has already shown that emotion affects basic brain functions such as attention, memory, vision and motor control, but never at such a high processing level as language and understanding.

Key to this is the understanding that people have a greater reaction to emotional words and phrases in their first language- which is why people speak to their infants and children in their first language despite living in a country which speaks another language and despite fluency in the second. It has been recognised for some time that anger, swearing or discussing intimate feelings has more power in a speaker’s native language. In other words, emotional information lacks the same power in a second language as in a native language.

Dr Yan Jing Wu of the University’s School of Psychology said: “We devised this experiment to unravel the unconscious interactions between the processing of emotional content and access to the native language system. We think we’ve identified, for the first time, the mechanism by which emotion controls fundamental thought processes outside consciousness.

"Perhaps this is a process that resembles the mental repression mechanism that people have theorised about but never previously located."

So why would the brain block access to the native language at an unconscious level?

Professor Guillaume Thierry explains: “We think this is a protective mechanism. We know that in trauma for example, people behave very differently. Surface conscious processes are modulated by a deeper emotional system in the brain. Perhaps this brain mechanism spontaneously minimises negative impact of disturbing emotional content on our thinking, to prevent causing anxiety or mental discomfort.”

He continues: “We were extremely surprised by our finding. We were expecting to find modulation between the different words- and perhaps a heightened reaction to the emotional word - but what we found was the exact opposite to what we expected- a cancellation of the response to the negative words.”

The psychologists made this discovery by asking English-speaking Chinese people whether word pairs were related in meaning. Some of the word pairs were related in their Chinese translations. Although not consciously acknowledging a relation, measurements of electrical activity in the brain revealed that the bilingual participants were unconsciously translating the words. However, uncannily, this activity was not observed when the English words had a negative meaning.

Provided by Bangor University

Source: medicalxpress.com

May 9, 201238 notes
#science #neuroscience #brain #psychology
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December