Neuroscience

Month

June 2012

Balancing connections for proper brain function

June 22, 2012

Neuropsychiatric conditions such as autism, schizophrenia and epilepsy involve an imbalance between two types of synapses in the brain: excitatory synapses that release the neurotransmitter glutamate, and inhibitory synapses that release the neurotransmitter GABA. Little is known about the molecular mechanisms underlying development of inhibitory synapses, but a research team from Japan and Canada has reported that a molecular signal between adjacent neurons is required for the development of inhibitory synapses.

image

Figure 1: Compared with the brains of normal animals (left), mice lacking the Slitrk3 gene (right) have a reduced density of inhibitory synapses in the hippocampus. Reproduced from Ref. 1 © 2012 Jun Aruga, RIKEN Brain Science Institute

In earlier work, the researchers—led by Jun Aruga of the RIKEN Brain Science Institute, Wako, and Ann Marie Craig of the University of British Colombia, Vancouver—showed that a membrane protein called Slitrk2 organizes signaling molecules at synapses. They therefore tested whether five related proteins are involved in inhibitory synapse development. They cultured immature hippocampal neurons with non-neural cells expressing each of the six Slitrk proteins. They found that Slitrk3, but not other Slitrk proteins, induced clustering of VGAT, a GABA transporter protein found only at inhibitory synapses.

The researchers also examined the localization of Slitrk3 by tagging it with yellow fluorescent protein and introducing it into cultured hippocampal cells. This revealed that Slitrk3 co-localizes in the dendrites of neurons with gephyrin, a scaffold protein found only in inhibitory synapses. They then blocked Slitrk3 synthesis, and found that it led to a significant reduction in the number of inhibitory synapses.

To confirm these findings, the researchers generated a strain of genetically engineered mice lacking the Slitrk3 gene. These animals had significantly fewer inhibitory synapses than normal animals (Fig. 1), and therefore impaired neurotransmission of GABA. They were also susceptible to epileptic seizures. From a screen for proteins that bind to Slitrk3, Aruga, Craig and colleagues identified the protein PTPδ as its only binding partner. Introduction of PTPδ fused to yellow fluorescent protein to cultured hippocampal neurons showed that it is expressed in neuronal dendrites and cell bodies, but not in axons. Blocking PTPδ synthesis prevented the induction of inhibitory synapses by the Slitrk3 protein.

These results demonstrated that the interaction between Slitrk3 on dendrites and PTPδ on axons of adjacent cells is required for the proper development of inhibitory synapses and for inhibitory neurotransmission in the brain.

“We are now examining whether the balance of excitatory and inhibitory synapses is affected by other members of the Slitrk protein family,” says Aruga. “It is possible that Slitrk3 and other Slitrk proteins are acting synergistically or antagonistically. We are also clarifying whether Slitrk3 is involved in any neurological disorders.”

Provided by RIKEN

Source: medicalxpress.com

Jun 23, 201229 notes
#science #neuroscience #brain #psychology #synapses
Preventing or Better Managing Diabetes May Prevent Cognitive Decline

ScienceDaily (June 21, 2012) — Preventing diabetes or delaying its onset has been thought to stave off cognitive decline — a connection strongly supported by the results of a 9-year study led by researchers at the University of California, San Francisco (UCSF) and the San Francisco VA Medical Center.

Earlier studies have looked at cognitive decline in people who already had diabetes. The new study is the first to demonstrate that the greater risk of cognitive decline is also present among people who develop diabetes later in life. It is also the first study to link the risk of cognitive decline to the severity of diabetes.

The result is the latest finding to emerge from the Health, Aging, and Body Composition (Health ABC) Study, which enrolled 3,069 adults over 70 at two community clinics in Memphis, TN and Pittsburgh, PA beginning in 1997. All the patients provided periodic blood samples and took regular cognitive tests over time.

When the study began, hundreds of those patients already had diabetes. A decade later, many more of them had developed diabetes, and many also suffered cognitive decline. As described this week in Archives of Neurology, those two health outcomes were closely linked.

People who had diabetes at the beginning of the study showed a faster cognitive decline than people who developed it during the course of the study — and these people, in turn, tended to be worse off than people who never developed diabetes at all. The study also showed that patients with more severe diabetes who did not control their blood sugar levels as well suffered faster cognitive declines.

"Both the duration and the severity of diabetes are very important factors," said Kristine Yaffe, MD, the lead author of the study. "It’s another piece of the puzzle in terms of linking diabetes to accelerated cognitive aging."

An important question for future studies, she added, would be to ask if interventions that would effectively prevent, delay or better control diabetes would also lower people’s risk of cognitive impairment later in life.

Yaffe is the Roy and Marie Scola Endowed Chair of Psychiatry; professor in the UCSF departments of Psychiatry, Neurology and Epidemiology and Biostatistics; and Chief of Geriatric Psychiatry and Director of the Memory Disorders Clinic at the San Francisco VA Medical Center.

Diabetes and Cognitive Decline

Diabetes is a chronic and complex disease marked by high levels of sugar in the blood that arise due to problems with the hormone insulin, which regulates blood sugar levels. It is caused by an inability to produce insulin (type 1) or an inability to respond correctly to insulin (type 2).

A major health concern in the United States, diabetes of all types affects an estimated 8.3 percent of the U.S. population — some 25.8 million Americans — and costs U.S. taxpayers more than $200 billion annually. In California alone, an estimated 4 million people (one out of every seven adults) has type 2 diabetes and millions more are at risk of developing it. These numbers are poised to explode in the next half century if more is not done to prevent the disease.

Over the last several decades, scientists have come to appreciate that diabetes affects many tissues and organs of the body, including the brain and central nervous system — particularly because diabetes places people at risk of cognitive decline later in life.

In their study the scientists looked at a blood marker known as “glycosylated hemoglobin,” a standard measure of the severity of diabetes and the ability to control it over time. The marker shows evidence of high blood sugar because these sugar molecules become permanently attached to hemoglobin proteins in the blood. Yaffe and her colleagues found that greater levels of this biomarker were associated with more severe cognitive dysfunction.

While the underlying mechanism that accounts for the link between diabetes and risk of cognitive decline is not completely understood, Yaffe said, it may be related to a human protein known as insulin degrading enzyme, which plays an important role in regulating insulin, the key hormone linked to diabetes. This same enzyme also degrades a protein in the brain known as beta-amyloid, a brain protein linked to Alzheimer’s disease.

Source: Science Daily

Jun 22, 201213 notes
#science #neuroscience #diabetes #brain #dementia
New Candidate Drug Stops Cancer Cells, Regenerates Nerve Cells

ScienceDaily (June 21, 2012) — Scientists have developed a small-molecule-inhibiting drug that in early laboratory cell tests stopped breast cancer cells from spreading and also promoted the growth of early nerve cells called neurites.

Researchers from Cincinnati Children’s Hospital Medical Center report their findings online June 21 in Chemistry & Biology. The scientists named their lead drug candidate “Rhosin” and hope future testing shows it to be promising for the treatment of various cancers or nervous system damage.

The inhibitor overcomes a number of previous scientific challenges by precisely targeting a single component of a cell signaling protein complex called Rho GTPases. This complex regulates cell movement and growth throughout the body. Miscues in Rho GTPase processes are also widely implicated in human diseases, including various cancers and neurologic disorders.

"Although still years from clinical development, in principle Rhosin could be useful in therapy for many kinds of cancer or possibly neuron and spinal cord regeneration," said Yi Zheng, PhD, lead investigator and director of Experimental Hematology and Cancer Biology at Cincinnati Children’s. "We’ve performed in silica (computerized) rational drug design, pharmacological characterization and cell tests in the laboratory, and we are now starting to work with mouse models."

Because the role of Rho GTPases in cellular processes and cancer formation is well established, researchers have spent years trying to identify safe and effective therapeutic targets for specific parts of the protein complex. In particular, scientists have focused on the center protein in the complex called RhoA, which is essential for the signaling function of the complex. In breast cancer for example, increased RhoA activity makes the cancer cells more invasive and causes them to spread, while a deficiency of RhoA suppresses cancer growth and progression.

Despite this knowledge, past efforts to develop an effective small-molecule inhibitor for RhoA have failed, explained Zheng, who has studied Rho GTPases for over two decades. Most roadblocks stem from a lack of specificity in how researchers have been able to target RhoA, a resulting lack of efficiency in affecting molecular processes, problems with toxicity, and the inability to find a workable drug design.

For the current study, Zheng and his colleagues started with the extensive body of research from Cincinnati Children’s and other institutions describing the processes and functions of Rho GTPases. They then used high-throughput computerized molecular screening and computerized drug design to reveal a druggable target site. This also provided a preliminary virtual simulation on the potential effectiveness of candidate drugs.

A key challenge to binding a small-molecule inhibitor to RhoA is the protein’s globular structure and lack of surface pocket areas suitable for easy binding, Zheng said. The unique chemical structure of the lead compound identified by researchers, Rhosin, allows it to effectively bind to two shallow surface grooves on RhoA. This enables the candidate drug to take root and begin affecting cells. The two-legged configuration of Rosin also describes a useful drug design strategy for more effectively targeting difficult molecular sites like RhoA.

The researchers also wanted to make sure Rhosin effectively blocked what are known as guanine nucleotide exchange factors (GEFs). Guanine nucleotide is a critical energy source and signaling component of cells. Activation of GEFs is required to set off the regulatory signaling of GTPases (GTP stands for guanosine triphosphate).

After conducting a series of laboratory cell tests to verify the targeting and binding capabilities of Rhosin to RhoA, the researchers then tested the candidate drug’s impact on cultured breast cancer cells and nerve cells.

In tests on a human breast cancer cells, Rhosin inhibited cell growth and the formation of mammary spheres in a dose dependent manner, acting specifically on RhoA molecular targets without disrupting other critical cellular processes. Rhosin does not affect non-cancerous breast cells. This, along with other tests the scientists performed, indicated Rhosin’s effectiveness in targeting RhoA-mediated breast cancer proliferation, according to the researchers.

Researchers also treated an extensively tested line of neuronal cells with Rhosin, along with nerve growth factor, a protein that is important to the growth and survival of neurons. Rhosin worked with nerve growth factor in a dose-dependent way to promote the proliferation of branching neurites from the neuronal cells. Neurites are young or early stage extensions from neurons required for neuronal communications.

Source: Science Daily

Jun 22, 2012115 notes
#science #neuroscience
Jun 22, 201241 notes
#science #neuroscience #perception #psychology
Eating Disorder Behaviors and Weight Concerns Are Common in Women Over 50

ScienceDaily (June 21, 2012) — Eating disorders are commonly seen as an issue faced by teenagers and young women, but a new study reveals that age is no barrier to disordered eating. In women aged 50 and over, 3.5% report binge eating, nearly 8% report purging, and more than 70% are trying to lose weight. The study published in the International Journal of Eating Disorders revealed that 62% of women claimed that their weight or shape negatively impacted on their life.

The researchers, led by Dr Cynthia Bulik, Director of the University of North Carolina Eating Disorders Program, reached 1,849 women from across the USA participating in the Gender and Body Image Study (GABI) with a survey titled, ‘Body Image in Women 50 and Over — Tell Us What You Think and Feel.’

"We know very little about how women aged 50 and above feel about their bodies," said Bulik. "An unfortunate assumption is that they ‘grow out of’ body dissatisfaction and eating disorders, but no one has really bothered to ask. Since most research focuses on younger women, our goal was to capture the concerns of women in this age range to inform future research and service planning."

The average age of the participants was 59, while 92% were white. More than a quarter, 27%, were obese, 29% were overweight, 42% were normal weight and 2% were underweight.

Results revealed that eating disorder symptoms were common. About 8% of women reported purging in the last five years and 3.5% reported binge eating in the last month. These behaviors were most prevalent in women in their early 50s, but also occurred in women over 75.

When it came to weight issues, 36% of the women reported spending at least half their time in the last five years dieting, 41% checked their body daily and 40% weighed themselves a couple of times a week or more.

62% of women claimed that their weight or shape negatively impacted their life, 79% said that it affected their self-perception and 64% said that they thought about it daily.

The women reported resorting to a variety of unhealthy methods to change their body, including diet pills (7.5%), excessive exercise (7%), diuretics (2.5%), laxatives (2%) and vomiting (1%).

Two-thirds, 66%, were unhappy with their overall appearance and this was highest when it came to their stomach, 84%, and shape, 73%.

"The bottom line is that eating disorders and weight and shape concerns don’t discriminate on the basis of age," concluded Bulik. "Healthcare providers should remain alert for eating disorder symptoms and weight and shape concerns that may adversely influence women’s physical and psychological wellbeing as they mature."

Source: Science Daily

Jun 22, 201221 notes
#science #neuroscience #psychology #eating disorders
Functional Links Between Autism and Genes Explained

ScienceDaily (June 21, 2012) — A pioneering report of genome-wide gene expression in autism spectrum disorders (ASDs) finds genetic changes that help explain why one person has an ASD and another does not. The study, published by Cell Press on June 21 in The American Journal of Human Genetics, pinpoints ASD risk factors by comparing changes in gene expression with DNA mutation data in the same individuals. This innovative approach is likely to pave the way for future personalized medicine, not just for ASD but also for any disease with a genetic component.

ASDs are a heterogeneous group of developmental conditions characterized by social deficits, difficulty communicating, and repetitive behaviors. ASDs are thought to be highly heritable, meaning that they run in families. However, the genetics of autism are complex.

Researchers have found rare changes in the number of copies of defined genetic regions that associate with ASD. Although there are some hot-spot regions containing these alterations, very few genetic changes are exactly alike. Similarly, no two autistic people share the exact same symptoms. To discover how these genetic changes might affect gene transcription and, thus, the presentation of the disorder, Rui Luo, a graduate student in the Geschwind lab at UCLA, studied 244 families in which one child (the proband) was affected with an ASD and one was not.

In addition to identifying several potential new regions where copy-number variants (CNVs) are associated with ASDs, Geschwind’s team found genes within these regions to be significantly misregulated in ASD children compared with their unaffected siblings. “Strikingly, we observed a higher incidence of haploinsufficient genes in the rare CNVs in probands than in those of siblings, strongly indicating a functional impact of these CNVs on expression,” says Geschwind. Haploinsuffiency occurs when only one copy of a gene is functional; the result is that the body cannot produce a normal amount of protein. The researchers also found a significant enrichment of misexpressed genes in neural-related pathways in ASD children. Previous research has found that these pathways include other genetic variants associated with autism, which Geschwind explains further legitimizes the present findings.

Source: Science Daily

Jun 22, 201230 notes
#science #neuroscience #psychology #autism #genetics
Where is the Love?

June 21, 2012 By Janice Wood

Thanks to science, we know that love lives in the brain, not the heart.

image

Now a new international study has mapped out where love and sexual desire are in the brain.

“No one has ever put these two together to see the patterns of activation,” says Dr. Jim Pfaus, professor of psychology at Concordia University.

“We didn’t know what to expect –the two could have ended up being completely separate. It turns out that love and desire activate specific but related areas in the brain.”

Working with colleagues in the United States and Switzerland, Pfaus analyzed the results of 20 separate studies that examined brain activity while subjects engaged in tasks such as viewing erotic pictures or looking at photographs of their significant others. Pooling this data enabled the scientists to form a map of love and desire in the brain.

They found that two brain structures, the insula and the striatum, are responsible for tracking the progression from sexual desire to love.

The insula is a portion of the cerebral cortex folded deep within an area between the temporal lobe and the frontal lobe, while the striatum is located nearby, inside the forebrain.

According to the researchers, love and sexual desire activate different areas of the striatum. The area activated by sexual desire is usually turned on by things that are inherently pleasurable, such as sex or food.

The area activated by love is involved in the process of conditioning in which things paired with reward or pleasure are given inherent value. That is, as feelings of sexual desire develop into love, they are processed in a different place in the striatum, the researchers explain.

This area of the striatum is also the part of the brain associated with drug addiction. Pfaus says there is good reason for this.

“Love is actually a habit that is formed from sexual desire as desire is rewarded,” he explains. “It works the same way in the brain as when people become addicted to drugs.”

However, the habit is not a bad one, he said, noting that love activates different pathways in the brain that are involved in monogamy and pair bonding. Some areas in the brain are actually less active when a person feels love than when they feel desire, he added.

“While sexual desire has a very specific goal, love is more abstract and complex, so it’s less dependent on the physical presence someone else,” says Pfaus.

Source: PsychCentral

Jun 22, 2012136 notes
#science #neuroscience #brain #psychology
Mind games: Mental exercises are key to better brain function

June 20, 2012 By Robin Erb

Go ahead - do it: Grab a pencil. Right now. Write your name backward. And upside down. Awkward, right?

But if researchers and neurologists are correct, doing exercises like these just might buy you a bit more time with a healthy brain.

Some research suggests that certain types of mental exercises - whether they are memory games on your mobile device or jotting down letters backward - might help our gray matter maintain concentration, memory and visual and spatial skills over the years.

"There is some evidence of a use-it-or-lose-it phenomenon," says Dr. Michael Maddens, chief of medicine at Beaumont Hospital, Royal Oak, Mich.

Makers of computer brain games, in fact, are tapping into a market of consumers who have turned to home treadmills and gym memberships to maintain their bodies, and now worry that aging might take its toll on their mental muscle as well.

But tweaking every day routines can help.

Like brushing your teeth with your non-dominant hand. Or crossing your arms the opposite way you’re used to, says Cheryl Deep, who leads “Brain Neurobics” sessions on behalf of the Wayne State Institute of Gerontology.

At a recent session in Novi, Mich., Deep encouraged several dozen senior citizens to flip the pictures in their homes upside-down. It might baffle houseguests, but the exercise crowbars the brain out of familiar grooves cut deep by years of mindless habit.

"Every time you walk past and look, your brain has to rotate that image," Deep says. "Brain neurobics is about getting us out of those ruts, those pathways, and shaking things up."

Participants were asked to call out the color of ink that flashed on a screen in front them. The challenge: The colors spelled out names of other colors. Blue ink spelled o-r-a-n-g-e, for example.

Several in the crowd at Waltonwood Senior Living hesitated - a few scrunching up faces in concentration. The first instinct is to say “orange.”

In another exercise, participants had to try to name as many red foods as possible. Apple? Sure that’s an easy one. It took a while, but the crowd eventually made its way to pomegranate and pimento.

Elissa and Hal Leider chuckled with friends as they tested their recall.

Hal Leider, 82, a retired carpenter, was diagnosed with early-stage Alzheimer’s, and he tries to challenge himself mentally and physically - bowling and shooting pool and playing poker: “I think anything we can do might be helpful,” says Elissa Leider, 74.

The idea of mental workouts marks a dramatic shift in how we understand the brain these days.

"We want to stretch and flex and push" the brain, says Moriah Thomason, assistant professor in Wayne State University School of Medicine’s pediatrics department and in the Merrill Palmer Skillman Institute for Child and Family Development

Thomason also is a scientific adviser to http://www.Lumosity.com, one of the fastest-growing brain game websites.

"We used to think that what you’re born with is what you have through life. But now we understand that the brain is a lot more plastic and flexible than we ever appreciated," she says.

Still, like the rest of your body, aging takes its toll, she says.

The protective covering of the neural cells - white matter - begins to shrink first. Neural and glial cells, often called the gray matter, begin to shrink as well, but more slowly. Neurotransmitters, or chemical messengers, decrease.

But challenging the brain stimulates neural pathways - those tentacles that look like tree branches in a cluster of brain cells. It boosts the brain’s chemistry and connectivity, refueling the entire engine.

"Certain activities will lay more neural pathways that can be more readily re-engaged," Thomason says. "The hope is that there are ways to train and strengthen these pathways."

Maddens explains it this way: Consider the neurons of your brain like electrical wires and the white matter like the insulation. When the insulation breaks down over time, things can misfire.

In labs, those who engaged in mentally challenging games do, in fact, show improvement in cognitive functioning. They get faster at speed games and stronger in memory games, for example.

What’s less clear is whether this improvement transfers to everyday tasks, like remembering where you parked the car or the name of your child’s teacher, both Thomason and Maddens say.

But when it comes to the link between physical exercise and the brain, researchers and clinicians agree: physical exercise is good for the brain; it has also been linked to lower rates of chronic disease. Good nutrition is essential too.

Oxygen, itself, is essential, Deep said: “Your brain is an oxygen hog.”

Diet, exercise and mental maneuvers all may boost brain health in ways science still doesn’t understand. In the best cases, the right mix might stave off the effects of Alzheimer’s and other age-related disease too, Maddens says.

All this is good news for an aging, stressed out, and too-busy society, he says.

Reading a book, engaging with friends or going out for a walk and paying attention to what’s around you - that’s not really about goofing off. Rather, it’s critical time that stimulates neural pathways and boosts the odds of long-time brain health.

"It’s talking to friends. It’s getting out socially. It’s engaging in life. The question is ‘How do I force myself to learn?’" Thomason says.

The same might be true when it comes to mentally changing computer games.

Says Maddens: “Would I have patients playing computer games eight hours a day in hopes that they can delay Alzheimer’s by two months? No. But you can enjoy (playing such games) and possibly get a benefit from it, too.”

Read More →

Jun 22, 201266 notes
#science #neuroscience #brain #psychology
Confusion Can Be Beneficial for Learning

ScienceDaily (June 20, 2012) — Most of us assume that confidence and certainty are preferred over uncertainty and bewilderment when it comes to learning complex information. But a new study led by Sidney D’Mello of the University of Notre Dame shows that confusion when learning can be beneficial if it is properly induced, effectively regulated and ultimately resolved.

image

Most of us assume that confidence and certainty are preferred over uncertainty and bewilderment when it comes to learning complex information. But a new study shows that confusion when learning can be beneficial if it is properly induced, effectively regulated and ultimately resolved. (Credit: © Ana Blazic Pavlovic / Fotolia)

The study will be published in a forthcoming issue of the journal Learning and Instruction.

Notre Dame psychologist and computer scientist D’Mello, whose research areas include artificial intelligence, human-computer interaction and the learning sciences, together with Art Graesser of the University of Memphis, collaborated on the study, which was funded by the National Science Foundation.

They found that by strategically inducing confusion in a learning session on difficult conceptual topics, people actually learned more effectively and were able to apply their knowledge to new problems.

In a series of experiments, subjects learned scientific reasoning concepts through interactions with computer-animated agents playing the roles of a tutor and a peer learner. The animated agents and the subject engaged in interactive conversations where they collaboratively discussed the merits of sample research studies that were flawed in one critical aspect. For example, one hypothetical case study touted the merits of a diet pill, but was flawed because it did not include an appropriate control group. Confusion was induced by manipulating the information the subjects received so that the animated agents sometimes disagreed with each other and expressed contradictory or incorrect information. The agents then asked subjects to decide which opinion had more scientific merit, thereby putting the subject in the hot spot of having to make a decision with incomplete and sometimes contradictory information.

In addition to the confusion and uncertainty triggered by the contradictions, subjects who were confused scored higher on a difficult post-test and could more successfully identify flaws in new case studies.

"We have been investigating links between emotions and learning for almost a decade, and find that confusion can be beneficial to learning if appropriately regulated because it can cause learners to process the material more deeply in order to resolve their confusion," D’Mello says.

According to D’Mello, it is not advisable to intentionally confuse students who are struggling or induce confusion during high-stakes learning activities. Confusion interventions are best for higher-level learners who want to be challenged with difficult tasks, are willing to risk failure, and who manage negative emotions when they occur.

"It is also important that the students are productively instead of hopelessly confused. By productive confusion, we mean that the source of the confusion is closely linked to the content of the learning session, the student attempts to resolve their confusion, and the learning environment provides help when the student struggles. Furthermore, any misleading information in the form of confusion-induction techniques should be corrected over the course of the learning session, as was done in the present experiments."

According to D’Mello, the next step in this body of research is to apply these methods to some of the more traditional domains such as physics, where misconceptions are common.

Source: Science Daily

Jun 21, 2012218 notes
#science #neuroscience #brain #psychology #learning
Understanding of Spinal Muscular Atrophy Improved With Use of Stem Cells

ScienceDaily (June 20, 2012) — Cedars-Sinai’s Regenerative Medicine Institute has pioneered research on how motor-neuron cell-death occurs in patients with spinal muscular atrophy, offering an important clue in identifying potential medicines to treat this leading genetic cause of death in infants and toddlers.

The study, published in the June 19 online issue of PLoS ONE, extends the institute’s work to employ pluripotent stem cells to find a pharmaceutical treatment for spinal muscular atrophy or SMA, a genetic neuromuscular disease characterized by muscle atrophy and weakness.

"With this new understanding of how motor neurons die in spinal muscular atrophy patients, we are an important step closer to identifying drugs that may reverse or prevent that process," said Clive Svendsen, PhD, director of the Cedars-Sinai Regenerative Medicine Institute.

Svendsen and his team have investigated this disease for some time now. In 2009, Nature published a study by Svendsen and his colleagues detailing how skin cells taken from a patient with the disorder were used to generate neurons of the same genetic makeup and characteristics of those affected in the disorder; this created a “disease-in-a-dish” that could serve as a model for discovering new drugs.

As the disease is unique to humans, previous methods to employ this approach had been unreliable in predicting how it occurs in humans. In the research published in PLoS ONE, the team reproduced this model with skin cells from multiple patients, taking them back in time to a pluripotent stem cell state (iPS cells), and then driving them forward to study the diseased patient-specific motor neurons.

Children born with this disorder have a genetic mutation that doesn’t allow their motor neurons to manufacture a critical protein necessary for them to survive. The study found these cells die through apoptosis — the same form of cell death that occurs when the body eliminates old, unnecessary as well as unhealthy cells. As motor neuron cell death progresses, children with the disease experience increasing paralysis and eventually death. There is no effective treatment now for this disease. An estimated one in 35 to one in 60 people are carriers and about in 100,000 newborns have the condition.

"Now we are taking these motor neurons (from multiple children with the disease and in their pluripotent state) and screening compounds that can rescue these cells and create the protein necessary for them to survive," said Dhruv Sareen, director of Cedars-Sinai’s Induced Pluripotent Stem Cell Core Facility and a primary author on the study. "This study is an important stepping stone to guide us toward the right kinds of compounds that we hope will be effective in the model — and then be reproduced in clinical trials."

Source: Science Daily

Jun 21, 2012
#science #neuroscience #brain #psychology #neuron
What's Your Name Again? Lack of Interest, Not Brain's Ability, May Be Why We Forget

ScienceDaily (June 20, 2012) — Most of us have experienced it. You are introduced to someone, only to forget his or her name within seconds. You rack your brain trying to remember, but can’t seem to even come up with the first letter. Then you get frustrated and think, “Why is it so hard for me to remember names?”

You may think it’s just how you were born, but that’s not the case, according to Kansas State University’s Richard Harris, professor of psychology. He says it’s not necessarily your brain’s ability that determines how well you can remember names, but rather your level of interest.

"Some people, perhaps those who are more socially aware, are just more interested in people, more interested in relationships," Harris said. "They would be more motivated to remember somebody’s name."

This goes for people in professions like politics or teaching where knowing names is beneficial. But just because someone can’t remember names doesn’t mean they have a bad memory.

"Almost everybody has a very good memory for something," Harris said.

The key to a good memory is your level of interest, he said. The more interest you show in a topic, the more likely it will imprint itself on your brain. If it is a topic you enjoy, then it will not seem like you are using your memory.

For example, Harris said a few years ago some students were playing a geography game in his office. He started to join in naming countries and their capitals. Soon, the students were amazed by his knowledge, although Harris didn’t understand why. Then it dawned on him that his vast knowledge of capitals didn’t come from memorizing them from a map, but rather from his love of stamps and learning their whereabouts.

"I learned a lot of geographical knowledge without really studying," he said.

Harris said this also explains why some things seem so hard to remember — they may be hard to understand or not of interest to some people, such as remembering names.

Harris said there are strategies for training your memory, including using a mnemonic device.

"If somebody’s last name is Hefty and you notice they’re left-handed, you could remember lefty Hefty," he said.

Another strategy is to use the person’s name while you talk to them — although the best strategy is simply to show more interest in the people you meet, he said.

Source: Science Daily

Jun 21, 201254 notes
#science #neuroscience #brain #psychology
'Brain pacemaker' effective for years against Parkinson's disease

June 20, 2012

A “brain pacemaker” called deep brain stimulation (DBS) remains an effective treatment for Parkinson’s disease for at least three years, according to a study in the June 2012 online issue of Neurology, the medical journal of the American Academy of Neurology.

But while improvements in motor function remained stable, there were gradual declines in health-related quality of life and cognitive abilities.

First author of the study is Frances M. Weaver, PhD, who has joint appointments at Edward Hines Jr. VA Hospital and Loyola University Chicago Stritch School of Medicine.

Weaver was one of the lead investigators of a 2010 paper in the New England Journal of Medicine that found that motor functions remained stable for two years in DBS patients. The new additional analysis extended the follow-up period to 36 months.

DBS is a treatment for Parkinson’s patients who no longer benefit from medication, or who experience unacceptable side effects. DBS is not a cure, and it does not stop the disease from progressing. But in the right patients, DBS can significantly improve symptoms, especially tremors. DBS also can relieve muscle rigidity that causes decreased range of motion.

In the DBS procedure, a neurosurgeon drills a dime-size hole in the skull and inserts an electrode about 4 inches into the brain. A connecting wire from the electrode runs under the skin to a battery implanted near the collarbone. The electrode delivers mild electrical signals that effectively reorganize the brain’s electrical impulses. The procedure can be done on one or both sides of the brain.

Researchers evaluated 89 patients who were stimulated in a part of the brain called the globus pallidus interna and 70 patients who were stimulated in a different part of the brain called the subthalamic nucleus. (Patients received DBS surgery at seven VA and six affiliated university medical centers.) Patients were assessed at baseline (before DBS surgery) and at 3, 6, 12, 18, 24 and 36 months. Patients were rated on a Parkinson’s disease scale that includes motor functions such as speech, facial expression, tremors, rigidity, finger taps, hand movements, posture, gait, bradykinesia (slow movement) etc. The lower the rating, the better the function.

Improvements in motor function were similar in both groups of patients, and stable over time. Among patients stimulated in the globus pallidus interna, the score improved from 41.1 at baseline to 27.1 at 36 months. Among patients stimulated in the subthalamic nucleus, the score improved from 42.5 at baseline to 29.7 at 36 months.

By contrast, some early gains in quality of life and the abilities to do the activities of daily living were gradually lost, and there was a decline in neurocognitive function. This likely reflects the progression of the disease, and the emergence of symptoms that are resistant to DBS and medications.

Researchers concluded that both the globus pallidus interna and the subthalamic nucleus areas of the brain “are viable DBS targets for treatment of motor symptoms, but highlight the importance of nonmotor symptoms as determinants of quality of life in people with Parkinson’s disease.”

Source: medicalxpress.com

Jun 21, 201211 notes
#science #neuroscience #brain #psychology #parkinson
Proposed drug may reverse Huntington's disease symptoms

June 20, 2012

With a single drug treatment, researchers at the Ludwig Institute for Cancer Research at the University of California, San Diego School of Medicine can silence the mutated gene responsible for Huntington’s disease, slowing and partially reversing progression of the fatal neurodegenerative disorder in animal models.

image

This image shows stained mouse neurons. Credit: Image courtesy of Taylor Bayouth

The findings are published in the June 21, 2012 online issue of the journal Neuron.

Researchers suggest the drug therapy, tested in mouse and non-human primate models, could produce sustained motor and neurological benefits in human adults with moderate and severe forms of the disorder. Currently, there is no effective treatment.

Huntington’s disease afflicts approximately 30,000 Americans, whose symptoms include uncontrolled movements and progressive cognitive and psychiatric problems. The disease is caused by the mutation of a single gene, which results in the production and accumulation of toxic proteins throughout the brain.

Don W. Cleveland, PhD, professor and chair of the UC San Diego Department of Cellular and Molecular Medicine and head of the Laboratory of Cell Biology at the Ludwig Institute for Cancer Research, and colleagues infused mouse and primate models of Huntington’s disease with one-time injections of an identified DNA drug based on antisense oligonucleotides (ASOs). These ASOs selectively bind to and destroy the mutant gene’s molecular instructions for making the toxic huntingtin protein.

The singular treatment produced rapid results. Treated animals began moving better within one month and achieved normal motor function within two. More remarkably, the benefits persisted, lasting nine months, well after the drug had disappeared and production of the toxic proteins had resumed.

"For diseases like Huntington’s, where a mutant protein product is tolerated for decades prior to disease onset, these findings open up the provocative possibility that transient treatment can lead to a prolonged benefit to patients,” said Cleveland. “This finding raises the prospect of a ‘huntingtin holiday,’ which may allow for clearance of disease-causing species that might take weeks or months to re-form. If so, then a single application of a drug to reduce expression of a target gene could ‘reset the disease clock,’ providing a benefit long after huntingtin suppression has ended.”

Beyond improving motor and cognitive function, researchers said the ASO treatment also blocked brain atrophy and increased lifespan in mouse models with a severe form of the disease. The therapy was equally effective whether one or both huntingtin genes were mutated, a positive indicator for human therapy.

Cleveland noted that the approach was particularly promising because antisense therapies have already been proven safe in clinical trials and are the focus of much drug development. Moreover, the findings may have broader implications, he said, for other “age-dependent neurodegenerative diseases that develop from exposure to a mutant protein product” and perhaps for nervous system cancers, such as glioblastomas.

Provided by University of California - San Diego

Source: medicalxpress.com

Jun 21, 201231 notes
#science #neuroscience #brain #psychology #huntington
Study shows role of cellular protein in regulation of binge eating

June 20, 2012

Researchers from Boston University School of Medicine (BUSM) have demonstrated in experimental models that blocking the Sigma-1 receptor, a cellular protein, reduced binge eating and caused binge eaters to eat more slowly. The research, which is published online in Neuropsychopharmacology, was led by Pietro Cottone, PhD, and Valentina Sabino, PhD, both assistant professors in the pharmacology and psychiatry departments at BUSM.

Binge eating disorder, which affects approximately 15 million Americans, is believed to be the eating disorder that most closely resembles substance dependence. In binge eating subjects, normal regulatory mechanisms that control hunger do not function properly. Binge eaters typically gorge on “junk” foods excessively and compulsively despite knowing the adverse consequences, which are physical, emotional and social in nature. In addition, binge eaters typically experience distress and withdrawal when they abstain from junk food.

The researchers developed an experimental model of compulsive binge eating by providing a sugary, chocolate diet only for one hour a day while the control group was given a standard laboratory diet. Within two weeks, the group exposed to the sugary diet exhibited binge eating behavior and ate four times as much as the controls. In addition, the experimental binge eaters exhibited compulsive behavior by putting themselves in a potentially risky situation in order to get to the sugary food while the control group avoided the risk.

The researchers then tested whether a drug that blocks the Sigma-1 receptor could reduce binge eating of the sugary diet. The experimental data showed the drug successfully reduced binge eating by 40 percent, caused the binge eaters to eat more slowly and blocked the risky behavior.

The abnormal, risky behavior exhibited by the binge eating experimental group suggested to the researchers that there could be something wrong with how decisions were made. Because evaluation of risks and decision making are functions executed in the prefronto-cortical regions of the brain, the researchers tested whether the abundance of Sigma-1 receptors in those regions was abnormal in the binge eaters. They found that Sigma-1 receptor expression was unusually high in those areas, which could explain why blocking its function could decrease both compulsive binge eating and risky behavior.

"These findings suggest that the Sigma-1 receptor may contribute to the neurobiological adaptations that cause compulsive-like eating, opening up a new potential therapeutic treatment target for binge eating disorder,” said Cottone, who also co-directs the Laboratory of Addictive Disorders at BUSM with Sabino.

Provided by Boston University Medical Center

Source: medicalxpress.com

Jun 21, 201216 notes
#neuroscience #psychology #science
Scientists Identify Protein Required to Regrow Injured Nerves in Limbs

ScienceDaily (June 20, 2012) — A protein required to regrow injured peripheral nerves has been identified by researchers at Washington University School of Medicine in St. Louis.

image

These are images of axon regeneration in mice two weeks after injury to the hind leg’s sciatic nerve. On the left, axons (green) of a normal mouse have regrown to their targets (red) in the muscle. On the right, a mouse lacking DLK shows no axons have regenerated, even after two weeks. (Credit: Jung Eun Shin)

The finding, in mice, has implications for improving recovery after nerve injury in the extremities. It also opens new avenues of investigation toward triggering nerve regeneration in the central nervous system, notorious for its inability to heal.

Peripheral nerves provide the sense of touch and drive the muscles that move arms and legs, hands and feet. Unlike nerves of the central nervous system, peripheral nerves can regenerate after they are cut or crushed. But the mechanisms behind the regeneration are not well understood.

In the new study, published online June 20 in Neuron, the scientists show that a protein called dual leucine zipper kinase (DLK) regulates signals that tell the nerve cell it has been injured — often communicating over distances of several feet. The protein governs whether the neuron turns on its regeneration program.

"DLK is a key molecule linking an injury to the nerve’s response to that injury, allowing the nerve to regenerate," says Aaron DiAntonio, MD, PhD, professor of developmental biology. "How does an injured nerve know that it is injured? How does it take that information and turn on a regenerative program and regrow connections? And why does only the peripheral nervous system respond this way, while the central nervous system does not? We think DLK is part of the answer."

The nerve cell body containing the nucleus or “brain” of a peripheral nerve resides in the spinal cord. During early development, these nerves send long, thin, branching wires, called axons, out to the tips of the fingers and toes. Once the axons reach their targets (a muscle, for example), they stop extending and remain mostly unchanged for the life of the organism. Unless they’re damaged.

If an axon is severed somewhere between the cell body in the spinal cord and the muscle, the piece of axon that is no longer connected to the cell body begins to disintegrate. Earlier work showed that DLK helps regulate this axonal degeneration. And in worms and flies, DLK also is known to govern the formation of an axon’s growth cone, the structure responsible for extending the tip of a growing axon whether after injury or during development.

The formation of the growth cone is an important part of the early, local response of a nerve to injury. But a later response, traveling over greater distances, proves vital for relaying the signals that activate genes promoting regeneration. This late response can happen hours or even days after injury.

But in mice, unlike worms and flies, DiAntonio and his colleagues found that DLK is not involved in an axon’s early response to injury. Even without DLK, the growth cone forms. But a lack of DLK means the nerve cell body, nestled in the spinal cord far from the injury, doesn’t get the message that it’s injured. Without the signals relaying the injury message, the cell body doesn’t turn on its regeneration program and the growth cone’s progress in extending the axon stalls.

In addition, it was shown many years ago that axons regrow faster after a second injury than axons injured only once. In other words, injury itself increases an axon’s ability to regenerate. Furthering this work, first author Jung Eun Shin, graduate research assistant, and her colleagues found that DLK is required to promote this accelerated growth.

"A neuron that has seen a previous injury now has a different regenerative program than one that has never been damaged," Shin says. "We hope to be able to identify what is different between these two neurons — specifically what factors lead to the improved regeneration after a second injury. We have found that activated DLK is one such factor. We would like to activate DLK in a newly injured neuron to see if it has improved regeneration."

In addition to speeding peripheral nerve recovery, DiAntonio and Shin see possible implications in the central nervous system. It is known for example, that some of the important factors regulated and ramped up by DLK are not activated in the central nervous system.

"Since this sort of signaling doesn’t appear to happen in the central nervous system, it’s possible these nerves don’t ‘know’ when they are injured," DiAntonio says. "It’s an exciting idea — but not at all proven — that activating DLK in the central nervous system could promote its regeneration."

Source: Science Daily

Jun 21, 201239 notes
#science #neuroscience #psychology #protein
How Humans Predict Other's Decisions

ScienceDaily (June 20, 2012) — Researchers at the RIKEN Brain Science Institute (BSI) in Japan have uncovered two brain signals in the human prefrontal cortex involved in how humans predict the decisions of other people. Their results suggest that the two signals, each located in distinct prefrontal circuits, strike a balance between expected and observed rewards and choices, enabling humans to predict the actions of people with different values than their own.

image

Figure one shows the neural activity for the simulation of another person: Reward Signal (red) and Action Signal (green). The action signal shown in this figure (green) is in the dorsomedial prefrontal cortex. The activity of reward signal (red) largely overlaps with the activity of the signal for the self-valuation (blue) in the ventromedial prefrontal cortex. (Credit: RIKEN)

Every day, humans are faced with situations in which they must predict what decisions other people will make. These predictions are essential to the social interactions that make up our personal and professional lives. The neural mechanism underlying these predictions, however, by which humans learn to understand the values of others and use this information to predict their decision-making behavior, has long remained a mystery.

Researchers at the RIKEN Brain Science Institute (BSI) in Japan have now shed light on this mystery with a paper to appear in the June 21st issue of Neuron. The researchers describe for the first time the process governing how humans learn to predict the decisions of another person using mental simulation of their mind.

Learning another person’s values and mental processes is often assumed to require simulation of the other’s mind: using one’s own familiar mental processes to simulate unfamiliar processes in the mind of the other. While simple and intuitive, this explanation is hard to prove due to the difficulty in disentangling one’s own brain signals from those of the simulated other.

Research scientists Shinsuke Suzuki and Hiroyuki Nakahara, a Principal Investigator of the Laboratory for Integrated Theoretical Neuroscience at RIKEN BSI, together with their collaborators, set out to disentangle these signals using functional Magnetic Resonance Imaging (fMRI) on humans. First, they studied the behavior of subjects as they played a game by making predictions about the other’s behavior based on the knowledge of others and their decisions. Then they generated a computer model of the simulation process to examine the brain signals underlying the prediction of the other’s behavior.

The authors found that humans simulate the decisions of other people using two brain signals encoded in the prefrontal cortex, an area responsible for higher cognition. One signal involves the estimated value of the reward to the other person, and is called the reward signal, referring to the difference between the other’s values, simulated in one’s mind, and the reward benefit that the other actually received. The other signal is called the action signal, relating to the other’s expected action predicted by the simulation process in one’s mind, and what the other person actually did, which may or may not be different. They found that the reward signal is processed in a part of the brain called the ventromedial prefrontal cortex. The action signal, on the other hand, was found in a separate brain area called the dorsomedial prefrontal cortex.

"Every day, we interact with a variety of other individuals," Suzuki said. "Some may share similar values with us and for those interactions simulation using the reward signal alone may suffice. However, other people with different values may be quite different and then the action signal may become quite important."

Nakahara believes that their approach, using mathematical models based on human behavior with brain imaging, will be useful to answer a wide range of questions about the social functions employed by the brain. “Perhaps we may one day better understand how and why humans have the ability to predict others’ behavior, even those with different characteristics. Ultimately, this knowledge could help improving political, educational, and social systems in human societies.”

Source: Science Daily

Jun 21, 201256 notes
#science #neuroscience #brain #psychology
All Things Big and Small: The Brain's Discerning Taste for Size

ScienceDaily (June 20, 2012) — The human brain can recognize thousands of different objects, but neuroscientists have long grappled with how the brain organizes object representation; in other words, how the brain perceives and identifies different objects. Now researchers at the MIT Computer Science and Artificial Intelligence Lab (CSAIL) and the MIT Department of Brain and Cognitive Sciences have discovered that the brain organizes objects based on their physical size, with a specific region of the brain reserved for recognizing large objects and another reserved for small objects.

image

This figure shows brain activations while participants view pictures of large and small objects. (Credit: Image courtesy of Massachusetts Institute of Technology, CSAIL)

Their findings, to be published in the June 21 issue of Neuron, could have major implications for fields like robotics, and could lead to a greater understanding of how the brain organizes and maps information.

"Prior to this study, nobody had looked at whether the size of an object was an important factor in the brain’s ability to recognize it," said Aude Oliva, an associate professor in the MIT Department of Brain and Cognitive Sciences and senior author of the study.

"It’s almost obvious that all objects in the world have a physical size, but the importance of this factor is surprisingly easy to miss when you study objects by looking at pictures of them on a computer screen," said Dr. Talia Konkle, lead author of the paper. "We pick up small things with our fingers, we use big objects to support our bodies. How we interact with objects in the world is deeply and intrinsically tied to their real-world size, and this matters for how our brain’s visual system organizes object information."

As part of their study, Konkle and Oliva took 3D scans of brain activity during experiments in which participants were asked to look at images of big and small objects or visualize items of differing size. By evaluating the scans, the researchers found that there are distinct regions of the brain that respond to big objects (for example, a chair or a table), and small objects (for example, a paperclip or a strawberry).

By looking at the arrangement of the responses, they found a systematic organization of big to small object responses across the brain’s cerebral cortex. Large objects, they learned, are processed in the parahippocampal region of the brain, an area located by the hippocampus, which is also responsible for navigating through spaces and for processing the location of different places, like the beach or a building. Small objects are handled in the inferior temporal region of the brain, near regions that are active when the brain has to manipulate tools like a hammer or a screwdriver.

The work could have major implications for the field of robotics, in particular in developing techniques for how robots deal with different objects, from grasping a pen to sitting in a chair.

"Our findings shed light on the geography of the human brain, and could provide insight into developing better machine interfaces for robots," said Oliva.

Many computer vision techniques currently focus on identifying what an object is without much guidance about the size of the object, which could be useful in recognition. “Paying attention to the physical size of objects may dramatically constrain the number of objects a robot has to consider when trying to identify what it is seeing,” said Oliva.

The study’s findings are also important for understanding how the organization of the brain may have evolved. The work of Konkle and Oliva suggests that the human visual system’s method for organizing thousands of objects may also be tied to human interactions with the world. “If experience in the world has shaped our brain organization over time, and our behavior depends on how big objects are, it makes sense that the brain may have established different processing channels for different actions, and at the center of these may be size,” said Konkle.

Oliva, a cognitive neuroscientist by training, has focused much of her research on how the brain tackles scene and object recognition, as well as visual memory. Her ultimate goal is to gain a better understanding of the brain’s visual processes, paving the way for the development of machines and interfaces that can see and understand the visual world like humans do.

"Ultimately, we want to focus on how active observers move in the natural world. We think this not only matters for large-scale brain organization of the visual system, but it also matters for making machines that can see like us," said Konkle and Oliva.

Source: Science Daily

Jun 21, 201214 notes
#science #neuroscience #brain #psychology
Simple mathematical pattern describes shape of neuron 'jungle'

June 20, 2012

Neurons come in an astounding assortment of shapes and sizes, forming a thick inter-connected jungle of cells. Now, UCL neuroscientists have found that there is a simple pattern that describes the tree-like shape of all neurons.

Neurons look remarkably like trees, and connect to other cells with many branches that effectively act like wires in an electrical circuit, carrying impulses that represent sensation, emotion, thought and action.

Over 100 years ago, Santiago Ramon y Cajal, the father of modern neuroscience, sought to systematically describe the shapes of neurons, and was convinced that there must be a unifying principle underlying their diversity.

Cajal proposed that neurons spread out their branches so as to use as little wiring as possible to reach other cells in the network. Reducing the amount of wiring between cells provides additional space to pack more neurons into the brain, and therefore increases its processing power.

New work by UCL neuroscientists, published today in Proceedings of the National Academy of Sciences, has revisited this century-old hypothesis using modern computational methods. They show that a simple computer program which connects points with as little wiring as possible can produce tree-like shapes which are indistinguishable from real neurons - and also happen to be very beautiful. They also show that the shape of neurons follows a simple mathematical relationship called a power law.

Power laws have been shown to be common across the natural world, and often point to simple rules underlying complex structures. Dr Herman Cuntz (UCL Wolfson Institute for Biomedical Research) and colleagues find that the power law holds true for many types of neurons gathered from across the animal kingdom, providing strong evidence for Ramon y Cajal’s general principle.

The UCL team further tested the theory by examining neurons in the olfactory bulb, a part of the brain where new brain cells are constantly being formed. These neurons grow and form new connections even in the adult brain, and therefore provide a unique window into the rules behind the development of neural trees in a mature neural circuit.

The team analysed the change in shape of the newborn olfactory neurons over several days, and found that the growth of these neurons also follow the power law, providing further evidence to support the theory.

Dr Hermann Cuntz said: “The ultimate goal of neuroscience is to understand how the impenetrable neural jungle can give rise to the complexity of behaviour.

"Our findings confirm Cajal’s original far-reaching insight that there is a simple pattern behind the circuitry, and provides hope that neuroscientists will someday be able to see the forest for the trees."

Provided by University College London

Source: medicalxpress.com

Jun 21, 201229 notes
#science #neuroscience #brain #psychology #neuron
Jun 20, 20124,913 notes
#science #neuroscience #brain #psychology #neuron #connectome
Fishing for Answers to Autism Puzzle

ScienceDaily (June 19, 2012) — Fish cannot display symptoms of autism, schizophrenia, or other human brain disorders. However, a team of Whitehead Institute and MIT scientists has shown that zebrafish can be a useful tool for studying the genes that contribute to such disorders.

image

Zebrafish with certain genes turned off during embryonic development (center and right images) showed abnormalities of brain formation (top row) and axon wiring (bottom row). At left is a normally developing zebrafish embryo. (Credit: Sive Lab)

Led by Whitehead Member Hazel Sive, the researchers set out to explore a group of about two dozen genes known to be either missing or duplicated in about 1 percent of autistic patients. Most of the genes’ functions were unknown, but a new study by Sive and Whitehead postdocs Alicia Blaker-Lee, Sunny Gupta and, Jasmine McCammon, revealed that nearly all of them produced brain abnormalities when deleted in zebrafish embryos.

The findings, published online recently in the journal Disease Models & Mechanisms, should help researchers pinpoint genes for further study in mammals, says Sive, who is also professor of biology and associate dean of MIT’s School of Science. Autism is thought to arise from a variety of genetic defects; this research is part of a broad effort to identify culprit genes and develop treatments that target them.

"That’s really the goal — to go from an animal that shares molecular pathways, but doesn’t get autistic behaviors, into humans who have the same pathways and do show these behaviors," Sive says.

Sive recalls that some of her colleagues chuckled when she first proposed studying human brain disorders in fish, but it is actually a logical starting point, she says. Brain disorders are difficult to study because most of the symptoms are behavioral, and the biological mechanisms behind those behaviors are not well understood, she says.

"We thought that since we really know so little, that a good place to start would be with the genes that confer risk in humans to various mental health disorders, and to study these various genes in a system where they can readily be studied," she says.

Those genes tend to be the same across species — conserved throughout evolution, from fish to mice to humans — though they may control somewhat different outcomes in each species.

In the latest study, Sive and her colleagues focused on a genetic region known as 16p11.2, first identified by Mark Daly, a former Whitehead Fellow who discovered a type of genetic defect known as a copy number variant. A typical genome includes two copies of every gene, one from each parent; copy number variants occur when one of those copies is deleted or duplicated, and this can be associated with pathology.

The central “core” 16p11.2 region includes 25 genes. Both deletions and duplications in this region have been associated with autism, but it was unclear which of the genes might actually produce symptoms of the disease. “At the time, there was an inkling about some of them, but very few,” Sive says.

Sive and her postdocs began by identifying zebrafish genes analogous to the human genes found in this region. (In zebrafish, these genes are not clustered in a single genetic chunk, but are scattered across many chromosomes.) The researchers studied one gene at a time, silencing each with short strands of nucleic acids that target a particular gene and prevent its protein from being produced.

For 21 of the genes, silencing led to abnormal development. Most produced brain deficits, including improper development of the brain or eyes, thinning of the brain, or inflation of the brain ventricles, cavities that contain cerebrospinal fluid. The researchers also found abnormalities in the wiring of axons, the long neural projections that carry messages to other neurons, and in simple behaviors of the fish. The results show that the 16p11.2 genes are very important during brain development, helping to explain the connection between this region and brain disorders.

Furthermore, the researchers were able to restore normal development by treating the fish with the human equivalents of the genes that had been repressed. “That allows you to deduce that what you’re learning in fish corresponds to what that gene is doing in humans. The human gene and the fish gene are very similar,” Sive says.

To figure out which of these genes might have a strong effect in autism or other disorders, the researchers set out to identify genes that produce abnormal development when their activity is reduced by 50 percent, which would happen in someone who is missing one copy of the gene. (This correlation is not seen for most genes, because there are many other checks and balances that regulate how much of a particular protein is made.)

The researchers identified two such genes in the 16p11.2 region. One, called kif22, codes for a protein involved in the separation of chromosomes during cell division, and one, aldolase a, is involved in glycolysis — the process of breaking down sugar to generate energy for the cell.

In work that has just begun, Sive’s lab is working with Stanford University researchers to explore in mice predictions made from the zebrafish study. They are also conducting molecular studies in zebrafish of the pathways affected by these genes, to get a better idea of how defects in these might bring about neurological disorders.

Source: Science Daily

Jun 20, 201225 notes
#science #neuroscience #brain #psychology #autism
Study Finds High Brain Integration in Top Performers

June 19, 2012 By Janice Wood

Why do some people excel in sports, music and managing companies? New research points to uniquely high mind-brain development in those who excel.

image

“What we have found is an astonishing integration of brain functioning in high performers compared to average-performing controls,” said Fred Travis, Ph.D., director of the Center for Brain, Consciousness, and Cognition at Maharishi University of Management in Fairfield, Iowa.

He claims this research is the “first in the world to show that there is a brain measure of effective leadership.”

In the study, published in the journal Cognitive Processing, researchers found that 20 top-level managers scored higher on three measures — the Brain Integration Scale, Gibbs’s Socio-moral Reasoning questionnaire, and an inventory of peak experiences — compared to 20 low-level managers who served as controls.

“The current understanding of high performance is fragmented,” said co-researcher Harald Harung, Ph.D., of the Oslo and Akershus University College of Applied Sciences in Norway.

“What we have done in our research is to use quantitative and neurophysiological research methods on topics that so far have been dominated by psychology.”

The researchers carried out four studies comparing world-class performers to average performers. This recent study and two others examined top performers in management, sports and classical music. A number of years ago Harung and his colleagues published a study on a variety of professions, such as public administration, management, sports, arts, and education.

The studies include using electroencephalography (EEG) to look at the extent of integration and development of several brain processes.

Read More →

Jun 20, 201231 notes
#science #neuroscience #psychology #brain
Infants Can't Distinguish Between Large and Small Groups

ScienceDaily (June 19, 2012) — Human brains process large and small numbers of objects using two different mechanisms, but infants have not yet developed the ability to make those two processes work together, according to new research from the University of Missouri.

"This research was the first to show the inability of infants in a single age group to discriminate large and small sets in a single task," said Kristy vanMarle, assistant professor of psychological sciences in the College of Arts and Science. "Understanding how infants develop the ability to represent and compare numbers could be used to improve early education programs."

The MU study found that infants consistently chose the larger of two groups of food items when both sets were larger or smaller than four, just as an adult would. Unlike adults, the infants showed no preference for the larger group when choosing between one large and one small set. The results suggest that at age one infants have not yet integrated the two mental functions: one being the ability to estimate numbers of items at a glance and the other being the ability to visually track small sets of objects.

In vanMarle’s study, 10- to 12-month-old infants were presented with two opaque cups. Different numbers of pieces of breakfast cereal were hidden in each cup, while the infants observed, and then the infants were allowed to choose a cup. Four comparisons were tested between different combinations of large and small sets. Infants consistently chose two food items over one and eight items over four, but chose randomly when asked to compare two versus four and two versus eight.

"Being unable to determine that eight is larger than two would put an organism at a serious disadvantage," vanMarle said. "However, ongoing studies in my lab suggest that the capacity to compare small and large sets seems to develop before age two."

The ability to make judgments about the relative number of objects in a group has old evolutionary roots. Dozens of species, including some fish, monkeys and birds have shown the ability to recognize numerical differences in laboratory studies. VanMarle speculated that being unable to compare large and small sets early in infancy may not have been problematic during human evolution because young children probably received most of their food and protection from caregivers. Infants’ survival didn’t depend on determining which bush had the most berries or how many predators they just saw, she said.

"In the modern world there are educational programs that claim to give children an advantage by teaching them arithmetic at an early age," said vanMarle. "This research suggests that such programs may be ineffective simply because infants are unable to compare some numbers with others."

Source: Science Daily

Jun 20, 201213 notes
#science #neuroscience #brain #psychology
Detector of DNA Damage: Structure of a Repair Factor Revealed

ScienceDaily (June 19, 2012) — Double-stranded breaks in cellular DNA can trigger tumorigenesis. LMU researchers have now determined the structure of a protein involved in the repair and signaling of DNA double-strand breaks. The work throws new light on the origins of neurodegenerative diseases and certain tumor types.

Agents such as radiation or environmental toxins can cause double-stranded breaks in genomic DNA, which facilitate the development of tumors or the neurodegenerative disorders ataxia telangiectasia (AT) and AT-like disease (ATLD). Hence efficient repair mechanisms are essential for cell survival and function. The so-called MRN complex is an important component of one such system, and its structure has just been elucidated by a team led by Professor Karl-Peter Hopfner of LMU’s Gene Center.

Malignant mutations

The MRN complex consists of the nuclease Mre11, the ATPase Rad50 and the protein Nbs1. Nbs1 is responsible for recruiting the protein ATM, which plays a central role in early stages of the cellular response to DNA damage, to the site of damage. “How the MRN complex actually recognizes double-stranded breaks is still not clear,” says Hopfner. He and his colleagues therefore set out to clarify the issue by analyzing the structures of mutant, functionally defective versions of the complex.

"We found that pairs of Mre11 molecules form a flexible dimer, which is stabilized by Nbs1." Mutations in different subunits of the complex are associated with distinct syndromes, marked by a predisposition to certain cancers, sensitivity to radiation or neurodegeneration. Hopfner’s results help to explain these differences. For instance, the mutation linked to ATLD lies within the zone of contact between Mre11 and Nbs1, and may inhibit activation of ATM by weakening their interaction.

Source: Science Daily

Jun 20, 20125 notes
#science #neuroscience #biology #DNA
Hulk smash? Maybe not anymore: scientists block excess aggression in mice

June 19, 2012

Pathological rage can be blocked in mice, researchers have found, suggesting potential new treatments for severe aggression, a widespread trait characterized by sudden violence, explosive outbursts and hostile overreactions to stress.

In a study appearing today in the Journal of Neuroscience, researchers from the University of Southern California and Italy identify a critical neurological factor in aggression: a brain receptor that malfunctions in overly hostile mice. When the researchers shut down the brain receptor, which also exists in humans, the excess aggression completely disappeared.

The findings are a significant breakthrough in developing drug targets for pathological aggression, a component in many common psychological disorders including Alzheimer’s disease, autism, bipolar disorder and schizophrenia.

"From a clinical and social point of view, reactive aggression is absolutely a major problem," said Marco Bortolato, lead author of the study and research assistant professor of pharmacology and pharmaceutical sciences at the USC School of Pharmacy. “We want to find the tools that might reduce impulsive violence.”

A large body of independent research, including past work by Bortolato and senior author Jean Shih, USC University Professor and Boyd & Elsie Welin Professor in Pharmacology and Pharmaceutical Sciences at USC, has identified a specific genetic predisposition to pathological aggression: low levels of the enzyme monoamine oxidase A (MAO A). Both male humans and mice with congenital deficiency of the enzyme respond violently in response to stress.

"The same type of mutation that we study in mice is associated with criminal, very violent behavior in humans. But we really didn’t understand why that it is," Bortolato said.

Bortolato and Shih worked backwards to replicate elements of human pathological aggression in mice, including not just low enzyme levels but also the interaction of genetics with early stressful events such as trauma and neglect during childhood.

"Low levels of MAO A are one basis of the predisposition to aggression in humans. The other is an encounter with maltreatment, and the combination of the two factors appears to be deadly: it results consistently in violence in adults," Bortolato said.

The researchers show that in excessively aggressive rodents that lack MAO A, high levels of electrical stimulus are required to activate a specific brain receptor in the pre-frontal cortex. Even when this brain receptor does work, it stays active only for a short period of time.

"The fact that blocking this receptor moderates aggression is why this discovery has so much potential. It may have important applications in therapy," Bortolato said. "Whatever the ways environment can persistently affect behavior — and even personality over the long term — behavior is ultimately supported by biological mechanisms."

Importantly, the aggression receptor, known as NMDA, is also thought to play a key role in helping us make sense of multiple, coinciding streams of sensory information, according to Bortolato.

The researchers are now studying the potential side effects of drugs that reduce the activity of this receptor.

"Aggressive behaviors have a profound socio-economic impact, yet current strategies to reduce these staggering behaviors are extremely unsatisfactory," Bortolato said. "Our challenge now is to understand what pharmacological tools and what therapeutic regimens should be administered to stabilize the deficits of this receptor. If we can manage that, this could truly be an important finding."

Provided by University of Southern California

Source: medicalxpress.com

Jun 20, 201219 notes
#science #neuroscience #brain #psychology
Front-most part of the cortex involved in making short-term predictions about what will happen next

June 19, 2012

Researchers at the University of Iowa, together with colleagues from the California Institute of Technology and New York University, have discovered how a part of the brain helps predict future events from past experiences. The work sheds light on the function of the front-most part of the frontal lobe, known as the frontopolar cortex, an area of the cortex uniquely well developed in humans in comparison with apes and other primates.

image

The image shows the overlap of lesions for eight subjects superimposed on a template brain — red indicates maximum overlap (seven subjects) and dark blue is minimum overlap (one subject). The patient group was selected for lesions that include frontopolar cortex, but the lesions almost invariably extended outside to other parts of anterior prefrontal cortex. Credit: Christopher Kovach, University of Iowa

Making the best possible decisions in a changing and unpredictable environment is an enormous challenge. Not only does it require learning from past experience, but it also demands anticipating what might happen under previously unencountered circumstances. Past research from the UI Department of Neurology was among the first to show that damage to certain parts of the frontal lobe can cause severe deficits in decision making in rapidly changing environments. The new study from the same department on a rare group of patients with damage to the very frontal part of their brains reveals a critical aspect of how this area contributes to decision making. The findings were published June 19 in the Journal of Neuroscience.

"We gave the patients four slot machines from which to pick in order to win money. Unbeknownst to the patients, the probability of getting money from a particular slot machine gradually and unpredictably changed during the experiment. Finding the strategy that pays the most in the long run is a surprisingly difficult problem to solve, and one we hypothesized would require the frontopolar cortex,” explains Christopher Kovach, Ph.D., a UI post-doctoral fellow in neurosurgery and first author of the study.

Contrary to the authors’ initial expectation, the patients actually did quite well on the task, winning as much money, on average, as healthy control participants.

"But when we compared their behavior to that of subjects with intact frontal lobe, we found they used a different set of assumptions about how the payoffs changed over time,” Kovach says. “Both groups based their decisions on how much they had recently won from each slot machine, but healthy comparison subjects pursued a more elaborate strategy, which involved predicting the direction that payoffs were moving based on recent trends. This points towards a specific role for the frontopolar cortex in extrapolating recent trends.”

Kovach’s colleague and study author Ralph Adolphs, Ph.D., professor of neuroscience and psychology at the California Institute of Technology, adds that the study results “argue that the frontopolar cortex helps us to make short-term predictions about what will happen next, a strategy particularly useful in environments that change rapidly — such as the stock market or most social settings.”

Adolphs also hold an adjunct appointment in the UI Department of Neurology.

The study’s innovative approach to understanding the function of this part of the brain uses model-based analyses of behavior of patients with specific and precisely characterized areas of brain damage. These patients are members of the UI’s world-renowned Iowa Neurological Patient Registry, which was established in 1982 and has more than 500 active members with selective forms of damage, or lesions, to one or two defined regions in the brain.

"The University of Iowa is one of the few places in the world where you could carry out this kind of study, since it requires carefully assessed patients with damage to specific parts of their brain," says study author Daniel Tranel, Ph.D., UI professor of neurology and psychology and director of the UI Division of Behavioral Neurology and Cognitive Neuroscience.

In a final twist to the finding, the strategy taken by lesion patients was actually slightly better than the one used by comparison subjects. It happened that the task was designed so that the trends in the payoffs were, in fact, random and uninformative.

"The healthy comparison subjects seemed to perceive trends in what was just random noise," Kovach says.

This implies that the functions of the frontopolar cortex, which support more complex and detailed models of the environment, at times come with a downside: setting up mistaken assumptions.

"To the best of my knowledge this is the first study which links a normal tendency to see a nonexistent pattern in random noise, a type of cognitive bias, to a particular brain region," Kovach notes.

The researchers next want to investigate other parts of the frontal cortex in the brain, and have also begun to record activity directly from the brains of neurosurgical patients to see how single cells respond while making decisions. The work is also important to understand difficulties in decision making seen in disorders such as addiction.

Provided by University of Iowa

Source: medicalxpress.com

Jun 20, 201214 notes
#science #neuroscience #brain #psychology
First example of a heritable abnormality affecting semantic cognition found

June 19, 2012

Four generations of a single family have been found to possess an abnormality within a specific brain region which appears to affect their ability to recall verbal material, a new study by researchers at the University of Bristol and University College London has found.

This is the first suggestion of a heritable abnormality in otherwise healthy humans, and this has important implications for our understanding of the genetic basis of cognition.

Dr Josie Briscoe of Bristol’s School of Experimental Psychology and colleagues at the Institute of Child Health in London studied eight members of a single family (aged 8 years), who despite all having high levels of intelligence have since childhood, experienced profound difficulties in recalling sentences and prose, and language difficulties in listening comprehension and naming less common objects .

While their conversation is articulate and engaging, they can experience the inability to ‘find’ a particular word or topic – a phenomenon similar to the ‘tip-of-the-tongue’ problem experienced by many people. They also report associated problems such as struggling to follow a narrative thread while reading or watching television drama.

Dr Briscoe said: “With their consent, we conducted a number of standard memory and language tests on the affected members of the family. These showed they had difficulty repeating longer sentences correctly and learning words in lists and pairs. This suggests their difficulties lie in semantic cognition: the way people construct and generate meaning from words, objects and ideas.”

"Given the very wide variation in age, the coherence of their difficulties in semantic cognition was remarkable."

The researchers also used Magnetic Resonance Imaging (MRI) to study the brains of the affected family members and found they had reduced grey matter in the posterior inferior portion of the temporal lobe, a brain area known to be involved in semantic cognition.

Dr Briscoe said: “These brain abnormalities were surprising to find in healthy people, particularly in the same family, although similar brain regions have been implicated in research with older adults with neurological problems that are linked to semantic cognition”

"Our findings have uncovered a potential causal link between anomalous neuroanatomy and semantic cognition in a single family. Importantly, the pattern of inheritance appears as a potentially dominant trait. This may well prove to be the first example of a heritable, highly specific abnormality affecting semantic cognition in humans.”

Provided by University of Bristol

Source: medicalxpress.com

Jun 20, 201210 notes
#science #neuroscience #brain #psychology
'Hallucinating' robots arrange objects for human use

June 18, 2012 By Bill Steele

(Phys.org) — If you hire a robot to help you move into your new apartment, you won’t have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, according to Cornell researchers, is to ask “How will humans use this?”

image

A robot populates a room with imaginary human stick figures in order to decide where objects should go to suit the needs of humans.

Researchers in the Personal Robotics Lab of Ashutosh Saxena, assistant professor of computer science, have already taught robots to identify common objects, pick them up and place them stably in appropriate locations. Now they’ve added the human element by teaching robots to “hallucinate” where and how humans might stand, sit or work in a room, and place objects in their usual relationship to those imaginary people.

Their work will be reported at the International Symposium on Experimental Robotics, June 21 in Quebec, and the International Conference of Machine Learning, June 29 in Edinburgh, Scotland.

Previous work on robotic placement, the researchers note, has relied on modeling relationships between objects. A keyboard goes in front of a monitor, and a mouse goes next to the keyboard. But that doesn’t help if the robot puts the monitor, keyboard and mouse at the back of the desk, facing the wall.

image

Above left, random placing of objects in a scene puts food on the floor, shoes on the desk and a laptop teetering on the top of the fridge. Considering the relationships between objects (upper right) is better, but he laptop is facing away from a potential user and the food higher than most humans would like. Adding human context (lower left) makes things more accessible. Lower right: how an actual robot carried it out. (Personal Robotics Lab)

Relating objects to humans not only avoids such mistakes but also makes computation easier, the researchers said, because each object is described in terms of its relationship to a small set of human poses, rather than to the long list of other objects in a scene. A computer learns these relationships by observing 3-D images of rooms with objects in them, in which it imagines human figures, placing them in practical relationships with objects and furniture. You don’t don’t put a sitting person where there is no chair. You can put a sitting person on top of a bookcase, but there are no objects there for the person to use, so that”s ignored. It The computer calculates the distance of objects from various parts of the imagined human figures, and notes the orientation of the objects.

Eventually it learns commonalities: There are lots of imaginary people sitting on the sofa facing the TV, and the TV is always facing them. The remote is usually near a human’s reaching arm, seldom near a standing person’s feet. “It is more important for a robot to figure out how an object is to be used by humans, rather than what the object is. One key achievement in this work is using unlabeled data to figure out how humans use a space,” Saxena said.

In a new situation the a robot places human figures in a 3-D image of a room, locating them in relation to objects and furniture already there. “It puts a sample of human poses in the environment, then figures out which ones are relevant and ignores the others,” Saxena explained. It decides where new objects should be placed in relation to the human figures, and carries out the action.

The researchers tested their method using images of living rooms, kitchens and offices from the Google 3-D Warehouse, and later, images of local offices and apartments. Finally, they programmed a robot to carry out the predicted placements in local settings. Volunteers who were not associated with the project rated the placement of each object for correctness on a scale of 1 to 5.

Comparing various algorithms, the researchers found that placements based on human context were more accurate than those based solely in relationships between objects, but the best results of all came from combining human context with object-to-object relationships, with an average score of 4.3. Some tests were done in rooms with furniture and some objects, others in rooms where only a major piece of furniture was present. The object-only method performed significantly worse in the latter case because there was no context to use. “The difference between previous works and our [human to object] method was significantly higher in the case of empty rooms,” Saxena reported.

Provided by Cornell University

Source: phys.org

Jun 19, 201211 notes
#science #neuroscience #robotics
Robots Get a Feel for the World

June 18th, 2012

Robots equipped with tactile sensor able to identify materials through touch, paving the way for more useful prostheses.

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel, or at least the ability to identify different materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

[Video: Robots Get a Feel for the World]
What does a robot feel when it touches something? Little or nothing until now. Researchers at the USC Viterbi School of Engineering publish a study in Frontiers in Neurorobotics showing that specially designed robots can be taught to feel even more than humans. Vimeo video by USC Viterbi. USC Viterbi.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.

image

Tactile sensors which mimic finger tips enables robots to identify materials through touch better than humans. Image from press release by USC Viterbi School of Engineering.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin.

Source: Neuroscience News

Jun 19, 201213 notes
#science #neuroscience #robotics
Children, Brain Development and the Criminal Law

ScienceDaily (June 18, 2012) — The legal system needs to take greater account of new discoveries in neuroscience that show how a difficult childhood can affect the development of a young person’s brain which can increase the risk adolescent crimes, according to researchers.

The research will be presented as part of an Economic and Social Research Council seminar series in conjunction with the Parliamentary Office of Science and Technology.

Neuroscientists have recently shown that early adversity — such as a very chaotic and frightening home life — can result in a young child becoming hyper vigilant to potential threats in their environment. This appears to influence the development of brain connectivity and functions.

Such children may come to adolescence with brain systems that are set differently, and this may increase their likelihood of taking impulsive risks. For many young offenders such early adversity is a common experience, and it may increase both their vulnerability to mental health problems and also their risk of problem behaviours.

These insights, from a team led by Dr Eamon McCrory, University College London, are part of a wave of neuroscientific research questions that have potential implications for the legal system.

Other research by Dr Seena Fazel of Oxford University has shown that while social disadvantage is a major risk factor for offending, a Traumatic Brain Injury (TBI) — from an accident or assault — significantly increases the risk of involvement in violent crime. Professor Huw Williams, at University of Exeter, has similarly shown that around 45 per cent of young offenders have TBI histories, and more injuries are associated with greater violence.

Professor Williams said: “The latest message from neuroscience is that young people who suffer troubled childhoods may experience a kind of ‘triple whammy’. A difficult social background may put them at greater risk of offending and influence their brain development early on in childhood in a way that increases risky behaviour. This can then increase their chances of experiencing an injury to their brains that would compromise their ability to stay in school or contribute to society still further.”

Professor Williams wants to see better communication between neuroscientists, clinicians and lawyers so that research findings like these lead to changes in the legal system. “There is a big gap between research conducted by neuroscientists and the realities of the day to day work of the justice system,” he said. “Although criminal behaviour results from a complex interplay of a host of factors, neuroscientists and clinicians are identifying key risk factors that — if addressed — could reduce crime. Investment in earlier, focussed interventions may offset the costs of years of custody and social violence.”

Dr Eileen Vizard, a prominent adolescent forensic psychiatrist, will talk at the event Neuroscience, Children and the Law, about how the criminal justice system needs to be changed to age appropriate sentencing for children as young as ten years old, whilst also providing for the welfare needs of these deprived children. Laura Hoyano — a leading expert on vulnerable people in criminal courts — will discuss the problems children face when testifying in criminal courts.

Source: Science Daily

Jun 19, 201211 notes
#science #neuroscience #psychology #brain
Clues to Nervous System Evolution Found in Nerve-Less Sponge

ScienceDaily (June 18, 2012) — UC Santa Barbara scientists turned to the simple sponge to find clues about the evolution of the complex nervous system and found that, but for a mechanism that coordinates the expression of genes that lead to the formation of neural synapses, sponges and the rest of the animal world may not be so distant after all. Their findings, titled “Functionalization of a protosynaptic gene expression network,” are published in the Proceedings of the National Academy of Sciences.

image

The genes of Amphimedon queenslandica, a marine sponge native to the Great Barrier Reef, Australia, have been fully sequenced, allowing the researchers to monitor gene expression for signs of neural development. (Credit: UCSB)

"If you’re interested in finding the truly ancient origins of the nervous system itself, we know where to look," said Kenneth Kosik, Harriman Professor of Neuroscience Research in the Department of Molecular, Cellular & Developmental Biology, and co-director of UCSB’s Neuroscience Research Institute.

That place, said Kosik, is the evolutionary period of time when virtually the rest of the animal kingdom branched off from a common ancestor it shared with sponges, the oldest known animal group with living representatives. Something must have happened to spur the evolution of the nervous system, a characteristic shared by creatures as simple as jellyfish and hydra to complex humans, according to Kosik.

A previous sequencing of the genome of the Amphimedon queenslandica — a sponge that lives in Australia’s Great Barrier Reef — showed that it contained the same genes that lead to the formation of synapses, the highly specialized characteristic component of the nervous system that sends chemical and electrical signals between cells. Synapses are like microprocessors, said Kosik explaining that they carry out many sophisticated functions: They send and receive signals, and they also change behaviors with interaction — a property called “plasticity.”

"Specifically, we were hoping to understand why the marine sponge, despite having almost all the genes necessary to build a neuronal synapse, does not have any neurons at all," said the paper’s first author, UCSB postdoctoral researcher Cecilia Conaco, from the UCSB Department of Molecular, Cellular, and Developmental Biology (MCDB) and Neuroscience Research Institute (NRI). "In the bigger scheme of things, we were hoping to gain an understanding of the various factors that contribute to the evolution of these complex cellular machines."

This time the scientists, including Danielle Bassett, from the Department of Physics and the Sage Center for the Study of the Mind, and Hongjun Zhou and Mary Luz Arcila, from NRI and MCDB, examined the sponge’s RNA (ribonucleic acid), a macromolecule that controls gene expression. They followed the activity of the genes that encode for the proteins in a synapse throughout the different stages of the sponge’s development.

"We found a lot of them turning on and off, as if they were doing something," said Kosik. However, compared to the same genes in other animals, which are expressed in unison, suggesting a coordinated effort to make a synapse, the ones in sponges were not coordinated.

"It was as if the synapse gene network was not wired together yet," said Kosik. The critical step in the evolution of the nervous system as we know it, he said, was not the invention of a gene that created the synapse, but the regulation of preexisting genes that were somehow coordinated to express simultaneously, a mechanism that took hold in the rest of the animal kingdom.

The work isn’t over, said Kosik. Plans for future research include a deeper look at some of the steps that lead to the formation of the synapse; and a study of the changes in nervous systems after they began to evolve.

"Is the human brain just a lot more of the same stuff, or has it changed in a qualitative way?" he asked.

Source: Science Daily

Jun 19, 201213 notes
#science #neuroscience #evolution #psychology #nervous system
Diabetes, poor glucose control associated with greater cognitive decline in older adults

June 18, 2012

Among well-functioning older adults without dementia, diabetes mellitus (DM) and poor glucose control among those with DM are associated with worse cognitive function and greater cognitive decline, according to a report published Online First by Archives of Neurology, a JAMA Network publication.

Findings from previous studies have suggested an association between diabetes mellitus and an increased risk of cognitive impairment and dementia, including Alzheimer disease, but this association continues to be debated and less is known regarding incident DM in late life and cognitive function over time, the authors write as background in the study.

Kristine Yaffe, M.D., of the University of California, San Francisco and the San Francisco VA Medical Center, and colleagues evaluated 3,069 patients (mean age, 74.2 years; 42 percent black; 52 percent female) who completed the Modified Mini-Mental State Examination (3MS) and Digit Symbol Substitution Test (DSST) at baseline and selected intervals over 10 years.

At study baseline, 717 patients (23.4 percent) had prevalent DM and 2,352 (76.6 percent) were without DM, 159 of whom developed DM during follow-up. Patients who had prevalent DM at baseline had lower 3MS and DSST test scores than patients without DM, and results from analysis show similar patterns for 9-year decline with participants with prevalent DM showing significant decline on both the 3MS and DSST compared with those without DM.

Also, among participants with prevalent DM at baseline, higher levels of hemoglobin A1c (HbA1c) were associated with lower 3MS and DSST scores. However, after adjusting for age, sex, race and education, scores remained significantly lower for those with mid (7 percent to 8 percent) and high (greater than or equal to 8 percent) HbA1c levels on the 3MS but were no longer significant for the DSST.

"This study supports the hypothesis that older adults with DM have reduced cognitive function and that poor glycemic control may contribute to this association,” the authors conclude. “Future studies should determine if early diagnosis and treatment of DM lessen the risk of developing cognitive impairment and if maintaining optimal glucose control helps mitigate the effect of DM on cognition.”

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Jun 19, 20122 notes
#science #neuroscience #brain #alzheimer
Highways of the brain: High-cost and high-capacity

June 18, 2012

A new study proposes a communication routing strategy for the brain that mimics the American highway system, with the bulk of the traffic leaving the local and feeder neural pathways to spend as much time as possible on the longer, higher-capacity passages through an influential network of hubs, the so-called rich club.

image

The study, published this week online in the Early Edition of the Proceedings of the National Academy of Sciences, involves researchers from Indiana University and the University Medical Center Utrecht in the Netherlands and advances their earlier findings that showed how select hubs in the brain not only are powerful in their own right but have numerous and strong connections between each other.

The current study characterizes the influential network within the rich club as the “backbone” for global brain communication. A costly network in terms of the energy and space consumed, said Olaf Sporns, professor in the Department of Psychological and Brain Sciences at IU Bloomington, but one with a big pay-off: providing quick and effective communication between billions and billions of brain cells.

"Until now, no one knew how central the brain’s rich club really was," Sporns said. "It turns out the rich club is always right in the middle when it comes to how brain regions talk to each other. It absorbs, transforms and disseminates information. This underscores its importance for brain communication.”

In earlier work, using diffusion imaging, the researchers found a group of 12 strongly interconnected bihemispheric hub regions, comprising the precuneus, superior frontal and superior parietal cortex, as well as the subcortical hippocampus, putamen and thalamus. Together, these regions form the brain’s “rich club.” Most of these areas are engaged in a wide range of complex behavioral and cognitive tasks, rather than more specialized processing such as vision and motor control.

For the current study, Martijn van den Heuvel, a professor at the Rudolf Magnus Institute of Neuroscience at University Medical Center Utrecht, used diffusion tensor imaging data for two sets of 40 healthy subjects to map the large-scale connectivity structure of the brain. The cortical sheet was divided into 1,170 regions, and then pathways between the regions were reconstructed and measured. As in the previous study, the rich club nodes were widely distributed and had up to 40 percent more connectivity compared to other areas.

The connections measured — almost 700,000 in total — were classified in one of three ways: as rich club connections if they connected nodes within the rich club; as feeder connections if they connected a non-rich club node to a rich club node; and as local connections if they connected non-rich club nodes. Rich club connections made up the majority of all long-distance neural pathways. The study also found that connections classified as rich club connections were used more heavily for communication than other feeder and local connections. A path analysis showed that when a minimally short path is traced from one area of the brain to another, it travels through the rich club network 69 percent of the time, even though the network accounts for only 10 percent of the brain.

A common pattern in communication paths spanning long distances, Sporns said, was that such paths involved sequences of steps leading across local, feeder, rich club, feeder and back to local connections. In other words, he said, many communication paths first traveled toward the rich club before reaching their destinations.

"It is as if the rich club acts as an attractor for signal traffic in the brain," Sporns said. "It soaks up information which is then integrated and sent back out to the rest of the brain."

Van den Heuvel agreed.

"It’s like a big ‘neuronal magnet’ for communication and information integration in our brains," he said. "Seeking out the rich club may offer a strategy for neurons and brain regions to find short communication paths across the brain, and might provide insight into how our brain manages to be so highly efficient."

From an evolutionary standpoint, it was important for the brain to minimize energy consumption and wiring volume, but if these were the only factors, there would be no rich club because of the extra resources it requires, Sporns said. The rich club is expensive, at least in terms of wiring volume, and perhaps also in terms of metabolic cost. The trade-off for higher cost, Sporns said, is higher performance — the integration of diverse signals and the ability to select short paths across the network.

“Brain neurons don’t have maps; how do they find paths to get in touch? Perhaps the rich club helps with this, offering the brain’s neurons and regions a way to communicate efficiently based on a routing strategy that involves the rich club.”

People use related strategies to navigate social networks.

"Strangely, neurons may solve their communication problems just like the people to which they belong," Sporns said.

Provided by Indiana University

Source: medicalxpress.com

Jun 19, 201213 notes
#science #neuroscience #brain #psychology
Coenzyme Q10 study indicates promise in Huntington's treatment

June 18, 2012

A new study shows that the compound Coenzyme Q10 (CoQ) reduces oxidative damage, a key finding that hints at its potential to slow the progression of Huntington disease. The discovery, which appears in the inaugural issue of the Journal of Huntington’s Disease, also points to a new biomarker that could be used to screen experimental treatments for this and other neurological disorders.

"This study supports the hypothesis that CoQ exerts antioxidant effects in patients with Huntington’s disease and therefore is a treatment that warrants further study," says University of Rochester Medical Center neurologist Kevin M. Biglan, M.D., M.P.H., lead author of the study. “As importantly, it has provided us with a new method to evaluate the efficacy of potential new treatments.”

Huntington’s disease (HD) is a genetic, progressive neurodegenerative disorder that impacts movement, behavior, cognition, and generally results in death within 20 years of the disease’s onset. While the precise causes and mechanism of the disease are not completely understood, scientists believe that one of the important triggers of the disease is a genetic “stutter" which produces abnormal protein deposits in brain cells. It is believed that these deposits – through a chain of molecular events – inhibit the cell’s ability to meet its energy demands resulting in oxidative stress and, ultimately, cellular death.

Scientists had previously identified the correlation between a specific fragment of genetic code, called 8-hydroxy-2’-deoxyguanosine (80HdG) and the presence of oxidative stress in brain cells. 80HdG can be detected in a person’s blood, meaning that it could serve as a convenient and accessible biomarker for the disease. Researchers have also been evaluating the compound Coenzyme Q10 as a possible treatment for HD because of its ability to support the function of mitochondria – the tiny power plants the provide cells with energy – and counter oxidative stress.

The study’s authors evaluated a series of blood samples of 20 individuals with HD who had previously undergone treatment with CoQ in clinical trial titled Pre-2Care. While these studies showed that CoQ alleviated some symptoms of the disease, it was not known what impact – if any – the treatment had at the molecular level in the brain. Upon analysis, the authors found that 80HdG levels dropped by 20 percent in individuals who had been treated with CoQ.

CoQ is currently being evaluated in a Phase 3 clinical trial, which is the largest therapeutic clinical study to date for HD. The trial – called 2Care – is being run by the Huntington Study Group, an international networks or investigators.

"Identifying treatments that slow the progression or delay the onset of Huntington’s disease is a major focus of the medical community," said Biglan. "This study demonstrates that 80HdG could be an ideal marker to identify the presence oxidative injury and whether or not treatment is having an impact."

Provided by University of Rochester Medical Center

Source: medicalxpress.com

Jun 18, 201211 notes
#science #neuroscience #brain #huntington #psychology
Device implanted in brain has therapeutic potential for Huntington's disease

June 18, 2012

Studies suggest that neurotrophic factors, which play a role in the development and survival of neurons, have significant therapeutic and restorative potential for neurologic diseases such as Huntington’s disease. However, clinical applications are limited because these proteins cannot easily cross the blood brain barrier, have a short half-life, and cause serious side effects. Now, a group of scientists has successfully treated neurological symptoms in laboratory rats by implanting a device to deliver a genetically engineered neurotrophic factor directly to the brain. They report on their results in the latest issue of Restorative Neurology and Neuroscience.

image

The tip of the EC biodelivery system, a straw-like device that is implanted in the brain of patients, contains living cells which are genetically modified to produce a therapeutic factor. The membrane enclosing the cells allows the factor to flow out of the device and into the patient’s brain tissue. This way, areas deep within the brain affected by Huntington’s disease can be treated to delay or prevent the disease. Credit: Jens Tornøe, NsGene A/S, Ballerup, Denmark

Researchers used Encapsulated Cell (EC) biodelivery, a platform which can be applied using conventional minimally invasive neurosurgical procedures to target deep brain structures with therapeutic proteins. “Our study adds to the continually increasing body of preclinical and clinical data positioning EC biodelivery as a promising therapeutic delivery method for larger biomolecules. It combines the therapeutic advantages of gene therapy with the well-established safety of a retrievable implant,” says lead investigator Jens Tornøe, NsGene A/S, Ballerup, Denmark.

Investigators made a catheter-like device consisting of a hollow fiber membrane encapsulating a polymeric “scaffold,” which provides a surface area to which neurotrophic factor-producing cells can attach. When implanted in the brain, the membrane allows the neurotrophic factor to flow out of the device, as well as allowing nutrients in. Dr. Tornøe and his colleagues used the neurotrophic factor Meteorin, which plays a role in the development of striatal projection neurons, whose degeneration is a hallmark of Huntington’s disease. The scientists engineered ARPE-19 cells to produce Meteorin and used those that produced high levels of Meteorin in their experiment.

The EC biodelivery devices were implanted in the brains of rats followed by injection with quinolinic acid (QA), a potent neurotoxin that causes excitotoxicity, a component of Huntington’s disease. They tested three different implant types: devices filled with the high-producing ARPE-19 cells (EC-Meteorin), devices with unmodified ARPE-19 cells (ARPE-19), and devices without cells. Motor dysfunction was tested immediately prior to injection with QA and at two and four weeks after injection.

The research team found that the EC-Meteorin devices significantly protected against QA-induced toxicity. Rats with EC-Meteorin devices manifested near normal neurological performance and significantly reduced loss of brain cells from the QA injection compared to controls. Analysis of the Meteorin-treated brains showed a markedly reduced striatal lesion size. The EC biodelivery devices were found to produce stable or even increasing levels of Meteorin throughout the study. Meteorin diffused readily from the biodelivery device to the striatal tissue.

"Huntington’s disease can be diagnosed with high accuracy by genetic testing. Pre-symptomatic administration of a safe therapeutic treatment providing sustained delay or prevention of disease would be of great benefit to patients," says Dr. Tornøe. "With additional functional and safety data, tests in animals larger than the rat to study distribution, and more accurate disease models to evaluate the therapeutic potential of Meteorin, we anticipate that EC biodelivery can be developed as a platform technology for targeted therapy in patients with Huntington’s disease."

Provided by IOS Press

Source: medicalxpress.com

Jun 18, 201210 notes
#science #neuroscience #brain #psychology #huntington
MRI images show what the brain looks like when you lose self-control

June 18, 2012

New pictures from the University of Iowa show what it looks like when a person runs out of patience and loses self-control.

image

This image shows brain activity when people exert self-control. Credit: University of Iowa

A study by University of Iowa neuroscientist and neuro-marketing expert William Hedgcock confirms previous studies that show self-control is a finite commodity that is depleted by use. Once the pool has dried up, we’re less likely to keep our cool the next time we’re faced with a situation that requires self-control.

But Hedgcock’s study is the first to actually show it happening in the brain using fMRI images that scan people as they perform self-control tasks. The images show the anterior cingulate cortex (ACC)—the part of the brain that recognizes a situation in which self-control is needed and says, “Heads up, there are multiple responses to this situation and some might not be good”—fires with equal intensity throughout the task.

However, the dorsolateral prefrontal cortex (DLPFC)—the part of the brain that manages self-control and says, “I really want to do the dumb thing, but I should overcome that impulse and do the smart thing”—fires with less intensity after prior exertion of self-control.

image

This image shows brain activity after people have been engaged in self-control tasks long enough that self-control resources have been depleted. Credit: University of Iowa

He said that loss of activity in the DLPFC might be the person’s self-control draining away. The stable activity in the ACC suggests people have no problem recognizing a temptation. Although they keep fighting, they have a harder and harder time not giving in.

Which would explain why someone who works very hard not to take seconds of lasagna at dinner winds up taking two pieces of cake at desert. The study could also modify previous thinking that considered self-control to be like a muscle. Hedgcock says his images seem to suggest that it’s like a pool that can be drained by use then replenished through time in a lower conflict environment, away from temptations that require its use.

The researchers gathered their images by placing subjects in an MRI scanner and then had them perform two self-control tasks—the first involved ignoring words that flashed on a computer screen, while the second involved choosing preferred options. The study found the subjects had a harder time exerting self-control on the second task, a phenomenon called “regulatory depletion.” Hedgcock says that the subjects’ DLPFCs were less active during the second self-control task, suggesting it was harder for the subjects to overcome their initial response.

Hedgcock says the study is an important step in trying to determine a clearer definition of self-control and to figure out why people do things they know aren’t good for them. One possible implication is crafting better programs to help people who are trying to break addictions to things like food, shopping, drugs, or alcohol. Some therapies now help people break addictions by focusing at the conflict recognition stage and encouraging the person to avoid situations where that conflict arises. For instance, an alcoholic should stay away from places where alcohol is served.

But Hedgcock says his study suggests new therapies might be designed by focusing on the implementation stage instead. For instance, he says dieters sometimes offer to pay a friend if they fail to implement control by eating too much food, or the wrong kind of food. That penalty adds a real consequence to their failure to implement control and increases their odds of choosing a healthier alternative.

The study might also help people who suffer from a loss of self-control due to birth defect or brain injury.

"If we know why people are losing self-control, it helps us design better interventions to help them maintain control," says Hedgcock, an assistant professor in the Tippie College of Business marketing department and the UI Graduate College’s Interdisciplinary Graduate Program in Neuroscience.

Provided by University of Iowa

Source: medicalxpress.com

Jun 18, 201248 notes
#science #neuroscience #brain #psychology
The neurological basis for fear and memory

June 18, 2012

Fear conditioning using sound and taste aversion, as applied to mice, have revealed interesting information on the basis of memory allocation.

image

Credit: Thinkstock

European ‘Cellular mechanisms underlying formation of the fear memory trace in the mouse amygdala’ (FEAR Memory TRACE) project is investigating memory allocation and the recruitment of certain neurons to encode a memory. By studying conditioned fear memory in response to an auditory stimulus, the researchers have delved into pathological emotional states and neural mechanisms involved in memory allocation, retrieval and extinction.

Prior research has revealed that the conditioned fear response in mice is located in a specific bundle of neurons in the amygdala. Memory allocation modulation is due to expression of the transcription factor, cyclic adenosine 3’, 5’-monophosphate response element binding protein (CREB) and possibly neuronal excitability.

FEAR Memory TRACE focused on the electrophysiological properties of neurons encoding the same memory. The project also aimed to ascertain the biophysical mechanisms in the plasticity changes recorded in the specific set of neurons in the fear memory trace.

Recording information on auditory fear conditioning and conditioned taste aversion, the scientists used intra-amygdala surgery using viral vectors and electrophysiological experiments to detect neuronal excitability.

Transfected by virus, CREB tagged with green fluorescent protein together with the gene for channelrhodopsin2 were used in neural control experiments. Combined, these two elements caused neuron firing in specific nerve cells. Molecular techniques included western blot for protein detection, genotyping and viral DNA preparation.

Behavioural tests on long- and short-term memory of mice involving fear conditioning and taste aversion showed increased memory performance at the three-hour point rather than the five-hour point. The intrinsic excitability of the mice receiving both shock and the tone was increased at three hours, not five, compared to mice that only received the tone.

As the project continues to its close in two years, the aim is to identify biophysical mechanisms involved in recruiting neurons that compete with each other for a specific memory. FEAR Memory TRACE will also develop computational models to assess the role of these mechanisms in memory performance.

Information on biochemical processes in neural mechanisms has wide application in many clinical situations including patients suffering memory loss, such as stroke victims. Fear response manipulation can be applied in treatment of neuroses and phobias.

Provided by CORDIS

Source: medicalxpress.com

Jun 18, 201238 notes
#science #neuroscience #brain #psychology #memory #emotion
Play
Jun 18, 201216 notes
#science #neuroscience #language #psychology
Manipulation of a specific neural circuit buried in complicated brain networks in primates

June 17, 2012

A collaborative research team led by Professor Tadashi ISA from The National Institute for Physiological Sciences, The National Institutes of Natural Sciences and Fukushima Medical University and Kyoto University, developed a “double viral vector transfection technique” which can deliver genes to a specific neural circuit by combining two new kinds of gene transfer vectors. With this method, they found that “indirect pathways”, which were suspected to have been left behind when the direct connection from the brain to motor neurons (which control muscles) was established in the course of evolution, actually plays an important role in the highly developed dexterous hand movements. This study was supported by the Strategic Research Program for Brain Sciences by the MEXT of Japan. This research result will be published in Nature (June 17th, advance online publication).

It is said that the higher primates including human beings accomplished explosive evolution by having acquired the ability to move hands skillfully. It has been thought that this ability to move individual fingers is a result of the evolution of the direct connection from the cerebrocortical motor area to motor neurons of the spinal cord which control the muscles. On the other hand, in lower animals with clumsy hands, such as cats or rats, the cortical motor area is connected to the motor neurons, only through interneurons of the spinal cord. Such “indirect pathway”remains in us, primates, without us fully understanding its functions. Is this “phylogenetically old circuit” still in operation? Or maybe suppressed since it is obstructive? The conclusion was not attached to this argument.

The collaborative research team led by Professor Tadashi ISA, Project Assistant Professor Masaharu KINOSHITA from The National Institute for Physiological Sciences, The National Institutes of Natural Sciences and Fukushima Medical University and Kyoto University developed “the double viral vector transfection technique”which can deliver genes to a specific neural circuit by combining two new kinds of gene transfer vectors.

With this method, they succeeded in the selective and reversible suppression of the propriospinal neurons (spinal interneurons mediating the indirect connection from cortical motor area to spinal motor neurons)

The results revealed that “indirect pathways” play an important role in dexterous hand movements and finally a longtime debate has come to a close.

The key component of this discovery was”the double viral vector transfection technique”in which one vector is retrogradely transported from the terminal zone back to the neuronal cell bodies and the other is transfected at the location of their cell bodies. The expression of the target gene is regulated only in the cells with double transfection by the two vectors. Using this technique, they succeeded in the suppression of the propriospinal neuron selectively and reversibly.

Such an operation was possible in mice in which the inheritable genetic manipulation of germline cells were possible, but impossible in primates until now.

Using this method, further development of gene therapy targeted to a specific neural circuit can be expected.

Professor Tadashi ISA says “this newly developed double viral vector transfection technique can be applied to the gene therapy of the human central nervous system, as we are the same higher primates.

And this is the discovery which reverses the general idea that the spinal cord is only a reflex pathway, but also plays a pivotal role in integrating the complex neural signals which enable dexterous movements.”

Provided by National Institute for Physiological Sciences

Source: medicalxpress.com

Jun 17, 20129 notes
#science #neuroscience #brain #psychology #neuron
Jun 17, 201299 notes
#science #neuroscience #psychology #neuron #brain #blue brain project
Freud's Theory of Unconscious Conflict Linked to Anxiety Symptoms

ScienceDaily (June 16, 2012) — A link between unconscious conflicts and conscious anxiety disorder symptoms have been shown, lending empirical support to psychoanalysis.

image

Data from the experiment showing that subliminal exposure to words related to a person’s unconscious conflict, followed by supraliminal exposure to words related to their anxiety symptoms, led to different alpha wave patterns compared with other scenarios. (Credit: Image courtesy of University of Michigan Health System)

An experiment that Sigmund Freud could never have imagined 100 years ago may help lend scientific support for one of his key theories, and help connect it with current neuroscience.

June 16 at the 101st Annual Meeting of the American Psychoanalytic Association, a University of Michigan professor who has spent decades applying scientific methods to the study of psychoanalysis will present new data supporting a causal link between the psychoanalytic concept known as unconscious conflict, and the conscious symptoms experienced by people with anxiety disorders such as phobias.

Howard Shevrin, Ph.D., emeritus professor of psychology in the U-M Medical School’s Department of Psychiatry, will present data from experiments performed in U-M’s Ormond and Hazel Hunt Laboratory.

The research involved 11 people with anxiety disorders who each received a series of psychoanalytically oriented diagnostic sessions conducted by a psychoanalyst.

From these interviews the psychoanalysts inferred what underlying unconscious conflict might be causing the person’s anxiety disorder. Words capturing the nature of the unconscious conflict were then selected from the interviews and used as stimuli in the laboratory. They also selected words related to each patient’s experience of anxiety disorder symptoms. Although these words differed from patient to patient, results showed that they functioned in the same way.

These verbal stimuli were presented subliminally at one thousandth of a second, and supraliminally at 30 milliseconds. A control category of stimuli was added that had no relationship to the unconscious conflict or anxiety symptom. While the stimuli were presented to the patients, scalp electrodes record the brain responses to them.

In a previous experiment Shevrin had demonstrated that time-frequency features, a type of brain activity, showed that patients grouped the unconscious conflict stimuli together only when they were presented subliminally. But the conscious symptom-related stimuli showed the reverse pattern — brain activity was better grouped together when patients viewed those words supraliminally.

"Only when the unconscious conflict words were presented unconsciously could the brain see them as connected," Shevrin notes. "What the analysts put together from the interview session made sense to the brain only unconsciously."

However, the experimental design in this first experiment did not allow for directly comparing the effect of the unconscious conflict stimuli on the conscious symptom stimuli.

To obtain evidence for that next level, the unconscious conflict stimuli were presented immediately prior to the conscious symptom stimuli and a new measurement was made, of the brain’s own alpha wave frequency, at 8-13 cycles per second, that had been shown to inhibit various cognitive functions.

Highly significant correlations, suggesting an inhibitory effect, were obtained when the amount of alpha generated by the unconscious conflict stimuli were correlated with the amount of alpha associated with the conscious symptom alpha — but only when the unconscious conflict stimuli were presented subliminally. No results were obtained when control stimuli replaced the symptom words. The fact that these findings are a function of inhibition suggests that from a psychoanalytic standpoint that repression might be involved.

"These results create a compelling case that unconscious conflicts cause or contribute to the anxiety symptoms the patient is experiencing," says Shevrin, who also holds an emeritus position in the Department of Psychology in U-M’s College of Literature, Science and the Arts. "These findings and the interdisciplinary methods used — which draw on psychoanalysis, cognitive psychology, and neuroscience — demonstrate that it is possible to develop an interdisciplinary science drawing upon psychoanalytic theory."

He notes that a prominent critic of psychoanalysis and Freudian theory, Adolf Grunbaum, Ph.D., professor of the philosophy of science at the University of Pittsburgh, has expressed satisfaction that the new results, when added to previous evidence, show that fundamental psychoanalytic concepts can indeed be tested in empirical ways.

For more than 40 years, Shevrin has led a team that has pushed at the boundaries between the disciplines of neuroscience, cognitive psychology, and psychoanalysis, looking for evidence that Freudian concepts such as the unconscious and repression could be documented through physical measures of brain activity. His work has explored the territory where neurobiology, thoughts, emotions and behavior meet.

In 1968 he published the first report of brain responses to unconscious visual stimuli in Science, thus providing strong objective evidence for the existence of the unconscious at a time when most scientists were skeptical of Freud’s ideas. In that same study, he showed that unconscious perceptions are processed in different ways from conscious perceptions, a finding consistent with Freud’s views on how the unconscious works.

In recent years, exchanges between Grunbaum and Shevrin explored the nature of the evidence for the existence and impact of unconscious conflicts. In a 1992 publication, the first study referred to, Grunbaum agreed that Shevrin had obtained objective brain based evidence for the existence of unconscious conflict, but Grunbaum noted that he had not shown that these conflicts caused psychiatric symptoms. His response to being informed of the new findings was an email stating: “I am satisfied.”

Source: Science Daily

Jun 17, 201229 notes
#science #neuroscience #psychology #anxiety #unconscious #brain
Neuroscience: The mind reader

Adrian Owen has found a way to use brain scans to communicate with people previously written off as unreachable. Now, he is fighting to take his methods to the clinic.

image

Adrian Owen still gets animated when he talks about patient 23. The patient was only 24 years old when his life was devastated by a car accident. Alive but unresponsive, he had been languishing in what neurologists refer to as a vegetative state for five years, when Owen, a neuro-scientist then at the University of Cambridge, UK, and his colleagues at the University of Liège in Belgium, put him into a functional magnetic resonance imaging (fMRI) machine and started asking him questions.

Incredibly, he provided answers. A change in blood flow to certain parts of the man’s injured brain convinced Owen that patient 23 was conscious and able to communicate. It was the first time that anyone had exchanged information with someone in a vegetative state.

Patients in these states have emerged from a coma and seem awake. Some parts of their brains function, and they may be able to grind their teeth, grimace or make random eye movements. They also have sleep–wake cycles. But they show no awareness of their surroundings, and doctors have assumed that the parts of the brain needed for cognition, perception, memory and intention are fundamentally damaged. They are usually written off as lost.

Read More →

Jun 16, 201245 notes
#science #neuroscience #brain #psychology
More to Facial Perception Than Meets the Eye

ScienceDaily (June 15, 2012) — People make complex judgements about a person from looking at their face that are based on a range of factors beyond simply their race and gender, according to findings of new research funded by the Economic and Social Research Council (ESRC).

The findings question a long-held belief that people immediately put a person they meet into a limited number of social categories such as: female or male; Asian, Black, Latino or White; and young or old.

Dr Kimberly Quinn at the University of Birmingham found that people ‘see’ faces in a multiple of ways. This could have wider importance in understanding stereotyping and discrimination because it has implications on whether and how people categorise others.

Categorisation is not done purely on the physical features of the face in front of us, but depends on other information as well, including whether the person is already known and whether the person is believed to share other important identities with us.

"How we perceive faces is not just a reflection of what’s in those faces," Dr Quinn said. "We are not objective; we bring our current goals and past knowledge to every new encounter. And this happens really quickly — within a couple of hundred milliseconds of seeing the face."

Dr Quinn and her colleagues explored social categories such as sex, race and age; physical attributes such as attractiveness; personality traits such as trustworthiness; and emotional states such as anger, sadness and happiness.

She found that although social categories are used to gather information on faces, these can be easily undermined. This research found that we reject simple stereotypes when something about the situation alerts us to the fact the stereotype does not tell the whole story. If we take, for example, a racial group and the corresponding stereotype of members of that group as unintelligent, seeing a person in that group playing an intellectual game such as chess would tell us to cancel out the stereotype.

In order to investigate the causes, mechanisms, and results of social categorisation, Dr Quinn used techniques from cognitive psychology and neuroscience to investigate how people process faces. The research was designed to provide insight into when and why people categorise others according to social group membership.

Their findings differ from previous research that adopted a ‘dual process’ approach and assumed people initially categorised faces based on factors such as gender, race or age before determining whether to stereotype them or to see them as unique individuals.

Dr Quinn’s findings were more consistent with a single process that initially focuses on ‘coarse’ information that is easy to detect, and then immediately starts to include more fine-grained processing as time elapses. This model allows for either categorisation or more individuated processing to emerge, and does not assume that categorisation always comes before recognising unique identities — thereby allowing for more diverse outcomes than previously thought.

Further information: http://www.esrc.ac.uk/my-esrc/grants/RES-061-23-0130/read

Source: Science Daily

Jun 16, 201219 notes
#science #neuroscience #brain #psychology
Genetic Markers Hope for New Brain Tumor Treatments

ScienceDaily (June 15, 2012) — Researchers at The University of Nottingham have identified three sets of genetic markers that could potentially pave the way for new diagnostic tools for a deadly type of brain tumour that mainly targets children.

The study, published in the latest edition of the journal Lancet Oncology, was led by Professor Richard Grundy at the University’s Children’s Brain Tumour Research Centre and Dr Suzanne Miller, a post doctoral research fellow in the Centre.

It focuses on a rare and aggressive cancer called Central Nervous System primitive neuro-ectodermal brain tumours. Patients with CNS PNET have a very poor prognosis and current treatments, including high dose chemotherapy and cranio-spinal radiotherapy are relatively unsuccessful and have severe lifelong side-effects. This is particularly the case in very young children.

Despite the need for new and more effective treatments, little research has been done to examine the underlying causes of CNS PNET, partly due to their rarity. The Nottingham study aimed to identify molecular markers as a first step to improving the treatments and therapies available to fight the cancer.

The Nottingham team collaborated with researchers at the Hospital for Sick Kids in Toronto, Canada, to perform an International study collecting 142 CNS PNET samples from 20 institutions in nine countries.

Professor Richard Grundy said: “Following our earlier research we realised that an international effort was needed to bring sufficient numbers of cases together to make the breakthrough we needed to better understand this disease or indeed diseases identified in our study. The next step is to translate this knowledge into improving treatments.”

By studying the genetics of the tumours, they discovered that instead of one cancer, the tumours have three sub-types featuring distinct genetic abnormalities and leading to different outcomes for patients.

They found that each group had its own genetic signature through subtle differences in the way they expressed two genetic markers, LIN28 and OLIG2.

When compared with clinical factors including age, survival and metastases (the spread of the tumours through the body), they discovered that group 1 tumours (primitive neural) were found most often in the youngest patients and had the poorest survival rates. Patients with group 3 tumours had the highest incidence of metastases at diagnosis.

Ultimately, the research has identified the two genetic markers LIN28 and OLIG2 as a promising basis for more effective tools for diagnosing and predicting outcomes for young patients with these types of brain tumours.

The research was funded by the Canadian Institute of Health Research, the Brainchild/Sick Kids Foundation and the Samantha Dickson Brain Tumour Trust.

Chief Executive of Samantha Dickson Brain Tumour Trust, Sarah Lindsell, said: “As the UK’s leading brain tumour charity, and the largest dedicated funder of brain tumour research, we are delighted that our investment has led to such significant success. It is great to see that understanding of these tumours is improving — this is desperately needed given the poor outcomes for children with this tumour. Samantha Dickson Brain Tumour Trust is proud to have been instrumental in this work.”

Source: Science Daily

Jun 16, 20121 note
#science #neuroscience #brain #psychology
Vitamin D With Calcium Shown to Reduce Mortality in Elderly

ScienceDaily (June 15, 2012) — A study recently published in the Endocrine Society’s Journal of Clinical Endocrinology and Metabolism (JCEM) suggests that vitamin D — when taken with calcium — can reduce the rate of mortality in seniors, therefore providing a possible means of increasing life expectancy.

During the last decade, there has been increasing recognition of the potential health effects of vitamin D. It is well known that calcium with vitamin D supplements reduces the risk of fractures. The present study assessed mortality among patients randomized to either vitamin D alone or vitamin D with calcium. The findings from the study found that the reduced mortality was not due to a lower number of fractures, but represents a beneficial effect beyond the reduced fracture risk.

"This is the largest study ever performed on effects of calcium and vitamin D on mortality," said Lars Rejnmark, PhD, of Aarhus University Hospital in Denmark and lead author of the study. "Our results showed reduced mortality in elderly patients using vitamin D supplements in combination with calcium, but these results were not found in patients on vitamin D alone."

In this study, researchers used pooled data from eight randomized controlled trials with more than 1,000 participants each. The patient data set was composed of nearly 90 percent women, with a median age of 70 years. During the three-year study, death was reduced by 9 percent in those treated with vitamin D with calcium.

"Some studies have suggested calcium (with or without vitamin D) supplements can have adverse effects on cardiovascular health," said Rejnmark. "Although our study does not rule out such effects, we found that calcium with vitamin D supplementation to elderly participants is overall not harmful to survival, and may have beneficial effects on general health."

Source: Science Daily

Jun 16, 20127 notes
#science #neuroscience #psychology
BPA Exposure Effects May Last for Generations

ScienceDaily (June 15, 2012) — Exposure to low doses of Bisphenol A (BPA) during gestation had immediate and long-lasting, trans-generational effects on the brain and social behaviors in mice, according to a recent study accepted for publication in the journal Endocrinology, a publication of The Endocrine Society.

BPA is a human-made chemical present in a variety of products including food containers, receipt paper and dental sealants and is now widely detected in human urine and blood. Public health concerns have been fueled by findings that BPA exposure can influence brain development. In mice, prenatal exposure to BPA is associated with increased anxiety, aggression and cognitive impairments.

"We have demonstrated for the first time to our knowledge that BPA has trans-generational actions on social behavior and neural expression," said Emilie Rissman, PhD, of the University of Virginia School of Medicine and lead author of the study. "Since exposure to BPA changes social interactions in mice at a dose within the reported human levels, it is possible that this compound has trans-generational actions on human behavior. If we banned BPA tomorrow, pulled all products with BPA in them, and cleaned up all landfills tomorrow it is possible, if the mice data generalize to humans, that we will still have effects of this compound for many generations."

In this study, female mice received chow with or without BPA before mating and throughout gestation. Plasma levels of BPA in supplemented female mice were in a range similar to those measured in humans. Juveniles in the first generation exposed to BPA in utero displayed fewer social interactions as compared with control mice. The changes in genes were most dramatic in the first generation (the offspring of the mice that were exposed to BPA in utero), but some of these gene changes persisted into the fourth generation.

"BPA is a ubiquitous chemical, it is in the air, water, our food, and our bodies," said Rissman. "It is a man-made chemical, and is not naturally occurring in any plant or animal. The fact that it can change gene expression in mice, and that these changes are heritable, is cause for us to be concerned about what this may mean for human health."

Source: Science Daily

Jun 16, 201214 notes
#science #neuroscience
Musical brain patterns could help predict epileptic seizures

June 15, 2012

The research led by Newcastle University’s Dr Mark Cunningham and Professor Miles Whittington and supported by the Dr Hadwen Trust for Humane Research, indicates a novel electrical bio-marker in humans.

The brain produces electrical rhythms and using EEG - electrodes on the scalp - researchers were able to monitor the brain patterns in patients with epilepsy. Both in patients and in brain tissue samples the team were able to witness an abnormal brain wave noticeable due to its rapidly increasing frequency over time.

Comparing these to a musical ‘glissando’, an upwards glide from one pitch to another, the team found that this brain rhythm is unique to humans and they believe it could be related to epilepsy.

Dr Cunningham, senior lecturer in Neuronal Dynamics at Newcastle University said: “We were able to examine EEG collected from patients with drug resistant epilepsy who were continually monitored over a two week period. During that time we noticed patterns of electrical activity with rapidly increasing frequency, just like glissandi, emerging in the lead-up to an epileptic seizure.”

"We are in the early days of the work and we want to investigate this in a larger group of patients but it may offer a promising insight into when a seizure is going to start."

Professor Whittington added: “Classical composers such as Gustav Mahler are famous for using notes of rapidly increasing pitch – called glissando - to convey intense expressions of anticipation. Similarly we identified glissando-like patterns of brain electrical activity generated in anticipation of seizures in patients with epilepsy.”

The team recorded electrical activity taken from patients in Newcastle and Glasgow with the help of collaborators Dr Roderick Duncan and Dr Aline Russell and worked in collaboration with the Epilepsy Surgery Group at Newcastle General Hospital part of the Newcastle Hospitals NHS Foundation Trust.

Having received permission from patients to use brain tissue removed during an operation to cure their seizures, the team were able to observe and study in great detail glissando discharges in slices of this human epileptic tissue maintained in the lab.

Publishing in Epilepsia online, the team discovered that glissandi are highly indicative of pathology associated with human epilepsy and, unlike other forms of epileptic activity studied previously, are extremely difficult to reproduce in normal, non-epileptic brain tissue. The team worked with Professor Roger Traub at the IBM Watson Research Centre in New York to provide predictions using highly detailed computational models. By manipulating the chemical conditions surrounding human epileptic brain tissue according to these predictions, they discovered that glissandi did not require any of the conventional chemical connections between nerve cells thought to underlie most brain functions. Instead, glissandi were generated by a combination of large changes in the pH of the tissue, specific electrical properties of certain types of nerve cell and, most importantly, direct electrical connections between these nerve cells.

"This work also suggests that given the lengths one has to go to reproduce this experimentally in rodents that the glissandi may be a unique feature of the human epileptic brain," explains Dr Cunningham.

Dr Kailah Eglington, Chief Executive of the Dr Hadwen Trust, said: “Of all human brain disorders, epilepsy research ranks as one that currently employs substantial numbers of laboratory animals worldwide.

"Dr Cunningham’s work at Newcastle University aims to address the shortcomings of existing animal-based research by removing animals from the equation and addressing the issue directly in humans."

Provided by Newcastle University

Source: medicalxpress.com

Jun 16, 201212 notes
#science #neuroscience #brain #psychology #seizures
Active ingredient of cannabis has no effect on the progression of multiple sclerosis

June 15, 2012

The first large non-commercial study to investigate whether the main active constituent of cannabis (tetrahydrocannabinol or THC) is effective in slowing the course of progressive multiple sclerosis (MS) shows that there is no evidence to suggest this; although benefits were noted for those at the lower end of the disability scale.

The CUPID (Cannabinoid Use in Progressive Inflammatory brain Disease) study was carried out by researchers from the Peninsula College of Medicine and Dentistry (PCMD), Plymouth University. The study was funded by the Medical Research Council (MRC) and managed by the National Institute for Health Research (NIHR) on behalf of the MRC-NIHR partnership, the Multiple Sclerosis Society and the Multiple Sclerosis Trust.

The preliminary results of CUPID are to be presented by lead researcher Professor John Zajicek at the Association of British Neurologists’ Annual Meeting in Brighton on Tuesday 29th May.

CUPID enrolled nearly 500 people with MS from 27 centres around the UK, and has taken eight years to complete. People with progressive MS were randomised to receive either THC capsules or identical placebo capsules for three years, and were carefully followed to see how their MS changed over this period. The two main outcomes of the trial were a disability scale administered by neurologists (the Expanded Disability Status Scale), and a patient report scale of the impact of MS on people with the condition (the Multiple Sclerosis Impact Scale 29).

Overall the study found no evidence to support an effect of THC on MS progression in either of the main outcomes. However, there was some evidence to suggest a beneficial effect in participants who were at the lower end of the disability scale at the time of enrolment but, as the benefit was only found in a small group of people rather than the whole population, further studies will be needed to assess the robustness of this finding. One of the other findings of the trial was that MS in the study population as a whole progressed slowly, more slowly than expected. This makes it more challenging to find a treatment effect when the aim of the treatment is that of slow progression.

As well as evaluating the potential neuroprotective effects and safety of THC over the long-term, one of the aims of the CUPID study was to improve the way that clinical trial research is done by exploring newer methods of measuring MS and using the latest statistical methods to make the most of every piece of information collected. This analysis will continue for several months. The CUPID study will therefore provide important information about conducting further large scale clinical trials in MS.

Professor John Zajicek, Professor of Clinical Neuroscience at PCMD, Plymouth University, said: “To put this study into context: current treatments for MS are limited, either being targeted at the immune system in the early stages of the disease or aimed at easing specific symptoms such as muscle spasms, fatigue or bladder problems. At present there is no treatment available to slow MS when it becomes progressive. Progression of MS is thought to be due to death of nerve cells, and researchers around the world are desperately searching for treatments that may be ‘neuroprotective’. Laboratory experiments have suggested that certain cannabis derivatives may be neuroprotective.”

He added: “Overall our research has not supported laboratory based findings and shown that, although there is a suggestion of benefit to those at the lower end of the disability scale when they joined CUPID, there is little evidence to suggest that THC has a long term impact on the slowing of progressive MS.”

Dr Doug Brown, Head of Biomedical Research at the MS Society, said: “There are currently no treatments for people with progressive MS to slow or stop the worsening of disability. The MS Society is committed to supporting research in this area and this was an important study for us to fund. While this study sadly suggests THC is ineffective at slowing the course of progressive MS, we will not stop our search for effective treatments. We are encouraged by the possibility shown by this study that THC may have potential benefits for some people with MS and we welcome further investigation in this area.”

Provided by The Peninsula College of Medicine and Dentistry

Source: medicalxpress.com

Jun 16, 201213 notes
#science #neuroscience #psychology #MS #brain
The risk of carrying a cup of coffee

June 15, 2012 By Angela Herring

Object manip­u­la­tion or tool use is almost a uniquely human trait, said Dagmar Sternad, director of Northeastern’s Action Lab, a research group inter­ested in move­ment coor­di­na­tion. “Not only does it require cer­tain cog­ni­tive abil­i­ties but also dis­tinct motor abilities.”

image

Professor Dagmar Sternad and postdoctoral researcher C.J. Hasson show that we subconsciously adjust our “safety margin” when we move a dynamic object like a cup of coffee based on the amount of variability in the situation. Credit: John Guillemin

Simply moving one’s own body, for instance by directing a hand toward a coffee cup, requires the orga­ni­za­tion of var­ious phys­i­o­log­ical sys­tems including the cen­tral and periph­eral ner­vous sys­tems and the mus­cu­loskeletal system.

Once the hand grasps and picks up the cup, the ques­tions become even more com­pli­cated. What if the cup is filled with liquid? At this point, the com­plexity of the con­trol problem bal­loons — the pres­ence of the liquid intro­duces non­linear fluid dynamics with the risk of a spill because of the inherent vari­ability in one’s movement.

Sternad, a pro­fessor of , biology, elec­trical and com­puter engi­neering and physics and post­doc­toral researcher C.J. Hasson are inter­ested in how we adapt our move­ment strate­gies when inter­acting with dynamic objects in the environment.

In a recent paper pub­lished in the Journal of Neu­ro­phys­i­ology, Hasson and Sternad explored the ques­tion by looking at the everyday task of manip­u­lating a cup of coffee. They show that how we adapt our move­ment strate­gies is directly related to the amount of vari­ability and reli­a­bility in our sur­round­ings and ourselves.

“Because we’re humans and not machines, we’re noisy and vari­able,” said Hasson. “We can’t expect that a move­ment will unfold exactly as we planned it.”

For the study, 18 healthy par­tic­i­pants vis­ited the Action Lab to play a video game, wherein they attempted to move a vir­tual cup filled with vir­tual liquid across a large video screen. Instead of a normal video-game con­troller, sub­jects moved the vir­tual cup by grasping a manip­u­landum — a large robotic arm. Sim­ilar to the real-life sce­nario, the robot sim­u­lated the forces one would feel from the weight of the object and the sloshing of the liquid in the cup.

They asked par­tic­i­pants to move the cup across the screen within a com­fort­able time of two sec­onds, a task for which there is an infi­nite number of pos­si­bil­i­ties. You could move fast for one second and slow for one second, slow for a half second and then fast for one and a half sec­onds. The team hypoth­e­sized that par­tic­i­pants would nat­u­rally adapt a safe move­ment strategy with prac­tice — and they did.

But the most intriguing result, said Hasson, was that the size of each participant’s safety margin —or how close they let the liquid get to the edge of the cup — could be pre­dicted by how vari­able they were in their move­ments. Those with more vari­ability tended to adapt a “safer” strategy with a larger safety margin.

“If you have a large safety margin and I move with a small margin, the ques­tion is, ‘Why am I more risky than you?’” Hasson said. “Well, you may find that I am much more con­sis­tent in my move­ments, so I don’t need a big safety margin. If you’re more vari­able, you need a larger safety margin.”

The results have impli­ca­tions in assessing elderly patients and patients of motor dis­or­ders such as cere­bral palsy. “If vari­ability deter­mines the move­ments that you do, maybe that’s an inter­ven­tion point,” said Sternad.

Provided by Northeastern University

Source: medicalxpress.com

Jun 16, 20129 notes
#science #neuroscience #brain #psychology
Inproved repair to damage of the peripheral nervous system

June 15, 2012

Researchers from the Peninsula College of Medicine and Dentistry, University of Exeter, in collaboration with colleagues from Rutgers University, Newark and University College London, have furthered understanding of the mechanism by which the cells that insulate the nerve cells in the peripheral nervous system, Schwann cells, protect and repair damage caused by trauma and disease.

The findings of the study, published on-line by the Journal of Neuroscience and supported by the Wellcome Trust, are exciting in that they point to future therapies for the repair and improvement of damage to the peripheral nervous system.

The peripheral nervous system is the part of the nervous system outside the brain and the spinal cord. It regulates almost every aspect of our bodily function, carrying sensory information that allows us to feel the sun on our face and motor information, that allows us to move. It also controls the functions of all the organs of the body.

Damage can occur through trauma: it can occur in diabetic neuropathy (suffered by almost half of those with diabetes) and patients with common inherited conditions such as Charcot-Marie-Tooth (CMT) disease. There can be a wide range of symptoms, from loss of sensation in the hands and feet to problems with digestion, blood pressure regulation, sexual function and bladder control.

Schwann cells provide the insulation, or myelin sheath, for the nerve cells that carry electrical impulses to and from the spinal cord. Schwann cells, because of their plasticity, are able to revert back to an immature ‘repair’ cell to repair damage to the peripheral nervous system. The level of repair is remarkably good but incomplete repair, perhaps after the severance of a nerve, may lead to long-term loss of function and pain.

The ability of Schwann cells to demyelinate can make them susceptible to the disease process seen in conditions such as CMT. CMT affects one in 2500people, so is a comparatively common inherited disease of the nervous system. Mutations in the many different genes in CMT can cause cycles of repair and re-insulation (re-myelination) which lead to long-term damage and the death of both Schwann and nerve cells. There is currently no therapy for CMT and patients experience increased sensory and motor problems which may lead to permanent disability.

The research team believes that its work to understand the ability of Schwann cells to revert back to an immature state and stimulate repair will lead to therapies to improve damage from severe trauma and break the cycle of damage caused by CMT. They also believe that there may also be potential to improve repair in cases of diabetic neuropathy.

They have identified a DNA binding protein, cJun, as a key player in the plasticity that allows a Schwann cell to revert back to the active repair state. cJun may be activated by a number of pathways that convey signals from the surface of the Schwann cell to the nucleus. One such pathway, the p38 Mitogen Activated Protein Kinase Pathway, appears to play a vital role: it is activated after PNS damage and may promote the process of repair; conversely it may be abnormally activated in demyelinating diseases such as CMT.

Professor David Parkinson, Associate Professor in Neuroscience, Peninsula College of Medicine and Dentistry, University of Exeter, said: “The findings of our research are exciting because we have pinpointed and are understanding the mechanism by which our bodies can repair damage to the peripheral nervous system. With further investigation, this could well lead to therapies to repair nerve damage from trauma and mitigate the damage which relates to common illnesses, such as CMT.”

Provided by The Peninsula College of Medicine and Dentistry

Source: medicalxpress.com

Jun 16, 20128 notes
#science #neuroscience #psychology
Control of brain waves from the brain surface

June 15, 2012

Whether or not a neuron transmits an electrical impulse is a function of many factors. European research is using a heady mixture of techniques – molecular, microscopy and electrophysiological – to identify the necessary input for nerve transmission in the cortex.

image

Credit: Thinkstock

In the central nervous system (CNS), a nerve cell or neuron has a ‘forest’ of elaborate dendritic trees arising from the cell body. These literally receive many thousands of synapses (junctions that allow transmission of a signal) at positions around the tree. These inputs then are able to generate an impulse, or ‘spike’, known as an action potential at the initial part of the axon.

Previous research has confirmed that an activated synapse will generate an electric signal as a result of neurotransmitters released from pre-synaptic axons. Electrical recordings from the neocortex have confirmed that, in line with the cable theory prediction, that modulation of potential at the dendrite is highly distance-dependent from the cell body or soma.

The ‘Information processing in distal dendrites of neocortical layer 5 pyramidal neurons’ (Channelrhodopsin) project aimed to shed more light on how more distal sites in the ‘tree’ influence the action potential of the post-synaptic neuron. Furthermore, they investigated exactly how dendritic spikes can be generated, another issue about which there is little information so far.

Recent research has highlighted the importance of activation of N-methyl-D-aspartate (NMDA) receptors to bring about the production of a signal that will proceed to the soma and then result in a spike. There is also indirect evidence that interneurons targeting dendrites can control level of dendrite excitability.

Channelrhodopsin scientists simultaneously recorded the pre- and post-synaptic electrical recordings of identified interneurons and a special type of neuron, pyramidal cells that are primary excitation units in the mammalian cortex.

The project team first characterised the different types of inhibitory neuron deep in the cortex in layer 5 at apical tuft dendrites. The researchers then showed that a special type of inhibitory interneuron in the outer layer of the neocortex can suppress dendritic spiking in layer 5.

Project results show that a superficial inhibitory neuron can impact information processing in a specific pyramidal neuron. The research will have massive implications for neuroscience and help to unravel the integrative operations of CNS neurons.

Provided by CORDIS

Source: medicalxpress.com

Jun 16, 201215 notes
#science #neuroscience #brain #psychology #neuron #brainwave
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December