Neuroscience

Month

June 2012

'Brain pacemaker' effective for years against Parkinson's disease

June 20, 2012

A “brain pacemaker” called deep brain stimulation (DBS) remains an effective treatment for Parkinson’s disease for at least three years, according to a study in the June 2012 online issue of Neurology, the medical journal of the American Academy of Neurology.

But while improvements in motor function remained stable, there were gradual declines in health-related quality of life and cognitive abilities.

First author of the study is Frances M. Weaver, PhD, who has joint appointments at Edward Hines Jr. VA Hospital and Loyola University Chicago Stritch School of Medicine.

Weaver was one of the lead investigators of a 2010 paper in the New England Journal of Medicine that found that motor functions remained stable for two years in DBS patients. The new additional analysis extended the follow-up period to 36 months.

DBS is a treatment for Parkinson’s patients who no longer benefit from medication, or who experience unacceptable side effects. DBS is not a cure, and it does not stop the disease from progressing. But in the right patients, DBS can significantly improve symptoms, especially tremors. DBS also can relieve muscle rigidity that causes decreased range of motion.

In the DBS procedure, a neurosurgeon drills a dime-size hole in the skull and inserts an electrode about 4 inches into the brain. A connecting wire from the electrode runs under the skin to a battery implanted near the collarbone. The electrode delivers mild electrical signals that effectively reorganize the brain’s electrical impulses. The procedure can be done on one or both sides of the brain.

Researchers evaluated 89 patients who were stimulated in a part of the brain called the globus pallidus interna and 70 patients who were stimulated in a different part of the brain called the subthalamic nucleus. (Patients received DBS surgery at seven VA and six affiliated university medical centers.) Patients were assessed at baseline (before DBS surgery) and at 3, 6, 12, 18, 24 and 36 months. Patients were rated on a Parkinson’s disease scale that includes motor functions such as speech, facial expression, tremors, rigidity, finger taps, hand movements, posture, gait, bradykinesia (slow movement) etc. The lower the rating, the better the function.

Improvements in motor function were similar in both groups of patients, and stable over time. Among patients stimulated in the globus pallidus interna, the score improved from 41.1 at baseline to 27.1 at 36 months. Among patients stimulated in the subthalamic nucleus, the score improved from 42.5 at baseline to 29.7 at 36 months.

By contrast, some early gains in quality of life and the abilities to do the activities of daily living were gradually lost, and there was a decline in neurocognitive function. This likely reflects the progression of the disease, and the emergence of symptoms that are resistant to DBS and medications.

Researchers concluded that both the globus pallidus interna and the subthalamic nucleus areas of the brain “are viable DBS targets for treatment of motor symptoms, but highlight the importance of nonmotor symptoms as determinants of quality of life in people with Parkinson’s disease.”

Source: medicalxpress.com

Jun 21, 201211 notes
#science #neuroscience #brain #psychology #parkinson
Proposed drug may reverse Huntington's disease symptoms

June 20, 2012

With a single drug treatment, researchers at the Ludwig Institute for Cancer Research at the University of California, San Diego School of Medicine can silence the mutated gene responsible for Huntington’s disease, slowing and partially reversing progression of the fatal neurodegenerative disorder in animal models.

image

This image shows stained mouse neurons. Credit: Image courtesy of Taylor Bayouth

The findings are published in the June 21, 2012 online issue of the journal Neuron.

Researchers suggest the drug therapy, tested in mouse and non-human primate models, could produce sustained motor and neurological benefits in human adults with moderate and severe forms of the disorder. Currently, there is no effective treatment.

Huntington’s disease afflicts approximately 30,000 Americans, whose symptoms include uncontrolled movements and progressive cognitive and psychiatric problems. The disease is caused by the mutation of a single gene, which results in the production and accumulation of toxic proteins throughout the brain.

Don W. Cleveland, PhD, professor and chair of the UC San Diego Department of Cellular and Molecular Medicine and head of the Laboratory of Cell Biology at the Ludwig Institute for Cancer Research, and colleagues infused mouse and primate models of Huntington’s disease with one-time injections of an identified DNA drug based on antisense oligonucleotides (ASOs). These ASOs selectively bind to and destroy the mutant gene’s molecular instructions for making the toxic huntingtin protein.

The singular treatment produced rapid results. Treated animals began moving better within one month and achieved normal motor function within two. More remarkably, the benefits persisted, lasting nine months, well after the drug had disappeared and production of the toxic proteins had resumed.

"For diseases like Huntington’s, where a mutant protein product is tolerated for decades prior to disease onset, these findings open up the provocative possibility that transient treatment can lead to a prolonged benefit to patients,” said Cleveland. “This finding raises the prospect of a ‘huntingtin holiday,’ which may allow for clearance of disease-causing species that might take weeks or months to re-form. If so, then a single application of a drug to reduce expression of a target gene could ‘reset the disease clock,’ providing a benefit long after huntingtin suppression has ended.”

Beyond improving motor and cognitive function, researchers said the ASO treatment also blocked brain atrophy and increased lifespan in mouse models with a severe form of the disease. The therapy was equally effective whether one or both huntingtin genes were mutated, a positive indicator for human therapy.

Cleveland noted that the approach was particularly promising because antisense therapies have already been proven safe in clinical trials and are the focus of much drug development. Moreover, the findings may have broader implications, he said, for other “age-dependent neurodegenerative diseases that develop from exposure to a mutant protein product” and perhaps for nervous system cancers, such as glioblastomas.

Provided by University of California - San Diego

Source: medicalxpress.com

Jun 21, 201231 notes
#science #neuroscience #brain #psychology #huntington
Study shows role of cellular protein in regulation of binge eating

June 20, 2012

Researchers from Boston University School of Medicine (BUSM) have demonstrated in experimental models that blocking the Sigma-1 receptor, a cellular protein, reduced binge eating and caused binge eaters to eat more slowly. The research, which is published online in Neuropsychopharmacology, was led by Pietro Cottone, PhD, and Valentina Sabino, PhD, both assistant professors in the pharmacology and psychiatry departments at BUSM.

Binge eating disorder, which affects approximately 15 million Americans, is believed to be the eating disorder that most closely resembles substance dependence. In binge eating subjects, normal regulatory mechanisms that control hunger do not function properly. Binge eaters typically gorge on “junk” foods excessively and compulsively despite knowing the adverse consequences, which are physical, emotional and social in nature. In addition, binge eaters typically experience distress and withdrawal when they abstain from junk food.

The researchers developed an experimental model of compulsive binge eating by providing a sugary, chocolate diet only for one hour a day while the control group was given a standard laboratory diet. Within two weeks, the group exposed to the sugary diet exhibited binge eating behavior and ate four times as much as the controls. In addition, the experimental binge eaters exhibited compulsive behavior by putting themselves in a potentially risky situation in order to get to the sugary food while the control group avoided the risk.

The researchers then tested whether a drug that blocks the Sigma-1 receptor could reduce binge eating of the sugary diet. The experimental data showed the drug successfully reduced binge eating by 40 percent, caused the binge eaters to eat more slowly and blocked the risky behavior.

The abnormal, risky behavior exhibited by the binge eating experimental group suggested to the researchers that there could be something wrong with how decisions were made. Because evaluation of risks and decision making are functions executed in the prefronto-cortical regions of the brain, the researchers tested whether the abundance of Sigma-1 receptors in those regions was abnormal in the binge eaters. They found that Sigma-1 receptor expression was unusually high in those areas, which could explain why blocking its function could decrease both compulsive binge eating and risky behavior.

"These findings suggest that the Sigma-1 receptor may contribute to the neurobiological adaptations that cause compulsive-like eating, opening up a new potential therapeutic treatment target for binge eating disorder,” said Cottone, who also co-directs the Laboratory of Addictive Disorders at BUSM with Sabino.

Provided by Boston University Medical Center

Source: medicalxpress.com

Jun 21, 201216 notes
#neuroscience #psychology #science
Scientists Identify Protein Required to Regrow Injured Nerves in Limbs

ScienceDaily (June 20, 2012) — A protein required to regrow injured peripheral nerves has been identified by researchers at Washington University School of Medicine in St. Louis.

image

These are images of axon regeneration in mice two weeks after injury to the hind leg’s sciatic nerve. On the left, axons (green) of a normal mouse have regrown to their targets (red) in the muscle. On the right, a mouse lacking DLK shows no axons have regenerated, even after two weeks. (Credit: Jung Eun Shin)

The finding, in mice, has implications for improving recovery after nerve injury in the extremities. It also opens new avenues of investigation toward triggering nerve regeneration in the central nervous system, notorious for its inability to heal.

Peripheral nerves provide the sense of touch and drive the muscles that move arms and legs, hands and feet. Unlike nerves of the central nervous system, peripheral nerves can regenerate after they are cut or crushed. But the mechanisms behind the regeneration are not well understood.

In the new study, published online June 20 in Neuron, the scientists show that a protein called dual leucine zipper kinase (DLK) regulates signals that tell the nerve cell it has been injured — often communicating over distances of several feet. The protein governs whether the neuron turns on its regeneration program.

"DLK is a key molecule linking an injury to the nerve’s response to that injury, allowing the nerve to regenerate," says Aaron DiAntonio, MD, PhD, professor of developmental biology. "How does an injured nerve know that it is injured? How does it take that information and turn on a regenerative program and regrow connections? And why does only the peripheral nervous system respond this way, while the central nervous system does not? We think DLK is part of the answer."

The nerve cell body containing the nucleus or “brain” of a peripheral nerve resides in the spinal cord. During early development, these nerves send long, thin, branching wires, called axons, out to the tips of the fingers and toes. Once the axons reach their targets (a muscle, for example), they stop extending and remain mostly unchanged for the life of the organism. Unless they’re damaged.

If an axon is severed somewhere between the cell body in the spinal cord and the muscle, the piece of axon that is no longer connected to the cell body begins to disintegrate. Earlier work showed that DLK helps regulate this axonal degeneration. And in worms and flies, DLK also is known to govern the formation of an axon’s growth cone, the structure responsible for extending the tip of a growing axon whether after injury or during development.

The formation of the growth cone is an important part of the early, local response of a nerve to injury. But a later response, traveling over greater distances, proves vital for relaying the signals that activate genes promoting regeneration. This late response can happen hours or even days after injury.

But in mice, unlike worms and flies, DiAntonio and his colleagues found that DLK is not involved in an axon’s early response to injury. Even without DLK, the growth cone forms. But a lack of DLK means the nerve cell body, nestled in the spinal cord far from the injury, doesn’t get the message that it’s injured. Without the signals relaying the injury message, the cell body doesn’t turn on its regeneration program and the growth cone’s progress in extending the axon stalls.

In addition, it was shown many years ago that axons regrow faster after a second injury than axons injured only once. In other words, injury itself increases an axon’s ability to regenerate. Furthering this work, first author Jung Eun Shin, graduate research assistant, and her colleagues found that DLK is required to promote this accelerated growth.

"A neuron that has seen a previous injury now has a different regenerative program than one that has never been damaged," Shin says. "We hope to be able to identify what is different between these two neurons — specifically what factors lead to the improved regeneration after a second injury. We have found that activated DLK is one such factor. We would like to activate DLK in a newly injured neuron to see if it has improved regeneration."

In addition to speeding peripheral nerve recovery, DiAntonio and Shin see possible implications in the central nervous system. It is known for example, that some of the important factors regulated and ramped up by DLK are not activated in the central nervous system.

"Since this sort of signaling doesn’t appear to happen in the central nervous system, it’s possible these nerves don’t ‘know’ when they are injured," DiAntonio says. "It’s an exciting idea — but not at all proven — that activating DLK in the central nervous system could promote its regeneration."

Source: Science Daily

Jun 21, 201239 notes
#science #neuroscience #psychology #protein
How Humans Predict Other's Decisions

ScienceDaily (June 20, 2012) — Researchers at the RIKEN Brain Science Institute (BSI) in Japan have uncovered two brain signals in the human prefrontal cortex involved in how humans predict the decisions of other people. Their results suggest that the two signals, each located in distinct prefrontal circuits, strike a balance between expected and observed rewards and choices, enabling humans to predict the actions of people with different values than their own.

image

Figure one shows the neural activity for the simulation of another person: Reward Signal (red) and Action Signal (green). The action signal shown in this figure (green) is in the dorsomedial prefrontal cortex. The activity of reward signal (red) largely overlaps with the activity of the signal for the self-valuation (blue) in the ventromedial prefrontal cortex. (Credit: RIKEN)

Every day, humans are faced with situations in which they must predict what decisions other people will make. These predictions are essential to the social interactions that make up our personal and professional lives. The neural mechanism underlying these predictions, however, by which humans learn to understand the values of others and use this information to predict their decision-making behavior, has long remained a mystery.

Researchers at the RIKEN Brain Science Institute (BSI) in Japan have now shed light on this mystery with a paper to appear in the June 21st issue of Neuron. The researchers describe for the first time the process governing how humans learn to predict the decisions of another person using mental simulation of their mind.

Learning another person’s values and mental processes is often assumed to require simulation of the other’s mind: using one’s own familiar mental processes to simulate unfamiliar processes in the mind of the other. While simple and intuitive, this explanation is hard to prove due to the difficulty in disentangling one’s own brain signals from those of the simulated other.

Research scientists Shinsuke Suzuki and Hiroyuki Nakahara, a Principal Investigator of the Laboratory for Integrated Theoretical Neuroscience at RIKEN BSI, together with their collaborators, set out to disentangle these signals using functional Magnetic Resonance Imaging (fMRI) on humans. First, they studied the behavior of subjects as they played a game by making predictions about the other’s behavior based on the knowledge of others and their decisions. Then they generated a computer model of the simulation process to examine the brain signals underlying the prediction of the other’s behavior.

The authors found that humans simulate the decisions of other people using two brain signals encoded in the prefrontal cortex, an area responsible for higher cognition. One signal involves the estimated value of the reward to the other person, and is called the reward signal, referring to the difference between the other’s values, simulated in one’s mind, and the reward benefit that the other actually received. The other signal is called the action signal, relating to the other’s expected action predicted by the simulation process in one’s mind, and what the other person actually did, which may or may not be different. They found that the reward signal is processed in a part of the brain called the ventromedial prefrontal cortex. The action signal, on the other hand, was found in a separate brain area called the dorsomedial prefrontal cortex.

"Every day, we interact with a variety of other individuals," Suzuki said. "Some may share similar values with us and for those interactions simulation using the reward signal alone may suffice. However, other people with different values may be quite different and then the action signal may become quite important."

Nakahara believes that their approach, using mathematical models based on human behavior with brain imaging, will be useful to answer a wide range of questions about the social functions employed by the brain. “Perhaps we may one day better understand how and why humans have the ability to predict others’ behavior, even those with different characteristics. Ultimately, this knowledge could help improving political, educational, and social systems in human societies.”

Source: Science Daily

Jun 21, 201256 notes
#science #neuroscience #brain #psychology
All Things Big and Small: The Brain's Discerning Taste for Size

ScienceDaily (June 20, 2012) — The human brain can recognize thousands of different objects, but neuroscientists have long grappled with how the brain organizes object representation; in other words, how the brain perceives and identifies different objects. Now researchers at the MIT Computer Science and Artificial Intelligence Lab (CSAIL) and the MIT Department of Brain and Cognitive Sciences have discovered that the brain organizes objects based on their physical size, with a specific region of the brain reserved for recognizing large objects and another reserved for small objects.

image

This figure shows brain activations while participants view pictures of large and small objects. (Credit: Image courtesy of Massachusetts Institute of Technology, CSAIL)

Their findings, to be published in the June 21 issue of Neuron, could have major implications for fields like robotics, and could lead to a greater understanding of how the brain organizes and maps information.

"Prior to this study, nobody had looked at whether the size of an object was an important factor in the brain’s ability to recognize it," said Aude Oliva, an associate professor in the MIT Department of Brain and Cognitive Sciences and senior author of the study.

"It’s almost obvious that all objects in the world have a physical size, but the importance of this factor is surprisingly easy to miss when you study objects by looking at pictures of them on a computer screen," said Dr. Talia Konkle, lead author of the paper. "We pick up small things with our fingers, we use big objects to support our bodies. How we interact with objects in the world is deeply and intrinsically tied to their real-world size, and this matters for how our brain’s visual system organizes object information."

As part of their study, Konkle and Oliva took 3D scans of brain activity during experiments in which participants were asked to look at images of big and small objects or visualize items of differing size. By evaluating the scans, the researchers found that there are distinct regions of the brain that respond to big objects (for example, a chair or a table), and small objects (for example, a paperclip or a strawberry).

By looking at the arrangement of the responses, they found a systematic organization of big to small object responses across the brain’s cerebral cortex. Large objects, they learned, are processed in the parahippocampal region of the brain, an area located by the hippocampus, which is also responsible for navigating through spaces and for processing the location of different places, like the beach or a building. Small objects are handled in the inferior temporal region of the brain, near regions that are active when the brain has to manipulate tools like a hammer or a screwdriver.

The work could have major implications for the field of robotics, in particular in developing techniques for how robots deal with different objects, from grasping a pen to sitting in a chair.

"Our findings shed light on the geography of the human brain, and could provide insight into developing better machine interfaces for robots," said Oliva.

Many computer vision techniques currently focus on identifying what an object is without much guidance about the size of the object, which could be useful in recognition. “Paying attention to the physical size of objects may dramatically constrain the number of objects a robot has to consider when trying to identify what it is seeing,” said Oliva.

The study’s findings are also important for understanding how the organization of the brain may have evolved. The work of Konkle and Oliva suggests that the human visual system’s method for organizing thousands of objects may also be tied to human interactions with the world. “If experience in the world has shaped our brain organization over time, and our behavior depends on how big objects are, it makes sense that the brain may have established different processing channels for different actions, and at the center of these may be size,” said Konkle.

Oliva, a cognitive neuroscientist by training, has focused much of her research on how the brain tackles scene and object recognition, as well as visual memory. Her ultimate goal is to gain a better understanding of the brain’s visual processes, paving the way for the development of machines and interfaces that can see and understand the visual world like humans do.

"Ultimately, we want to focus on how active observers move in the natural world. We think this not only matters for large-scale brain organization of the visual system, but it also matters for making machines that can see like us," said Konkle and Oliva.

Source: Science Daily

Jun 21, 201214 notes
#science #neuroscience #brain #psychology
Simple mathematical pattern describes shape of neuron 'jungle'

June 20, 2012

Neurons come in an astounding assortment of shapes and sizes, forming a thick inter-connected jungle of cells. Now, UCL neuroscientists have found that there is a simple pattern that describes the tree-like shape of all neurons.

Neurons look remarkably like trees, and connect to other cells with many branches that effectively act like wires in an electrical circuit, carrying impulses that represent sensation, emotion, thought and action.

Over 100 years ago, Santiago Ramon y Cajal, the father of modern neuroscience, sought to systematically describe the shapes of neurons, and was convinced that there must be a unifying principle underlying their diversity.

Cajal proposed that neurons spread out their branches so as to use as little wiring as possible to reach other cells in the network. Reducing the amount of wiring between cells provides additional space to pack more neurons into the brain, and therefore increases its processing power.

New work by UCL neuroscientists, published today in Proceedings of the National Academy of Sciences, has revisited this century-old hypothesis using modern computational methods. They show that a simple computer program which connects points with as little wiring as possible can produce tree-like shapes which are indistinguishable from real neurons - and also happen to be very beautiful. They also show that the shape of neurons follows a simple mathematical relationship called a power law.

Power laws have been shown to be common across the natural world, and often point to simple rules underlying complex structures. Dr Herman Cuntz (UCL Wolfson Institute for Biomedical Research) and colleagues find that the power law holds true for many types of neurons gathered from across the animal kingdom, providing strong evidence for Ramon y Cajal’s general principle.

The UCL team further tested the theory by examining neurons in the olfactory bulb, a part of the brain where new brain cells are constantly being formed. These neurons grow and form new connections even in the adult brain, and therefore provide a unique window into the rules behind the development of neural trees in a mature neural circuit.

The team analysed the change in shape of the newborn olfactory neurons over several days, and found that the growth of these neurons also follow the power law, providing further evidence to support the theory.

Dr Hermann Cuntz said: “The ultimate goal of neuroscience is to understand how the impenetrable neural jungle can give rise to the complexity of behaviour.

"Our findings confirm Cajal’s original far-reaching insight that there is a simple pattern behind the circuitry, and provides hope that neuroscientists will someday be able to see the forest for the trees."

Provided by University College London

Source: medicalxpress.com

Jun 21, 201229 notes
#science #neuroscience #brain #psychology #neuron
Jun 20, 20124,913 notes
#science #neuroscience #brain #psychology #neuron #connectome
Fishing for Answers to Autism Puzzle

ScienceDaily (June 19, 2012) — Fish cannot display symptoms of autism, schizophrenia, or other human brain disorders. However, a team of Whitehead Institute and MIT scientists has shown that zebrafish can be a useful tool for studying the genes that contribute to such disorders.

image

Zebrafish with certain genes turned off during embryonic development (center and right images) showed abnormalities of brain formation (top row) and axon wiring (bottom row). At left is a normally developing zebrafish embryo. (Credit: Sive Lab)

Led by Whitehead Member Hazel Sive, the researchers set out to explore a group of about two dozen genes known to be either missing or duplicated in about 1 percent of autistic patients. Most of the genes’ functions were unknown, but a new study by Sive and Whitehead postdocs Alicia Blaker-Lee, Sunny Gupta and, Jasmine McCammon, revealed that nearly all of them produced brain abnormalities when deleted in zebrafish embryos.

The findings, published online recently in the journal Disease Models & Mechanisms, should help researchers pinpoint genes for further study in mammals, says Sive, who is also professor of biology and associate dean of MIT’s School of Science. Autism is thought to arise from a variety of genetic defects; this research is part of a broad effort to identify culprit genes and develop treatments that target them.

"That’s really the goal — to go from an animal that shares molecular pathways, but doesn’t get autistic behaviors, into humans who have the same pathways and do show these behaviors," Sive says.

Sive recalls that some of her colleagues chuckled when she first proposed studying human brain disorders in fish, but it is actually a logical starting point, she says. Brain disorders are difficult to study because most of the symptoms are behavioral, and the biological mechanisms behind those behaviors are not well understood, she says.

"We thought that since we really know so little, that a good place to start would be with the genes that confer risk in humans to various mental health disorders, and to study these various genes in a system where they can readily be studied," she says.

Those genes tend to be the same across species — conserved throughout evolution, from fish to mice to humans — though they may control somewhat different outcomes in each species.

In the latest study, Sive and her colleagues focused on a genetic region known as 16p11.2, first identified by Mark Daly, a former Whitehead Fellow who discovered a type of genetic defect known as a copy number variant. A typical genome includes two copies of every gene, one from each parent; copy number variants occur when one of those copies is deleted or duplicated, and this can be associated with pathology.

The central “core” 16p11.2 region includes 25 genes. Both deletions and duplications in this region have been associated with autism, but it was unclear which of the genes might actually produce symptoms of the disease. “At the time, there was an inkling about some of them, but very few,” Sive says.

Sive and her postdocs began by identifying zebrafish genes analogous to the human genes found in this region. (In zebrafish, these genes are not clustered in a single genetic chunk, but are scattered across many chromosomes.) The researchers studied one gene at a time, silencing each with short strands of nucleic acids that target a particular gene and prevent its protein from being produced.

For 21 of the genes, silencing led to abnormal development. Most produced brain deficits, including improper development of the brain or eyes, thinning of the brain, or inflation of the brain ventricles, cavities that contain cerebrospinal fluid. The researchers also found abnormalities in the wiring of axons, the long neural projections that carry messages to other neurons, and in simple behaviors of the fish. The results show that the 16p11.2 genes are very important during brain development, helping to explain the connection between this region and brain disorders.

Furthermore, the researchers were able to restore normal development by treating the fish with the human equivalents of the genes that had been repressed. “That allows you to deduce that what you’re learning in fish corresponds to what that gene is doing in humans. The human gene and the fish gene are very similar,” Sive says.

To figure out which of these genes might have a strong effect in autism or other disorders, the researchers set out to identify genes that produce abnormal development when their activity is reduced by 50 percent, which would happen in someone who is missing one copy of the gene. (This correlation is not seen for most genes, because there are many other checks and balances that regulate how much of a particular protein is made.)

The researchers identified two such genes in the 16p11.2 region. One, called kif22, codes for a protein involved in the separation of chromosomes during cell division, and one, aldolase a, is involved in glycolysis — the process of breaking down sugar to generate energy for the cell.

In work that has just begun, Sive’s lab is working with Stanford University researchers to explore in mice predictions made from the zebrafish study. They are also conducting molecular studies in zebrafish of the pathways affected by these genes, to get a better idea of how defects in these might bring about neurological disorders.

Source: Science Daily

Jun 20, 201225 notes
#science #neuroscience #brain #psychology #autism
Study Finds High Brain Integration in Top Performers

June 19, 2012 By Janice Wood

Why do some people excel in sports, music and managing companies? New research points to uniquely high mind-brain development in those who excel.

image

“What we have found is an astonishing integration of brain functioning in high performers compared to average-performing controls,” said Fred Travis, Ph.D., director of the Center for Brain, Consciousness, and Cognition at Maharishi University of Management in Fairfield, Iowa.

He claims this research is the “first in the world to show that there is a brain measure of effective leadership.”

In the study, published in the journal Cognitive Processing, researchers found that 20 top-level managers scored higher on three measures — the Brain Integration Scale, Gibbs’s Socio-moral Reasoning questionnaire, and an inventory of peak experiences — compared to 20 low-level managers who served as controls.

“The current understanding of high performance is fragmented,” said co-researcher Harald Harung, Ph.D., of the Oslo and Akershus University College of Applied Sciences in Norway.

“What we have done in our research is to use quantitative and neurophysiological research methods on topics that so far have been dominated by psychology.”

The researchers carried out four studies comparing world-class performers to average performers. This recent study and two others examined top performers in management, sports and classical music. A number of years ago Harung and his colleagues published a study on a variety of professions, such as public administration, management, sports, arts, and education.

The studies include using electroencephalography (EEG) to look at the extent of integration and development of several brain processes.

Read More →

Jun 20, 201231 notes
#science #neuroscience #psychology #brain
Infants Can't Distinguish Between Large and Small Groups

ScienceDaily (June 19, 2012) — Human brains process large and small numbers of objects using two different mechanisms, but infants have not yet developed the ability to make those two processes work together, according to new research from the University of Missouri.

"This research was the first to show the inability of infants in a single age group to discriminate large and small sets in a single task," said Kristy vanMarle, assistant professor of psychological sciences in the College of Arts and Science. "Understanding how infants develop the ability to represent and compare numbers could be used to improve early education programs."

The MU study found that infants consistently chose the larger of two groups of food items when both sets were larger or smaller than four, just as an adult would. Unlike adults, the infants showed no preference for the larger group when choosing between one large and one small set. The results suggest that at age one infants have not yet integrated the two mental functions: one being the ability to estimate numbers of items at a glance and the other being the ability to visually track small sets of objects.

In vanMarle’s study, 10- to 12-month-old infants were presented with two opaque cups. Different numbers of pieces of breakfast cereal were hidden in each cup, while the infants observed, and then the infants were allowed to choose a cup. Four comparisons were tested between different combinations of large and small sets. Infants consistently chose two food items over one and eight items over four, but chose randomly when asked to compare two versus four and two versus eight.

"Being unable to determine that eight is larger than two would put an organism at a serious disadvantage," vanMarle said. "However, ongoing studies in my lab suggest that the capacity to compare small and large sets seems to develop before age two."

The ability to make judgments about the relative number of objects in a group has old evolutionary roots. Dozens of species, including some fish, monkeys and birds have shown the ability to recognize numerical differences in laboratory studies. VanMarle speculated that being unable to compare large and small sets early in infancy may not have been problematic during human evolution because young children probably received most of their food and protection from caregivers. Infants’ survival didn’t depend on determining which bush had the most berries or how many predators they just saw, she said.

"In the modern world there are educational programs that claim to give children an advantage by teaching them arithmetic at an early age," said vanMarle. "This research suggests that such programs may be ineffective simply because infants are unable to compare some numbers with others."

Source: Science Daily

Jun 20, 201213 notes
#science #neuroscience #brain #psychology
Detector of DNA Damage: Structure of a Repair Factor Revealed

ScienceDaily (June 19, 2012) — Double-stranded breaks in cellular DNA can trigger tumorigenesis. LMU researchers have now determined the structure of a protein involved in the repair and signaling of DNA double-strand breaks. The work throws new light on the origins of neurodegenerative diseases and certain tumor types.

Agents such as radiation or environmental toxins can cause double-stranded breaks in genomic DNA, which facilitate the development of tumors or the neurodegenerative disorders ataxia telangiectasia (AT) and AT-like disease (ATLD). Hence efficient repair mechanisms are essential for cell survival and function. The so-called MRN complex is an important component of one such system, and its structure has just been elucidated by a team led by Professor Karl-Peter Hopfner of LMU’s Gene Center.

Malignant mutations

The MRN complex consists of the nuclease Mre11, the ATPase Rad50 and the protein Nbs1. Nbs1 is responsible for recruiting the protein ATM, which plays a central role in early stages of the cellular response to DNA damage, to the site of damage. “How the MRN complex actually recognizes double-stranded breaks is still not clear,” says Hopfner. He and his colleagues therefore set out to clarify the issue by analyzing the structures of mutant, functionally defective versions of the complex.

"We found that pairs of Mre11 molecules form a flexible dimer, which is stabilized by Nbs1." Mutations in different subunits of the complex are associated with distinct syndromes, marked by a predisposition to certain cancers, sensitivity to radiation or neurodegeneration. Hopfner’s results help to explain these differences. For instance, the mutation linked to ATLD lies within the zone of contact between Mre11 and Nbs1, and may inhibit activation of ATM by weakening their interaction.

Source: Science Daily

Jun 20, 20125 notes
#science #neuroscience #biology #DNA
Hulk smash? Maybe not anymore: scientists block excess aggression in mice

June 19, 2012

Pathological rage can be blocked in mice, researchers have found, suggesting potential new treatments for severe aggression, a widespread trait characterized by sudden violence, explosive outbursts and hostile overreactions to stress.

In a study appearing today in the Journal of Neuroscience, researchers from the University of Southern California and Italy identify a critical neurological factor in aggression: a brain receptor that malfunctions in overly hostile mice. When the researchers shut down the brain receptor, which also exists in humans, the excess aggression completely disappeared.

The findings are a significant breakthrough in developing drug targets for pathological aggression, a component in many common psychological disorders including Alzheimer’s disease, autism, bipolar disorder and schizophrenia.

"From a clinical and social point of view, reactive aggression is absolutely a major problem," said Marco Bortolato, lead author of the study and research assistant professor of pharmacology and pharmaceutical sciences at the USC School of Pharmacy. “We want to find the tools that might reduce impulsive violence.”

A large body of independent research, including past work by Bortolato and senior author Jean Shih, USC University Professor and Boyd & Elsie Welin Professor in Pharmacology and Pharmaceutical Sciences at USC, has identified a specific genetic predisposition to pathological aggression: low levels of the enzyme monoamine oxidase A (MAO A). Both male humans and mice with congenital deficiency of the enzyme respond violently in response to stress.

"The same type of mutation that we study in mice is associated with criminal, very violent behavior in humans. But we really didn’t understand why that it is," Bortolato said.

Bortolato and Shih worked backwards to replicate elements of human pathological aggression in mice, including not just low enzyme levels but also the interaction of genetics with early stressful events such as trauma and neglect during childhood.

"Low levels of MAO A are one basis of the predisposition to aggression in humans. The other is an encounter with maltreatment, and the combination of the two factors appears to be deadly: it results consistently in violence in adults," Bortolato said.

The researchers show that in excessively aggressive rodents that lack MAO A, high levels of electrical stimulus are required to activate a specific brain receptor in the pre-frontal cortex. Even when this brain receptor does work, it stays active only for a short period of time.

"The fact that blocking this receptor moderates aggression is why this discovery has so much potential. It may have important applications in therapy," Bortolato said. "Whatever the ways environment can persistently affect behavior — and even personality over the long term — behavior is ultimately supported by biological mechanisms."

Importantly, the aggression receptor, known as NMDA, is also thought to play a key role in helping us make sense of multiple, coinciding streams of sensory information, according to Bortolato.

The researchers are now studying the potential side effects of drugs that reduce the activity of this receptor.

"Aggressive behaviors have a profound socio-economic impact, yet current strategies to reduce these staggering behaviors are extremely unsatisfactory," Bortolato said. "Our challenge now is to understand what pharmacological tools and what therapeutic regimens should be administered to stabilize the deficits of this receptor. If we can manage that, this could truly be an important finding."

Provided by University of Southern California

Source: medicalxpress.com

Jun 20, 201219 notes
#science #neuroscience #brain #psychology
Front-most part of the cortex involved in making short-term predictions about what will happen next

June 19, 2012

Researchers at the University of Iowa, together with colleagues from the California Institute of Technology and New York University, have discovered how a part of the brain helps predict future events from past experiences. The work sheds light on the function of the front-most part of the frontal lobe, known as the frontopolar cortex, an area of the cortex uniquely well developed in humans in comparison with apes and other primates.

image

The image shows the overlap of lesions for eight subjects superimposed on a template brain — red indicates maximum overlap (seven subjects) and dark blue is minimum overlap (one subject). The patient group was selected for lesions that include frontopolar cortex, but the lesions almost invariably extended outside to other parts of anterior prefrontal cortex. Credit: Christopher Kovach, University of Iowa

Making the best possible decisions in a changing and unpredictable environment is an enormous challenge. Not only does it require learning from past experience, but it also demands anticipating what might happen under previously unencountered circumstances. Past research from the UI Department of Neurology was among the first to show that damage to certain parts of the frontal lobe can cause severe deficits in decision making in rapidly changing environments. The new study from the same department on a rare group of patients with damage to the very frontal part of their brains reveals a critical aspect of how this area contributes to decision making. The findings were published June 19 in the Journal of Neuroscience.

"We gave the patients four slot machines from which to pick in order to win money. Unbeknownst to the patients, the probability of getting money from a particular slot machine gradually and unpredictably changed during the experiment. Finding the strategy that pays the most in the long run is a surprisingly difficult problem to solve, and one we hypothesized would require the frontopolar cortex,” explains Christopher Kovach, Ph.D., a UI post-doctoral fellow in neurosurgery and first author of the study.

Contrary to the authors’ initial expectation, the patients actually did quite well on the task, winning as much money, on average, as healthy control participants.

"But when we compared their behavior to that of subjects with intact frontal lobe, we found they used a different set of assumptions about how the payoffs changed over time,” Kovach says. “Both groups based their decisions on how much they had recently won from each slot machine, but healthy comparison subjects pursued a more elaborate strategy, which involved predicting the direction that payoffs were moving based on recent trends. This points towards a specific role for the frontopolar cortex in extrapolating recent trends.”

Kovach’s colleague and study author Ralph Adolphs, Ph.D., professor of neuroscience and psychology at the California Institute of Technology, adds that the study results “argue that the frontopolar cortex helps us to make short-term predictions about what will happen next, a strategy particularly useful in environments that change rapidly — such as the stock market or most social settings.”

Adolphs also hold an adjunct appointment in the UI Department of Neurology.

The study’s innovative approach to understanding the function of this part of the brain uses model-based analyses of behavior of patients with specific and precisely characterized areas of brain damage. These patients are members of the UI’s world-renowned Iowa Neurological Patient Registry, which was established in 1982 and has more than 500 active members with selective forms of damage, or lesions, to one or two defined regions in the brain.

"The University of Iowa is one of the few places in the world where you could carry out this kind of study, since it requires carefully assessed patients with damage to specific parts of their brain," says study author Daniel Tranel, Ph.D., UI professor of neurology and psychology and director of the UI Division of Behavioral Neurology and Cognitive Neuroscience.

In a final twist to the finding, the strategy taken by lesion patients was actually slightly better than the one used by comparison subjects. It happened that the task was designed so that the trends in the payoffs were, in fact, random and uninformative.

"The healthy comparison subjects seemed to perceive trends in what was just random noise," Kovach says.

This implies that the functions of the frontopolar cortex, which support more complex and detailed models of the environment, at times come with a downside: setting up mistaken assumptions.

"To the best of my knowledge this is the first study which links a normal tendency to see a nonexistent pattern in random noise, a type of cognitive bias, to a particular brain region," Kovach notes.

The researchers next want to investigate other parts of the frontal cortex in the brain, and have also begun to record activity directly from the brains of neurosurgical patients to see how single cells respond while making decisions. The work is also important to understand difficulties in decision making seen in disorders such as addiction.

Provided by University of Iowa

Source: medicalxpress.com

Jun 20, 201214 notes
#science #neuroscience #brain #psychology
First example of a heritable abnormality affecting semantic cognition found

June 19, 2012

Four generations of a single family have been found to possess an abnormality within a specific brain region which appears to affect their ability to recall verbal material, a new study by researchers at the University of Bristol and University College London has found.

This is the first suggestion of a heritable abnormality in otherwise healthy humans, and this has important implications for our understanding of the genetic basis of cognition.

Dr Josie Briscoe of Bristol’s School of Experimental Psychology and colleagues at the Institute of Child Health in London studied eight members of a single family (aged 8 years), who despite all having high levels of intelligence have since childhood, experienced profound difficulties in recalling sentences and prose, and language difficulties in listening comprehension and naming less common objects .

While their conversation is articulate and engaging, they can experience the inability to ‘find’ a particular word or topic – a phenomenon similar to the ‘tip-of-the-tongue’ problem experienced by many people. They also report associated problems such as struggling to follow a narrative thread while reading or watching television drama.

Dr Briscoe said: “With their consent, we conducted a number of standard memory and language tests on the affected members of the family. These showed they had difficulty repeating longer sentences correctly and learning words in lists and pairs. This suggests their difficulties lie in semantic cognition: the way people construct and generate meaning from words, objects and ideas.”

"Given the very wide variation in age, the coherence of their difficulties in semantic cognition was remarkable."

The researchers also used Magnetic Resonance Imaging (MRI) to study the brains of the affected family members and found they had reduced grey matter in the posterior inferior portion of the temporal lobe, a brain area known to be involved in semantic cognition.

Dr Briscoe said: “These brain abnormalities were surprising to find in healthy people, particularly in the same family, although similar brain regions have been implicated in research with older adults with neurological problems that are linked to semantic cognition”

"Our findings have uncovered a potential causal link between anomalous neuroanatomy and semantic cognition in a single family. Importantly, the pattern of inheritance appears as a potentially dominant trait. This may well prove to be the first example of a heritable, highly specific abnormality affecting semantic cognition in humans.”

Provided by University of Bristol

Source: medicalxpress.com

Jun 20, 201210 notes
#science #neuroscience #brain #psychology
'Hallucinating' robots arrange objects for human use

June 18, 2012 By Bill Steele

(Phys.org) — If you hire a robot to help you move into your new apartment, you won’t have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, according to Cornell researchers, is to ask “How will humans use this?”

image

A robot populates a room with imaginary human stick figures in order to decide where objects should go to suit the needs of humans.

Researchers in the Personal Robotics Lab of Ashutosh Saxena, assistant professor of computer science, have already taught robots to identify common objects, pick them up and place them stably in appropriate locations. Now they’ve added the human element by teaching robots to “hallucinate” where and how humans might stand, sit or work in a room, and place objects in their usual relationship to those imaginary people.

Their work will be reported at the International Symposium on Experimental Robotics, June 21 in Quebec, and the International Conference of Machine Learning, June 29 in Edinburgh, Scotland.

Previous work on robotic placement, the researchers note, has relied on modeling relationships between objects. A keyboard goes in front of a monitor, and a mouse goes next to the keyboard. But that doesn’t help if the robot puts the monitor, keyboard and mouse at the back of the desk, facing the wall.

image

Above left, random placing of objects in a scene puts food on the floor, shoes on the desk and a laptop teetering on the top of the fridge. Considering the relationships between objects (upper right) is better, but he laptop is facing away from a potential user and the food higher than most humans would like. Adding human context (lower left) makes things more accessible. Lower right: how an actual robot carried it out. (Personal Robotics Lab)

Relating objects to humans not only avoids such mistakes but also makes computation easier, the researchers said, because each object is described in terms of its relationship to a small set of human poses, rather than to the long list of other objects in a scene. A computer learns these relationships by observing 3-D images of rooms with objects in them, in which it imagines human figures, placing them in practical relationships with objects and furniture. You don’t don’t put a sitting person where there is no chair. You can put a sitting person on top of a bookcase, but there are no objects there for the person to use, so that”s ignored. It The computer calculates the distance of objects from various parts of the imagined human figures, and notes the orientation of the objects.

Eventually it learns commonalities: There are lots of imaginary people sitting on the sofa facing the TV, and the TV is always facing them. The remote is usually near a human’s reaching arm, seldom near a standing person’s feet. “It is more important for a robot to figure out how an object is to be used by humans, rather than what the object is. One key achievement in this work is using unlabeled data to figure out how humans use a space,” Saxena said.

In a new situation the a robot places human figures in a 3-D image of a room, locating them in relation to objects and furniture already there. “It puts a sample of human poses in the environment, then figures out which ones are relevant and ignores the others,” Saxena explained. It decides where new objects should be placed in relation to the human figures, and carries out the action.

The researchers tested their method using images of living rooms, kitchens and offices from the Google 3-D Warehouse, and later, images of local offices and apartments. Finally, they programmed a robot to carry out the predicted placements in local settings. Volunteers who were not associated with the project rated the placement of each object for correctness on a scale of 1 to 5.

Comparing various algorithms, the researchers found that placements based on human context were more accurate than those based solely in relationships between objects, but the best results of all came from combining human context with object-to-object relationships, with an average score of 4.3. Some tests were done in rooms with furniture and some objects, others in rooms where only a major piece of furniture was present. The object-only method performed significantly worse in the latter case because there was no context to use. “The difference between previous works and our [human to object] method was significantly higher in the case of empty rooms,” Saxena reported.

Provided by Cornell University

Source: phys.org

Jun 19, 201211 notes
#science #neuroscience #robotics
Robots Get a Feel for the World

June 18th, 2012

Robots equipped with tactile sensor able to identify materials through touch, paving the way for more useful prostheses.

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel, or at least the ability to identify different materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

[Video: Robots Get a Feel for the World]
What does a robot feel when it touches something? Little or nothing until now. Researchers at the USC Viterbi School of Engineering publish a study in Frontiers in Neurorobotics showing that specially designed robots can be taught to feel even more than humans. Vimeo video by USC Viterbi. USC Viterbi.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.

image

Tactile sensors which mimic finger tips enables robots to identify materials through touch better than humans. Image from press release by USC Viterbi School of Engineering.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin.

Source: Neuroscience News

Jun 19, 201213 notes
#science #neuroscience #robotics
Children, Brain Development and the Criminal Law

ScienceDaily (June 18, 2012) — The legal system needs to take greater account of new discoveries in neuroscience that show how a difficult childhood can affect the development of a young person’s brain which can increase the risk adolescent crimes, according to researchers.

The research will be presented as part of an Economic and Social Research Council seminar series in conjunction with the Parliamentary Office of Science and Technology.

Neuroscientists have recently shown that early adversity — such as a very chaotic and frightening home life — can result in a young child becoming hyper vigilant to potential threats in their environment. This appears to influence the development of brain connectivity and functions.

Such children may come to adolescence with brain systems that are set differently, and this may increase their likelihood of taking impulsive risks. For many young offenders such early adversity is a common experience, and it may increase both their vulnerability to mental health problems and also their risk of problem behaviours.

These insights, from a team led by Dr Eamon McCrory, University College London, are part of a wave of neuroscientific research questions that have potential implications for the legal system.

Other research by Dr Seena Fazel of Oxford University has shown that while social disadvantage is a major risk factor for offending, a Traumatic Brain Injury (TBI) — from an accident or assault — significantly increases the risk of involvement in violent crime. Professor Huw Williams, at University of Exeter, has similarly shown that around 45 per cent of young offenders have TBI histories, and more injuries are associated with greater violence.

Professor Williams said: “The latest message from neuroscience is that young people who suffer troubled childhoods may experience a kind of ‘triple whammy’. A difficult social background may put them at greater risk of offending and influence their brain development early on in childhood in a way that increases risky behaviour. This can then increase their chances of experiencing an injury to their brains that would compromise their ability to stay in school or contribute to society still further.”

Professor Williams wants to see better communication between neuroscientists, clinicians and lawyers so that research findings like these lead to changes in the legal system. “There is a big gap between research conducted by neuroscientists and the realities of the day to day work of the justice system,” he said. “Although criminal behaviour results from a complex interplay of a host of factors, neuroscientists and clinicians are identifying key risk factors that — if addressed — could reduce crime. Investment in earlier, focussed interventions may offset the costs of years of custody and social violence.”

Dr Eileen Vizard, a prominent adolescent forensic psychiatrist, will talk at the event Neuroscience, Children and the Law, about how the criminal justice system needs to be changed to age appropriate sentencing for children as young as ten years old, whilst also providing for the welfare needs of these deprived children. Laura Hoyano — a leading expert on vulnerable people in criminal courts — will discuss the problems children face when testifying in criminal courts.

Source: Science Daily

Jun 19, 201211 notes
#science #neuroscience #psychology #brain
Clues to Nervous System Evolution Found in Nerve-Less Sponge

ScienceDaily (June 18, 2012) — UC Santa Barbara scientists turned to the simple sponge to find clues about the evolution of the complex nervous system and found that, but for a mechanism that coordinates the expression of genes that lead to the formation of neural synapses, sponges and the rest of the animal world may not be so distant after all. Their findings, titled “Functionalization of a protosynaptic gene expression network,” are published in the Proceedings of the National Academy of Sciences.

image

The genes of Amphimedon queenslandica, a marine sponge native to the Great Barrier Reef, Australia, have been fully sequenced, allowing the researchers to monitor gene expression for signs of neural development. (Credit: UCSB)

"If you’re interested in finding the truly ancient origins of the nervous system itself, we know where to look," said Kenneth Kosik, Harriman Professor of Neuroscience Research in the Department of Molecular, Cellular & Developmental Biology, and co-director of UCSB’s Neuroscience Research Institute.

That place, said Kosik, is the evolutionary period of time when virtually the rest of the animal kingdom branched off from a common ancestor it shared with sponges, the oldest known animal group with living representatives. Something must have happened to spur the evolution of the nervous system, a characteristic shared by creatures as simple as jellyfish and hydra to complex humans, according to Kosik.

A previous sequencing of the genome of the Amphimedon queenslandica — a sponge that lives in Australia’s Great Barrier Reef — showed that it contained the same genes that lead to the formation of synapses, the highly specialized characteristic component of the nervous system that sends chemical and electrical signals between cells. Synapses are like microprocessors, said Kosik explaining that they carry out many sophisticated functions: They send and receive signals, and they also change behaviors with interaction — a property called “plasticity.”

"Specifically, we were hoping to understand why the marine sponge, despite having almost all the genes necessary to build a neuronal synapse, does not have any neurons at all," said the paper’s first author, UCSB postdoctoral researcher Cecilia Conaco, from the UCSB Department of Molecular, Cellular, and Developmental Biology (MCDB) and Neuroscience Research Institute (NRI). "In the bigger scheme of things, we were hoping to gain an understanding of the various factors that contribute to the evolution of these complex cellular machines."

This time the scientists, including Danielle Bassett, from the Department of Physics and the Sage Center for the Study of the Mind, and Hongjun Zhou and Mary Luz Arcila, from NRI and MCDB, examined the sponge’s RNA (ribonucleic acid), a macromolecule that controls gene expression. They followed the activity of the genes that encode for the proteins in a synapse throughout the different stages of the sponge’s development.

"We found a lot of them turning on and off, as if they were doing something," said Kosik. However, compared to the same genes in other animals, which are expressed in unison, suggesting a coordinated effort to make a synapse, the ones in sponges were not coordinated.

"It was as if the synapse gene network was not wired together yet," said Kosik. The critical step in the evolution of the nervous system as we know it, he said, was not the invention of a gene that created the synapse, but the regulation of preexisting genes that were somehow coordinated to express simultaneously, a mechanism that took hold in the rest of the animal kingdom.

The work isn’t over, said Kosik. Plans for future research include a deeper look at some of the steps that lead to the formation of the synapse; and a study of the changes in nervous systems after they began to evolve.

"Is the human brain just a lot more of the same stuff, or has it changed in a qualitative way?" he asked.

Source: Science Daily

Jun 19, 201213 notes
#science #neuroscience #evolution #psychology #nervous system
Diabetes, poor glucose control associated with greater cognitive decline in older adults

June 18, 2012

Among well-functioning older adults without dementia, diabetes mellitus (DM) and poor glucose control among those with DM are associated with worse cognitive function and greater cognitive decline, according to a report published Online First by Archives of Neurology, a JAMA Network publication.

Findings from previous studies have suggested an association between diabetes mellitus and an increased risk of cognitive impairment and dementia, including Alzheimer disease, but this association continues to be debated and less is known regarding incident DM in late life and cognitive function over time, the authors write as background in the study.

Kristine Yaffe, M.D., of the University of California, San Francisco and the San Francisco VA Medical Center, and colleagues evaluated 3,069 patients (mean age, 74.2 years; 42 percent black; 52 percent female) who completed the Modified Mini-Mental State Examination (3MS) and Digit Symbol Substitution Test (DSST) at baseline and selected intervals over 10 years.

At study baseline, 717 patients (23.4 percent) had prevalent DM and 2,352 (76.6 percent) were without DM, 159 of whom developed DM during follow-up. Patients who had prevalent DM at baseline had lower 3MS and DSST test scores than patients without DM, and results from analysis show similar patterns for 9-year decline with participants with prevalent DM showing significant decline on both the 3MS and DSST compared with those without DM.

Also, among participants with prevalent DM at baseline, higher levels of hemoglobin A1c (HbA1c) were associated with lower 3MS and DSST scores. However, after adjusting for age, sex, race and education, scores remained significantly lower for those with mid (7 percent to 8 percent) and high (greater than or equal to 8 percent) HbA1c levels on the 3MS but were no longer significant for the DSST.

"This study supports the hypothesis that older adults with DM have reduced cognitive function and that poor glycemic control may contribute to this association,” the authors conclude. “Future studies should determine if early diagnosis and treatment of DM lessen the risk of developing cognitive impairment and if maintaining optimal glucose control helps mitigate the effect of DM on cognition.”

Provided by JAMA and Archives Journals

Source: medicalxpress.com

Jun 19, 20122 notes
#science #neuroscience #brain #alzheimer
Highways of the brain: High-cost and high-capacity

June 18, 2012

A new study proposes a communication routing strategy for the brain that mimics the American highway system, with the bulk of the traffic leaving the local and feeder neural pathways to spend as much time as possible on the longer, higher-capacity passages through an influential network of hubs, the so-called rich club.

image

The study, published this week online in the Early Edition of the Proceedings of the National Academy of Sciences, involves researchers from Indiana University and the University Medical Center Utrecht in the Netherlands and advances their earlier findings that showed how select hubs in the brain not only are powerful in their own right but have numerous and strong connections between each other.

The current study characterizes the influential network within the rich club as the “backbone” for global brain communication. A costly network in terms of the energy and space consumed, said Olaf Sporns, professor in the Department of Psychological and Brain Sciences at IU Bloomington, but one with a big pay-off: providing quick and effective communication between billions and billions of brain cells.

"Until now, no one knew how central the brain’s rich club really was," Sporns said. "It turns out the rich club is always right in the middle when it comes to how brain regions talk to each other. It absorbs, transforms and disseminates information. This underscores its importance for brain communication.”

In earlier work, using diffusion imaging, the researchers found a group of 12 strongly interconnected bihemispheric hub regions, comprising the precuneus, superior frontal and superior parietal cortex, as well as the subcortical hippocampus, putamen and thalamus. Together, these regions form the brain’s “rich club.” Most of these areas are engaged in a wide range of complex behavioral and cognitive tasks, rather than more specialized processing such as vision and motor control.

For the current study, Martijn van den Heuvel, a professor at the Rudolf Magnus Institute of Neuroscience at University Medical Center Utrecht, used diffusion tensor imaging data for two sets of 40 healthy subjects to map the large-scale connectivity structure of the brain. The cortical sheet was divided into 1,170 regions, and then pathways between the regions were reconstructed and measured. As in the previous study, the rich club nodes were widely distributed and had up to 40 percent more connectivity compared to other areas.

The connections measured — almost 700,000 in total — were classified in one of three ways: as rich club connections if they connected nodes within the rich club; as feeder connections if they connected a non-rich club node to a rich club node; and as local connections if they connected non-rich club nodes. Rich club connections made up the majority of all long-distance neural pathways. The study also found that connections classified as rich club connections were used more heavily for communication than other feeder and local connections. A path analysis showed that when a minimally short path is traced from one area of the brain to another, it travels through the rich club network 69 percent of the time, even though the network accounts for only 10 percent of the brain.

A common pattern in communication paths spanning long distances, Sporns said, was that such paths involved sequences of steps leading across local, feeder, rich club, feeder and back to local connections. In other words, he said, many communication paths first traveled toward the rich club before reaching their destinations.

"It is as if the rich club acts as an attractor for signal traffic in the brain," Sporns said. "It soaks up information which is then integrated and sent back out to the rest of the brain."

Van den Heuvel agreed.

"It’s like a big ‘neuronal magnet’ for communication and information integration in our brains," he said. "Seeking out the rich club may offer a strategy for neurons and brain regions to find short communication paths across the brain, and might provide insight into how our brain manages to be so highly efficient."

From an evolutionary standpoint, it was important for the brain to minimize energy consumption and wiring volume, but if these were the only factors, there would be no rich club because of the extra resources it requires, Sporns said. The rich club is expensive, at least in terms of wiring volume, and perhaps also in terms of metabolic cost. The trade-off for higher cost, Sporns said, is higher performance — the integration of diverse signals and the ability to select short paths across the network.

“Brain neurons don’t have maps; how do they find paths to get in touch? Perhaps the rich club helps with this, offering the brain’s neurons and regions a way to communicate efficiently based on a routing strategy that involves the rich club.”

People use related strategies to navigate social networks.

"Strangely, neurons may solve their communication problems just like the people to which they belong," Sporns said.

Provided by Indiana University

Source: medicalxpress.com

Jun 19, 201213 notes
#science #neuroscience #brain #psychology
Coenzyme Q10 study indicates promise in Huntington's treatment

June 18, 2012

A new study shows that the compound Coenzyme Q10 (CoQ) reduces oxidative damage, a key finding that hints at its potential to slow the progression of Huntington disease. The discovery, which appears in the inaugural issue of the Journal of Huntington’s Disease, also points to a new biomarker that could be used to screen experimental treatments for this and other neurological disorders.

"This study supports the hypothesis that CoQ exerts antioxidant effects in patients with Huntington’s disease and therefore is a treatment that warrants further study," says University of Rochester Medical Center neurologist Kevin M. Biglan, M.D., M.P.H., lead author of the study. “As importantly, it has provided us with a new method to evaluate the efficacy of potential new treatments.”

Huntington’s disease (HD) is a genetic, progressive neurodegenerative disorder that impacts movement, behavior, cognition, and generally results in death within 20 years of the disease’s onset. While the precise causes and mechanism of the disease are not completely understood, scientists believe that one of the important triggers of the disease is a genetic “stutter" which produces abnormal protein deposits in brain cells. It is believed that these deposits – through a chain of molecular events – inhibit the cell’s ability to meet its energy demands resulting in oxidative stress and, ultimately, cellular death.

Scientists had previously identified the correlation between a specific fragment of genetic code, called 8-hydroxy-2’-deoxyguanosine (80HdG) and the presence of oxidative stress in brain cells. 80HdG can be detected in a person’s blood, meaning that it could serve as a convenient and accessible biomarker for the disease. Researchers have also been evaluating the compound Coenzyme Q10 as a possible treatment for HD because of its ability to support the function of mitochondria – the tiny power plants the provide cells with energy – and counter oxidative stress.

The study’s authors evaluated a series of blood samples of 20 individuals with HD who had previously undergone treatment with CoQ in clinical trial titled Pre-2Care. While these studies showed that CoQ alleviated some symptoms of the disease, it was not known what impact – if any – the treatment had at the molecular level in the brain. Upon analysis, the authors found that 80HdG levels dropped by 20 percent in individuals who had been treated with CoQ.

CoQ is currently being evaluated in a Phase 3 clinical trial, which is the largest therapeutic clinical study to date for HD. The trial – called 2Care – is being run by the Huntington Study Group, an international networks or investigators.

"Identifying treatments that slow the progression or delay the onset of Huntington’s disease is a major focus of the medical community," said Biglan. "This study demonstrates that 80HdG could be an ideal marker to identify the presence oxidative injury and whether or not treatment is having an impact."

Provided by University of Rochester Medical Center

Source: medicalxpress.com

Jun 18, 201211 notes
#science #neuroscience #brain #huntington #psychology
Device implanted in brain has therapeutic potential for Huntington's disease

June 18, 2012

Studies suggest that neurotrophic factors, which play a role in the development and survival of neurons, have significant therapeutic and restorative potential for neurologic diseases such as Huntington’s disease. However, clinical applications are limited because these proteins cannot easily cross the blood brain barrier, have a short half-life, and cause serious side effects. Now, a group of scientists has successfully treated neurological symptoms in laboratory rats by implanting a device to deliver a genetically engineered neurotrophic factor directly to the brain. They report on their results in the latest issue of Restorative Neurology and Neuroscience.

image

The tip of the EC biodelivery system, a straw-like device that is implanted in the brain of patients, contains living cells which are genetically modified to produce a therapeutic factor. The membrane enclosing the cells allows the factor to flow out of the device and into the patient’s brain tissue. This way, areas deep within the brain affected by Huntington’s disease can be treated to delay or prevent the disease. Credit: Jens Tornøe, NsGene A/S, Ballerup, Denmark

Researchers used Encapsulated Cell (EC) biodelivery, a platform which can be applied using conventional minimally invasive neurosurgical procedures to target deep brain structures with therapeutic proteins. “Our study adds to the continually increasing body of preclinical and clinical data positioning EC biodelivery as a promising therapeutic delivery method for larger biomolecules. It combines the therapeutic advantages of gene therapy with the well-established safety of a retrievable implant,” says lead investigator Jens Tornøe, NsGene A/S, Ballerup, Denmark.

Investigators made a catheter-like device consisting of a hollow fiber membrane encapsulating a polymeric “scaffold,” which provides a surface area to which neurotrophic factor-producing cells can attach. When implanted in the brain, the membrane allows the neurotrophic factor to flow out of the device, as well as allowing nutrients in. Dr. Tornøe and his colleagues used the neurotrophic factor Meteorin, which plays a role in the development of striatal projection neurons, whose degeneration is a hallmark of Huntington’s disease. The scientists engineered ARPE-19 cells to produce Meteorin and used those that produced high levels of Meteorin in their experiment.

The EC biodelivery devices were implanted in the brains of rats followed by injection with quinolinic acid (QA), a potent neurotoxin that causes excitotoxicity, a component of Huntington’s disease. They tested three different implant types: devices filled with the high-producing ARPE-19 cells (EC-Meteorin), devices with unmodified ARPE-19 cells (ARPE-19), and devices without cells. Motor dysfunction was tested immediately prior to injection with QA and at two and four weeks after injection.

The research team found that the EC-Meteorin devices significantly protected against QA-induced toxicity. Rats with EC-Meteorin devices manifested near normal neurological performance and significantly reduced loss of brain cells from the QA injection compared to controls. Analysis of the Meteorin-treated brains showed a markedly reduced striatal lesion size. The EC biodelivery devices were found to produce stable or even increasing levels of Meteorin throughout the study. Meteorin diffused readily from the biodelivery device to the striatal tissue.

"Huntington’s disease can be diagnosed with high accuracy by genetic testing. Pre-symptomatic administration of a safe therapeutic treatment providing sustained delay or prevention of disease would be of great benefit to patients," says Dr. Tornøe. "With additional functional and safety data, tests in animals larger than the rat to study distribution, and more accurate disease models to evaluate the therapeutic potential of Meteorin, we anticipate that EC biodelivery can be developed as a platform technology for targeted therapy in patients with Huntington’s disease."

Provided by IOS Press

Source: medicalxpress.com

Jun 18, 201210 notes
#science #neuroscience #brain #psychology #huntington
MRI images show what the brain looks like when you lose self-control

June 18, 2012

New pictures from the University of Iowa show what it looks like when a person runs out of patience and loses self-control.

image

This image shows brain activity when people exert self-control. Credit: University of Iowa

A study by University of Iowa neuroscientist and neuro-marketing expert William Hedgcock confirms previous studies that show self-control is a finite commodity that is depleted by use. Once the pool has dried up, we’re less likely to keep our cool the next time we’re faced with a situation that requires self-control.

But Hedgcock’s study is the first to actually show it happening in the brain using fMRI images that scan people as they perform self-control tasks. The images show the anterior cingulate cortex (ACC)—the part of the brain that recognizes a situation in which self-control is needed and says, “Heads up, there are multiple responses to this situation and some might not be good”—fires with equal intensity throughout the task.

However, the dorsolateral prefrontal cortex (DLPFC)—the part of the brain that manages self-control and says, “I really want to do the dumb thing, but I should overcome that impulse and do the smart thing”—fires with less intensity after prior exertion of self-control.

image

This image shows brain activity after people have been engaged in self-control tasks long enough that self-control resources have been depleted. Credit: University of Iowa

He said that loss of activity in the DLPFC might be the person’s self-control draining away. The stable activity in the ACC suggests people have no problem recognizing a temptation. Although they keep fighting, they have a harder and harder time not giving in.

Which would explain why someone who works very hard not to take seconds of lasagna at dinner winds up taking two pieces of cake at desert. The study could also modify previous thinking that considered self-control to be like a muscle. Hedgcock says his images seem to suggest that it’s like a pool that can be drained by use then replenished through time in a lower conflict environment, away from temptations that require its use.

The researchers gathered their images by placing subjects in an MRI scanner and then had them perform two self-control tasks—the first involved ignoring words that flashed on a computer screen, while the second involved choosing preferred options. The study found the subjects had a harder time exerting self-control on the second task, a phenomenon called “regulatory depletion.” Hedgcock says that the subjects’ DLPFCs were less active during the second self-control task, suggesting it was harder for the subjects to overcome their initial response.

Hedgcock says the study is an important step in trying to determine a clearer definition of self-control and to figure out why people do things they know aren’t good for them. One possible implication is crafting better programs to help people who are trying to break addictions to things like food, shopping, drugs, or alcohol. Some therapies now help people break addictions by focusing at the conflict recognition stage and encouraging the person to avoid situations where that conflict arises. For instance, an alcoholic should stay away from places where alcohol is served.

But Hedgcock says his study suggests new therapies might be designed by focusing on the implementation stage instead. For instance, he says dieters sometimes offer to pay a friend if they fail to implement control by eating too much food, or the wrong kind of food. That penalty adds a real consequence to their failure to implement control and increases their odds of choosing a healthier alternative.

The study might also help people who suffer from a loss of self-control due to birth defect or brain injury.

"If we know why people are losing self-control, it helps us design better interventions to help them maintain control," says Hedgcock, an assistant professor in the Tippie College of Business marketing department and the UI Graduate College’s Interdisciplinary Graduate Program in Neuroscience.

Provided by University of Iowa

Source: medicalxpress.com

Jun 18, 201248 notes
#science #neuroscience #brain #psychology
The neurological basis for fear and memory

June 18, 2012

Fear conditioning using sound and taste aversion, as applied to mice, have revealed interesting information on the basis of memory allocation.

image

Credit: Thinkstock

European ‘Cellular mechanisms underlying formation of the fear memory trace in the mouse amygdala’ (FEAR Memory TRACE) project is investigating memory allocation and the recruitment of certain neurons to encode a memory. By studying conditioned fear memory in response to an auditory stimulus, the researchers have delved into pathological emotional states and neural mechanisms involved in memory allocation, retrieval and extinction.

Prior research has revealed that the conditioned fear response in mice is located in a specific bundle of neurons in the amygdala. Memory allocation modulation is due to expression of the transcription factor, cyclic adenosine 3’, 5’-monophosphate response element binding protein (CREB) and possibly neuronal excitability.

FEAR Memory TRACE focused on the electrophysiological properties of neurons encoding the same memory. The project also aimed to ascertain the biophysical mechanisms in the plasticity changes recorded in the specific set of neurons in the fear memory trace.

Recording information on auditory fear conditioning and conditioned taste aversion, the scientists used intra-amygdala surgery using viral vectors and electrophysiological experiments to detect neuronal excitability.

Transfected by virus, CREB tagged with green fluorescent protein together with the gene for channelrhodopsin2 were used in neural control experiments. Combined, these two elements caused neuron firing in specific nerve cells. Molecular techniques included western blot for protein detection, genotyping and viral DNA preparation.

Behavioural tests on long- and short-term memory of mice involving fear conditioning and taste aversion showed increased memory performance at the three-hour point rather than the five-hour point. The intrinsic excitability of the mice receiving both shock and the tone was increased at three hours, not five, compared to mice that only received the tone.

As the project continues to its close in two years, the aim is to identify biophysical mechanisms involved in recruiting neurons that compete with each other for a specific memory. FEAR Memory TRACE will also develop computational models to assess the role of these mechanisms in memory performance.

Information on biochemical processes in neural mechanisms has wide application in many clinical situations including patients suffering memory loss, such as stroke victims. Fear response manipulation can be applied in treatment of neuroses and phobias.

Provided by CORDIS

Source: medicalxpress.com

Jun 18, 201238 notes
#science #neuroscience #brain #psychology #memory #emotion
Play
Jun 18, 201216 notes
#science #neuroscience #language #psychology
Manipulation of a specific neural circuit buried in complicated brain networks in primates

June 17, 2012

A collaborative research team led by Professor Tadashi ISA from The National Institute for Physiological Sciences, The National Institutes of Natural Sciences and Fukushima Medical University and Kyoto University, developed a “double viral vector transfection technique” which can deliver genes to a specific neural circuit by combining two new kinds of gene transfer vectors. With this method, they found that “indirect pathways”, which were suspected to have been left behind when the direct connection from the brain to motor neurons (which control muscles) was established in the course of evolution, actually plays an important role in the highly developed dexterous hand movements. This study was supported by the Strategic Research Program for Brain Sciences by the MEXT of Japan. This research result will be published in Nature (June 17th, advance online publication).

It is said that the higher primates including human beings accomplished explosive evolution by having acquired the ability to move hands skillfully. It has been thought that this ability to move individual fingers is a result of the evolution of the direct connection from the cerebrocortical motor area to motor neurons of the spinal cord which control the muscles. On the other hand, in lower animals with clumsy hands, such as cats or rats, the cortical motor area is connected to the motor neurons, only through interneurons of the spinal cord. Such “indirect pathway”remains in us, primates, without us fully understanding its functions. Is this “phylogenetically old circuit” still in operation? Or maybe suppressed since it is obstructive? The conclusion was not attached to this argument.

The collaborative research team led by Professor Tadashi ISA, Project Assistant Professor Masaharu KINOSHITA from The National Institute for Physiological Sciences, The National Institutes of Natural Sciences and Fukushima Medical University and Kyoto University developed “the double viral vector transfection technique”which can deliver genes to a specific neural circuit by combining two new kinds of gene transfer vectors.

With this method, they succeeded in the selective and reversible suppression of the propriospinal neurons (spinal interneurons mediating the indirect connection from cortical motor area to spinal motor neurons)

The results revealed that “indirect pathways” play an important role in dexterous hand movements and finally a longtime debate has come to a close.

The key component of this discovery was”the double viral vector transfection technique”in which one vector is retrogradely transported from the terminal zone back to the neuronal cell bodies and the other is transfected at the location of their cell bodies. The expression of the target gene is regulated only in the cells with double transfection by the two vectors. Using this technique, they succeeded in the suppression of the propriospinal neuron selectively and reversibly.

Such an operation was possible in mice in which the inheritable genetic manipulation of germline cells were possible, but impossible in primates until now.

Using this method, further development of gene therapy targeted to a specific neural circuit can be expected.

Professor Tadashi ISA says “this newly developed double viral vector transfection technique can be applied to the gene therapy of the human central nervous system, as we are the same higher primates.

And this is the discovery which reverses the general idea that the spinal cord is only a reflex pathway, but also plays a pivotal role in integrating the complex neural signals which enable dexterous movements.”

Provided by National Institute for Physiological Sciences

Source: medicalxpress.com

Jun 17, 20129 notes
#science #neuroscience #brain #psychology #neuron
Jun 17, 201299 notes
#science #neuroscience #psychology #neuron #brain #blue brain project
Freud's Theory of Unconscious Conflict Linked to Anxiety Symptoms

ScienceDaily (June 16, 2012) — A link between unconscious conflicts and conscious anxiety disorder symptoms have been shown, lending empirical support to psychoanalysis.

image

Data from the experiment showing that subliminal exposure to words related to a person’s unconscious conflict, followed by supraliminal exposure to words related to their anxiety symptoms, led to different alpha wave patterns compared with other scenarios. (Credit: Image courtesy of University of Michigan Health System)

An experiment that Sigmund Freud could never have imagined 100 years ago may help lend scientific support for one of his key theories, and help connect it with current neuroscience.

June 16 at the 101st Annual Meeting of the American Psychoanalytic Association, a University of Michigan professor who has spent decades applying scientific methods to the study of psychoanalysis will present new data supporting a causal link between the psychoanalytic concept known as unconscious conflict, and the conscious symptoms experienced by people with anxiety disorders such as phobias.

Howard Shevrin, Ph.D., emeritus professor of psychology in the U-M Medical School’s Department of Psychiatry, will present data from experiments performed in U-M’s Ormond and Hazel Hunt Laboratory.

The research involved 11 people with anxiety disorders who each received a series of psychoanalytically oriented diagnostic sessions conducted by a psychoanalyst.

From these interviews the psychoanalysts inferred what underlying unconscious conflict might be causing the person’s anxiety disorder. Words capturing the nature of the unconscious conflict were then selected from the interviews and used as stimuli in the laboratory. They also selected words related to each patient’s experience of anxiety disorder symptoms. Although these words differed from patient to patient, results showed that they functioned in the same way.

These verbal stimuli were presented subliminally at one thousandth of a second, and supraliminally at 30 milliseconds. A control category of stimuli was added that had no relationship to the unconscious conflict or anxiety symptom. While the stimuli were presented to the patients, scalp electrodes record the brain responses to them.

In a previous experiment Shevrin had demonstrated that time-frequency features, a type of brain activity, showed that patients grouped the unconscious conflict stimuli together only when they were presented subliminally. But the conscious symptom-related stimuli showed the reverse pattern — brain activity was better grouped together when patients viewed those words supraliminally.

"Only when the unconscious conflict words were presented unconsciously could the brain see them as connected," Shevrin notes. "What the analysts put together from the interview session made sense to the brain only unconsciously."

However, the experimental design in this first experiment did not allow for directly comparing the effect of the unconscious conflict stimuli on the conscious symptom stimuli.

To obtain evidence for that next level, the unconscious conflict stimuli were presented immediately prior to the conscious symptom stimuli and a new measurement was made, of the brain’s own alpha wave frequency, at 8-13 cycles per second, that had been shown to inhibit various cognitive functions.

Highly significant correlations, suggesting an inhibitory effect, were obtained when the amount of alpha generated by the unconscious conflict stimuli were correlated with the amount of alpha associated with the conscious symptom alpha — but only when the unconscious conflict stimuli were presented subliminally. No results were obtained when control stimuli replaced the symptom words. The fact that these findings are a function of inhibition suggests that from a psychoanalytic standpoint that repression might be involved.

"These results create a compelling case that unconscious conflicts cause or contribute to the anxiety symptoms the patient is experiencing," says Shevrin, who also holds an emeritus position in the Department of Psychology in U-M’s College of Literature, Science and the Arts. "These findings and the interdisciplinary methods used — which draw on psychoanalysis, cognitive psychology, and neuroscience — demonstrate that it is possible to develop an interdisciplinary science drawing upon psychoanalytic theory."

He notes that a prominent critic of psychoanalysis and Freudian theory, Adolf Grunbaum, Ph.D., professor of the philosophy of science at the University of Pittsburgh, has expressed satisfaction that the new results, when added to previous evidence, show that fundamental psychoanalytic concepts can indeed be tested in empirical ways.

For more than 40 years, Shevrin has led a team that has pushed at the boundaries between the disciplines of neuroscience, cognitive psychology, and psychoanalysis, looking for evidence that Freudian concepts such as the unconscious and repression could be documented through physical measures of brain activity. His work has explored the territory where neurobiology, thoughts, emotions and behavior meet.

In 1968 he published the first report of brain responses to unconscious visual stimuli in Science, thus providing strong objective evidence for the existence of the unconscious at a time when most scientists were skeptical of Freud’s ideas. In that same study, he showed that unconscious perceptions are processed in different ways from conscious perceptions, a finding consistent with Freud’s views on how the unconscious works.

In recent years, exchanges between Grunbaum and Shevrin explored the nature of the evidence for the existence and impact of unconscious conflicts. In a 1992 publication, the first study referred to, Grunbaum agreed that Shevrin had obtained objective brain based evidence for the existence of unconscious conflict, but Grunbaum noted that he had not shown that these conflicts caused psychiatric symptoms. His response to being informed of the new findings was an email stating: “I am satisfied.”

Source: Science Daily

Jun 17, 201229 notes
#science #neuroscience #psychology #anxiety #unconscious #brain
Neuroscience: The mind reader

Adrian Owen has found a way to use brain scans to communicate with people previously written off as unreachable. Now, he is fighting to take his methods to the clinic.

image

Adrian Owen still gets animated when he talks about patient 23. The patient was only 24 years old when his life was devastated by a car accident. Alive but unresponsive, he had been languishing in what neurologists refer to as a vegetative state for five years, when Owen, a neuro-scientist then at the University of Cambridge, UK, and his colleagues at the University of Liège in Belgium, put him into a functional magnetic resonance imaging (fMRI) machine and started asking him questions.

Incredibly, he provided answers. A change in blood flow to certain parts of the man’s injured brain convinced Owen that patient 23 was conscious and able to communicate. It was the first time that anyone had exchanged information with someone in a vegetative state.

Patients in these states have emerged from a coma and seem awake. Some parts of their brains function, and they may be able to grind their teeth, grimace or make random eye movements. They also have sleep–wake cycles. But they show no awareness of their surroundings, and doctors have assumed that the parts of the brain needed for cognition, perception, memory and intention are fundamentally damaged. They are usually written off as lost.

Read More →

Jun 16, 201245 notes
#science #neuroscience #brain #psychology
More to Facial Perception Than Meets the Eye

ScienceDaily (June 15, 2012) — People make complex judgements about a person from looking at their face that are based on a range of factors beyond simply their race and gender, according to findings of new research funded by the Economic and Social Research Council (ESRC).

The findings question a long-held belief that people immediately put a person they meet into a limited number of social categories such as: female or male; Asian, Black, Latino or White; and young or old.

Dr Kimberly Quinn at the University of Birmingham found that people ‘see’ faces in a multiple of ways. This could have wider importance in understanding stereotyping and discrimination because it has implications on whether and how people categorise others.

Categorisation is not done purely on the physical features of the face in front of us, but depends on other information as well, including whether the person is already known and whether the person is believed to share other important identities with us.

"How we perceive faces is not just a reflection of what’s in those faces," Dr Quinn said. "We are not objective; we bring our current goals and past knowledge to every new encounter. And this happens really quickly — within a couple of hundred milliseconds of seeing the face."

Dr Quinn and her colleagues explored social categories such as sex, race and age; physical attributes such as attractiveness; personality traits such as trustworthiness; and emotional states such as anger, sadness and happiness.

She found that although social categories are used to gather information on faces, these can be easily undermined. This research found that we reject simple stereotypes when something about the situation alerts us to the fact the stereotype does not tell the whole story. If we take, for example, a racial group and the corresponding stereotype of members of that group as unintelligent, seeing a person in that group playing an intellectual game such as chess would tell us to cancel out the stereotype.

In order to investigate the causes, mechanisms, and results of social categorisation, Dr Quinn used techniques from cognitive psychology and neuroscience to investigate how people process faces. The research was designed to provide insight into when and why people categorise others according to social group membership.

Their findings differ from previous research that adopted a ‘dual process’ approach and assumed people initially categorised faces based on factors such as gender, race or age before determining whether to stereotype them or to see them as unique individuals.

Dr Quinn’s findings were more consistent with a single process that initially focuses on ‘coarse’ information that is easy to detect, and then immediately starts to include more fine-grained processing as time elapses. This model allows for either categorisation or more individuated processing to emerge, and does not assume that categorisation always comes before recognising unique identities — thereby allowing for more diverse outcomes than previously thought.

Further information: http://www.esrc.ac.uk/my-esrc/grants/RES-061-23-0130/read

Source: Science Daily

Jun 16, 201219 notes
#science #neuroscience #brain #psychology
Genetic Markers Hope for New Brain Tumor Treatments

ScienceDaily (June 15, 2012) — Researchers at The University of Nottingham have identified three sets of genetic markers that could potentially pave the way for new diagnostic tools for a deadly type of brain tumour that mainly targets children.

The study, published in the latest edition of the journal Lancet Oncology, was led by Professor Richard Grundy at the University’s Children’s Brain Tumour Research Centre and Dr Suzanne Miller, a post doctoral research fellow in the Centre.

It focuses on a rare and aggressive cancer called Central Nervous System primitive neuro-ectodermal brain tumours. Patients with CNS PNET have a very poor prognosis and current treatments, including high dose chemotherapy and cranio-spinal radiotherapy are relatively unsuccessful and have severe lifelong side-effects. This is particularly the case in very young children.

Despite the need for new and more effective treatments, little research has been done to examine the underlying causes of CNS PNET, partly due to their rarity. The Nottingham study aimed to identify molecular markers as a first step to improving the treatments and therapies available to fight the cancer.

The Nottingham team collaborated with researchers at the Hospital for Sick Kids in Toronto, Canada, to perform an International study collecting 142 CNS PNET samples from 20 institutions in nine countries.

Professor Richard Grundy said: “Following our earlier research we realised that an international effort was needed to bring sufficient numbers of cases together to make the breakthrough we needed to better understand this disease or indeed diseases identified in our study. The next step is to translate this knowledge into improving treatments.”

By studying the genetics of the tumours, they discovered that instead of one cancer, the tumours have three sub-types featuring distinct genetic abnormalities and leading to different outcomes for patients.

They found that each group had its own genetic signature through subtle differences in the way they expressed two genetic markers, LIN28 and OLIG2.

When compared with clinical factors including age, survival and metastases (the spread of the tumours through the body), they discovered that group 1 tumours (primitive neural) were found most often in the youngest patients and had the poorest survival rates. Patients with group 3 tumours had the highest incidence of metastases at diagnosis.

Ultimately, the research has identified the two genetic markers LIN28 and OLIG2 as a promising basis for more effective tools for diagnosing and predicting outcomes for young patients with these types of brain tumours.

The research was funded by the Canadian Institute of Health Research, the Brainchild/Sick Kids Foundation and the Samantha Dickson Brain Tumour Trust.

Chief Executive of Samantha Dickson Brain Tumour Trust, Sarah Lindsell, said: “As the UK’s leading brain tumour charity, and the largest dedicated funder of brain tumour research, we are delighted that our investment has led to such significant success. It is great to see that understanding of these tumours is improving — this is desperately needed given the poor outcomes for children with this tumour. Samantha Dickson Brain Tumour Trust is proud to have been instrumental in this work.”

Source: Science Daily

Jun 16, 20121 note
#science #neuroscience #brain #psychology
Vitamin D With Calcium Shown to Reduce Mortality in Elderly

ScienceDaily (June 15, 2012) — A study recently published in the Endocrine Society’s Journal of Clinical Endocrinology and Metabolism (JCEM) suggests that vitamin D — when taken with calcium — can reduce the rate of mortality in seniors, therefore providing a possible means of increasing life expectancy.

During the last decade, there has been increasing recognition of the potential health effects of vitamin D. It is well known that calcium with vitamin D supplements reduces the risk of fractures. The present study assessed mortality among patients randomized to either vitamin D alone or vitamin D with calcium. The findings from the study found that the reduced mortality was not due to a lower number of fractures, but represents a beneficial effect beyond the reduced fracture risk.

"This is the largest study ever performed on effects of calcium and vitamin D on mortality," said Lars Rejnmark, PhD, of Aarhus University Hospital in Denmark and lead author of the study. "Our results showed reduced mortality in elderly patients using vitamin D supplements in combination with calcium, but these results were not found in patients on vitamin D alone."

In this study, researchers used pooled data from eight randomized controlled trials with more than 1,000 participants each. The patient data set was composed of nearly 90 percent women, with a median age of 70 years. During the three-year study, death was reduced by 9 percent in those treated with vitamin D with calcium.

"Some studies have suggested calcium (with or without vitamin D) supplements can have adverse effects on cardiovascular health," said Rejnmark. "Although our study does not rule out such effects, we found that calcium with vitamin D supplementation to elderly participants is overall not harmful to survival, and may have beneficial effects on general health."

Source: Science Daily

Jun 16, 20127 notes
#science #neuroscience #psychology
BPA Exposure Effects May Last for Generations

ScienceDaily (June 15, 2012) — Exposure to low doses of Bisphenol A (BPA) during gestation had immediate and long-lasting, trans-generational effects on the brain and social behaviors in mice, according to a recent study accepted for publication in the journal Endocrinology, a publication of The Endocrine Society.

BPA is a human-made chemical present in a variety of products including food containers, receipt paper and dental sealants and is now widely detected in human urine and blood. Public health concerns have been fueled by findings that BPA exposure can influence brain development. In mice, prenatal exposure to BPA is associated with increased anxiety, aggression and cognitive impairments.

"We have demonstrated for the first time to our knowledge that BPA has trans-generational actions on social behavior and neural expression," said Emilie Rissman, PhD, of the University of Virginia School of Medicine and lead author of the study. "Since exposure to BPA changes social interactions in mice at a dose within the reported human levels, it is possible that this compound has trans-generational actions on human behavior. If we banned BPA tomorrow, pulled all products with BPA in them, and cleaned up all landfills tomorrow it is possible, if the mice data generalize to humans, that we will still have effects of this compound for many generations."

In this study, female mice received chow with or without BPA before mating and throughout gestation. Plasma levels of BPA in supplemented female mice were in a range similar to those measured in humans. Juveniles in the first generation exposed to BPA in utero displayed fewer social interactions as compared with control mice. The changes in genes were most dramatic in the first generation (the offspring of the mice that were exposed to BPA in utero), but some of these gene changes persisted into the fourth generation.

"BPA is a ubiquitous chemical, it is in the air, water, our food, and our bodies," said Rissman. "It is a man-made chemical, and is not naturally occurring in any plant or animal. The fact that it can change gene expression in mice, and that these changes are heritable, is cause for us to be concerned about what this may mean for human health."

Source: Science Daily

Jun 16, 201214 notes
#science #neuroscience
Musical brain patterns could help predict epileptic seizures

June 15, 2012

The research led by Newcastle University’s Dr Mark Cunningham and Professor Miles Whittington and supported by the Dr Hadwen Trust for Humane Research, indicates a novel electrical bio-marker in humans.

The brain produces electrical rhythms and using EEG - electrodes on the scalp - researchers were able to monitor the brain patterns in patients with epilepsy. Both in patients and in brain tissue samples the team were able to witness an abnormal brain wave noticeable due to its rapidly increasing frequency over time.

Comparing these to a musical ‘glissando’, an upwards glide from one pitch to another, the team found that this brain rhythm is unique to humans and they believe it could be related to epilepsy.

Dr Cunningham, senior lecturer in Neuronal Dynamics at Newcastle University said: “We were able to examine EEG collected from patients with drug resistant epilepsy who were continually monitored over a two week period. During that time we noticed patterns of electrical activity with rapidly increasing frequency, just like glissandi, emerging in the lead-up to an epileptic seizure.”

"We are in the early days of the work and we want to investigate this in a larger group of patients but it may offer a promising insight into when a seizure is going to start."

Professor Whittington added: “Classical composers such as Gustav Mahler are famous for using notes of rapidly increasing pitch – called glissando - to convey intense expressions of anticipation. Similarly we identified glissando-like patterns of brain electrical activity generated in anticipation of seizures in patients with epilepsy.”

The team recorded electrical activity taken from patients in Newcastle and Glasgow with the help of collaborators Dr Roderick Duncan and Dr Aline Russell and worked in collaboration with the Epilepsy Surgery Group at Newcastle General Hospital part of the Newcastle Hospitals NHS Foundation Trust.

Having received permission from patients to use brain tissue removed during an operation to cure their seizures, the team were able to observe and study in great detail glissando discharges in slices of this human epileptic tissue maintained in the lab.

Publishing in Epilepsia online, the team discovered that glissandi are highly indicative of pathology associated with human epilepsy and, unlike other forms of epileptic activity studied previously, are extremely difficult to reproduce in normal, non-epileptic brain tissue. The team worked with Professor Roger Traub at the IBM Watson Research Centre in New York to provide predictions using highly detailed computational models. By manipulating the chemical conditions surrounding human epileptic brain tissue according to these predictions, they discovered that glissandi did not require any of the conventional chemical connections between nerve cells thought to underlie most brain functions. Instead, glissandi were generated by a combination of large changes in the pH of the tissue, specific electrical properties of certain types of nerve cell and, most importantly, direct electrical connections between these nerve cells.

"This work also suggests that given the lengths one has to go to reproduce this experimentally in rodents that the glissandi may be a unique feature of the human epileptic brain," explains Dr Cunningham.

Dr Kailah Eglington, Chief Executive of the Dr Hadwen Trust, said: “Of all human brain disorders, epilepsy research ranks as one that currently employs substantial numbers of laboratory animals worldwide.

"Dr Cunningham’s work at Newcastle University aims to address the shortcomings of existing animal-based research by removing animals from the equation and addressing the issue directly in humans."

Provided by Newcastle University

Source: medicalxpress.com

Jun 16, 201212 notes
#science #neuroscience #brain #psychology #seizures
Active ingredient of cannabis has no effect on the progression of multiple sclerosis

June 15, 2012

The first large non-commercial study to investigate whether the main active constituent of cannabis (tetrahydrocannabinol or THC) is effective in slowing the course of progressive multiple sclerosis (MS) shows that there is no evidence to suggest this; although benefits were noted for those at the lower end of the disability scale.

The CUPID (Cannabinoid Use in Progressive Inflammatory brain Disease) study was carried out by researchers from the Peninsula College of Medicine and Dentistry (PCMD), Plymouth University. The study was funded by the Medical Research Council (MRC) and managed by the National Institute for Health Research (NIHR) on behalf of the MRC-NIHR partnership, the Multiple Sclerosis Society and the Multiple Sclerosis Trust.

The preliminary results of CUPID are to be presented by lead researcher Professor John Zajicek at the Association of British Neurologists’ Annual Meeting in Brighton on Tuesday 29th May.

CUPID enrolled nearly 500 people with MS from 27 centres around the UK, and has taken eight years to complete. People with progressive MS were randomised to receive either THC capsules or identical placebo capsules for three years, and were carefully followed to see how their MS changed over this period. The two main outcomes of the trial were a disability scale administered by neurologists (the Expanded Disability Status Scale), and a patient report scale of the impact of MS on people with the condition (the Multiple Sclerosis Impact Scale 29).

Overall the study found no evidence to support an effect of THC on MS progression in either of the main outcomes. However, there was some evidence to suggest a beneficial effect in participants who were at the lower end of the disability scale at the time of enrolment but, as the benefit was only found in a small group of people rather than the whole population, further studies will be needed to assess the robustness of this finding. One of the other findings of the trial was that MS in the study population as a whole progressed slowly, more slowly than expected. This makes it more challenging to find a treatment effect when the aim of the treatment is that of slow progression.

As well as evaluating the potential neuroprotective effects and safety of THC over the long-term, one of the aims of the CUPID study was to improve the way that clinical trial research is done by exploring newer methods of measuring MS and using the latest statistical methods to make the most of every piece of information collected. This analysis will continue for several months. The CUPID study will therefore provide important information about conducting further large scale clinical trials in MS.

Professor John Zajicek, Professor of Clinical Neuroscience at PCMD, Plymouth University, said: “To put this study into context: current treatments for MS are limited, either being targeted at the immune system in the early stages of the disease or aimed at easing specific symptoms such as muscle spasms, fatigue or bladder problems. At present there is no treatment available to slow MS when it becomes progressive. Progression of MS is thought to be due to death of nerve cells, and researchers around the world are desperately searching for treatments that may be ‘neuroprotective’. Laboratory experiments have suggested that certain cannabis derivatives may be neuroprotective.”

He added: “Overall our research has not supported laboratory based findings and shown that, although there is a suggestion of benefit to those at the lower end of the disability scale when they joined CUPID, there is little evidence to suggest that THC has a long term impact on the slowing of progressive MS.”

Dr Doug Brown, Head of Biomedical Research at the MS Society, said: “There are currently no treatments for people with progressive MS to slow or stop the worsening of disability. The MS Society is committed to supporting research in this area and this was an important study for us to fund. While this study sadly suggests THC is ineffective at slowing the course of progressive MS, we will not stop our search for effective treatments. We are encouraged by the possibility shown by this study that THC may have potential benefits for some people with MS and we welcome further investigation in this area.”

Provided by The Peninsula College of Medicine and Dentistry

Source: medicalxpress.com

Jun 16, 201213 notes
#science #neuroscience #psychology #MS #brain
The risk of carrying a cup of coffee

June 15, 2012 By Angela Herring

Object manip­u­la­tion or tool use is almost a uniquely human trait, said Dagmar Sternad, director of Northeastern’s Action Lab, a research group inter­ested in move­ment coor­di­na­tion. “Not only does it require cer­tain cog­ni­tive abil­i­ties but also dis­tinct motor abilities.”

image

Professor Dagmar Sternad and postdoctoral researcher C.J. Hasson show that we subconsciously adjust our “safety margin” when we move a dynamic object like a cup of coffee based on the amount of variability in the situation. Credit: John Guillemin

Simply moving one’s own body, for instance by directing a hand toward a coffee cup, requires the orga­ni­za­tion of var­ious phys­i­o­log­ical sys­tems including the cen­tral and periph­eral ner­vous sys­tems and the mus­cu­loskeletal system.

Once the hand grasps and picks up the cup, the ques­tions become even more com­pli­cated. What if the cup is filled with liquid? At this point, the com­plexity of the con­trol problem bal­loons — the pres­ence of the liquid intro­duces non­linear fluid dynamics with the risk of a spill because of the inherent vari­ability in one’s movement.

Sternad, a pro­fessor of , biology, elec­trical and com­puter engi­neering and physics and post­doc­toral researcher C.J. Hasson are inter­ested in how we adapt our move­ment strate­gies when inter­acting with dynamic objects in the environment.

In a recent paper pub­lished in the Journal of Neu­ro­phys­i­ology, Hasson and Sternad explored the ques­tion by looking at the everyday task of manip­u­lating a cup of coffee. They show that how we adapt our move­ment strate­gies is directly related to the amount of vari­ability and reli­a­bility in our sur­round­ings and ourselves.

“Because we’re humans and not machines, we’re noisy and vari­able,” said Hasson. “We can’t expect that a move­ment will unfold exactly as we planned it.”

For the study, 18 healthy par­tic­i­pants vis­ited the Action Lab to play a video game, wherein they attempted to move a vir­tual cup filled with vir­tual liquid across a large video screen. Instead of a normal video-game con­troller, sub­jects moved the vir­tual cup by grasping a manip­u­landum — a large robotic arm. Sim­ilar to the real-life sce­nario, the robot sim­u­lated the forces one would feel from the weight of the object and the sloshing of the liquid in the cup.

They asked par­tic­i­pants to move the cup across the screen within a com­fort­able time of two sec­onds, a task for which there is an infi­nite number of pos­si­bil­i­ties. You could move fast for one second and slow for one second, slow for a half second and then fast for one and a half sec­onds. The team hypoth­e­sized that par­tic­i­pants would nat­u­rally adapt a safe move­ment strategy with prac­tice — and they did.

But the most intriguing result, said Hasson, was that the size of each participant’s safety margin —or how close they let the liquid get to the edge of the cup — could be pre­dicted by how vari­able they were in their move­ments. Those with more vari­ability tended to adapt a “safer” strategy with a larger safety margin.

“If you have a large safety margin and I move with a small margin, the ques­tion is, ‘Why am I more risky than you?’” Hasson said. “Well, you may find that I am much more con­sis­tent in my move­ments, so I don’t need a big safety margin. If you’re more vari­able, you need a larger safety margin.”

The results have impli­ca­tions in assessing elderly patients and patients of motor dis­or­ders such as cere­bral palsy. “If vari­ability deter­mines the move­ments that you do, maybe that’s an inter­ven­tion point,” said Sternad.

Provided by Northeastern University

Source: medicalxpress.com

Jun 16, 20129 notes
#science #neuroscience #brain #psychology
Inproved repair to damage of the peripheral nervous system

June 15, 2012

Researchers from the Peninsula College of Medicine and Dentistry, University of Exeter, in collaboration with colleagues from Rutgers University, Newark and University College London, have furthered understanding of the mechanism by which the cells that insulate the nerve cells in the peripheral nervous system, Schwann cells, protect and repair damage caused by trauma and disease.

The findings of the study, published on-line by the Journal of Neuroscience and supported by the Wellcome Trust, are exciting in that they point to future therapies for the repair and improvement of damage to the peripheral nervous system.

The peripheral nervous system is the part of the nervous system outside the brain and the spinal cord. It regulates almost every aspect of our bodily function, carrying sensory information that allows us to feel the sun on our face and motor information, that allows us to move. It also controls the functions of all the organs of the body.

Damage can occur through trauma: it can occur in diabetic neuropathy (suffered by almost half of those with diabetes) and patients with common inherited conditions such as Charcot-Marie-Tooth (CMT) disease. There can be a wide range of symptoms, from loss of sensation in the hands and feet to problems with digestion, blood pressure regulation, sexual function and bladder control.

Schwann cells provide the insulation, or myelin sheath, for the nerve cells that carry electrical impulses to and from the spinal cord. Schwann cells, because of their plasticity, are able to revert back to an immature ‘repair’ cell to repair damage to the peripheral nervous system. The level of repair is remarkably good but incomplete repair, perhaps after the severance of a nerve, may lead to long-term loss of function and pain.

The ability of Schwann cells to demyelinate can make them susceptible to the disease process seen in conditions such as CMT. CMT affects one in 2500people, so is a comparatively common inherited disease of the nervous system. Mutations in the many different genes in CMT can cause cycles of repair and re-insulation (re-myelination) which lead to long-term damage and the death of both Schwann and nerve cells. There is currently no therapy for CMT and patients experience increased sensory and motor problems which may lead to permanent disability.

The research team believes that its work to understand the ability of Schwann cells to revert back to an immature state and stimulate repair will lead to therapies to improve damage from severe trauma and break the cycle of damage caused by CMT. They also believe that there may also be potential to improve repair in cases of diabetic neuropathy.

They have identified a DNA binding protein, cJun, as a key player in the plasticity that allows a Schwann cell to revert back to the active repair state. cJun may be activated by a number of pathways that convey signals from the surface of the Schwann cell to the nucleus. One such pathway, the p38 Mitogen Activated Protein Kinase Pathway, appears to play a vital role: it is activated after PNS damage and may promote the process of repair; conversely it may be abnormally activated in demyelinating diseases such as CMT.

Professor David Parkinson, Associate Professor in Neuroscience, Peninsula College of Medicine and Dentistry, University of Exeter, said: “The findings of our research are exciting because we have pinpointed and are understanding the mechanism by which our bodies can repair damage to the peripheral nervous system. With further investigation, this could well lead to therapies to repair nerve damage from trauma and mitigate the damage which relates to common illnesses, such as CMT.”

Provided by The Peninsula College of Medicine and Dentistry

Source: medicalxpress.com

Jun 16, 20128 notes
#science #neuroscience #psychology
Control of brain waves from the brain surface

June 15, 2012

Whether or not a neuron transmits an electrical impulse is a function of many factors. European research is using a heady mixture of techniques – molecular, microscopy and electrophysiological – to identify the necessary input for nerve transmission in the cortex.

image

Credit: Thinkstock

In the central nervous system (CNS), a nerve cell or neuron has a ‘forest’ of elaborate dendritic trees arising from the cell body. These literally receive many thousands of synapses (junctions that allow transmission of a signal) at positions around the tree. These inputs then are able to generate an impulse, or ‘spike’, known as an action potential at the initial part of the axon.

Previous research has confirmed that an activated synapse will generate an electric signal as a result of neurotransmitters released from pre-synaptic axons. Electrical recordings from the neocortex have confirmed that, in line with the cable theory prediction, that modulation of potential at the dendrite is highly distance-dependent from the cell body or soma.

The ‘Information processing in distal dendrites of neocortical layer 5 pyramidal neurons’ (Channelrhodopsin) project aimed to shed more light on how more distal sites in the ‘tree’ influence the action potential of the post-synaptic neuron. Furthermore, they investigated exactly how dendritic spikes can be generated, another issue about which there is little information so far.

Recent research has highlighted the importance of activation of N-methyl-D-aspartate (NMDA) receptors to bring about the production of a signal that will proceed to the soma and then result in a spike. There is also indirect evidence that interneurons targeting dendrites can control level of dendrite excitability.

Channelrhodopsin scientists simultaneously recorded the pre- and post-synaptic electrical recordings of identified interneurons and a special type of neuron, pyramidal cells that are primary excitation units in the mammalian cortex.

The project team first characterised the different types of inhibitory neuron deep in the cortex in layer 5 at apical tuft dendrites. The researchers then showed that a special type of inhibitory interneuron in the outer layer of the neocortex can suppress dendritic spiking in layer 5.

Project results show that a superficial inhibitory neuron can impact information processing in a specific pyramidal neuron. The research will have massive implications for neuroscience and help to unravel the integrative operations of CNS neurons.

Provided by CORDIS

Source: medicalxpress.com

Jun 16, 201215 notes
#science #neuroscience #brain #psychology #neuron #brainwave
A Toothy Grin or Angry Snarl Makes It Easy to Stand out in a Crowd: Visible Teeth Are Key

ScienceDaily (June 14, 2012) — Rockville, Md. — Scientists have found new evidence that people spot a face in the crowd more quickly when teeth are visible — whether smiling or grimacing — than a face with a particular facial expression. The new findings, published in the Journal of Vision, counters the long held “face-in -the-crowd” effect that suggests only angry looking faces are detected more readily in a crowd.

image

Examples of stimuli — closed mouth and open mouth with visible teeth — presented in the experiment. (Credit: ARVO)

"The research concerned with the face-in-the-crowd effect essentially deals with the question of how we detect social signals of friendly or unfriendly intent in the human face," said author Gernot Horstmann, PhD, of the Center for Interdisciplinary Research and Department of Psychology at Bielefeld University, Germany. "Our results indicate that, contrary to previous assertions, detection of smiles or frowns is relatively slow in crowds of neutral faces, whereas toothy grins and snarls are quite easily detected."

In two studies, the researchers asked subjects to search for a happy or an angry face within a crowd of neutral faces, and measured the search speed. While the search was relatively slow when emotion was signaled with a closed mouth face, the speed search doubled when emotion was signaled with an open mouth and visible teeth. This was the case for both happy and angry faces, and happy faces were found even somewhat faster than angry faces.

Horstmann and his colleagues conducted these experiments as a result of discrepancies in previous studies that investigated visual search for emotional faces. According to the research team, the inconsistent results with respect to which of the two expressions are found faster — the happy face or the angry face — suggested that the emotional expression category could not be the only important factor determining the face-in- the-crowd effect.

The scientists believe this new study may explain the discrepancies. “This will probably inspire researchers to clarify whether emotion and, in particular, threat plays an additional, unique role in face detection,” said Horstmann.

Source: Science Daily

Jun 15, 201210 notes
#science #neuroscience #brain #emotion #psychology
Environmental Factors Spread Obesity, Study Shows

ScienceDaily (June 14, 2012) — An international team of researchers’ study of the spatial patterns of the spread of obesity suggests America’s bulging waistlines may have more to do with collective behavior than genetics or individual choices. The team, led by City College of New York physicist Hernán Makse, found correlations between the epidemic’s geography and food marketing and distribution patterns.

image

Supermarket. Physicists found correlations between the obesity epidemic’s geography and food marketing and distribution patterns. (Credit: © flashpics / Fotolia)

"We found there is a relationship between the prevalence of obesity and the growth of the supermarket economy," Professor Makse said. "While we can’t claim causality because we don’t know whether obesity is driven by market forces or vice versa, the obesity epidemic can’t be solved by focus on individual behavior."

The teams findings, published online this week in Scientific Reports, come as a policymakers are starting to address the role of environmental factors in obesity. For example, in New York Mayor Michael Bloomberg wants to limit serving sizes of soda sweetened with sugar to 16 ounces as a way to combat obesity.

The World Health Organization considers obesity a global epidemic similar to cancer or diabetes. It is a non-communicable disease for which no prevention strategy has been able to contain the spread.

Because obesity is related to increased calorie intake and physical inactivity, prevention has focused on changing individuals’ behaviors. However, prevalence of non-communicable diseases shows spatial clustering, and the spread of obesity has shown “high susceptibility to social pressure and global economic drivers.”

Professor Makse and his colleagues hypothesized that these earlier findings suggest collective behavior plays a more significant role in the spread of the epidemic than individual factors such as genetics and lifestyle choices. To study collective behavior’s role, they implemented a statistical clustering analysis based on the physics on the critical phenomena.

Using county-level microdata provided by the U.S. Centers for Disease Control Behavior Risk Factor Surveillance Systems for 2004 through 2008, they investigated spatial correlations for specific years. Over that time span, the pattern of the spreading of the epidemic, which has Greene County, Ala., as its epicenter, has shown that two clusters spanning distances of 1,000 kilometers have emerged; one along the Appalachian Mountains, the second in the lower Mississippi River valley.

The spatial map of obesity prevalence in the United States shows that neighboring areas tend to have similar percentages of their populations considered obese, i.e. have a body mass index greater than or equal to 30. Such areas are considered obesity clusters, and their spread can be seen in the maps from 2004 to 2008.

To assess the properties of these spatial arrangements, the researchers calculated an equal-time, two-point correlation function that measured the influence of a set of characteristics in one county on another county at a given distance. The characteristics studied were population density, prevalence of adult obesity and diabetes, cancer mortality rates and economic activity.

The researchers said the form of the correlations in obesity were reminiscent of those in physical systems at a critical point of second-order phase transition. Such systems are uncorrelated and characterized by short-range vanishing fluctuations when they are not at a critical stage.

However, at critical points long-range correlations appear, and these may signal the emergence of strong critical fluctuations in the spreading of obesity and diabetes. Consequently, they concluded the clustering patterns found in obesity were the result of “collective behavior, which may not merely be the consequence of fluctuations in individual habits.”

Professor Makse and his colleagues believe the correlations of fluctuations in the prevalence of obesity may be linked to demographic and economic variables. To test this hypothesis, they compared the spatial characteristics of industries associated with food production and sales, e.g. supermarkets, food and beverage stores, restaurants and bars, to other sectors of the economy.

Their analysis of spatial fluctuations in food economic activity gave rise to the same anomalous values as obesity and diabetes. Areas with above-average concentrations of food-related businesses had high-than-normal prevalence of obesity and diabetes.

In future studies, Professor Makse plans to apply physics concepts to measure the spread of cancer and diabetes. “The basic idea is that if a non-communicable disease is spreading like a virus, then environmental factors have to be at work,” he said. “If only genetics determined obesity, we wouldn’t have seen the correlations.”

Source: Science Daily

Jun 15, 20128 notes
#science #neuroscience #obesity #psychology
Fragile X Gene's Prevalence Suggests Broader Health Risk

ScienceDaily (June 14, 2012) — The first U.S. population prevalence study of mutations in the gene that causes fragile X syndrome, the most common inherited form of intellectual disability, suggests the mutation in the gene — and its associated health risks — may be more common than previously believed.

Writing this month (June 2012) in the American Journal of Medical Genetics, a team of Wisconsin researchers reports that the cascade of genetic amino acid repeats, which accumulate over generations and culminate in the mutation of a single gene causing fragile X, is occurring with more frequency among Americans than previously believed. The study also shows that as the genetic basis for the condition is passed from generation to generation and amplified, risks to neurological and reproductive health emerge in many carriers.

"The premutation of this condition is much more prevalent than we previously thought and there are some clinical risks associated with that," explains Marsha Mailick Seltzer, director of the University of Wisconsin-Madison Waisman Center, who led the new study.

Fragile X is caused by the unexplained runaway expansion of a set of amino acid repeats in a single X chromosome gene known as FMR1. When fully mutated, the gene fails to express and produce a protein that’s required for healthy brain development. The syndrome, which is more common in boys, results in a spectrum of intellectual disability.

However, before the gene fully mutates, carriers of the faulty gene exhibit a smaller number of elevated repeats, which expand as the gene is passed from generation to generation. Normal FMR1 genes exhibit anywhere from five to 40 repeats. Carriers with a premutation may have anywhere from 55 to 200. Those with between 45 and 54 repeats are characterized as falling into a “gray zone.” Carriers of gray zone expansions often pass the mutation on to their children who themselves are at greater risk of having the premutation, and in subsequent generations the risk of a full mutation causing fragile X syndrome is high.

The goal of the new study was to calculate the prevalence in a U.S. population of the premutation and the gray zone. The research was based on data from the Wisconsin Longitudinal Study (WLS), also known as the “Happy Days study,” which for more than 50 years has tracked the careers, family life, health and education of more than 10,000 graduates of Wisconsin’s high school class of 1957.

Using genetic samples from 6,747 WLS participants, the team led by Seltzer, an expert on developmental disability and family life, found that 1 in 151 females and 1 in 468 males carry the fragile X premutation while 1 in 35 females and 1 of every 42 males fall into the gray zone.

"The prevalence is high, the second highest reported in the world literature," says Seltzer, noting that the incidence of fragile X varies by population and is higher in some places such as Israel, and lower in others like Asia.

The expansion of the FMR1 gene is known to vary across ethnic groups. The sample in the WLS study is primarily white and of northern European descent.

People with the premutation are more likely to have a child with disability; to have neurological symptoms such as numbness, dizziness and faintness; and, for women, to experience early menopause. Although these symptoms have been recognized previously in clinical studies, the WLS data represent an unbiased sample and supports those observations.

"This study confirms that there are health risks associated with the premutation," says Seltzer. "People with the premutation have a higher probability of neurological and reproductive problems. There is a significant public health burden."

Source: Science Daily

Jun 15, 20122 notes
#science #neuroscience #genes #health
Link Between Metabolic Disorders and Alzheimer's Disease Examined

ScienceDaily (June 14, 2012) — No effective treatments are currently available for the prevention or cure of Alzheimer’s disease (AD), the most frequent form of dementia in the elderly. The most recognized risk factors, advancing age and having the apolipoprotein E Ɛ4 gene, cannot be modified or treated. Increasingly, scientists are looking toward other risk factors to identify preventive and therapeutic strategies. Much attention recently has focused on the metabolic syndrome (MetS), with a strong and growing body of research suggesting that metabolic disorders and obesity may play a role in the development of dementia.

A new supplement to the Journal of Alzheimer’s Disease provides a state-of-the-art assessment of research into the link between metabolic syndrome and cognitive disorders. The supplement is guest edited by Vincenza Frisardi, of the Department of Neurological and Psychiatric Sciences, University of Bari, and the Geriatric Unit and Gerontology-Geriatrics Research Laboratory, IRCCS, Foggia, Italy, and Bruno P. Imbimbo, Research and Development Department, Chiesi Farmaceutici, Parma, Italy.

The prevalence of MetS and obesity has increased over the past several decades. MetS is a cluster of vascular and metabolic risk factors including obesity, hypertension, an abnormal cholesterol profile, and impaired blood glucose regulation. “Although molecular mechanisms underlying the relationship between MetS and neurological disorders are not fully understood, it is becoming increasingly clear that cellular and biochemical alterations observed in MetS may represent a pathological bridge between MetS and various neurological disorders,” explains Dr. Frisardi.

Type 2 diabetes (T2D) has been linked with cognitive impairment in a number of studies. The risk for developing both T2D and AD increases proportionately with age, and evidence shows that individuals with T2D have a nearly twofold higher risk of AD than nondiabetic individuals.

Paula I. Moreira, Faculty of Medicine and Center for Neuroscience and Cell Biology, University of Coimbra, Portugal, outlines some of the likely mechanisms. Both AD and T2D present similar abnormalities in the mitochondria, which play a pivotal role in cellular processes that impair their ability to regulate oxidation in the cell. Human amylin, a peptide that forms deposits in the pancreatic cells of T2D patients, shares several properties with amyloid-ß plaques in the Alzheimer’s brain. Insulin resistance is another feature shared by both disorders. Impairment of insulin signalling is directly involved in the development of tau tangles and amyloid ß (Aß) plaques. “Understanding the key mechanisms underlying this deleterious interaction may provide opportunities for the design of effective therapeutic strategies,” Dr. Moreira notes.

In another article, author, José A. Luchsinger of the Division of General Medicine, Department of Medicine, Columbia University College of Physicians and Surgeons, New York, notes that while there seems to be little dispute that T2D can cause cerebrovascular disease and vascular cognitive impairment, whether T2D can cause late onset AD remains to be determined. “Although the idea is highly speculative, the association between T2D and cognitive impairment may not be causal. Several lines of evidence provide some support to the idea that late onset Alzheimer’s disease could cause T2D, or that both could share causal pathways,” he notes. He reviews epidemiological, imaging, and pathological studies and clinical trials to provide insight. “Given the epidemic of T2D in the world, it’s important to determine whether the association between T2D and cognitive impairment, particularly late onset AD, is causal and if so, what are the mechanisms underlying it.”

Dr. Frisardi notes that most efforts by the pharmaceutical industry have been directed against the production and accumulation of amyloid-ß. “Unfortunately, these efforts have not produced effective therapies yet, since the exact mechanisms of AD are largely unknown. Given that the onset of AD most likely results from the interaction of genetic and environmental factors, the research agenda should consider new platforms of study, going beyond the monolithic outlook of AD, by synthesizing epidemiological, experimental, and biological data under a unique pathophysiological model as a point of reference for further advances in the field.”

Source: Science Daily

Jun 15, 20125 notes
#science #neuroscience #brain #psychology #alzheimer
Tense film scenes trigger brain activity: New ways to predict how audiences will respond

June 14, 2012

Visual and auditory stimuli that elicit high levels of engagement and emotional response can be linked to reliable patterns of brain activity, a team of researchers from The City College of New York and Columbia University reports. Their findings could lead to new ways for producers of films, television programs and commercials to predict what kinds of scenes their audiences will respond to.

"Peak correlations of neural activity across viewings can occur in remarkable correspondence with arousing moments of the film,” the researchers said in an article published in the journal Frontiers in Human Neuroscience. “Moreover, a significant reduction in neural correlation occurs upon a second viewing of the film or when the narrative is disrupted by presenting its scenes scrambled in time.”

The researchers used EEG (electroencephalography), which measures electrical activity across the scalp, to collect data on brainwaves of 20 human subjects, who were shown scenes from three films with repeat viewings. Two films, Alfred Hitchcock’s “Bang! You’re Dead” and Sergio Leone’s “The Good, the Bad and the Ugly,” contained moments of high drama expected to trigger responses. The third, an amateur film of people walking on a college campus, was used as a control.

"We found moments of high correlation (between brainwave activity during separate viewings) and moments when this did not occur," said Dr. Lucas C. Parra, Herbert G. Kayser Professor of Biomedical Engineering in CCNY’s Grove School of Engineering, and a corresponding author. "By looking at patterns of oscillation we could tell at which moments a person was particularly engaged. Additionally, we could see whether the correlation occurred across subjects and repeated viewings."

[Video: Reading the Brain during Film Viewing]
Video of EEG readings during scenes from “Bang, You’re Dead”

Measurements along the EEG alpha activity scale show the degree of attentiveness in a person, he explained. When the oscillations are strong, a person is relaxed, i.e. not engaged. When a person is very attentive, alpha activity is low.

Peaks in engagement were correlated to three kinds of scenes, said Dr. Jasek Dmochowski, a post-doctoral fellow in the Grove School and a corresponding author. They included moments with powerful visual cues, such as a close-up on the gun in “Bang! You’re Dead,” scenes with ominous music in which the visual component was not significant, and meaningful scene changes.

The researchers found significantly less neural correlation on participants’ second viewings and when scenes were scrambled and shown out of sequence. “Following a narrative is complex and involves a lot of distributed processing. When a person doesn’t have a sense of the narrative there is much less correlation (across views of the same or another subject),” Dr. Dmochowski said.

Having demonstrated the correlations between intense stimuli and brainwave reliability, the research team now wants to locate where in the brain the response occurs, Professor Parra said. He wants to deploy a combination of EEG and magnetic resonance imaging to “get the best of both worlds:” the fine temporal resolution of EEG and the detailed imagery of MRI.

The team sees several potential applications for the ability to quantify levels of engagement, including neuro-marketing, quantitative assessment of entertainment, measuring the impact of narrative discourse and the study of attention deficit disorders. “Advertisers would love to know where and when an ad is engaging,” he noted.

"The potential to measure engagement is huge since this provides an objective way to collect data," added Dr. Dmochowski, who currently is investigating whether there is a correlation between social media usage and brain activity in young people while watching “The Walking Dead,” a drama series on the American Movie Classics cable network.

"We are mining Twitter to measure the depth of watching," he continued. "We think there will be many correlations between scenes that elicit social media responses and neural signatures, and we can look at both positive and negative responses."

Provided by City College of New York

Source: medicalxpress.com

Jun 15, 20124 notes
#science #neuroscience #psychology #brain
Dissonant Music Brings out the Animal in Listeners

ScienceDaily (June 13, 2012) — Ever wonder why Jimi Hendrix’s rendition of “The Star-Spangled Banner” moved so many people in 1969 or why the music in the shower scene of “Psycho” still sends chills down your spine?

image

Jimi Hendrix (Credit: Public domain image, courtesy of UCLA)

A UCLA-based team of researchers has isolated some of the ways in which distorted and jarring music is so evocative, and they believe that the mechanisms are closely related to distress calls in animals.

They report their findings in the latest issue of the peer-reviewed scientific journal Biology Letters, which publishes online June 12.

"Music that shares aural characteristics with the vocalizations of distressed animals captures human attention and is uniquely arousing," said Daniel Blumstein, one of the study’s authors and chair of the UCLA Department of Ecology and Evolutionary Biology.

Read More →

Jun 15, 201299 notes
#science #neuroscience #brain #psychology #perception #music
Toddler Spatial Knowledge Boosts Understanding of Numbers

ScienceDaily (June 13, 2012) — Children who are skilled in understanding how shapes fit together to make recognizable objects also have an advantage when it comes to learning the number line and solving math problems, research at the University of Chicago shows.

The work is further evidence of the value of providing young children with early opportunities in spatial learning, which contributes to their ability to mentally manipulate objects and understand spatial relationships, which are important in a wide range of tasks, including reading maps and graphs and understanding diagrams showing how to put things together. Those skills also have been shown to be important in Science Technology, Engineering and Math (STEM) fields.

Scholars at UChicago have shown, for instance, that working with puzzles and learning to identify shapes are connected to improved spatial understanding and better achievement, particularly in geometry. A new paper, however, is the first to connect robust spatial learning with better comprehension of other aspects of mathematics, such as arithmetic.

"We found that children’s spatial skills at the beginning of first and second grades predicted improvements in linear number line knowledge over the course of the school year," said Elizabeth Gunderson, a UChicago postdoctoral scholar who is lead author of the paper, "The Relation Between Spatial Skill and Early Number Knowledge: The Role of the Linear Number Line," published in the current issue of the journal Development Psychology.

In addition to finding the importance of spatial learning to improving understanding of the number line, the team also showed that better understanding of the number line boosted mathematics performance on a calculation task.

Read More →

Jun 15, 20129 notes
#science #neuroscience #brain #psychology
Obesity, Depression Found to Be Root Causes of Daytime Sleepiness

ScienceDaily (June 13, 2012) — Wake up, America, and lose some weight — it’s keeping you tired and prone to accidents. Three studies being presented June 13 at sleep 2012 conclude that obesity and depression are the two main culprits making us excessively sleepy while awake.

Researchers at Penn State examined a random population sample of 1,741 adults and determined that obesity and emotional stress are the main causes of the current “epidemic” of sleepiness and fatigue plaguing the country. Insufficient sleep and obstructive sleep apnea also play a role; both have been linked to high blood pressure, heart disease, stroke, depression, diabetes, obesity and accidents.

"The ‘epidemic’ of sleepiness parallels an ‘epidemic’ of obesity and psychosocial stress," said Alexandros Vgontzas, MD, the principal investigator for the three studies. "Weight loss, depression and sleep disorders should be our priorities in terms of preventing the medical complications and public safety hazards associated with this excessive sleepiness."

In the Penn State cohort study, 222 adults reporting excessive daytime sleepiness (EDS) were followed up 7½ years later. For those whose EDS persisted, weight gain was the strongest predicting factor. “In fact, our results showed that in individuals who lost weight, excessive sleepiness improved,” Vgontzas said.

Adults from that same cohort who developed EDS within the 7½-year span also were studied. The results show for the first time that depression and obesity are the strongest risk factors for new-onset excessive sleepiness. The third study, of a group of 103 research volunteers, determined once again that depression and obesity were the best predictors for EDS.

"The primary finding connecting our three studies are that depression and obesity are the main risk factors for both new-onset and persistent excessive sleepiness," Vgontzas said.

In the Penn State cohort study, the rate of new-onset excessive sleepiness was 8 percent, and the rate of persistent daytime sleepiness was 38 percent. Like insufficient sleep and obstructive sleep apnea, EDS also is associated with significant health risks and on-the-job accidents.

Source: Science Daily

Jun 15, 20125 notes
#science #neuroscience #brain #psychology #obesity #depression
Role of Omega-3 in Preventing Cognitive Decline in Older People Questioned

ScienceDaily (June 13, 2012) — Older people who take omega-3 fish oil supplements are probably not reducing their chances of losing cognitive function, according to a new Cochrane systematic review. Based on the available data from studies lasting up to 3.5 years, the researchers concluded that the supplements offered no benefits for cognitive health over placebo capsules or margarines, but that longer term effects are worth investigating.

Omega-3 fatty acids are fats responsible for many important jobs in the body. We get these fats through our daily diets and the three major omega-3 fats are: alpha linolenic acid (ALA) from sources such as nuts and seeds and eicosapentoic acid (EPA) and docosahexaenoic acid (DHA) from sources including oily fish such as salmon and mackerel. A number of studies have hinted that omega-3 fatty acids and DHA in particular may be involved in keeping nerve cells in the brain healthy into old age. However, there is limited evidence for the role of these fats in preventing cognitive decline and dementia.

The researchers, led by Emma Sydenham at the London School of Hygiene & Tropical Medicine (LSHTM), London, UK, gathered together evidence from three high quality trials comparing the effects of omega-3 fatty acids taken in capsules or margarine spread to those of sunflower oil, olive oil or regular margarine. A total of 3,536 people over the age of 60 took part in the trials, which lasted between six and 40 months. None of the participants had any signs of poor cognitive health or dementia at the start of the trials.

The researchers found no benefit of taking the omega-3 capsules or spread over placebo capsules or spread. Participants given omega-3 did not score better in standard mental state examinations or in memory and verbal fluency tests than those given placebo.

"From these studies, there doesn’t appear to be any benefit for cognitive health for older people of taking omega-3 supplements," said Alan Dangour, a nutritionist at LSHTM and co-author of the report. "However, these were relatively short-term studies, so we saw very little deterioration in cognitive function in either the intervention groups or the control groups. It may take much longer to see any effect of these supplements."

The researchers conclude that the longer term effects of omega-3 fatty acids on cognitive decline and dementia need to be explored in further studies, particularly in people with low intakes of omega-3 fatty acids in their diet. In the meantime, they stress other potential health benefits. “Fish is an important part of a healthy diet and we would still support the recommendation to eat two portions a week, including one portion of oily fish,” said Dangour.

Source: Science Daily

Jun 14, 2012
#science #neuroscience #psychology #brain #cognition #dementia
Juveniles Build Up Physical -- But Not Mental -- Tolerance for Alcohol

ScienceDaily (June 13, 2012) — Research into alcohol’s effect on juvenile rats shows they have an ability to build up a physical, but not cognitive, tolerance over the short term — a finding that could have implications for adolescent humans, according to Baylor University psychologists.

The research findings are significant because they indicate that blood alcohol concentration levels alone may not fully account for impaired orientation and navigation ability, said Jim Diaz-Granados, Ph.D., professor and chair of psychology and neuroscience at Baylor. He co-authored the study, published in the journal Brain Research.  “There’s been a lot of supposition about the reaction to blood alcohol levels,” Diaz-Granados said. “We use the blood alcohol level to decide if someone is going to get arrested, because we think that a high level means impairment. But here we see a model where we can separate that out. You may have a tolerance in metabolism, but just because your blood alcohol concentration is less than the legal limit doesn’t mean your behavior isn’t impaired.”

"More research is needed to fully understand how adolescents react to alcohol, but this contributes a piece to the puzzle," said study co-author Douglas Matthews, Ph.D., a research scientist at Baylor and an associate professor in Psychology at Nanyang Technological University in Singapore.

The study was conducted in the Baylor Addiction Research Center of Baylor’s Department of Psychology and Neuroscience in Baylor’s College of Arts & Sciences.

More than half of under-age alcohol use is due to binge drinking, according to the Substance Abuse and Mental Health Services Administration, and “when initial alcohol use occurs during adolescence, it increases the chance of developing alcoholism later in life,” said lead study author Candice E. Van Skike, a doctoral candidate in psychology at Baylor. Researchers have long been interested in whether adolescents react differently to alcohol than adults and how alcohol use affects their brains when they reach adulthood, but Baylor researchers also wanted to test the short-term effect of alcohol on adolescents’ brains in terms of memory about space and dimension.

In the study, 96 rats were trained to navigate a water maze to an escape platform. Half were exposed to alcohol vapor in chambers for 16 hours a day over four days (a method to approximate binge-like alcohol intake), while others were exposed only to air. After a 28-hour break, some were injected with alcohol, then both groups tested again in the maze. A comparison found that those who had undergone the chronic intermittent ethanol exposure built up a metabolic tolerance. They were better able to eliminate alcohol from their systems than ones who had been exposed only to air, based on a comparison of the blood ethanol concentrations of the two groups after they had been injected with alcohol later. While the alcohol-injected rats swam as hard and as fast as the others, their ability to find the escape platform was impaired.

Previous research at Baylor led by Matthews showed that adolescents are less sensitive than adults to motor impairment during alcohol intake because a particular neuron fires more slowly in adults who are drinking. The lack of sensitivity may be part of the reason adolescents do not realize they have had too much to drink.

"It’s difficult to compare metabolic and cognitive tolerance in adults with those of juveniles, because many studies that have looked at the cognitive aspect of chronic ethanol exposure didn’t measure blood alcohol concentration levels," Van Skike said. "It would be an interesting comparison to make, and it is an avenue for future research."

Other research has shown that high levels of alcohol consumption during human adolescence are mirrored in animals. Adolescent rats consume two to three times more ethanol than adults relative to body weight, suggesting that adolescents are who drink are pre-disposed to do so in binges.

Source: Science Daily

Jun 14, 20125 notes
#science #neuroscience #psychology #alcohol #brain
Anxious Mice Make Lousy Dads

ScienceDaily (June 13, 2012) — Normally, male California mice are surprisingly doting fathers, but new research published in the journal Physiological and Biochemical Zoology suggests that high anxiety can turn these good dads bad.

Unlike most rodents, male and female California mice pair up for life with males providing extensive parental care, helping deliver the pups, lick them clean, and keep them warm during their first few weeks of life. Experienced fathers are so paternal that they’ll even take care of pups that aren’t theirs. “If we place a male California mouse in a test cage and present it with an unknown pup, experienced fathers will quickly start to lick and huddle with it,” said Trynke de Jong, a post-doctoral researcher at University of California, Riverside.

Inexperienced males, on the other hand, aren’t always so loving. “Virgin males show more variability,” de Jong explained. “They may behave paternally, or they may ignore the pup, or even attack it. We want to understand what triggers these three behavioral responses in virgin males.”

De Jong and her colleagues thought this variability might have something to do with social status. In other species — including another rodent, Mongolian gerbils — dominant virgin males are more likely than subordinate ones to kill pups. Perhaps social status influences parenting in California mice as well.

To test this, de Jong and her colleagues paired up 12 virgin males in six enclosures, and performed several tests to see which was dominant. First was a food competition. “If a cornflake is dropped in the cage, the more dominant male will manage to eat most of it,” de Jong said. The researchers also observed each mouse’s urine marking. “Dominant males will make more, smaller, and more widespread marks than subordinate males,” said de Jong

After determining the mightier mouse in each pair, the team tested parental behavior by introducing a pup. Contrary to the hypothesis, scores on the dominance tests did not predict whether a male licked or huddled up to the pup. However, the research did turn up signs that anxiety, not status, plays a role in paternal behavior.

Males who shied away from urinating the middle of a new enclosure — a behavioral signal that a mouse is anxious — were slower to approach a pup. Further tests showed that less paternal males had higher levels of the vasopressin in their brains. Vasopressin is a hormone that is strongly associated with stress and anxiety.

"Our findings support the theory that vasopressin may alter the expression of paternal behavior depending on the emotional state of the animal," de Jong said. She believes these results could shed light on the role of stress in paternal care in other mammals — including humans.

Source: Science Daily

Jun 14, 20129 notes
#science #neuroscience #psychology #brain #anxiety
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December