Neuroscience

Month

February 2012

Feb 9, 201230 notes
#science #neuroscience #psychology #brain
'Explorers,' who embrace the uncertainty of choices, use specific part of cortex

February 8, 2012

"Explorers," whose decision-making style embraces the possibilities of uncertainty, use specific parts (red) of the right rostrolateral prefrontal cortex to make calculations based on relative uncertainty. Credit: Badre-Frank Lab/Brown University

Life shrouds most choices in mystery. Some people inch toward a comfortable enough spot and stick close to that rewarding status quo. Out to dinner, they order the usual. Others consider their options systematically or randomly. But many choose to grapple with the uncertainty head on. “Explorers” order the special because they aren’t sure they’ll like it. It’s a strategy of maximizing rewards by discovering whether as yet unexplored options might yield better returns. In a new study, Brown University researchers show that such explorers use a specific part of their brain to calculate the relative uncertainty of their choices, while non-explorers do not.

The study, published in the journal Neuron, newly exposes an aspect of the brain’s architecture for producing decisions and learning, said co-author David Badre, assistant professor of cognitive, linguistic, and psychological sciences at Brown. There was no consensus that a precise area of theprefrontal cortex, in this case the right rostrolateral prefrontal cortex, would be so clearly associated with a specific operation, such as performing the requisite uncertainty comparison for supporting a decision-making strategy.

"There has long been a debate about the functional organization of the frontal cortex," Badre said. "There has been a notion that the frontal lobe lacks specialization when exercising cognitive control, that it’s undifferentiated. This study provides evidence that there is a kind of organization. This is an example of how higher-order functions such as decision-making may relate to the frontal lobe’s more general functional architecture."

Stop the clock

To spot explorer behavior among their 15 participants, Badre and Michael Frank, associate professor of cognitive, linguistic, and psychological sciences, slid them into an MRI scanner and presented them with a game to play. Participants had to stop the sweeping hand of a virtual clock to win points in different rounds. They were told that they could maximize their rewards by responding quickly in some rounds, and slowly in others. The trick is they did not know round-to-round which response prevailed, and the number of points they could win was highly variable. They therefore had to employ a strategy to discover how to maximize their rewards among uncertain options, keeping track of the current expected value of fast and slow responses in each round.

While the MRI scanner tracked the blood flow in the brains of the subjects — a proxy for neural activity — the game’s software tracked their response times in each round. The computer then fed the game’s data into mathematical models devised to determine whether participants adapted their response times by taking relative uncertainty into account or adapted in another manner.

Over dozens of rounds a clear pattern emerged. Regardless of which version of the model they used, the researchers found that about half the subjects were engaging in exploratory behavior based on uncertainty: Their choices of response times correlated strongly with the choices that had the greatest outcome uncertainty.

Badre, Frank, and their team then looked at the MRI scans, reasoning that if decision-making is based on relative uncertainty, then the subjects’ brains must somehow represent this uncertainty. Sure enough, as relative uncertainty between choice options increased, so did activation in the right rostrolateral prefrontal cortex. This effect was substantially stronger in the explorers than the nonexplorers.

The result is the first to show that this region of the brain keeps track of relative uncertainty to guide exploration, but is consistent with previous studies that have shown an association between the right rostrolateral prefrontal cortex and relative comparisons. It also provides a potential explanation for Frank’s previous findings that explorers were more likely to have a variation in a gene called COMT that affects dopamine levels in the prefrontal cortex.

From cortex to choice

Frank said researchers still don’t know why some people employ the explorer strategy while others do not, but they might not be so different. According to one hypothesis, they all have an aversion to uncertainty and ambiguity.

"The difference could be that some people are averse to ambiguity in the time point where they make a single decision and other people are averse to ambiguity about their strategy over the long run," Frank said.

In other words, explorers may seek to reduce uncertainty by confronting it, rather than avoiding it.

Badre said that while the study has no direct clinical implications, the findings may still inform efforts to understand a broad set of disorders that affect frontal lobe function.

"There are a lot of diseases and disorders that affect the frontal lobes," Badre said. "They affect the ability to live independently, to carry out the day and make good decisions that get you where you want to go. The more we know about the specificity of these systems, the better that you can diagnose and suggest treatments."

Provided by Brown University

Source: medicalxpress.com

Feb 9, 20125 notes
#science #neuroscience #psychology #brain
Scientists delve into the brain roots of hunger and eating

February 8, 2012

Synaptic plasticity – the ability of the synaptic connections between the brain’s neurons to change and modify over time — has been shown to be a key to memory formation and the acquisition of new learning behaviors. Now research led by a scientific team at Beth Israel Deaconess Medical Center (BIDMC) reveals that the neural circuits controlling hunger and eating behaviors are also controlled by plasticity.

Described in the February 9, 2012 issue of the journal Neuron, the findings show that during fasting, the AgRP neurons that drive feeding behaviors actually undergo anatomical changes that cause them to become more active, which results in their “learning” to be more responsive to hunger-promoting neural stimuli.

"The role of plasticity has generally not been evaluated in neuronal circuits that control feeding behavior and with this new discovery we can start to unravel the basic mechanisms underpinning hunger and gain a greater understanding of the factors that influence weight gain and obesity," explains senior author Bradford Lowell, MD, PhD, an investigator in BIDMC’s Division of Endocrinology, Diabetes and Metabolism and Professor of Medicine at Harvard Medical School (HMS).

Adds BIDMC Chairman of Neurology Clifford Saper, MD, PhD, “For most animals, finding enough food to survive is their biggest daily challenge, and so the brain’s increase in feeding drive may be adaptive. But, for humans who are overweight, reducing this drive to the AgRP neurons may prove to be a path to future weight loss therapies.”

The roots of hunger, eating, and weight are based in the brain’s complex and rapid-fire neurocircuitry. Over the years, nerve cells containing agouti-related peptide (AgRP) protein and pro-opiomelanocortin (POMC) protein have emerged as critical players in feeding behaviors. Located in the hypothalamus, the brain area that controls automatic body functions, AgRP neurons have been shown to drive eating and weight gain while POMC neurons inhibit feeding behaviors, causing satiety and weight loss.

Previous work by the Lowell lab and others had demonstrated that when AgRP neurons in mice are artificially switched on, the animals eat voraciously, consuming four times more than control animals. “The ‘switched-on’ animals search in an unrelenting fashion for food, and when given a task to obtain pellets, will work five times harder to get them,” Lowell explains.

Given the important role played by AgRP neurons, the scientists had a great interest in understanding the factors that regulate their activity. While much focus had centered on hormones, including leptin, insulin and ghrelin, as the possible mechanisms directly affecting neuronal activity, the Lowell team hypothesized that other nerve cells might be behind the regulation.

Neurons communicate with one another via neurotransmitters, chemical messengers that traverse synapses, the specialized junctions between upstream and downstream neurons. Glutamate is one such excitatory neurotransmitter.

"Studies in other regions of the brain [for example those controlling learning and reward and addiction behaviors] have demonstrated that glutamate synapses are highly plastic, changing in their strength and sometimes even in their number," explains Lowell. Shown to exert powerful control over behavior, synaptic plasticity is brought about when glutamate binds to NMDA receptors on downstream neurons.

"NMDA receptors are unusual and really interesting," he adds. "When glutamate gets released by upstream neurons and binds to NMDA receptors, calcium enters the downstream neuron. This, in turn, engages signal transduction pathways that cause synaptic plasticity. In other parts of the brain, such as the hippocampus, NMDA receptors drive plasticity which serves to encode memories."

Led by co-first authors Tiemin Liu, PhD, Dong Kong, PhD, Bhavik P. Shah, PhD, and Chianping Ye, PhD, the investigators created and studied mice genetically engineered to lack glutamate-binding NMDA receptors on the AgRP neurons. For the sake of comparison, they also created mice genetically engineered to lack NMDA receptors on POMC neurons.

They found that while mice lacking NMDA receptors on POMC neurons showed no change in feeding behavior, the situation was dramatically different in the mice lacking NMDA receptors on AgRP neurons.

"These mice ate a lot less and were much skinnier than a group of control mice," explains Lowell. Furthermore, the scientists found that a 24-hour period of fasting – which causes intense hunger in the control mice – was associated with a 67 percent increase in the number of dendritic spines on the AgRP neurons.

"Dendritic spines are tiny structures attached to the neuron’s dendrites, the tree-like branches that receive incoming signals from upstream neurons," explains Lowell. "These structures are the physical site, the subcellular communication hub, where synaptic input from upstream glutamate-releasing neurons is received, typically one synaptic input per spine."

"I’ve been studying spines for a long time and I’ve never before seen a manipulation that triggered such rapid and robust changes in spine number," says coauthor Bernardo Sabatini, MD, PhD, a Howard Hughes Medical Institute investigator in the Department of Neurobiology at Harvard Medical School. "Clearly, feeding is plugging in to the most basic mechanisms that control synapse and spine number in these cells. This may be a great system to understand not only feeding behavior, but also to understand the cell biology behind dynamic synapse formation and retraction."

When the control mice were refed – and their hunger alleviated – the number of spines dropped back to normal. (In contrast, fasting had no effect on spine number in the mutant mice lacking NMDA receptors on AgRP neurons.) These dramatic changes in spine number and their tight association with states of hunger and satiety in control mice – and the absence of changes in spine number in mice lacking NMDA receptors on the downstream AgRP neurons– strongly suggests that structural plasticity of excitatory glutamate synapses on AgRP neurons is an important regulator of feeding behavior, says Lowell.

"Obesity is a major risk factor for type 2 diabetes, cardiovascular disease, and certain types of cancer," he adds. "By understanding the neurobiological mechanisms underlying feeding behaviors, we can work on treatments for a problem that has now become a global epidemic. These findings move us closer to a mechanistic understanding of how various factors controlling hunger might work."

Provided by Beth Israel Deaconess Medical Center

Source: medicalxpress.com

Feb 9, 20123 notes
#science #neuroscience #psychology #brain
Neuroscientists link brain-wave pattern to energy consumption

February 8, 2012 by Anne Trafton

Emery Brown, an MIT professor of brain and cognitive sciences and health sciences and technology, left, and ShiNung Ching, a postdoc in Brown’s lab. Photo: M. Scott Brauer

Different brain states produce different waves of electrical activity, with the alert brain, relaxed brain and sleeping brain producing easily distinguishable electroencephalogram (EEG) patterns. These patterns change even more dramatically when the brain goes into certain deeply quiescent states during general anesthesia or a coma. 

MIT and Harvard University researchers have now figured out how one such quiescent state, known as burst suppression, arises. The finding, reported in the online edition of the Proceedings of the National Academy of Sciences the week of Feb. 6, could help researchers better monitor other states in which burst suppression occurs. For example, it is also seen in the brains of heart attack victims who are cooled to prevent brain damage due to oxygen deprivation, and in the brains of patients deliberately placed into a medical coma to treat a traumatic brain injury or intractable seizures.

During burst suppression, the brain is quiet for up to several seconds at a time, punctuated by short bursts of activity. Emery Brown, an MIT professor of brain and cognitive sciences and health sciences and technology and an anesthesiologist at Massachusetts General Hospital, set out to study burst suppression in the anesthetized brain and other brain states in hopes of discovering a fundamental mechanism for how the pattern arises. Such knowledge could help scientists figure out how much burst suppression is needed for optimal brain protection during induced hypothermia, when this state is created deliberately. 

“You might be able to develop a much more principled way to guide therapy for using burst suppression in cases of medical coma,” says Brown, senior author of the PNASpaper. “The question is, how do you know that patients are sufficiently brain-protected? Should they have one burst every second? Or one every five seconds?”

Modeling electrical activity

ShiNung Ching, a postdoc in Brown’s lab and lead author of the PNAS paper, developed a model to describe how burst suppression arises, based on the behavior of neurons in the brain. Neuron firing is controlled by the activity of channels that allow ions such as potassium and sodium to flow in and out of the cell, altering its voltage.

For each neuron, “we’re able to mathematically model the flow of ions into and out of the cell body, through the membrane,” Ching says. In this study, the team combined many neurons to create a model of a large brain network. By showing how both cooling and certain anesthetic drugs reduce the brain’s use of ATP (the cell’s energy currency), the researchers were able to generate burst-suppression patterns consistent with those actually seen in human patients. 

This is the first time that reductions in metabolic activity at the neuron level have been linked to burst suppression, and suggests that the brain likely uses burst suppression to conserve vital energy during times of trauma.

“What’s really exciting about this is the idea that the metabolic regulation of cell energy stores plays a role in the observed dynamics of EEG. That’s a different way to think about the determinants of EEG,” says Nicholas Schiff, a professor of neurology and neuroscience at Weill Cornell Medical College who was not involved in this research. 

The developing brain

Burst suppression is also seen in babies born prematurely. As these babies get older, their brain patterns move into the normal continuous pattern. Brown speculates that in premature infants, the brain may be protecting itself by conserving energy.

“When you’re looking at these kids develop, we can easily start to suggest ways of tracking their improvement quantitatively. So the same algorithms we use to track burst suppression in the operating room could be used to track the disappearance of burst suppression in these kids,” Brown says.

Such tracking could help doctors determine whether premature infants are moving toward normal development or have an underlying brain disorder that might otherwise go undiagnosed, Ching says. 

In future studies, the researchers plan to study premature infants as well as patients whose brains are cooled and those in induced comas. Such studies could reveal just how much burst suppression is enough to protect the brain in those vulnerable situations.

Provided by Massachusetts Institute of Technology

Source: medicalxpress.com

Feb 9, 20122 notes
#science #neuroscience #psychology #brain
Brain Proteins May Be Key to Aging

Deterioration of long-lived proteins on the surface of neuronal nuclei in the brain could lead to age-related defects in nervous function.

By Bob Grant | February 8, 2012

Scientists have found that aptly named extremely long-lived proteins (ELLPs) in the brains of rats can persist for more than one year—a result that suggests the proteins, also found in human brains, last an entire lifetime. Most proteins only last a day or two before being recycled. The researchers reported their findings last week in Science.

A team at the Salk Institute for Biological Studies made the discovery while studying ELLPs that are part of the nuclear pore complex (NPC), which is a transport channel that regulates the flow of molecules into or out of the nucleus in neurons. Because the persistent ELLPs are more likely to accumulate molecular damage, NPC function may eventually become compromised, allowing more toxins into the nucleus. This could result in alterations to DNA, subsequent changes in gene activity, and signs of cellular aging. “Most cells, but not neurons, combat functional deterioration of their protein components through the process of protein turnover, in which the potentially impaired parts of the proteins are replaced with new functional copies,” said senior author Martin Hetzer, of Salk’s Molecular and Cell Biology Laboratory, in a statement. “Our results also suggest that nuclear pore deterioration might be a general aging mechanism leading to age-related defects in nuclear function, such as the loss of youthful gene expression programs.”

In addition to aging, the results may provide key clues to the development of neurodegenerative disorders like Alzheimer’s and Parkinson’s diseases.

Source: TheScientist

Feb 9, 2012
#science #neuroscience #psychology #brain
Research links 'brain waves' to cognition, attention and diagnosing disorders

February 7, 2012

Professor Jason Mattingley, Foundation Chair in Cognitive Neuroscience at The University of Queensland, released his findings into ‘brain waves’ at the Australian Neuroscience Society’s (ANS) annual conference last week.

'Brain waves' are the oscillations produced by the brain, which are thought to contribute to its remarkable capacity to integrate information about the world.

According to Professor Mattingley’s research, brain oscillations can be linked to sleep, navigation, cognition, attention, and to diagnosing a wide range of disorders including autism, schizophrenia and epilepsy.

To understand how the brain filters information during visual attention and perception, Professor Mattingley and his fellow researchers encouraged subjects to perform tasks involving the use of flickering stimuli on a computer display. This included embedding colour-coded visual information to see how well subjects track a specific target colour from a myriad of distracting information.

“Imagine the brain as a stadium full of sports fans. Each spectator is like an individual neuron in the brain. Now imagine the spectators starting a Mexican wave that sweeps through the crowd from one side of the stadium to the other. Our research shows that neurons in the brain act in much the same way. Distinct waves of neural activity, moving at different speeds and in different directions, help coordinate neurons across widely separated areas of the brain,” Professor Mattingley said.

“We can measure these brain waves as people engage in different tasks, such as focusing their attention on just one colour in multi-coloured display. The measurements we take from the brain are a bit like the ripples from a handful of pebbles thrown into a pond.”

“While interesting in their own right, these studies are also relevant to brain dysfunction, as defects in neural responses to flickering visual stimuli have been found in individuals with autism, schizophrenia, and epilepsy, and such oscillations have been found to be significantly altered in aging, depression, and neurodegenerative disorders. Using these tasks may help to both diagnose and understand the basis for differences in brain function in people with these conditions.”

The Australian Neuroscience Society’s (ANS) annual conference brings together researchers in search of a greater understanding of the human nervous system and its functions.

As part of the program around 100 international speakers and delegates shared their insights into the peripheral senses - touch, sight, hearing and smell – perception, cognition, learning and memory, with a particular focus on neurological and neurodegenerative disease. 

Provided by University of Queensland

Source: medicalxpress.com

Feb 8, 20121 note
#science #neuroscience #psychology #brain
Feb 8, 20122 notes
#science #neuroscience #psychology
Warning! Collision imminent! The brain's quick interceptions help you navigate the world

February 7, 2012

Researchers at The Neuro and the University of Maryland have figured out the mathematical calculations that specific neurons employ in order to inform us of our distance from an object and the 3-D velocities of moving objects and surfaces relative to ourselves.

When you are about to collide into something and manage to swerve away just in the nick of time, what exactly is happening in your brain? A new study from the Montreal Neurological Institute and Hospital – The Neuro, McGill University shows how the brain processes visual information to figure out when something is moving towards you or when you are about to head into a collision. The study, published in the Proceedings of the National Academy of Sciences (PNAS), provides vital insight into our sense of vision and a greater understanding of the brain.

Researchers at The Neuro and the University of Maryland have figured out the mathematical calculations that specific neurons employ in order to inform us of our distance from an object and the 3D velocities of moving objects and surfaces relative to ourselves. Highly specialized neurons located in the brain’s visual cortex, in an area known as MST, respond selectively to motion patterns such as expansion, rotation, and deformation. However, the computations underlying such selectivity were unknown until now.

Using mathematical models and sophisticated recording techniques, researchers have discovered how individual MST neurons function. “Area MST is typical of high-level visual cortex, in that information about important aspects of vision can be seen in the firing patterns of single neurons. A classic example is a neuron that only fires when the subject is looking at the image of a particular face. This type of neuron has to gather information from other neurons that are selective to simpler features, like lines, colors, and textures, and combine these pieces of information in a fairly sophisticated way,” says Dr. Christopher Pack, neuroscientist at The Neuro and senior author. “Similarly, for motion detection, neurons have to combine input from many other neurons earlier in the visual pathway, in order to determine whether something is moving toward you or just drifting past.” The brain’s visual pathway is made up of building blocks. For example, neurons in the retina respond to very simple stimuli, such as small spots of light. Further along the visual pathway, neurons respond to more complex stimulus such as straight lines, by combining inputs from neurons earlier on. Neurons further along respond to even more complex stimulus such as combinations of lines (angles), ultimately leading to neurons that can respond to, or recognize, faces and objects for example.

Source: medicalxpress.com

Feb 8, 20121 note
#science #neuroscience #psychology #brain
Study of Live Human Neurons Reveals Parkinson's Origins

ScienceDaily (Feb. 7, 2012) — Parkinson’s disease researchers at the University at Buffalo have discovered how mutations in the parkin gene cause the disease, which afflicts at least 500,000 Americans and for which there is no cure.

The results are published in the current issue of Nature Communications. The UB findings reveal potential new drug targets for the disease as well as a screening platform for discovering new treatments that might mimic the protective functions of parkin. UB has applied for patent protection on the screening platform.

"This is the first time that human dopamine neurons have ever been generated from Parkinson’s disease patients with parkin mutations," says Jian Feng, PhD, professor of physiology and biophysics in the UB School of Medicine and Biomedical Sciences and the study’s lead author.

As the first study of human neurons affected by parkin, the UB research overcomes a major roadblock in research on Parkinson’s disease and on neurological diseases in general. The problem has been that human neurons live in a complex network in the brain and thus are off-limits to invasive studies, Feng explains.

"Before this, we didn’t even think about being able to study the disease in human neurons," he says. "The brain is so fully integrated. It’s impossible to obtain live human neurons to study."

But studying human neurons is critical in Parkinson’s disease, Feng explains, because animal models that lack the parkin gene do not develop the disease; thus, human neurons are thought to have “unique vulnerabilities.”

"Our large brains may use more dopamine to support the neural computation needed for bipedal movement, compared to quadrupedal movement of almost all other animals," he says. Since in 2007, when Japanese researchers announced they had converted human cells to induced pluripotent stem cells (iPSCs) that could then be converted to nearly any cells in the body, mimicking embryonic stem cells, Feng and his UB colleagues saw their enormous potential. They have been working on it ever since.

"This new technology was a game-changer for Parkinson’s disease and for other neurological diseases," says Feng. "It finally allowed us to obtain the material we needed to study this disease."

The current paper is the fruition of the UB team’s ability to “reverse engineer” human neurons from human skin cells taken from four subjects: two with a rare type of Parkinson’s disease in which the parkin mutation is the cause of their disease and two healthy subjects who served as controls.

"Once parkin is mutated, it can no longer precisely control the action of dopamine, which supports the neural computation required for our movement," says Feng.

The UB team also found that parkin mutations prevent it from tightly controlling the production of monoamine oxidase (MAO), which catalyzes dopamine oxidation.

"Normally, parkin makes sure that MAO, which can be toxic, is expressed at a very low level so that dopamine oxidation is under control," Feng explains. "But we found that when parkin is mutated, that regulation is gone, so MAO is expressed at a much higher level. The nerve cells from our Parkinson’s patients had much higher levels of MAO expression than those from our controls. We suggest in our study that it might be possible to design a new class of drugs that would dial down the expression level of MAO."

He notes that one of the drugs currently used to treat Parkinson’s disease inhibits the enzymatic activity of MAO and has been shown in clinical trials to slow down the progression of the disease.

Parkinson’s disease is caused by the death of dopamine neurons. In the vast majority of cases, the reason for this is unknown, Feng explains. But in 10 percent of Parkinson’s cases, the disease is caused by mutations of genes, such as parkin: the subjects with Parkinson’s in the UB study had this rare form of the disease.

"We found that a key reason for the death of dopamine neurons is oxidative stress due to the overproduction of MAO," explains Feng. "But before the death of the neurons, the precise action of dopamine in supporting neural computation is disrupted by parkin mutations. This paper provides the first clues about what the parkin gene is doing in healthy controls and what it fails to achieve in Parkinson’s patients."

He noted in this study that these defects are reversed by delivering the normal parkin gene into the patients’ neurons, thus offering hope that these neurons may be used as a screening platform for discovering new drug candidates that could mimic the protective functions of parkin and potentially even lead to a cure for Parkinson’s.

While the parkin mutations are only responsible for a small percentage of Parkinson’s cases, Feng notes that understanding how parkin works is relevant to all Parkinson’s patients. His ongoing research on sporadic Parkinson’s disease, in which the cause is unknown, also points to the same direction.

Source: ScienceDaily

Feb 7, 20121 note
#science #neuroscience #psychology #parkinson
Why the Middle Finger Has Such a Slow Connection

ScienceDaily (Feb. 7, 2012) — Each part of the body has its own nerve cell area in the brain -we therefore have a map of our bodies in our heads. The functional significance of these maps is largely unclear. What effects they can have is now shown by RUB neuroscientists through reaction time measurements combined with learning experiments and “computational modelling.” They have been able to demonstrate that inhibitory influences of neighbouring “finger nerve cells” affect the reaction time of a finger. The fingers on the outside — i.e. the thumb and little finger — therefore react faster than the middle finger, which is exposed to the “cross fire” of two neighbours on each side. Through targeted learning, this speed handicap can be compensated.

The working group led by PD Dr. Hubert Dinse (Neural Plasticity Lab at the Institute for Neuroral Computation) report in the current issue of PNAS.

Thumb and little finger are the quickest

The researchers set subjects a simple task to measure the speed of decision: they showed them an image on a monitor that represented all ten fingers. If one of the fingers was marked, the subjects were to press a corresponding key as quickly as possible with that finger. The thumb and little finger were the fastest. The middle finger brought up the rear. “You might think that this has anatomical reasons or depends on the exercise” said Dr Dinse, “but we were able to rule that out through further tests. In principle, each finger is able to react equally quickly. Only in the selection task, the middle finger is at a distinct disadvantage.”

Computer simulation depicts brain maps

To explain their observations, the researchers used computer simulations based on a so-called mean-field model. It is especially suited for modelling large neuronal networks in the brain. For these simulations, each individual finger is represented by a group of nerve cells, which are arranged in the form of a topographic map of the fingers based on the actual conditions in the somatosensory cortex of the brain. “Adjacent fingers are adjacent in the brain too, and thus also in the simulation,” explained Dr. Dinse. The communication of the nerve cells amongst themselves is organised so that the nerve cells interact through mutual excitation and inhibition.

Inhibitory influences from both sides slow down the middle finger

The computer simulations showed that the longer reaction time of the middle finger in a multiple choice task is a consequence of the fact that the middle finger is within the inhibition range of the two adjacent fingers. The thumb and little finger on the other hand only receive an inhibitory effect of comparable strength from one adjacent finger each. “In other words, the high level of inhibition received by the nerve cells of the middle fingers mean that it takes longer for the excitement to build up — they therefore react more slowly” said Dr. Dinse.

Targeted reduction of the inhibition through learning

From the results of the computer simulation it can be concluded that weaker inhibition from the neighbouring fingers would shorten the reaction time of the middle finger. This would require a so-termed plastic change in the brain — a specialty of the Neural Plasticity Lab, which has been studying the development of learning protocols that induce such changes for years. One such protocol is the repeated stimulation of certain nerve cell groups, which the laboratory has already used in many experiments. “If, for example, you stimulate one finger electrically or by means of vibration for two to three hours, then its representation in the brain changes” explained Dr. Dinse. The result is an improvement in the sense of touch and a measurable reduction of the inhibitory processes in this brain area. This also results in the enlargement of the representation of the finger stimulated.

Second experiment confirms the prediction

The Bochum researchers then conducted a second experiment in which the middle finger of the right hand was subjected to such stimulation. The result was a significant shortening of the reaction time of this finger in the selection task. “This finding confirms our prediction” Dr. Dinse summed up. Thus, for the first time, Bochum’s researchers have established a direct link between the so-called lateral inhibitory processes and decision making processes. They have shown that learning processes that change the cortical maps can have far-reaching implications not only for simple discrimination tasks, but also for decision processes that were previously attributed to the so-called “higher” cortical areas. 

Source: ScienceDaily

Feb 7, 2012
#science #neuroscience #psychology
Sharp Images from the Living Mouse Brain

February 6th, 2012

This STED image of a nerve cell in the upper brain layer of a living mouse shows in previously impossible detail the very fine dendritic protrusions of a nerve cell, the so-called spines, at which the synapses are located. The inset shows the mushroom-shaped head of such a dendritic spine at which the nerve cells receive information from their peers. © Max Planck Institute for Biophysical Chemistry

Source: Neuroscience News

Feb 7, 2012
#science #neuroscience #psychology #brain
It's not solitaire: Brain activity differs when one plays against others

February 6, 2012

Rock, paper or scissors? Learning while playing a strategic game against others involves a different pattern of brain activity than learning from the consequences of one’s own actions, researchers found. Credit: L. Brian Stauffer

Researchers have found a way to study how our brains assess the behavior – and likely future actions – of others during competitive social interactions. Their study, described in a paper in the Proceedings of the National Academy of Sciences, is the first to use a computational approach to tease out differing patterns of brain activity during these interactions, the researchers report.

"When players compete against each other in a game, they try to make a mental model of the other person’s intentions, what they’re going to do and how they’re going to play, so they can play strategically against them," said University of Illinois postdoctoral researcher Kyle Mathewson, who conducted the study as a doctoral student in the Beckman Institute with graduate student Lusha Zhu and economics professor and Beckman affiliate Ming Hsu, who now is at the University of California, Berkeley. "We were interested in how this process happens in the brain."

Previous studies have tended to consider only how one learns from the consequences of one’s own actions, called reinforcement learning, Mathewson said. These studies have found heightened activity in the basal ganglia, a set of brain structures known to be involved in the control of muscle movements, goals and learning. Many of these structures signal via the neurotransmitter dopamine.

"That’s been pretty well studied and it’s been figured out that dopamine seems to carry the signal for learning about the outcome of our own actions," Mathewson said. "But how we learn from the actions of other people wasn’t very well characterized."

Researchers call this type of learning “belief learning.”

To better understand how the brain processes information in a competitive setting, the researchers used functional magnetic resonance imaging (fMRI) to track activity in the brains of participants while they played a competitive game, called a Patent Race, against other players. The goal of the game was to invest more than one’s opponent in each round to win a prize (a patent worth considerably more than the amount wagered), while minimizing one’s own losses (the amount wagered in each trial was lost). The fMRI tracked activity at the moment the player learned the outcome of the trial and how much his or her opponent had wagered.

A computational model evaluated the players’ strategies and the outcomes of the trials to map the brain regions involved in each type of learning.

"Both types of learning were tracked by activity in the ventral striatum, which is part of the basal ganglia," Mathewson said. "That’s traditionally known to be involved in reinforcement learning, so we were a little bit surprised to see that belief learning also was represented in that area."

Belief learning also spurred activity in the rostral anterior cingulate, a structure deep in the front of the brain. This region is known to be involved in error processing, regret and “learning with a more social and emotional flavor,” Mathewson said.

The findings offer new insight into the workings of the brain as it is engaged in strategic thinking, Hsu said, and may aid the understanding of neuropsychiatric illnesses that undermine those processes.

"There are a number of mental disorders that affect the brain circuits implicated in our study," Hsu said. "These include schizophrenia, depression and Parkinson’s disease. They all affect these dopaminergic regions in the frontal and striatal brain areas. So to the degree that we can better understand these ubiquitous social functions in strategic settings, it may help us understand how to characterize and, eventually, treat the social deficits that are symptoms of these diseases."

Provided by University of Illinois at Urbana-Champaign

Source: medicalxpress.com

Feb 7, 2012
#science #neuroscience #brain #psychology
Magnetic research for better brain health

February 6, 2012

A pioneering therapy that uses magnetic pulses to stimulate the brain to treat conditions such as Parkinson’s disease, depression, schizophrenia, epilepsy and stroke is now better understood thanks to researchers from The University of Western Australia and the Université Pierre et Marie Curie in France.

Research Associate Professor Jennifer Rodger from UWA’s School of Animal Biology said she and her team tested the therapy - known as repetitive transcranial magnetic stimulation (rTMS) - on mice to find out how it can be applied to treating human neurological disease.

The research was published recently in the prestigious journal FASEB.

"Our work demonstrated for the first time that pulsed magnetic fields promote changes in brain chemicals that correct abnormal brain connections, resulting in improved behaviour and brain function," joint lead author Dr Rodger said.

"rTMS is an exciting therapy that stimulates the brain. It has shown promising results in treating the damaged human brain. Our research helps to explain how this therapy works on the cells of the brain. Previously, evidence of its usefulness was mainly from anecdotal clinical evidence.

"Our results greatly increase our understanding of the specific cellular and molecular events that occur in the brain during rTMS therapy. We are the first to show that changes in brain circuits underpin these beneficial effects. Our results have implications for how rTMS is used in humans to treat disease and improve brain function."

Dr Rodger explained that the structural and functional changes caused by the therapy in malfunctioning circuits were not seen in the normal healthy brain, suggesting that the therapy could have minimal side effects in humans.

Provided by University of Western Australia

Source: medicalxpress.com

Feb 7, 2012
#science #neuroscience #psychology #brain
Magnetic therapy becoming more popular for treating depression

February 6, 2012

(Medical Xpress) — A new magnetic therapy that treats major depression recently received a major boost when the government announced Medicare will cover the procedure in Illinois.

The treatment, called transcranial magnetic stimulation (TMS), sends short pulses of magnetic fields to the brain. TMS “is rapidly gaining momentum” said Dr. Murali Rao of Loyola University Medical Center, one of the first Chicago-area centers to offer TMS. There now are nearly 300 such centers in the United States.

At Loyola, about two-thirds of Rao’s TMS patients so far report that their depression has significantly lessened or gone away completely.

Before receiving TMS, Nan Miller had failed nine antidepressants and suffered increasingly severe cycles of depression over seven years. There were times when she couldn’t get out of bed or eat. “I just wanted to die,” she said. She had even tried electroconvulsive therapy (formerly known as electroshock) but did not want to consider that option anymore.

Miller said that a few weeks after beginning TMS treatments, she was eating lunch when she suddenly realized depression did not consume her anymore. “I could almost hear the chains breaking, the darkness lifting and the heaviness dissolving,” she said. “I feel about 10 years younger and 20 shades lighter.”

The Food and Drug Administration approved TMS in 2009 for patients who have major depression and have failed at least one antidepressant. The FDA has approved one TMS system, NeuroStar®, made by Neuronetics.

The patient reclines in a comfortable padded chair. A magnetic coil, placed next to the left side of the head, sends short pulses of magnetic fields to the surface of the brain. This produces currents that stimulate brain cells. The currents, in turn, affect mood-regulatory circuits deeper in the brain. The resulting changes in the brain appear to be beneficial to patients who suffer depression.

Each treatment lasts 35 to 40 minutes. Patients typically undergo three treatments per week for four to six weeks.

The treatments do not require anesthesia or sedation. Afterward, a patient can immediately resume normal activities, including driving. Studies have found that patients do not experience memory loss or seizures. Side effects include mild headache or tingling in the scalp, which can be treated with Tylenol.

Together, psychotherapy and antidepressants successfully treat only about one-third of patients who suffer major depression. TMS is a noninvasive treatment option now available for the other two-thirds of patients, who experience only partial relief from depression or no relief at all, Rao said.

Provided by Loyola University Health System

Source: medicalxpress.com

Feb 7, 20121 note
#science #neuroscience #psychology #depression
Mom’s Love Good for Child’s Brain

January 30th, 2012

The hippocampus (highlighted in fuchsia) is a key brain structure important to learning, memory and stress response. New research shows that children who were nurtured by their mothers early in life have a larger hippocampus than children who were not nurtured as much. Credit: Washington University Medical School from press release

Source: Neuroscience News

Feb 6, 20123 notes
#science #neuroscience #psychology #brain
DNA Test that Identifies Down Syndrome in Pregnancy Can Also Detect Trisomy 18 and Trisomy 13

February 2nd, 2012

A newly available DNA-based prenatal blood test that can identify a pregnancy with Down syndrome can also identify two additional chromosome abnormalities: trisomy 18 (Edwards syndrome) and trisomy 13 (Patau syndrome).The test for all three defects can be offered as early as 10 weeks of pregnancy to women who have been identified as being at high risk for these abnormalities.

These are the results of an international, multicenter study published on-line today in the journal Genetics in Medicine. The study, the largest and most comprehensive done to date, adds to the documented capability (study published in Genetics in Medicine in October 2011) of the tests by examining results in 62 pregnancies with trisomy 18 and 12 pregnancies with trisomy 13.Together with the Down syndrome pregnancies reported earlier, 286 trisomic pregnancies and 1,702 normal pregnancies are included in the report.

The research was led by Glenn Palomaki, PhD, and Jacob Canick, PhD, of the Division of Medical Screening and Special Testing in the Department of Pathology and Laboratory Medicine at Women & Infants Hospital of Rhode Island and The Warren Alpert Medical School of Brown University, and included scientists at Sequenom Inc. and Sequenom Center for Molecular Medicine, San Diego, CA, and an independent academic laboratory at the University of California at Los Angeles.

The test identified 100% (59/59) of the trisomy 18 and 91.7% (11/12) of the trisomy 13 pregnancies.The associated false positive rates were 0.28 and 0.97%, respectively.Overall, testing failed to provide a clinical interpretation in 17 women (0.9%); three of these women had a trisomy 18 pregnancy.By slightly raising the definition of a positive test for chromosome 18 and 13, the detection rate remained constant, but the false positive rate could be as low as 0.1%.These findings, along with the detailed information learned from testing such a large number of samples, demonstrate that the new test will be highly effective when offered to women considering invasive testing.

“Our previous work demonstrated the ability to identify Down syndrome, the most common trisomy.These new data extend the finding to the next two most common trisomies and will allow for wider use of such testing with the ability to identify all three common trisomies,” said Dr. Palomaki.”The new DNA test can now also be offered to women identified as being as high risk for trisomy 18 or trisomy 13, as well those at high risk for Down syndrome.”

“This highly sensitive and specific DNA test has the potential to impact on couples’ decision-making,” says Dr. Canick.”A woman whose pregnancy was identified as high risk who earlier would have chosen not to have invasive diagnostic testing, might now consider the DNA test as a safe way to obtain further information, before making a final decision.”The US Centers for Disease Control and Prevention estimated in 1995 that about one in every 200 invasive diagnostic procedures will cause a pregnancy miscarriage.

Trisomy 18, also called Edwards syndrome, is a serious disorder with up to 70% of first trimester affected fetuses being spontaneously lost during pregnancies.Among those born alive, half die within a week with only 5% surviving the first year.All have serious medical and developmental problems.About 1,330 infants with trisomy 18 would be born in the US each year in the absence of prenatal diagnosis.Trisomy 13, also called Patau syndrome, is less common but equally serious.About 600 infants with trisomy 13 would be born in the US each year in the absence of prenatal diagnosis.Like Down syndrome, trisomy 18 and trisomy 13 are more common as maternal age increases.For comparison, about 7,730 Down syndrome cases would be born each year in the absence of prenatal diagnosis.Current prenatal screening tests for trisomy 18 and trisomy 13 rely on both biochemical and ultrasound markers.For more information visit the US National Library of Medicine PubMed Health.

This industry-sponsored project, awarded to Drs. Palomaki and Canick and Women & Infants Hospital in 2008, enrolled 4,500 women at 27 prenatal diagnostic centers throughout the world.Women & Infants also served as one of the enrollment centers under the direction of maternal-fetal medicine specialist and director of Perinatal Genetics, Barbara O’Brien, MD.

“It is clinically more relevant that all three trisomies can be detected by this test,” said Dr. O’Brien.”Having access to such a comprehensive, DNA-based test that can be done early in pregnancy will give us more information so that we can better guide which patients should consider diagnostic testing.”

Women & Infants Hospital has been an international center for prenatal screening research. For more than three decades, Drs. Palomaki and Canick have collaborated with others in developing and improving screening tests for Down syndrome and other fetal abnormalities.In 1988, Drs. Palomaki and Canick were involved in the development of triple marker screening. The team was able to convert its findings into prenatal screening tests now used throughout the world.Dr. Canick’s lab in 1998 was the first in the US to offer quad marker screening and in the past decade was the laboratory center for the NIH-funded FASTER Trial which compared first and second trimester screening.

Source: Neuroscience News

Feb 6, 20121 note
#science #neuroscience #psychology #genetics
Gender Specific Behavior Traced To Hormone-Controlled Genes In The Brain

Article Date: 06 Feb 2012 - 0:00 PST

Men and women may be equals, but they often behave differently when it comes to sex and parenting. Now a study of the differences between the brains of male and female mice in the Cell Press journal Cell provides insight into how our own brains might be programmed for these stereotypically different behaviors.

The new evidence shows that the sex hormones - testosterone, estrogen, and progesterone - act in a key region of the brain, switching certain genes on and others off. When the researchers tinkered with each of these genes one by one, animals showed subtle but important shifts in individual sex-specific behaviors, such as how males mate or females care for their pups.

“What this means is that complex behaviors like male mating or maternal care in mice can be deconstructed at the genetic level,” said Nirao Shah of the University of California, San Francisco. The findings present a cellular and molecular representation of gender that is remarkable in its complexity, the researchers say.

Shah’s team made these discoveries after screening mouse brains for genes that show differences in expression in males versus females. The researchers focused specifically on the hypothalamus, a region previously implicated in the control of sex-specific behaviors. Their screen produced a list of 16 genes with clear sex differences in distinct neurons in the hypothalamus. Surprisingly, Shah’s team found that many of these genes also show sex differences in the amygdala, a part of the brain important for emotions.

In further studies, the researchers examined the effects of a subset of these individual genes. Mice missing only one of these 16 genes seemed to behave normally. But upon closer observation, these mice showed significant differences in sex-specific behaviors. For instance, Shah explained, females mutant for one gene took longer to return their pups to the nest and to fight off intruders. “They still take care of their pups, but less effectively,” he said.

In other experiments, deletion of a single gene produced females that were two-fold less receptive to mating with males. Similarly, males mutant for another gene were less interested in females. Together these results mean that sex-specific behaviors can be controlled in modular fashion, such that the loss of any one gene leads to subtle but potentially important changes.

“At the superficial level, the mice appear normal, but this is pretty significant variation in behavior,” Shah said. It suggests that variation in such genes might explain not just differences between the sexes, but also differences in behaviors within one sex or the other - why some male mice are more aggressive than other males or some females more attentive to their offspring than other females.

The researchers don’t yet know exactly how these differences in gene expression lead to those differences in behavior, although Shah says some of the genes are known to be involved in sending or receiving neural messages in the brain. It also remains to be seen how the male and female gene expression programs might be influenced by the animals’ social interactions and experiences.

There is still a lot to learn about what makes males and females tick. “This gene list of sex differences in the brain is probably just a small subset of what we will eventually unearth,” Shah said.  

Source: Medical News Today

Feb 6, 20125 notes
#science #neuroscience #psychology #genetics #brain
Memory Function - Decaffeinated Coffee May Help

Article Date: 05 Feb 2012 - 0:00 PST

Drinking decaffeinated coffee may improve brain energy metabolism associated with diabetes type 2, according to a study published in Nutritional Neuroscience and carried out by researchers at Mount Sinai School of Medicine. Brain energy metabolism is a dysfunction with a known risk factor for dementia and other neurodegenerative disorders like Alzheimer’s disease.

Giulio Maria Pasinetti, MD, PhD, and team decided to investigate whether dietary supplementation with a standard decaffeinated coffee prior to diabetes onset could improve insulin resistance and glucose utilization in mice with diet-induced type 2 diabetes.

The mice were given the supplement for five months, after which the researchers assessed the animals’ brain’s genetic response. They discovered that the brain could metabolize glucose more effectively and that it was used for cellular energy in the brain. People with type 2 diabetes have reduced glucose utilization in the brain, which often leads to neurocognitive problems.

Dr. Pasinetti stated:

"Impaired energy metabolism in the brain is known to be tightly correlated with cognitive decline during aging and in subjects at high risk for developing neurodegenerative disorders. This is the first evidence showing the potential benefits of decaffeinated coffee preparations for both preventing and treating cognitive decline caused by type 2 diabetes, aging, and/or neurodegenerative disorders."



Drinking coffee is not recommended for everyone, because of its association with cardiovascular health risks, including elevated blood cholesterol and blood pressure, both of which result in a higher risk of developing heart disease, stroke, and premature death. However, these negative effects have mainly been caused because of the high caffeine content of coffee - the study findings prove that some components in decaffeinated coffee have beneficial health factors for mice.

Dr. Pasinetti wants to investigate whether decaffeinated coffee as a dietary supplement in humans can act as a preventive measure.

He concludes:

"In light of recent evidence suggesting that cognitive impairment associated with Alzheimer’s disease and other age-related neurodegenerative disorders may be traced back to neuropathological conditions initiated several decades before disease onset, developing preventive treatments for such disorders is critical."


Petra Rattue 

Source: Medical News Today

Feb 6, 2012
#science #neuroscience #psychology #brain #memory
Hearing Metaphors Activates Brain Regions Involved in Sensory Experience

ScienceDaily (Feb. 3, 2012) — When a friend tells you she had a rough day, do you feel sandpaper under your fingers? The brain may be replaying sensory experiences to help understand common metaphors, new research suggests.

Regions of the brain activated by hearing textural metaphors are shown in green. Yellow and red show regions activated by sensory experience of textures visually and through touch. (Credit: Image courtesy of Emory University)

Linguists and psychologists have debated how much the parts of the brain that mediate direct sensory experience are involved in understanding metaphors. George Lakoff and Mark Johnson, in their landmark work ‘Metaphors we live by’, pointed out that our daily language is full of metaphors, some of which are so familiar (like “rough day”) that they may not seem especially novel or striking. They argued that metaphor comprehension is grounded in our sensory and motor experiences.

New brain imaging research reveals that a region of the brain important for sensing texture through touch, the parietal operculum, is also activated when someone listens to a sentence with a textural metaphor. The same region is not activated when a similar sentence expressing the meaning of the metaphor is heard.

The results were published online this week in the journal Brain & Language.

"We see that metaphors are engaging the areas of the cerebral cortex involved in sensory responses even though the metaphors are quite familiar," says senior author Krish Sathian, MD, PhD, professor of neurology, rehabilitation medicine, and psychology at Emory University. "This result illustrates how we draw upon sensory experiences to achieve understanding of metaphorical language."

Sathian is also medical director of the Center for Systems Imaging at Emory University School of Medicine and director of the Rehabilitation R&D Center of Excellence at the Atlanta Veterans Affairs Medical Center.

Seven college students who volunteered for the study were asked to listen to sentences containing textural metaphors as well as sentences that were matched for meaning and structure, and to press a button as soon as they understood each sentence. Blood flow in their brains was monitored by functional magnetic resonance imaging. On average, response to a sentence containing a metaphor took slightly longer (0.84 vs 0.63 seconds).

In a previous study, the researchers had already mapped out, for each of these individuals, which parts of the students’ brains were involved in processing actual textures by touch and sight. This allowed them to establish with confidence the link within the brain between metaphors involving texture and the sensory experience of texture itself.

"Interestingly, visual cortical regions were not activated by textural metaphors, which fits with other evidence for the primacy of touch in texture perception," says research associate Simon Lacey, PhD, the first author of the paper.

The researchers did not find metaphor-specific differences in cortical regions well known to be involved in generating and processing language, such as Broca’s or Wernicke’s areas. However, this result doesn’t rule out a role for these regions in processing metaphors, Sathian says. Also, other neurologists have seen that injury to various areas of the brain can interfere with patients’ understanding of metaphors.

"I don’t think that there’s only one area responsible for metaphor processing," Sathian says. "Actually, several recent lines of research indicate that engagement with abstract concepts is distributed around the brain." "I think our research highlights the role of neural networks, rather than a single area of the brain, in these processes. What could be happening is that the brain is conducting an internal simulation as a way to understand the metaphor, and that’s why the regions associated with touch get involved. This also demonstrates how complex processes involving symbols, such as appreciating a painting or understanding a metaphor, do not depend just on evolutionarily new parts of the brain, but also on adaptations of older parts of the brain."

Sathian’s future plans include asking whether similar relationships exist for other senses, such as vision. The researchers also plan to probe whether magnetic stimulation of the brain in regions associated with sensory experience can interfere with understanding metaphors.

The research was supported by the National Institutes of Health and the National Science Foundation.

Source: ScienceDaily

Feb 6, 2012
#science #neuroscience #psychology #brain
Feb 4, 2012293 notes
Treating Brain Injuries With Stem Cell Transplants - Promising Results

Article Date: 04 Feb 2012 - 10:00 PST

The February edition of Neurosurgery reports that animal experiments in brain-injured rats have shown that stem cells injected via the carotid artery travel directly to the brain, greatly enhancing functional recovery. The study demonstrates, according to leading researcher Dr Toshiya Osanai, of Hokkaido University Graduate School of Medicine in Sapporo, Japan, that the carotid artery injection technique could, together with some form of in-vivo optical imaging to track the stem cells after transplantation, potentially be part of a new approach for stem cell transplantation in human brain trauma injuries (TBI).

Dr. Osanai and team assessed a new “intra-arterial” technique of stem cell transplantation in rats, with the aim of delivering the stem cells directly to the brain without having to go through the general circulation. They induced TBI in the animals before injecting stem cells into the carotid artery seven days later.

The stem cells were obtained from the rats’ bone marrow and were labeled with “quantum dots” prior to being injected. Quantom dots are a biocompatible, fluorescent semiconductor created with nanotechnology that emit near-infrared light with much longer wavelengths that penetrate bone and skin, enabling a non-invasive method of monitoring the stem cells for a period of four weeks following transplantation.

This in vivo optical imaging technique enabled the scientists to observe that the injected stem cells entered the brain on the first attempt, without entering the general circulation. They observed that the stem cells started migrating from the capillaries into the injured part of the brain within three hours.

At week 4, the researchers noted that the rats in the stem cell transplant group achieved a substantial recovery of motor function, compared with the untreated animals that had no signs of recovery.

The team learnt, after examining the treated brains, that the stem cells had transformed into different brain cell types and aided in healing the injured brain area.

Over the last few years, the potential of stem cell therapy for curing and treating illnesses and conditions has been growing rapidly. Below is a list of some of its possible uses.

(Photo by: Mikael Häggström)

Developing stem cell therapy for brain injury in human patients

Stem cells represent a potential, new important method of treatment for those who suffered brain injuries, TBI and stroke. But even though bone marrow stem cells, similar to the ones used in the new study, are a promising source of donor cells, many questions remain open regarding the optimal timing, dose and route of stem cell delivery.


In the new animal study, the rats were injected with the stem cells one week after TBI. This is a “clinically relevant” time, given that this is the minimum time it takes to develop stem cells from bone marrow.

Transplanting the stem cells into the carotid artery is a fairly simple procedure that delivers the cells directly to the brain.

The experiments have also provided key evidence that stem cell treatment can promote healing after TBI with a substantial recovery of function.

Dr. Osanai and team write that by using in vivo optical imaging:

"The present study was the first to successfully track donor cells that were intra-arterially transplanted into the brain of living animals over four weeks."

A similar form of imaging technology could also prove beneficial for monitoring the effects of stem cell transplantation in humans, although the tracking will pose challenges, due to the human skull and scalp being much thicker than in rats.

The researchers conclude:

"Further studies are warranted to apply in vivo optical imaging clinically.”

Written by Petra Rattue

Source: Medical News Today

Feb 4, 20122 notes
#science #neuroscience #psychology #brain
Discovery of Extremely Long-Lived Proteins May Provide Insight Into Cell Aging and Neurodegenerative Diseases

ScienceDaily (Feb. 3, 2012) — One of the big mysteries in biology is why cells age. Now scientists at the Salk Institute for Biological Studies report that they have discovered a weakness in a component of brain cells that may explain how the aging process occurs in the brain.

This microscope image shows extremely long-lived proteins, or ELLPs, glowing green on the outside of the nucleus of a rat brain cell. DNA inside the nucleus is pictured in blue. The Salk scientists discovered that the ELLPs, which form channels through the wall of the nucleus, lasted for more than a year without being replaced. Deterioration of these proteins may allow toxins to enter the nucleus, resulting in cellular aging. (Credit: Courtesy of Brandon Toyama, Salk Institute for Biological Studies)

The scientists discovered that certain proteins, called extremely long-lived proteins (ELLPs), which are found on the surface of the nucleus of neurons, have a remarkably long lifespan.

While the lifespan of most proteins totals two days or less, the Salk Institute researchers identified ELLPs in the rat brain that were as old as the organism, a finding they reported February 3 in Science.

The Salk scientists are the first to discover an essential intracellular machine whose components include proteins of this age. Their results suggest the proteins last an entire lifetime, without being replaced.

ELLPs make up the transport channels on the surface of the nucleus; gates that control what materials enter and exit. Their long lifespan might be an advantage if not for the wear-and-tear that these proteins experience over time. Unlike other proteins in the body, ELLPs are not replaced when they incur aberrant chemical modifications and other damage.

Damage to the ELLPs weakens the ability of the three-dimensional transport channels that are composed of these proteins to safeguard the cell’s nucleus from toxins, says Martin Hetzer, a professor in Salk’s Molecular and Cell Biology Laboratory, who headed the research. These toxins may alter the cell’s DNA and thereby the activity of genes, resulting in cellular aging.

Funded by the Ellison Medical Foundation and the Glenn Foundation for Medical Research, Hetzer’s research group is the only lab in the world that is investigating the role of these transport channels, called the nuclear pore complex (NPC), in the aging process.

Previous studies have revealed that alterations in gene expression underlie the aging process. But, until the Hetzer lab’s discovery that mammals’ NPCs possess an Achilles’ heel that allows DNA-damaging toxins to enter the nucleus, the scientific community has had few solid clues about how these gene alterations occur.

"The fundamental defining feature of aging is an overall decline in the functional capacity of various organs such as the heart and the brain," says Hetzer. "This decline results from deterioration of the homeostasis, or internal stability, within the constituent cells of those organs. Recent research in several laboratories has linked breakdown of protein homeostasis to declining cell function."

The results that Hetzer and his team just report suggest that declining neuron function may originate in ELLPs that deteriorate as a result of damage over time.

"Most cells, but not neurons, combat functional deterioration of their protein components through the process of protein turnover, in which the potentially impaired parts of the proteins are replaced with new functional copies," says Hetzer.

"Our results also suggest that nuclear pore deterioration might be a general aging mechanism leading to age-related defects in nuclear function, such as the loss of youthful gene expression programs," he adds.

The findings may prove relevant to understanding the molecular origins of aging and such neurodegenerative disorders as Alzheimer’s disease and Parkinson’s disease.

In previous studies, Hetzer and his team discovered large filaments in the nuclei of neurons of old mice and rats, whose origins they traced to the cytoplasm. Such filaments have been linked to various neurological disorders including Parkinson’s disease. Whether the misplaced molecules are a cause, or a result, of the disease has not yet been determined.

Also in previous studies, Hetzer and his team documented age-dependent declines in the functioning of NPCs in the neurons of healthy aging rats, which are laboratory models of human biology.

Hetzer’s team includes his colleagues at the Salk Institute as well as John Yates III, a professor in the Department of Chemical Physiology of The Scripps Research Institute.

When Hetzer decided three years ago to investigate whether the NPC plays a role in initiating or contributing to the onset of aging and certain neurodegenerative diseases, some members of the scientific community warned him that such a study was too bold and would be difficult and expensive to conduct. But Hetzer was determined despite the warnings.

Source: ScienceDaily

Feb 4, 20126 notes
#science #neuroscience #psychology #disease
Feb 4, 201216 notes
#science #neuroscience #psychology #brain #brain wave
Human Brains Wire Up Slowly but Surely

by Jon Cohen on 1 February 2012, 6:00 PM

Synaptic division. Compared with chimpanzees, human children children slowly wire their brains. Credit: Fotosearch

As the father-to-son exchange in the old Cat Stevens song advised, “take your time, think a lot, … think of everything you’ve got.” Turns out the mellow ’70s folkie had stumbled upon what may explain a key feature of our brains that sets us apart from our closest relatives: We unhurriedly make synaptic connections through much of our early childhoods, and this plasticity enables us to slowly wire our brains based on our experiences. Given that humans and chimpanzees share 98.8% of the same genes, researchers have long wondered what drives our unique cognitive and social skills. Yes, chimpanzees are smart and cooperative to a degree, but we clearly outshine them when it comes to abstract thinking, self-regulation, assimilation of cultural knowledge, and reasoning abilities. Now a study that looks at postmortem brain samples from humans, chimpanzees, and macaques collected from before birth to up to the end of the life span for each of these species has found a key difference in the expression of genes that control the development and function of synapses, the connections among neurons through which information flows.

As researchers describe in a report published online today in Genome Research, they analyzed the expression of some 12,000 genes—part of the so-called transcriptome—from each species. They found 702 genes in the prefrontal cortex (PFC) of humans that had a pattern of expression over time that differed from the two other species. (The PFC plays a central role in social behavior, working toward goals, and reasoning.) By comparison, genes in the chimpanzee PFC at various life stages had only 55 unique expression patterns—12-fold fewer than found in humans.

The genes the researchers analyzed have myriad functions. But when the researchers created five modules that lumped together genes that were co-expressed, they found that the module in humans that’s most closely tied to synapse formation and function had a “drastically” different developmental trajectory. These genes were turned on high from just after birth until about 5 years of age; the same genes in chimpanzees and macaques began to stop expressing themselves shortly after birth. “We might have discovered one of the differences that makes human brains work differently from chimpanzees and macaques,” says lead researcher Philipp Khaitovich, an evolutionary biologist who works at both the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and the Chinese Academy of Sciences (CAS) in Shanghai, China.

The researchers, including Svante Pääbo of the Leipzig institute and Xiling Liu of CAS, went a step further and actually counted more than 7000 synapses visible in electron micrographs from the three species at different ages. They found that the number of synapses in macaques and chimpanzees skyrocketed shortly after birth but did not peak in humans until about 4 years of age. “Humans have much more time to form synaptic connections,” Khaitovich concludes.

In their analyses, the researchers factored in that humans have much longer life spans than the other species and develop and mature more slowly in general. Their findings still stood out, even when adjusting for this developmental delay.

The work builds on behavioral evidence that showed the advantages of a prolonged childhood, as well as several other studies that have found differences in chimpanzee and human genes involved with synapse formation and function. But no group has ever done such a thorough comparative, longitudinal analysis of the brain transciptomes of these three species, says Todd Preuss, a neuroscientist at the Yerkes National Primate Research Center in Atlanta. “The whole thing is a technical tour de force,” Preuss says.

Nenad Sestan, a neurobiologist at Yale University who published a comprehensive analysis of the transcriptome of human brains from embryos to late adulthood in the 27 October 2011 issue of Nature, says the new work “is novel and provocative.” Sestan says to clarify differences between the species, the field now needs to examine more brain regions “to have a clearer idea of how specific this may be to the dorsolateral prefrontal cortex.”

The findings from Khaitovich and colleagues promise to spark future studies that address profound questions about everything from evolution to gene regulation. For example, they suggest in their report that the differences they found may also separate us from Neandertals, as evidence suggests that these extinct humans had faster cranial and dental development than modern humans.

Neurologist Eric Courchesne of the University of California, San Diego, says the new findings also mesh with his own studies of autism and brain overgrowth. Courchesne has found that the brains of autistic children grow more quickly than normal, which he theorizes prevents them from having enough experiences to properly wire neurons. “This is an absolutely fascinating study that will have great importance for advancing understanding of human disorders of early brain development as well as illuminating the evolutionary changes in neural development,” Courchesne says.

Source: ScienceNow

Feb 4, 20121 note
#science #neuroscience #psychology #brain
New procedure repairs severed nerves in minutes, restoring limb use in days or weeks

February 3rd, 2012 in Neuroscience 

American scientists believe a new procedure to repair severed nerves could result in patients recovering in days or weeks, rather than months or years. The team used a cellular mechanism similar to that used by many invertebrates to repair damage to nerve axons. Their results are published today in the Journal of Neuroscience Research.

"We have developed a procedure which can repair severed nerves within minutes so that the behavior they control can be partially restored within days and often largely restored within two to four weeks," said Professor George Bittner from the University of Texas. "If further developed in clinical trials this approach would be a great advance on current procedures that usually imperfectly restore lost function within months at best."

The team studied the mechanisms all animal cells use to repair damage to their membranes and focused on invertebrates, which have a superior ability to regenerate nerve axons compared to mammals. An axon is a long extension arising from a nerve cell body that communicates with other nerve cells or with muscles.

This research success arises from Bittner’s discovery that nerve axons of invertebrates which have been severed from their cell body do not degenerate within days, as happens with mammals, but can survive for months, or even years.

The severed proximal nerve axon in invertebrates can also reconnect with its surviving distal nerve axon to produce much quicker and much better restoration of behaviour than occurs in mammals.

"Severed invertebrate nerve axons can reconnect proximal and distal ends of severed nerve axons within seven days, allowing a rate of behavioural recovery that is far superior to mammals," said Bittner. "In mammals the severed distal axonal stump degenerates within three days and it can take nerve growths from proximal axonal stumps months or years to regenerate and restore use of muscles or sensory areas, often with less accuracy and with much less function being restored."

The team described their success in applying this process to rats in two research papers published today. The team were able to repair severed sciatic nerves in the upper thigh, with results showing the rats were able to use their limb within a week and had much function restored within 2 to 4 weeks, in some cases to almost full function.

"We used rats as an experimental model to demonstrate how severed nerve axons can be repaired. Without our procedure, the return of nearly full function rarely comes close to happening," said Bittner. "The sciatic nerve controls all muscle movement of the leg of all mammals and this new approach to repairing nerve axons could almost-certainly be just as successful in humans."

To explore the long term implications and medical uses of this procedure, MD’s and other scientist- collaborators at Harvard Medical School and Vanderbilt Medical School and Hospitals are conducting studies to obtain approval to begin clinical trials.

"We believe this procedure could produce a transformational change in the way nerve injuries are repaired," concluded Bittner.

Provided by Wiley

"New procedure repairs severed nerves in minutes, restoring limb use in days or weeks." February 3rd, 2012. http://medicalxpress.com/news/2012-02-procedure-severed-nerves-minutes-limb.html

Feb 4, 20124 notes
#science #neuroscience #psychology
Renowned physicist invents microscope that can peer at living brain cells

February 3, 2012

Schematic drawing of the upright STED microscope used for the experiments. Image: Science, DOI:10.1126/science.1215369

(PhysOrg.com) — Ever since scientists began studying the brain, they’ve wanted to get a better look at what was going on. Researchers have poked and prodded and looked at dead cells under electron microscopes, but never before have they been able to get high resolution microscopic views of actual living brain cells as they function inside of a living animal. Now, thanks to work by physicist Stefan Hell and his colleagues at the Max Planck Institute in Germany, that dream is realized. In a paper published in Science, Hell and his team describe the workings of their marvelous discovery.

Hell (which in German means “bright”) and others at the Institute have been working for years on ultra high resolution microscopes that go by the name “stimulated emission depletion” or STED microscopes. Now, they’ve taken their work to a whole new level by cutting away a small portion of a mouse’s skull and replacing it with a glass window and then placing their latest STED microscope against the glass to peer inside. To make it easier to see what is what, the team first genetically altered the mouse to make certain brain cells fluorescent. Then, to allow for focusing exclusively on just those cells that are lit up, they added software to the microscope to blot out anything that was not lit up. The result is super high resolution real time imagery of the neurons that exist on the exterior part of a living mouse brain. 

(video)

STED time-lapse recording of a single spine at an interval of 10 seconds. The measurement includes 128 z-stacks consisting of 5 slices each. Most of the rapid remodeling of the spine head appears continuous and smooth at this frame rate. No damage is observed at the dendrite or the spine after recording a total of 640 slices. The movie was acquired in a different experiment than the spines in Fig.1. Scale bar = 1µm. Video: DOI:10.1126/science.1215369

The new microscope provides clear resolution down to 70 nanometers, which is four times that ever achieved before and is enough to allow scientists to see the actual movement of dendritic spines, which may help researches understand why they do so.

It is likely that researchers will find many varied uses for the new microscope. One prominent area will almost certainly involve looking into what psychiatric drugs are really doing within synapses, perhaps leading to breakthroughs in pharmaceutical drugs that are better able to target specific illnesses.

One downside to any new scientific breakthrough however, is the natural tendency of many to move from excitation, to wondering about what will come next. In this case, Hell and his team have already started contemplating ideas on ways to allow researchers to study any cell in the living brain at such high resolution, not just those that lie on the surface.

More information: Nanoscopy in a Living Mouse Brain, Science 3 February 2012: Vol. 335 no. 6068 p. 551. DOI: 10.1126/science.1215369

"Renowned physicist invents microscope that can peer at living brain cells." February 3rd, 2012. http://www.physorg.com/news/2012-02-renowned-physicist-microscope-peer-brain.html

Feb 4, 2012
#brain #science #neuroscience #psychology #physics
Feb 3, 20121 note
#placebo #placebo effect #brain
Placebo Effect: New Study Shows How to Boost the Power of Pain Relief, Without Drugs

ScienceDaily (Feb. 3, 2012) — Placebos reduce pain by creating an expectation of relief. Distraction — say, doing a puzzle — relieves it by keeping the brain busy. But do they use the same brain processes? Neuromaging suggests they do. When applying a placebo, scientists see activity in the dorsolateral prefrontal cortex. That’s the part of the brain that controls high-level cognitive functions like working memory and attention — which is what you use to do that distracting puzzle.

Now a new study challenges the theory that the placebo effect is a high-level cognitive function. The authors — Jason T. Buhle, Bradford L. Stevens, and Jonathan J. Friedman of Columbia University and Tor D. Wager of the University of Colorado Boulder — reduced pain in two ways — either by giving them a placebo, or a difficult memory task. lacebo. But when they put the two together, “the level of pain reduction that people experienced added up. There was no interference between them,” says Buhle. “That suggests they rely on separate mechanisms.” The findings, published in Psychological Science, a journal of the Association for Psychological Science, could help clinicians maximize pain relief without drugs.

In the study, 33 participants came in for three separate sessions. In the first, experimenters applied heat to the skin with a little metal plate and calibrated each individual’s pain perceptions. In the second session, some of the people applied an ordinary skin cream they were told was a powerful but safe analgesic. The others put on what they were told was a regular hand cream. In the placebo-only trials, participants stared at a cross on the screen and rated the pain of numerous applications of heat — the same level, though they were told it varied. For other trials they performed a tough memory task — distraction and placebo simultaneously. For the third session, those who’d had the plain cream got the “analgesic” and vice versa. The procedure was the same.

The results: With either the memory task or the placebo alone, participants felt less pain than during the trials when they just stared at the cross. Together, the two effects added up; they didn’t interact or interfere with each other. The data suggest that the placebo effect does not require executive attention or working memory.

So what about that neuroimaging? “Neuroimaging is great,” says Buhle, “but because each brain region does many things, when you see activation in a particular area, you don’t know what cognitive process is driving it.” This study tested the theory about how placebos work with direct behavioral observation.

The findings are promising for pain relief. Clinicians use both placebos and distraction — for instance, virtual reality in burn units. But they weren’t sure if one might diminish the other’s efficacy. “This study shows you can use them together,” says Buhle, “and get the maximum bang for your buck without medications.”

Source: ScienceDaily

Feb 3, 20121 note
#science #neuroscience #psychology #placebo
Schizophrenia: When Hallucinatory Voices Suppress Real Ones, New Electronic Application May Help

ScienceDaily (Feb. 3, 2012) — When a patient afflicted with schizophrenia hears inner voices something is taking place inside the brain that prevents the individual from perceiving real voices. A simple electronic application may help the patient learn to shift focus.

 

Image captures of the brain show how neurons are activated in healthy control subjects when hearing actual voices (top row) whereas activation fails to occur in patients who experience auditory hallucinations. (Credit: Kenneth Hugdahl)

"The patient experiences the inner voices as 100 per cent real, just as if someone was standing next to him and speaking" explains Professor Kenneth Hugdahl of the University of Bergen. "At the same time, he can’t hear voices of others actually present in the same room."

Auditory hallucinations are one of the most common symptoms associated with schizophrenia.

Neural activity ceases

Dr Hugdahl’s research group has made use of a variety of neuroimaging techniques, including functional magnetic resonance imaging technology (fMRI) to enable them quite literally to see what happens inside the brain when the inner voices make their presence known. The project received funding under the NevroNor national initiative on neuroscientific research administered under the auspices of the Research Council of Norway

Images of patients’ brains reveal a spontaneous activation of neurons in a particular area of the brain — specifically the rear, upper region of the left temporal lobe. This is the area responsible for speech perception, and when healthy people hear speech it becomes activated. So what happens when patients with schizophrenia hear a real voice and a hallucinatory one at the same time?

"It would be natural to assume that neural activity would increase somewhat — even twofold. But quite the opposite takes place; we actually observed that the activity ceased altogether," states Professor Hugdahl.

Losing contact with the outside world

In order to learn more about what was happening, Hugdahl and his colleagues Kristiina Kompus and René Westerhausen carried out a meta-analysis of 23 studies. These studies focused either on spontaneous inner-voice triggered neural activation in subjects with schizophrenia or the stimulatory reaction prompted by actual sounds in both healthy and schizophrenic subjects.

It emerged that many researchers had observed either that a spontaneous activation of neurons occurs in patients hearing inner voices or that the patients’ perception of actual voices becomes suppressed when these are heard simultaneously with inner voices. No one had seen the connection between these findings.

"Previously, we thought these were two separate phenomena. But our analyses revealed that the one causes the other: when neurons become activated by inner voices it inhibits perception of outside speech. The neurons become ‘preoccupied’ and can’t ‘process’ voices from the outside," explains Professor Hugdahl.

"This may explain why schizophrenic patients close themselves off so completely and lose touch with the outside world when experiencing hallucinations," he purports.

Electronic app designed to improve impulse control

Hugdal and his colleagues made yet another discovery that may well help explain how the lives of these individuals become consumed by inner voices. It turns out that the frontal lobe in the brains of schizophrenia patients does not function exactly the way it should. As a result, these patients have a lesser degree of impulse control and are unable to filter out their inner voices.

"Every one of us hears inner voices or melodies from time to time. The difference between non-afflicted individuals and schizophrenia patients is that the former manage to tune these out better," the professor points out.

If patients could learn to stifle inner noise it could have a huge impact on our ability to treat schizophrenia, he states. To this end, Professor Hugdahl’s research group has developed an application that can be used on mobile phones and other simple electronic devices, to help patients improve their filters.

Wearing headphones, the patient is exposed to simple speech sounds with different sounds played in each ear. The task is to practice hearing the sound in one ear while blocking out sound in the other. The application has only been tested on two patients with schizophrenia so far. The response from these patients is promising, Dr Hugdahl relates.

"The voices are still there, but the test subjects feel that they have control over the voices instead of the other way around. The patient feels it is a breakthrough since it means he can actively shift his focus from the inner voices over to the sounds coming from the outside," the professor explains.

Source: ScienceDaily

Feb 3, 20127 notes
#science #neuroscience #psychology #brain #schizophrenia
Noise Exposure Can Cause Long-Lasting Changes To Sensory Pathways; Touch-Sensing Nerve Cells May Lead To Future Tinnitus Treatments

Article Date: 03 Feb 2012 - 0:00 PST

We all know that it can take a little while for our hearing to bounce back after listening to our iPods too loud or attending a raucous concert. But new research at the University of Michigan Health System suggests over-exposure to noise can actually cause more lasting changes to our auditory circuitry - changes that may lead to tinnitus, commonly known as ringing in the ears.

U-M researchers previously demonstrated that after hearing damage, touch-sensing “somatosensory” nerves in the face and neck can become overactive, seeming to overcompensate for the loss of auditory input in a way the brain interprets - or “hears” - as noise that isn’t really there.

The new study, which appears in The Journal of Neuroscience, found that somatosensory neurons maintain a high level of activity following exposure to loud noise, even after hearing itself returns to normal.

The findings were made in guinea pigs, but mark an important step toward potential relief for people plagued by tinnitus, says lead investigator Susan E. Shore, Ph.D., of U-M’s Kresge Hearing Research Institute and a professor of otolaryngology and molecular and integrative physiology at the U-M Medical School.

“The animals that developed tinnitus after a temporary loss in their hearing after loud noise exposure were the ones who had sustained increases in activity in these neural pathways,” Shore says. “In the future it may be possible to treat tinnitus patients by dampening the hyperactivity by reprogramming these auditory-touch circuits in the brain.”

In normal hearing, a part of the brain called the dorsal cochlear nucleus is the first stop for signals arriving from the ear via the auditory nerve. But it’s also a hub where “multitasking” neurons process other sensory signals, such as touch, together with hearing information.

During hearing loss, the other sensory signals entering the dorsal cochlear nucleus are amplified, Shore’s earlier research found. This overcompensation by the somatosensory neurons, which carry information about touch, vibration, skin temperature and pain, is believed to fuel tinnitus in many cases.

Tinnitus affects up to 50 million people in the United States and millions more worldwide, according to the American Tinnitus Association. It can range from intermittent and mildly annoying to chronic, severe and debilitating. There is no cure.

It especially affects baby boomers, who, as they reach an age at which hearing tends to diminish, increasingly find that tinnitus moves in. The condition most commonly occurs with hearing loss, but can also follow head and neck trauma, such as after an auto accident, or dental work. Tinnitus is the number one disability afflicting members of the armed forces.

The involvement of touch sensing (or “somatosensory”) nerves in the head and neck explains why many tinnitus sufferers can change the volume and pitch of the sound by clenching their jaw, or moving their head and neck, Shore explains.

While the new study builds on previous discoveries by Shore and her team, many aspects are new.

“This is the first research to show that, in the animals that developed tinnitus after hearing returned to normal, increased excitation from the somatosensory nerves in the head and neck continued. This dovetails with our previous research, which suggests this somatosensory excitation is a major component of tinnitus,” says Shore, who serves on the scientific advisory committee of the American Tinnitus Association.

“The better we understand the underlying causes of tinnitus, the better we’ll be able to develop new treatments,” she adds.

Source: Medical News Today 

Feb 3, 201230 notes
#science #neuroscience #psychology #ear #tinnitus
Play
Feb 3, 201215 notes
Feb 3, 201213 notes
Investigating The Neural Basis Of Prosopagnosia

Article Date: 03 Feb 2012 - 0:00 PST

For Bradley Duchaine, there is definitely more than meets the eye where faces are concerned.

With colleagues at Birkbeck College in the University of London, he is investigating the process of facial recognition, seeking to understand the complexity of what is actually taking place in the brain when one person looks at another.

His studies target people who display an inability to recognize faces, a condition long known as prosopagnosia. Duchaine is trying to understand the neural basis of the condition while also make inferences about what is going wrong in terms of information processing - where in the stages that our brains go through to recognize a face is the system breaking down. A paper published in Brain details the most recent experimental results.

“We refer to prosopagnosia as a ‘selective’ deficit of face recognition, in that other cognitive process do not seem to be affected,” explains Duchaine, an associate professor of psychological and brain sciences. “[People with the condition] might be able to recognize voices perfectly, which demonstrates that it is really a visual problem. In what we call pure cases, people can recognize cars perfectly, and they can recognize houses perfectly. It is just faces that are a problem.”

The condition may be acquired as the result of a stroke, for example. But in the recent study, Duchaine focused on developmental prosopagnosia, in which a person fails to develop facial recognition abilities.

“Other parts of the brain develop apparently normally,” Duchaine says. “These are intelligent people who have good jobs and get along fine but they can’t recognize faces.”

The primary experimental tool in this experiment was the electroencephalogram (EEG), which has the advantage of providing excellent temporal resolution - pinpointing the timing of the brain’s electrical response to a given stimulus.

Duchaine and his colleagues placed a series of electrodes around the scalps of prosopagnosics and showed them images of famous faces and non-famous faces, recording their responses. As expected, many of the famous faces were not recognized.

They found an electrical response at about 250 milliseconds (ms) after seeing the faces. Among the control group of non-prosopagnosics, a real difference was observed between their responses to famous and non-famous faces. In half the prosopagnosics there was not. Surprisingly, however, in the other half of the prosopagnosic test subjects they did find a difference.

“On the many trials where half failed to categorize a famous face as familiar, they nevertheless showed an EEG difference around 250ms after stimulus presentation between famous and non-famous faces like normal subjects do. Normal subjects also show a difference between famous and non-famous about 600ms after presentation, but the prosopagnosics did not show this difference,” Duchaine observes.

This pattern of results suggests the prosopagnosics unconsciously recognized the famous faces at an early stage (250ms) but this information was lost by the later stage (600ms). Duchaine concludes that even though they are not consciously aware that this is a famous face, some part of their brain at this stage in the process is aware and is recognizing that face, a phenomenon termed covert face recognition.

He suggests that the other half of the prosopagnosics, who showed no difference between responses at 250ms, were experiencing a malfunction in their face processing system already at this early stage suggesting a different type of prosopagnosia.

“The temporal lobe contains a number of face processing areas, so you can imagine there are many different ways that this system can malfunction. Not only can an area not work, connections between areas might not work yielding probably dozens of these different variants of this condition,” he surmises.

Covert recognition has been demonstrated in prosopagnosia acquired through brain damage, but Duchaine’s work is the first convincing demonstration of covert recognition in developmental prosopagnosia, the much more common form. 

Source: Medical News Today

Feb 3, 20122 notes
#science #brain #psychology #neuroscience #prosopagnosia
An Explanation For Why The Brain May Become More Reluctant To Function As We Grow Older

Article Date: 03 Feb 2012 - 0:00 PST

New findings, led by neuroscientists at the University of Bristol and published this week in the journal Neurobiology of Aging, reveal a novel mechanism through which the brain may become more reluctant to function as we grow older.

It is not fully understood why the brain’s cognitive functions such as memory and speech decline as we age. Although work published this year suggests cognitive decline can be detectable before 50 years of age. The research, led by Professor Andy Randall and Dr Jon Brown from the University’s School of Physiology and Pharmacology, identified a novel cellular mechanism underpinning changes to the activity of neurones which may underlie cognitive decline during normal healthy aging.

The brain largely uses electrical signals to encode and convey information. Modifications to this electrical activity are likely to underpin age-dependent changes to cognitive abilities.

The researchers examined the brain’s electrical activity by making recordings of electrical signals in single cells of the hippocampus, a structure with a crucial role in cognitive function. In this way they characterised what is known as “neuronal excitability” - this is a descriptor of how easy it is to produce brief, but very large, electrical signals called action potentials; these occur in practically all nerve cells and are absolutely essential for communication within all the circuits of the nervous system.

Action potentials are triggered near the neurone’s cell body and once produced travel rapidly through the massively branching structure of the nerve cell, along the way activating the synapses the nerve cell makes with the numerous other nerve cells to which it is connected.

The Bristol group identified that in the aged brain it is more difficult to make hippocampal neurones generate action potentials. Furthermore they demonstrated that this relative reluctance to produce action potential arises from changes to the activation properties of membrane proteins called sodium channels, which mediate the rapid upstroke of the action potential by allowing a flow of sodium ions into neurones.

Professor Randall, Professor in Applied Neurophysiology said: “Much of our work is about understanding dysfunctional electrical signalling in the diseased brain, in particular Alzheimer’s disease. We began to question, however, why even the healthy brain can slow down once you reach my age. Previous investigations elsewhere have described age-related changes in processes that are triggered by action potentials, but our findings are significant because they show that generating the action potential in the first place is harder work in aged brain cells.

“Also by identifying sodium channels as the likely culprit for this reluctance to produce action potentials, our work even points to ways in which we might be able modify age-related changes to neuronal excitability, and by inference cognitive ability.”  

Source: Medical News Today

Feb 3, 2012
#science #neuroscience #psychology #brain
Gene regulator in brain's executive hub tracked across lifespan

February 2nd, 2012 in Genetics


A representative gene shows how sex can influence levels of methylation across the lifespan. Each dot represents a different brain. Credit: Barbara Lipska, Ph.D., NIMH Clinical Brain Disorders Branch

For the first time, scientists have tracked the activity, across the lifespan, of an environmentally responsive regulatory mechanism that turns genes on and off in the brain’s executive hub. Among key findings of the study by National Institutes of Health scientists: genes implicated in schizophrenia and autism turn out to be members of a select club of genes in which regulatory activity peaks during an environmentally-sensitive critical period in development. The mechanism, called DNA methylation, abruptly switches from off to on within the human brain’s prefrontal cortex during this pivotal transition from fetal to postnatal life. As methylation increases, gene expression slows down after birth.

Epigenetic mechanisms like methylation leave chemical instructions that tell genes what proteins to make -what kind of tissue to produce or what functions to activate. Although not part of our DNA, these instructions are inherited from our parents. But they are also influenced by environmental factors, allowing for change throughout the lifespan.

“Developmental brain disorders may be traceable to altered methylation of genes early in life,” explained Barbara Lipska, Ph.D., a scientist in the NIH’s National Institute of Mental Health (NIMH) and lead author of the study. “For example, genes that code for the enzymes that carry out methylation have been implicated in schizophrenia. In the prenatal brain, these genes help to shape developing circuitry for learning, memory and other executive functions which become disturbed in the disorders. Our study reveals that methylation in a family of these genes changes dramatically during the transition from fetal to postnatal life - and that this process is influenced by methylation itself, as well as genetic variability. Regulation of these genes may be particularly sensitive to environmental influences during this critical early life period.”

Lipska and colleagues report on the ebb and flow of the human prefrontal cortex’s (PFC) epigenome across the lifespan, February 2, 2012, online in the American Journal of Human Genetics.



Two representative genes show strikingly opposite trajectories of PFC methylation across the lifespan. Each dot represents a different brain. Usually, the more methylation, the less gene expression. Credit: Barbara Lipska, Ph.D., NIMH Clinical Brain Disorders Branch

“This new study reminds us that genetic sequence is only part of the story of development. Epigenetics links nurture and nature, showing us when and where the environment can influence how the genetic sequence is read,” said NIMH director Thomas R. Insel, M.D.

In a companion study published last October, the NIMH researchers traced expression of gene products in the PFC across the lifespan. The current study instead examined methylation at 27,000 sites within PFC genes that regulate such expression. Both studies examined post-mortem brains of non-psychiatrically impaired individuals ranging in age from two weeks after conception to 80 years old.

In most cases, when chemicals called methyl groups attach to regulatory regions of genes, they silence them. Usually, the more methylation, the less gene expression. Lipska’s team found that the overall level of PFC methylation is low prenatally when gene expression is highest and then switches direction at birth, increasing as gene expression plummets in early childhood. It then levels off as we grow older. But methylation in some genes shows an opposite trajectory. The study found that methylation is strongly influenced by gender, age and genetic variation.

For example, methylation levels differed between males and females in 85 percent of X chromosome sites examined, which may help to explain sex differences in disorders like autism and schizophrenia.

Different genes - and subsets of genes - methylate at different ages. Some of the suspect genes found to peak in methylation around birth code for enzymes, called methytransferases, that are over-expressed in people with schizophrenia and bipolar disorder. This process is influenced, in turn, by methylation in other genes, as well as by genetic variation. So genes associated with risk for such psychiatric disorders may influence gene expression through methylation in addition to inherited DNA.

Provided by National Institutes of Health

“Gene regulator in brain’s executive hub tracked across lifespan.” February 2nd, 2012. http://medicalxpress.com/news/2012-02-gene-brain-hub-tracked-lifespan.html

Feb 3, 2012
#science #neuroscience #psychology #brain #genetics
Feb 2, 201252 notes
Untangling the Mysteries of Alzheimer's

ScienceDaily (Feb. 2, 2012) — One of the most distinctive signs of the development of Alzheimer’s disease is a change in the behavior of a protein that neuroscientists call tau. In normal brains, tau is present in individual units essential to neuron health. In the cells of Alzheimer’s brains, by contrast, tau proteins aggregate into twisted structures known as “neurofibrillary tangles.” These tangles are considered a hallmark of the disease, but their precise role in Alzheimer’s pathology has long been a point of contention among researchers.

Now, University of Texas Medical Branch at Galveston researchers have found new evidence that confirms the significance of tau to Alzheimer’s. Instead of focusing on tangles, however, their work highlights the intermediary steps between a single tau protein unit and a neurofibrillary tangle — assemblages of two, three, four, or more tau proteins known as “oligomers,” which they believe are the most toxic entities in Alzheimer’s.

"What we discovered is that there are smaller structures that form before the neurofibrillary tangles, and they are much more toxic than the big structures," said Rakez Kayed, UTMB assistant professor and senior author of a paper on the work now online in the FASEB Journal. “And we established that they were toxic in real human brains, which is important to developing an effective therapy.”

According to Kayed, a key antibody developed at UTMB called T22 enabled the team to produce a detailed portrait of tau oligomer behavior in human brain tissue. Specifically designed to bond only to tau oligomers (and not lone tau proteins or neurofibrillary tangles), the antibody made it possible for the researchers to use a variety of analytical tools to compare samples of Alzheimer’s brain with samples of age-matched healthy brain.

"One thing that’s remarkable about this research is that before we developed this antibody, people couldn’t even see tau oligomers in the brain," Kayed said. "With T22, we were able to thoroughly characterize them, and also study them in human brain cells."

Among the researchers’ most striking findings: in some of the Alzheimer’s brains they examined, tau oligomer levels were as much as four times as high as those found in age-matched control brains.

Other experiments revealed specific biochemical behavior and structures taken on by oligomers, and demonstrated their presence outside neurons — in particular, on the walls of blood vessels.

"We think this is going to make a big impact scientifically, because it opens up a lot of new areas to study," Kayed said. "It also relates to our main focus, developing a cure for Alzheimer’s. And I find that very, very exciting."

Provided by University of Texas Medical Branch at Galveston

Source: ScienceDaily

Feb 2, 201214 notes
#Alzheimer's #brain #science #psychology #neuroscience
Scientists Have Now Discovered How Different Brain Regions Cooperate During Short-Term Memory

Article Date: 02 Feb 2012 - 1:00 PST

Holding information within one’s memory for a short while is a seemingly simple and everyday task. We use our short-term memory when remembering a new telephone number if there is nothing to write at hand, or to find the beautiful dress inside the store that we were just admiring in the shopping window. Yet, despite the apparent simplicity of these actions, short-term memory is a complex cognitive act that entails the participation of multiple brain regions. However, whether and how different brain regions cooperate during memory has remained elusive. A group of researchers from the Max Planck Institute for Biological Cybernetics in Tubingen, Germany have now come closer to answering this question. They discovered that oscillations between different brain regions are crucial in visually remembering things over a short period of time.

It has long been known that brain regions in the frontal part of the brain are involved in short-term memory, while processing of visual information occurs primarily at the back of the brain. However, to successfully remember visual information over a short period of time, these distant regions need to coordinate and integrate information.

To better understand how this occurs, scientists from the Max Planck Institute of Biological Cybernetics in the department of Nikos Logothetis recorded electrical activity both in a visual area and in the frontal part of the brain in monkeys. The scientists showed the animals identical or different images within short intervals while recording their brain activity. The animals then had to indicate whether the second image was the same as the first one.

The scientists observed that, in each of the two brain regions, brain activity showed strong oscillations in a certain set of frequencies called the theta-band. Importantly, these oscillations did not occur independently of each other, but synchronized their activity temporarily: “It is as if you have two revolving doors in each of the two areas. During working memory, they get in sync, thereby allowing information to pass through them much more efficiently than if they were out of sync,” explains Stefanie Liebe, the first author of the study, conducted in the team of Gregor Rainer in cooperation with Gregor Hörzer from the Technical University Graz. The more synchronized the activity was, the better could the animals remember the initial image. Thus, the authors were able to establish a direct relationship between what they observed in the brain and the performance of the animal.

The study highlights how synchronized brain oscillations are important for the communication and interaction of different brain regions. Almost all multi-faceted cognitive acts, such as visual recognition, arise from a complex interplay of specialized and distributed neural networks. How relationships between such distributed sites are established and how they contribute to represent and communicate information about external and internal events in order to attain a coherent percept or memory is still poorly understood.

Source: Medical News Today

Feb 2, 201230 notes
#brain #neuroscience #science #memory #psychology
Feb 2, 201215,386 notes
Just another pretty face: Professor investigates neural basis of prosopagnosia

February 1st, 2012 in Psychology & Psychiatry 

These are examples of famous faces and non-famous faces used in Bradley Duchaine’s prosopagnosia experiment. Paired famous and non-famous faces are shown in corresponding positions. Credit: Bradley Duchaine

For Bradley Duchaine, there is definitely more than meets the eye where faces are concerned.

With colleagues at Birkbeck College in the University of London, he is investigating the process of facial recognition, seeking to understand the complexity of what is actually taking place in the brain when one person looks at another.

His studies target people who display an inability to recognize faces, a condition long known as prosopagnosia. Duchaine is trying to understand the neural basis of the condition while also make inferences about what is going wrong in terms of information processing-where in the stages that our brains go through to recognize a face is the system breaking down. A paper published in Brain details the most recent experimental results.

"We refer to prosopagnosia as a ‘selective’ deficit of face recognition, in that other cognitive process do not seem to be affected," explains Duchaine, an associate professor of psychological and brain sciences. "[People with the condition] might be able to recognize voices perfectly, which demonstrates that it is really a visual problem. In what we call pure cases, people can recognize cars perfectly, and they can recognize houses perfectly. It is just faces that are a problem."

The condition may be acquired as the result of a stroke, for example. But in the recent study, Duchaine focused on developmental prosopagnosia, in which a person fails to develop facial recognition abilities.

"Other parts of the brain develop apparently normally," Duchaine says. "These are intelligent people who have good jobs and get along fine but they can’t recognize faces."

The primary experimental tool in this experiment was the electroencephalogram (EEG), which has the advantage of providing excellent temporal resolution-pinpointing the timing of the brain’s electrical response to a given stimulus.

Duchaine and his colleagues placed a series of electrodes around the scalps of prosopagnosics and showed them images of famous faces and non-famous faces, recording their responses. As expected, many of the famous faces were not recognized.

They found an electrical response at about 250 milliseconds (ms) after seeing the faces. Among the control group of non-prosopagnosics, a real difference was observed between their responses to famous and non-famous faces. In half the prosopagnosics there was not. Surprisingly, however, in the other half of the prosopagnosic test subjects they did find a difference.

"On the many trials where half failed to categorize a famous face as familiar, they nevertheless showed an EEG difference around 250ms after stimulus presentation between famous and non-famous faces like normal subjects do. Normal subjects also show a difference between famous and non-famous about 600ms after presentation, but the prosopagnosics did not show this difference," Duchaine observes.

This pattern of results suggests the prosopagnosics unconsciously recognized the famous faces at an early stage (250ms) but this information was lost by the later stage (600ms). Duchaine concludes that even though they are not consciously aware that this is a famous face, some part of their brain at this stage in the process is aware and is recognizing that face, a phenomenon termed covert face recognition.

He suggests that the other half of the prosopagnosics, who showed no difference between responses at 250ms, were experiencing a malfunction in their face processing system already at this early stage suggesting a different type of prosopagnosia.

"The temporal lobe contains a number of face processing areas, so you can imagine there are many different ways that this system can malfunction. Not only can an area not work, connections between areas might not work yielding probably dozens of these different variants of this condition," he surmises.

Covert recognition has been demonstrated in prosopagnosia acquired through brain damage, but Duchaine’s work is the first convincing demonstration of covert recognition in developmental prosopagnosia, the much more common form.

Provided by Dartmouth College

"Just another pretty face: Professor investigates neural basis of prosopagnosia." February 1st, 2012. http://medicalxpress.com/news/2012-02-pretty-professor-neural-basis-prosopagnosia.html

Feb 2, 20122 notes
#science #neuroscience #psychology #prosopagnosia
Brain capacity limits exponential online data growth

February 1st, 2012 in Physics / General Physics 

Scientists have found that the capacity of the human brain to process and record information - and not economic constraints - may constitute the dominant limiting factor for the overall growth of globally stored information. These findings have just been published in an article in EPJ B by Claudius Gros and colleagues from the Institute for Theoretical Physics at Goethe University Frankfurt in Germany.

The authors first looked at the distribution of 633 public internet files by plotting the number of videos, audio and image files against the size of the files. They gathered files which were produced by humans or intended for human use with the spider file search engine Findfiles.net. They chose to focus on files which are hosted on domains pointing from the online encyclopaedia Wikipedia and the open web directory dmoz.

Assuming that economic costs for data production are proportional to the amount of data produced, these costs should be driving the generation of information exponentially. However, the authors found that, in fact, economic costs were not the limiting factors for data production. The absence of exponential tails for the graph representing the number of files indicates this conclusion.

They found that underlying neurophysiological processes influence the brain’s ability to handle information. For example, when a person produces an image and attributes a subjective value to it, for example, a given resolution, he or she is influenced by his or her perception of the quality of that image. Their perception of the amount of information gained when increasing the resolution of a low-quality image is substantially higher than when increasing the resolution of a high-quality photo by the same degree. This relation is known as the Weber-Fechner law.

The authors observed that file-size distributions obey this Weber-Fechner law. This means that the total amount of information cannot grow faster than our ability to digest or handle it.

More information: Gros C., Kaczor G., Markovic D., (2012) Neuropsychological constraints to human data production on a global scale, European Physical Journal B (EPJ B) 85: 28, DOI 10.1140/epjb/e2011-20581-3

Provided by Springer

"Brain capacity limits exponential online data growth." February 1st, 2012.http://www.physorg.com/news/2012-02-brain-capacity-limits-exponential-online.html

Feb 2, 20121 note
#science #neuroscience #brain #physics #psychology
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December