Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

409 notes

Scientists reverse memory loss in animal brain cells
Neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) have taken a major step in their efforts to help people with memory loss tied to brain disorders such as Alzheimer’s disease.
Using sea snail nerve cells, the scientists reversed memory loss by determining when the cells were primed for learning. The scientists were able to help the cells compensate for memory loss by retraining them through the use of optimized training schedules. Findings of this proof-of-principle study appear in the April 17 issue of The Journal of Neuroscience.
“Although much works remains to be done, we have demonstrated the feasibility of our new strategy to help overcome memory deficits,” said John “Jack” Byrne, Ph.D., the study’s senior author, as well as director of the W.M. Keck Center for the Neurobiology of Learning and Memory and chairman of the Department of Neurobiology and Anatomy at the UTHealth Medical School.
This latest study builds on Byrne’s 2012 investigation that pioneered this memory enhancement strategy. The 2012 study showed a significant increase in long-term memory in healthy sea snails called Aplysia californica, an animal that has a simple nervous system, but with cells having properties similar to other more advanced species including humans.
Yili Zhang, Ph.D., the study’s co-lead author and a research scientist at the UTHealth Medical School, has developed a sophisticated mathematical model that can predict when the biochemical processes in the snail’s brain are primed for learning.
Her model is based on five training sessions scheduled at different time intervals ranging from 5 to 50 minutes. It can generate 10,000 different schedules and identify the schedule most attuned to optimum learning.
“The logical follow-up question was whether you could use the same strategy to overcome a deficit in memory,” Byrne said. “Memory is due to a change in the strength of the connections among neurons. In many diseases associated with memory deficits, the change is blocked.”
To test whether their strategy would help with memory loss, Rong-Yu Liu, Ph.D., co-lead author and senior research scientist at the UTHealth Medical School, simulated a brain disorder in a cell culture by taking sensory cells from the sea snails and blocking the activity of a gene that produces a memory protein. This resulted in a significant impairment in the strength of the neurons’ connections, which is responsible for long-term memory.
To mimic training sessions, cells were administered a chemical at intervals prescribed by the mathematical model. After five training sessions, which like the earlier study were at irregular intervals, the strength of the connections returned to near normal in the impaired cells.
“This methodology may apply to humans if we can identify the same biochemical processes in humans. Our results suggest a new strategy for treatments of cognitive impairment.  Mathematical models might help design therapies that optimize the combination of training protocols with traditional drug treatments,” Byrne said.
He added, “Combining these two could enhance the effectiveness of the latter while compensating at least in part for any limitations or undesirable side effects of drugs. These two approaches are likely to be more effective together than separately and may have broad generalities in treating individuals with learning and memory deficits.”
(Image courtesy: UC Berkeley)

Scientists reverse memory loss in animal brain cells

Neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) have taken a major step in their efforts to help people with memory loss tied to brain disorders such as Alzheimer’s disease.

Using sea snail nerve cells, the scientists reversed memory loss by determining when the cells were primed for learning. The scientists were able to help the cells compensate for memory loss by retraining them through the use of optimized training schedules. Findings of this proof-of-principle study appear in the April 17 issue of The Journal of Neuroscience.

“Although much works remains to be done, we have demonstrated the feasibility of our new strategy to help overcome memory deficits,” said John “Jack” Byrne, Ph.D., the study’s senior author, as well as director of the W.M. Keck Center for the Neurobiology of Learning and Memory and chairman of the Department of Neurobiology and Anatomy at the UTHealth Medical School.

This latest study builds on Byrne’s 2012 investigation that pioneered this memory enhancement strategy. The 2012 study showed a significant increase in long-term memory in healthy sea snails called Aplysia californica, an animal that has a simple nervous system, but with cells having properties similar to other more advanced species including humans.

Yili Zhang, Ph.D., the study’s co-lead author and a research scientist at the UTHealth Medical School, has developed a sophisticated mathematical model that can predict when the biochemical processes in the snail’s brain are primed for learning.

Her model is based on five training sessions scheduled at different time intervals ranging from 5 to 50 minutes. It can generate 10,000 different schedules and identify the schedule most attuned to optimum learning.

“The logical follow-up question was whether you could use the same strategy to overcome a deficit in memory,” Byrne said. “Memory is due to a change in the strength of the connections among neurons. In many diseases associated with memory deficits, the change is blocked.”

To test whether their strategy would help with memory loss, Rong-Yu Liu, Ph.D., co-lead author and senior research scientist at the UTHealth Medical School, simulated a brain disorder in a cell culture by taking sensory cells from the sea snails and blocking the activity of a gene that produces a memory protein. This resulted in a significant impairment in the strength of the neurons’ connections, which is responsible for long-term memory.

To mimic training sessions, cells were administered a chemical at intervals prescribed by the mathematical model. After five training sessions, which like the earlier study were at irregular intervals, the strength of the connections returned to near normal in the impaired cells.

“This methodology may apply to humans if we can identify the same biochemical processes in humans. Our results suggest a new strategy for treatments of cognitive impairment.  Mathematical models might help design therapies that optimize the combination of training protocols with traditional drug treatments,” Byrne said.

He added, “Combining these two could enhance the effectiveness of the latter while compensating at least in part for any limitations or undesirable side effects of drugs. These two approaches are likely to be more effective together than separately and may have broad generalities in treating individuals with learning and memory deficits.”

(Image courtesy: UC Berkeley)

Filed under alzheimer's disease memory loss animal model nerve cells aplysia memory neuroscience science

94 notes

Detecting Autism From Brain Activity
Neuroscientists from Case Western Reserve University School of Medicine and the University of Toronto have developed an efficient and reliable method of analyzing brain activity to detect autism in children. Their findings appear today in the online journal PLOS ONE.
The researchers recorded and analyzed dynamic patterns of brain activity with magnetoencephalography (MEG) to determine the brain’s functional connectivity – that is, its communication from one region to another. MEG measures magnetic fields generated by electrical currents in neurons of the brain.
Roberto Fernández Galán, PhD, an assistant professor of neurosciences at Case Western Reserve and an electrophysiologist seasoned in theoretical physics led the research team that detected autism spectrum disorder (ASD) with 94 percent accuracy. The new analytic method offers an efficient, quantitative way of confirming a clinical diagnosis of autism.
“We asked the question, ‘Can you distinguish an autistic brain from a non-autistic brain simply by looking at the patterns of neural activity?’ and indeed, you can,” Galán said. “This discovery opens the door to quantitative tools that complement the existing diagnostic tools for autism based on behavioral tests.”
In a study of 19 children—nine with ASD—141 sensors tracked the activity of each child’s cortex. The sensors recorded how different regions interacted with each other while at rest, and compared the brain’s interactions of the control group to those with ASD. Researchers found significantly stronger connections between rear and frontal areas of the brain in the ASD group; there was an asymmetrical flow of information to the frontal region, but not vice versa.
The new insight into the directionality of the connections may help identify anatomical abnormalities in ASD brains. Most current measures of functional connectivity do not indicate the interactions’ directionality.
“It is not just who is connected to whom, but rather who is driving whom,” Galán said.
Their approach also allows them to measure background noise, or the spontaneous input driving the brain’s activity while at rest. A spatial map of these inputs demonstrated there was more complexity and structure in the control group than the ASD group, which had less variety and intricacy. This feature offered better discrimination between the two groups, providing an even stronger measure of criteria than functional connectivity alone, with 94 percent accuracy.
Case Western Reserve’s Office of Technology Transfer has filed a provisional patent application for the analysis’ algorithm, which investigates the brain’s activity at rest. Galán and colleagues hope to collaborate with others in the autism field with emphasis on translational and clinical research.
(Image: SPL)

Detecting Autism From Brain Activity

Neuroscientists from Case Western Reserve University School of Medicine and the University of Toronto have developed an efficient and reliable method of analyzing brain activity to detect autism in children. Their findings appear today in the online journal PLOS ONE.

The researchers recorded and analyzed dynamic patterns of brain activity with magnetoencephalography (MEG) to determine the brain’s functional connectivity – that is, its communication from one region to another. MEG measures magnetic fields generated by electrical currents in neurons of the brain.

Roberto Fernández Galán, PhD, an assistant professor of neurosciences at Case Western Reserve and an electrophysiologist seasoned in theoretical physics led the research team that detected autism spectrum disorder (ASD) with 94 percent accuracy. The new analytic method offers an efficient, quantitative way of confirming a clinical diagnosis of autism.

“We asked the question, ‘Can you distinguish an autistic brain from a non-autistic brain simply by looking at the patterns of neural activity?’ and indeed, you can,” Galán said. “This discovery opens the door to quantitative tools that complement the existing diagnostic tools for autism based on behavioral tests.”

In a study of 19 children—nine with ASD—141 sensors tracked the activity of each child’s cortex. The sensors recorded how different regions interacted with each other while at rest, and compared the brain’s interactions of the control group to those with ASD. Researchers found significantly stronger connections between rear and frontal areas of the brain in the ASD group; there was an asymmetrical flow of information to the frontal region, but not vice versa.

The new insight into the directionality of the connections may help identify anatomical abnormalities in ASD brains. Most current measures of functional connectivity do not indicate the interactions’ directionality.

“It is not just who is connected to whom, but rather who is driving whom,” Galán said.

Their approach also allows them to measure background noise, or the spontaneous input driving the brain’s activity while at rest. A spatial map of these inputs demonstrated there was more complexity and structure in the control group than the ASD group, which had less variety and intricacy. This feature offered better discrimination between the two groups, providing an even stronger measure of criteria than functional connectivity alone, with 94 percent accuracy.

Case Western Reserve’s Office of Technology Transfer has filed a provisional patent application for the analysis’ algorithm, which investigates the brain’s activity at rest. Galán and colleagues hope to collaborate with others in the autism field with emphasis on translational and clinical research.

(Image: SPL)

Filed under brain activity autism ASD magnetoencephalography autistic brain neuroscience science

187 notes

Why Don’t Men Understand Women? Altered Neural Networks for Reading the Language of Male and Female Eyes
Men are traditionally thought to have more problems in understanding women compared to understanding other men, though evidence supporting this assumption remains sparse. Recently, it has been shown, however, that meńs problems in recognizing women’s emotions could be linked to difficulties in extracting the relevant information from the eye region, which remain one of the richest sources of social information for the attribution of mental states to others. To determine possible differences in the neural correlates underlying emotion recognition from female, as compared to male eyes, a modified version of the Reading the Mind in the Eyes Test in combination with functional magnetic resonance imaging (fMRI) was applied to a sample of 22 participants. We found that men actually had twice as many problems in recognizing emotions from female as compared to male eyes, and that these problems were particularly associated with a lack of activation in limbic regions of the brain (including the hippocampus and the rostral anterior cingulate cortex). Moreover, men revealed heightened activation of the right amygdala to male stimuli regardless of condition (sex vs. emotion recognition). Thus, our findings highlight the function of the amygdala in the affective component of theory of mind (ToM) and in empathy, and provide further evidence that men are substantially less able to infer mental states expressed by women, which may be accompanied by sex-specific differences in amygdala activity.

Why Don’t Men Understand Women? Altered Neural Networks for Reading the Language of Male and Female Eyes

Men are traditionally thought to have more problems in understanding women compared to understanding other men, though evidence supporting this assumption remains sparse. Recently, it has been shown, however, that meńs problems in recognizing women’s emotions could be linked to difficulties in extracting the relevant information from the eye region, which remain one of the richest sources of social information for the attribution of mental states to others. To determine possible differences in the neural correlates underlying emotion recognition from female, as compared to male eyes, a modified version of the Reading the Mind in the Eyes Test in combination with functional magnetic resonance imaging (fMRI) was applied to a sample of 22 participants. We found that men actually had twice as many problems in recognizing emotions from female as compared to male eyes, and that these problems were particularly associated with a lack of activation in limbic regions of the brain (including the hippocampus and the rostral anterior cingulate cortex). Moreover, men revealed heightened activation of the right amygdala to male stimuli regardless of condition (sex vs. emotion recognition). Thus, our findings highlight the function of the amygdala in the affective component of theory of mind (ToM) and in empathy, and provide further evidence that men are substantially less able to infer mental states expressed by women, which may be accompanied by sex-specific differences in amygdala activity.

Filed under emotions emotion recognition limbic system amygdala empathy men women psychology neuroscience science

129 notes

The Shrinking of the Hobbit’s Brain
Where do Hobbits come from? No, not the little humanoids in the J. R. R. Tolkien books, but Homo floresiensis, the 1-meter-tall human with the chimp-sized brain that lived on the Indonesian island of Flores between 90,000 and 13,000 years ago. There are two main hypotheses: either the creature downsized from H. erectus, a human ancestor that lived in Africa and Asia and that is known to have made it to Flores about 800,000 years ago and may have shrunk when it got there—a case of so-called “insular dwarfism” often seen in other animals that get small when they take up residence on islands. Or it evolved from an even earlier, smaller-brained ancestor, such as the early human H. habilis or an australopithecine like Lucy, that somehow made it to Flores from Africa. The insular dwarfism hypothesis had fallen out of favor recently, however, because many researchers thought that the Hobbit’s brain, often estimated at 400 cubic centimeters in volume, was too small to have evolved from the larger H. erectus brain, which was at least twice as big. But a new study, published online today in the Proceedings of the Royal Society B, finds from  CT scans of the Hobbit’s brain that it was actually about 426 cubic centimeters in volume. The team calculates that this is big enough to make the island dwarfism hypothesis considerably more plausible once the body size differences between the Hobbit and H. erectus—which was nearly twice as tall—are adjusted for.

The Shrinking of the Hobbit’s Brain

Where do Hobbits come from? No, not the little humanoids in the J. R. R. Tolkien books, but Homo floresiensis, the 1-meter-tall human with the chimp-sized brain that lived on the Indonesian island of Flores between 90,000 and 13,000 years ago. There are two main hypotheses: either the creature downsized from H. erectus, a human ancestor that lived in Africa and Asia and that is known to have made it to Flores about 800,000 years ago and may have shrunk when it got there—a case of so-called “insular dwarfism” often seen in other animals that get small when they take up residence on islands. Or it evolved from an even earlier, smaller-brained ancestor, such as the early human H. habilis or an australopithecine like Lucy, that somehow made it to Flores from Africa. The insular dwarfism hypothesis had fallen out of favor recently, however, because many researchers thought that the Hobbit’s brain, often estimated at 400 cubic centimeters in volume, was too small to have evolved from the larger H. erectus brain, which was at least twice as big. But a new study, published online today in the Proceedings of the Royal Society B, finds from CT scans of the Hobbit’s brain that it was actually about 426 cubic centimeters in volume. The team calculates that this is big enough to make the island dwarfism hypothesis considerably more plausible once the body size differences between the Hobbit and H. erectus—which was nearly twice as tall—are adjusted for.

Filed under brain size homo floresiensis CT scans insular dwarfism evolution neuroscience science

116 notes

Drug Could Improve Working Memory of People with Autism
People with an Autism Spectrum Disorder (ASD) often have trouble communicating and interacting with others because they process language, facial expressions and social cues differently. Previously, researchers found that propranolol, a drug commonly used to treat high blood pressure, anxiety and panic, could improve the language abilities and social functioning of people with an ASD. Now, University of Missouri investigators say the prescription drug also could help improve the working memory abilities of individuals with autism.
Working memory represents individuals’ ability to hold and manipulate a small amount of information for a short period; it allows people to remember directions, complete puzzles and follow conversations. Neurologist David Beversdorf and research neuropsychologist Shawn Christ found that propranolol improves the working memory performance of people with an ASD.
“Seeing a tiger might signal a fight or flight response. Nowadays, a stressor such as taking an exam could generate the same response, which is not helpful,” said Beversdorf, an associate professor in the Departments of Radiology and Neurology in the MU School of Medicine. “Propranolol works by calming those nervous responses, which is why some people benefit from taking the drug to reduce anxiety.”
Propranolol increased working memory performance in a sample of 14 young adult patients of the MU Thompson Center for Autism and Neurodevelopmental Disorders but had little to no effect on a group of 13 study participants who do not have autism. The researchers do not recommend that doctors prescribe propranolol solely to improve working memory in individuals with an ASD, but patients who already take the prescription drug might benefit.
“People with an Autism Spectrum Disorder who are already being prescribed propranolol for a different reason, such as anxiety, might also see an improvement in working memory,” said Christ, an associate professor in the Department of Psychological Sciences in the MU College of Arts and Science.
Future research will incorporate clinical trials to assess further the relationship between cognitive and behavioral functioning and connectivity among various regions of the brain.
The study, “Noradrenergic Moderation of Working Memory Impairments in Adults with Autism Spectrum Disorder,” was published in the Journal of the International Neuropsychological Society. Kimberly Bodner, a psychological sciences doctoral student at MU, and Sanjida Saklayen from the Ohio State University College of Medicine co-authored the study.

Drug Could Improve Working Memory of People with Autism

People with an Autism Spectrum Disorder (ASD) often have trouble communicating and interacting with others because they process language, facial expressions and social cues differently. Previously, researchers found that propranolol, a drug commonly used to treat high blood pressure, anxiety and panic, could improve the language abilities and social functioning of people with an ASD. Now, University of Missouri investigators say the prescription drug also could help improve the working memory abilities of individuals with autism.

Working memory represents individuals’ ability to hold and manipulate a small amount of information for a short period; it allows people to remember directions, complete puzzles and follow conversations. Neurologist David Beversdorf and research neuropsychologist Shawn Christ found that propranolol improves the working memory performance of people with an ASD.

“Seeing a tiger might signal a fight or flight response. Nowadays, a stressor such as taking an exam could generate the same response, which is not helpful,” said Beversdorf, an associate professor in the Departments of Radiology and Neurology in the MU School of Medicine. “Propranolol works by calming those nervous responses, which is why some people benefit from taking the drug to reduce anxiety.”

Propranolol increased working memory performance in a sample of 14 young adult patients of the MU Thompson Center for Autism and Neurodevelopmental Disorders but had little to no effect on a group of 13 study participants who do not have autism. The researchers do not recommend that doctors prescribe propranolol solely to improve working memory in individuals with an ASD, but patients who already take the prescription drug might benefit.

“People with an Autism Spectrum Disorder who are already being prescribed propranolol for a different reason, such as anxiety, might also see an improvement in working memory,” said Christ, an associate professor in the Department of Psychological Sciences in the MU College of Arts and Science.

Future research will incorporate clinical trials to assess further the relationship between cognitive and behavioral functioning and connectivity among various regions of the brain.

The study, “Noradrenergic Moderation of Working Memory Impairments in Adults with Autism Spectrum Disorder,” was published in the Journal of the International Neuropsychological Society. Kimberly Bodner, a psychological sciences doctoral student at MU, and Sanjida Saklayen from the Ohio State University College of Medicine co-authored the study.

Filed under autism ASD working memory propranolol cognitive functioning neuroscience science

84 notes

Stimulating the Brain Blunts Cigarette Craving

Cigarette smoking is the leading cause of preventable deaths globally. Unfortunately smoking cessation is difficult, with more than 90% of attempts to quit resulting in relapse.

image

(Image: Jupiterimages)

There are a growing number of available methods that can be tried in the effort to reduce smoking, including medications, behavioral therapies, hypnosis, and even acupuncture. All attempt to alter brain function or behavior in some way.

A new study published in Biological Psychiatry now reports that a single 15-minute session of high frequency transcranial magnetic stimulation (TMS) over the prefrontal cortex temporarily reduced cue-induced smoking craving in nicotine-dependent individuals.

Nicotine activates the dopamine system and reward-related regions in the brain. Nicotine withdrawal naturally results in decreased activity of these regions, which has been closely associated with craving, relapse, and continued nicotine consumption.

One of the critical reward-related regions is the dorsolateral prefrontal cortex, which can be targeted using a brain stimulation technology called transcranial magnetic stimulation. Transcranial magnetic stimulation is a non-invasive procedure that uses magnetic fields to stimulate nerve cells. It does not require sedation or anesthesia and so patients remain awake, reclined in a chair, while treatment is administered through coils placed near the forehead.

Dr. Xingbao Li and colleagues at Medical University of South Carolina examined cravings triggered by smoking cues in 16 nicotine-dependent volunteers who received one session each of high frequency or sham repetitive transcranial magnetic stimulation applied over the dorsolateral prefrontal cortex. This design allowed the researchers to ferret out the effects of the real versus the sham stimulation, similar to how placebo pills are used in evaluating the effectiveness and safety of new medications.

They found that craving induced by smoking cues was reduced after participants received real stimulation. They also report that the reduction in cue-induced craving was positively correlated with level of nicotine dependence; in other words, the TMS-induced craving reductions were greater in those with higher levels of nicotine use.

Dr. John Krystal, Editor of Biological Psychiatry, commented, “One of the elegant aspects of this study is that it suggests that specific manipulations of particular brain circuits may help to protect smokers and possibly people with other addictions from relapsing.”

"While this was only a temporary effect, it raises the possibility that repeated TMS sessions might ultimately be used to help smokers quit smoking. TMS as used in this study is safe and is already FDA approved for treating depression. This finding opens the way for further exploration of the use of brain stimulation techniques in smoking cessation treatment," said Li.

(Source: alphagalileo.org)

Filed under smoking tobacco smoking transcranial magnetic stimulation prefrontal cortex brain stimulation neuroscience science

233 notes

Anxious about life? Tylenol may do the trick

University of British Columbia researchers have found a new potential use for the over-the-counter pain drug Tylenol. Typically known to relieve physical pain, the study suggests the drug may also reduce the psychological effects of fear and anxiety over the human condition, or existential dread.

Published in the Association for Psychological Science journal Psychological Science, the study advances our understanding of how the human brain processes different kinds of pain.

“Pain exists in many forms, including the distress that people feel when exposed to thoughts of existential uncertainty and death,” says lead author Daniel Randles, UBC Dept. of Psychology. “Our study suggests these anxieties may be processed as ‘pain’ by the brain – but Tylenol seems to inhibit the signal telling the brain that something is wrong.”

The study builds on recent American research that found acetaminophen – the generic form of Tylenol – can successfully reduce the non-physical pain of being ostracized from friends. The UBC team sought to determine whether the drug had similar effects on other unpleasant experiences – in this case, existential dread.

image

In the study, participants took acetaminophen or a placebo while performing tasks designed to evoke this kind of anxiety – including writing about death or watching a surreal David Lynch video – and then assign fines to different types of crimes, including public rioting and prostitution.

Compared to a placebo group, the researchers found the people taking acetaminophen were significantly more lenient in judging the acts of the criminals and rioters – and better able to cope with troubling ideas. The results suggest that participants’ existential suffering was “treated” by the headache drug.

“That a drug used primarily to alleviate headaches may also numb people to the worry of thoughts of their deaths, or to the uneasiness of watching a surrealist film – is a surprising and very interesting finding,” says Randles, a PhD candidate who authored the study with Prof. Steve Heine and Nathan Santos.

While the findings suggest that acetaminophen can help to reduce anxiety, the researchers caution that further research – and clinical trials – must occur before acetaminophen should be considered a safe or effective treatment for anxiety.

(Source: publicaffairs.ubc.ca)

Filed under tylenol anxiety fear emotional distress psychology neuroscience science

95 notes

Memory, the Adolescent Brain, and Lying: Understanding the Limits of Neuroscientific Evidence in the Law

Brain scans are increasingly able to reveal whether or not you believe you remember some person or event in your life. In a new study presented at a cognitive neuroscience meeting today, researchers used fMRI brain scans to detect whether a person recognized scenes from their own lives, as captured in some 45,000 images by digital cameras. The study is seeking to test the capabilities and limits of brain-based technology for detecting memories, a technique being considered for use in legal settings.

The advancement and falling costs of fMRI, EEG, and other techniques will one day make it more practical for this type of evidence to show up in court,” says Francis Shen of the University of Minnesota Law School, who is chairing a session on neuroscience and the law at a meeting of the Cognitive Neuroscience Society (CNS) in San Francisco this week. “But technological advancement on its own doesn’t necessarily lead to use in the law.” But as the technology has advanced and as the legal system desires to use more empirical evidence, neuroscience and the law are intersecting more often than in previous decades.

In U.S. courts, neuroscientific evidence has been used largely in cases involving brain injury litigation or questions of impaired ability. In some cases outside the United States, however, courts have used brain-based evidence to check whether a person has memories of legally relevant events, such as a crime. New companies also are claiming to use brain scans to detect lies – although judges have not yet admitted this evidence in U.S. courts. These developments have rallied some in the neuroscience community to take a critical look at the promise and perils of such technology in addressing legal questions – working in partnership with legal scholars through efforts such as the MacArthur Foundation Research Network on Law and Neuroscience.

Recognizing your own memories

What inspired Anthony Wagner, a cognitive neuroscientist at Stanford University, to test fMRI uses for memory detection was a case in June 2008 in Mumbai, India, in which a judge cited EEG evidence as indicating that a murder suspect held knowledge about the crime that only the killer could possess. “It appeared that the brain data held considerable sway,” says Wagner, who points out that the methods used in that case have not been subject to extensive peer review.

Since then, Wagner and colleagues have conducted a number of experiments to test whether brain scans can be used to discriminate between stimuli that people perceive as old or new, as well as more objectively, whether or not they have previously encountered a particular person, place, or thing. To date, Wagner and colleagues have had success in the lab using fMRI-based analyses to determine whether someone recognizes a person or perceives them as unfamiliar, but not in determining whether in fact they have actually seen them before.

In a new study presented today, his team sought to take the experiments out of the lab and into the real world by outfitting participants with digital cameras around their necks that automatically took photos of the participants’ everyday experiences. Over a multi-week period, the cameras yielded 45,000 photos per participant.

Wagner’s team then took brief photo sequences of individual events from the participants’ lives and showed them to the participants in the fMRI scanner, along with photo sequences from other subjects as the control stimuli. The researchers analyzed their brain patterns to determine whether or not the participants were recognizing the sequences as their own. “We did quite well with most subjects, with a mean accuracy of 91% in discriminating between event sequences that the participant recognized as old and those that the participant perceived as unfamiliar, ” Wagner says. “These findings indicate that distributed patterns of brain activity, as measured with fMRI, carry considerable information about an individual’s subjective memory experience – that is, whether or not they are remembering the event.”

In another new study, Wagner and colleagues tested whether people can “beat the technology” by using countermeasures to alter their brain patterns. Back in the lab, the researchers showed participants individual faces and later asked them whether the faces were old or new. “Halfway through the memory test, we stopped and told them ‘What we are actually trying to do is read out from your brain patterns whether or not you are recognizing the face or perceiving it as novel, and we’ve been successful with other subjects in doing this in the past. Now we want you to try to beat the system by altering your neural responses.’” The researchers instructed the participants to think about a familiar person or experience when presented with a new face, and to focus on a novel feature of the face when presented a previously encountered face.

In the first half of the test, during which participants were just making memory decisions, we were well above chance in decoding from brain patterns whether they recognized face or perceived it as novel. However, in the second half of the test, we were unable to classify whether or not they recognized the face nor whether the face was objectively old or new,” Wagner says. Within a forensic setting, Wagner says, it is conceivable that a suspect could use such measures to try to mask the brain patterns associated with memory.

Wagner says that his work to date suggests that the technology may have some utility in reading out brain patterns in cooperative individuals but that the uses are much more uncertain with uncooperative individuals. However, Wagner stresses that the method currently does not distinguish well between whether a person’s memory reflects true or false recognition. He says that it is premature to consider such evidence in the courts because many additional factors await future testing, including the effects of stress, practice, and time between the experience and the memory test.

Overgeneralizing the adolescent brain

A general challenge to the use of neuroscientific evidence in legal settings, Wagner says, is that most studies are at the group rather than the individual level. “The law cares about a particular individual in a particular situation right in front of them,” he says, and the science often cannot speak to that specificity.

Shen cites the challenge of making individualized inference from group-based data as one of the major ones facing use of neuroscience evidence in the court. “This issue has come up in the context of juvenile justice, where the adolescent brain development data confirms behavioral data that on average 17-year-olds are more impulsive than adults, but does not tell us whether a particular 17-year-old, namely the one on trial, was less able to control his/her actions on the day and in the manner in question,” he says.

Indeed, B.J. Casey of the Weill Medical College of Cornell University says that too often we overgeneralize the lack of self control among adolescents. Although adolescents do show poor self control as a group, some situations and individuals are more prone to this breakdown than others.

It is not that teens can’t make decisions, they can and they can do so efficiently,” Casey says. “It is when they must make decisions in the heat of the moment – in presence of potential or perceived threats, among peers – that the court should consider diminished responsibility of teens while still holding them accountable for their behavior.” Research suggests that this diminished ability is due to the immature development of circuitry involved in processing of negative or positive cues in the environment in the subcortical limbic regions and then in regulating responses to those cues in the prefrontal cortex.

The body of research to date is at the group-level, however, and is not yet able to comment on the neurobiological maturity of an individual adolescent. To help provide more guidance on this issue in legal settings, Casey and colleagues are working alongside legal scholars on a developmental imaging study, funded by the MacArthur Foundation, that is examining behaviors relevant to juvenile criminal behavior, including impulsivity and peer influence.

Making real-world connections

The same type of work – to connect brain imaging to particular behaviors in the real-world – is ongoing in a number of other areas, including fMRI-based lie detection and linking negligence to specific mental states. “It’s a big leap to go from a laboratory setting, in which impulse control may be measured by one’s ability to not press a button in response to a stimulus, to the real-world, where the question is whether someone had requisite self-control not to tie up an innocent person and throw them off a bridge.” Shen says. “I don’t see neuroscience solving these big problems anytime soon, and so the question for law becomes: What do we do with this uncertainty? I think this is where we’re at right now, and where we’ll be for some time.”

With a few notable exceptions such as death penalty cases, cases where a juvenile is facing a very stiff sentence, and litigating brain injury claims, ‘law and neuroscience’ is not familiar to most lawyers,” Shen says. “But this might change – and soon.” The ongoing work is vital, he says, for laying a foundation for a future that’s yet to come, and he hopes that more neuroscientists will increasingly collaborate with legal scholars.

Filed under brain scans neuroimaging brain activity law memory neuroscience adolescent brain science

66 notes

Scientists pinpoint brain’s area for numeral recognition
Scientists at the Stanford University School of Medicine have determined the precise anatomical coordinates of a brain “hot spot,” measuring only about one-fifth of an inch across, that is preferentially activated when people view the ordinary numerals we learn early on in elementary school, like “6” or “38.”
Activity in this spot relative to neighboring sites drops off substantially when people are presented with numbers that are spelled out (“one” instead of “1”), homophones (“won” instead of “1”) or “false fonts,” in which a numeral or letter has been altered.
“This is the first-ever study to show the existence of a cluster of nerve cells in the human brain that specializes in processing numerals,” said Josef Parvizi, MD, PhD, associate professor of neurology and neurological sciences and director of Stanford’s Human Intracranial Cognitive Electrophysiology Program. “In this small nerve-cell population, we saw a much bigger response to numerals than to very similar-looking, similar-sounding and similar-meaning symbols.
“It’s a dramatic demonstration of our brain circuitry’s capacity to change in response to education,” he added. “No one is born with the innate ability to recognize numerals.”
The finding pries open the door to further discoveries delineating the flow of math-focused information processing in the brain. It also could have direct clinical ramifications for patients with dyslexia for numbers and with dyscalculia: the inability to process numerical information.
The cluster Parvizi’s group identified consists of perhaps 1 to 2 million nerve cells in the inferior temporal gyrus, a superficial region of the outer cortex on the brain. The inferior temporal gyrus is already generally known to be involved in the processing of visual information.
The new study, published April 17 in the Journal of Neuroscience, builds on an earlier one in which volunteers had been challenged with math questions. “We had accumulated lots of data from that study about what parts of the brain become active when a person is focusing on arithmetic problems, but we were mostly looking elsewhere and hadn’t paid much attention to this area within the inferior temporal gyrus,” said Parvizi, who is senior author of the study.
Not, that is, until fourth-year medical student Jennifer Shum, who also is doing research in Parvizi’s lab, noticed that, among some subjects in the first study, a spot in the inferior temporal gyrus seemed to be substantially activated by math exercises. Charged with verifying that this observation was consistent from one patient to the next, Shum, the study’s lead author, reported that this was indeed the case. So, Parvizi’s team designed a new study to look into it further.
The new study relied on epileptic volunteers who, as a first step toward possible surgery to relieve unremitting seizures that weren’t responding to therapeutic drugs, had a small section of their skulls removed and electrodes applied directly to the brain’s surface. The procedure, which doesn’t destroy any brain tissue or disrupt the brain’s function, had been undertaken so that the patients could be monitored for several days to help attending neurologists find the exact location of their seizures’ origination points. While these patients are bedridden in the hospital for as much as a week of such monitoring, they are fully conscious, in no pain and, frankly, a bit bored.
Over time, Parvizi identified seven epilepsy patients with electrode coverage in or near the inferior temporal gyrus and got these patients’ consent to undergo about an hour’s worth of tests in which they would be shown images presented for very short intervals on a laptop computer screen, while activity in their brain regions covered by electrodes was recorded. Each electrode picked up activity from an area corresponding to about a half-million nerve cells (a drop in the bucket in comparison to the brain’s roughly 100 billion nerve cells).
To make sure that any numeral-responsive brain areas identified were really responding to numerals — and not just generic lines, angles and curves — these tests were carefully calibrated to distinguish brain responses to visual presentations of the classic numerals taught in Western schools, such as 3 or 50, as opposed to squiggly lines, letters of the alphabet, number-denoting words such as “three” or “fifty,” and symbols that in fact were also numerals but — because they were drawn from the Thai, Tibetan and Devanagari languages — were extremely unlikely to be recognized as such by this particular group of volunteers.
In the first test, subjects were shown series of single numerals and letters — along with false fonts, in which the component parts of numerals or letters had been scrambled but defining curves and angles were retained, and the foreign-number symbols just described. A second test, controlling for meaning and sound, included numerals and their spelled-out versions (for instance, “1” and “one,” or “3” and “three”) and other words with the same sound or a similar one (“won” and “tree,” respectively).
All of our brains are shaped slightly differently. But in almost the identical spot within each study subject’s brain, the investigators observed a significantly larger response to numerals than to similar-shaped stimuli, such as letters or scrambled letters and numerals, or to words that either meant the same as the numerals or sounded like them.
Interestingly, said Parvizi, that numeral-processing nerve-cell cluster is parked within a larger group of neurons that is activated by visual symbols that have lines with angles and curves. “These neuronal populations showed a preference for numerals compared with words that denote or sound like those numerals,” he said. “But in many cases, these sites actually responded strongly to scrambled letters or scrambled numerals. Still, within this larger pool of generic neurons, the ‘visual numeral area’ preferred real numerals to the false fonts and to same-meaning or similar-sounding words.”
It seems, Parvizi said, that “evolution has designed this brain region to detect visual stimuli such as lines intersecting at various angles — the kind of intersections a monkey has to make sense of quickly when swinging from branch to branch in a dense jungle.” The adaptation of one part of this region in service of numeracy is a beautiful intersection of culture and neurobiology, he said.
Having nailed down a specifically numeral-oriented spot in the brain, Parvizi’s lab is looking to use it in tracing the pathways described by the brain’s number-processing circuitry. “Neurons that fire together wire together,” said Shum. “We want to see how this particular area connects with and communicates with other parts of the brain.”

Scientists pinpoint brain’s area for numeral recognition

Scientists at the Stanford University School of Medicine have determined the precise anatomical coordinates of a brain “hot spot,” measuring only about one-fifth of an inch across, that is preferentially activated when people view the ordinary numerals we learn early on in elementary school, like “6” or “38.”

Activity in this spot relative to neighboring sites drops off substantially when people are presented with numbers that are spelled out (“one” instead of “1”), homophones (“won” instead of “1”) or “false fonts,” in which a numeral or letter has been altered.

“This is the first-ever study to show the existence of a cluster of nerve cells in the human brain that specializes in processing numerals,” said Josef Parvizi, MD, PhD, associate professor of neurology and neurological sciences and director of Stanford’s Human Intracranial Cognitive Electrophysiology Program. “In this small nerve-cell population, we saw a much bigger response to numerals than to very similar-looking, similar-sounding and similar-meaning symbols.

“It’s a dramatic demonstration of our brain circuitry’s capacity to change in response to education,” he added. “No one is born with the innate ability to recognize numerals.”

The finding pries open the door to further discoveries delineating the flow of math-focused information processing in the brain. It also could have direct clinical ramifications for patients with dyslexia for numbers and with dyscalculia: the inability to process numerical information.

The cluster Parvizi’s group identified consists of perhaps 1 to 2 million nerve cells in the inferior temporal gyrus, a superficial region of the outer cortex on the brain. The inferior temporal gyrus is already generally known to be involved in the processing of visual information.

The new study, published April 17 in the Journal of Neuroscience, builds on an earlier one in which volunteers had been challenged with math questions. “We had accumulated lots of data from that study about what parts of the brain become active when a person is focusing on arithmetic problems, but we were mostly looking elsewhere and hadn’t paid much attention to this area within the inferior temporal gyrus,” said Parvizi, who is senior author of the study.

Not, that is, until fourth-year medical student Jennifer Shum, who also is doing research in Parvizi’s lab, noticed that, among some subjects in the first study, a spot in the inferior temporal gyrus seemed to be substantially activated by math exercises. Charged with verifying that this observation was consistent from one patient to the next, Shum, the study’s lead author, reported that this was indeed the case. So, Parvizi’s team designed a new study to look into it further.

The new study relied on epileptic volunteers who, as a first step toward possible surgery to relieve unremitting seizures that weren’t responding to therapeutic drugs, had a small section of their skulls removed and electrodes applied directly to the brain’s surface. The procedure, which doesn’t destroy any brain tissue or disrupt the brain’s function, had been undertaken so that the patients could be monitored for several days to help attending neurologists find the exact location of their seizures’ origination points. While these patients are bedridden in the hospital for as much as a week of such monitoring, they are fully conscious, in no pain and, frankly, a bit bored.

Over time, Parvizi identified seven epilepsy patients with electrode coverage in or near the inferior temporal gyrus and got these patients’ consent to undergo about an hour’s worth of tests in which they would be shown images presented for very short intervals on a laptop computer screen, while activity in their brain regions covered by electrodes was recorded. Each electrode picked up activity from an area corresponding to about a half-million nerve cells (a drop in the bucket in comparison to the brain’s roughly 100 billion nerve cells).

To make sure that any numeral-responsive brain areas identified were really responding to numerals — and not just generic lines, angles and curves — these tests were carefully calibrated to distinguish brain responses to visual presentations of the classic numerals taught in Western schools, such as 3 or 50, as opposed to squiggly lines, letters of the alphabet, number-denoting words such as “three” or “fifty,” and symbols that in fact were also numerals but — because they were drawn from the Thai, Tibetan and Devanagari languages — were extremely unlikely to be recognized as such by this particular group of volunteers.

In the first test, subjects were shown series of single numerals and letters — along with false fonts, in which the component parts of numerals or letters had been scrambled but defining curves and angles were retained, and the foreign-number symbols just described. A second test, controlling for meaning and sound, included numerals and their spelled-out versions (for instance, “1” and “one,” or “3” and “three”) and other words with the same sound or a similar one (“won” and “tree,” respectively).

All of our brains are shaped slightly differently. But in almost the identical spot within each study subject’s brain, the investigators observed a significantly larger response to numerals than to similar-shaped stimuli, such as letters or scrambled letters and numerals, or to words that either meant the same as the numerals or sounded like them.

Interestingly, said Parvizi, that numeral-processing nerve-cell cluster is parked within a larger group of neurons that is activated by visual symbols that have lines with angles and curves. “These neuronal populations showed a preference for numerals compared with words that denote or sound like those numerals,” he said. “But in many cases, these sites actually responded strongly to scrambled letters or scrambled numerals. Still, within this larger pool of generic neurons, the ‘visual numeral area’ preferred real numerals to the false fonts and to same-meaning or similar-sounding words.”

It seems, Parvizi said, that “evolution has designed this brain region to detect visual stimuli such as lines intersecting at various angles — the kind of intersections a monkey has to make sense of quickly when swinging from branch to branch in a dense jungle.” The adaptation of one part of this region in service of numeracy is a beautiful intersection of culture and neurobiology, he said.

Having nailed down a specifically numeral-oriented spot in the brain, Parvizi’s lab is looking to use it in tracing the pathways described by the brain’s number-processing circuitry. “Neurons that fire together wire together,” said Shum. “We want to see how this particular area connects with and communicates with other parts of the brain.”

Filed under brain circuitry nerve cells inferior temporal gyrus numeral recognition information processing neuroscience science

66 notes

New model of how brain functions are organized may revolutionize stroke rehab
A new model of brain lateralization for movement could dramatically improve the future of rehabilitation for stroke patients, according to Penn State researcher Robert Sainburg, who proposed and confirmed the model through novel virtual reality and brain lesion experiments.
Since the 1860s, neuroscientists have known that the human brain is organized into two hemispheres, each of which is responsible for different functions. Known as neural lateralization, this functional division has significant implications for the control of movement and is familiar in the phenomenon of handedness.
Understanding the connections between neural lateralization and motor control is crucial to many applications, including the rehabilitation of stroke patients. While most people intuitively understand handedness, the neural foundations underlying motor asymmetry have until recently remained elusive, according to Sainburg, professor of kinesiology and neurology and participant in the neuroscience and physiology graduate programs at the University’s Huck Institutes of the Life Sciences.
Research by Sainburg and his colleagues in the Center for Motor Control and published in the journal Brain has revealed a new model of motor lateralization that accounts for the neural foundations of handedness. The discovery could fundamentally change the way post-stroke rehabilitation is designed.
"Each hemisphere of the brain is specialized for different aspects of motor control, and thus each arm is ‘dominant’ for different features of movement," said Sainburg. "The dominant arm is used for applying specific force sequences — such as when slicing a loaf of bread with a knife — and the other arm is used for impeding forces to maintain stable posture, such as holding the loaf of bread. Together these specialized control mechanisms are seamlessly integrated into every day activities.
"Our research has shown that this integration breaks down in neural disorders such as stroke, which produces different motor deficits depending on whether the right or left hemisphere has been damaged," Sainburg continued. "Traditionally, physical rehabilitation professionals have used the same protocols to practice movements of the paretic arm, regardless of the hemisphere that has been damaged. Our research shows that each arm should be treated for different control deficits, and it also indicates that therapists should directly retrain patients in how to use the two arms together in order to recover function."
In preparing to test their model, Sainburg and his team selected study participants from the New Mexico Veterans Administration Hospital and Penn State Milton S. Hershey Medical Center based on specific criteria in order to accurately distinguish the motor control mechanisms specific to each brain hemisphere. Participants were then asked to perform a series of tasks on a virtual reality interface, programmed and designed by Sainburg, which allowed the researchers to record detailed 3D position and motion data. The data for all the participants’ hand trajectories and final positions were then aggregated to compare the effects of left versus right hemisphere damage on different aspects of control.
"Our results indicated that while both groups of patients showed similar clinical impairment in the contralesional arm, this was produced by different motor control deficits," Sainburg said. "Right hemisphere damaged patients were able to make straight movements that were directed toward the targets, but were unable to stabilize their arms in the targets at the end of motion. In contrast, left hemisphere damaged patients were unable to make straight and efficient movements, but had no difficulty stabilizing their arms at the end of motion. These results confirmed that each hemisphere contributes unique control to its contralesional arm, verifying why our arms seem different when we use them for the same tasks."
Results mirror those of Sainburg’s prior studies of motor deficits in unilateral stroke patients, focused on the ipsilesional arm, which formed the basis for his model of lateralization.
"Because both arms in stroke patients show motor deficits that are specific to the hemisphere that was damaged, we have concluded that the left arm is not simply controlled with the right hemisphere and vice versa," Sainburg said. "This is a revolutionarily new perspective on sensorimotor control: each hemisphere contributes different control mechanisms to the coordination of both arms, regardless of which arm is considered dominant."
Sainburg and his colleagues are currently designing follow-up studies that will aid the development of new rehabilitation protocols addressing the specific motor deficits associated with each hemisphere.

New model of how brain functions are organized may revolutionize stroke rehab

A new model of brain lateralization for movement could dramatically improve the future of rehabilitation for stroke patients, according to Penn State researcher Robert Sainburg, who proposed and confirmed the model through novel virtual reality and brain lesion experiments.

Since the 1860s, neuroscientists have known that the human brain is organized into two hemispheres, each of which is responsible for different functions. Known as neural lateralization, this functional division has significant implications for the control of movement and is familiar in the phenomenon of handedness.

Understanding the connections between neural lateralization and motor control is crucial to many applications, including the rehabilitation of stroke patients. While most people intuitively understand handedness, the neural foundations underlying motor asymmetry have until recently remained elusive, according to Sainburg, professor of kinesiology and neurology and participant in the neuroscience and physiology graduate programs at the University’s Huck Institutes of the Life Sciences.

Research by Sainburg and his colleagues in the Center for Motor Control and published in the journal Brain has revealed a new model of motor lateralization that accounts for the neural foundations of handedness. The discovery could fundamentally change the way post-stroke rehabilitation is designed.

"Each hemisphere of the brain is specialized for different aspects of motor control, and thus each arm is ‘dominant’ for different features of movement," said Sainburg. "The dominant arm is used for applying specific force sequences — such as when slicing a loaf of bread with a knife — and the other arm is used for impeding forces to maintain stable posture, such as holding the loaf of bread. Together these specialized control mechanisms are seamlessly integrated into every day activities.

"Our research has shown that this integration breaks down in neural disorders such as stroke, which produces different motor deficits depending on whether the right or left hemisphere has been damaged," Sainburg continued. "Traditionally, physical rehabilitation professionals have used the same protocols to practice movements of the paretic arm, regardless of the hemisphere that has been damaged. Our research shows that each arm should be treated for different control deficits, and it also indicates that therapists should directly retrain patients in how to use the two arms together in order to recover function."

In preparing to test their model, Sainburg and his team selected study participants from the New Mexico Veterans Administration Hospital and Penn State Milton S. Hershey Medical Center based on specific criteria in order to accurately distinguish the motor control mechanisms specific to each brain hemisphere. Participants were then asked to perform a series of tasks on a virtual reality interface, programmed and designed by Sainburg, which allowed the researchers to record detailed 3D position and motion data. The data for all the participants’ hand trajectories and final positions were then aggregated to compare the effects of left versus right hemisphere damage on different aspects of control.

"Our results indicated that while both groups of patients showed similar clinical impairment in the contralesional arm, this was produced by different motor control deficits," Sainburg said. "Right hemisphere damaged patients were able to make straight movements that were directed toward the targets, but were unable to stabilize their arms in the targets at the end of motion. In contrast, left hemisphere damaged patients were unable to make straight and efficient movements, but had no difficulty stabilizing their arms at the end of motion. These results confirmed that each hemisphere contributes unique control to its contralesional arm, verifying why our arms seem different when we use them for the same tasks."

Results mirror those of Sainburg’s prior studies of motor deficits in unilateral stroke patients, focused on the ipsilesional arm, which formed the basis for his model of lateralization.

"Because both arms in stroke patients show motor deficits that are specific to the hemisphere that was damaged, we have concluded that the left arm is not simply controlled with the right hemisphere and vice versa," Sainburg said. "This is a revolutionarily new perspective on sensorimotor control: each hemisphere contributes different control mechanisms to the coordination of both arms, regardless of which arm is considered dominant."

Sainburg and his colleagues are currently designing follow-up studies that will aid the development of new rehabilitation protocols addressing the specific motor deficits associated with each hemisphere.

Filed under stroke rehabilitation rehabilitation brain lateralization motor control handedness hemispheres neuroscience science

free counters