Posts tagged science

Posts tagged science

Ever since the appetite-regulation hormone called leptin was discovered in 1994, scientists have sought to understand the mechanisms that control its action. It was known that leptin was made by fat cells, reduced appetite and interacted with insulin , but the precise molecular details of its function —details that might enable the creation of a new treatment for obesity — remained elusive.
Now, University of Texas Medical Branch at Galveston researchers have revealed a significant part of one of those mechanisms, identifying a protein that can interfere with the brain’s response to leptin. They’ve also created a compound that blocks the protein’s action — a potential forerunner to an anti-obesity drug.
In experiments with mice fed a high-fat diet, scientists from UTMB and the University of California, San Diego explored the role of the protein, known as Epac1, in blocking leptin’s activity in the brain. They found that mice genetically engineered to be unable to produce Epac1 had lower body weights, lower body fat percentages, lower blood-plasma leptin levels and better glucose tolerance than normal mice.
When the researchers used a specially developed “Epac inhibitor” to treat brain-slice cultures taken from normal laboratory mice, they found elevated levels of proteins associated with greater leptin sensitivity. Similar results were seen in the genetically engineered mice that lacked the Epac1 gene. In addition, normal mice treated with the inhibitor had significantly lower levels of leptin in their blood plasma — an indication that Epac1 also affected their leptin levels.
“We found that we can increase leptin sensitivity by creating mice that lack the genes for Epac1 or through a pharmacological intervention with our Epac inhibitor,” said UTMB professor Xiaodong Cheng, lead author of a paper on the study that recently appeared on the cover of Molecular and Cellular Biology. “The knockout mice gave us a way to tease out the function of the protein, and the inhibitor served as a pharmacological probe that allowed us to manipulate these molecules in the cells.”
Cheng and his colleagues suspected a connection between Epac1 and leptin because Epac1 is activated by cyclic AMP, a signaling molecule linked to metabolism and leptin production and secretion. Cyclic AMP is tied to a multitude of other cell signaling processes, many of which are targeted by current drugs. Cheng believes that understanding how it acts through Epac1 (and another form of the protein called Epac2) will also generate new pharmaceutical possibilities — possibly including a drug therapy that will help fight obesity and diabetes.
“We refer to these Epac inhibitors as pharmacological probes, and while they are still far away from drugs, pharmaceutical intervention is always our eventual goal,” Cheng said. “We were the first to develop Epac inhibitors, and now we’re working very actively with Dr. Jia Zhou, a UTMB medicinal chemist, to modify them and improve their properties. In addition, we are collaborating with colleagues at the NIH National Center for Advancing Translational Sciences in searching for more potent and selective pharmacological probes for Epac proteins.”
Close family members of people with Alzheimer’s disease are more than twice as likely as those without a family history to develop silent buildup of brain plaques associated with Alzheimer’s disease, according to researchers at Duke Medicine.
The study, published online in the journal PLOS ONE on April 17, 2013, confirms earlier findings on a known genetic variation that increases one’s risk for Alzheimer’s, and raises new questions about other genetic factors involved in the disease that have yet to be identified.
An estimated 25 million people worldwide have Alzheimer’s disease, and the number is expected to triple by 2050. More than 95 percent of these individuals have late-onset Alzheimer’s, which usually occurs after the age of 65. Research has shown that Alzheimer’s begins years to decades before it is diagnosed, with changes to the brain measurable through a variety of tests.
Family history is a known risk factor and predictor of late-onset Alzheimer’s disease, and studies suggest a two- to four-fold greater risk for Alzheimer’s in individuals with a mother, father, brother or sister who develop the disease. These first-degree relatives share roughly 50 percent of their genes with another member of their family. Common genetic variations, including changes to the APOE gene, account for around 50 percent of the heritability of Alzheimer’s, but the disease’s other genetic roots are still unexplained.
“In this study, we sought to understand whether simply having a positive family history, in otherwise normal or mildly forgetful people, was enough to trigger silent buildup of Alzheimer’s plaques and shrinkage of memory centers,” said senior author P. Murali Doraiswamy, professor of psychiatry and medicine at Duke.
Duke neuroscience research trainee Erika J. Lampert, Doraiswamy and colleagues analyzed data from 257 adults, ages 55 to 89, both cognitively healthy and with varying levels of impairment. The participants were part of the Alzheimer’s Disease Neuroimaging Initiative, a national study working to define the progression of Alzheimer’s through biomarkers.
The researchers looked at participants’ age, gender and family history of the disease, with a positive family history defined as having a parent or sibling with Alzheimer’s. This information was compared with cognitive assessments and other biological tests, including APOE genotyping, MRI scans measuring hippocampal volume, and studies of three different pathologic markers (Aβ42, t-tau, and t-tau/Aβ42 ratio) found in cerebrospinal fluid.
As expected, the researchers found that a variation in the APOE gene associated with a greater risk and earlier onset of Alzheimer’s was overrepresented in participants with a family history of the disease. However, other biological differences were also seen in those with a family history, suggesting that unidentified genetic factors may influence the disease’s development before the onset of dementia.
Nearly half of all healthy people with a positive family history would have met the criteria for preclinical Alzheimer’s disease based on measurements of their cerebrospinal fluid, but only about 20 percent of those without a family history would have met such criteria.
“We already knew that family history increases one’s risk for developing Alzheimer’s, but we now are showing that people with a positive family history may also have higher levels of Alzheimer’s pathology earlier, which could be a reason why they experience a faster cognitive decline than those without a family history,” Lampert said.
The findings may influence the design of future studies developing new diagnostic tests for Alzheimer’s, as researchers may choose to exclude those with a positive family history – a group that has historically volunteered to participate in studies to better understand the disease – as healthy controls, given that they are more likely to develop Alzheimer’s pathology.
“Our study shows the power of a simple one-minute questionnaire about family history to predict silent brain changes,” Doraiswamy said. “In the absence of full understanding of all genetic risks for late-onset Alzheimer’s, family history information can serve as a risk stratification tool for prevention research and personalizing care.” He encouraged those with a known positive family history to seek out clinical trials specific to preventing the disease.
(Source: dukehealth.org)

Scientists reverse memory loss in animal brain cells
Neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) have taken a major step in their efforts to help people with memory loss tied to brain disorders such as Alzheimer’s disease.
Using sea snail nerve cells, the scientists reversed memory loss by determining when the cells were primed for learning. The scientists were able to help the cells compensate for memory loss by retraining them through the use of optimized training schedules. Findings of this proof-of-principle study appear in the April 17 issue of The Journal of Neuroscience.
“Although much works remains to be done, we have demonstrated the feasibility of our new strategy to help overcome memory deficits,” said John “Jack” Byrne, Ph.D., the study’s senior author, as well as director of the W.M. Keck Center for the Neurobiology of Learning and Memory and chairman of the Department of Neurobiology and Anatomy at the UTHealth Medical School.
This latest study builds on Byrne’s 2012 investigation that pioneered this memory enhancement strategy. The 2012 study showed a significant increase in long-term memory in healthy sea snails called Aplysia californica, an animal that has a simple nervous system, but with cells having properties similar to other more advanced species including humans.
Yili Zhang, Ph.D., the study’s co-lead author and a research scientist at the UTHealth Medical School, has developed a sophisticated mathematical model that can predict when the biochemical processes in the snail’s brain are primed for learning.
Her model is based on five training sessions scheduled at different time intervals ranging from 5 to 50 minutes. It can generate 10,000 different schedules and identify the schedule most attuned to optimum learning.
“The logical follow-up question was whether you could use the same strategy to overcome a deficit in memory,” Byrne said. “Memory is due to a change in the strength of the connections among neurons. In many diseases associated with memory deficits, the change is blocked.”
To test whether their strategy would help with memory loss, Rong-Yu Liu, Ph.D., co-lead author and senior research scientist at the UTHealth Medical School, simulated a brain disorder in a cell culture by taking sensory cells from the sea snails and blocking the activity of a gene that produces a memory protein. This resulted in a significant impairment in the strength of the neurons’ connections, which is responsible for long-term memory.
To mimic training sessions, cells were administered a chemical at intervals prescribed by the mathematical model. After five training sessions, which like the earlier study were at irregular intervals, the strength of the connections returned to near normal in the impaired cells.
“This methodology may apply to humans if we can identify the same biochemical processes in humans. Our results suggest a new strategy for treatments of cognitive impairment. Mathematical models might help design therapies that optimize the combination of training protocols with traditional drug treatments,” Byrne said.
He added, “Combining these two could enhance the effectiveness of the latter while compensating at least in part for any limitations or undesirable side effects of drugs. These two approaches are likely to be more effective together than separately and may have broad generalities in treating individuals with learning and memory deficits.”
(Image courtesy: UC Berkeley)
Detecting Autism From Brain Activity
Neuroscientists from Case Western Reserve University School of Medicine and the University of Toronto have developed an efficient and reliable method of analyzing brain activity to detect autism in children. Their findings appear today in the online journal PLOS ONE.
The researchers recorded and analyzed dynamic patterns of brain activity with magnetoencephalography (MEG) to determine the brain’s functional connectivity – that is, its communication from one region to another. MEG measures magnetic fields generated by electrical currents in neurons of the brain.
Roberto Fernández Galán, PhD, an assistant professor of neurosciences at Case Western Reserve and an electrophysiologist seasoned in theoretical physics led the research team that detected autism spectrum disorder (ASD) with 94 percent accuracy. The new analytic method offers an efficient, quantitative way of confirming a clinical diagnosis of autism.
“We asked the question, ‘Can you distinguish an autistic brain from a non-autistic brain simply by looking at the patterns of neural activity?’ and indeed, you can,” Galán said. “This discovery opens the door to quantitative tools that complement the existing diagnostic tools for autism based on behavioral tests.”
In a study of 19 children—nine with ASD—141 sensors tracked the activity of each child’s cortex. The sensors recorded how different regions interacted with each other while at rest, and compared the brain’s interactions of the control group to those with ASD. Researchers found significantly stronger connections between rear and frontal areas of the brain in the ASD group; there was an asymmetrical flow of information to the frontal region, but not vice versa.
The new insight into the directionality of the connections may help identify anatomical abnormalities in ASD brains. Most current measures of functional connectivity do not indicate the interactions’ directionality.
“It is not just who is connected to whom, but rather who is driving whom,” Galán said.
Their approach also allows them to measure background noise, or the spontaneous input driving the brain’s activity while at rest. A spatial map of these inputs demonstrated there was more complexity and structure in the control group than the ASD group, which had less variety and intricacy. This feature offered better discrimination between the two groups, providing an even stronger measure of criteria than functional connectivity alone, with 94 percent accuracy.
Case Western Reserve’s Office of Technology Transfer has filed a provisional patent application for the analysis’ algorithm, which investigates the brain’s activity at rest. Galán and colleagues hope to collaborate with others in the autism field with emphasis on translational and clinical research.
(Image: SPL)
Men are traditionally thought to have more problems in understanding women compared to understanding other men, though evidence supporting this assumption remains sparse. Recently, it has been shown, however, that meńs problems in recognizing women’s emotions could be linked to difficulties in extracting the relevant information from the eye region, which remain one of the richest sources of social information for the attribution of mental states to others. To determine possible differences in the neural correlates underlying emotion recognition from female, as compared to male eyes, a modified version of the Reading the Mind in the Eyes Test in combination with functional magnetic resonance imaging (fMRI) was applied to a sample of 22 participants. We found that men actually had twice as many problems in recognizing emotions from female as compared to male eyes, and that these problems were particularly associated with a lack of activation in limbic regions of the brain (including the hippocampus and the rostral anterior cingulate cortex). Moreover, men revealed heightened activation of the right amygdala to male stimuli regardless of condition (sex vs. emotion recognition). Thus, our findings highlight the function of the amygdala in the affective component of theory of mind (ToM) and in empathy, and provide further evidence that men are substantially less able to infer mental states expressed by women, which may be accompanied by sex-specific differences in amygdala activity.
The Shrinking of the Hobbit’s Brain
Where do Hobbits come from? No, not the little humanoids in the J. R. R. Tolkien books, but Homo floresiensis, the 1-meter-tall human with the chimp-sized brain that lived on the Indonesian island of Flores between 90,000 and 13,000 years ago. There are two main hypotheses: either the creature downsized from H. erectus, a human ancestor that lived in Africa and Asia and that is known to have made it to Flores about 800,000 years ago and may have shrunk when it got there—a case of so-called “insular dwarfism” often seen in other animals that get small when they take up residence on islands. Or it evolved from an even earlier, smaller-brained ancestor, such as the early human H. habilis or an australopithecine like Lucy, that somehow made it to Flores from Africa. The insular dwarfism hypothesis had fallen out of favor recently, however, because many researchers thought that the Hobbit’s brain, often estimated at 400 cubic centimeters in volume, was too small to have evolved from the larger H. erectus brain, which was at least twice as big. But a new study, published online today in the Proceedings of the Royal Society B, finds from CT scans of the Hobbit’s brain that it was actually about 426 cubic centimeters in volume. The team calculates that this is big enough to make the island dwarfism hypothesis considerably more plausible once the body size differences between the Hobbit and H. erectus—which was nearly twice as tall—are adjusted for.

Drug Could Improve Working Memory of People with Autism
People with an Autism Spectrum Disorder (ASD) often have trouble communicating and interacting with others because they process language, facial expressions and social cues differently. Previously, researchers found that propranolol, a drug commonly used to treat high blood pressure, anxiety and panic, could improve the language abilities and social functioning of people with an ASD. Now, University of Missouri investigators say the prescription drug also could help improve the working memory abilities of individuals with autism.
Working memory represents individuals’ ability to hold and manipulate a small amount of information for a short period; it allows people to remember directions, complete puzzles and follow conversations. Neurologist David Beversdorf and research neuropsychologist Shawn Christ found that propranolol improves the working memory performance of people with an ASD.
“Seeing a tiger might signal a fight or flight response. Nowadays, a stressor such as taking an exam could generate the same response, which is not helpful,” said Beversdorf, an associate professor in the Departments of Radiology and Neurology in the MU School of Medicine. “Propranolol works by calming those nervous responses, which is why some people benefit from taking the drug to reduce anxiety.”
Propranolol increased working memory performance in a sample of 14 young adult patients of the MU Thompson Center for Autism and Neurodevelopmental Disorders but had little to no effect on a group of 13 study participants who do not have autism. The researchers do not recommend that doctors prescribe propranolol solely to improve working memory in individuals with an ASD, but patients who already take the prescription drug might benefit.
“People with an Autism Spectrum Disorder who are already being prescribed propranolol for a different reason, such as anxiety, might also see an improvement in working memory,” said Christ, an associate professor in the Department of Psychological Sciences in the MU College of Arts and Science.
Future research will incorporate clinical trials to assess further the relationship between cognitive and behavioral functioning and connectivity among various regions of the brain.
The study, “Noradrenergic Moderation of Working Memory Impairments in Adults with Autism Spectrum Disorder,” was published in the Journal of the International Neuropsychological Society. Kimberly Bodner, a psychological sciences doctoral student at MU, and Sanjida Saklayen from the Ohio State University College of Medicine co-authored the study.
Cigarette smoking is the leading cause of preventable deaths globally. Unfortunately smoking cessation is difficult, with more than 90% of attempts to quit resulting in relapse.

(Image: Jupiterimages)
There are a growing number of available methods that can be tried in the effort to reduce smoking, including medications, behavioral therapies, hypnosis, and even acupuncture. All attempt to alter brain function or behavior in some way.
A new study published in Biological Psychiatry now reports that a single 15-minute session of high frequency transcranial magnetic stimulation (TMS) over the prefrontal cortex temporarily reduced cue-induced smoking craving in nicotine-dependent individuals.
Nicotine activates the dopamine system and reward-related regions in the brain. Nicotine withdrawal naturally results in decreased activity of these regions, which has been closely associated with craving, relapse, and continued nicotine consumption.
One of the critical reward-related regions is the dorsolateral prefrontal cortex, which can be targeted using a brain stimulation technology called transcranial magnetic stimulation. Transcranial magnetic stimulation is a non-invasive procedure that uses magnetic fields to stimulate nerve cells. It does not require sedation or anesthesia and so patients remain awake, reclined in a chair, while treatment is administered through coils placed near the forehead.
Dr. Xingbao Li and colleagues at Medical University of South Carolina examined cravings triggered by smoking cues in 16 nicotine-dependent volunteers who received one session each of high frequency or sham repetitive transcranial magnetic stimulation applied over the dorsolateral prefrontal cortex. This design allowed the researchers to ferret out the effects of the real versus the sham stimulation, similar to how placebo pills are used in evaluating the effectiveness and safety of new medications.
They found that craving induced by smoking cues was reduced after participants received real stimulation. They also report that the reduction in cue-induced craving was positively correlated with level of nicotine dependence; in other words, the TMS-induced craving reductions were greater in those with higher levels of nicotine use.
Dr. John Krystal, Editor of Biological Psychiatry, commented, “One of the elegant aspects of this study is that it suggests that specific manipulations of particular brain circuits may help to protect smokers and possibly people with other addictions from relapsing.”
"While this was only a temporary effect, it raises the possibility that repeated TMS sessions might ultimately be used to help smokers quit smoking. TMS as used in this study is safe and is already FDA approved for treating depression. This finding opens the way for further exploration of the use of brain stimulation techniques in smoking cessation treatment," said Li.
(Source: alphagalileo.org)
University of British Columbia researchers have found a new potential use for the over-the-counter pain drug Tylenol. Typically known to relieve physical pain, the study suggests the drug may also reduce the psychological effects of fear and anxiety over the human condition, or existential dread.
Published in the Association for Psychological Science journal Psychological Science, the study advances our understanding of how the human brain processes different kinds of pain.
“Pain exists in many forms, including the distress that people feel when exposed to thoughts of existential uncertainty and death,” says lead author Daniel Randles, UBC Dept. of Psychology. “Our study suggests these anxieties may be processed as ‘pain’ by the brain – but Tylenol seems to inhibit the signal telling the brain that something is wrong.”
The study builds on recent American research that found acetaminophen – the generic form of Tylenol – can successfully reduce the non-physical pain of being ostracized from friends. The UBC team sought to determine whether the drug had similar effects on other unpleasant experiences – in this case, existential dread.

In the study, participants took acetaminophen or a placebo while performing tasks designed to evoke this kind of anxiety – including writing about death or watching a surreal David Lynch video – and then assign fines to different types of crimes, including public rioting and prostitution.
Compared to a placebo group, the researchers found the people taking acetaminophen were significantly more lenient in judging the acts of the criminals and rioters – and better able to cope with troubling ideas. The results suggest that participants’ existential suffering was “treated” by the headache drug.
“That a drug used primarily to alleviate headaches may also numb people to the worry of thoughts of their deaths, or to the uneasiness of watching a surrealist film – is a surprising and very interesting finding,” says Randles, a PhD candidate who authored the study with Prof. Steve Heine and Nathan Santos.
While the findings suggest that acetaminophen can help to reduce anxiety, the researchers caution that further research – and clinical trials – must occur before acetaminophen should be considered a safe or effective treatment for anxiety.
(Source: publicaffairs.ubc.ca)
Brain scans are increasingly able to reveal whether or not you believe you remember some person or event in your life. In a new study presented at a cognitive neuroscience meeting today, researchers used fMRI brain scans to detect whether a person recognized scenes from their own lives, as captured in some 45,000 images by digital cameras. The study is seeking to test the capabilities and limits of brain-based technology for detecting memories, a technique being considered for use in legal settings.
“The advancement and falling costs of fMRI, EEG, and other techniques will one day make it more practical for this type of evidence to show up in court,” says Francis Shen of the University of Minnesota Law School, who is chairing a session on neuroscience and the law at a meeting of the Cognitive Neuroscience Society (CNS) in San Francisco this week. “But technological advancement on its own doesn’t necessarily lead to use in the law.” But as the technology has advanced and as the legal system desires to use more empirical evidence, neuroscience and the law are intersecting more often than in previous decades.
In U.S. courts, neuroscientific evidence has been used largely in cases involving brain injury litigation or questions of impaired ability. In some cases outside the United States, however, courts have used brain-based evidence to check whether a person has memories of legally relevant events, such as a crime. New companies also are claiming to use brain scans to detect lies – although judges have not yet admitted this evidence in U.S. courts. These developments have rallied some in the neuroscience community to take a critical look at the promise and perils of such technology in addressing legal questions – working in partnership with legal scholars through efforts such as the MacArthur Foundation Research Network on Law and Neuroscience.
Recognizing your own memories
What inspired Anthony Wagner, a cognitive neuroscientist at Stanford University, to test fMRI uses for memory detection was a case in June 2008 in Mumbai, India, in which a judge cited EEG evidence as indicating that a murder suspect held knowledge about the crime that only the killer could possess. “It appeared that the brain data held considerable sway,” says Wagner, who points out that the methods used in that case have not been subject to extensive peer review.
Since then, Wagner and colleagues have conducted a number of experiments to test whether brain scans can be used to discriminate between stimuli that people perceive as old or new, as well as more objectively, whether or not they have previously encountered a particular person, place, or thing. To date, Wagner and colleagues have had success in the lab using fMRI-based analyses to determine whether someone recognizes a person or perceives them as unfamiliar, but not in determining whether in fact they have actually seen them before.
In a new study presented today, his team sought to take the experiments out of the lab and into the real world by outfitting participants with digital cameras around their necks that automatically took photos of the participants’ everyday experiences. Over a multi-week period, the cameras yielded 45,000 photos per participant.
Wagner’s team then took brief photo sequences of individual events from the participants’ lives and showed them to the participants in the fMRI scanner, along with photo sequences from other subjects as the control stimuli. The researchers analyzed their brain patterns to determine whether or not the participants were recognizing the sequences as their own. “We did quite well with most subjects, with a mean accuracy of 91% in discriminating between event sequences that the participant recognized as old and those that the participant perceived as unfamiliar, ” Wagner says. “These findings indicate that distributed patterns of brain activity, as measured with fMRI, carry considerable information about an individual’s subjective memory experience – that is, whether or not they are remembering the event.”
In another new study, Wagner and colleagues tested whether people can “beat the technology” by using countermeasures to alter their brain patterns. Back in the lab, the researchers showed participants individual faces and later asked them whether the faces were old or new. “Halfway through the memory test, we stopped and told them ‘What we are actually trying to do is read out from your brain patterns whether or not you are recognizing the face or perceiving it as novel, and we’ve been successful with other subjects in doing this in the past. Now we want you to try to beat the system by altering your neural responses.’” The researchers instructed the participants to think about a familiar person or experience when presented with a new face, and to focus on a novel feature of the face when presented a previously encountered face.
“In the first half of the test, during which participants were just making memory decisions, we were well above chance in decoding from brain patterns whether they recognized face or perceived it as novel. However, in the second half of the test, we were unable to classify whether or not they recognized the face nor whether the face was objectively old or new,” Wagner says. Within a forensic setting, Wagner says, it is conceivable that a suspect could use such measures to try to mask the brain patterns associated with memory.
Wagner says that his work to date suggests that the technology may have some utility in reading out brain patterns in cooperative individuals but that the uses are much more uncertain with uncooperative individuals. However, Wagner stresses that the method currently does not distinguish well between whether a person’s memory reflects true or false recognition. He says that it is premature to consider such evidence in the courts because many additional factors await future testing, including the effects of stress, practice, and time between the experience and the memory test.
Overgeneralizing the adolescent brain
A general challenge to the use of neuroscientific evidence in legal settings, Wagner says, is that most studies are at the group rather than the individual level. “The law cares about a particular individual in a particular situation right in front of them,” he says, and the science often cannot speak to that specificity.
Shen cites the challenge of making individualized inference from group-based data as one of the major ones facing use of neuroscience evidence in the court. “This issue has come up in the context of juvenile justice, where the adolescent brain development data confirms behavioral data that on average 17-year-olds are more impulsive than adults, but does not tell us whether a particular 17-year-old, namely the one on trial, was less able to control his/her actions on the day and in the manner in question,” he says.
Indeed, B.J. Casey of the Weill Medical College of Cornell University says that too often we overgeneralize the lack of self control among adolescents. Although adolescents do show poor self control as a group, some situations and individuals are more prone to this breakdown than others.
“It is not that teens can’t make decisions, they can and they can do so efficiently,” Casey says. “It is when they must make decisions in the heat of the moment – in presence of potential or perceived threats, among peers – that the court should consider diminished responsibility of teens while still holding them accountable for their behavior.” Research suggests that this diminished ability is due to the immature development of circuitry involved in processing of negative or positive cues in the environment in the subcortical limbic regions and then in regulating responses to those cues in the prefrontal cortex.
The body of research to date is at the group-level, however, and is not yet able to comment on the neurobiological maturity of an individual adolescent. To help provide more guidance on this issue in legal settings, Casey and colleagues are working alongside legal scholars on a developmental imaging study, funded by the MacArthur Foundation, that is examining behaviors relevant to juvenile criminal behavior, including impulsivity and peer influence.
Making real-world connections
The same type of work – to connect brain imaging to particular behaviors in the real-world – is ongoing in a number of other areas, including fMRI-based lie detection and linking negligence to specific mental states. “It’s a big leap to go from a laboratory setting, in which impulse control may be measured by one’s ability to not press a button in response to a stimulus, to the real-world, where the question is whether someone had requisite self-control not to tie up an innocent person and throw them off a bridge.” Shen says. “I don’t see neuroscience solving these big problems anytime soon, and so the question for law becomes: What do we do with this uncertainty? I think this is where we’re at right now, and where we’ll be for some time.”
“With a few notable exceptions such as death penalty cases, cases where a juvenile is facing a very stiff sentence, and litigating brain injury claims, ‘law and neuroscience’ is not familiar to most lawyers,” Shen says. “But this might change – and soon.” The ongoing work is vital, he says, for laying a foundation for a future that’s yet to come, and he hopes that more neuroscientists will increasingly collaborate with legal scholars.