Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

239 notes

The Search for the Best Depression Treatment
Brain scans, blood samples, and other diagnostic tests could one day direct doctors to the best treatments for depression patients and uncover the biological basis of the condition. 
When someone is diagnosed with depression, patient and doctor often begin a long trial-and-error process of testing different treatments. Sometimes they work, sometimes they don’t, so patients may try several options before finding the best one. But in the future, a brain scan, blood test, or some combination could help guide doctors to the best drugs, or lead them to suggest talk therapy.
Recently, Emory University researcher Helen Mayberg reported that a PET scan, a commonly used imaging method, can reveal whether a patient will respond better to an antidepressant or cognitive behavioral therapy. And in May, Medscape reported that David Mischoulon of Massachusetts General Hospital presented findings that the amount of a particular protein in the blood of depression patients could indicate whether a patient would do better by adding a form of folic acid to his or her treatment.
A key goal of such research is to distinguish between causes of depression. “The presence of certain biomarkers might give us a clue whether [a particular patient’s] depression is truly biologically driven, or whether it is depression like sadness over an event,” says Mischoulon. “If we can identify people who have these biological bases, it might suggest these patients might do better with medications, as opposed to psychotherapies or meditation.”
According to the World Health Organization, depression is the leading cause of disability globally. Many people do not seek or do not have access to treatment, and among those who do, fewer than 40 percent of depression patients improve with the first type of treatment they try. The problem is not that treatments like antidepressants and cognitive behavioral therapy don’t work, it’s that no one treatment works for every patient. Researchers from many disciplines, from neuroscience to genomics, are studying this complex disorder, which likely represents many different conditions with unique origins and treatments. Large clinical trials to predict a patient’s response to therapy or drugs based on brain or body biomarkers could improve treatment for future patients and perhaps uncover a clearer understanding of depression’s origins.
“You see now a number of big studies on predictive biomarkers,” says Mayberg, who has pioneered pacemaker-like implants as a treatment for severe cases of depression. She’s also involved in a large study of patients who will be treated with antidepressants or cognitive behavioral therapy based on brain scans. “It’s going to be interesting over the next year or two to see how this plays out,” she says. One question will be whether researchers will be able to identify markers that are both unambiguous but also practical to test. Brain scans may be the best place to start, she says, because they focus on the origin of the condition, but once good biomarkers are identified via brain scan, surrogates found in the blood may provide a simpler and more affordable option.
One challenge for researchers is that depression is probably a conglomeration of many diseases, says Madhukar Trivedi, a University of Texas Southwestern researcher heading a large trial that is trying to distinguish patients who respond better to one type of antidepressant compared to another. “There are a lot of subtypes in depression, so any given marker, whether genetic, protein, imaging, or EEG, ends up accounting for only a small percentage of variance for any group of patients,” says Trivedi.   
If these researchers are successful, they could dramatically change how depression is treated and perhaps diagnosed. Doctors in the United States use the Diagnostic and Statistical Manual of Mental Disorders, or DSM, to diagnose depression. The diagnoses are largely based on the collection of symptoms presented or described by patients. In May, the head of the National Institute of Mental Health, Thomas Insel, announced that his institution would focus its research in areas other than the categories presented by the DSM. “Patients with mental disorders deserve better,” he said.
Bruce Cuthbert is heading the NIMH’s project to establish new ways of studying mental illness and potentially to improve future versions of the DSM by more precisely identifying the brain abnormalities in various diseases, including depression. The idea behind the project is to map out the genetic, circuit, and cognitive aspects of mental illness and to focus on individual features of disorders instead of clinical diagnoses. It could provide the information necessary to improve the DSM so that it is based on neuroscience and not just collections of symptoms. “In the future, we might define the disorders differently, or we might not. But this project will provide a framework to look at neural systems and how they operate and how that contributes to disease,” says Cuthbert.
Perhaps more immediately, the NIMH project could help researchers tune clinical trials of drugs to the right patients by focusing on discrete symptoms. For example, anhedonia, the inability to feel pleasure or seek pleasure, is a major symptom of depression, but it is also found in other patients, such as those with schizophrenia. By recruiting patients with measurable anhedonia, drug developers may be more likely to succeed in clinical trials than if they focused only on depression patients, says Cuthbert.
The NIMH project could also help to identify biomarkers of depression. “It could give us a structure to look at the pathology through different markers of the disease,” says Trivedi. “The goal is fantastic, but the proof is going to come in doing it.”

The Search for the Best Depression Treatment

Brain scans, blood samples, and other diagnostic tests could one day direct doctors to the best treatments for depression patients and uncover the biological basis of the condition.

When someone is diagnosed with depression, patient and doctor often begin a long trial-and-error process of testing different treatments. Sometimes they work, sometimes they don’t, so patients may try several options before finding the best one. But in the future, a brain scan, blood test, or some combination could help guide doctors to the best drugs, or lead them to suggest talk therapy.

Recently, Emory University researcher Helen Mayberg reported that a PET scan, a commonly used imaging method, can reveal whether a patient will respond better to an antidepressant or cognitive behavioral therapy. And in May, Medscape reported that David Mischoulon of Massachusetts General Hospital presented findings that the amount of a particular protein in the blood of depression patients could indicate whether a patient would do better by adding a form of folic acid to his or her treatment.

A key goal of such research is to distinguish between causes of depression. “The presence of certain biomarkers might give us a clue whether [a particular patient’s] depression is truly biologically driven, or whether it is depression like sadness over an event,” says Mischoulon. “If we can identify people who have these biological bases, it might suggest these patients might do better with medications, as opposed to psychotherapies or meditation.”

According to the World Health Organization, depression is the leading cause of disability globally. Many people do not seek or do not have access to treatment, and among those who do, fewer than 40 percent of depression patients improve with the first type of treatment they try. The problem is not that treatments like antidepressants and cognitive behavioral therapy don’t work, it’s that no one treatment works for every patient. Researchers from many disciplines, from neuroscience to genomics, are studying this complex disorder, which likely represents many different conditions with unique origins and treatments. Large clinical trials to predict a patient’s response to therapy or drugs based on brain or body biomarkers could improve treatment for future patients and perhaps uncover a clearer understanding of depression’s origins.

“You see now a number of big studies on predictive biomarkers,” says Mayberg, who has pioneered pacemaker-like implants as a treatment for severe cases of depression. She’s also involved in a large study of patients who will be treated with antidepressants or cognitive behavioral therapy based on brain scans. “It’s going to be interesting over the next year or two to see how this plays out,” she says. One question will be whether researchers will be able to identify markers that are both unambiguous but also practical to test. Brain scans may be the best place to start, she says, because they focus on the origin of the condition, but once good biomarkers are identified via brain scan, surrogates found in the blood may provide a simpler and more affordable option.

One challenge for researchers is that depression is probably a conglomeration of many diseases, says Madhukar Trivedi, a University of Texas Southwestern researcher heading a large trial that is trying to distinguish patients who respond better to one type of antidepressant compared to another. “There are a lot of subtypes in depression, so any given marker, whether genetic, protein, imaging, or EEG, ends up accounting for only a small percentage of variance for any group of patients,” says Trivedi.   

If these researchers are successful, they could dramatically change how depression is treated and perhaps diagnosed. Doctors in the United States use the Diagnostic and Statistical Manual of Mental Disorders, or DSM, to diagnose depression. The diagnoses are largely based on the collection of symptoms presented or described by patients. In May, the head of the National Institute of Mental Health, Thomas Insel, announced that his institution would focus its research in areas other than the categories presented by the DSM. “Patients with mental disorders deserve better,” he said.

Bruce Cuthbert is heading the NIMH’s project to establish new ways of studying mental illness and potentially to improve future versions of the DSM by more precisely identifying the brain abnormalities in various diseases, including depression. The idea behind the project is to map out the genetic, circuit, and cognitive aspects of mental illness and to focus on individual features of disorders instead of clinical diagnoses. It could provide the information necessary to improve the DSM so that it is based on neuroscience and not just collections of symptoms. “In the future, we might define the disorders differently, or we might not. But this project will provide a framework to look at neural systems and how they operate and how that contributes to disease,” says Cuthbert.

Perhaps more immediately, the NIMH project could help researchers tune clinical trials of drugs to the right patients by focusing on discrete symptoms. For example, anhedonia, the inability to feel pleasure or seek pleasure, is a major symptom of depression, but it is also found in other patients, such as those with schizophrenia. By recruiting patients with measurable anhedonia, drug developers may be more likely to succeed in clinical trials than if they focused only on depression patients, says Cuthbert.

The NIMH project could also help to identify biomarkers of depression. “It could give us a structure to look at the pathology through different markers of the disease,” says Trivedi. “The goal is fantastic, but the proof is going to come in doing it.”

Filed under depression biomarkers antidepressants CBT brain scans treatment psychology neuroscience science

418 notes

The Anorexic Brain

Neuroimaging improves understanding of eating disorder

image

In a spacious hotel room not far from the beach in La Jolla, Calif., Kelsey Heenan gripped her fiancé’s hand. Heenan, a 20-year-old anorexic woman, couldn’t believe what she was hearing. Walter Kaye, director of the eating disorders program at the University of California, San Diego, was telling a handful of rapt patients and their family members what the latest brain imaging research suggested about their disorder.

It’s not your fault, he told them.

Heenan had always assumed that she was to blame for her illness. Kaye’s data told a different story. He handed out a pile of black-and-white brain scans — some showed the brains of healthy people, others were from people with anorexia nervosa. The scans didn’t look the same. “People were shocked,” Heenan says. But above all, she remembers, the group seemed to sigh in relief, breathing out years of buried guilt about the disorder. “It’s something in the way I was wired — it’s something I didn’t choose to do,” Heenan says. “It was pretty freeing to know that there could be something else going on.”

Years of psychological and behavioral research have helped scientists better understand some signs and triggers of anorexia. But that knowledge hasn’t straightened out the disorder’s tangled roots, or pointed scientists to a therapy that works for everyone. “Anorexia has a high death rate, it’s expensive to treat and people are chronically ill,” says Kaye.

Kaye’s program uses a therapy called family-based treatment, or FBT, to teach adolescents and their families how to manage anorexia. A year after therapy, about half of the patients treated with FBT recover. In the world of eating disorders, that’s success: FBT is considered one of the very best treatments doctors have. To many scientists, that just highlights how much about anorexia remains unknown.

Kaye and others are looking to the brain for answers. Using brain imaging tools and other methods to explore what’s going on in patients’ minds, researchers have scraped together clues that suggest anorexics are wired differently than healthy people. The mental brakes people use to curb impulsive instincts, for example, might get jammed in people with anorexia. Some studies suggest that just a taste of sugar can send parts of the brain barrelling into overdrive. Other brain areas appear numb to tastes — and even sensations such as pain. For people with anorexia, a sharp pang of hunger might register instead as a dull thud.

The mishmash of different brain imaging data is just beginning to highlight the neural roots of anorexia, Kaye says. But because starvation physically changes the brain, researchers can run into trouble teasing out whether glitchy brain wiring causes anorexia, or vice versa. Still, Kaye thinks understanding what’s going on in the brain may spark new treatment ideas. It may also help the eating disorder shake off some of its noxious stereotypes.

“One of the biggest problems is that people do not take this disease seriously,” says James Lock, an eating disorders researcher at Stanford University who cowrote the book on family-based treatment. “No one gets upset at a child who has cancer,” he says. “If the treatment is hard, parents still do it because they know they need to do it to make their child well.”

Pop culture often paints anorexics as willful young women who go on diets to be beautiful, he says. But, “you can’t just choose to be anorexic,” Lock adds. “The brain data may help counteract some of the mythology.”

Beyond dieting

A society that glamorizes thinness can encourage unhealthy eating behaviors in kids, scientists have shown. A 2011 study of Minnesota high school students reported that more than half of girls had dieted within the past year. Just under a sixth had used diet pills, vomiting, laxatives or diuretics.

But a true eating disorder goes well beyond an unhealthy diet. Anorexia involves malnutrition, excessive weight loss and often faulty thinking about one of the body’s most basic drives: hunger. The disorder is also rare. Less than 1 percent of girls develop anorexia. The disease crops up in boys too, but adolescent girls — especially in wealthy countries such as the U.S., Australia and Japan — are most likely to suffer from the illness.

As the disease progresses, people with anorexia become intensely afraid of getting fat and stick to extreme diets or exercise schedules to drop pounds. They also misjudge their own weight. Beyond these diagnostic hallmarks, patients’ symptoms can vary. Some refuse to eat, others binge and purge. Some live for years with the illness, others yo-yo between weight gain and loss. Though most anorexics gain back some weight within five years of becoming ill, anorexia is the deadliest of all mental disorders.

Though anorexia tends to run in families, scientists haven’t yet hammered out the suite of genes at play. Some individuals are particularly vulnerable to developing an eating disorder. In these people, stressful life changes, such as heading off to college, can tip the mental scales toward anorexia.

For decades, scientists have known that anorexic children behave a little differently. In school and sports, anorexic kids strive for perfection. Though Heenan, a former college basketball player, didn’t notice her symptoms creeping in until the end of high school, she remembers initiating strict practice regimens as a child. Starting in second grade, Heenan spent hours perfecting her jump shot, shooting the ball again and again until she had the technique exactly right — until her form was flawless.

“It’s very rare for me to see a person with anorexia in my office who isn’t a straight-A student,” Lock says. Even at an early age, people who later develop the eating disorder tend to exert an almost superhuman ability to practice, focus or study. “They will work and work and work,” says Lock. “The problem is they don’t know when to stop.”

In fact, many scientists think anorexics’ brains might be wired for willpower, for good and ill. Using new imaging tools that let scientists watch as a person’s mental gears grind through different tasks, researchers are starting to pin down how anorexic brains work overtime.

image

Different wiring: Studies of the brains of people with anorexia have revealed a number of complex brain circuits that show changes in activity compared with healthy people. Medical RF, adapted by M. Atarod

Control signs

image

To glimpse the circuits that govern self-control, experimental neuropsychologist Samantha Brooks uses functional magnetic resonance imaging, or fMRI, a tool that measures and maps brain activity. Last year, she and colleagues scanned volunteers as they imagined eating high-calorie foods, such as chocolate cake and French fries, or using inedible objects such as clothespins piled on a plate. One result gave Brooks a jolt. A center of self-control in anorexics’ brains sprung to life when the volunteers thought about food — but only in the women who severely restricted their calories, her team reported March 2012 in PLOS ONE.

The control center, two golf ball–sized chunks of tissue called the dorsolateral prefrontal cortex, or DLPFC, helps stamp out primitive urges. “They put a brake on your impulsive behaviors,” says Brooks, now at the University of Cape Town in South Africa.

For Brooks, discovering the DLPFC data was like finding a tiny vein of gold in a heap of granite. The control center could be the nugget that reveals how anorexics clamp down on their appetites. So she and her colleagues devised an experiment to test anorexics’ DLPFC. Using a memory task known to engage the brain region, the researchers quizzed volunteers while showing them subliminal images. The quizzes tested working memory, the mental tool that lets people hold  phone numbers in their heads while hunting for a pen and paper. Compared with healthy people, anorexics tended to get more answers right, Brooks’ team wrote June 2012 in Consciousness and Cognition. “The patients were really good,” Brooks says. “They hardly made any mistakes.”

A turbocharged working memory could help anorexics hold on to rules they set for themselves about food. “It’s like saying ‘I will only eat a salad at noon, I will only eat a salad at noon,’ over and over in your mind,” says Brooks. These mantras may become so ingrained that an anorexic person can’t escape them.

But looking at subliminal images of food distracted anorexics from the memory task. “Then they did just as well as the healthy people,” Brooks says. The results suggest that anorexic people might tap into their DLPFC control circuits when faced with food.

James Lock has also seen signs of self-control circuits gone awry in people with eating disorders. In 2011, he and colleagues scanned the brains of teenagers with different eating disorders while signaling them to push a button. While volunteers lay inside the fMRI machine, researchers flashed pictures of different letters on an interior screen. For every letter but “X,” Lock’s group told the teens to push a button. During the task, anorexic teens who obsessively cut calories tended to have more active visual circuits than healthy teens or those with bulimia, a disorder that compels people to binge and purge. The result isn’t easy to explain, says Lock. “Anorexics may just be more focused in on the task.”

Bulimics’ brains told a simpler story. When teens with bulimia saw the letter “X,” broad swaths of their brains danced with activity — more so than the healthy or calorie-cutting anorexic volunteers, Lock’s team reported in the American Journal of Psychiatry. For bulimics, controlling the impulse to push the button may take more brain power than for others, Lock says.

Though the data don’t reveal differences in self-control between anorexics and healthy people, Lock thinks that anorexics’ well-documented ability to swat away urges probably does have signatures in the brain. He notes that his study was small, and that the “healthy” people he used as a control group might have shared similarities with anorexics. “The people who tend to volunteer are generally pretty high performers,” he says. “The chances are good that my controls are a little bit more like anorexics than bulimics.”

Still, Lock’s results offered another flicker of proof that people with eating disorders might have glitches in their self-control circuits. A tight rein on urges could help steer anorexics toward illness, but the parts of their brain tuned into rewards, such as sugary snacks, may also be a little off track.

Sugar low

image

When an anorexic woman unexpectedly gets a taste of sugar (yellow) or misses out on it (blue), her brain’s reward circuitry shows more activity than a healthy-weight or obese woman’s. Anorexics’ reward-processing systems may be out of order. Credit: G. Frank et al/ Neuropsychopharmacology 2012

For many anorexics, food just doesn’t taste very good. A classic symptom of the disorder is anhedonia, or trouble experiencing pleasure. Parts of Heenan’s past reflect the symptom. When she was ill, she had trouble remembering favorite dishes from childhood, for example — a blank spot common to anorexics. “I think I enjoyed some things,” she says. Beyond frozen yogurt, she can’t really rattle off a list.

After Heenan started seriously restricting her calories in college, only one aspect of food made her feel satisfied. Skipping, rather than eating, meals felt good, she says. Some of Heenan’s symptoms may have stemmed from frays in her reward wiring, the brain circuitry connecting food to pleasure. In the past few years, researchers have found that the chemicals coursing through healthy people’s reward circuits aren’t quite the same in anorexics. And studies in rodents have linked chemical changes in reward circuitry to under- and overeating.

To find out whether under- and overweight people had altered brain chemistry, eating disorder researcher Guido Frank of the University of Colorado Denver studied anorexic, healthy-weight and obese women. He and his colleagues trained volunteers to link images, such as orange or purple shapes, with the taste of a sweet solution, slightly salty water or no liquid. Then, the researchers scanned the women’s brains while showing them the shapes and dispensing tiny squirts of flavors. But the team threw in a twist: Sometimes the flavors didn’t match up with the right images.

When anorexics got an unexpected hit of sugar, a surge of activity bloomed in their brains. Obese people had the opposite response: Their brains didn’t register the surprise. Healthy-weight women fit somewhere in the middle, Frank’s team reported August 2012, in Neuropsychopharmacology. While obese people might not be sensitive to sweets anymore, a little sugar rush goes a long way for anorexics. “It’s just too much stimulation for them,” Frank says.

One of the lively regions in anorexics’ brains was the ventral striatum, a lump of nerve cells that’s part of a person’s reward circuitry. The lump picks up signals from dopamine, a chemical that rushes in when most people see a sugary treat.

Frank says that it’s possible cutting calories could sculpt a person’s brain chemistry, but he thinks some young people are just more likely to become sugar-sensitive than others. Frank suspects anorexics’ dopamine-sensing equipment might be out of alignment to begin with. And he may be onto something. Recently, researchers in Kaye’s lab at UCSD showed that the same chemical that makes people perk up when a coworker brings in a box of doughnuts might actually trigger anxiety in anorexics.

Mixed signals

Usually a rush of dopamine triggers euphoria or a boost of energy, says Ursula Bailer, a psychiatrist and neuroimaging researcher at UCSD. Anorexics don’t seem to pick up those good feelings. 

When Bailer and colleagues gave volunteers amphetamine, a drug known to trigger dopamine release, and then asked them to rate their feelings, healthy people stuck to a familiar script. The drug made them feel intensely happy, Bailer’s team described March 2012 in the International Journal of Eating Disorders. Researchers linked the volunteers’ happy feelings to a wave of dopamine flooding the brain, using an imaging technique to track the chemical’s levels.

But anorexics said something different. “People with anorexia didn’t feel euphoria — they got anxious,” Bailer says. And the more dopamine coursing through anorexics’ brains, the more anxious they felt. Anorexics’ reaction to the chemical could help explain why they steer clear of food — or at least foods that healthy people find tempting. “Anorexics don’t usually get anxious if you give them a plate of cucumbers,” Bailer says.

Beyond the anxiety finding, one other aspect of the study sticks out: Instead of examining sick patients, Bailer, Kaye and colleagues recruited women who had recovered from anorexia. By studying people whose brains are no longer starving, Kaye’s team hopes to sidestep the chicken-and-egg question of whether specific brain signatures predispose people to anorexia or whether anorexia carves those signatures in the brain.

Though Kaye says that there’s still a lot scientists don’t know about anorexia, he’s convinced it’s a disorder that starts in the brain. Compared with healthy children, anorexic children’s brains are getting different signals, he says. “Parents have to realize that it’s very hard for these kids to change.”

Kaye thinks imaging data can help families reframe their beliefs about anorexia, which might help them handle tough treatments. He thinks the data can also offer new insights into therapies tailored for anorexics’ specific traits.

Sensory underload

One trait Kaye has focused on is anorexics’ sense of awareness of their bodies. Peel back the outer lobes of the brain by the temples, and the bit that handles body awareness pops into view. These regions, little islands of tissue called the insula, are one of the first brain areas to register pain, taste and other sensations. When people hold their breath, for example, and feel the panicky claws of air hunger, “the insula lights up like crazy,” Kaye says.

Kaye and colleagues have shown that the insulas of people with anorexia seem to be somewhat dulled to sensations. In a recent study, his team strapped heat-delivering gadgets to volunteers’ arms and cranked the devices to painfully hot temperatures while measuring insula activity via fMRI.

Compared with healthy volunteers, bits of recovered anorexics’ insulas dimmed when the researchers turned up the heat. But when researchers simply warned that pain was coming, other parts of the brain region flared brightly, Kaye’s team reported in January in the International Journal of Eating Disorders. For people who have had anorexia, actually feeling pain didn’t seem as bad as anticipating it. “They don’t seem to be sensing things correctly,” says Kaye.

If anorexics can’t detect sensations like pain properly, they may also have trouble picking up other signals from the body, such as hunger. Typically when people get hungry, their insulas rev up to let them know. And in healthy hungry people, a taste of sugar really gets the insula excited. For anorexics, this hunger-sensing part of the brain seems numb. Parts of the insula barely perked up when recovered anorexic volunteers tasted sugar, Kaye’s team showed this June in the American Journal of Psychiatry. The findings “may help us understand why people can starve themselves and not get hungry,” Kaye says.

Though the brain region that tells people they’re hungry might have trouble detecting sweet signals, some reward circuits seem to overreact to the same cues. Combined with a tendency to swap happiness for anxiety, and a mental vise grip on behavior, anorexics might have just enough snags in their brain wiring to tip them toward disease.

Now, Kaye’s group hopes to tap neuroimaging data for new treatment ideas. One day, he thinks doctors might be able to help anorexics “train” their insulas using biofeedback. With real-time brain scanning, patients could watch as their insulas struggle to pick up sugar signals, and then practice strengthening the response. More effective treatment options could potentially spare anorexics the relapses many patients suffer.

Heenan says she’s one of the lucky ones. Four years have passed since she first saw the anorexic brain images at UCSD. In the months following her treatment, Heenan and her family worked together to rebuild her relationship with food. At first, her fiancé picked out all her meals, but step by step, Heenan earned autonomy over her diet. Today, Heenan, a coordinator for Minneapolis’ public schools, is married and has a new puppy. “Life can be good,” she says. “Life can be fun. I want other people to know the freedom that I do.”

Searching for treatments

The bowl of pasta sitting in front of Kelsey Heenan didn’t look especially scary.

Spaghetti, chopped asparagus and chunks of chicken glistened in an olive oil sauce. Usually, such savory fare might make a person’s mouth water. But when Heenan’s fiancé served her a portion, she started sobbing. “You can’t do this to me,” she told him. “I thought you loved me!”

Heenan was confronting her “fear foods” at the Eating Disorders Center for Treatment and Research at UCSD. Therapists in her treatment program, Intensive Multi-Family Therapy, spend five days teaching anorexic patients and families about the disorder and how to encourage healthy eating. “There’s no blame,” says Christina Wierenga, a clinical neuropsychologist at UCSD. “The focus is just on having the parent refeed the child.” Therapists lay out healthy meals and portion sizes for teens, bolster parents’ self-confidence and hammer home the dangers of not eating. Heenan compares the experience to boot camp. But by the end of her time at the center, she says, “I was starting to see glimpses of what life could be like as a healthy person.”

Treatment options for anorexia include a broad mix of behavioral and medication-based therapies. Most don’t work very well, and many lack the support of evidence-based trials. Hospitalizing patients can boost short-term weight gain, “but when people go home they lose all the weight again,” says Stanford University’s James Lock, one of the architects of family-based treatment. That treatment is currently considered the most effective therapy for adolescent anorexics.

In a 2010 clinical trial, half of teens who underwent FBT maintained a normal weight a year after therapy. In contrast, only a fifth of teens treated with adolescent-focused individual therapy, which aims to help kids cope with emotions without using starvation, hit the healthy weight goal.

Few good options exist for adult anorexics, a group notorious for dropping out of therapy. New work hints that cognitive remediation therapy, or CRT, which uses cognitive exercises to change anorexics’ behaviors, has potential. After two months of CRT, only 13 percent of patients abandoned treatment, and most regained some weight, Lock and colleagues reported in the April International Journal of Eating Disorders. Researchers still need to find out, however, if CRT helps patients keep weight on long-term.

(Source: sciencenews.org)

Filed under anorexia nervosa neuroimaging brain scans eating disorders psychology neuroscience science

152 notes

Injuries From Teen Fighting Deal a Blow to IQ 
New study explores connection between physical fights, cognitive decline
A new Florida State University study has found that adolescent boys who are hurt in just two physical fights suffer a loss in IQ that is roughly equivalent to missing an entire year of school. Girls experience a similar loss of IQ after only a single fighting-related injury.
The findings are significant because decreases in IQ are associated with lower educational achievement and occupational performance, mental disorders, behavioral problems and even longevity, the researchers said.
“It’s no surprise that being severely physically injured results in negative repercussions, but the extent to which such injuries affect intelligence was quite surprising,” said Joseph A. Schwartz, a doctoral student who conducted the study with Professor Kevin Beaver in FSU’s College of Criminology and Criminal Justice.
Their findings are outlined in the paper, “Serious Fighting-Related Injuries Produce a Significant Reduction in Intelligence,” which was published in the Journal of Adolescent Health. The study is among the first to look at the long-term effects of fighting during adolescence, a critical period of neurological development.
About 4 percent of high school students are injured as a result of a physical fight each year, the researchers said.
Schwartz and Beaver used data from the National Longitudinal Study of Adolescent Health collected between 1994 and 2002 to examine whether serious fighting-related injuries resulted in significant decreases in IQ over a 5- to 6-year time span. The longitudinal study began with a nationally representative sample of 20,000 middle and high school students who were tracked into adulthood through subsequent waves of data collection. At each wave of data collection, respondents were asked about a wide variety of topics, including personality traits, social relationships and the frequency of specific behaviors.
Perhaps not surprisingly, boys experienced a higher number of injuries from fighting than girls; however, the consequences for girls were more severe, a fact the researchers attributed to physiological differences that give males an increased ability to withstand physical trauma.
The researchers found that each fighting-related injury resulted in a loss of 1.62 IQ points for boys, while girls lost an average of 3.02 IQ points, even after controlling for changes in socio-economic status, age and race for both genders. Previous studies have indicated that missing a single year of school is associated with a loss of 2 to 4 IQ points.
The impact on IQ may be even greater when considering only head injuries, the researchers said. The data they studied took into account all fighting-related physical injuries.
The findings highlight the importance of schools and communities developing policies aimed at limiting injuries suffered during adolescence whether through fighting, bullying or contact sports, Schwartz said.
“We tend to focus on factors that may result in increases in intelligence over time, but examining the factors that result in decreases may be just as important,” he said. “The first step in correcting a problem is understanding its underlying causes. By knowing that fighting-related injuries result in a significant decrease in intelligence, we can begin to develop programs and protocols aimed at effective intervention.”

Injuries From Teen Fighting Deal a Blow to IQ

New study explores connection between physical fights, cognitive decline

A new Florida State University study has found that adolescent boys who are hurt in just two physical fights suffer a loss in IQ that is roughly equivalent to missing an entire year of school. Girls experience a similar loss of IQ after only a single fighting-related injury.

The findings are significant because decreases in IQ are associated with lower educational achievement and occupational performance, mental disorders, behavioral problems and even longevity, the researchers said.

“It’s no surprise that being severely physically injured results in negative repercussions, but the extent to which such injuries affect intelligence was quite surprising,” said Joseph A. Schwartz, a doctoral student who conducted the study with Professor Kevin Beaver in FSU’s College of Criminology and Criminal Justice.

Their findings are outlined in the paper, “Serious Fighting-Related Injuries Produce a Significant Reduction in Intelligence,” which was published in the Journal of Adolescent Health. The study is among the first to look at the long-term effects of fighting during adolescence, a critical period of neurological development.

About 4 percent of high school students are injured as a result of a physical fight each year, the researchers said.

Schwartz and Beaver used data from the National Longitudinal Study of Adolescent Health collected between 1994 and 2002 to examine whether serious fighting-related injuries resulted in significant decreases in IQ over a 5- to 6-year time span. The longitudinal study began with a nationally representative sample of 20,000 middle and high school students who were tracked into adulthood through subsequent waves of data collection. At each wave of data collection, respondents were asked about a wide variety of topics, including personality traits, social relationships and the frequency of specific behaviors.

Perhaps not surprisingly, boys experienced a higher number of injuries from fighting than girls; however, the consequences for girls were more severe, a fact the researchers attributed to physiological differences that give males an increased ability to withstand physical trauma.

The researchers found that each fighting-related injury resulted in a loss of 1.62 IQ points for boys, while girls lost an average of 3.02 IQ points, even after controlling for changes in socio-economic status, age and race for both genders. Previous studies have indicated that missing a single year of school is associated with a loss of 2 to 4 IQ points.

The impact on IQ may be even greater when considering only head injuries, the researchers said. The data they studied took into account all fighting-related physical injuries.

The findings highlight the importance of schools and communities developing policies aimed at limiting injuries suffered during adolescence whether through fighting, bullying or contact sports, Schwartz said.

“We tend to focus on factors that may result in increases in intelligence over time, but examining the factors that result in decreases may be just as important,” he said. “The first step in correcting a problem is understanding its underlying causes. By knowing that fighting-related injuries result in a significant decrease in intelligence, we can begin to develop programs and protocols aimed at effective intervention.”

Filed under cognitive decline brain injury fighting IQ adolescence neuroscience psychology science

700 notes

Study finds night owls more likely to be psychopaths
People who stay up late at night are more likely to display anti-social personality traits such as narcissism, Machiavellianism, and psychopathic tendencies, according to a study published by a University of Western Sydney researcher.

Dr Peter Jonason, from the UWS School of Social Sciences and Psychology, assessed over 250 people’s tendency to be a morning- or evening-type person to discover whether this was linked to the ‘Dark Triad’ of personality traits.

The results, published in Personality and Individual Differences, found students who were awake in the twilight hours displayed greater anti-social tendencies than those who went to bed earlier.

“Those who scored highly on the Dark Triad traits are, like many other predators such as lions and scorpions, creatures of the night,” he says.

"For people pursuing a fast life strategy like that embodied by the Dark Triad traits, it’s better to occupy and exploit a lowlight environment where others are sleeping and have diminished cognitive functioning."

Dr Jonason says there may be an evolutionary basis for the link between anti-social behaviour and a preference to being awake late at night.

“There is likely to be a co-evolutionary arms race between cheaters and those who wish to detect and punish them, and the Dark Triad traits may represent specialized adaptations to avoid detection,” he says.

“The features of the night - a low-light environment where others are sleeping - may facilitate the casual sex, mate-poaching, and risk-taking the Dark Triad traits are linked to.”

“Indeed, most crimes and most sexual activity peak at night, suggesting just such a link.”

Dr Jonason adds that far more work is needed, but these results represent an important advance in behavioural ecological and evolutionary psychological models of the Dark Triad, as well as ‘darker’ aspects of human nature and personality.

Study finds night owls more likely to be psychopaths

People who stay up late at night are more likely to display anti-social personality traits such as narcissism, Machiavellianism, and psychopathic tendencies, according to a study published by a University of Western Sydney researcher.

Dr Peter Jonason, from the UWS School of Social Sciences and Psychology, assessed over 250 people’s tendency to be a morning- or evening-type person to discover whether this was linked to the ‘Dark Triad’ of personality traits.

The results, published in Personality and Individual Differences, found students who were awake in the twilight hours displayed greater anti-social tendencies than those who went to bed earlier.

“Those who scored highly on the Dark Triad traits are, like many other predators such as lions and scorpions, creatures of the night,” he says.

"For people pursuing a fast life strategy like that embodied by the Dark Triad traits, it’s better to occupy and exploit a lowlight environment where others are sleeping and have diminished cognitive functioning."

Dr Jonason says there may be an evolutionary basis for the link between anti-social behaviour and a preference to being awake late at night.

“There is likely to be a co-evolutionary arms race between cheaters and those who wish to detect and punish them, and the Dark Triad traits may represent specialized adaptations to avoid detection,” he says.

“The features of the night - a low-light environment where others are sleeping - may facilitate the casual sex, mate-poaching, and risk-taking the Dark Triad traits are linked to.”

“Indeed, most crimes and most sexual activity peak at night, suggesting just such a link.”

Dr Jonason adds that far more work is needed, but these results represent an important advance in behavioural ecological and evolutionary psychological models of the Dark Triad, as well as ‘darker’ aspects of human nature and personality.

Filed under personality traits anti-social personality traits psychopathy narcissism mental health psychology neuroscience science

114 notes

By tracking maggots’ food choices, scientists open significant new window into human learning
The squirming larva of the humble fruit fly, which shares a surprising amount of genetic material with the human being, is helping scientists to understand the way we learn information from one another.
Fruit flies have long served as models for studying behaviour because their cognitive mechanisms are parallel to humans’, but much simpler to study.
Fruit flies exhibit many of the same basic behaviours as humans and share 87 per cent of the material that is responsible for genetically based neurological disorders, making them a potent model for study.
While adult fruit flies have been studied for decades, the new paper reveals that their larvae, which are even simpler organisms, may be more valuable models for behavioral research. A fruit fly larva has only 3,000 neurons, for example, while a human has about 10 billion.
The McMaster researchers were able to prove that the larvae, or maggots, are capable of social learning, which opens the door to many other experiments that could provide valuable insights into human behaviour, end even lead to treatments for human disorders, the scientists say.
“People have been studying adult flies for decades now,” explains the study’s lead author, Zachary Durisko. “The larval stage is much simpler in terms of the brain, but behaviour at the larval stage has been less well studied. Here we have a complex behaviour in this even simpler model.”
Durisko and Reuven Dukas, both of McMaster’s Department of Psychology, Neuroscience and Behaviour, have shown that fruit fly larvae are able to distinguish which food sources have been used by other larvae and utilize the information to benefit themselves by choosing to eat from those same established sources instead of available alternatives.
The maggots’ attraction to food that others have been eating is based on smell, and is roughly equivalent to a person arriving in a new city, seeing two restaurants and choosing a busy one over an empty one, the researchers explain.
“They prefer the social over the non-social like we would do, and they learn to prefer the social over the non-social,” Dukas says.
In fact, the motivations may be similar in each case, and could include accepting the judgment of others as an indication of quality and seeking the company of others for protection from harm.
Durisko, the lead author, recently completed his PhD at McMaster, and Dukas, his co-author, is a professor at the university. Their work is published in the prestigious Proceedings of the Royal Society B, one of the society’s biological journals.
The researchers used several combinations of foods, both completely fresh and previously used, and of varying degrees of nutritional value, to compare the maggots’ preferences.

By tracking maggots’ food choices, scientists open significant new window into human learning

The squirming larva of the humble fruit fly, which shares a surprising amount of genetic material with the human being, is helping scientists to understand the way we learn information from one another.

Fruit flies have long served as models for studying behaviour because their cognitive mechanisms are parallel to humans’, but much simpler to study.

Fruit flies exhibit many of the same basic behaviours as humans and share 87 per cent of the material that is responsible for genetically based neurological disorders, making them a potent model for study.

While adult fruit flies have been studied for decades, the new paper reveals that their larvae, which are even simpler organisms, may be more valuable models for behavioral research. A fruit fly larva has only 3,000 neurons, for example, while a human has about 10 billion.

The McMaster researchers were able to prove that the larvae, or maggots, are capable of social learning, which opens the door to many other experiments that could provide valuable insights into human behaviour, end even lead to treatments for human disorders, the scientists say.

“People have been studying adult flies for decades now,” explains the study’s lead author, Zachary Durisko. “The larval stage is much simpler in terms of the brain, but behaviour at the larval stage has been less well studied. Here we have a complex behaviour in this even simpler model.”

Durisko and Reuven Dukas, both of McMaster’s Department of Psychology, Neuroscience and Behaviour, have shown that fruit fly larvae are able to distinguish which food sources have been used by other larvae and utilize the information to benefit themselves by choosing to eat from those same established sources instead of available alternatives.

The maggots’ attraction to food that others have been eating is based on smell, and is roughly equivalent to a person arriving in a new city, seeing two restaurants and choosing a busy one over an empty one, the researchers explain.

“They prefer the social over the non-social like we would do, and they learn to prefer the social over the non-social,” Dukas says.

In fact, the motivations may be similar in each case, and could include accepting the judgment of others as an indication of quality and seeking the company of others for protection from harm.

Durisko, the lead author, recently completed his PhD at McMaster, and Dukas, his co-author, is a professor at the university. Their work is published in the prestigious Proceedings of the Royal Society B, one of the society’s biological journals.

The researchers used several combinations of foods, both completely fresh and previously used, and of varying degrees of nutritional value, to compare the maggots’ preferences.

Filed under fruit fly maggots learning social learning human behavior neuroscience psychology science

113 notes

Cockatoos know what is going on behind barriers
How do you know that the cookies are still there although they have been placed out of your sight into the drawer? How do you know when and where a car that has driven into a tunnel will reappear? The ability to represent and to track the trajectory of objects, which are temporally out of sight, is highly important in many aspects but is also cognitively demanding. Alice Auersperg and her team from the University of Vienna and Oxford show that “object permanence” abilities in a cockatoo levels apes and four year old human toddlers. The researchers published their findings in the journal “Journal of Comparative Psychology”.
For investigating spatial memory and tracking in animals and human infants a number of setups have been habitually used. These can roughly be subdivided depending on what is being moved: a desired object (food reward), the hiding places for this object or the test animal itself: In the original invisible displacement tasks, designed by French psychologist Jean Piaget in the 50s, the reward is moved underneath a small cup behind one or more bigger screens and its contents is shown in between visits: if the cup is empty we know that the reward must be behind the last screen visited. Humans solve this task after about two years of age, whereas in primates only the great apes show convincing results.
Likely to be even more challenging in terms of attention, are “Transposition” tasks: the reward is hidden underneath one of several equal cups, which are interchanged one or more times. Human children struggle with this task type more than with the previous and do not solve it reliably before the age of three to four years whereas adult apes solve it but have more trouble with double than single swaps.
In “Rotation” tasks several equal cups, one bearing a reward are aligned in parallel on a rotatable platform, which is rotated at different angles. “Translocation” tasks are similar except that the cups are not rotated but the test animal is carried around the arrangement and released at different angles to the cup alignment. Children find Translocation tasks easier than Rotation tasks and solve them at two to three years of age.
A team of international Scientists tested eight Goffin cockatoos (Cacatua goffini), a conspicuously inquisitive and playful species on visible as well as invisible Piagetian object displacements and derivations of spatial transposition, rotation and translocation tasks.  Birgit Szabo, one of the experimenters from the University of Vienna, says: “The majority of our eight birds readily and spontaneously solved Transposition, Rotation and Translocation tasks whereas only two out of eight choose immediately and reliably the correct location in the original Piagetian invisible displacement task in which a smaller cup is visiting two of three bigger screens”. Alice Auersperg, the manager of the Goffin Lab who was also one of the experimenters, explains: “Interestingly and just opposite to human toddlers our cockatoos had more problems solving the Piagetian invisible displacements than the transposition task with which children struggle until the age of four. Transpositions are highly demanding in terms of attention since two occluding objects are moved simultaneously. Nevertheless, in contrast to apes, which find single swaps easier than double the cockatoos perform equally in both conditions”.
Similarly, Goffins had little complications with Rotations and Translocation tasks and some of them solved them at four different angles. Again, in contrast to children, which find Translocations easier than Rotations, the cockatoos showed no significant differences between the two tasks. Auguste von Bayern from the University of Oxford adds: ” We assume that the ability to fly and prey upon or being preyed upon from the air is likely to require pronounced spatial rotation abilities and may be a candidate trait influencing the animals’ performance in rotation and translocation tasks”.
Thomas Bugnayer from the University of Vienna concludes: “Finding that Goffins solve transposition, rotation and translocation tasks, which are likely to pose a large cognitive load on working memory, was surprising and calls for more comparative data in order to better understand the relevance of such accurate tracking abilities in terms of ecology and sociality”.

Cockatoos know what is going on behind barriers

How do you know that the cookies are still there although they have been placed out of your sight into the drawer? How do you know when and where a car that has driven into a tunnel will reappear? The ability to represent and to track the trajectory of objects, which are temporally out of sight, is highly important in many aspects but is also cognitively demanding. Alice Auersperg and her team from the University of Vienna and Oxford show that “object permanence” abilities in a cockatoo levels apes and four year old human toddlers. The researchers published their findings in the journal “Journal of Comparative Psychology”.

For investigating spatial memory and tracking in animals and human infants a number of setups have been habitually used. These can roughly be subdivided depending on what is being moved: a desired object (food reward), the hiding places for this object or the test animal itself: In the original invisible displacement tasks, designed by French psychologist Jean Piaget in the 50s, the reward is moved underneath a small cup behind one or more bigger screens and its contents is shown in between visits: if the cup is empty we know that the reward must be behind the last screen visited. Humans solve this task after about two years of age, whereas in primates only the great apes show convincing results.

Likely to be even more challenging in terms of attention, are “Transposition” tasks: the reward is hidden underneath one of several equal cups, which are interchanged one or more times. Human children struggle with this task type more than with the previous and do not solve it reliably before the age of three to four years whereas adult apes solve it but have more trouble with double than single swaps.

In “Rotation” tasks several equal cups, one bearing a reward are aligned in parallel on a rotatable platform, which is rotated at different angles. “Translocation” tasks are similar except that the cups are not rotated but the test animal is carried around the arrangement and released at different angles to the cup alignment. Children find Translocation tasks easier than Rotation tasks and solve them at two to three years of age.

A team of international Scientists tested eight Goffin cockatoos (Cacatua goffini), a conspicuously inquisitive and playful species on visible as well as invisible Piagetian object displacements and derivations of spatial transposition, rotation and translocation tasks.  Birgit Szabo, one of the experimenters from the University of Vienna, says: “The majority of our eight birds readily and spontaneously solved Transposition, Rotation and Translocation tasks whereas only two out of eight choose immediately and reliably the correct location in the original Piagetian invisible displacement task in which a smaller cup is visiting two of three bigger screens”. Alice Auersperg, the manager of the Goffin Lab who was also one of the experimenters, explains: “Interestingly and just opposite to human toddlers our cockatoos had more problems solving the Piagetian invisible displacements than the transposition task with which children struggle until the age of four. Transpositions are highly demanding in terms of attention since two occluding objects are moved simultaneously. Nevertheless, in contrast to apes, which find single swaps easier than double the cockatoos perform equally in both conditions”.

Similarly, Goffins had little complications with Rotations and Translocation tasks and some of them solved them at four different angles. Again, in contrast to children, which find Translocations easier than Rotations, the cockatoos showed no significant differences between the two tasks. Auguste von Bayern from the University of Oxford adds: ” We assume that the ability to fly and prey upon or being preyed upon from the air is likely to require pronounced spatial rotation abilities and may be a candidate trait influencing the animals’ performance in rotation and translocation tasks”.

Thomas Bugnayer from the University of Vienna concludes: “Finding that Goffins solve transposition, rotation and translocation tasks, which are likely to pose a large cognitive load on working memory, was surprising and calls for more comparative data in order to better understand the relevance of such accurate tracking abilities in terms of ecology and sociality”.

Filed under spatial memory object permanence piagetian object displacement psychology neuroscience science

51 notes

Yes, You Can? A Speaker’s Potency to Act upon His Words Orchestrates Early Neural Responses to Message-Level Meaning 
Evidence is accruing that, in comprehending language, the human brain rapidly integrates a wealth of information sources–including the reader or hearer’s knowledge about the world and even his/her current mood. However, little is known to date about how language processing in the brain is affected by the hearer’s knowledge about the speaker. Here, we investigated the impact of social attributions to the speaker by measuring event-related brain potentials while participants watched videos of three speakers uttering true or false statements pertaining to politics or general knowledge: a top political decision maker (the German Federal Minister of Finance at the time of the experiment), a well-known media personality and an unidentifiable control speaker. False versus true statements engendered an N400 - late positivity response, with the N400 (150–450 ms) constituting the earliest observable response to message-level meaning. Crucially, however, the N400 was modulated by the combination of speaker and message: for false versus true political statements, an N400 effect was only observable for the politician, but not for either of the other two speakers; for false versus true general knowledge statements, an N400 was engendered by all three speakers. We interpret this result as demonstrating that the neurophysiological response to message-level meaning is immediately influenced by the social status of the speaker and whether he/she has the power to bring about the state of affairs described.

Yes, You Can? A Speaker’s Potency to Act upon His Words Orchestrates Early Neural Responses to Message-Level Meaning

Evidence is accruing that, in comprehending language, the human brain rapidly integrates a wealth of information sources–including the reader or hearer’s knowledge about the world and even his/her current mood. However, little is known to date about how language processing in the brain is affected by the hearer’s knowledge about the speaker. Here, we investigated the impact of social attributions to the speaker by measuring event-related brain potentials while participants watched videos of three speakers uttering true or false statements pertaining to politics or general knowledge: a top political decision maker (the German Federal Minister of Finance at the time of the experiment), a well-known media personality and an unidentifiable control speaker. False versus true statements engendered an N400 - late positivity response, with the N400 (150–450 ms) constituting the earliest observable response to message-level meaning. Crucially, however, the N400 was modulated by the combination of speaker and message: for false versus true political statements, an N400 effect was only observable for the politician, but not for either of the other two speakers; for false versus true general knowledge statements, an N400 was engendered by all three speakers. We interpret this result as demonstrating that the neurophysiological response to message-level meaning is immediately influenced by the social status of the speaker and whether he/she has the power to bring about the state of affairs described.

Filed under neural activity ERPs N400 effect language language comprehension psychology neuroscience science

134 notes

Face Identification Accuracy is in the Eye (and Brain) of the Beholder
Though humans generally have a tendency to look at a region just below the eyes and above the nose toward the midline when first identifying another person, a small subset of people tend to look further down –– at the tip of the nose, for instance, or at the mouth. However, as UC Santa Barbara researchers Miguel Eckstein and Matthew Peterson recently discovered, “nose lookers” and “mouth lookers” can do just as well as everyone else when it comes to the split-second decision-making that goes into identifying someone. Their findings are in a recent issue of the journal Psychological Science.

"It was a surprise to us," said Eckstein, professor in the Department of Psychological & Brain Sciences, of the ability of that subset of "nose lookers" and "mouth lookers" to identify faces. In a previous study, he and postdoctoral researcher Peterson established through tests involving a series of face images and eye-tracking software that most humans tend to look just below the eyes when identifying another human being and when forced to look somewhere else, like the mouth, their face identification accuracy suffers.
The reason we look where we look, said the researchers, is evolutionary. With survival at stake and only a limited amount of time to assess who an individual might be, humans have developed the ability to make snap judgments by glancing at a place on the face that allows the observer’s eye to gather a massive amount of information, from the finer features around the eyes to the larger features of the mouth. In 200 milliseconds, we can tell whether another human being is friend, foe, or potential mate. The process is deceptively easy and seemingly negligible in its quickness: Identifying another individual is an activity on which we embark virtually from birth, and is crucial to everything from day-to-day social interaction to life-or-death situations. Thus, our brain devotes specialized circuitry to face recognition.
"One of, if not the most, difficult task you can do with the human face is to actually identify it," said Peterson, explaining that each time we look at someone’s face, it’s a little different –– perhaps the angle, or the lighting, or the face itself has changed –– and our brains constantly work to associate the current image with previously remembered images of that face, or faces like it, in a continuous process of recognition. Computer vision has nowhere near that capacity in identifying faces, yet.
So it would seem to follow that those who look at other parts of a person’s face might perform less well, and might be slower to recognize potential threats, or opportunities.
Or so the researchers thought. In a series of tests involving face identification tasks, the researchers found a small group that departed from the typical just-below-the-eyes gaze. The observers were Caucasian, had normal or corrected to normal vision, and no history of neurological disorders –– all qualities which controlled for cultural, physical, or neurological elements that could influence a person’s gaze.
But instead of performing less well, as would have been predicted by the theoretical analysis of the investigators, the participants were still able to identify faces with the same degree of accuracy as just-below-the-eyes lookers. Furthermore, when these nose-looking participants were forced to look at the eyes to do the identification, their accuracy degraded.
The findings both fascinate and set up a chicken-and-egg scenario for the researchers. One possibility is that people tailor their eye movement to the properties of their visual system –– everything from their eye structures to the brain functions they are born with and develop. If, for example, one is able to see well in the upper visual field (the region above where they look), they can afford to look lower on the face without losing the detail around the eyes when identifying someone. According to Eckstein, it is known that most humans tend to see better in the lower visual field.
The other possibility is the reverse –– that our visual systems adapt to our looking behavior. If at an early age a person developed the habit of looking lower on the face to identify someone else, over time brain circuits specialized for face identification could develop and arrange itself around that tendency.
"The main finding is that people develop distinct optimal face-looking strategies that maximize face identification accuracy," said Peterson. "In our framework, an optimized strategy or behavior is one that results in maximized performance. Thus, when we say that the observer-looking behavior was self-optimal, it refers to each individual fixating on locations that maximize their identification accuracy."
Future research will delve deeper into the mechanisms involved in those who look lower on the face to determine what could drive that gaze pattern and what information is gathered.

Face Identification Accuracy is in the Eye (and Brain) of the Beholder

Though humans generally have a tendency to look at a region just below the eyes and above the nose toward the midline when first identifying another person, a small subset of people tend to look further down –– at the tip of the nose, for instance, or at the mouth. However, as UC Santa Barbara researchers Miguel Eckstein and Matthew Peterson recently discovered, “nose lookers” and “mouth lookers” can do just as well as everyone else when it comes to the split-second decision-making that goes into identifying someone. Their findings are in a recent issue of the journal Psychological Science.

"It was a surprise to us," said Eckstein, professor in the Department of Psychological & Brain Sciences, of the ability of that subset of "nose lookers" and "mouth lookers" to identify faces. In a previous study, he and postdoctoral researcher Peterson established through tests involving a series of face images and eye-tracking software that most humans tend to look just below the eyes when identifying another human being and when forced to look somewhere else, like the mouth, their face identification accuracy suffers.

The reason we look where we look, said the researchers, is evolutionary. With survival at stake and only a limited amount of time to assess who an individual might be, humans have developed the ability to make snap judgments by glancing at a place on the face that allows the observer’s eye to gather a massive amount of information, from the finer features around the eyes to the larger features of the mouth. In 200 milliseconds, we can tell whether another human being is friend, foe, or potential mate. The process is deceptively easy and seemingly negligible in its quickness: Identifying another individual is an activity on which we embark virtually from birth, and is crucial to everything from day-to-day social interaction to life-or-death situations. Thus, our brain devotes specialized circuitry to face recognition.

"One of, if not the most, difficult task you can do with the human face is to actually identify it," said Peterson, explaining that each time we look at someone’s face, it’s a little different –– perhaps the angle, or the lighting, or the face itself has changed –– and our brains constantly work to associate the current image with previously remembered images of that face, or faces like it, in a continuous process of recognition. Computer vision has nowhere near that capacity in identifying faces, yet.

So it would seem to follow that those who look at other parts of a person’s face might perform less well, and might be slower to recognize potential threats, or opportunities.

Or so the researchers thought. In a series of tests involving face identification tasks, the researchers found a small group that departed from the typical just-below-the-eyes gaze. The observers were Caucasian, had normal or corrected to normal vision, and no history of neurological disorders –– all qualities which controlled for cultural, physical, or neurological elements that could influence a person’s gaze.

But instead of performing less well, as would have been predicted by the theoretical analysis of the investigators, the participants were still able to identify faces with the same degree of accuracy as just-below-the-eyes lookers. Furthermore, when these nose-looking participants were forced to look at the eyes to do the identification, their accuracy degraded.

The findings both fascinate and set up a chicken-and-egg scenario for the researchers. One possibility is that people tailor their eye movement to the properties of their visual system –– everything from their eye structures to the brain functions they are born with and develop. If, for example, one is able to see well in the upper visual field (the region above where they look), they can afford to look lower on the face without losing the detail around the eyes when identifying someone. According to Eckstein, it is known that most humans tend to see better in the lower visual field.

The other possibility is the reverse –– that our visual systems adapt to our looking behavior. If at an early age a person developed the habit of looking lower on the face to identify someone else, over time brain circuits specialized for face identification could develop and arrange itself around that tendency.

"The main finding is that people develop distinct optimal face-looking strategies that maximize face identification accuracy," said Peterson. "In our framework, an optimized strategy or behavior is one that results in maximized performance. Thus, when we say that the observer-looking behavior was self-optimal, it refers to each individual fixating on locations that maximize their identification accuracy."

Future research will delve deeper into the mechanisms involved in those who look lower on the face to determine what could drive that gaze pattern and what information is gathered.

Filed under eye movements face recognition face perception psychology neuroscience science

344 notes

Brain research shows psychopathic criminals do not lack empathy, but fail to use it automatically
Criminal psychopathy can be both repulsive and fascinating, as illustrated by the vast number of books and movies inspired by this topic. Offenders diagnosed with psychopathy pose a significant threat to society, because they are more likely to harm other individuals and to do so again after being released. A brain imaging study in the Netherlands shows individuals with psychopathy have reduced empathy while witnessing the pains of others. When asked to empathize, however, they can activate their empathy. This could explain why psychopathic individuals can be callous and socially cunning at the same time.
Why are psychopathic individuals more likely to hurt others? Individuals with psychopathy characteristically demonstrate reduced empathy with the feelings of others, which may explain why it is easier for them to hurt other people. However, what causes this lack of empathy is poorly understood. Scientific studies on psychopathic subjects are notoriously hard to conduct. “Convicted criminals with a diagnosis of psychopathy are confined to high-security forensic institutions in which state-of-the-art technology to study their brain, like magnetic resonance imaging, is usually unavailable”, explains Professor Christian Keysers, Head of the Social Brain Lab in Amsterdam, and senior author of a study on psychopathy appearing in the Journal Brain this week. “Bringing them to scientific research centres, on the other hand, requires the kind of high-security transportation that most judicial systems are unwilling to finance.”
The Dutch judicial system, however, seems to be an exception. They joined forces with academia to promote a better understanding of psychopathy. As a result, criminals with psychopathy were transported to the Social Brain Lab of the University Medical Center in Groningen (The Netherlands). There, the team could use state of the art high-field functional magnetic resonance imaging to peak into the brain of criminals with psychopathy while they view the emotions of others.
The study, which will appear on the 25th of July in the journal Brain (published by Oxford University Press) and is entitled “Reduced spontaneous but relatively normal deliberate vicarious representations in psychopathy”, included 18 individuals with psychopathy and a control group, and consisted of three parts. “All participants first watched short movie clips of two people interacting with each other, zoomed in on their hands. The movie clips showed one hand touching the other in a loving, a painful, a socially rejecting or a neutral way. At this stage, we asked them to look at these movies just as they would watch one of their favourite films”, Harma Meffert, the first author of the paper, explains. Meffert was a graduate student in the Social Brain Lab while the study was conducted, and is now a post-doctoral fellow at the National Institutes of Mental Health in Bethesda.
Next, the participants watched the same clips again. This time, however, the researchers prompted them explicitly to “empathise with one of the actors in the movie”, that is, they were requested to really try to feel what the actors in the movie were feeling.
"In the third and final part, we performed similar hand interactions with the participants themselves, while they were lying in the scanner, having their brain activity measured", adds Meffert. "We wanted to know to what extent they would activate the same brain regions while they were watching the hand interactions in the movies, as they would when they were experiencing these same hand interactions themselves."
Our brains are equipped with what scientists call a “mirror system”. For example, the motor cortex of the brain normally allows you to move your own body. Your so called somatosensory cortex, when activated, makes you to feel touch on your skin. Your insula, finally, when activated makes you feel emotions like pain or disgust. In the last decades, brain scientists have discovered that when people watch other people move their body, or see those people being touched, or have emotions, these same brain regions are activated. In other words, the actions, touch or emotions of others become your own. This “mirror system” possibly constitutes a crucial part of our ability to empathize with other people, and it has been previously shown, that the less you activate this system, the less you report to empathize with other people. It has been suggested that individuals with psychopathy might somehow suffer from a broken “mirror system”, resulting in a diminished ability to empathize with their victims.
As it turns out, however, the picture seems to be more complex. When asked to just watch the film clips, the individuals with psychopathy indeed did activate their mirror system less. “Regions involved in their own actions, emotions and sensations were less active than that of controls while they saw what happens in others”, summarizes Christian Keysers. “At first, this seems to suggest that psychopathic criminals might hurt others more easily than we do, because they do not feel pain, when they see the pain of their victims.”
As the second part of the study revealed, however, it’s not quite so simple. Instead of generally activating their mirror system less, individuals with psychopathy rather seem not to use this system spontaneously, but they can use it when asked to. “When explicitly asked to empathize, the differences between how strongly the individuals with and without psychopathy activate their own actions, sensations and emotions almost entirely disappeared in their empathic brain”, explains Valeria Gazzola, Assistant Professor at the UMCG and second author of the paper. “Psychopathy may not be so much the incapacity to empathize, but a reduced propensity to empathize, paired with a preserved capacity to empathize when required to do so”. The brain data suggests, that by default, psychopathic individuals feel less empathy than others. If they try to empathize, however, they can switch to ‘empathy mode’.
There might be two sides to these findings. The darker side is that reduced spontaneous empathy together with a preserved capacity for empathy might be the cocktail that makes these individuals so callous when harming their victims and at the same time so socially cunning when they try to seduce their victims. Whether individuals with psychopathy autonomously switch their empathy mode on and off depending on the requirements of a social situation however remains to be established. The brighter side is that the preserved capacity for empathy might be harnessed in therapy. Instead of having to create a capacity for empathy, therapies may need to focus on making the existing capacity more automatic to prevent them from further harming others. How to do so, remains at this stage uncertain.

Brain research shows psychopathic criminals do not lack empathy, but fail to use it automatically

Criminal psychopathy can be both repulsive and fascinating, as illustrated by the vast number of books and movies inspired by this topic. Offenders diagnosed with psychopathy pose a significant threat to society, because they are more likely to harm other individuals and to do so again after being released. A brain imaging study in the Netherlands shows individuals with psychopathy have reduced empathy while witnessing the pains of others. When asked to empathize, however, they can activate their empathy. This could explain why psychopathic individuals can be callous and socially cunning at the same time.

Why are psychopathic individuals more likely to hurt others? Individuals with psychopathy characteristically demonstrate reduced empathy with the feelings of others, which may explain why it is easier for them to hurt other people. However, what causes this lack of empathy is poorly understood. Scientific studies on psychopathic subjects are notoriously hard to conduct. “Convicted criminals with a diagnosis of psychopathy are confined to high-security forensic institutions in which state-of-the-art technology to study their brain, like magnetic resonance imaging, is usually unavailable”, explains Professor Christian Keysers, Head of the Social Brain Lab in Amsterdam, and senior author of a study on psychopathy appearing in the Journal Brain this week. “Bringing them to scientific research centres, on the other hand, requires the kind of high-security transportation that most judicial systems are unwilling to finance.”

The Dutch judicial system, however, seems to be an exception. They joined forces with academia to promote a better understanding of psychopathy. As a result, criminals with psychopathy were transported to the Social Brain Lab of the University Medical Center in Groningen (The Netherlands). There, the team could use state of the art high-field functional magnetic resonance imaging to peak into the brain of criminals with psychopathy while they view the emotions of others.

The study, which will appear on the 25th of July in the journal Brain (published by Oxford University Press) and is entitled “Reduced spontaneous but relatively normal deliberate vicarious representations in psychopathy”, included 18 individuals with psychopathy and a control group, and consisted of three parts. “All participants first watched short movie clips of two people interacting with each other, zoomed in on their hands. The movie clips showed one hand touching the other in a loving, a painful, a socially rejecting or a neutral way. At this stage, we asked them to look at these movies just as they would watch one of their favourite films”, Harma Meffert, the first author of the paper, explains. Meffert was a graduate student in the Social Brain Lab while the study was conducted, and is now a post-doctoral fellow at the National Institutes of Mental Health in Bethesda.

Next, the participants watched the same clips again. This time, however, the researchers prompted them explicitly to “empathise with one of the actors in the movie”, that is, they were requested to really try to feel what the actors in the movie were feeling.

"In the third and final part, we performed similar hand interactions with the participants themselves, while they were lying in the scanner, having their brain activity measured", adds Meffert. "We wanted to know to what extent they would activate the same brain regions while they were watching the hand interactions in the movies, as they would when they were experiencing these same hand interactions themselves."

Our brains are equipped with what scientists call a “mirror system”. For example, the motor cortex of the brain normally allows you to move your own body. Your so called somatosensory cortex, when activated, makes you to feel touch on your skin. Your insula, finally, when activated makes you feel emotions like pain or disgust. In the last decades, brain scientists have discovered that when people watch other people move their body, or see those people being touched, or have emotions, these same brain regions are activated. In other words, the actions, touch or emotions of others become your own. This “mirror system” possibly constitutes a crucial part of our ability to empathize with other people, and it has been previously shown, that the less you activate this system, the less you report to empathize with other people. It has been suggested that individuals with psychopathy might somehow suffer from a broken “mirror system”, resulting in a diminished ability to empathize with their victims.

As it turns out, however, the picture seems to be more complex. When asked to just watch the film clips, the individuals with psychopathy indeed did activate their mirror system less. “Regions involved in their own actions, emotions and sensations were less active than that of controls while they saw what happens in others”, summarizes Christian Keysers. “At first, this seems to suggest that psychopathic criminals might hurt others more easily than we do, because they do not feel pain, when they see the pain of their victims.”

As the second part of the study revealed, however, it’s not quite so simple. Instead of generally activating their mirror system less, individuals with psychopathy rather seem not to use this system spontaneously, but they can use it when asked to. “When explicitly asked to empathize, the differences between how strongly the individuals with and without psychopathy activate their own actions, sensations and emotions almost entirely disappeared in their empathic brain”, explains Valeria Gazzola, Assistant Professor at the UMCG and second author of the paper. “Psychopathy may not be so much the incapacity to empathize, but a reduced propensity to empathize, paired with a preserved capacity to empathize when required to do so”. The brain data suggests, that by default, psychopathic individuals feel less empathy than others. If they try to empathize, however, they can switch to ‘empathy mode’.

There might be two sides to these findings. The darker side is that reduced spontaneous empathy together with a preserved capacity for empathy might be the cocktail that makes these individuals so callous when harming their victims and at the same time so socially cunning when they try to seduce their victims. Whether individuals with psychopathy autonomously switch their empathy mode on and off depending on the requirements of a social situation however remains to be established. The brighter side is that the preserved capacity for empathy might be harnessed in therapy. Instead of having to create a capacity for empathy, therapies may need to focus on making the existing capacity more automatic to prevent them from further harming others. How to do so, remains at this stage uncertain.

Filed under psychopathy empathy brain imaging brain activity somatosensory cortex psychology neuroscience science

91 notes

Breastfeeding Could Prevent ADHD

TAU research finds that breastfed children are less likely to develop ADHD later in life

image

We know that breastfeeding has a positive impact on child development and health — including protection against illness. Now researchers from Tel Aviv University have shown that breastfeeding could also help protect against Attention Deficit/Hyperactivity Disorder (ADHD), the most commonly diagnosed neurobehavioral disorder in children and adolescents.

Seeking to determine if the development of ADHD was associated with lower rates of breastfeeding, Dr. Aviva Mimouni-Bloch, of Tel Aviv University’s Sackler Faculty of Medicine and Head of the Child Neurodevelopmental Center in Loewenstein Hospital, and her fellow researchers completed a retrospective study on the breastfeeding habits of parents of three groups of children: a group that had been diagnosed with ADHD; siblings of those diagnosed with ADHD; and a control group of children without ADHD and lacking any genetic ties to the disorder.

The researchers found a clear link between rates of breastfeeding and the likelihood of developing ADHD, even when typical risk factors were taken into consideration. Children who were bottle-fed at three months of age were found to be three times more likely to have ADHD than those who were breastfed during the same period. These results have been published in Breastfeeding Medicine.

Understanding genetics and environment

In their study, the researchers compared breastfeeding histories of children from six to 12 years of age at Schneider’s Children Medical Center in Israel. The ADHD group was comprised of children that had been diagnosed at the hospital, the second group included the siblings of the ADHD patients, and the control group included children without neurobehavioral issues who had been treated at the clinics for unrelated complaints.

In addition to describing their breastfeeding habits during the first year of their child’s life, parents answered a detailed questionnaire on medical and demographic data that might also have an impact on the development of ADHD, including marital status and education of the parents, problems during pregnancy such as hypertension or diabetes, birth weight of the child, and genetic links to ADHD.

Taking all risk factors into account, researchers found that children with ADHD were far less likely to be breastfed in their first year of life than the children in the other groups. At three months, only 43 percent of children in the ADHD group were breastfed compared to 69 percent of the sibling group and 73 percent of the control group. At six months, 29 percent of the ADHD group was breastfed, compared to 50 percent of the sibling group and 57 percent of the control group.

One of the unique elements of the study was the inclusion of the sibling group, says Dr. Mimouni-Bloch. Although a mother will often make the same breastfeeding choices for all her children, this is not always the case. Some children’s temperaments might be more difficult than their siblings’, making it hard for the mother to breastfeed, she suggests.

Added protection

While researchers do not yet know why breastfeeding has an impact on the future development of ADHD — it could be due to the breast milk itself, or the special bond formed between mother and baby during breastfeeding, for example — they believe this research shows that breastfeeding can have a protective effect against the development of the disorder, and can be counted as an additional biological advantage for breastfeeding.

Dr. Mimouni-Bloch hopes to conduct a further study on breastfeeding and ADHD, examining children who are at high risk for ADHD from birth and following up in six-month intervals until six years of age, to obtain more data on the phenomenon.

(Source: aftau.org)

Filed under ADHD breastfeeding neurobiology psychology neuroscience science

free counters