Neuroscience

Articles and news from the latest research reports.

Posts tagged social interaction

40 notes

Professor Examines Social Capabilities of Performing Multiple-Action Sequences

The day of the big barbecue arrives and it’s time to fire up the grill. But rather than toss the hamburgers and hotdogs haphazardly onto the grate, you wait for the heat to reach an optimal temperature, and then neatly lay them out in their apportioned areas according to size and cooking times. Meanwhile, your friend is preparing the beverages. Cups are grabbed face down from the stack, turned over, and – using the other hand – filled with ice.

While these tasks – like countless, everyday actions – may seem trivial at first glance, they are actually fairly complex, according to Robrecht van der Wel, an assistant professor of psychology at Rutgers–Camden. “For instance, the observation that you grab a glass differently when you are filling a beverage than when you are stacking glasses suggests that you are thinking about the goal that you want to achieve,” he says. “How do you manipulate the glass? How do you coordinate your actions so that the liquid goes into the cup?  These kinds of actions are not just our only way to accomplish our intentions, but they reveal our intentions and mental states as well.”

van der Wel and his research partners, Marlene Meyer and Sabine Hunnius, turned their attention to how action planning generalizes to collaborative actions performed with others in a study, titled Higher-order planning for individual and joint object manipulations, published recently in Experimental Brain Research.

According to van der Wel, the researchers were especially interested in determining whether people’s actions exhibit certain social capabilities when performing multiple-action sequences in concert with a partner. “It is a pretty astonishing ability that we, as people, are able to plan and coordinate our actions with others,” says van der Wel. “If people plan ahead for themselves, what happens if they are now in a task where their action might influence another person’s comfort? Do they actually take that into account or not, even though, for their personal action, it makes no difference?”

In the research study, participants first completed a series of individual tasks requiring them to pick up a cylindrical object with one hand, pass it to their other hand, and then place it on a shelf. In the collaborative tasks, individuals picked up the object and handed it to their partner, who placed it on the shelf. The researchers varied the height of the shelf, to test whether people altered their grasps to avoid uncomfortable end postures. The object could only be grasped at one of two positions, implying that the first grasp would determine the postures – and comfort – of the remaining actions.

According to the researchers, the results from both the individual and joint performances show that participants altered their grasp location relative to the height of the shelf.  The participants in both scenarios were thus more likely to use a low-grasp location when the shelf was low, and vice versa. Doing so implied that the participants ended the sequences in comfortable postures. The researchers conclude that, in both individual and collaborative scenarios, participants engaged in extended planning to finish the object-transport sequences in a relatively comfortable posture. Given that participants did plan ahead for the sake of their action partner, it indicates an implicit social awareness that supports collaboration across individuals.

van der Wel notes that, while such basic actions may seem insignificant, it is important to understand how people perform basic tasks such as manipulating objects when considering those populations that aren’t able to complete them so efficiently. “How to pick up an object seems like a really trivial problem when you look at healthy adults, but as soon as you look at children, or people suffering from a stroke, it takes some time to develop that skill properly,” says van der Wel. “When someone has a stroke, it is not that they have damage to the musculature involved in doing the task; rather, damage to action planning areas in the brain results in an inability to perform simple actions. A better understanding of the mechanisms involved in action planning may guide rehabilitation strategies in such cases.”

According to van der Wel, the researchers are currently working on modifying the task to determine the age at which children begin planning their actions with respect to other peoples’ comfort. In particular, they want to understand how the development of social action planning links with the development of other cognitive and social abilities.

(Source: news.rutgers.edu)

Filed under social interaction cognitive abilities planning psychology neuroscience science

264 notes

MACH system from MIT can coach those with social anxiety
Plenty of people out there have a serious phobia of public speaking and there are tons of other disorders, such as Asperger’s, that severely limit a person’s ability to handle even simple social interactions. M. Ehsan Hoque, a student at the MIT Media Lab, has made these subjects the focus of her latest project: MACH (My Automated Conversation coacH). At the heart of MACH is a complex system of facial and speech recognition algorithms that can detect subtle nuances in intonation while tracking smiles, head nods and eye movement. The latter is especially important since the front end of MACH is a computer generated avatar that can tell when you break eye contact and shift your attention elsewhere.
The software then provides feedback about your performance, helping to prep you for that big presentation or just guide you out of your shell. Experimental data suggests that coaching from MACH could even help you perform better in a job interview. What’s particularly exciting is that the program requires no special hardware; it’s designed to be used with a standard webcam and microphone on a laptop. So it might not be too long before we start seeing apps designed to help users through social awkwardness.

MACH system from MIT can coach those with social anxiety

Plenty of people out there have a serious phobia of public speaking and there are tons of other disorders, such as Asperger’s, that severely limit a person’s ability to handle even simple social interactions. M. Ehsan Hoque, a student at the MIT Media Lab, has made these subjects the focus of her latest project: MACH (My Automated Conversation coacH). At the heart of MACH is a complex system of facial and speech recognition algorithms that can detect subtle nuances in intonation while tracking smiles, head nods and eye movement. The latter is especially important since the front end of MACH is a computer generated avatar that can tell when you break eye contact and shift your attention elsewhere.

The software then provides feedback about your performance, helping to prep you for that big presentation or just guide you out of your shell. Experimental data suggests that coaching from MACH could even help you perform better in a job interview. What’s particularly exciting is that the program requires no special hardware; it’s designed to be used with a standard webcam and microphone on a laptop. So it might not be too long before we start seeing apps designed to help users through social awkwardness.

Filed under MACH social interaction social anxiety public speaking technology science

207 notes

People can sense a smile before it appears on the face

But a forced or polite smile does not transmit the same signals, meaning we only detect it when it is visible, reports journal Psychological Science.


Researchers say the study reflects the unique social value of a heartfelt smile, which involves specific movements of muscles around the eyes.


A team from Bangor University had noted that pairs of strangers getting to know one another not only exchanged smiles, they almost always matched the particular smile type, whether genuine or polite.


But they responded much more quickly to their partners’ genuine smiles than their polite smiles, suggesting that they were anticipating the genuine smiles.


In the lab, the results were repeated and data from electrical sensors on participants’ faces revealed that they engaged smile-related muscles when they expected a genuine smile to appear but showed no such activity when expecting polite smiles.
The different responses suggest that genuine smiles are more valuable social rewards, said Dr Erin Heerey.
She said: “These findings give us the first clear suggestion that the basic processes that guide responses to reward also play a role in guiding social behaviour on a moment-to-moment basis during interactions.
"No two interactions are alike, yet people still manage to smoothly coordinate their speech and nonverbal behaviors with those of another person."
She said that polite smiles typically occur when sociocultural norms dictate that smiling is appropriate.
Genuine smiles, on the other hand, signify pleasure, occur spontaneously, and are indicated by engagement of specific muscles around the eye.
She said the study could help those who find social interactions tricky.
She explained: “As we progress in our understanding of how social interactions unfold, these findings may help to guide the development of interventions for people who find social interactions difficult, such as those with social anxiety, autism, or schizophrenia.”

People can sense a smile before it appears on the face

But a forced or polite smile does not transmit the same signals, meaning we only detect it when it is visible, reports journal Psychological Science.

Researchers say the study reflects the unique social value of a heartfelt smile, which involves specific movements of muscles around the eyes.

A team from Bangor University had noted that pairs of strangers getting to know one another not only exchanged smiles, they almost always matched the particular smile type, whether genuine or polite.

But they responded much more quickly to their partners’ genuine smiles than their polite smiles, suggesting that they were anticipating the genuine smiles.

In the lab, the results were repeated and data from electrical sensors on participants’ faces revealed that they engaged smile-related muscles when they expected a genuine smile to appear but showed no such activity when expecting polite smiles.

The different responses suggest that genuine smiles are more valuable social rewards, said Dr Erin Heerey.

She said: “These findings give us the first clear suggestion that the basic processes that guide responses to reward also play a role in guiding social behaviour on a moment-to-moment basis during interactions.

"No two interactions are alike, yet people still manage to smoothly coordinate their speech and nonverbal behaviors with those of another person."

She said that polite smiles typically occur when sociocultural norms dictate that smiling is appropriate.

Genuine smiles, on the other hand, signify pleasure, occur spontaneously, and are indicated by engagement of specific muscles around the eye.

She said the study could help those who find social interactions tricky.

She explained: “As we progress in our understanding of how social interactions unfold, these findings may help to guide the development of interventions for people who find social interactions difficult, such as those with social anxiety, autism, or schizophrenia.”

Filed under smiles social interaction social anxiety psychology neuroscience science

64 notes

Early brain responses to words predict developmental outcomes in children with autism
The pattern of brain responses to words in 2-year-old children with autism spectrum disorder predicted the youngsters’ linguistic, cognitive and adaptive skills at ages 4 and 6, according to a new study.
The findings, published May 29 in PLOS ONE, are among the first to demonstrate that a brain marker can predict future abilities in children with autism.
“We’ve shown that the brain’s indicator of word learning in 2-year-olds already diagnosed with autism predicts their eventual skills on a broad set of cognitive and linguistic abilities and adaptive behaviors,” said lead author Patricia Kuhl, co-director of the University of Washington’s Institute for Learning & Brain Sciences.
“This is true four years after the initial test, and regardless of the type of autism treatment the children received,” she said.
In the study, 2-year-olds – 24 with autism and 20 without – listened to a mix of familiar and unfamiliar words while wearing an elastic cap that held sensors in place. The sensors measured brain responses to hearing words, known as event-related potentials.
The research team then divided the children with autism into two groups based on the severity of their social impairments and took a closer look at the brain responses. Youngsters with less severe symptoms had brain responses that were similar to the typically developing children, in that both groups exhibited a strong response to known words in a language area located in the temporal parietal region on the left side of the brain.
This suggests that the brains of children with less severe symptoms can process words in ways that are similar to children without the disorder.
In contrast, children with more severe social impairments showed brain responses more broadly over the right hemisphere, which is not seen in typically developing children of any age.
“We think this measure signals that the 2-year-old’s brain has reorganized itself to process words. This reorganization depends on the child’s ability to learn from social experiences,” Kuhl said. She cautioned that identifying a neural marker that predicts future autism diagnoses with assurance is still a ways off.
The researchers also tested the children’s language skills, cognitive abilities, and social and emotional development, beginning at age 2, then again at ages 4 and 6.
The children with autism received intensive treatment and, as a group, they improved on the behavioral tests over time. But the outcome for individual children varied widely and the more their brain responses to words at age 2 were like those of typically developing children, the more improvement in skills they showed by age 6.
In other studies, Kuhl has found that social interactions accelerate language learning in babies. Infants use social cues, such as tracking adults’ eye movements to learn the names of things, and must be interested in people to learn in this way. Paying attention to people is a way for babies to sort through all that is happening around them and serves as a gate to know what is important.
But with autism, social impairments impede children’s interest in, and ability to pick up on, social cues. They find themselves paying attention to many other things, especially objects as opposed to people.
“Social learning is what most humans are about,” Kuhl said. “If your brain can learn from other people in a social context you have the capability to learn just about anything.”
She hopes that the new findings will lead to brain measures that can be used much earlier in development – at 12 months or younger – to help identify children at risk for autism.
“This line of work may lead to new interventions applied early in development, when the brain shows its highest level of neural plasticity,” Kuhl said.

Early brain responses to words predict developmental outcomes in children with autism

The pattern of brain responses to words in 2-year-old children with autism spectrum disorder predicted the youngsters’ linguistic, cognitive and adaptive skills at ages 4 and 6, according to a new study.

The findings, published May 29 in PLOS ONE, are among the first to demonstrate that a brain marker can predict future abilities in children with autism.

“We’ve shown that the brain’s indicator of word learning in 2-year-olds already diagnosed with autism predicts their eventual skills on a broad set of cognitive and linguistic abilities and adaptive behaviors,” said lead author Patricia Kuhl, co-director of the University of Washington’s Institute for Learning & Brain Sciences.

“This is true four years after the initial test, and regardless of the type of autism treatment the children received,” she said.

In the study, 2-year-olds – 24 with autism and 20 without – listened to a mix of familiar and unfamiliar words while wearing an elastic cap that held sensors in place. The sensors measured brain responses to hearing words, known as event-related potentials.

The research team then divided the children with autism into two groups based on the severity of their social impairments and took a closer look at the brain responses. Youngsters with less severe symptoms had brain responses that were similar to the typically developing children, in that both groups exhibited a strong response to known words in a language area located in the temporal parietal region on the left side of the brain.

This suggests that the brains of children with less severe symptoms can process words in ways that are similar to children without the disorder.

In contrast, children with more severe social impairments showed brain responses more broadly over the right hemisphere, which is not seen in typically developing children of any age.

“We think this measure signals that the 2-year-old’s brain has reorganized itself to process words. This reorganization depends on the child’s ability to learn from social experiences,” Kuhl said. She cautioned that identifying a neural marker that predicts future autism diagnoses with assurance is still a ways off.

The researchers also tested the children’s language skills, cognitive abilities, and social and emotional development, beginning at age 2, then again at ages 4 and 6.

The children with autism received intensive treatment and, as a group, they improved on the behavioral tests over time. But the outcome for individual children varied widely and the more their brain responses to words at age 2 were like those of typically developing children, the more improvement in skills they showed by age 6.

In other studies, Kuhl has found that social interactions accelerate language learning in babies. Infants use social cues, such as tracking adults’ eye movements to learn the names of things, and must be interested in people to learn in this way. Paying attention to people is a way for babies to sort through all that is happening around them and serves as a gate to know what is important.

But with autism, social impairments impede children’s interest in, and ability to pick up on, social cues. They find themselves paying attention to many other things, especially objects as opposed to people.

“Social learning is what most humans are about,” Kuhl said. “If your brain can learn from other people in a social context you have the capability to learn just about anything.”

She hopes that the new findings will lead to brain measures that can be used much earlier in development – at 12 months or younger – to help identify children at risk for autism.

“This line of work may lead to new interventions applied early in development, when the brain shows its highest level of neural plasticity,” Kuhl said.

Filed under ASD autism brain responses language skills social interaction ERPs neuroscience science

83 notes

Research discovers link between epilepsy and autism

Our researchers have found a previously undiscovered link between epileptic seizures and the signs of autism in adults.

Dr SallyAnn Wakeford from the Department of Psychology revealed that adults with epilepsy were more likely to have higher traits of autism and Asperger syndrome.

image

Characteristics of autism, which include impairment in social interaction and communication as well as restricted and repetitive interests, can be severe and go unnoticed for many years, having tremendous impact on the lives of those who have them.

The research found that epileptic seizures disrupt the neurological function that affects social functioning in the brain resulting in the same traits seen in autism.

Dr Wakeford said: “The social difficulties in epilepsy have been so far under-diagnosed and research has not uncovered any underlying theory to explain them. This new research links social difficulties to a deficit in somatic markers in the brain, explaining these characteristics in adults with epilepsy.”

Dr Wakeford and her colleagues discovered that having increased autistic traits was common to all epilepsy types, however, this was more pronounced for adults with Temporal Lobe Epilepsy (TLE).

The researchers suggest that one explanation may be because anti-epileptic drugs are often less effective for TLE. The reason why they suspect these drugs are implicated is because they were strongly related to the severity of autistic characteristics.

Dr Wakeford carried out a comprehensive range of studies with volunteers with epilepsy and discovered that all of the adults with epilepsy showed autism traits.

She said: “It is unknown whether these adults had a typical developmental period during childhood or whether they were predisposed to having autistic traits before the onset of their epilepsy. However what is known is that the social components of autistic characteristics in adults with epilepsy may be explained by social cognitive differences, which have largely been unrecognised until now.”

Dr Wakeford said the findings could lead to improved treatment for people with epilepsy and autism. She said: “Epilepsy has a history of cultural stigma, however the more we understand about the psychological consequences of epilepsy the more we can remove the stigma and mystique of this condition.

“These findings could mean that adults with epilepsy get access to better services, as there is a wider range of treatments available for those with autism condition.”

Margaret Rawnsley, research administration officer at Epilepsy Action welcomed the findings.

She said: “We welcome any research that could further our understanding of epilepsy and ultimately improve the lives of those with the condition. This research has the potential to tell us more about the links between epilepsy and other conditions, such as autism spectrum disorders.”

(Source: bath.ac.uk)

Filed under epilepsy autism social interaction brain TLE psychology neuroscience science

171 notes

Researchers Successfully Treat Autism in Infants

Most infants respond to a game of peek-a-boo with smiles at the very least, and, for those who find the activity particularly entertaining, gales of laughter. For infants with autism spectrum disorders (ASD), however, the game can be distressing rather than pleasant, and they’ll do their best to tune out all aspects of it –– and that includes the people playing with them.

image

That disengagement is a hallmark of ASD, and one of the characteristics that amplifies the disorder as infants develop into children and then adults.

A study conducted by researchers at the Koegel Autism Center at UC Santa Barbara has found that replacing such games in favor of those the infant prefers can actually lessen the severity of the infants’ ASD symptoms, and, perhaps, alleviate the condition altogether. Their work is highlighted the current issue of the Journal of Positive Behavior Interventions.

Lynn Koegel, clinical director of the center and the study’s lead author, described the game-playing protocol as a modified Pivotal Response Treatment (PVT). Developed at UCSB, PRT is based on principles of positive motivation. The researchers identified the activities that seemed to be more enjoyable to the infants and taught the respective parents to focus on those rather than on the typical games they might otherwise choose. “We had them play with their infants for short periods, and then give them some kind of social reward,” Koegel said. “Over time, we conditioned the infants to enjoy all the activities that were presented by pairing the less desired activities with the highly desired ones.” The social reward is preferable to, say, a toy, Koegel noted, because it maintains the ever-crucial personal interaction.

"The idea is to get them more interested in people," she continued, "to focus on their socialization. If they’re avoiding people and avoiding interacting, that creates a whole host of other issues. They don’t form friendships, and then they don’t get the social feedback that comes from interacting with friends."

According to Koegel, by the end of the relatively short one- to three-month intervention period, which included teaching the parents how to implement the procedures, all the infants in the study had normal reactions to stimuli. “Two of the three have no disabilities at all, and the third is very social,” she said. “The third does have a language delay, but that’s more manageable than some of the other issues.”

On a large scale, Koegel hopes to establish some benchmark for identifying social deficits in infants so parents and health care providers can intervene sooner rather than later. “We have a grant from the Autism Science Foundation to look at lots of babies and try to really figure out which signs are red flags, and which aren’t,” she said. “A number of the infants who show signs of autism will turn out to be perfectly fine; but we’re saying, let’s not take the risk if we can put an intervention in play that really works. Then we don’t have to worry about whether or not these kids would develop the full-blown symptoms of autism.”

Historically, ASD is diagnosed in children 18 months or older, and treatment generally begins around 4 years. “You can pretty reliably diagnose kids at 18 months, especially the more severe cases,” said Koegel. “The mild cases might be a little harder, especially if the child has some verbal communication. There are a few measures –– like the ones we used in our study –– that can diagnose kids pre-language, even as young as six months. But ours was the first that worked with children under 12 months and found an effective intervention.”

Given the increasing number of children being diagnosed with ASD, Koegel’s findings could be life altering –– literally. “When you consider that the recommended intervention for preschoolers with autism is 30 to 40 hours per week of one-on-one therapy, this is a fairly easy fix,” she said. “We did a single one-hour session per week for four to 12 weeks until the symptoms improved, and some of these infants were only a few months old. We saw a lot of positive change.”

(Source: ia.ucsb.edu)

Filed under ASD autism infants socialization social interaction psychology neuroscience science

133 notes

Autistic Children’s Love For Video Games Could Lead To New Treatment Options 
Kids and teenagers suffering from autism spectrum disorder (ASD) are more likely to use television and video games and less likely to spend time on social media than their normally-developing counterparts, claims new research set for publication in a future issue of the Journal of Autism and Developmental Disorders.
Micah Mazurek, an assistant professor of health psychology and a clinical child psychologist at the University of Missouri, recruited 202 children and adolescents with ASD and 179 of their typically developing siblings for the study.
Those with ASD spent more time playing video games and watching TV than spending time on physical or pro-social activities (including spending time on websites like Facebook or Twitter). The opposite was also true: typically-developing children spent more time on non-screen-related activities than they did watching shows or playing on the PS3 or the Xbox 360, according to the soon-to-be-published study.
“Many parents and clinicians have noticed that children with ASD are fascinated with technology, and the results of our recent studies certainly support this idea,” Mazurek said in a statement. “We found that children with ASD spent much more time playing video games than typically developing children, and they are much more likely to develop problematic or addictive patterns of video game play.”
In a separate study of 169 boys with ASD, excessive video game use had been linked to oppositional behaviors, such as refusal to follow directions or getting into arguments with others. Mazurek said that the issues will need to be further examined in future, closely-controlled research.
“Because these studies were cross-sectional, it is not clear if there is a causal relationship between video game use and problem behaviors,” she said. “Children with ASD may be attracted to video games because they can be rewarding, visually engaging and do not require face-to-face communication or social interaction. Parents need to be aware that, although video games are especially reinforcing for children with ASD, children with ASD may have problems disengaging from these games.”
Despite those issues, Mazurek also believes that autistic children’s love for video games and television could be used for beneficial purposes. The professor believes that discovering what makes these screen-related pastimes so attractive to kids with ASD could help researchers and medical experts develop new treatment options.
“Using screen-based technologies, communication and social skills could be taught and reinforced right away,” Mazurek explained. “However, more research is needed to determine whether the skills children with ASD might learn in virtual reality environments would translate into actual social interactions.”

Autistic Children’s Love For Video Games Could Lead To New Treatment Options

Kids and teenagers suffering from autism spectrum disorder (ASD) are more likely to use television and video games and less likely to spend time on social media than their normally-developing counterparts, claims new research set for publication in a future issue of the Journal of Autism and Developmental Disorders.

Micah Mazurek, an assistant professor of health psychology and a clinical child psychologist at the University of Missouri, recruited 202 children and adolescents with ASD and 179 of their typically developing siblings for the study.

Those with ASD spent more time playing video games and watching TV than spending time on physical or pro-social activities (including spending time on websites like Facebook or Twitter). The opposite was also true: typically-developing children spent more time on non-screen-related activities than they did watching shows or playing on the PS3 or the Xbox 360, according to the soon-to-be-published study.

“Many parents and clinicians have noticed that children with ASD are fascinated with technology, and the results of our recent studies certainly support this idea,” Mazurek said in a statement. “We found that children with ASD spent much more time playing video games than typically developing children, and they are much more likely to develop problematic or addictive patterns of video game play.”

In a separate study of 169 boys with ASD, excessive video game use had been linked to oppositional behaviors, such as refusal to follow directions or getting into arguments with others. Mazurek said that the issues will need to be further examined in future, closely-controlled research.

“Because these studies were cross-sectional, it is not clear if there is a causal relationship between video game use and problem behaviors,” she said. “Children with ASD may be attracted to video games because they can be rewarding, visually engaging and do not require face-to-face communication or social interaction. Parents need to be aware that, although video games are especially reinforcing for children with ASD, children with ASD may have problems disengaging from these games.”

Despite those issues, Mazurek also believes that autistic children’s love for video games and television could be used for beneficial purposes. The professor believes that discovering what makes these screen-related pastimes so attractive to kids with ASD could help researchers and medical experts develop new treatment options.

“Using screen-based technologies, communication and social skills could be taught and reinforced right away,” Mazurek explained. “However, more research is needed to determine whether the skills children with ASD might learn in virtual reality environments would translate into actual social interactions.”

Filed under autism ASD video games gaming social interaction psychology neuroscience science

159 notes

The subtle hallmarks of psychiatric illness can reveal themselves even remotely

Most people are so attuned to the nuances of social interaction that they can detect clues to mental illness while playing a strategy game with someone they have never met.

image

That was the finding of a team of scientists led by Read Montague, director of the Human Neuroimaging Laboratory at the Virginia Tech Carilion Research Institute. The researchers discovered that healthy people and those with borderline personality disorder displayed different patterns of behavior while playing an online strategy game, so much so that when healthy players played people with borderline personality disorder, they gave up on trying to predict what their partners would do next.

For their large neuroimaging study, the scientists used a multiround social interaction game, the investor-trustee game, to study the level of strategic thinking in 195 pairs of subjects. In each pair, one player played the investor and the other the trustee. The investor chose how much money to send the trustee, and the trustee in turn decided how much to return to the investor. Profit required the cooperation of both players.

“This classic tit-for-tat game allows us to probe people’s responses to the social gestures of others,” said Montague, who also directs the Computational Psychiatry Unit, an academic center that uses computational models to understand mental disease. “It further allows us to see how people form models of one another. These insights are important for understanding a range of mental illnesses, as the ability to infer other people’s intentions is an essential component of healthy cognition.”

The scientists classified the investors according to varying levels of strategic depth of thought. The healthy subjects fell into three categories: about half simply responded to the amount the other player sent; about one-quarter built a model of their partner’s behavior; and the remaining quarter considered not just their model of their partner, but also their partner’s models of them. 

Not surprisingly, the depth-of-thought style of play correlated with success, with the players who looked deeper into interactions making considerably more money than those who played at a shallow level.

When healthy subjects played people with borderline personality disorder, though, they were far less likely to exhibit depth of thought.

“People with borderline personality disorder are characterized by their unstable relationships, and when they play this game, they tend to break cooperation,” said Montague. “The healthy subjects picked up on the erratic behavior, likely without even realizing it, and far fewer played strategically.”

Notably, the functional magnetic resonance imaging of the subjects’ brains revealed that each category of player showed distinct neural correlates of learning signals associated with differing depths of thought. The scientists used hyperscanning, a technique Montague invented that enables subjects in different brain scanners to interact in real time, regardless of geography. Hyperscanning allows scientists to eavesdrop on brain activity during social exchanges in scanners, whether across the hallway or across the world.

“We’re always modeling other people, and our brains have a substantial amount of neural tissue devoted to pondering our interactions with other people,” Montague said. “This study is a start to turning neural signals into numbers – not just theory-of-mind arguments, but actual numbers. And when we can do that across thousands of people, we should start to gain insights into psychopathologies – what circuits are involved, what brain regions are engaged, and how injuries, congenital disorders, and genetic defects might play into psychiatric illness.”

Montague believes the study represents a significant contribution to the field of computational psychiatry, which seeks to bring computational clout to efforts to understand mental dysfunction. “Traditional psychiatric categories are useful yet incomplete,” said Montague, who delivered a TEDGlobal talk on the growing field of computational psychiatry last year. “Computational psychiatry enables us to redefine with a new lexicon – a mathematical one – the standard ways we think about mental illness.”

Computationally based insights may one day help psychiatry achieve better precision in diagnosis and treatment, Montague said. But until scientists have the right instruments, they cannot even begin to make those connections.

“The exquisite sensitivity that most people have to social gestures gives us a valuable opening,” Montague said. “We’re hoping to invent a tool – almost a human inkblot test – for identifying and characterizing mental disorders in which social interactions go awry.”

(Source: vtnews.vt.edu)

Filed under mental illness social interaction borderline personality disorder strategic thinking neuroimaging psychology neuroscience science

67 notes

Machine Perception Lab Shows Robotic One-Year-Old on Video
The world is getting a long-awaited first glimpse at a new humanoid robot in action mimicking the expressions of a one-year-old child. The robot will be used in studies on sensory-motor and social development – how babies “learn” to control their bodies and to interact with other people.
Diego-san’s hardware was developed by leading robot manufacturers: the head by Hanson Robotics, and the body by Japan’s Kokoro Co. The project is led by University of California, San Diego full research scientist Javier Movellan.
Movellan directs the Institute for Neural Computation’s Machine Perception Laboratory, based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2). The Diego-san project is also a joint collaboration with the Early Play and Development Laboratory of professor Dan Messinger at the University of Miami, and with professor Emo Todorov’s Movement Control Laboratory at the University of Washington.
Movellan and his colleagues are developing the software that allows Diego-san to learn to control his body and to learn to interact with people.
"We’ve made good progress developing new algorithms for motor control, and they have been presented at robotics conferences, but generally on the motor-control side, we really appreciate the difficulties faced by the human brain when controlling the human body," said Movellan, reporting even more progress on the social-interaction side. "We developed machine-learning methods to analyze face-to-face interaction between mothers and infants, to extract the underlying social controller used by infants, and to port it to Diego-san. We then analyzed the resulting interaction between Diego-san and adults." Full details and results of that research are being submitted for publication in a top scientific journal.
While photos and videos of the robot have been presented at scientific conferences in robotics and in infant development, the general public is getting a first peak at Diego-san’s expressive face in action. On January 6, David Hanson (of Hanson Robotics) posted a new video on  YouTube.
“This robotic baby boy was built with funding from the National Science Foundation and serves cognitive A.I. and human-robot interaction research,” wrote Hanson. “With high definition cameras in the eyes, Diego San sees people, gestures, expressions, and uses A.I. modeled on human babies, to learn from people, the way that a baby hypothetically would. The facial expressions are important to establish a relationship, and communicate intuitively to people.”
Diego-san is the next step in the development of “emotionally relevant” robotics, building on Hanson’s previous work with the Machine Perception Lab, such as the emotionally responsive Albert Einstein head.

Machine Perception Lab Shows Robotic One-Year-Old on Video

The world is getting a long-awaited first glimpse at a new humanoid robot in action mimicking the expressions of a one-year-old child. The robot will be used in studies on sensory-motor and social development – how babies “learn” to control their bodies and to interact with other people.

Diego-san’s hardware was developed by leading robot manufacturers: the head by Hanson Robotics, and the body by Japan’s Kokoro Co. The project is led by University of California, San Diego full research scientist Javier Movellan.

Movellan directs the Institute for Neural Computation’s Machine Perception Laboratory, based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2). The Diego-san project is also a joint collaboration with the Early Play and Development Laboratory of professor Dan Messinger at the University of Miami, and with professor Emo Todorov’s Movement Control Laboratory at the University of Washington.

Movellan and his colleagues are developing the software that allows Diego-san to learn to control his body and to learn to interact with people.

"We’ve made good progress developing new algorithms for motor control, and they have been presented at robotics conferences, but generally on the motor-control side, we really appreciate the difficulties faced by the human brain when controlling the human body," said Movellan, reporting even more progress on the social-interaction side. "We developed machine-learning methods to analyze face-to-face interaction between mothers and infants, to extract the underlying social controller used by infants, and to port it to Diego-san. We then analyzed the resulting interaction between Diego-san and adults." Full details and results of that research are being submitted for publication in a top scientific journal.

While photos and videos of the robot have been presented at scientific conferences in robotics and in infant development, the general public is getting a first peak at Diego-san’s expressive face in action. On January 6, David Hanson (of Hanson Robotics) posted a new video on YouTube.

“This robotic baby boy was built with funding from the National Science Foundation and serves cognitive A.I. and human-robot interaction research,” wrote Hanson. “With high definition cameras in the eyes, Diego San sees people, gestures, expressions, and uses A.I. modeled on human babies, to learn from people, the way that a baby hypothetically would. The facial expressions are important to establish a relationship, and communicate intuitively to people.”

Diego-san is the next step in the development of “emotionally relevant” robotics, building on Hanson’s previous work with the Machine Perception Lab, such as the emotionally responsive Albert Einstein head.

Filed under robots robotics AI Diego-san social interaction robotic baby facial expressions neuroscience science

54 notes

Networking Ability a Family Trait in Monkeys
Two years of painstaking observation on the social interactions of a troop of free-ranging monkeys and an analysis of their family trees has found signs of natural selection affecting the behavior of the descendants. 
Rhesus macaques who had large, strong networks tended to be descendants of similarly social macaques, according to a Duke University team of researchers. And their ability to recognize relationships and play nice with others also won them more reproductive success. 
"If you are a more social monkey, then you’re going to have greater reproductive success, meaning your babies are more likely to survive their first year," said post-doctoral research fellow Lauren Brent, who led the study. "Natural selection appears to be favoring pro-social behavior."
The analysis, which appears in  Nature Scientific Reports, combined sophisticated social network maps with 75 years of pedigree data and some genetic analysis.

Networking Ability a Family Trait in Monkeys

Two years of painstaking observation on the social interactions of a troop of free-ranging monkeys and an analysis of their family trees has found signs of natural selection affecting the behavior of the descendants. 

Rhesus macaques who had large, strong networks tended to be descendants of similarly social macaques, according to a Duke University team of researchers. And their ability to recognize relationships and play nice with others also won them more reproductive success. 

"If you are a more social monkey, then you’re going to have greater reproductive success, meaning your babies are more likely to survive their first year," said post-doctoral research fellow Lauren Brent, who led the study. "Natural selection appears to be favoring pro-social behavior."

The analysis, which appears in Nature Scientific Reports, combined sophisticated social network maps with 75 years of pedigree data and some genetic analysis.

Filed under primates animal behavior natural selection social behavior social interaction neuroscience science

free counters