Posts tagged psychology

Posts tagged psychology
Dog watch - How attention changes in the course of a dog’s life
Dogs are known to be Man’s best friend. No other pet has adjusted to Man’s lifestyle as this four-legged animal. Scientists at the Messerli Research Institute at the Vetmeduni Vienna, have been the first to investigate the evolution of dogs’ attentiveness in the course of their lives and to what extent they resemble Man in this regard. The outcome: dogs’ attentional and sensorimotor control developmental trajectories are very similar to those found in humans. The results were published in the journal Frontiers in Psychology.
Dogs are individual personalities, possess awareness, and are particularly known for their learning capabilities, or trainability. To learn successfully, they must display a sufficient quantity of attention and concentration. However, the attentiveness of dogs’ changes in the course of their lives, as it does in humans. The lead author Lisa Wallis and her colleagues investigated 145 Border Collies aged 6 months to 14 years in the Clever Dog Lab at the Vetmeduni Vienna and determined, for the first time, how attentiveness changes in the entire course of a dog’s life using a cross-sectional study design.
Humans are more interesting for dogs than objects
To determine how rapidly dogs of various age groups pay attention to objects or humans, the scientists performed two tests. In the first situation the dogs were confronted with a child’s toy suspended suddenly from the ceiling. The scientists measured how rapidly each dog reacted to this occurrence and how quickly the dogs became accustomed to it. Initially all dogs reacted with similar speed to the stimulus, but older dogs lost interest in the toy more rapidly than younger ones did.
In the second test situation, a person known to the dog entered the room and pretended to paint the wall. All dogs reacted by watching the person and the paint roller in the person’s hands for a longer duration than the toy hanging from the ceiling.
Wallis’ conclusion: “So-called social attentiveness was more pronounced in all dogs than “non-social” attentiveness. The dogs generally tended to react by watching the person with the object for longer than an object on its own. We found that older dogs - like older human beings - demonstrated a certain calmness. They were less affected by new items in the environment and thus showed less interest than younger dogs.”
Selective attention is highest in mid-adulthood
In a further test the scientists investigated so-called selective attention. The dogs participated in an alternating attention task, where they had to perform two tasks consecutively. First, they needed to find a food reward thrown onto the floor by the experimenter, then after eating the food, the experimenter waited for the dog to establish eye contact with her. These tasks were repeated for a further twenty trials. The establishment of eye contact was marked by a clicking sound produced by a “clicker” and small pieces of hot dog were used as a reward. The time spans to find the food and look up into the face were measured. With respect to both time spans, middle-aged dogs (3 to 6 years) reacted most rapidly.
Under these test conditions, sensorimotor abilities were highest among dogs of middle age. Younger dogs fared more poorly probably because of their general lack of experience. Motor abilities in dogs as in humans deteriorate with age. Humans between the age of 20 and 39 years experience a similar peak in sensorimotor abilities,” says Wallis.
Adolescent dogs have the steepest learning curve
Dogs also go through a difficult phase during adolescence (1-2 years) which affects their ability to pay attention. This phase of hormonal change may be compared to puberty in Man. Therefore, young dogs occasionally reacted with some delay to the clicker test. However, Wallis found that adolescent dogs improved their performance more rapidly than other age groups after several repetitions of the clicker test. In other words, the learning curve was found to be steepest in puberty. “Thus, dogs in puberty have great potential for learning and therefore trainability” says Wallis.
Dogs as a model for ADHD and Alzheimer’s disease
As the development of attentiveness in the course of a dog’s life is similar to human development in many respects, dogs make appropriate animal models for various human psychological diseases. For instance, the course of diseases like ADHD (attention deficit/hyperactivity disorder) or Alzheimer’s can be studied by observing the behavior of dogs. In her current project Wallis is investigating the effects of diet on cognition in older dogs together with her colleague Durga Chapagain. The scientists are still looking for dog owners who would like to participate in a long-term study.
Some innate preferences shape the sound of words from birth
Languages are learned, it’s true, but are there also innate bases in the structure of language that precede experience? Linguists have noticed that, despite the huge variability of human languages, here are some preferences in the sound of words that can be found across languages. So they wonder whether this reflects the existence of a universal, innate biological basis of language. A SISSA study provides evidence to support this hypothesis, demonstrating that certain preferences in the sound of words are already active in newborn infants.
Take the sound “bl”: how many words starting with that sound can you think of? Blouse, blue, bland… Now try with “lb”: how many can you find? None in English and Italian, and even in other languages such words either don’t exist or are extremely rare. Human languages offer several examples of this kind, and this indicates that in forming words we tend to prefer certain sound combinations to others, irrespective of which language we speak. The fact that this occurs across languages has prompted linguists to hypothesize the existence of biological bases of language (in born and universal) which precede language learning in humans. Finding evidence to support his hypothesis is, however, far from easy and the debate between the proponents of this view and those who believe that language is merely the result of learning is still open. But proof supporting the “universalist” hypothesis has now been provided by a new study conducted by a research team of the International School for Advanced Studies (SISSA) in Trieste and just published in the journal PNAS.
David Gomez, a SISSA research scientist working under the supervision of Jacques Mehler and first author of the paper, and his coworkers decided to observe the brain activity of newborns. “In fact, if it is possible to demonstrate that these preferences are already present within days from birth, when the newborn baby is still unable to speak and presumably has very limited language knowledge, then we can infer that there is an inborn bias that prefers certain words to others”, comments Gomez.
“To monitor the newborns’ brain activity we used a non-invasive technique, i.e., functional near-infrared spectroscopy”, explains Marina Nespor, a SISSA neuroscientist who participated in the study. During the experiments the newborns would listen to words starting with normally “preferred” sounds (like “bl”) and others with uncommon sounds (“lb”). “What we found was that the newborns’ brains reacted in a significantly different manner to the two types of sound” continues Nespor.
“The brain regions that are activated while the newborns are listening react differently in the two cases”, comments Gomez, “and reflect the preferences observed across languages, as well as the behavioural responses recorded in similar experiments carried out in adults”. “It’s difficult to imagine what languages would sound like if humans didn’t share a common knowledge base”, concludes Gomez. “We are lucky that this common base exists. This way, our children are born with an ability to distinguish words from “non-words” ever since birth, regardless of which language they will then go on to learn”.
Researchers at The Ohio State University have found a way for computers to recognize 21 distinct facial expressions—even expressions for complex or seemingly contradictory emotions such as “happily disgusted” or “sadly angry.”

(Image caption: Researchers at the Ohio State University have found a way for computers to recognize 21 distinct facial expressions — even expressions for complex or seemingly contradictory emotions. The study gives cognitive scientists more tools to study the origins of emotion in the brain. Here, a study participant makes three faces: happy (left), disgusted (center), and happily disgusted (right). Credit: Image courtesy of The Ohio State University.)
In the current issue of the Proceedings of the National Academy of Sciences, they report that they were able to more than triple the number of documented facial expressions that researchers can now use for cognitive analysis.
“We’ve gone beyond facial expressions for simple emotions like ‘happy’ or ‘sad.’ We found a strong consistency in how people move their facial muscles to express 21 categories of emotions,” said Aleix Martinez, a cognitive scientist and associate professor of electrical and computer engineering at Ohio State. “That is simply stunning. That tells us that these 21 emotions are expressed in the same way by nearly everyone, at least in our culture.”
The resulting computational model will help map emotion in the brain with greater precision than ever before, and perhaps even aid the diagnosis and treatment of mental conditions such as autism and post-traumatic stress disorder (PTSD).
Congenitally blind visualise numbers opposite way to sighted
For the first time, scientists have uncovered that people blind from birth visualise numbers the opposite way around to sighted people.
Through a recent study, the researchers in our Department of Psychology were surprised to find that the ‘mental number line’ for congenitally blind people ran in the opposite direction to sighted people, with larger numbers to the left and smaller numbers to the right.
Whereas a sighted person would count 1, 2, 3, 4, 5, the researchers have found that someone blind from birth mentally visualises their number line from right to left, effectively 5, 4, 3, 2, 1.
Senior Lecturer from the Department, Dr Michael Proulx explained: “Our unexpected results relate to the fact that people who were born visually impaired like to map the position of objects in relation to themselves.
“It is likely that this style of spatial representation extends to numbers too, and the right-handed participants mapped the number line from their dominant right hand.”
The study used a novel ‘random number generation’ procedure where volunteers were asked to say numbers while turning their head to the left or the right. This task is linked to how the brain visualises a mental number line.
As part of the study, an international team from Bath, Sabanci University (Turkey) and Taisho University (Japan) compared responses of congenitally blind people, with the adventitiously blind – those who were born with vision – and sighted, but blindfolded, volunteers.
Previous studies have shown that people in Western cultures, where writing runs from left to right, possess a similar mental number line, with small numbers on the left and larger numbers on the right. But in cultures where writing flows from right to left, for example Arabic, people’s mental number lines runs in a similar direction. This is the first time scientists have uncovered that blind individuals in a Western culture also had a right to left number line.
Dr Proulx added: “Remembering and representing numbers is an important skill, and the foundation of mental maths. Visually impaired people are just as good, if not better, at mathematics than sighted people – Georgian Maths Professor and Royal Society Fellow, Nicholas Saunderson as one famous example.
“What makes this work exciting is that Saunderson may have been able to advance mathematics with an entirely different mental representation of numbers than that of sighted contemporaries like Isaac Newton.”
Detecting Unidentified Changes
Does becoming aware of a change to a purely visual stimulus necessarily cause the observer to be able to identify or localise the change or can change detection occur in the absence of identification or localisation? Several theories of visual awareness stress that we are aware of more than just the few objects to which we attend. In particular, it is clear that to some extent we are also aware of the global properties of the scene, such as the mean luminance or the distribution of spatial frequencies. It follows that we may be able to detect a change to a visual scene by detecting a change to one or more of these global properties. However, detecting a change to global property may not supply us with enough information to accurately identify or localise which object in the scene has been changed. Thus, it may be possible to reliably detect the occurrence of changes without being able to identify or localise what has changed. Previous attempts to show that this can occur with natural images have produced mixed results. Here we use a novel analysis technique to provide additional evidence that changes can be detected in natural images without also being identified or localised. It is likely that this occurs by the observers monitoring the global properties of the scene.
The circadian clock is like an orchestra with many conductors
You’ve switched to the night shift and your weight skyrockets, or you wake at 7 a.m. on weekdays but sleep until noon on weekends—a social jet lag that can fog your Saturday and Sunday.
Life runs on rhythms driven by circadian clocks, and disruption of these cycles is associated with serious physical and emotional problems, says Orie Shafer, a University of Michigan assistant professor of molecular, cellular and developmental biology.
Now, new findings from Shafer and U-M doctoral student Zepeng Yao challenge the prevailing wisdom about how our body clocks are organized, and suggest that interactions among neurons that govern circadian rhythms are more complex than originally thought.
Yao and Shafer looked at the circadian clock neuron network in fruit flies, which is functionally similar to that of mammals, but at only 150 clock neurons is much simpler. Previously, scientists thought that a master group of eight clock neurons acted as pacemaker for the remaining 142 clock neurons—think of a conductor leading an orchestra—thus imposing the rhythm for the fruit fly circadian clock. It is thought that the same principle applies to mammals.
Interactions among clock neurons determine the strength and speed of circadian rhythms, Yao says. So, when researchers genetically changed the clock speeds of only the group of eight master pacemakers they could examine how well the conductor alone governed the orchestra. They found that without the environmental cues, the orchestra didn’t follow the conductor as closely as previously thought.
Some of the fruit flies completely lost sense of time, and others simultaneously demonstrated two different sleep cycles, one following the group of eight neurons and the other following some other set of neurons.
"The finding shows that instead of the entire orchestra following a single conductor, part of the orchestra is following a different conductor or not listening at all," Shafer said.
The findings suggest that instead of a group of master pacemaker neurons, the clock network consists of many independent clocks, each of which drives rhythms in activity. Shafer and Yao suspect that a similar organization will be found in mammals, as well.
"A better understanding of the circadian clock mechanisms will be critical for attempts to alleviate the adverse effects associated with circadian disorders," Yao said.
Disrupting the circadian clock through shift work is associated with diabetes, obesity, stress, heart disease, mood disorders and cancer, among other disorders, Yao says. The International Agency for Research on Cancer classified shift work that disrupts circadian rhythms as a human carcinogen equal to cancer-causing ultraviolet radiation.
Artificial intelligence lie detector
Wrongly accused and imprisoned for a crime you didn’t commit. It sounds like the plot to a generic crime thriller. However, this scenario does happen from time to time in the UK. From the Birmingham Six, falsely imprisoned for sixteen years, to the more recent case of Barri White, who was wrongly jailed for the murder of his girlfriend Rachel Manning, these situations can seem to the public like a tragic miscarriage of the criminal justice system.
However, what if you could stop these miscarriages of justice from happening? Imperial alumnus Dr James O’Shea, who graduated with a Bachelor of Science in Chemistry in 1976, has built a lie detector device called the ‘Silent Talker’ that he believes could help to improve criminal investigations.
While lie detector tests of any sort are not currently admissible evidence in British courts, Dr O’Shea believes Silent Talker could be an invaluable tool in helping law enforcement to focus their investigations.
Dr O’Shea says: “An original member of my team who helped to develop the Silent Talker was very close to the area where one of the attacks by Yorkshire Ripper took place. She took an interest in the case and found that the Ripper had been interviewed and passed over several times by the police. If the police had Silent Talker back then, it may have helped them to determine that they needed to spend a little more time on this guy, and investigate his background more closely.”
Artificially intelligent
The Silent Talker consists of a digital video camera that is hooked up to a computer. It runs a series of programs called artificial neural networks. These are computational models that take their design from animals’ central nervous systems, acting like an autonomous ‘brain’ for the device.
The computer programming in the artificial brain is a type of artificial intelligence called machine learning. It enables Silent Talker to learn and recognise patterns in data so that it can constantly adapt and reprogram itself during an interview. This enables Silent Talker to build up an overall profile of the subject to identify when someone is lying or telling the truth.
But how does it know when someone is lying? The inventors of the device claim it’s written all over your face. The camera records the subject in an interview and the artificial brain identifies non-verbal ‘micro-gestures’ on people’s faces. These are unconscious responses that Silent Talker picks up on to determine if the interviewee is lying.
Examples of micro-gestures include signs of stress, mental strain and what psychologists call ‘duping delight’. This refers to the unconscious flash of a smile at the pleasure and thrill of getting away with telling a lie. Dr O’Shea says these ‘tells’ are extremely fine-grained and exceedingly difficult for the interviewee to have any control over.
Coming to an interview near you
Dr O’Shea says the uses for such a device are numerous.
“One can imagine a near-future scenario in which your prospective employers are wearing Google Glasses, where every micro-gesture that ‘leaks’ from your face is a response that flashes by their eyes as ‘true’ or ‘false’ in real-time.”
While it does use the latest in computational techniques, Dr O’Shea says Silent Talker is not infallible. In tests to classify the micro-gestures as deceptive or non-deceptive, the Silent Talker has achieved an accuracy rate of 87 per cent.
However, this has not stopped prospective clients from clamouring for the device. Dr O’Shea and his colleagues have already been approached by security services about whether Silent Talker could be used to determine if people approaching a military checkpoint could be suicide bombers so that they can be eliminated before blowing up their target. The team’s answer has been a loud and emphatic ‘no’.
“In an ethical sense, such decisions should not be taken by a machine,” says Dr O’Shea.
A good trip: Researchers are giving psychedelics to cancer patients to help alleviate their despair — and it’s working
On a bone-chilling morning in February last year, Nick Fernandez bundled up and took the subway from his Manhattan apartment to the Bluestone Center for Clinical Research, which is located in an art deco-style building on the Upper East Side. A 27-year-old graduate student in psychology with dark, wavy hair and delicate, bird-like features, Fernandez was excited and nervous. He had eaten a light breakfast consisting of a bagel and industrial-strength coffee in preparation for another journey he was about to take. Fernandez had signed up to be a subject in a New York University study into the use of psilocybin, the psychoactive ingredient in hallucinogenic mushrooms, to relieve mental anguish in people with terminal or recurrent cancer.
Fernandez hoped that the drug would lift the shroud of melancholy and free-floating anxiety that had enveloped him ever since he was diagnosed with leukemia in 2004 during his senior year in high school. Two and a half years of almost continuous chemotherapy vanquished the disease, but left him drained and traumatised. The former soccer star dropped more than 50 lbs from an already lean frame. ‘It was pretty brutal and forces you to grow up fast,’ said Fernandez, who became intensely interested in spiritual philosophy during this period, and went on to dabble in psychedelics in college. For years afterward, every sneeze and sniffle, every day that he felt tired or out of sorts, filled him with an unshakeable dread that the cancer had returned. When he heard the study mentioned on a radio show, he immediately signed up.
Jeffrey Guss and Erin Zerbo, the two NYU psychiatrists who would quietly monitor Fernandez’s progress throughout the day, greeted him when he arrived. After they took his vital signs, Fernandez changed into sweat pants and a shirt, and settled into a converted dental exam room that had been transformed into a hippie-style sanctum: tricked out with fresh flowers and fruits, a comfy sofa littered with plush pillows, Buddhist and shamanistic totems, and a high-tech sound system. Stephen Ross, an associate professor of psychiatry at NYU and the lead investigator for the study, made a brief appearance in the trip room. He was holding a glass vial that had been retrieved earlier that morning from a massive safe located inside a high-security storage room. It contained a single white capsule, and no one could be sure if it was a placebo – a dummy pill – or a 30 milligram dose of synthesised psilocybin.
Physics-minded crows bring Aesop’s fable to life
Eureka! Like Archimedes in his bath, crows know how to displace water, showing that Aesop’s fable The Crow and the Pitcher isn’t purely fictional.
To see if New Caledonian crows could handle some of the basic principles of volume displacement, Sarah Jelbert at the University of Auckland in New Zealand and her colleagues placed scraps of meat just out of a crow’s reach, floating in a series of tubes that were part-filled with water. Objects potentially useful for bringing up the water level, like stones or heavy rubber erasers, were left nearby.
The crows successfully figured out that heavy and solid objects would help them get a treat faster. They also preferred to drop objects in tubes where they could access a reward more easily, picking out tubes with higher water levels and choosing tubes of water over sand-filled ones.
Common psychiatric disorders, such as anxiety and addiction, likely result from changes in brain circuitry. Understanding structural and functional brain connections – and how they change in psychiatric disorders – could lead to novel preventive and therapeutic strategies.

The bed nucleus of the stria terminalis (BNST) has been linked to both anxiety and addiction, but its circuitry in humans has not been described. Jennifer Blackford, Ph.D., assistant professor of Psychiatry, and colleagues used two neuroimaging methods – diffusion tensor imaging and functional MRI – to identify patterns of connectivity between the BNST and other brain regions in healthy individuals. The BNST showed connections to multiple subcortical brain regions, including limbic, thalamic and basal ganglia structures, which matched reported connections in rodents. The researchers also identified two novel BNST connections: to the temporal pole and to the paracingulate gyrus.
The findings, reported in NeuroImage, provide a map of BNST neurocircuitry and lay the foundation for future studies of the circuits that mediate anxiety and addiction.
(Source: news.vanderbilt.edu)