Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

91 notes


The Knowing Nose: Chemosignals Communicate Human Emotions
Many animal species transmit information via chemical signals, but the extent to which these chemosignals play a role in human communication is unclear. In a new study published in Psychological Science, a journal of the Association for Psychological Science, researcher Gün Semin and colleagues from Utrecht University in the Netherlands investigate whether we humans might actually be able to communicate our emotional states to each other through chemical signals.
Existing research suggests that emotional expressions are multi-taskers, serving more than one function. Fear signals, for example, not only help to warn others about environmental danger, they are also associated with behaviors that confer a survival advantage through sensory acquisition. Research has shown that taking on a fearful expression (i.e., opening the eyes) leads us to breathe in more through our noses, enhances our perception, and accelerates our eye movements so that we can spot potentially dangerous targets more quickly. Disgust signals, on the other hand, warn others to avoid potentially noxious chemicals and are associated with sensory rejection, causing us to lower our eyebrows and wrinkle our noses.
Semin and colleagues wanted to build on this research to examine the role of chemosignals in social communication. They hypothesized that chemicals in bodily secretions, such as sweat, would activate similar processes in both the sender and receiver, establishing an emotional synchrony of sorts. Specifically, people who inhaled chemosignals associated with fear would themselves make a fear expression and show signs of sensory acquisition, while people who inhaled chemosignals associated with disgust would make an expression of disgust and show signs of sensory rejection.

The Knowing Nose: Chemosignals Communicate Human Emotions

Many animal species transmit information via chemical signals, but the extent to which these chemosignals play a role in human communication is unclear. In a new study published in Psychological Science, a journal of the Association for Psychological Science, researcher Gün Semin and colleagues from Utrecht University in the Netherlands investigate whether we humans might actually be able to communicate our emotional states to each other through chemical signals.

Existing research suggests that emotional expressions are multi-taskers, serving more than one function. Fear signals, for example, not only help to warn others about environmental danger, they are also associated with behaviors that confer a survival advantage through sensory acquisition. Research has shown that taking on a fearful expression (i.e., opening the eyes) leads us to breathe in more through our noses, enhances our perception, and accelerates our eye movements so that we can spot potentially dangerous targets more quickly. Disgust signals, on the other hand, warn others to avoid potentially noxious chemicals and are associated with sensory rejection, causing us to lower our eyebrows and wrinkle our noses.

Semin and colleagues wanted to build on this research to examine the role of chemosignals in social communication. They hypothesized that chemicals in bodily secretions, such as sweat, would activate similar processes in both the sender and receiver, establishing an emotional synchrony of sorts. Specifically, people who inhaled chemosignals associated with fear would themselves make a fear expression and show signs of sensory acquisition, while people who inhaled chemosignals associated with disgust would make an expression of disgust and show signs of sensory rejection.

Filed under emotion emotional states chemical signals olfactory system social communication neuroscience psychology science

143 notes


Inside the unconscious brain
A new study from MIT and Massachusetts General Hospital (MGH) reveals, for the first time, what happens inside the brain as patients lose consciousness during anesthesia.
By monitoring brain activity as patients were given a common anesthetic, the researchers were able to identify a distinctive brain activity pattern that marked the loss of consciousness. This pattern, characterized by very slow oscillation, corresponds to a breakdown of communication between different brain regions, each of which experiences short bursts of activity interrupted by longer silences.
“Within a small area, things can look pretty normal, but because of this periodic silencing, everything gets interrupted every few hundred milliseconds, and that prevents any communication,” says Laura Lewis, a graduate student in MIT’s Department of Brain and Cognitive Sciences (BCS) and one of the lead authors of a paper describing the findings in the Proceedings of the National Academy of Sciences this week.
This pattern may help anesthesiologists to better monitor patients as they receive anesthesia, preventing rare cases where patients awaken during surgery or stop breathing after excessive doses of anesthesia drugs.

Inside the unconscious brain

A new study from MIT and Massachusetts General Hospital (MGH) reveals, for the first time, what happens inside the brain as patients lose consciousness during anesthesia.

By monitoring brain activity as patients were given a common anesthetic, the researchers were able to identify a distinctive brain activity pattern that marked the loss of consciousness. This pattern, characterized by very slow oscillation, corresponds to a breakdown of communication between different brain regions, each of which experiences short bursts of activity interrupted by longer silences.

“Within a small area, things can look pretty normal, but because of this periodic silencing, everything gets interrupted every few hundred milliseconds, and that prevents any communication,” says Laura Lewis, a graduate student in MIT’s Department of Brain and Cognitive Sciences (BCS) and one of the lead authors of a paper describing the findings in the Proceedings of the National Academy of Sciences this week.

This pattern may help anesthesiologists to better monitor patients as they receive anesthesia, preventing rare cases where patients awaken during surgery or stop breathing after excessive doses of anesthesia drugs.

Filed under brain brain activity anesthesia consciousness oscillations neuroscience psychology science

138 notes


Noam Chomsky on Where Artificial Intelligence Went Wrong
If one were to rank a list of civilization’s greatest and most elusive intellectual challenges, the problem of “decoding” ourselves — understanding the inner workings of our minds and our brains, and how the architecture of these elements is encoded in our genome — would surely be at the top. Yet the diverse fields that took on this challenge, from philosophy and psychology to computer science and neuroscience, have been fraught with disagreement about the right approach.
In 1956, the computer scientist John McCarthy coined the term “Artificial Intelligence” (AI) to describe the study of intelligence by implementing its essential features on a computer. Instantiating an intelligent system using man-made hardware, rather than our own “biological hardware” of cells and tissues, would show ultimate understanding, and have obvious practical applications in the creation of intelligent devices or even robots.
Some of McCarthy’s colleagues in neighboring departments, however, were more interested in how intelligence is implemented in humans (and other animals) first. Noam Chomsky and others worked on what became cognitive science, a field aimed at uncovering the mental representations and rules that underlie our perceptual and cognitive abilities. Chomsky and his colleagues had to overthrow the then-dominant paradigm of behaviorism, championed by Harvard psychologist B.F. Skinner, where animal behavior was reduced to a simple set of associations between an action and its subsequent reward or punishment. The undoing of Skinner’s grip on psychology is commonly marked by Chomsky’s 1967 critical review of Skinner’s bookVerbal Behavior, a book in which Skinner attempted to explain linguistic ability using behaviorist principles.

Read more

Noam Chomsky on Where Artificial Intelligence Went Wrong

If one were to rank a list of civilization’s greatest and most elusive intellectual challenges, the problem of “decoding” ourselves — understanding the inner workings of our minds and our brains, and how the architecture of these elements is encoded in our genome — would surely be at the top. Yet the diverse fields that took on this challenge, from philosophy and psychology to computer science and neuroscience, have been fraught with disagreement about the right approach.

In 1956, the computer scientist John McCarthy coined the term “Artificial Intelligence” (AI) to describe the study of intelligence by implementing its essential features on a computer. Instantiating an intelligent system using man-made hardware, rather than our own “biological hardware” of cells and tissues, would show ultimate understanding, and have obvious practical applications in the creation of intelligent devices or even robots.

Some of McCarthy’s colleagues in neighboring departments, however, were more interested in how intelligence is implemented in humans (and other animals) first. Noam Chomsky and others worked on what became cognitive science, a field aimed at uncovering the mental representations and rules that underlie our perceptual and cognitive abilities. Chomsky and his colleagues had to overthrow the then-dominant paradigm of behaviorism, championed by Harvard psychologist B.F. Skinner, where animal behavior was reduced to a simple set of associations between an action and its subsequent reward or punishment. The undoing of Skinner’s grip on psychology is commonly marked by Chomsky’s 1967 critical review of Skinner’s bookVerbal Behavior, a book in which Skinner attempted to explain linguistic ability using behaviorist principles.

Read more

Filed under Noam Chomsky AI intelligence cognition behaviorism statistical models neuroscience psychology science

224 notes

Why Children Think They Are Invisible when Covering Their Eyes

Dr. James Russell and a research team at the University of Cambridge recently published work on young children’s conception of personal visibility, which furthers the understanding of cognitive development and of our emerging sense of self.

The research involved children three to four years of age. Researchers placed an eye mask on each of the children and asked them if they could be seen when wearing it. They then asked each child if an adult who was wearing a similar mask could be seen. The majority of the children involved in the study believed they were not visible when wearing the mask. Most also believed that the adult wearing the eye mask was also hidden.

Additional tests revealed a unique layer of complexity, demonstrating that although the children thought they were invisible when there eyes were covered, they still believed that their head and body were able to be seen.

The research team concluded by process of elimination that the factor that makes children believe they are visible is eye contact with another person.

“… it would seem that children apply the principle of joint attention to the self and assume that for somebody to be perceived, experience must be shared and mutually known to be shared, as it is when two pairs of eyes meet,” the researchers reported. “Young children’s natural tendency to acquire knowledge intersubjectively, by joint attention, leads them to undergo a developmental period in which they believe the self is something that must be mutually experienced for it to be perceived.”

Evidently, children only believe they exist when making eye contact with another person. The implications point to a simple but necessary way to make children feel present and involved. Cultures worldwide seem to have some version of “peek-a-boo,” as a quick Google image search reveals. Lack of eye contact in children has been linked as an early sign of autism, while the presence of eye contact is associated with empathy. Dr. Russell’s team seems to have discovered a key facet of cognitive development.

The results of Dr. Russell’s study were published in the Journal of Cognition and Development.

(Source: united-academics.org)

Filed under children personal visibility eye contact perception neuroscience psychology science

54 notes

Multivitamin lifts brain activity

A daily multivitamin supplement may improve brain efficiency in older women, according to new research from Swinburne University of Technology.

Centre for Human Psychopharmacology researcher at Swinburne, Dr Helen Macpherson’s four month study of the commercial product Swisse Women’s Ultivite 50+ found some evidence that multivitamin supplements may influence cognitive function by altering electrical activity in the brain.

"The main finding of the study was that 16 weeks supplementation with the Swisse Women’s 50+ multivitamin modulated brain activity," Dr Macpherson said.

"This is an important result as it shows there are direct effects of multivitamins on the brain.

"Previous research has used measures of behaviour to determine whether multivitamins can affect brain function, but this is the first trial to directly measure brain activity."

The study was conducted over 16 weeks with 56 women aged between 64 and 79 who were concerned about their memory or experiencing memory difficulties. They were randomly assigned to take the multivitamin supplement or a placebo daily.

Volunteers underwent a recording of their brain electrical activity whilst performing a spatial working memory task.

The research was published in Physiology and Behavior.

A previous paper published in Psychopharmacology reported that multivitamin supplementation improved behavioural performance on a similar task, in the same group of participants.

The study concluded that 16 weeks of supplementation with a combined multivitamin, mineral and herbal formula may benefit memory, by enabling the brain to work in a more efficient way.

"When considered with our other findings of benefits to memory performance, there is increasing evidence that multivitamins may be useful to combat cognitive decline in the elderly," Dr Macpherson said.

(Source: swinburne.edu.au)

Filed under brain cognitive decline memory brain function multivitamin neuroscience psychology science

44 notes

Brain imaging alone cannot diagnose autism

In a column appearing in the current issue of the journal Nature, McLean Hospital biostatistician Nicholas Lange, ScD, cautions against heralding the use of brain imaging scans to diagnose autism and urges greater focus on conducting large, long-term multicenter studies to identify the biological basis of the disorder.

"Several studies in the past two years have claimed that brain scans can diagnose autism, but this assertion is deeply flawed," said Lange, an associate professor of Psychiatry and Biostatistics at Harvard Medical School. "To diagnose autism reliably, we need to better understand what goes awry in people with the disorder. Until its solid biological basis is found, any attempt to use brain imaging to diagnose autism will be futile."

While cautioning against current use of brain imaging as a diagnostic tool, he is a strong proponent of using this technology to help scientists better understand autism. Through the use of various brain imaging techniques, including functional magnetic resonance imaging (MRI), positron emission tomography (PET), and volumetric MRI, Lange points out that researchers have made important discoveries related to early brain enlargement in the disorder, how those with autism focus during social interaction and the role of serotonin in someone with autism.

"Brain scans have led to these extremely valuable advances, and, with each discovery, we are getting closer to solving the autism pathology puzzle," said Lange. "What individuals with autism and their parents urgently need is for us to carry out large-scale studies that lead us to find reliable, sensitive and specific biological markers of autism with high predictive value that allow clinicians to identify interventions that will improve the lives of people with the disorder."

Autism and autism spectrum disorder (ASD) are terms regularly used to describe a group of complex disorders of brain development. This spectrum characterized, in varying degrees, by difficulties in social interaction, verbal and nonverbal communication, and repetitive behaviors, whose criteria have been revised in the newly proposed Diagnostic and Statistical Manual of Mental Disorders (DSM-5). The prevalence of ASD in the United States has increased 78 percent in the last decade, with the Centers for Disease Control estimating that one in 88 children has ASD.

(Source: eurekalert.org)

Filed under brain brain scans neuroimaging autism ASD neuroscience psychology science

43 notes

Making a Game Out of Improving the ‘Sticky’ Brain
UCSF neuroscientists have found that by training on attention tests, people young and old can improve brain performance and multitasking skills.
Anyone who tries to perform two tasks at once is likely to do worse on both. Why that is so at the neurological level has largely been terra incognita. But research now is starting to reveal the impact of multitasking on short-term memory and attention.
Adam Gazzaley, MD, PhD, associate professor of neurology, physiology and psychiatry, and researchers at the UCSF Neuroscience Imaging Center use EEG, MRI and other non-invasive tools to study cognitive processes while people try their best on drills that test short-term memory.

Making a Game Out of Improving the ‘Sticky’ Brain

UCSF neuroscientists have found that by training on attention tests, people young and old can improve brain performance and multitasking skills.

Anyone who tries to perform two tasks at once is likely to do worse on both. Why that is so at the neurological level has largely been terra incognita. But research now is starting to reveal the impact of multitasking on short-term memory and attention.

Adam Gazzaley, MD, PhD, associate professor of neurology, physiology and psychiatry, and researchers at the UCSF Neuroscience Imaging Center use EEG, MRI and other non-invasive tools to study cognitive processes while people try their best on drills that test short-term memory.

Filed under brain cognitive processes memory STM research neuroscience psychology science

60 notes

UCSB Scientists Report ‘New Beginning’ in Split-Brain Research, Using New Analytical Tools

UC Santa Barbara has reported an important discovery in the interdisciplinary study of split-brain research. The findings uncover dynamic changes in brain coordination patterns between left and right hemispheres.

Split-brain research has been conducted for decades, and scientists have long ago shown that language processing is largely located in the left side of the brain. When words appear only in the left visual field –– an area processed by the right side of the brain –– the right brain must transfer that information to the left brain, in order to interpret it. The new study at UCSB shows that healthy test subjects respond less accurately when information is shown only to the right brain.

While hemispheric specialization is considered accurate, the new study sheds light on the highly complex interplay –– with neurons firing back and forth between distinct areas in each half of the brain. The findings rely on extremely sensitive neuroscience equipment and analysis techniques from network science, a fast-growing field that draws on insights from sociology, mathematics, and physics to understand complex systems composed of many interacting parts. These tools can be applied to systems as diverse as earthquakes and brains.

Fifty years ago, UC Santa Barbara neuroscientist Michael S. Gazzaniga moved the field forward when he was a graduate student at the California Institute of Technology and first author of a groundbreaking report on split-brain patients. The study, which became world-renowned, was published in the Proceedings of the National Academy of Sciences (PNAS) in August 1962. This week, in the very same journal, Gazzaniga and his team announced major new findings in split-brain research. The report is an example of the interdisciplinary science for which UCSB is well known.

"The occasion of this paper is on the 50th anniversary of the first report on human split-brain research reported in PNAS," said Gazzaniga. "That study showed how surgically dividing the two hemispheres of the human brain –– in an attempt to control epilepsy –– allowed for studying how each isolated half-brain was specialized for cognitive function.

"In the present study, new techniques –– not present 50 years ago –– begin to allow for an understanding of how the normal, undivided brain integrates the special functions of each half brain. It is a new beginning and very exciting," said Gazzaniga, professor of psychology in UCSB’s Department of Psychological and Brain Sciences, and director of UCSB’s SAGE Center for the Study of Mind.

(Source: ia.ucsb.edu)

Filed under brain brain research split-brain neural oscillations neuroscience psychology science

free counters