Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

199 notes

Caffeine Consumption Within Six Hours Of Bedtime May Disrupt Sleep
Consumption of caffeine, even six hours before bedtime, can have significant, disruptive effects on sleep. The study, from the American Academy of Sleep Medicine, was published in the Journal of Clinical Sleep Medicine.
“Sleep specialists have always suspected that caffeine can disrupt sleep long after it is consumed,” said American Academy of Sleep Medicine President M. Safwan Badr, MD. “This study provides objective evidence supporting the general recommendation that avoiding caffeine in the late afternoon and at night is beneficial for sleep.”
The researchers found that 400 mg of caffeine (about 2-3 cups of coffee) taken at bedtime, or three to six hours before bedtime, significantly impacts sleep. Objectively measured total sleep time was reduced by more than an hour even when the caffeine was consumed six hours before going to bed. Subjective reports, however, suggest that the study participants were unaware of this sleep disturbance.
“Drinking a big cup of coffee on the way home from work can lead to negative effects on sleep just as if someone were to consume caffeine closer to bedtime,” said Christopher Drake, PhD, investigator at the Henry Ford Sleep Disorders and Research Center and associate professor of psychiatry and behavioral neurosciences at Wayne State University.
People tend to be less likely to detect the disruptive effects of caffeine on sleep when taken in the afternoon,” noted Drake, who is also on the board of directors of the Sleep Research Society.
The researchers recruited 12 healthy normal sleepers, as determined by a physical examination and clinical interview. Subjects were instructed to maintain their normal sleep schedule, but were given three pills a day for four days to be taken at six, three and zero hours before scheduled bedtime. Two of the pills were placebos, and one was 400 mg of caffeine. On one of the four days, all three of the participants’ pills were a placebo. The researchers measured sleep disturbance subjectively using a standard sleep diary and objectively using an in-home sleep monitor.
This is the first study to investigate the effects of a given dose of caffeine taken at different times before sleep. The findings suggest that, in order to allow healthy sleep, individuals should avoid caffeine after 5pm.

Caffeine Consumption Within Six Hours Of Bedtime May Disrupt Sleep

Consumption of caffeine, even six hours before bedtime, can have significant, disruptive effects on sleep. The study, from the American Academy of Sleep Medicine, was published in the Journal of Clinical Sleep Medicine.

“Sleep specialists have always suspected that caffeine can disrupt sleep long after it is consumed,” said American Academy of Sleep Medicine President M. Safwan Badr, MD. “This study provides objective evidence supporting the general recommendation that avoiding caffeine in the late afternoon and at night is beneficial for sleep.”

The researchers found that 400 mg of caffeine (about 2-3 cups of coffee) taken at bedtime, or three to six hours before bedtime, significantly impacts sleep. Objectively measured total sleep time was reduced by more than an hour even when the caffeine was consumed six hours before going to bed. Subjective reports, however, suggest that the study participants were unaware of this sleep disturbance.

“Drinking a big cup of coffee on the way home from work can lead to negative effects on sleep just as if someone were to consume caffeine closer to bedtime,” said Christopher Drake, PhD, investigator at the Henry Ford Sleep Disorders and Research Center and associate professor of psychiatry and behavioral neurosciences at Wayne State University.

People tend to be less likely to detect the disruptive effects of caffeine on sleep when taken in the afternoon,” noted Drake, who is also on the board of directors of the Sleep Research Society.

The researchers recruited 12 healthy normal sleepers, as determined by a physical examination and clinical interview. Subjects were instructed to maintain their normal sleep schedule, but were given three pills a day for four days to be taken at six, three and zero hours before scheduled bedtime. Two of the pills were placebos, and one was 400 mg of caffeine. On one of the four days, all three of the participants’ pills were a placebo. The researchers measured sleep disturbance subjectively using a standard sleep diary and objectively using an in-home sleep monitor.

This is the first study to investigate the effects of a given dose of caffeine taken at different times before sleep. The findings suggest that, in order to allow healthy sleep, individuals should avoid caffeine after 5pm.

Filed under caffeine caffeine consumption sleep circadian rhythms psychology neuroscience science

130 notes

Looking for a needle in a haystack: new research shows how brain prepares to start searching

Many of us have steeled ourselves for those ‘needle in a haystack’ tasks of finding our vehicle in an airport car park, or scouring the supermarket shelves for a favourite brand.

image

A new scientific study has revealed that our understanding of how the human brain prepares to perform visual search tasks of varying difficulty may now need to be revised.

When people search for a specific object, they tend to hold in mind a visual representation of it, based on key attributes like shape, size or colour. Scientists call this ‘advanced specification’. For example, we might search for a friend at a busy railway station by scanning the platform for someone who is very tall or who is wearing a green coat, or a combination of these characteristics.

Researchers from the School of Psychology at the University of Lincoln, UK, set out to better explain how these abstract visual representations are formed. They used fMRI scanners to record neural activity when volunteers prepared to search for a target object: a coloured letter amid a screen of other coloured letters.

Their findings, published in the journal ‘Brain Research’, are the first to fully isolate the different areas of the human brain involved in this ‘prepare to search’ function. Surprisingly, they show that the advanced frontal areas of the brain, usually key to advanced cognitive tasks, appear to take a backseat. Instead it is the basic back areas of the brain and the sub-cortical areas that do the work.

Dr Patrick Bourke from the University of Lincoln’s School of Psychology, who led the study, said: “Up until now, when researchers have studied visual search tasks they have also found that frontal areas of the brain were active. This has been assumed to indicate a control system: an ‘executive’ that largely resides in the advanced front of the brain which sends signals to the simpler back of the brain, activating visual memories. Here, when we isolated the ‘prepare’ part of the task from the actual search and response phase we found that this activation in the front was no longer present.”

This finding has important implications for understanding the fundamental brain processes involved. It was previously thought that the Intra-parietal region of the brain, which is linked to visual attention, was the central component of the supposed ‘front-back’ control network, relaying useful information (such as a shape or colour bias) from frontal areas of the brain to the back, where simple visual representations of the object are held. If the frontal areas are not activated in the preparation phase, this cannot be the case.

The study also showed that the pattern of brain activation varied depending on the anticipated difficulty of the search task, even when the target object was the same. This indicates that rather than holding in mind a single representation of an object, a new target is constructed each time, depending on the nature of the task.

Dr Bourke added: “While consistent with previous brain imaging work on visual search, these results change the interpretations and assumptions that have been applied previously. Notably, they highlight a difference between studies of animals’ brains and those of humans. Studies with monkeys convincingly show the front-back control system and we thought we understood how this worked. At the same time our findings are consistent with a growing body of brain imaging work in humans that also shows no frontal brain activation when short term memories are held.”

(Source: lincoln.ac.uk)

Filed under visual search visual representations brain activity fMRI brain imaging psychology neuroscience science

361 notes

Literacy depends on nurture, not nature
A University at Buffalo education professor has sided with the environment in the timeless “nurture vs. nature” debate after his research found that a child’s ability to read depends mostly on where that child is born, rather than on his or her individual qualities.
“Individual characteristics explain only 9 percent of the differences in children who can read versus those who cannot,” says Ming Ming Chiu, lead author of an international study that explains this connection and a professor in the Department of Learning and Instruction in UB’s Graduate School of Education. 
“In contrast, country differences account for 61 percent and school differences account for 30 percent,” Chiu says.
Therefore, he concludes, the country in which a child is born largely determines whether he or she will have at least basic reading skills. It’s clearly a case where “nurture” — the environment and surroundings of the child — is more important than “nature” — the child’s inherited, individual qualities, according to Chiu.
More than 99 percent of fourth-graders in the Netherlands can read, but only 19 percent of fourth-graders in South Africa can read, Chiu notes.
“Although the richest countries typically have high literacy rates exceeding 97 percent,” he says, “some rich countries, such as Qatar and Kuwait, have low literacy rates — 33 percent and 28 percent, respectively.”
The study, “Ecological, Psychological and Cognitive Components of Reading Difficulties: Testing the Component Model of Reading in Fourth-graders Across 38 Countries,” analyzed reading test scores of 186,725 fourth-graders from 38 countries, including more than 4,000 children from the U.S. Chiu and co-authors Catherine McBride-Chang of the Chinese University of Hong Kong and Dan Lin of the Hong Kong Institute of Education published the study in the winter 2013 issue of the Journal of Learning Disabilities.
The educators used data from the Organization for Economic Cooperation and Development’s Program for International Student Assessment.
Besides showing that the country of origin was a better predictor of reading skills than individual traits, the study also showed that other attributes at the child, school and country levels were all related to reading.
First, girls were more likely than boys to have basic reading skills, Chiu says. Children with greater early-literacy skills, better attitudes about reading or greater self-confidence in their reading ability also were more likely to have strong basic reading skills.
“Children were more likely to have basic reading skills if they were from privileged families, as measured through socioeconomic status, number of books at home and parent attitudes about reading,” says Chiu. “Also, children attending schools with better school climate and more resources were more likely to have basic reading skills.
“Our U.S. culture values ‘can-do’ individualism, but we forget how much depends on being lucky enough to be born in the right place,” he says.

Literacy depends on nurture, not nature

A University at Buffalo education professor has sided with the environment in the timeless “nurture vs. nature” debate after his research found that a child’s ability to read depends mostly on where that child is born, rather than on his or her individual qualities.

“Individual characteristics explain only 9 percent of the differences in children who can read versus those who cannot,” says Ming Ming Chiu, lead author of an international study that explains this connection and a professor in the Department of Learning and Instruction in UB’s Graduate School of Education. 

“In contrast, country differences account for 61 percent and school differences account for 30 percent,” Chiu says.

Therefore, he concludes, the country in which a child is born largely determines whether he or she will have at least basic reading skills. It’s clearly a case where “nurture” — the environment and surroundings of the child — is more important than “nature” — the child’s inherited, individual qualities, according to Chiu.

More than 99 percent of fourth-graders in the Netherlands can read, but only 19 percent of fourth-graders in South Africa can read, Chiu notes.

“Although the richest countries typically have high literacy rates exceeding 97 percent,” he says, “some rich countries, such as Qatar and Kuwait, have low literacy rates — 33 percent and 28 percent, respectively.”

The study, “Ecological, Psychological and Cognitive Components of Reading Difficulties: Testing the Component Model of Reading in Fourth-graders Across 38 Countries,” analyzed reading test scores of 186,725 fourth-graders from 38 countries, including more than 4,000 children from the U.S. Chiu and co-authors Catherine McBride-Chang of the Chinese University of Hong Kong and Dan Lin of the Hong Kong Institute of Education published the study in the winter 2013 issue of the Journal of Learning Disabilities.

The educators used data from the Organization for Economic Cooperation and Development’s Program for International Student Assessment.

Besides showing that the country of origin was a better predictor of reading skills than individual traits, the study also showed that other attributes at the child, school and country levels were all related to reading.

First, girls were more likely than boys to have basic reading skills, Chiu says. Children with greater early-literacy skills, better attitudes about reading or greater self-confidence in their reading ability also were more likely to have strong basic reading skills.

“Children were more likely to have basic reading skills if they were from privileged families, as measured through socioeconomic status, number of books at home and parent attitudes about reading,” says Chiu. “Also, children attending schools with better school climate and more resources were more likely to have basic reading skills.

“Our U.S. culture values ‘can-do’ individualism, but we forget how much depends on being lucky enough to be born in the right place,” he says.

Filed under literacy learning reading education nature vs nurture psychology neuroscience science

445 notes

Your Brain Sees Things You Don’t
University of Arizona doctoral degree candidate Jay Sanguinetti has authored a new study, published online in the journal Psychological Science, that indicates that the brain processes and understands visual input that we may never consciously perceive.
The finding challenges currently accepted models about how the brain processes visual information.

A doctoral candidate in the UA’s Department of Psychology in the College of Science, Sanguinetti showed study participants a series of black silhouettes, some of which contained meaningful, real-world objects hidden in the white spaces on the outsides.

Saguinetti worked with his adviser Mary Peterson, a professor of psychology and director of the UA’s Cognitive Science Program, and with John Allen, a UA Distinguished Professor of psychology, cognitive science and neuroscience, to monitor subjects’ brainwaves with an electroencephalogram, or EEG, while they viewed the objects.

"We were asking the question of whether the brain was processing the meaning of the objects that are on the outside of these silhouettes," Sanguinetti said. "The specific question was, ‘Does the brain process those hidden shapes to the level of meaning, even when the subject doesn’t consciously see them?"
The answer, Sanguinetti’s data indicates, is yes.

Study participants’ brainwaves indicated that even if a person never consciously recognized the shapes on the outside of the image, their brains still processed those shapes to the level of understanding their meaning.

"There’s a brain signature for meaningful processing," Sanguinetti said. A peak in the averaged brainwaves called N400 indicates that the brain has recognized an object and associated it with a particular meaning.
"It happens about 400 milliseconds after the image is shown, less than a half a second," said Peterson. "As one looks at brainwaves, they’re undulating above a baseline axis and below that axis. The negative ones below the axis are called N and positive ones above the axis are called P, so N400 means it’s a negative waveform that happens approximately 400 milliseconds after the image is shown."
The presence of the N400 peak indicates that subjects’ brains recognize the meaning of the shapes on the outside of the figure.
"The participants in our experiments don’t see those shapes on the outside; nonetheless, the brain signature tells us that they have processed the meaning of those shapes," said Peterson. "But the brain rejects them as interpretations, and if it rejects the shapes from conscious perception, then you won’t have any awareness of them."
"We also have novel silhouettes as experimental controls," Sanguinetti said. "These are novel black shapes in the middle and nothing meaningful on the outside."
The N400 waveform does not appear on the EEG of subjects when they are seeing truly novel silhouettes, without images of any real-world objects, indicating that the brain does not recognize a meaningful object in the image.
"This is huge," Peterson said. "We have neural evidence that the brain is processing the shape and its meaning of the hidden images in the silhouettes we showed to participants in our study."
The finding leads to the question of why the brain would process the meaning of a shape when a person is ultimately not going to perceive it, Sanguinetti said.
"The traditional opinion in vision research is that this would be wasteful in terms of resources," he explained. "If you’re not going to ultimately see the object on the outside why would the brain waste all these processing resources and process that image up to the level of meaning?"
"Many, many theorists assume that because it takes a lot of energy for brain processing, that the brain is only going to spend time processing what you’re ultimately going to perceive," added Peterson. "But in fact the brain is deciding what you’re going to perceive, and it’s processing all of the information and then it’s determining what’s the best interpretation."
"This is a window into what the brain is doing all the time," Peterson said. "It’s always sifting through a variety of possibilities and finding the best interpretation for what’s out there. And the best interpretation may vary with the situation."
Our brains may have evolved to sift through the barrage of visual input in our eyes and identify those things that are most important for us to consciously perceive, such as a threat or resources such as food, Peterson suggested.
In the future, Peterson and Sanguinetti plan to look for the specific regions in the brain where the processing of meaning occurs.
"We’re trying to look at exactly what brain regions are involved," said Peterson. "The EEG tells us this processing is happening and it tells us when it’s happening, but it doesn’t tell us where it’s occurring in the brain."
"We want to look inside the brain to understand where and how this meaning is processed," said Peterson.
Images were shown to Sanguinetti’s study participants for only 170 milliseconds, yet their brains were able to complete the complex processes necessary to interpret the meaning of the hidden objects.
"There are a lot of processes that happen in the brain to help us interpret all the complexity that hits our eyeballs," Sanguinetti said. "The brain is able to process and interpret this information very quickly."
Sanguinetti’s study indicates that in our everyday life, as we walk down the street, for example, our brains may recognize many meaningful objects in the visual scene, but ultimately we are aware of only a handful of those objects.
The brain is working to provide us with the best, most useful possible interpretation of the visual world, Sanguinetti said, an interpretation that does not necessarily include all the information in the visual input.

Your Brain Sees Things You Don’t

University of Arizona doctoral degree candidate Jay Sanguinetti has authored a new study, published online in the journal Psychological Science, that indicates that the brain processes and understands visual input that we may never consciously perceive.

The finding challenges currently accepted models about how the brain processes visual information.

A doctoral candidate in the UA’s Department of Psychology in the College of Science, Sanguinetti showed study participants a series of black silhouettes, some of which contained meaningful, real-world objects hidden in the white spaces on the outsides.

Saguinetti worked with his adviser Mary Peterson, a professor of psychology and director of the UA’s Cognitive Science Program, and with John Allen, a UA Distinguished Professor of psychology, cognitive science and neuroscience, to monitor subjects’ brainwaves with an electroencephalogram, or EEG, while they viewed the objects.

"We were asking the question of whether the brain was processing the meaning of the objects that are on the outside of these silhouettes," Sanguinetti said. "The specific question was, ‘Does the brain process those hidden shapes to the level of meaning, even when the subject doesn’t consciously see them?"

The answer, Sanguinetti’s data indicates, is yes.

Study participants’ brainwaves indicated that even if a person never consciously recognized the shapes on the outside of the image, their brains still processed those shapes to the level of understanding their meaning.

"There’s a brain signature for meaningful processing," Sanguinetti said. A peak in the averaged brainwaves called N400 indicates that the brain has recognized an object and associated it with a particular meaning.

"It happens about 400 milliseconds after the image is shown, less than a half a second," said Peterson. "As one looks at brainwaves, they’re undulating above a baseline axis and below that axis. The negative ones below the axis are called N and positive ones above the axis are called P, so N400 means it’s a negative waveform that happens approximately 400 milliseconds after the image is shown."

The presence of the N400 peak indicates that subjects’ brains recognize the meaning of the shapes on the outside of the figure.

"The participants in our experiments don’t see those shapes on the outside; nonetheless, the brain signature tells us that they have processed the meaning of those shapes," said Peterson. "But the brain rejects them as interpretations, and if it rejects the shapes from conscious perception, then you won’t have any awareness of them."

"We also have novel silhouettes as experimental controls," Sanguinetti said. "These are novel black shapes in the middle and nothing meaningful on the outside."

The N400 waveform does not appear on the EEG of subjects when they are seeing truly novel silhouettes, without images of any real-world objects, indicating that the brain does not recognize a meaningful object in the image.

"This is huge," Peterson said. "We have neural evidence that the brain is processing the shape and its meaning of the hidden images in the silhouettes we showed to participants in our study."

The finding leads to the question of why the brain would process the meaning of a shape when a person is ultimately not going to perceive it, Sanguinetti said.

"The traditional opinion in vision research is that this would be wasteful in terms of resources," he explained. "If you’re not going to ultimately see the object on the outside why would the brain waste all these processing resources and process that image up to the level of meaning?"

"Many, many theorists assume that because it takes a lot of energy for brain processing, that the brain is only going to spend time processing what you’re ultimately going to perceive," added Peterson. "But in fact the brain is deciding what you’re going to perceive, and it’s processing all of the information and then it’s determining what’s the best interpretation."

"This is a window into what the brain is doing all the time," Peterson said. "It’s always sifting through a variety of possibilities and finding the best interpretation for what’s out there. And the best interpretation may vary with the situation."

Our brains may have evolved to sift through the barrage of visual input in our eyes and identify those things that are most important for us to consciously perceive, such as a threat or resources such as food, Peterson suggested.

In the future, Peterson and Sanguinetti plan to look for the specific regions in the brain where the processing of meaning occurs.

"We’re trying to look at exactly what brain regions are involved," said Peterson. "The EEG tells us this processing is happening and it tells us when it’s happening, but it doesn’t tell us where it’s occurring in the brain."

"We want to look inside the brain to understand where and how this meaning is processed," said Peterson.

Images were shown to Sanguinetti’s study participants for only 170 milliseconds, yet their brains were able to complete the complex processes necessary to interpret the meaning of the hidden objects.

"There are a lot of processes that happen in the brain to help us interpret all the complexity that hits our eyeballs," Sanguinetti said. "The brain is able to process and interpret this information very quickly."

Sanguinetti’s study indicates that in our everyday life, as we walk down the street, for example, our brains may recognize many meaningful objects in the visual scene, but ultimately we are aware of only a handful of those objects.

The brain is working to provide us with the best, most useful possible interpretation of the visual world, Sanguinetti said, an interpretation that does not necessarily include all the information in the visual input.

Filed under visual perception brain mapping neuroimaging object recognition psychology neuroscience science

330 notes

Reduced cognitive control in passionate lovers

People who are in love are less able to focus and to perform tasks that require attention. Researcher Henk van Steenbergen concludes this, together with colleagues from Leiden University and the University of Maryland. The article has appeared in the journal Motivation and Emotion.

image

The more in love, the less focused you are

Forty-three participants who had been in a relationship for less than half a year performed a number of tasks during which they had to discriminate irrelevant from relevant information as soon as possible. It appeared that the more in love they were, the less able they were to ignore the irrelevant information. Love intensity thus was related to how well someone is able to focus. There was no difference between men and women.

Cognitive control

The participants listened to music that elicited romantic feelings and thought of a romantic event to intensify their love feelings. Participants also completed a questionnaire that was used to assess the intensity of their love feelings. The results of the study by Henk van Steenbergen differed from results from previous studies. Those previous studies showed that the ability to ignore distracting information is required to maintain a long-term romantic relationship. Being able to control oneself (also called “cognitive control”) and to resist temptations that could threaten the relationship is essential in long-term love.

Thinking of your beloved

In the study by Van Steenbergen, in contrast, the participants had become involved in a romantic relationship only a few months ago. “When you have just become involved in a romantic relationship you’ll probably find it harder to focus on other things because you spend a large part of your cognitive resources on thinking of your beloved”, Van Steenbergen says. “For long-lasting love in a long-term relationship, on the other hand, it seems crucial to have proper cognitive control.” Over time, a balance between less and more cognitive control may be critical for a successful relationship.

Why is romantic love associated with cognitive control?

Van Steenbergen emphasizes that the link between romantic love and cognitive control is a new area of research. “The reason why romantic love is associated with cognitive control is still unknown. It could be that lovers use all their cognitive resources to think about their beloved, which leaves them no resources to perform a boring task. It could also be that the association goes in the opposite direction: people who have reduced cognitive control may experience more intense love feelings than people who have higher levels of cognitive control.” Future research will have to clarify this.

(Source: news.leiden.edu)

Filed under passionate love cognitive control performance psychology neuroscience science

228 notes

A longitudinal study of grapheme-color synaesthesia in childhood
What colour is H? Is 4 brighter than 9? For most people these questions might seem baffling, but not for people with grapheme-color synesthesia.
In the first long-term childhood study on grapheme-color synesthesia, researchers followed 80 children to determine when and how associations between graphemes and colors develop. The latest results are published in the open-access journal Frontiers in Human Neuroscience.
Grapheme-color synesthesia is a harmless, alternative form of perception caused by subtle differences in the brain – possibly, stronger connections between centers for language and color – that give letters and numbers their phantom colors. It is passed down from parent to child in around 1 to 2% of the population.
In the present study, a group of synesthete children was tested three times between 6 and 10 years old. Each child was presented with 36 graphemes – the letters A to Z and digits 0 to 9 – and asked to choose the ‘best’ of 13 colors for each.
Children with grapheme-color synesthesia had already developed strong associations for around 30% of graphemes at 6 years old. At 7 years old, the same children had associations for around 50% of graphemes, and this increased to 70% of graphemes at 10 years old. The synesthete children were consistent in their choices over this 4-year period. Three children who were synesthetes at ages 6 to 7 were no longer so at 10 years old, indicating that the condition spontaneously disappears in some children as they grow older.
"This repeated testing of child synesthetes in real time allowed us to see for the first time that synesthetic colours emerge slowly during childhood, building up an incremental inventory of colorful letters and numbers," says Dr. Simner, a cognitive neuropsychologist who specializes in synesthesia, from the University of Edinburgh, UK.
The researchers’ next challenge is to determine how changes in the intensity of synesthesia - as strengthening or loss with increasing age - can be explained from changes in the organization of the brain.
(Image: Shutterstock)

A longitudinal study of grapheme-color synaesthesia in childhood

What colour is H? Is 4 brighter than 9? For most people these questions might seem baffling, but not for people with grapheme-color synesthesia.

In the first long-term childhood study on grapheme-color synesthesia, researchers followed 80 children to determine when and how associations between graphemes and colors develop. The latest results are published in the open-access journal Frontiers in Human Neuroscience.

Grapheme-color synesthesia is a harmless, alternative form of perception caused by subtle differences in the brain – possibly, stronger connections between centers for language and color – that give letters and numbers their phantom colors. It is passed down from parent to child in around 1 to 2% of the population.

In the present study, a group of synesthete children was tested three times between 6 and 10 years old. Each child was presented with 36 graphemes – the letters A to Z and digits 0 to 9 – and asked to choose the ‘best’ of 13 colors for each.

Children with grapheme-color synesthesia had already developed strong associations for around 30% of graphemes at 6 years old. At 7 years old, the same children had associations for around 50% of graphemes, and this increased to 70% of graphemes at 10 years old. The synesthete children were consistent in their choices over this 4-year period. Three children who were synesthetes at ages 6 to 7 were no longer so at 10 years old, indicating that the condition spontaneously disappears in some children as they grow older.

"This repeated testing of child synesthetes in real time allowed us to see for the first time that synesthetic colours emerge slowly during childhood, building up an incremental inventory of colorful letters and numbers," says Dr. Simner, a cognitive neuropsychologist who specializes in synesthesia, from the University of Edinburgh, UK.

The researchers’ next challenge is to determine how changes in the intensity of synesthesia - as strengthening or loss with increasing age - can be explained from changes in the organization of the brain.

(Image: Shutterstock)

Filed under synaesthesia grapheme-color synaesthesia childhood psychology neuroscience science

138 notes

Scientists discover that ants, like humans, can change their priorities

All animals have to make decisions every day. Where will they live and what will they eat? How will they protect themselves? They often have to make these decisions as a group, too, turning what may seem like a simple choice into a far more nuanced process. So, how do animals know what’s best for their survival?

image

For the first time, Arizona State University researchers have discovered that at least in ants, animals can change their decision-making strategies based on experience. They can also use that experience to weigh different options.

The findings are featured today in the early online edition of the scientific journal Biology Letters, as well as in its Dec. 23 edition.

Co-authors Taka Sasaki and Stephen Pratt, both with ASU’s School of Life Sciences, have studied insect collectives, such as ants, for years. Sasaki, a postdoctoral research associate, specializes in adapting psychological theories and experiments that are designed for humans to ants, hoping to understand how the collective decision-making process arises out of individually ignorant ants.

“The interesting thing is we can make decisions and ants can make decisions – but ants do it collectively,” said Sasaki. “So how different are we from ant colonies?”

To answer this question, Sasaki and Pratt gave a number of Temnothorax rugatulus ant colonies a series of choices between two nests with differing qualities. In one treatment, the entrances of the nests had varied sizes, and in the other, the exposure to light was manipulated. Since these ants prefer both a smaller entrance size and a lower level of light exposure, they had to prioritize.

“It’s kind of like a humans and buying a house,” said Pratt, an associate professor with the school. “There’s so many options to consider – the size, the number of rooms, the neighborhood, the price, if there’s a pool. The list goes on and on. And for the ants it’s similar, since they live in cavities that can be dark or light, big or small. With all of these things, just like with a human house, it’s very unlikely to find a home that has everything you want.”

Pratt continued to explain that because it is impossible to find the perfect habitat, ants make various tradeoffs for certain qualities, ordering them in a queue of most important aspects. But, when faced with a decision between two different homes, the ants displayed a previously unseen level of intelligence.

According to their data, the series of choices the ants faced caused them to reprioritize their preferences based on the type of decision they faced. Ants that had to choose a nest based on light level prioritized light level over entrance size in the final choice. On the other hand, ants that had to choose a nest based on entrance size ranked light level lower in the later experiment.

This means that, like people, ants take the past into account when weighing options while making a choice. The difference is that ants somehow manage to do this as a colony without any dissent. While this research builds on groundwork previously laid down by Sasaki and Pratt, the newest experiments have already raised more questions.

“You have hundreds of these ants, and somehow they have to reach a consensus,” Pratt said. “How do they do it without anyone in charge to tell them what to do?”

Pratt likened individual ants to individual neurons in the human brain. Both play a key role in the decision-making process, but no one understands how every neuron influences a decision.

Sasaki and Pratt hope to delve deeper into the realm of ant behavior so that one day, they can understand how individual ants influence the colony. Their greater goal is to apply what they discover to help society better understand how humanity can make collective decisions with the same ease ants display.

“This helps us learn how collective decision-making works and how it’s different from individual decision-making,” said Pratt. “And ants aren’t the only animals that make collective decisions – humans do, too. So maybe we can gain some general insight.”

(Source: asunews.asu.edu)

Filed under ants learning decision making collective decision making neuroscience psychology science

175 notes

Personal reflection triggers increased brain activity during depressive episodes
Research by the University of Liverpool has found that people experiencing depressive episodes display increased brain activity when they think about themselves.
Using functional magnetic resonance imaging (fMRI) brain imaging technologies, scientists found that people experiencing a depressive episode process information about themselves in the brain differently to people who are not depressed.
British Queen
Researchers scanned the brains of people in major depressive episodes and those that weren’t whilst they chose positive, negative and neutral adjectives to describe either themselves or the British Queen -  a figure significantly removed from their daily lives but one that all participants were familiar with.
Professor Peter Kinderman, Head of the University’s Institute of Psychology, Health and Society, said: “We found that participants who were experiencing depressed mood chose significantly fewer positive words and more negative and neutral words to describe themselves, in comparison to participants who were not depressed.
“That’s not too surprising, but the brain scans also revealed significantly greater blood oxygen levels in the medial superior frontal cortex – the area associated with processing self-related information – when the depressed participants were making judgments about themselves.
“This research leads the way for further studies into the psychological and neural processes that accompany depressed mood. Understanding more about how people evaluate themselves when they are depressed, and how neural processes are involved could lead to improved understanding and care.”
Dr May Sarsam, from the Mersey Care NHS Trust, said:  “This study explored the difference in medical and psychological theories of depression.  It showed that brain activity only differed when depressed people thought about themselves, not when they thought about the Queen or when they made other types of judgements, which fits very well with the current psychological theory.
Equally important
“Thought and neurochemistry should be considered as equally important in our understanding of mental health difficulties such as depression.”
Depression is associated with extensive negative feelings and thoughts.  Nearly one-fifth of adults experience anxiety or depression, with the conditions affecting a higher proportion of women than men.
The research, in collaboration with the Mersey Care NHS Trust and the Universities of Manchester, Edinburgh and Lancaster, is published in PLOS One.

Personal reflection triggers increased brain activity during depressive episodes

Research by the University of Liverpool has found that people experiencing depressive episodes display increased brain activity when they think about themselves.

Using functional magnetic resonance imaging (fMRI) brain imaging technologies, scientists found that people experiencing a depressive episode process information about themselves in the brain differently to people who are not depressed.

British Queen

Researchers scanned the brains of people in major depressive episodes and those that weren’t whilst they chose positive, negative and neutral adjectives to describe either themselves or the British Queen -  a figure significantly removed from their daily lives but one that all participants were familiar with.

Professor Peter Kinderman, Head of the University’s Institute of Psychology, Health and Society, said: “We found that participants who were experiencing depressed mood chose significantly fewer positive words and more negative and neutral words to describe themselves, in comparison to participants who were not depressed.

“That’s not too surprising, but the brain scans also revealed significantly greater blood oxygen levels in the medial superior frontal cortex – the area associated with processing self-related information – when the depressed participants were making judgments about themselves.

“This research leads the way for further studies into the psychological and neural processes that accompany depressed mood. Understanding more about how people evaluate themselves when they are depressed, and how neural processes are involved could lead to improved understanding and care.”

Dr May Sarsam, from the Mersey Care NHS Trust, said:  “This study explored the difference in medical and psychological theories of depression.  It showed that brain activity only differed when depressed people thought about themselves, not when they thought about the Queen or when they made other types of judgements, which fits very well with the current psychological theory.

Equally important

“Thought and neurochemistry should be considered as equally important in our understanding of mental health difficulties such as depression.”

Depression is associated with extensive negative feelings and thoughts.  Nearly one-fifth of adults experience anxiety or depression, with the conditions affecting a higher proportion of women than men.

The research, in collaboration with the Mersey Care NHS Trust and the Universities of Manchester, Edinburgh and Lancaster, is published in PLOS One.

Filed under anxiety depression neuroimaging brain activity frontal cortex psychology neuroscience science

280 notes

Repetition in Music Pulls Us In and Pulls Us Together
In On Repeat: How Music Plays the Mind, Elizabeth Hellmuth Margulis of the University of Arkansas explores the psychology of repetition in music, across time, style and cultures. Hers is the first in-depth study of repetitiveness in music, which she calls “at once entirely ordinary and entirely mysterious” and “so common as to seem almost invisible.”
Repetition in music can be a motif repeated throughout a composition or a favorite song played again and again. It can be the annoying earworm burrowed into the brain that just won’t go away.
Music, she writes, “is a fundamentally human capacity, present in all known cultures, and important to intellectual, emotional and social experience.” And repetition is a key element in music, one that both pulls us into the experience and pulls us together as people.
In her research, Margulis drew on a range of disciplines, including music theory, psycholinguistics, neuroscience and cognitive psychology, to examine how listeners perceive and respond to repetition. She worked with ethnomusicologists to understand the place of music and its repetitive features in cultures around the world.
On Repeat is published by Oxford University Press. The Kindle version is available already, and the hardback publication will ship on Nov. 11, 2013.
A repeated musical motif can build pleasurable expectations in the listener, pulling them into the experience of the piece of music.
“Repetition makes it possible for us to experience a sense of expanded present, characterized not by the explicit knowledge that x will occur at time point y, but rather a déjà-vu-like sense of orientation and involvement,” Margulis writes.
Through repeated playing, a work of music develops an important social and biological role in creating cohesion between individuals and groups. Margulis points to children in nursery school singing a cleanup song each day or adults singing Auld Lang Syne at midnight on New Year’s Eve.
“Repeatability is how songs come to be the property of a group or a community instead of an individual,” she writes, “how they come to belong to a tradition, rather than to a moment.”
On Repeat offers new insights into the relationship between music and language, the nature of musical pleasure and the cognitive science of repetition in music. While the book will be useful to scholars and students, it is written for specialist and non-specialist alike.

Repetition in Music Pulls Us In and Pulls Us Together

In On Repeat: How Music Plays the Mind, Elizabeth Hellmuth Margulis of the University of Arkansas explores the psychology of repetition in music, across time, style and cultures. Hers is the first in-depth study of repetitiveness in music, which she calls “at once entirely ordinary and entirely mysterious” and “so common as to seem almost invisible.”

Repetition in music can be a motif repeated throughout a composition or a favorite song played again and again. It can be the annoying earworm burrowed into the brain that just won’t go away.

Music, she writes, “is a fundamentally human capacity, present in all known cultures, and important to intellectual, emotional and social experience.” And repetition is a key element in music, one that both pulls us into the experience and pulls us together as people.

In her research, Margulis drew on a range of disciplines, including music theory, psycholinguistics, neuroscience and cognitive psychology, to examine how listeners perceive and respond to repetition. She worked with ethnomusicologists to understand the place of music and its repetitive features in cultures around the world.

On Repeat is published by Oxford University Press. The Kindle version is available already, and the hardback publication will ship on Nov. 11, 2013.

A repeated musical motif can build pleasurable expectations in the listener, pulling them into the experience of the piece of music.

“Repetition makes it possible for us to experience a sense of expanded present, characterized not by the explicit knowledge that x will occur at time point y, but rather a déjà-vu-like sense of orientation and involvement,” Margulis writes.

Through repeated playing, a work of music develops an important social and biological role in creating cohesion between individuals and groups. Margulis points to children in nursery school singing a cleanup song each day or adults singing Auld Lang Syne at midnight on New Year’s Eve.

“Repeatability is how songs come to be the property of a group or a community instead of an individual,” she writes, “how they come to belong to a tradition, rather than to a moment.”

On Repeat offers new insights into the relationship between music and language, the nature of musical pleasure and the cognitive science of repetition in music. While the book will be useful to scholars and students, it is written for specialist and non-specialist alike.

Filed under music repetition earworm psychology neuroscience science

247 notes

Torture Permanently Damages Normal Perception of Pain

TAU researchers study the long-term effects of torture on the human pain system

image

Israeli soldiers captured during the 1973 Yom Kippur War were subjected to brutal torture in Egypt and Syria. Held alone in tiny, filthy spaces for weeks or months, sometimes handcuffed and blindfolded, they suffered severe beatings, burns, electric shocks, starvation, and worse. And rather than receiving treatment, additional torture was inflicted on existing wounds.

Forty years later, research by Prof. Ruth Defrin of the Department of Physical Therapy in the Sackler Faculty of Medicine at Tel Aviv University shows that the ex-prisoners of war (POWs), continue to suffer from dysfunctional pain perception and regulation, likely as a result of their torture. The study — conducted in collaboration with Prof. Zahava Solomon and Prof. Karni Ginzburg of TAU’s Bob Shapell School of Social Work and Prof. Mario Mikulincer of the School of Psychology at the Interdisciplinary Center, Herzliya — was published in the European Journal of Pain.

"The human body’s pain system can either inhibit or excite pain. It’s two sides of the same coin," says Prof. Defrin. "Usually, when it does more of one, it does less of the other. But in Israeli ex-POWs, torture appears to have caused dysfunction in both directions. Our findings emphasize that tissue damage can have long-term systemic effects and needs to be treated immediately."

A painful legacy

The study focused on 104 combat veterans of the Yom Kippur War. Sixty of the men were taken prisoner during the war, and 44 of them were not. In the study, all were put through a battery of psychophysical pain tests — applying a heating device to one arm, submerging the other arm in a hot water bath, and pressing a nylon fiber into a middle finger. They also filled out psychological questionnaires.

The ex-POWs exhibited diminished pain inhibition (the degree to which the body eases one pain in response to another) and heightened pain excitation (the degree to which repeated exposure to the same sensation heightens the resulting pain). Based on these novel findings, the researchers conclude that the torture survivors’ bodies now regulate pain in a dysfunctional way.

It is not entirely clear whether the dysfunction is the result of years of chronic pain or of the original torture itself. But the ex-POWs exhibited worse pain regulation than the non-POW chronic pain sufferers in the study. And a statistical analysis of the test data also suggested that being tortured had a direct effect on their ability to regulate pain.

Head games

The researchers say non-physical torture may have also contributed to the ex-POWs’ chronic pain. Among other forms of oppression and humiliation, the ex-POWs were not allowed to use the toilet, cursed at and threatened, told demoralizing misinformation about their loved ones, and exposed to mock executions. In the later stages of captivity, most of the POWs were transferred to a group cell, where social isolation was replaced by intense friction, crowding, and loss of privacy.

"We think psychological torture also affects the physiological pain system," says Prof. Defrin. "We still have to fully analyze the data, but preliminary analysis suggests there is a connection."

(Source: aftau.org)

Filed under torture chronic pain pain psychology neuroscience science

free counters