Neuroscience

Articles and news from the latest research reports.

130 notes


Swimming kids are smarter
Children who learn how to swim at a young age are reaching many developmental milestones earlier than the norm.
Researchers from the Griffith Institute for Educational Research surveyed parents of 7000 under-fives from Australia, New Zealand and the US over three years.
A further 180 children aged 3, 4 and 5 years have been involved in intensive testing, making it the world’s most comprehensive study into early-years swimming.
Lead researcher Professor Robyn Jorgensen says the study shows young children who participate in early-years swimming achieve a wide range of skills earlier than the normal population.
“Many of these skills are those that help young children into the transition into formal learning contexts such as pre-school or school.
“The research also found significant differences between the swimming cohort and non-swimmers regardless of socio-economic background.
“While the two higher socio-economic groups performed better than the lower two in testing, the four SES groups all performed better than the normal population.
The researchers also found there were no gender differences between the research cohort and the normal population.
As well as achieving physical milestones faster, children also scored significantly better in visual-motor skills such as cutting paper, colouring in and drawing lines and shapes, and many mathematically-related tasks. Their oral expression was also better as well as in the general areas of literacy and numeracy.
“Many of these skills are highly valuable in other learning environments and will be of considerable benefit for young children as they transition into pre-schools and school.”

Swimming kids are smarter

Children who learn how to swim at a young age are reaching many developmental milestones earlier than the norm.

Researchers from the Griffith Institute for Educational Research surveyed parents of 7000 under-fives from Australia, New Zealand and the US over three years.

A further 180 children aged 3, 4 and 5 years have been involved in intensive testing, making it the world’s most comprehensive study into early-years swimming.

Lead researcher Professor Robyn Jorgensen says the study shows young children who participate in early-years swimming achieve a wide range of skills earlier than the normal population.

“Many of these skills are those that help young children into the transition into formal learning contexts such as pre-school or school.

“The research also found significant differences between the swimming cohort and non-swimmers regardless of socio-economic background.

“While the two higher socio-economic groups performed better than the lower two in testing, the four SES groups all performed better than the normal population.

The researchers also found there were no gender differences between the research cohort and the normal population.

As well as achieving physical milestones faster, children also scored significantly better in visual-motor skills such as cutting paper, colouring in and drawing lines and shapes, and many mathematically-related tasks. Their oral expression was also better as well as in the general areas of literacy and numeracy.

“Many of these skills are highly valuable in other learning environments and will be of considerable benefit for young children as they transition into pre-schools and school.”

Filed under children swimming learning cognitive skills psychology neuroscience science

35 notes

Reconsidering cancer’s bad guy

Researchers at the University of Copenhagen have found that a protein, known for causing cancer cells to spread around the body, is also one of the molecules that trigger repair processes in the brain. These findings are the subject of a paper, published this week in Nature Communications. They point the way to new avenues of research into degenerative brain diseases like Alzheimer’s.

How to repair brain injuries is a fundamental question facing brain researchers. Scientists have been familiar with the protein S100A4 for some time as a factor in metastasis, or how cancer spreads. However it’s the first time the protein has been shown to play a role in brain protection and repair.

“This protein is not normally in the brain, only when there’s trauma or degeneration. When we deleted the protein in mice we discovered that their brains were less protected and able to resist injury. We also discovered that S100A4 works by activating signalling pathways inside neurons,” says Postdoc Oksana Dmytriyeva, who worked on the research in a team at the Protein Laboratory in the Department of Neuroscience and Pharmacology at the University of Copenhagen.

The villain turns out to be the hero

This research stands on the shoulders of many years of work on S100A4 in its deadlier role in cancer progression. The discovery represents a significant development for the new Neuro-Oncology Group that moved to the University of Copenhagen’s Protein Laboratory Group from the Danish Cancer Society in October.

“We were surprised to find this protein in this role, as we thought it was purely a cancer protein. We are very excited about it and we’re looking forward to continuing our research in a practical direction. We hope that the findings will eventually benefit people who need treatment for neurodegenerative disorders like Alzheimer’s disease, although obviously we have a long way to go before we get to that point,” says Oksana Dmytriyeva.

(Source: news.ku.dk)

Filed under brain brain injury neurodegenerative diseases protein cancer neuroscience science

5,927 notes


Animals Are Moral Creatures, Scientist Argues
Until recently, scientists would have said your cat was snuggling up to you only as a means to get tasty treats. But many animals have a moral compass, and feel emotions such as love, grief, outrage and empathy, a new book argues.
The book, “Can Animals Be Moral?” (Oxford University Press, October 2012), suggests social mammals such as rats, dogs and chimpanzees can choose to be good or bad. And because they have morality, we have moral obligations to them, said author Mark Rowlands, a University of Miami philosopher.
“Animals are owed a certain kind of respect that they wouldn’t be owed if they couldn’t act morally,” Rowlands told LiveScience. But while some animals have complex emotions, they don’t necessarily have true morality, other researchers argue.
Moral behavior?
Some research suggests animals have a sense of outrage when social codes are violated. Chimpanzees may punish other chimps for violating certain rules of the social order, said Marc Bekoff, an evolutionary biologist at the University of Colorado, Boulder, and co-author of “Wild Justice: The Moral Lives of Animals” (University Of Chicago Press, 2012).
Male bluebirds that catch their female partners stepping out may beat the female, said Hal Herzog, a psychologist at Western Carolina University who studies how humans think about animals.
And there are many examples of animals demonstrating ostensibly compassionate or empathetic behaviors toward other animals, including humans. In one experiment, hungry rhesus monkeys refused to electrically shock their fellow monkeys, even when it meant getting food for themselves. In another study, a female gorilla named Binti Jua rescued an unconscious 3-year-old (human) boy who had fallen into her enclosure at the Brookline Zoo in Illinois, protecting the child from other gorillas and even calling for human help. And when a car hit and injured a dog on a busy Chilean freeway several years ago, its canine compatriot dodged traffic, risking its life to drag the unconscious dog to safety.
All those examples suggest that animals have some sense of right and wrong, Rowlands said. “I think what’s at the heart of following morality is the emotions,” Rowlands said. “Evidence suggests that animals can act on those sorts of emotions.”
Instinct, not morals?
Not everyone agrees these behaviors equal morality, however. One of the most obvious examples — the guilty look of a dog that has just eaten a forbidden food — may not be true remorse, but simply the dog responding appropriately to its owner’s disappointment, according to a study published in the journal Behavioural Processes in 2009.
And animals don’t seem to develop or follow rules that serve no purpose for them or their species, suggesting they don’t reason about morality. Humans, in contrast, have a grab bag of moral taboos, such as prohibitions on eating certain foods, committing blasphemy, or marrying distant cousins.
“What I think is interesting about human morality is that often times there’s this wacky, arbitrary feature of it,” Herzog said. Instead, animal emotions may be rooted in instinct and hard-wiring, rather than conscious choice, Herzog said. “They look to us like moral behaviors, but they’re not rooted in the same mire of intellect and culture and language that human morality is,” he said.
Hard-wired morality
But Rowlands argues that such hair-splitting is overthinking things. In the case of the child-rescuing gorilla Binti Jua, for instance, “what sort of instinct is involved there? Do gorillas have an instinct to help unconscious boys in enclosures?” he said.
And even if instinct is involved, human parents have an instinctive desire to help their children, but that makes the desire no less moral, he said. Being able to reason about morality isn’t required to have a moral compass, he added. A 3-year-old child, for instance, may not consciously articulate a system of right and wrong, but will (hopefully) still feel guilty for stealing his playmate’s toy. (Scientists continue to debate whether or not babies have moral compasses.)
If one accepts that animals have moral compasses, Rowlands argues, we have the responsibility to treat them with respect, Rowlands said. “If the animal is capable of acting morally, I don’t think it’s problematic to be friends with your pets,” he said. “If you have a cat or a dog and you make it do tricks, I am not sure that’s respect. If you insist on dressing them up, I’m not sure I’m onboard with that either.”

Animals Are Moral Creatures, Scientist Argues

Until recently, scientists would have said your cat was snuggling up to you only as a means to get tasty treats. But many animals have a moral compass, and feel emotions such as love, grief, outrage and empathy, a new book argues.

The book, “Can Animals Be Moral?” (Oxford University Press, October 2012), suggests social mammals such as rats, dogs and chimpanzees can choose to be good or bad. And because they have morality, we have moral obligations to them, said author Mark Rowlands, a University of Miami philosopher.

“Animals are owed a certain kind of respect that they wouldn’t be owed if they couldn’t act morally,” Rowlands told LiveScience. But while some animals have complex emotions, they don’t necessarily have true morality, other researchers argue.

Moral behavior?

Some research suggests animals have a sense of outrage when social codes are violated. Chimpanzees may punish other chimps for violating certain rules of the social order, said Marc Bekoff, an evolutionary biologist at the University of Colorado, Boulder, and co-author of “Wild Justice: The Moral Lives of Animals” (University Of Chicago Press, 2012).

Male bluebirds that catch their female partners stepping out may beat the female, said Hal Herzog, a psychologist at Western Carolina University who studies how humans think about animals.

And there are many examples of animals demonstrating ostensibly compassionate or empathetic behaviors toward other animals, including humans. In one experiment, hungry rhesus monkeys refused to electrically shock their fellow monkeys, even when it meant getting food for themselves. In another study, a female gorilla named Binti Jua rescued an unconscious 3-year-old (human) boy who had fallen into her enclosure at the Brookline Zoo in Illinois, protecting the child from other gorillas and even calling for human help. And when a car hit and injured a dog on a busy Chilean freeway several years ago, its canine compatriot dodged traffic, risking its life to drag the unconscious dog to safety.

All those examples suggest that animals have some sense of right and wrong, Rowlands said. “I think what’s at the heart of following morality is the emotions,” Rowlands said. “Evidence suggests that animals can act on those sorts of emotions.”

Instinct, not morals?

Not everyone agrees these behaviors equal morality, however. One of the most obvious examples — the guilty look of a dog that has just eaten a forbidden food — may not be true remorse, but simply the dog responding appropriately to its owner’s disappointment, according to a study published in the journal Behavioural Processes in 2009.

And animals don’t seem to develop or follow rules that serve no purpose for them or their species, suggesting they don’t reason about morality. Humans, in contrast, have a grab bag of moral taboos, such as prohibitions on eating certain foods, committing blasphemy, or marrying distant cousins.

“What I think is interesting about human morality is that often times there’s this wacky, arbitrary feature of it,” Herzog said. Instead, animal emotions may be rooted in instinct and hard-wiring, rather than conscious choice, Herzog said. “They look to us like moral behaviors, but they’re not rooted in the same mire of intellect and culture and language that human morality is,” he said.

Hard-wired morality

But Rowlands argues that such hair-splitting is overthinking things. In the case of the child-rescuing gorilla Binti Jua, for instance, “what sort of instinct is involved there? Do gorillas have an instinct to help unconscious boys in enclosures?” he said.

And even if instinct is involved, human parents have an instinctive desire to help their children, but that makes the desire no less moral, he said. Being able to reason about morality isn’t required to have a moral compass, he added. A 3-year-old child, for instance, may not consciously articulate a system of right and wrong, but will (hopefully) still feel guilty for stealing his playmate’s toy. (Scientists continue to debate whether or not babies have moral compasses.)

If one accepts that animals have moral compasses, Rowlands argues, we have the responsibility to treat them with respect, Rowlands said. “If the animal is capable of acting morally, I don’t think it’s problematic to be friends with your pets,” he said. “If you have a cat or a dog and you make it do tricks, I am not sure that’s respect. If you insist on dressing them up, I’m not sure I’m onboard with that either.”

(Source: scinerds)

156 notes


Socrates Method Of Memory Works Just As Well Using Virtual Reality
In the episode of NOVA that aired October 24 of this year, host David Pogue posed the question, “How Smart Can We Get?” At one point in the episode, he met with Chester Santos, who was the 2008 US Memory Champion, to pick his brain on how he manages to learn long strings of numbers and words. Santos taught him a technique that involved visualization of objects that were in Pogue’s own house and associating them with the string of non-related words. It turns out this technique is nothing new. Its roots stem all the way back to the time of Socrates, in fact.
A new research study conducted by a team from the University of Alberta has revisited this age old technique giving it a modern-day twist.
The memory technique, called loci, or location, by the ancient Greeks, was used by Socrates, according to classic scholars, to memorize his oratories. To do this, Socrates would wander around his home and assign a word or fact that he needed to memorize some familiar object or structure in his home.
At the time that Socrates needed to recall this information in front of an audience, he would simply conjure up his home and, in his mind, the words that he had linked to things like his window or table would instantly be recalled.
“Nowadays many contestants in memory competitions use this same technique,” said lead researcher Eric Legge. “They use the location method to instantly recall everything from words to a long list of random numbers.”
Legge, along with his U of A research colleague Christopher Madan, developed a virtual living-space environment. This virtual living room would allow their test subjects to use the ancient Greek technique to increase their memory ability.

Socrates Method Of Memory Works Just As Well Using Virtual Reality

In the episode of NOVA that aired October 24 of this year, host David Pogue posed the question, “How Smart Can We Get?” At one point in the episode, he met with Chester Santos, who was the 2008 US Memory Champion, to pick his brain on how he manages to learn long strings of numbers and words. Santos taught him a technique that involved visualization of objects that were in Pogue’s own house and associating them with the string of non-related words. It turns out this technique is nothing new. Its roots stem all the way back to the time of Socrates, in fact.

A new research study conducted by a team from the University of Alberta has revisited this age old technique giving it a modern-day twist.

The memory technique, called loci, or location, by the ancient Greeks, was used by Socrates, according to classic scholars, to memorize his oratories. To do this, Socrates would wander around his home and assign a word or fact that he needed to memorize some familiar object or structure in his home.

At the time that Socrates needed to recall this information in front of an audience, he would simply conjure up his home and, in his mind, the words that he had linked to things like his window or table would instantly be recalled.

“Nowadays many contestants in memory competitions use this same technique,” said lead researcher Eric Legge. “They use the location method to instantly recall everything from words to a long list of random numbers.”

Legge, along with his U of A research colleague Christopher Madan, developed a virtual living-space environment. This virtual living room would allow their test subjects to use the ancient Greek technique to increase their memory ability.

Filed under memory memory technique method of loci virtual reality neuroscience psychology science

125 notes


How we “hear” with our eyes
In everyday life we rarely consciously try to lip-read. However, in a noisy environment it is often very helpful to be able to see the mouth of the person you are speaking to. Researcher Helen Blank at the MPI in Leipzig explains why this is so: “When our brain is able to combine information from different sensory sources, for example during lip-reading, speech comprehension is improved.” In a recent study, the researchers of the Max Planck Research Group “Neural Mechanisms of Human Communication” investigated this phenomenon in more detail to uncover how visual and auditory brain areas work together during lip-reading.
In the experiment, brain activity was measured using functional magnetic resonance imaging (fMRI) while participants heard short sentences. The participants then watched a short silent video of a person speaking. Using a button press, participants indicated whether the sentence they had heard matched the mouth movements in the video. If the sentence did not match the video, a part of the brain network that combines visual and auditory information showed greater activity and there were increased connections between the auditory speech region and the STS.
“It is possible that advanced auditory information generates an expectation about the lip movements that will be seen”, says Blank. “Any contradiction between the prediction of what will be seen and what is actually observed generates an error signal in the STS.”
How strong the activation is depends on the lip-reading skill of participants: The strong-er the activation, the more correct responses were. “People that were the best lip-readers showed an especially strong error signal in the STS”, Blank explains. This effect seems to be specific to the content of speech - it did not occur when the subjects had to decide if the identity of the voice and face matched.
The results of this study are very important to basic research in this area. A better understanding of how the brain combines auditory and visual information during speech processing could also be applied in clinical settings. “People with hearing impairment are often strongly dependent on lip-reading”, says Blank. The researchers suggest that further studies could examine what happens in the brain after lip-reading training or during a combined use of sign language and lip-reading.

How we “hear” with our eyes

In everyday life we rarely consciously try to lip-read. However, in a noisy environment it is often very helpful to be able to see the mouth of the person you are speaking to. Researcher Helen Blank at the MPI in Leipzig explains why this is so: “When our brain is able to combine information from different sensory sources, for example during lip-reading, speech comprehension is improved.” In a recent study, the researchers of the Max Planck Research Group “Neural Mechanisms of Human Communication” investigated this phenomenon in more detail to uncover how visual and auditory brain areas work together during lip-reading.

In the experiment, brain activity was measured using functional magnetic resonance imaging (fMRI) while participants heard short sentences. The participants then watched a short silent video of a person speaking. Using a button press, participants indicated whether the sentence they had heard matched the mouth movements in the video. If the sentence did not match the video, a part of the brain network that combines visual and auditory information showed greater activity and there were increased connections between the auditory speech region and the STS.

“It is possible that advanced auditory information generates an expectation about the lip movements that will be seen”, says Blank. “Any contradiction between the prediction of what will be seen and what is actually observed generates an error signal in the STS.”

How strong the activation is depends on the lip-reading skill of participants: The strong-er the activation, the more correct responses were. “People that were the best lip-readers showed an especially strong error signal in the STS”, Blank explains. This effect seems to be specific to the content of speech - it did not occur when the subjects had to decide if the identity of the voice and face matched.

The results of this study are very important to basic research in this area. A better understanding of how the brain combines auditory and visual information during speech processing could also be applied in clinical settings. “People with hearing impairment are often strongly dependent on lip-reading”, says Blank. The researchers suggest that further studies could examine what happens in the brain after lip-reading training or during a combined use of sign language and lip-reading.

Filed under brain superior temporal sulcus lip reading brain areas brain activity neuroscience psychology science

106 notes

Neuroscientists develop word concept mind-reading tool

A team of cognitive neuroscientists has identified the areas of the brain responsible for processing specific words meanings, bringing us one step closer to developing multilingual mind reading machines.

Presenting the findings at the Society for the Neurobiology of Language Conference in San Sebastián, Spain, Joao Correia of Maastricht University explained that his team decided to answer one central question: “how do we represent the meaning of words independent of the language we are listening to?”

Past studies have focused on identifying areas of the brain that generate and hear general terms or feelings. However, if we can locate where the actual concept of a word — which transcends language — is processed, we would be able to read the mind of any individual. The recent case of 39-year-old Scott Routley letting doctors know he is not in pain, just by thinking, is a prime example of where this could be extremely effective in the future. After not responding to any stimulation for more than a decade, Routley was thought to be in a persistent vegetative state. However, by studying fMRI scans in real time neurologists could identify that Routley was in fact responding to their questions — they asked him to think about playing tennis or walking around at home to indicate yes or no. These two actions are processed in different areas of the brain, so answers could be extracted by reading scans. With Correia’s approach, we would need no signifier for yes or no — we could go straight to the source where the processing of the meaning of positive and negative takes place; the “hub”, as he puts it.

"This fMRI study investigates the neural network of speech processing responsible for transforming sound to meaning, by exploring the semantic similarities between bilingual wordpairs," explains an abstract of the study. To achieve this, they needed bilingual volunteers, so worked with eight Dutch candidates all fluent in English. First off, the team monitored the volunteers’ neural activity while saying the words "bull", "horse", "shark" and "duck" in English. All the words chosen had one syllable, were from a similar group and were probably learnt round the same period — this ensured that any differences would specifically relate to meaning. Different brain activity patterns appeared in the left anterior temporal cortex, and each of these were then fed into an algorithm so it would be able to flag up when one of the words was uttered again.

The hypothesis was, if the algorithm could still correctly identify the words when they were spoken in Dutch, these patterns would hold the key to where the word concepts are derived. The algorithm did exactly that. It demonstrates that words are encoded in the same way in the brain, regardless of language.

There is one pretty major drawback to the process, which quashes any visions of a full-on real-time mind translation machine hitting stores anytime soon — the neural activity patterns differed slightly from person to person. Our neurons learn and identify in unique ways, and understanding these pathway patterns through machine learning would be a long process. “You would have to scan a person as they thought their way through a dictionary,” said Matt Davis of the MRC Cognition and Brain Sciences Unit in Cambridge. It would be difficult to translate a mind now without this concept map. However, we are only at the beginning of this line of study, and an algorithm could potentially be devised to aggregate hundreds of neural activity patterns to help indicate what the brain activity of an individual unable to communicate represents.

Filed under brain language semantics word meaning bilinguals neuroscience psychology science

free counters