Neuroscience

Articles and news from the latest research reports.

Posts tagged psychology

30 notes

New research from the Hebrew University of Jerusalem shows that a carefully scheduled high-fat diet can lead to a reduction in body weight and a unique metabolism in which ingested fats are not stored, but rather used for energy at times when no food is available.
The results were published in FASEB Journal under the title ‘Timed high-fat diet resets circadian metabolism and prevents obesity.’  Previous research has established that disrupting mammals’ daily rhythms, or feeding them a high-fat diet, disrupts metabolism and leads to obesity. The researchers wanted to determine the effect of combining a high-fat diet with long-term feeding on a fixed schedule. They hypothesized that careful scheduling of meals would regulate the biological clock and reduce the effects of a high-fat diet that, under normal circumstances, would lead to obesity.

New research from the Hebrew University of Jerusalem shows that a carefully scheduled high-fat diet can lead to a reduction in body weight and a unique metabolism in which ingested fats are not stored, but rather used for energy at times when no food is available.

The results were published in FASEB Journal under the title ‘Timed high-fat diet resets circadian metabolism and prevents obesity.’

Previous research has established that disrupting mammals’ daily rhythms, or feeding them a high-fat diet, disrupts metabolism and leads to obesity. The researchers wanted to determine the effect of combining a high-fat diet with long-term feeding on a fixed schedule. They hypothesized that careful scheduling of meals would regulate the biological clock and reduce the effects of a high-fat diet that, under normal circumstances, would lead to obesity.

Filed under circadian rhythms obesity weight loss nutrition neuroscience psychology brain science

76 notes


Stem Cells Turn Hearing Back On
Scientists have enabled deaf gerbils to hear again—with the help of transplanted cells that develop into nerves that can transmit auditory information from the ears to the brain. The advance, reported in Nature, could be the basis for a therapy to treat various kinds of hearing loss.
In humans, deafness is most often caused by damage to inner ear hair cells—so named because they sport hairlike cilia that bend when they encounter vibrations from sound waves—or by damage to the neurons that transmit that information to the brain. When the hair cells are damaged, those associated spiral ganglion neurons often begin to degenerate from lack of use. Implants can work in place of the hair cells, but if the sensory neurons are damaged, hearing is still limited.
"Obviously the ultimate aim is to replace both cell types," says Marcelo Rivolta of the University of Sheffield in the United Kingdom, who led the new work. "But we already have cochlear implants to replace hair cells, so we decided the first priority was to start by targeting the neurons."
In the past, scientists have tried to isolate so-called auditory stem cells from embryoid bodie—aggregates of stem cells that have begun to differentiate into different types. But such stem cells can only divide about 25 times, making it impossible to produce them in the quantity needed for a neuron transplant.
Rivolta and his colleagues knew that during embryonic development, a handful of proteins, including fibroblast growth factor (FGF) 3 and 10, are required for ears to form. So they exposed human embryonic stem cells to FGF3 and FGF10. Multiple types of cells formed, including precursor inner-ear hair cells, but they were also able to identify and isolate the cells beginning to differentiate into the desired spiral ganglion neurons. Then, they implanted the neuron precursor cells into the ears of gerbils with damaged ear neurons and followed the animals for 10 weeks. The function of the neurons was restored.
"We’ve only followed the animals for a very limited time," Rivolta says. "We want to follow them long-term now"—both to assess the possibility of increased cancer risk and to observe the long-term function of the new neurons, he adds.
"It’s very exciting," says neuroscientist Mark Maconochie of Sussex University in the United Kingdom, who was not involved in the new work. "In the past, there has been work where someone makes a single hair cell or something that looks like one neuron [from stem cells], and even that gets the field excited. This is a real step change."
The question now, he says, is whether the procedure can be fine-tuned to allow more efficient production of the relay neurons—currently, fewer than 20% of the stem cells treated develop into those ear neurons. By combining growth factors other than FGF3 and FGF10 with the stem cell mix, researchers could harvest even more ear progenitor cells, he hypothesizes.
"The next big challenge will be to do something as effective as this for the hair cells," Maconochie adds.

Stem Cells Turn Hearing Back On

Scientists have enabled deaf gerbils to hear again—with the help of transplanted cells that develop into nerves that can transmit auditory information from the ears to the brain. The advance, reported in Nature, could be the basis for a therapy to treat various kinds of hearing loss.

In humans, deafness is most often caused by damage to inner ear hair cells—so named because they sport hairlike cilia that bend when they encounter vibrations from sound waves—or by damage to the neurons that transmit that information to the brain. When the hair cells are damaged, those associated spiral ganglion neurons often begin to degenerate from lack of use. Implants can work in place of the hair cells, but if the sensory neurons are damaged, hearing is still limited.

"Obviously the ultimate aim is to replace both cell types," says Marcelo Rivolta of the University of Sheffield in the United Kingdom, who led the new work. "But we already have cochlear implants to replace hair cells, so we decided the first priority was to start by targeting the neurons."

In the past, scientists have tried to isolate so-called auditory stem cells from embryoid bodie—aggregates of stem cells that have begun to differentiate into different types. But such stem cells can only divide about 25 times, making it impossible to produce them in the quantity needed for a neuron transplant.

Rivolta and his colleagues knew that during embryonic development, a handful of proteins, including fibroblast growth factor (FGF) 3 and 10, are required for ears to form. So they exposed human embryonic stem cells to FGF3 and FGF10. Multiple types of cells formed, including precursor inner-ear hair cells, but they were also able to identify and isolate the cells beginning to differentiate into the desired spiral ganglion neurons. Then, they implanted the neuron precursor cells into the ears of gerbils with damaged ear neurons and followed the animals for 10 weeks. The function of the neurons was restored.

"We’ve only followed the animals for a very limited time," Rivolta says. "We want to follow them long-term now"—both to assess the possibility of increased cancer risk and to observe the long-term function of the new neurons, he adds.

"It’s very exciting," says neuroscientist Mark Maconochie of Sussex University in the United Kingdom, who was not involved in the new work. "In the past, there has been work where someone makes a single hair cell or something that looks like one neuron [from stem cells], and even that gets the field excited. This is a real step change."

The question now, he says, is whether the procedure can be fine-tuned to allow more efficient production of the relay neurons—currently, fewer than 20% of the stem cells treated develop into those ear neurons. By combining growth factors other than FGF3 and FGF10 with the stem cell mix, researchers could harvest even more ear progenitor cells, he hypothesizes.

"The next big challenge will be to do something as effective as this for the hair cells," Maconochie adds.

Filed under hearing hearing loss auditory cortex deafness implants stem cells neuron neuroscience brain psychology science

24 notes

A team of Australian researchers, led by University of Melbourne has developed a genetic test that is able to predict the risk of developing Autism Spectrum Disorder, ASD.
Lead researcher Professor Stan Skafidas, Director of the Centre for Neural Engineering at the University of Melbourne said the test could be used to assess the risk for developing the disorder.
 
“This test could assist in the early detection of the condition in babies and children and help in the early management of those who become diagnosed,” he said.
 
“It would be particularly relevant for families who have a history of Autism or related conditions such as Asperger’s Syndrome,” he said. 
 
Autism affects around one in 150 births and is characterized by abnormal social interaction, impaired communication and repetitive behaviours.

The test correctly predicted ASD with more than 70 per cent accuracy in people of central European descent. Ongoing validation tests are continuing including the development of accurate testing for other ethnic groups.

A team of Australian researchers, led by University of Melbourne has developed a genetic test that is able to predict the risk of developing Autism Spectrum Disorder, ASD.

Lead researcher Professor Stan Skafidas, Director of the Centre for Neural Engineering at the University of Melbourne said the test could be used to assess the risk for developing the disorder.
 
“This test could assist in the early detection of the condition in babies and children and help in the early management of those who become diagnosed,” he said.
 
“It would be particularly relevant for families who have a history of Autism or related conditions such as Asperger’s Syndrome,” he said. 
 

Autism affects around one in 150 births and is characterized by abnormal social interaction, impaired communication and repetitive behaviours.

The test correctly predicted ASD with more than 70 per cent accuracy in people of central European descent. Ongoing validation tests are continuing including the development of accurate testing for other ethnic groups.

Filed under ASD autism brain neuroscience psychology genetic test science

73 notes

Scientists discover how the brain ages

Researchers at Newcastle University have revealed the mechanism by which neurons, the nerve cells in the brain and other parts of the body, age.

The research, published in Aging Cell, opens up new avenues of understanding for conditions where the ageing of neurons are known to be responsible, such as dementia and Parkinson’s disease.

The ageing process has its roots deep within the cells and molecules that make up our bodies. Experts have previously identified the molecular pathway that react to cell damage and stems the cell’s ability to divide, known as cell senescence.

However, in cells that do not have this ability to divide, such as neurons in the brain and elsewhere, little was understood of the ageing process. Now a team of scientists at Newcastle University, led by Professor Thomas von Zglinicki have shown that these cells follow the same pathway.

This challenges previous assumptions on cell senescence and opens new areas to explore in terms of treatments for conditions such as dementia, motor neuron disease or age-related hearing loss.

Newcastle University’s Professor Thomas von Zglinicki who led the research said: “We want to continue our work looking at the pathways in human brains as this study provides us with a new concept as to how damage can spread from the first affected area to the whole brain.”

Working with the University’s special colony of aged mice, the scientists have discovered that ageing in neurons follows exactly the same rules as in senescing fibroblasts, the cells which divide in the skin to repair wounds.

DNA damage responses essentially re-program senescent fibroblasts to produce and secrete a host of dangerous substances including oxygen free radicals or reactive oxygen species (ROS) and pro-inflammatory signalling molecules. This makes senescent cells the ‘rotten apple in a basket’ that can damage and spoil the intact cells in their neighbourhood.  However, so far it was always thought that ageing in cells that can’t divide - post-mitotic, non-proliferating cells - like neurons would follow a completely different pathway.

Now, this research explains that in fact ageing in neurons follows exactly the same rules as in senescing fibroblasts.

Professor von Zglinicki, professor of Cellular Gerontology at Newcastle University said: “We will now need to find out whether the same mechanisms we detected in mouse brains are also associated with brain ageing and cognitive loss in humans. We might have opened up a short-cut towards understanding brain ageing, should that be the case.”

Dr Diana Jurk, who did most of this work during her PhD in the von Zglinicki group, said: “It was absolutely fascinating to see how ageing processes that we always thought of as completely separate turned out to be identical.  Suddenly so much disparate knowledge came together and made sense.”

(Source: ncl.ac.uk)

Filed under brain neuron neuroscience psychology aging neurodegenerative diseases science

53 notes

The brain is a notoriously difficult organ to treat, but Johns Hopkins researchers report they are one step closer to having a drug-delivery system flexible enough to overcome some key challenges posed by brain cancer and perhaps other maladies affecting that organ.
In a report published online on August 29 in Science Translational Medicine, the Johns Hopkins team says its bioengineers have designed nanoparticles that can safely and predictably infiltrate deep into the brain when tested in rodent and human tissue.
“We are pleased to have found a way to prevent drug-embedded particles from sticking to their surroundings so that they can spread once they are in the brain,” says Justin Hanes, Ph.D., Lewis J. Ort Professor of Ophthalmology, with secondary appointments in chemical and biomolecular engineering, biomedical engineering, oncology, neurological surgery and environmental health sciences, and director of the Johns Hopkins Center for Nanomedicine.

The brain is a notoriously difficult organ to treat, but Johns Hopkins researchers report they are one step closer to having a drug-delivery system flexible enough to overcome some key challenges posed by brain cancer and perhaps other maladies affecting that organ.

In a report published online on August 29 in Science Translational Medicine, the Johns Hopkins team says its bioengineers have designed nanoparticles that can safely and predictably infiltrate deep into the brain when tested in rodent and human tissue.

“We are pleased to have found a way to prevent drug-embedded particles from sticking to their surroundings so that they can spread once they are in the brain,” says Justin Hanes, Ph.D., Lewis J. Ort Professor of Ophthalmology, with secondary appointments in chemical and biomolecular engineering, biomedical engineering, oncology, neurological surgery and environmental health sciences, and director of the Johns Hopkins Center for Nanomedicine.

Filed under brain neuroscience drug-delivery system nanoparticles psychology science

47 notes


“Doctor” or “Darling”: The Subtle Differences of Speech
Human speech comes in countless varieties: When people talk to close friends or partners, they talk differently than when they address a physician. These differences in speech are quite subtle and hard to pinpoint. In a recent special issue of the journal Frontiers in Human Neuroscience, Johanna Derix, Dr. Tonio Ball, and their colleagues from the Bernstein Center and the University Medical Center in Freiburg report that they were able to tell from brain signals who a person was talking to. This discovery could contribute to the further development of speech synthesizers for patients with severe paralysis.
In contrast to the experimental research common in human neuroscience, the scientists studied natural, non-experimental behavior. Patients who, for medical reasons, had electrodes implanted underneath their skull allowed their brain activity to be recorded during daily life in the hospital. The Freiburg researchers compared data recorded during natural conversations that the patients had with their physicians and their life partners. They found pronounced differences in the anterior temporal lobe, a brain area well known for its significance in social interaction. Several components of neural signals that are detectable on the brain surface can convey such information.
“This study is only the first step towards elucidating the neural basis of human everyday behavior,” explains the neuroscientist and physician Tonio Ball. “Such investigations will become especially important in developing new neurotechnological treatment options for patients with impaired motor and language functions that work in real life situations.” The restoration of speech production becomes necessary in some forms of neurological diseases and chronic paralysis. A computer could synthesize speech for patients suffering from such conditions by using their brain signals. Information on who the patient is addressing could help the device to select the degree of formality – and to prevent it from calling the doctor “darling.”

“Doctor” or “Darling”: The Subtle Differences of Speech

Human speech comes in countless varieties: When people talk to close friends or partners, they talk differently than when they address a physician. These differences in speech are quite subtle and hard to pinpoint. In a recent special issue of the journal Frontiers in Human Neuroscience, Johanna Derix, Dr. Tonio Ball, and their colleagues from the Bernstein Center and the University Medical Center in Freiburg report that they were able to tell from brain signals who a person was talking to. This discovery could contribute to the further development of speech synthesizers for patients with severe paralysis.

In contrast to the experimental research common in human neuroscience, the scientists studied natural, non-experimental behavior. Patients who, for medical reasons, had electrodes implanted underneath their skull allowed their brain activity to be recorded during daily life in the hospital. The Freiburg researchers compared data recorded during natural conversations that the patients had with their physicians and their life partners. They found pronounced differences in the anterior temporal lobe, a brain area well known for its significance in social interaction. Several components of neural signals that are detectable on the brain surface can convey such information.

“This study is only the first step towards elucidating the neural basis of human everyday behavior,” explains the neuroscientist and physician Tonio Ball. “Such investigations will become especially important in developing new neurotechnological treatment options for patients with impaired motor and language functions that work in real life situations.” The restoration of speech production becomes necessary in some forms of neurological diseases and chronic paralysis. A computer could synthesize speech for patients suffering from such conditions by using their brain signals. Information on who the patient is addressing could help the device to select the degree of formality – and to prevent it from calling the doctor “darling.”

Filed under brain neuroscience speech brain signals psychology behavior science

73 notes

How non-verbal cues can predict a person’s (and a robot’s) trustworthiness
People face this predicament all the time—can you determine a person’s character in a single interaction? Can you judge whether someone you just met can be trusted when you have only a few minutes together? And if you can, how do you do it? Using a robot named Nexi, Northeastern University psychology professor David DeSteno and collaborators Cynthia Breazeal from MIT’s Media Lab and Robert Frank and David Pizarro from Cornell University have figured out the answer. The findings were recently published in the journal Psychological Science, a journal of the Association for Psychological Science.
It’s What You’re Not Saying…
In the absence of reliable information about a person’s reputation, nonverbal cues can offer a look into a person’s likely actions. This concept has been known for years, but the cues that convey trustworthiness or untrustworthiness have remained a mystery. Collecting data from face-to-face conversations with research participants where money was on the line, DeSteno and his team realized that it’s not one single non-verbal movement or cue that determines a person’s trustworthiness, but rather sets of cues. When participants expressed these cues, they cheated their partners more, and, at a gut level, their partners expected it. “Scientists haven’t been able to unlock the cues to trust because they’ve been going about it the wrong way,” DeSteno said. “There’s no one golden-cue. Context and coordination of movements is what matters.”
Robots Have Feelings, Too
People are fidgety – they’re moving all the time. So how could the team truly zero-in on the cues that mattered? This is where Nexi comes in. Nexi is a humanoid social robot that afforded the team an important benefit – they could control all its movements perfectly. In a second experiment, the team had research participants converse with Nexi for 10 minutes, much like they did with another person in the first experiment. While conversing with the participants, Nexi — operated remotely by researchers — either expressed cues that were considered less than trustworthy or expressed similar, but non-trust-related cues. Confirming their theory, the team found that participants exposed to Nexi’s untrustworthy cues intuited that Nexi was likely to cheat them and adjusted their financial decisions accordingly. “Certain nonverbal gestures trigger emotional reactions we’re not consciously aware of, and these reactions are enormously important for understanding how interpersonal relationships develop,” said Frank. “The fact that a robot can trigger the same reactions confirms the mechanistic nature of many of the forces that influence human interaction.”
Real-Life Application
This discovery has led the research team to not only answer enduring questions about if and how people are able to assess the trustworthiness of an unknown person, but also to show the human mind’s willingness to ascribe trust-related intentions to technological entities based on the same movements. “This is a very exciting result that showcases how social robots can be used to gain important insights about human behavior,” said Cynthia Breazeal of MIT’s Media Lab. “This also has fascinating implications for the design of future robots that interact and work alongside people as partners.” Accordingly, these findings hold important insights not only for security and financial endeavors and for the evolving design of robots and computer-based agents. The subconscious mind is ready to see these entities as social beings.

How non-verbal cues can predict a person’s (and a robot’s) trustworthiness

People face this predicament all the time—can you determine a person’s character in a single interaction? Can you judge whether someone you just met can be trusted when you have only a few minutes together? And if you can, how do you do it? Using a robot named Nexi, Northeastern University psychology professor David DeSteno and collaborators Cynthia Breazeal from MIT’s Media Lab and Robert Frank and David Pizarro from Cornell University have figured out the answer. The findings were recently published in the journal Psychological Science, a journal of the Association for Psychological Science.

It’s What You’re Not Saying…

In the absence of reliable information about a person’s reputation, nonverbal cues can offer a look into a person’s likely actions. This concept has been known for years, but the cues that convey trustworthiness or untrustworthiness have remained a mystery. Collecting data from face-to-face conversations with research participants where money was on the line, DeSteno and his team realized that it’s not one single non-verbal movement or cue that determines a person’s trustworthiness, but rather sets of cues. When participants expressed these cues, they cheated their partners more, and, at a gut level, their partners expected it. “Scientists haven’t been able to unlock the cues to trust because they’ve been going about it the wrong way,” DeSteno said. “There’s no one golden-cue. Context and coordination of movements is what matters.”

Robots Have Feelings, Too

People are fidgety – they’re moving all the time. So how could the team truly zero-in on the cues that mattered? This is where Nexi comes in. Nexi is a humanoid social robot that afforded the team an important benefit – they could control all its movements perfectly. In a second experiment, the team had research participants converse with Nexi for 10 minutes, much like they did with another person in the first experiment. While conversing with the participants, Nexi — operated remotely by researchers — either expressed cues that were considered less than trustworthy or expressed similar, but non-trust-related cues. Confirming their theory, the team found that participants exposed to Nexi’s untrustworthy cues intuited that Nexi was likely to cheat them and adjusted their financial decisions accordingly. “Certain nonverbal gestures trigger emotional reactions we’re not consciously aware of, and these reactions are enormously important for understanding how interpersonal relationships develop,” said Frank. “The fact that a robot can trigger the same reactions confirms the mechanistic nature of many of the forces that influence human interaction.”

Real-Life Application

This discovery has led the research team to not only answer enduring questions about if and how people are able to assess the trustworthiness of an unknown person, but also to show the human mind’s willingness to ascribe trust-related intentions to technological entities based on the same movements. “This is a very exciting result that showcases how social robots can be used to gain important insights about human behavior,” said Cynthia Breazeal of MIT’s Media Lab. “This also has fascinating implications for the design of future robots that interact and work alongside people as partners.” Accordingly, these findings hold important insights not only for security and financial endeavors and for the evolving design of robots and computer-based agents. The subconscious mind is ready to see these entities as social beings.

Filed under AI Nexi neuroscience non-verbal cues psychology robotics robots science trustworthiness humanoids

16 notes

The world continues to be a noisy place, and Purdue University researchers have found that all that background chatter causes the ears of those with hearing impairments to work differently.
"When immersed in the noise, the neurons of the inner ear must work harder because they are spread too thin," said Kenneth S. Henry, a postdoctoral researcher in Purdue’s Department of Speech, Language and Hearing Sciences. "It’s comparable to turning on a dozen television screens and asking someone to focus on one program. The result can be fuzzy because these neurons get distracted by other information."
The findings, by Henry and Michael G. Heinz, an associate professor of speech, language and hearing sciences, are published as a Brief Communication in Nature Neuroscience. The work was funded by the National Institutes of Health and the National Institute on Deafness and Other Communication Disorders.

The world continues to be a noisy place, and Purdue University researchers have found that all that background chatter causes the ears of those with hearing impairments to work differently.

"When immersed in the noise, the neurons of the inner ear must work harder because they are spread too thin," said Kenneth S. Henry, a postdoctoral researcher in Purdue’s Department of Speech, Language and Hearing Sciences. "It’s comparable to turning on a dozen television screens and asking someone to focus on one program. The result can be fuzzy because these neurons get distracted by other information."

The findings, by Henry and Michael G. Heinz, an associate professor of speech, language and hearing sciences, are published as a Brief Communication in Nature Neuroscience. The work was funded by the National Institutes of Health and the National Institute on Deafness and Other Communication Disorders.

Filed under hearing auditory cortex brain neuroscience psychology science

60 notes

Second-hand smoking damages memory

Non-smokers who live with or spend time with smokers are damaging their memory, according to new research from Northumbria University. 

The findings, published in the latest online edition of the journal Addiction is the first study to explore the relationship between exposure to other people’s smoke and everyday memory problems.

Dr Tom Heffernan and Dr Terence O’Neil, both researchers at the Collaboration for Drug and Alcohol Research Group at Northumbria University, compared a group of current smokers with two groups of non-smokers – those who were regularly exposed to second-hand smoke and those who were not.

Those exposed to second-hand smoke either lived with smokers or spent time with smokers, for example in a designated “smoking area,” and reported being exposed to second-hand smoke for an average of 25 hours a week for an average of four and a half years.

The three groups were tested on time-based memory (remembering to carry out an activity after some time) and event-based memory (which refers to memory for future intentions and activities).

Researchers found that the non-smokers who had been exposed to second-hand smoke forgot almost 20% more in the memory tests than those non-smokers not exposed. However, both groups out-performed the current smokers who forgot 30% more than those who were not exposed to second-hand smoking.

Dr Heffernan said: “According to recent reports by the World Health Organisation, exposure to second-hand smoke can have serious consequences on the health of people who have never smoked themselves, but who are exposed to other people’s tobacco smoke.

“Our findings suggest that the deficits associated with second-hand smoke exposure extend to everyday cognitive function. We hope our work will stimulate further research in the field in order to gain a better understanding of the links between exposure to second-hand smoke, health problems and everyday cognitive function.”

(Source: northumbria.ac.uk)

Filed under brain memory second-hand smoking smoking neuroscience psychology science

39 notes

Have you ever wondered why some people find it so much easier to stop smoking than others?
New research shows that vulnerability to smoking addiction is shaped by our genes. A study from the Montreal Neurological Institute and Hospital - The Neuro, McGill University shows that people with genetically fast nicotine metabolism have a significantly greater brain response to smoking cues than those with slow nicotine metabolism. Previous research shows that greater reactivity to smoking cues predicts decreased success at smoking cessation and that environmental cues promote increased nicotine intake in animals and humans. This new finding that nicotine metabolism rates affect the brain’s response to smoking may lead the way for tailoring smoking cessation programs based on individual genetics.

Have you ever wondered why some people find it so much easier to stop smoking than others?

New research shows that vulnerability to smoking addiction is shaped by our genes. A study from the Montreal Neurological Institute and Hospital - The Neuro, McGill University shows that people with genetically fast nicotine metabolism have a significantly greater brain response to smoking cues than those with slow nicotine metabolism. Previous research shows that greater reactivity to smoking cues predicts decreased success at smoking cessation and that environmental cues promote increased nicotine intake in animals and humans. This new finding that nicotine metabolism rates affect the brain’s response to smoking may lead the way for tailoring smoking cessation programs based on individual genetics.

Filed under addiction brain genetics neuroscience psychology smoking smoking cessation science

free counters