Neuroscience

Articles and news from the latest research reports.

56 notes

Study gives clues to causes of Motor Neurone Disease

Scientists at the University of Bath are one step further to understanding the role of one of the proteins that causes the neurodegenerative disorder, Amyotrophic Lateral Sclerosis (ALS), also known as Motor Neurone Disease (MND).

The scientists studied a protein called angiogenin, which is present in the spinal cord and brain that protects neurones from cell death. Mutations in this protein have been found in sufferers of MND and are thought to play a key role in the progression of the condition.

MND triggers progressive weakness, muscle atrophy and muscle twitches and spasms. The disease affects around 5000 people in the UK.

The team of cell biologists and structural biologists have, for the first time, produced images of the 3D structures of 11 mutant versions of angiogenin to see how the mutations changed the structure of the active part of the molecule, damaging its function.

The study, published in the prestigious journal Nature Communications, provides insights into the causes of this disease and related conditions such as Parkinson’s Disease.

The team also looked at the effects of the malfunctioning proteins on neurones grown from embryonic stem cells in the laboratory.

They found that some of the mutations stopped the protein being transported to the cell nucleus, a process that is critical for the protein to function correctly.

The mutations also prevented the cells from producing stress granules, the neurone’s natural defence from stress caused by low oxygen levels.

Dr Vasanta Subramanian, Reader in Biology & Biochemistry at the University, said:

“This study is exciting because it’s the first time we’ve directly linked the structure of these faulty proteins with their effects in the cell.

“We’ve worked alongside Professor Ravi Acharya’s group to combine structural knowledge with cell biology to gain new insights into the causes of this devastating disease.

“We hope that the scientific community can use this new knowledge to help design new drugs that will bind selectively to the defective protein to protect the body from its damaging effects.”

The findings were welcomed by medical research charity, the Motor Neurone Disease (MND) Association, the only national charity in England, Wales and Northern Ireland dedicated to supporting people living with MND while funding and promoting cutting-edge global research to bring about a world free of the disease.

Dr Brian Dickie, Director of Research Development at the charity, said: “The researchers at the University of Bath have skilfully combined aspects of biology, chemistry and physics to answer some fundamental questions on how angiogenin can damage motor neurones. It not only advances our understanding of the disease, but may also give rise to new ideas on treatment development.”

(Source: bath.ac.uk)

Filed under brain neuron MND ALS neurodegenerative diseases neuroscience psychology science

89 notes


New Study Reveals How Humans Became Right-Handed
According to a new study led by Dr Gillian Forrester of the University of Sussex, a predominance to be right-handed is not a uniquely human trait but one shared by great apes.
The study, published in the journal Behavioural Brain Research, analyzed hand actions directed towards either objects or individuals in chimpanzees, gorillas and children, and found that all three species are right-handed for actions to objects, but not for actions directed to individuals.
The results support a theory that human right-handedness is a trait developed through tool use that was inherited from an ancestor common to both humans and great apes. The findings challenge a widely held view that right-handed dominance in humans was a species-unique trait linked to the emergence of language.
“Humans have been tool users for 2.5 million years, while the current view is that language only emerged one hundred thousand years ago,” Dr Forrester said. “Our findings provide the first non-invasive results from naturalistic behavior, suggesting that language emerged as a consequence of left hemisphere brain regions that were already evolved to process regular sequences of actions. The structure found in language may have developed from pre-existing brain processes adapted from experience with tool-use.”

New Study Reveals How Humans Became Right-Handed

According to a new study led by Dr Gillian Forrester of the University of Sussex, a predominance to be right-handed is not a uniquely human trait but one shared by great apes.

The study, published in the journal Behavioural Brain Research, analyzed hand actions directed towards either objects or individuals in chimpanzees, gorillas and children, and found that all three species are right-handed for actions to objects, but not for actions directed to individuals.

The results support a theory that human right-handedness is a trait developed through tool use that was inherited from an ancestor common to both humans and great apes. The findings challenge a widely held view that right-handed dominance in humans was a species-unique trait linked to the emergence of language.

“Humans have been tool users for 2.5 million years, while the current view is that language only emerged one hundred thousand years ago,” Dr Forrester said. “Our findings provide the first non-invasive results from naturalistic behavior, suggesting that language emerged as a consequence of left hemisphere brain regions that were already evolved to process regular sequences of actions. The structure found in language may have developed from pre-existing brain processes adapted from experience with tool-use.”

Filed under brain handedness using tools language neuroscience psychology primates science

45 notes

Let there be sight: Burst of neural activity necessary for vision

A sudden and mysterious burst of activity originating in the retina of a developing fetus spurs brain connections that are essential to development of finely-tuned sight, Yale researchers report in the journal Nature. Interference with this spontaneous wave of activity could play a role in neurodevelopmental disorders such as autism, the scientists speculate.

The study in mice is the first to demonstrate in a living animal that this wave of activity spreads throughout large regions of the brain and is crucial to wiring of the visual system. Without the wiring, infants would not be able to distinguish details in their environment.

“If you interfere with this activity, the circuits are all messed up, the wiring details are all wrong,” said Michael Crair, the William Ziegler III Professor of Neurobiology and Professor of Ophthalmology and Visual Science and senior author of the study.

For instance, this activity might allow a newborn human baby to perceive such details as the five fingers attached to her hand or her mother’s face. This wave wires up the visual system so that infants are poised to learn from their environment soon after birth.

The development of animals from a fertilized egg into trillions of intricately connected and specialized cells is the result of a precisely timed expression of genes. However, the Nature paper introduces another necessary factor — a mysterious wave of activity arising in the retina itself that propagates through several regions of the brain. Crair terms this wave an emergent property, or a trait possessed by a complex system that cannot be directly traced to its individual parts. This experiment in living, neonatal mice shows that this wave is crucial to the proper wiring not only of the visual system but other brain areas as well.

Crair said his lab plans to explore whether interruptions of this activity might play a role in neurodevelopmental disorders such as autism or schizophrenia.

(Source: news.yale.edu)

Filed under brain vision neuron neural activity retina developmental disorders neuroscience psychology science

159 notes


Study links eating chocolate to winning Nobels
Take this with a grain of salt, or perhaps some almonds or hazelnuts: A study ties chocolate consumption to the number of Nobel Prize winners a country has and suggests it’s a sign that the sweet treat can boost brain power.
No, this does not appear in the satirical Onion newspaper. It’s in the prestigious New England Journal of Medicine, which published it online Wednesday as a “note” rather than a rigorous, peer-reviewed study.
The author — Dr. Franz Messerli, of St. Luke’s-Roosevelt Hospital and Columbia University in New York — writes that there is evidence that flavanols in green tea, red wine and chocolate can help “in slowing down or even reversing” age-related mental decline — a contention some medical experts may dispute.
Nevertheless, he examined whether a country’s per-capita chocolate consumption was related to the number of Nobels it had won — a possible sign of a nation’s “cognitive function.” Using data from some major chocolate producers on sales in 23 countries, he found “a surprisingly powerful correlation.”

Study links eating chocolate to winning Nobels

Take this with a grain of salt, or perhaps some almonds or hazelnuts: A study ties chocolate consumption to the number of Nobel Prize winners a country has and suggests it’s a sign that the sweet treat can boost brain power.

No, this does not appear in the satirical Onion newspaper. It’s in the prestigious New England Journal of Medicine, which published it online Wednesday as a “note” rather than a rigorous, peer-reviewed study.

The author — Dr. Franz Messerli, of St. Luke’s-Roosevelt Hospital and Columbia University in New York — writes that there is evidence that flavanols in green tea, red wine and chocolate can help “in slowing down or even reversing” age-related mental decline — a contention some medical experts may dispute.

Nevertheless, he examined whether a country’s per-capita chocolate consumption was related to the number of Nobels it had won — a possible sign of a nation’s “cognitive function.” Using data from some major chocolate producers on sales in 23 countries, he found “a surprisingly powerful correlation.”

Filed under chocolate chocolate consumption flavanols Nobel Prize brain neuroscience psychology science

88 notes

Is the afterlife full of fluffy clouds and angels?

What does the neuroscientist Colin Blakemore make of an American neurosurgeon’s account of the afterlife?

Have you ever noticed that more people come back from Heaven than from Hell? We have all read those astonishing reports of near-death experiences (NDEs, as the aficionados call them) – the things that people say have happened to them when they almost, but don’t quite, shuffle off the coil.

They are nearly always pleasant and deeply reassuring in a saccharin-soaked way. Lots of spinning down warm, dark tunnels to the sound of celestial music; lots of trips along country lanes lined with hedges, towards the light of a welcoming cottage at the end of the road; lots of tumbling down Alice-in-Wonderland rabbit holes, but without the damaging effects of gravity.

True, Dr Maurice S Rawlings Jr, MD, heart surgeon in Chattanooga, Tennessee, and author of To Hell and Back, did have patients who reported very nasty NDEs after they came back on his operating table. Booming noises; licking flames and all that Mephistophelian stuff. But perhaps that tells us more about the challenges of living in Chattanooga, Tennessee, than about the metaphysics of life after death.

Predictably, the amazingly consistent, remarkably heaven-like experiences recounted by the majority of NDE-ers (yes, that really is what the experts call them) have been summarily dismissed by materialist sceptics – like me. Of course the brain does funny things when it’s running out of oxygen. The odd perceptions are just the consequences of confused activity in the temporal lobes.

But NDEs have taken on a new cloak of respectability with a book by a Harvard doctor. Proof of Heaven, by Eben Alexander, will make your toes wiggle or curl, depending on your prejudices. What’s special about his account of being dead is that he’s a neurosurgeon. At least that’s what the publicity is telling us. It’s a cover story in Newsweek magazine, with a screaming headline: “Heaven is Real: a doctor’s account of the afterlife”.

Read more …

Filed under near-death experiences metaphysics life death neuroscience brain perception afterlife science

43 notes


Researchers Find Regenerated Lizard Tails Are Different From Originals
Just because a lizard can grow back its tail, doesn’t mean it will be exactly the same. A multidisciplinary team of scientists from the University of Arizona and Arizona State University examined the anatomical and microscopic make-up of regenerated lizard tails and discovered that the new tails are quite different from the original ones. The findings are published in a pair of articles featured in a special October edition of the journal, The Anatomical Record.
“The regenerated lizard tail is not perfect replica,” said Rebecca Fisher, an associate professor at the UA College of Medicine-Phoenix. “There are key anatomical differences including the presence of a cartilaginous rod and elongated muscle fibers spanning the length of the regenerated tail.”
Researchers studied the regenerated tails of the green anole lizard (Anolis carolinensis), which can lose its tail when caught by a predator and then grow it back. The new tail had a single, long tube of cartilage rather than vertebrae, as in the original. Also, long muscles span the length of the regenerated tail compared to shorter muscle fibers found in the original.
"These differences suggest that the regenerated tail is less flexible, as neither the cartilage tube nor the long muscle fibers would be capable of the fine movements of the original tail, with its interlocking vertebrae and short muscle fibers," said Fisher, who also is an associate professor in the School of Life Sciences at ASU. "The regrown tail is not simply a copy of the original, but instead is a replacement that restores some function."

Researchers Find Regenerated Lizard Tails Are Different From Originals

Just because a lizard can grow back its tail, doesn’t mean it will be exactly the same. A multidisciplinary team of scientists from the University of Arizona and Arizona State University examined the anatomical and microscopic make-up of regenerated lizard tails and discovered that the new tails are quite different from the original ones. The findings are published in a pair of articles featured in a special October edition of the journal, The Anatomical Record.

“The regenerated lizard tail is not perfect replica,” said Rebecca Fisher, an associate professor at the UA College of Medicine-Phoenix. “There are key anatomical differences including the presence of a cartilaginous rod and elongated muscle fibers spanning the length of the regenerated tail.”

Researchers studied the regenerated tails of the green anole lizard (Anolis carolinensis), which can lose its tail when caught by a predator and then grow it back. The new tail had a single, long tube of cartilage rather than vertebrae, as in the original. Also, long muscles span the length of the regenerated tail compared to shorter muscle fibers found in the original.

"These differences suggest that the regenerated tail is less flexible, as neither the cartilage tube nor the long muscle fibers would be capable of the fine movements of the original tail, with its interlocking vertebrae and short muscle fibers," said Fisher, who also is an associate professor in the School of Life Sciences at ASU. "The regrown tail is not simply a copy of the original, but instead is a replacement that restores some function."

Filed under lizards anatomy regeneration regenerated tail genetics neuroscience science

52 notes

Rare genetic disorder points to molecules that may play role in schizophrenia

Scientists studying a rare genetic disorder have identified a molecular pathway that may play a role in schizophrenia, according to new research in the Oct. 10 issue of The Journal of Neuroscience. The findings may one day guide researchers to new treatment options for people with schizophrenia — a devastating disease that affects approximately 1 percent of the world’s population.

Schizophrenia is characterized by a multitude of symptoms, including hallucinations, social withdrawal, and learning and memory deficits, which usually appear during late adolescence or early adulthood. Efforts to identify disease causes have been complicated by the fact that no single genetic mutation is strongly associated with the disease. By studying a rare genetic disorder that increases the risk of schizophrenia, Laurie Earls, PhD, and colleagues in the laboratory of Stanislav Zakharenko, MD, PhD, at St. Jude Children’s Research Hospital identified molecular changes that affect memory and are also present in people with schizophrenia.

Approximately 30 percent of people with a genetic disorder known as 22q11 deletion syndrome develop schizophrenia, making it one of the strongest risk factors for the disease. In previous studies of mice with the 22q11 deletion, Zakharenko’s group identified changes in nerve cells leading to deficits in the hippocampus — the brain’s learning and memory center — that appear with age. In the current study, the group confirmed similar molecular changes occur in people with schizophrenia. They also zeroed in on the gene contributing to the nerve cell changes.

"This study makes some very important discoveries about the precise mechanisms underlying the learning and memory deficits seen in the genetic mouse model — problems that are a central part of the human disease," said Carrie Bearden, PhD, an expert on 22q11 deletion syndrome at the University of California, Los Angeles, who was not involved in the study. "Pinpointing the specific gene involved is the first step toward developing targeted therapies that could reverse the cognitive deficits associated with schizophrenia, both in the context of this genetic mutation and the broader population," she added.

In previous studies, Zakharenko’s group found that abnormal nerve cell communication and cognitive dysfunction was associated with elevated levels of a protein that regulates calcium in certain nerve cells known as Serca2. These abnormalities are only detectable with age in mice with the 22q11 deletion.

In the current study, the researchers identified the gene Dgcr8 as the source of the changes.It produces molecules called microRNAs that normally keep Serca2 in check. Without them, the protein becomes elevated.By adding these molecules back into the hippocampus of animals with the 22q11 deletion, the researchers were able to reduce elevated Serca2 levels and reduce the cellular deficits associated with this genetic defect.

To assess whether the findings from these genetic mouse studies might translate to schizophrenia, the authors analyzed post-mortem brain tissue from people with schizophrenia. The researchers discovered that Serca2 was elevated even in patients with schizophrenia who did not have the 22q11 deletion.

"These data suggest a link between the nerve cell changes in patients with the 22q11 deletion syndrome and those that occur in patients with schizophrenia," Zakharenko said. "Serca2 regulation represents a novel therapeutic target for schizophrenia."

(Source: sott.net)

Filed under genetic dsorders mental illness schizophrenia 22q11 22q11 deletion syndrome nerve cells neuroscience science

15 notes

Cognitive reorganization during pregnancy and the postpartum period: An evolutionary perspective

Where the non-human animal research investigating reproduction-induced cognitive reorganization has focused on neural plasticity and adaptive advantage in response to the demands associated with pregnancy and parenting, human studies have primarily concentrated on pregnancy-induced memory decline. The current review updates Henry and Rendell’s 2007 meta-analysis, and examines cognitive reorganization as the result of reproductive experience from an adaptationist perspective. Investigations of pregnancy-induced cognitive change in human females may benefit by focusing on areas, such as social cognition, where a cognitive advantage would serve a protective function, and by extending the study duration beyond pregnancy into the postpartum period.

(Source: epjournal.net)

Filed under brain cognition pregnancy evolution neuroscience psychology science

60 notes

Explaining the origins of word order using information theory
The majority of languages — roughly 85 percent of them — can be sorted into two categories: those, like English, in which the basic sentence form is subject-verb-object (“the girl kicks the ball”), and those, like Japanese, in which the basic sentence form is subject-object-verb (“the girl the ball kicks”).
The reason for the difference has remained somewhat mysterious, but researchers from MIT’s Department of Brain and Cognitive Sciences now believe that they can account for it using concepts borrowed from information theory, the discipline, invented almost singlehandedly by longtime MIT professor Claude Shannon, that led to the digital revolution in communications. The researchers will present their hypothesis in an upcoming issue of the journal  Psychological Science.
Shannon was largely concerned with faithful communication in the presence of “noise” — any external influence that can corrupt a message on its way from sender to receiver. Ted Gibson, a professor of cognitive sciences at MIT and corresponding author on the new paper, argues that human speech is an example of what Shannon called a “noisy channel.”
“If I’m getting an idea across to you, there’s noise in what I’m saying,” Gibson says. “I may not say what I mean — I pick up the wrong word, or whatever. Even if I say something right, you may hear the wrong thing. And then there’s ambient stuff in between on the signal, which can screw us up. It’s a real problem.” In their paper, the MIT researchers argue that languages develop the word order rules they do in order to minimize the risk of miscommunication across a noisy channel.
[E. Gibson, S.T. Piantadosi, K. Brink, L. Bergen, E. Lim, and R. Saxe. A noisy-channel account of crosslinguistic word order variation. Psychological Science, accepted, 2012]

Explaining the origins of word order using information theory

The majority of languages — roughly 85 percent of them — can be sorted into two categories: those, like English, in which the basic sentence form is subject-verb-object (“the girl kicks the ball”), and those, like Japanese, in which the basic sentence form is subject-object-verb (“the girl the ball kicks”).

The reason for the difference has remained somewhat mysterious, but researchers from MIT’s Department of Brain and Cognitive Sciences now believe that they can account for it using concepts borrowed from information theory, the discipline, invented almost singlehandedly by longtime MIT professor Claude Shannon, that led to the digital revolution in communications. The researchers will present their hypothesis in an upcoming issue of the journal Psychological Science.

Shannon was largely concerned with faithful communication in the presence of “noise” — any external influence that can corrupt a message on its way from sender to receiver. Ted Gibson, a professor of cognitive sciences at MIT and corresponding author on the new paper, argues that human speech is an example of what Shannon called a “noisy channel.”

“If I’m getting an idea across to you, there’s noise in what I’m saying,” Gibson says. “I may not say what I mean — I pick up the wrong word, or whatever. Even if I say something right, you may hear the wrong thing. And then there’s ambient stuff in between on the signal, which can screw us up. It’s a real problem.” In their paper, the MIT researchers argue that languages develop the word order rules they do in order to minimize the risk of miscommunication across a noisy channel.

[E. Gibson, S.T. Piantadosi, K. Brink, L. Bergen, E. Lim, and R. Saxe. A noisy-channel account of crosslinguistic word order variation. Psychological Science, accepted, 2012]

Filed under language information theory miscommunication communication word order neuroscience science

free counters