Neuroscience

Articles and news from the latest research reports.

Posts tagged science

55 notes

How a new approach to funding Alzheimer’s research could pay off



More than 5 million Americans suffer from Alzheimer’s disease, the affliction that erodes memory and other mental capacities, but no drugs targeting the disease have been approved by the U.S. Food and Drug Administration since 2003. Now a paper by an MIT professor suggests that a revamped way of financing Alzheimer’s research could spur the development of useful new drugs for the illness.
“We are spending tremendous amounts of resources dealing with this disease, but we don’t have any effective therapies for it,” says Andrew Lo, the Charles E. and Susan T. Harris Professor of Finance and director of the Laboratory for Financial Engineering at the MIT Sloan School of Management. “It really imposes a tremendous burden on society, not just for the afflicted, but also for those who care for them.”
Lo and three co-authors propose creating a public-private partnership that would fund research for a diverse array of drug-discovery projects simultaneously. Such an approach would increase the chances of a therapeutic breakthrough, they say, and the inclusion of public funding would help mitigate the risks and costs of Alzheimer’s research for the private sector.
There would be a long-term public-sector payoff, according to the researchers: Government funding for Alzheimer’s research would pale in comparison to the cost of caring for Alzheimer’s sufferers in public health-care programs. The paper’s model of the new funding approach calls for an outlay of $38.4 billion over 13 years for research; the costs of Medicare and Medicaid support for Alzheimer’s patients in 2014 alone is estimated to be $150 billion.
“Having parallel development would obviously decrease the waiting time, but it increases the short-run need for funding,” Lo says. “Given how much of an urgent need there is for Alzheimer’s therapies, it has to be the case that if you develop a cure, you’re going to be able to recoup your costs and then some.” In fact, the paper’s model estimates a double-digit return on public investment over the long run.
Lo adds: “Can we afford it? I think a more pressing question is, ‘Can we afford not to do something about this now?’”
Modeling the odds of success
The paper, “Parallel Discovery of Alzheimer’s Therapeutics,” was published today in Science Translational Medicine. Along with Lo, the co-authors of the piece are Carole Ho of the biotechnology firm Genentech, Jayna Cummings of MIT Sloan, and Kenneth Kosik of the University of California at Santa Barbara.
The main hypothesis on the cause of Alzheimer’s involves amyloid deposition, the buildup of plaques in the brain that impair neurological function; most biomedical efforts to tackle the disease have focused on this issue. For the study, Ho and Kosik, leading experts in Alzheimer’s research, compiled a list of 64 conceivable approaches to drug discovery, addressing a range of biological mechanisms that may be involved in the disease.
A fund backing that group of research projects might expand the chances of developing a drug that could, at a minimum, slow the progression of the disease. On the other hand, it might not increase the odds of success so much that pharmaceutical firms and biomedical investment funds would plow money into the problem.
“Sixty-four projects are a lot more than what’s being investigated today, but it’s still way shy of the 150 or 200 that are needed to mitigate the financial risks of an Alzheimer’s-focused fund,” Lo says.
The model assumes 13 years for the development of an individual drug, including clinical trials, and estimates the success rates for drug development. Given 150 trials, the odds of at least two successful trials are 99.59 percent. Two successful trials, Lo says, is what it would take to make the investment — a series of bonds issued by the fund — profitable and attractive to a broad range of investors.
“With a sufficiently high likelihood of success, you can issue debt to attract a large group of bondholders who would be willing to put their money to work,” Lo says. “The enormous size of bond markets translates into enormous potential funding opportunities for developing these therapeutics.”
Stakeholders everywhere
To be clear, Lo says, Alzheimer’s drug development is a very difficult task, since researchers often have to identify a pool of potential patients well before symptoms occur, in order to see how well therapies might work on delaying the onset of the disease.
Compared with the development of new drugs to treat other diseases, “Alzheimer’s drug development is more expensive, takes longer, and needs a larger sample of potential patients,” Lo acknowledges.
However, since the number of Americans suffering from Alzheimer’s is projected to double by 2050, according to the Alzheimer’s Association, an advocacy group, Lo stresses the urgency of the task at hand.”

How a new approach to funding Alzheimer’s research could pay off

More than 5 million Americans suffer from Alzheimer’s disease, the affliction that erodes memory and other mental capacities, but no drugs targeting the disease have been approved by the U.S. Food and Drug Administration since 2003. Now a paper by an MIT professor suggests that a revamped way of financing Alzheimer’s research could spur the development of useful new drugs for the illness.

“We are spending tremendous amounts of resources dealing with this disease, but we don’t have any effective therapies for it,” says Andrew Lo, the Charles E. and Susan T. Harris Professor of Finance and director of the Laboratory for Financial Engineering at the MIT Sloan School of Management. “It really imposes a tremendous burden on society, not just for the afflicted, but also for those who care for them.”

Lo and three co-authors propose creating a public-private partnership that would fund research for a diverse array of drug-discovery projects simultaneously. Such an approach would increase the chances of a therapeutic breakthrough, they say, and the inclusion of public funding would help mitigate the risks and costs of Alzheimer’s research for the private sector.

There would be a long-term public-sector payoff, according to the researchers: Government funding for Alzheimer’s research would pale in comparison to the cost of caring for Alzheimer’s sufferers in public health-care programs. The paper’s model of the new funding approach calls for an outlay of $38.4 billion over 13 years for research; the costs of Medicare and Medicaid support for Alzheimer’s patients in 2014 alone is estimated to be $150 billion.

“Having parallel development would obviously decrease the waiting time, but it increases the short-run need for funding,” Lo says. “Given how much of an urgent need there is for Alzheimer’s therapies, it has to be the case that if you develop a cure, you’re going to be able to recoup your costs and then some.” In fact, the paper’s model estimates a double-digit return on public investment over the long run.

Lo adds: “Can we afford it? I think a more pressing question is, ‘Can we afford not to do something about this now?’”

Modeling the odds of success

The paper, “Parallel Discovery of Alzheimer’s Therapeutics,” was published today in Science Translational Medicine. Along with Lo, the co-authors of the piece are Carole Ho of the biotechnology firm Genentech, Jayna Cummings of MIT Sloan, and Kenneth Kosik of the University of California at Santa Barbara.

The main hypothesis on the cause of Alzheimer’s involves amyloid deposition, the buildup of plaques in the brain that impair neurological function; most biomedical efforts to tackle the disease have focused on this issue. For the study, Ho and Kosik, leading experts in Alzheimer’s research, compiled a list of 64 conceivable approaches to drug discovery, addressing a range of biological mechanisms that may be involved in the disease.

A fund backing that group of research projects might expand the chances of developing a drug that could, at a minimum, slow the progression of the disease. On the other hand, it might not increase the odds of success so much that pharmaceutical firms and biomedical investment funds would plow money into the problem.

“Sixty-four projects are a lot more than what’s being investigated today, but it’s still way shy of the 150 or 200 that are needed to mitigate the financial risks of an Alzheimer’s-focused fund,” Lo says.

The model assumes 13 years for the development of an individual drug, including clinical trials, and estimates the success rates for drug development. Given 150 trials, the odds of at least two successful trials are 99.59 percent. Two successful trials, Lo says, is what it would take to make the investment — a series of bonds issued by the fund — profitable and attractive to a broad range of investors.

“With a sufficiently high likelihood of success, you can issue debt to attract a large group of bondholders who would be willing to put their money to work,” Lo says. “The enormous size of bond markets translates into enormous potential funding opportunities for developing these therapeutics.”

Stakeholders everywhere

To be clear, Lo says, Alzheimer’s drug development is a very difficult task, since researchers often have to identify a pool of potential patients well before symptoms occur, in order to see how well therapies might work on delaying the onset of the disease.

Compared with the development of new drugs to treat other diseases, “Alzheimer’s drug development is more expensive, takes longer, and needs a larger sample of potential patients,” Lo acknowledges.

However, since the number of Americans suffering from Alzheimer’s is projected to double by 2050, according to the Alzheimer’s Association, an advocacy group, Lo stresses the urgency of the task at hand.”

Filed under alzheimer's disease drug development health medicine neuroscience science

98 notes

Portable brain-mapping device allows researchers to ‘see’ where memory fails student veterans

UT Arlington researchers have successfully used a portable brain-mapping device to show limited prefrontal cortex activity among student veterans with Post Traumatic Stress Disorder when they were asked to recall information from simple memorization tasks.

The study by bioengineering professor Hanli Liu and Alexa Smith-Osborne, an associate professor of social work, and two other collaborators was published in the May 2014 edition of NeuroImage: Clinical. The team used functional near infrared spectroscopy to map brain activity responses during cognitive activities related to digit learning and memory retrial.

Smith-Osborne has used the findings to guide treatment recommendations for some veterans through her work as principal investigator for UT Arlington’s Student Veteran Project, which offers free services to veterans who are undergraduates or who are considering returning to college.

“When we retest those student veterans after we’ve provided therapy and interventions, they’ve shown marked improvement,” Smith-Osborne said. “The fNIRS data have shown improvement in brain functions and responses after the student veterans have undergone treatment.”

Liu said this type of brain imaging allows us to “see” which brain region or regions fail to memorize or recall learned knowledge in student veterans with PTSD.

“It also shows how PTSD can affect the way we learn and our ability to recall information, so this new way of brain imaging advances our understanding of PTSD.” Liu said.

This study is multi-disciplinary, associating objective brain imaging with neurological disorders and social work.

While UT Arlington bioengineering faculty associate Fenghua Tian is the primary author assisted by bioengineering graduate research assistant Amarnath Yennu, collaborators of the study include UT Austin psychology professor Francisco Gonzalez-Lima and psychology professor Carol North with UT Southwestern Medical Center and the Veterans Administration North Texas Health Care System.

Khosrow Behbehani, dean of the UT Arlington College of Engineering, said this collaborative research is “allowing the researchers to objectively measure the changes in the level of oxygen in the brain and relate them to some of the brain functions that may have been adversely affected by trauma or stress.”  

Numerous neuropsychological studies have linked learning dysfunctions – such as memory loss, attention deficits and learning disabilities – with PTSD.

The new study involved 16 combat veterans previously diagnosed with PTSD who were experiencing distress and functional impairment affecting cognitive and related academic performance.  The veterans were directed to perform a series of number-ordering tasks on a computer while researchers monitored their brain activity through near infrared spectroscopy, a noninvasive neuroimaging technology.

The research found that participants with PTSD experienced significant difficulty recalling the given digits compared with a control group. This deficiency is closely associated with dysfunction of a portion in the right frontal cortex. The team also determined that near infrared spectroscopy was an effective tool for measuring cognitive dysfunction associated with PTSD.

With that information, Smith-Osborne said mental healthcare providers could customize a treatment plan best suited for that individual.

“It’s not a one-size-fits-all treatment plan but a concentrated effort to tailor the treatment based on where that person is on the learning scale,” Smith-Osborne said.

Smith-Osborne and Liu hope that their research results lead to better and more comprehensive care for veterans and a better college education.

(Source: uta.edu)

Filed under PTSD prefrontal cortex brain activity working memory neuroscience science

99 notes

(Image caption: Astrocyte activity is shown in green in this slice of tissue from the brain region that controls movement in mice. Internal, structural elements of the astrocytes are shown in magenta; cell bodies are in red. Credit: Amit Agarwal and Dwight Bergles, courtesy of Cell Press)
Fight-Or-Flight Chemical Prepares Cells to Shift the Brain From Subdued to Alert State
A new study from The Johns Hopkins University shows that the brain cells surrounding a mouse’s neurons do much more than fill space. According to the researchers, the cells, called astrocytes because of their star-shaped appearance, can monitor and respond to nearby neural activity, but only after being activated by the fight-or-flight chemical norepinephrine. Because astrocytes can alter the activity of neurons, the findings suggest that astrocytes may help control the brain’s ability to focus.
The study involved observing the cells in the brains of living, active mice over long periods of time. A combination of genetically engineered mice and advanced microscopy allowed the researchers to visualize the activity of astrocyte networks in different regions of the brain to learn how these abundant supporting cells are controlled.
The scientists monitored astrocytes in the area of the brain responsible for controlling movement and saw that the cells often increased their activity as the mice walked on treadmills — but not always, and sometimes astrocytes became active when the animals were not moving. This lack of consistency suggested to the researchers that the astrocytes were not responding to nearby neurons, as had been thought.
Similarly, astrocytes in the vision processing area of the brain did not necessarily become active when the mice were stimulated with light, but they were sometimes active, even in the dark. The team solved both mysteries when they tested the idea that the astrocytes needed a signal to “wake them up” before they could respond to nearby neurons. That is how they found that norepinephrine, the brain’s broadly distributed fight-or-flight signal, primes the astrocytes in both locations to “listen in” on nearby neuronal activity.
“Astrocytes are among the most abundant cells in the brain, but we know very little about how they are controlled and how they contribute to brain function,” says Dwight Bergles, Ph.D., professor of neuroscience, who led the study. “Since memory formation and other important functions of the brain require a state of attention, we’re interested in learning more about how astrocytes help create that state.”
For example, Bergles says, “We know that astrocytes can regulate local blood flow, provide energy to neurons and release signaling molecules that alter neuronal activity. They could be doing any or all of those things in response to being activated. It is also possible that they act as a sort of megaphone to broadcast local norepinephrine signals to every neuron in the brain.” Whatever the case may be, researchers now know that astrocytes are not idle loiterers. This ability to study astrocyte network activity in animals as they do different things will help to reveal how these cells contribute to brain function.
This research was published in the journal Neuron on June 18.

(Image caption: Astrocyte activity is shown in green in this slice of tissue from the brain region that controls movement in mice. Internal, structural elements of the astrocytes are shown in magenta; cell bodies are in red. Credit: Amit Agarwal and Dwight Bergles, courtesy of Cell Press)

Fight-Or-Flight Chemical Prepares Cells to Shift the Brain From Subdued to Alert State

A new study from The Johns Hopkins University shows that the brain cells surrounding a mouse’s neurons do much more than fill space. According to the researchers, the cells, called astrocytes because of their star-shaped appearance, can monitor and respond to nearby neural activity, but only after being activated by the fight-or-flight chemical norepinephrine. Because astrocytes can alter the activity of neurons, the findings suggest that astrocytes may help control the brain’s ability to focus.

The study involved observing the cells in the brains of living, active mice over long periods of time. A combination of genetically engineered mice and advanced microscopy allowed the researchers to visualize the activity of astrocyte networks in different regions of the brain to learn how these abundant supporting cells are controlled.

The scientists monitored astrocytes in the area of the brain responsible for controlling movement and saw that the cells often increased their activity as the mice walked on treadmills — but not always, and sometimes astrocytes became active when the animals were not moving. This lack of consistency suggested to the researchers that the astrocytes were not responding to nearby neurons, as had been thought.

Similarly, astrocytes in the vision processing area of the brain did not necessarily become active when the mice were stimulated with light, but they were sometimes active, even in the dark. The team solved both mysteries when they tested the idea that the astrocytes needed a signal to “wake them up” before they could respond to nearby neurons. That is how they found that norepinephrine, the brain’s broadly distributed fight-or-flight signal, primes the astrocytes in both locations to “listen in” on nearby neuronal activity.

“Astrocytes are among the most abundant cells in the brain, but we know very little about how they are controlled and how they contribute to brain function,” says Dwight Bergles, Ph.D., professor of neuroscience, who led the study. “Since memory formation and other important functions of the brain require a state of attention, we’re interested in learning more about how astrocytes help create that state.”

For example, Bergles says, “We know that astrocytes can regulate local blood flow, provide energy to neurons and release signaling molecules that alter neuronal activity. They could be doing any or all of those things in response to being activated. It is also possible that they act as a sort of megaphone to broadcast local norepinephrine signals to every neuron in the brain.” Whatever the case may be, researchers now know that astrocytes are not idle loiterers. This ability to study astrocyte network activity in animals as they do different things will help to reveal how these cells contribute to brain function.

This research was published in the journal Neuron on June 18.

Filed under astrocytes neural activity norepinephrine visual cortex neuroscience science

145 notes

Modelling how neurons work together



A newly-developed, highly accurate representation of the way in which neurons behave when performing movements such as reaching could not only enhance understanding of the complex dynamics at work in the brain, but aid in the development of robotic limbs which are capable of more complex and natural movements.
Researchers from the University of Cambridge, working in collaboration with the University of Oxford and the Ecole Polytechnique Fédérale de Lausanne (EPFL), have developed a new model of a neural network, offering a novel theory of how neurons work together when performing complex movements. The results are published in the 18 June edition of the journal Neuron.
While an action such as reaching for a cup of coffee may seem straightforward, the millions of neurons in the brain’s motor cortex must work together to prepare and execute the movement before the coffee ever reaches our lips. When we reach for the much-needed cup of coffee, the neurons spring into action, sending a series of signals from the brain to the hand. These signals are transmitted across synapses – the junctions between neurons.
Determining exactly how the neurons work together to execute these movements is difficult, however. The new theory was inspired by recent experiments carried out at Stanford University, which had uncovered some key aspects of the signals that neurons emit before, during and after the movement. “There is a remarkable synergy in the activity recorded simultaneously in hundreds of neurons,” said Dr Guillaume Hennequin of the University’s Department of Engineering, who led the research. “In contrast, previous models of cortical circuit dynamics predict a lot of redundancy, and therefore poorly explain what happens in the motor cortex during movements.”
Better models of how neurons behave will not only aid in our understanding of the brain, but could also be used to design prosthetic limbs controlled via electrodes implanted in the brain. “Our theory could provide a more accurate guess of how neurons would want to signal both movement intention and execution to the robotic limb,” said Dr Hennequin.
The behaviour of neurons in the motor cortex can be likened to a mousetrap or a spring-loaded box, in which the springs are waiting to be released and are let go once the lid is opened or the mouse takes the bait. As we plan a movement, the ‘neural springs’ are progressively flexed and compressed. When released, they orchestrate a series of neural activity bursts, all of which takes place in the blink of an eye.
The signals transmitted by the synapses in the motor cortex during complex movements can be either excitatory or inhibitory, which are in essence mirror reflections of each other. The signals cancel each other out for the most part, leaving occasional bursts of activity.
Using control theory, a branch of mathematics well-suited to the study of complex interacting systems such as the brain, the researchers devised a model of neural behaviour which achieves a balance between the excitatory and inhibitory synaptic signals. The model can accurately reproduce a range of multidimensional movement patterns.
The researchers found that neurons in the motor cortex might not be wired together with nearly as much randomness as had been previously thought. “Our model shows that the inhibitory synapses might be tuned to stabilise the dynamics of these brain networks,” said Dr Hennequin. “We think that accurate models like these can really aid in the understanding of the incredibly complex dynamics at work in the human brain.”
Future directions for the research include building a more realistic, ‘closed-loop’ model of movement generation in which feedback from the limbs is actively used by the brain to correct for small errors in movement execution. This will expose the new theory to the more thorough scrutiny of physiological and behavioural validation, potentially leading to a more complete mechanistic understanding of complex movements.

Modelling how neurons work together

A newly-developed, highly accurate representation of the way in which neurons behave when performing movements such as reaching could not only enhance understanding of the complex dynamics at work in the brain, but aid in the development of robotic limbs which are capable of more complex and natural movements.

Researchers from the University of Cambridge, working in collaboration with the University of Oxford and the Ecole Polytechnique Fédérale de Lausanne (EPFL), have developed a new model of a neural network, offering a novel theory of how neurons work together when performing complex movements. The results are published in the 18 June edition of the journal Neuron.

While an action such as reaching for a cup of coffee may seem straightforward, the millions of neurons in the brain’s motor cortex must work together to prepare and execute the movement before the coffee ever reaches our lips. When we reach for the much-needed cup of coffee, the neurons spring into action, sending a series of signals from the brain to the hand. These signals are transmitted across synapses – the junctions between neurons.

Determining exactly how the neurons work together to execute these movements is difficult, however. The new theory was inspired by recent experiments carried out at Stanford University, which had uncovered some key aspects of the signals that neurons emit before, during and after the movement. “There is a remarkable synergy in the activity recorded simultaneously in hundreds of neurons,” said Dr Guillaume Hennequin of the University’s Department of Engineering, who led the research. “In contrast, previous models of cortical circuit dynamics predict a lot of redundancy, and therefore poorly explain what happens in the motor cortex during movements.”

Better models of how neurons behave will not only aid in our understanding of the brain, but could also be used to design prosthetic limbs controlled via electrodes implanted in the brain. “Our theory could provide a more accurate guess of how neurons would want to signal both movement intention and execution to the robotic limb,” said Dr Hennequin.

The behaviour of neurons in the motor cortex can be likened to a mousetrap or a spring-loaded box, in which the springs are waiting to be released and are let go once the lid is opened or the mouse takes the bait. As we plan a movement, the ‘neural springs’ are progressively flexed and compressed. When released, they orchestrate a series of neural activity bursts, all of which takes place in the blink of an eye.

The signals transmitted by the synapses in the motor cortex during complex movements can be either excitatory or inhibitory, which are in essence mirror reflections of each other. The signals cancel each other out for the most part, leaving occasional bursts of activity.

Using control theory, a branch of mathematics well-suited to the study of complex interacting systems such as the brain, the researchers devised a model of neural behaviour which achieves a balance between the excitatory and inhibitory synaptic signals. The model can accurately reproduce a range of multidimensional movement patterns.

The researchers found that neurons in the motor cortex might not be wired together with nearly as much randomness as had been previously thought. “Our model shows that the inhibitory synapses might be tuned to stabilise the dynamics of these brain networks,” said Dr Hennequin. “We think that accurate models like these can really aid in the understanding of the incredibly complex dynamics at work in the human brain.”

Future directions for the research include building a more realistic, ‘closed-loop’ model of movement generation in which feedback from the limbs is actively used by the brain to correct for small errors in movement execution. This will expose the new theory to the more thorough scrutiny of physiological and behavioural validation, potentially leading to a more complete mechanistic understanding of complex movements.

Filed under neurons neural networks motor cortex motor movements prosthetic limbs robotics neuroscience science

131 notes

Hearing protein required to convert sound into brain signals
A specific protein found in the bridge-like structures that make up part of the auditory machinery of the inner ear is essential for hearing. The absence of this protein or impairment of the gene that codes for this protein leads to profound deafness in mice and humans, respectively, reports a team of researchers in the journal EMBO Molecular Medicine.
“The goal of our study was to identify which isoform of protocadherin-15 forms the tip-links, the essential connections of the auditory mechanotransduction machinery within mature hair cells that are needed to convert sound into electrical signals,” remarks Christine Petit, the lead author of the study and Professor at the Institut Pasteur in Paris and at Collège de France.
Three types of protocadherin-15 are known to exist in auditory sensory cells of the inner ear but it was not clear which of these protein isoforms was essential for hearing. “Our work pinpoints the CD2 isoform of protocadherin-15 as an essential component of the tip-link and reveals that the absence of protocadherin-15 CD2 in mouse hair cells results in profound deafness.”
Within the hair bundle, the sensory antenna of auditory sensory cells, the tip-link is a bridge-like structure that when stretched can activate the ion channel responsible for generating electrical signals from sound. Tension in the tip-link created by sound stimulation opens this channel of unknown molecular composition thus generating electrical signals and, ultimately, the perception of sound.
The researchers engineered mice that lack only the CD2 isoform of protocadherin-15 exclusively during adulthood. While the absence of this isoform led to profound deafness, the lack of the other protocadherin-15 isoforms in mice did not affect their hearing.
Patients who carry a mutation in the gene encoding protocadherin-15 are affected by a rare devastating disorder, Usher syndrome, which is characterized by profound deafness, balance problems and gradual visual loss due to retinitis pigmentosa. In a separate approach, the scientists also sequenced the genes of 60 patients who had profound deafness without balance and visual impairment. Three of these patients were shown to have mutations specifically affecting protocadherin-15 CD2. “The demonstration of a requirement for protocadherin-15 CD2 for hearing not only in mice but also in humans constitutes a major step in the objective of deciphering the components of the auditory mechanotransduction machinery. This isoform can be used as a starting point to identify the other components of the auditory machinery. By focusing our attention on the CD2 isoform of protocadherin-15, we can now consider developing gene therapy strategies for deafness caused by defects in this gene,” says EMBO Member Christine Petit.

Hearing protein required to convert sound into brain signals

A specific protein found in the bridge-like structures that make up part of the auditory machinery of the inner ear is essential for hearing. The absence of this protein or impairment of the gene that codes for this protein leads to profound deafness in mice and humans, respectively, reports a team of researchers in the journal EMBO Molecular Medicine.

“The goal of our study was to identify which isoform of protocadherin-15 forms the tip-links, the essential connections of the auditory mechanotransduction machinery within mature hair cells that are needed to convert sound into electrical signals,” remarks Christine Petit, the lead author of the study and Professor at the Institut Pasteur in Paris and at Collège de France.

Three types of protocadherin-15 are known to exist in auditory sensory cells of the inner ear but it was not clear which of these protein isoforms was essential for hearing. “Our work pinpoints the CD2 isoform of protocadherin-15 as an essential component of the tip-link and reveals that the absence of protocadherin-15 CD2 in mouse hair cells results in profound deafness.”

Within the hair bundle, the sensory antenna of auditory sensory cells, the tip-link is a bridge-like structure that when stretched can activate the ion channel responsible for generating electrical signals from sound. Tension in the tip-link created by sound stimulation opens this channel of unknown molecular composition thus generating electrical signals and, ultimately, the perception of sound.

The researchers engineered mice that lack only the CD2 isoform of protocadherin-15 exclusively during adulthood. While the absence of this isoform led to profound deafness, the lack of the other protocadherin-15 isoforms in mice did not affect their hearing.

Patients who carry a mutation in the gene encoding protocadherin-15 are affected by a rare devastating disorder, Usher syndrome, which is characterized by profound deafness, balance problems and gradual visual loss due to retinitis pigmentosa. In a separate approach, the scientists also sequenced the genes of 60 patients who had profound deafness without balance and visual impairment. Three of these patients were shown to have mutations specifically affecting protocadherin-15 CD2. “The demonstration of a requirement for protocadherin-15 CD2 for hearing not only in mice but also in humans constitutes a major step in the objective of deciphering the components of the auditory mechanotransduction machinery. This isoform can be used as a starting point to identify the other components of the auditory machinery. By focusing our attention on the CD2 isoform of protocadherin-15, we can now consider developing gene therapy strategies for deafness caused by defects in this gene,” says EMBO Member Christine Petit.

Filed under hair cells inner ear usher syndrome hearing protocadherin-15 medicine science

75 notes

Boost for dopamine packaging protects brain in Parkinson’s model

Researchers from Emory’s Rollins School of Public Health discovered that an increase in the protein that helps store dopamine, a critical brain chemical, led to enhanced dopamine neurotransmission and protection from a Parkinson’s disease-related neurotoxin in mice.

Dopamine and related neurotransmitters are stored in small storage packages called vesicles by the vesicular monoamine transporter (VMAT2). When released from these packages dopamine can help regulate movement, pleasure and emotional response. Low dopamine levels are associated with neurodegenerative diseases such as Parkinson’s disease and recent research has shown that VMAT2 function is impaired in people with the disease.

Lead researcher Gary W. Miller, PhD professor and associate dean for research at the Rollins School of Public Health and his team generated transgenic mice with increased levels of VMAT2 and found it led to an increase in dopamine release. In addition, the group found improved outcomes on anxiety and depressive behaviors, increased movement, and protection from MPTP, the chemical that can cause Parkinson’s disease-related damage in the brain.

The complete study is available in the June 17, 2014 edition of Proceedings of the National Academy of Sciences (PNAS).

According to Miller, “This work suggests that enhanced vesicular filling can be sustained over time and may be a viable
 therapeutic approach for a variety of central nervous system disorders that involve the storage and release of dopamine, serotonin or norepinephrine.”

(Source: news.emory.edu)

Filed under parkinson's disease dopamine VMAT2 neurotransmitters neuroscience science

382 notes

Stress hormone linked to short-term memory loss as we age
A new study at the University of Iowa reports a potential link between stress hormones and short-term memory loss in older adults.
The study, published in the Journal of Neuroscience, reveals that having high levels of cortisol—a natural hormone in our body whose levels surge when we are stressed—can lead to memory lapses as we age.
Short-term increases in cortisol are critical for survival. They promote coping and help us respond to life’s challenges by making us more alert and able to think on our feet. But abnormally high or prolonged spikes in cortisol—like what happens when we are dealing with long-term stress—can lead to negative consequences that numerous bodies of research have shown to include digestion problems, anxiety, weight gain, and high blood pressure.
In this study, the UI researchers linked elevated amounts of cortisol to the gradual loss of synapses in the prefrontal cortex, the region of the brain that houses short-term memory. Synapses are the connections that help us process, store, and recall information. And when we get older, repeated and long-term exposure to cortisol can cause them to shrink and disappear.
“Stress hormones are one mechanism that we believe leads to weathering of the brain,” says Jason Radley, assistant professor in psychology at the UI and corresponding author on the paper. Like a rock on the shoreline, after years and years it will eventually break down and disappear.
While previous studies have shown cortisol to produce similar effects in other regions of the aging brain, this was the first study to examine its impact on the prefrontal cortex.
And although preliminary, the findings raise the possibility that short-memory decline in aging adults may be slowed or prevented by treatments that decrease levels of cortisol in susceptible individuals, says Radley. That could mean treating people who have naturally high levels of cortisol—such as those who are depressed—or those who experience repeated, long-term stress due to traumatic life events like the death of a loved one.
According to Radley and Rachel Anderson, the paper’s lead author and a second year-graduate student in psychology at the UI, short-term memory lapses related to cortisol start around age 65. That’s about the equivalent of 21 month-old rats, which the pair studied to make their discovery.
The UI scientists compared the elderly rats to four-month old rats, which are roughly the same age as a 20 year-old person. The young and elderly groups were then separated further according to whether the rats had naturally high or naturally low levels of corticosterone—the hormone comparable to cortisol in humans.
The researchers subsequently placed the rats in a T-shaped maze that required them to use their short-term memory. In order to receive a treat, they needed to recall which direction they had turned at the top of the T just 30, 60, or 120 seconds ago and then turn the opposite way each time they ran the maze.
Though memory declined across all groups as the time rats waited before running the maze again increased, older rats with high corticosterone levels consistently performed the worst. They chose the correct direction only 58 percent of the time, compared to their older peers with low corticosterone levels who chose it 80 percent of the time.
When researchers took tissue samples from the rats’ prefrontal cortexes and examined them under a microscope, they found the poor performers had smaller and 20 percent fewer synapses than all other groups, indicating memory loss.
In contrast, older rats with low corticosterone levels showed little memory loss and ran the maze nearly as well as the younger rats, who were not affected by any level of corticosterone—low or high.
Still, researchers say it’s important to remember that stress hormones are only one of a host of factors when it comes to mental decline and memory loss as we age.

Stress hormone linked to short-term memory loss as we age

A new study at the University of Iowa reports a potential link between stress hormones and short-term memory loss in older adults.

The study, published in the Journal of Neuroscience, reveals that having high levels of cortisol—a natural hormone in our body whose levels surge when we are stressed—can lead to memory lapses as we age.

Short-term increases in cortisol are critical for survival. They promote coping and help us respond to life’s challenges by making us more alert and able to think on our feet. But abnormally high or prolonged spikes in cortisol—like what happens when we are dealing with long-term stress—can lead to negative consequences that numerous bodies of research have shown to include digestion problems, anxiety, weight gain, and high blood pressure.

In this study, the UI researchers linked elevated amounts of cortisol to the gradual loss of synapses in the prefrontal cortex, the region of the brain that houses short-term memory. Synapses are the connections that help us process, store, and recall information. And when we get older, repeated and long-term exposure to cortisol can cause them to shrink and disappear.

“Stress hormones are one mechanism that we believe leads to weathering of the brain,” says Jason Radley, assistant professor in psychology at the UI and corresponding author on the paper. Like a rock on the shoreline, after years and years it will eventually break down and disappear.

While previous studies have shown cortisol to produce similar effects in other regions of the aging brain, this was the first study to examine its impact on the prefrontal cortex.

And although preliminary, the findings raise the possibility that short-memory decline in aging adults may be slowed or prevented by treatments that decrease levels of cortisol in susceptible individuals, says Radley. That could mean treating people who have naturally high levels of cortisol—such as those who are depressed—or those who experience repeated, long-term stress due to traumatic life events like the death of a loved one.

According to Radley and Rachel Anderson, the paper’s lead author and a second year-graduate student in psychology at the UI, short-term memory lapses related to cortisol start around age 65. That’s about the equivalent of 21 month-old rats, which the pair studied to make their discovery.

The UI scientists compared the elderly rats to four-month old rats, which are roughly the same age as a 20 year-old person. The young and elderly groups were then separated further according to whether the rats had naturally high or naturally low levels of corticosterone—the hormone comparable to cortisol in humans.

The researchers subsequently placed the rats in a T-shaped maze that required them to use their short-term memory. In order to receive a treat, they needed to recall which direction they had turned at the top of the T just 30, 60, or 120 seconds ago and then turn the opposite way each time they ran the maze.

Though memory declined across all groups as the time rats waited before running the maze again increased, older rats with high corticosterone levels consistently performed the worst. They chose the correct direction only 58 percent of the time, compared to their older peers with low corticosterone levels who chose it 80 percent of the time.

When researchers took tissue samples from the rats’ prefrontal cortexes and examined them under a microscope, they found the poor performers had smaller and 20 percent fewer synapses than all other groups, indicating memory loss.

In contrast, older rats with low corticosterone levels showed little memory loss and ran the maze nearly as well as the younger rats, who were not affected by any level of corticosterone—low or high.

Still, researchers say it’s important to remember that stress hormones are only one of a host of factors when it comes to mental decline and memory loss as we age.

Filed under stress memory cortisol STM prefrontal cortex synapses aging neuroscience science

354 notes

Researchers identify new compound to treat depression
There is new hope for people suffering from depression. Researchers have identified a compound, hydroxynorketamine (HNK), that may treat symptoms of depression just as effectively and rapidly as ketamine, without the unwanted side effects associated with the psychoactive drug, according to a study in the July issue of Anesthesiology, the official medical journal of the American Society of Anesthesiologists® (ASA®).  Interestingly, use of HNK may also serve as a future therapeutic approach for treating neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases, the authors note.
“The clinical use of ketamine therapy for depression is limited because the drug is administered intravenously and may produce adverse effects such as hallucinations and sedation to the point of anesthesia,” said Irving Wainer, Ph.D., senior investigator with the Intramural Research Program at the National Institute on Aging, Baltimore. “We found that the HNK compound significantly contributes to the anti-depressive effects of ketamine in animals, but doesn’t produce the sedation or anesthesia, which makes HNK an attractive alternative as an antidepressant in humans.”
HNK is one of several different compounds produced when ketamine, an anesthesia medicine-turned-antidepressant, is broken down (metabolized) in the body. Using a rat model, researchers tested HNK to see if the compound alone could produce the same beneficial effects attributed to ketamine without ketamine’s unwanted side effects. 
In the study, rats were given intravenous doses of ketamine, HNK and another compound produced by ketamine metabolism known as norketamine. The effect each had on stimulating certain cellular pathways of the rats’ brains was examined after 20, 30 and 60 minutes.  Brain tissue from drug-free rats was used as a control.
Researchers found the compound HNK, like ketamine, not only produced potent and rapid antidepressant effects, but also stimulated neuro-regenerative pathways and initiated the regrowth of neurons in rats’ brains. HNK also appears to have several advantages over ketamine in that it is 1,000 times more potent, does not act as an anesthetic agent, and can be taken by mouth, the authors report. 
Surprisingly, HNK was also found to reduce the production of D-serine, a chemical found in the body, overproduction of which is associated with neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases. HNK’s ability to reduce the production of D-serine, while stimulating the regeneration of neuron connections in the brain, may present a potential new therapeutic approach to the treatment of these disorders. 
“HNK’s unique properties increase the possibility of the development of a self-administered, daily treatment that works quickly and can be taken at home for a variety of central nervous system diseases,” said Dr. Wainer.  “This is a very exciting discovery and we hope that the results of this study will enable future investigations into this potentially therapeutic and important compound.”
Dr. Wainer and several of the study’s authors are listed as co-inventors on a patent application for the use of ketamine compounds in the treatment of bipolar disorder and major depression. 

Researchers identify new compound to treat depression

There is new hope for people suffering from depression. Researchers have identified a compound, hydroxynorketamine (HNK), that may treat symptoms of depression just as effectively and rapidly as ketamine, without the unwanted side effects associated with the psychoactive drug, according to a study in the July issue of Anesthesiology, the official medical journal of the American Society of Anesthesiologists® (ASA®).  Interestingly, use of HNK may also serve as a future therapeutic approach for treating neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases, the authors note.

“The clinical use of ketamine therapy for depression is limited because the drug is administered intravenously and may produce adverse effects such as hallucinations and sedation to the point of anesthesia,” said Irving Wainer, Ph.D., senior investigator with the Intramural Research Program at the National Institute on Aging, Baltimore. “We found that the HNK compound significantly contributes to the anti-depressive effects of ketamine in animals, but doesn’t produce the sedation or anesthesia, which makes HNK an attractive alternative as an antidepressant in humans.”

HNK is one of several different compounds produced when ketamine, an anesthesia medicine-turned-antidepressant, is broken down (metabolized) in the body. Using a rat model, researchers tested HNK to see if the compound alone could produce the same beneficial effects attributed to ketamine without ketamine’s unwanted side effects. 

In the study, rats were given intravenous doses of ketamine, HNK and another compound produced by ketamine metabolism known as norketamine. The effect each had on stimulating certain cellular pathways of the rats’ brains was examined after 20, 30 and 60 minutes.  Brain tissue from drug-free rats was used as a control.

Researchers found the compound HNK, like ketamine, not only produced potent and rapid antidepressant effects, but also stimulated neuro-regenerative pathways and initiated the regrowth of neurons in rats’ brains. HNK also appears to have several advantages over ketamine in that it is 1,000 times more potent, does not act as an anesthetic agent, and can be taken by mouth, the authors report. 

Surprisingly, HNK was also found to reduce the production of D-serine, a chemical found in the body, overproduction of which is associated with neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases. HNK’s ability to reduce the production of D-serine, while stimulating the regeneration of neuron connections in the brain, may present a potential new therapeutic approach to the treatment of these disorders. 

“HNK’s unique properties increase the possibility of the development of a self-administered, daily treatment that works quickly and can be taken at home for a variety of central nervous system diseases,” said Dr. Wainer.  “This is a very exciting discovery and we hope that the results of this study will enable future investigations into this potentially therapeutic and important compound.”

Dr. Wainer and several of the study’s authors are listed as co-inventors on a patent application for the use of ketamine compounds in the treatment of bipolar disorder and major depression. 

Filed under hydroxynorketamine ketamine depression neurodegenerative diseases norketamine medicine science

180 notes

Does the moon affect our sleep?
Popular beliefs about the influence of the moon on humans widely exist. Many people report sleeplessness around the time of full moon. In contrast to earlier studies, scientists from the Max Planck Institute of Psychiatry in Munich did not observe any correlation between human sleep and the lunar phases. The researchers analyzed preexisting data of a large cohort of volunteers and their sleep nights. Further identification of mostly unpublished null findings suggests that the conflicting results of previous studies might be due to a publication bias.
For centuries, people have believed that the moon cycle influences human health, behavior and physiology. Folklore mainly links the full moon with sleeplessness. But what about the scientific background?
Several studies searched in re-analyses of pre-existing datasets on human sleep for a lunar effect, although the results were quite varying and the effects on sleep have rarely been assessed with objective measures, such as a sleep EEG. In some studies women appeared more affected by the moon phase, in others men. Two analyses of datasets from 2013 and 2014, each including between 30 and 50 volunteers, agreed on shorter total sleep duration in the nights around full moon. However, both studies came to conflicting results in other variables. For example, in one analysis the beginning of the REM-sleep phase in which we mainly dream was delayed around new moon, whereas the other study observed the longest delay around full moon.
To overcome the problem of possible chance findings in small study samples, scientists now analyzed the sleep data of overall 1,265 volunteers during 2,097 nights. “Investigating this large cohort of test persons and sleep nights, we were unable to replicate previous findings,” states Martin Dresler, neuroscientist at the Max Planck Institute of Psychiatry in Munich, Germany, and the Donders Institute for Brain, Cognition and Behaviour in Nijmegen, Netherlands. “We could not observe a statistical relevant correlation between human sleep and the lunar phases.” Further, his team identified several unpublished null findings including cumulative analyses of more than 20,000 sleep nights, which suggest that the conflicting results might be an example of a publication bias (i.e. the file drawer problem).
The file drawer problem describes the phenomenon, that many studies may be conducted but never reported – they remain in the file drawer. One much-discussed publication bias in science, medicine and pharmacy is the tendency to report experimental results that are positive or show a significant finding and to omit results that are negative or inconclusive.
Up to now, the influence of the lunar cycle on human sleep was investigated in re-analyses of earlier studies which originally followed different purposes. “To overcome the obvious limitations of retrospective data analysis, carefully controlled studies specifically designed for the test of lunar cycle effects on sleep in large samples are required for a definite answer,” comments Dresler.

Does the moon affect our sleep?

Popular beliefs about the influence of the moon on humans widely exist. Many people report sleeplessness around the time of full moon. In contrast to earlier studies, scientists from the Max Planck Institute of Psychiatry in Munich did not observe any correlation between human sleep and the lunar phases. The researchers analyzed preexisting data of a large cohort of volunteers and their sleep nights. Further identification of mostly unpublished null findings suggests that the conflicting results of previous studies might be due to a publication bias.

For centuries, people have believed that the moon cycle influences human health, behavior and physiology. Folklore mainly links the full moon with sleeplessness. But what about the scientific background?

Several studies searched in re-analyses of pre-existing datasets on human sleep for a lunar effect, although the results were quite varying and the effects on sleep have rarely been assessed with objective measures, such as a sleep EEG. In some studies women appeared more affected by the moon phase, in others men. Two analyses of datasets from 2013 and 2014, each including between 30 and 50 volunteers, agreed on shorter total sleep duration in the nights around full moon. However, both studies came to conflicting results in other variables. For example, in one analysis the beginning of the REM-sleep phase in which we mainly dream was delayed around new moon, whereas the other study observed the longest delay around full moon.

To overcome the problem of possible chance findings in small study samples, scientists now analyzed the sleep data of overall 1,265 volunteers during 2,097 nights. “Investigating this large cohort of test persons and sleep nights, we were unable to replicate previous findings,” states Martin Dresler, neuroscientist at the Max Planck Institute of Psychiatry in Munich, Germany, and the Donders Institute for Brain, Cognition and Behaviour in Nijmegen, Netherlands. “We could not observe a statistical relevant correlation between human sleep and the lunar phases.” Further, his team identified several unpublished null findings including cumulative analyses of more than 20,000 sleep nights, which suggest that the conflicting results might be an example of a publication bias (i.e. the file drawer problem).

The file drawer problem describes the phenomenon, that many studies may be conducted but never reported – they remain in the file drawer. One much-discussed publication bias in science, medicine and pharmacy is the tendency to report experimental results that are positive or show a significant finding and to omit results that are negative or inconclusive.

Up to now, the influence of the lunar cycle on human sleep was investigated in re-analyses of earlier studies which originally followed different purposes. “To overcome the obvious limitations of retrospective data analysis, carefully controlled studies specifically designed for the test of lunar cycle effects on sleep in large samples are required for a definite answer,” comments Dresler.

Filed under sleep lunar phases EEG moon cycle psychology neuroscience science

119 notes

New Study Shows Limited Motor Skills In Early Infancy May Be Trait of Autism
Researchers from Kennedy Krieger Institute in Baltimore, Md., announced findings that provide evidence for reduced grasping and fine motor activity among six-month-old infants with an increased familial risk for autism spectrum disorders (ASD). The research, which was published in Child Development, has important implications for our overall understanding of ASDs. Furthermore, the results suggest that subtle lags in object exploration-related motor skills in early infancy may present an ASD endophenotype - a heritable characteristic that may have genetic relation to ASD without predicting a full diagnosis- and further our understanding of the genes involved in the disorder.
“Among the infants with familial history of ASD, many were shown to have reduced fine motor skills regardless of eventual ASD diagnosis,” says Dr. Rebecca Landa, lead author and director of Kennedy Krieger’s Center for Autism and Related Disorders. “This means that reduced fine motor skills could be an ASD endophenotype without predicting full diagnosis. Identifying potential endophenotypes has important implications for future research and may improve our understanding of the neurobiology and genetics of ASDs.”
Researchers conducted two experiments examining the correlation of early motor development and object exploration in children with low risk (LR) or high risk (HR) of developing an ASD. Researchers measured key early learning skills, such as object manipulation and grasping activity, in infants at six months of age and again at 10 months. While all infants scored within the expected range and showed no difference in terms of their object manipulation, there were subtle signs that showed reduced grasping activity in HR infants as compared to their LR age-peers. These findings demonstrate that regardless of developmental outcomes, early motor skill differences in HR infants may represent an endophenotype that can be linked to ASD.
About Experiment 1
In experiment 1, participants included 129 infants, largely consisting of infant siblings of children with confirmed ASD diagnoses. During the testing period, most participants were six months old and were then followed longitudinally to the age of 36 months. Infants completed an assessment using the Mullen Scales of Early Learning (MSEL), which is a standardized assessment tool providing scores in five categories: Gross Motor (GM); Fine Motor (FM); Visual Reception (VR); Receptive Language (RL); and Expressive Language (EL). Based on the results of this assessment, infants were then divided into four groups : low-risk (LR) infants without ASD; high-risk (HR) infants without ASD, language, or social delays; HR infants showing language or social delays but not ASD; and HR infants with autism or ASD diagnosis. All children in the HR ASD group met DSM-IV diagnostic criteria for the disorder.
All four groups in Experiment 1 scored within the typical range on the MSEL subtests, meaning that none exhibited a clinical delay in their overall fine motor development at age six months. Subtle differences between HR and LR infants emerged even in HR infants who did not receive a diagnosis of ASD or other delays by age 36 months, which suggests that lower fine motor scores on the MSEL are characteristic of infants at high familial risk for ASD. In order to examine whether the HR infants would catch up to the LR infants in time, researchers conducted a second experiment with new participants.
About Experiment 2
Experiment 2 focused on a new group of six-month-old infants in both LR and HR categories and examined only their grasping behaviors in a naturalistic, free-play context, which was an important factor that emerged in Experiment 1. Participants included 42 infants who were siblings of children with ASD. The infants were observed in an unstructured play session.
The results of Experiment 2 showed reduced grasping and object exploration activity in six-month-old infants at HR for ASD. Overall, the MSEL FM T-score results observed in Experiment 2 show a similar pattern as in Experiment 1, but statistical results are somewhat weakened by an effect of gender in the LR sample. Unique to Experiment 2, was the sole focus on object manipulation-related items of the MSEL, which offered a consistent measure to identify differences between HR and LR infants. Reduced grasping activity in HR infants at age 6 months was also observed during an unstructured free-play task in Experiment 2, which provides additional evidence for the findings observed in Experiment 1. However, the HR infants caught up to the LR group in grasping, as measured in this study, by 10 months of age.
Future studies are needed to examine these preliminary findings more closely to specifically assess grasping ability in infants that receive an ASD diagnosis later in life.
(Image: Bigstock)

New Study Shows Limited Motor Skills In Early Infancy May Be Trait of Autism

Researchers from Kennedy Krieger Institute in Baltimore, Md., announced findings that provide evidence for reduced grasping and fine motor activity among six-month-old infants with an increased familial risk for autism spectrum disorders (ASD). The research, which was published in Child Development, has important implications for our overall understanding of ASDs. Furthermore, the results suggest that subtle lags in object exploration-related motor skills in early infancy may present an ASD endophenotype - a heritable characteristic that may have genetic relation to ASD without predicting a full diagnosis- and further our understanding of the genes involved in the disorder.

“Among the infants with familial history of ASD, many were shown to have reduced fine motor skills regardless of eventual ASD diagnosis,” says Dr. Rebecca Landa, lead author and director of Kennedy Krieger’s Center for Autism and Related Disorders. “This means that reduced fine motor skills could be an ASD endophenotype without predicting full diagnosis. Identifying potential endophenotypes has important implications for future research and may improve our understanding of the neurobiology and genetics of ASDs.”

Researchers conducted two experiments examining the correlation of early motor development and object exploration in children with low risk (LR) or high risk (HR) of developing an ASD. Researchers measured key early learning skills, such as object manipulation and grasping activity, in infants at six months of age and again at 10 months. While all infants scored within the expected range and showed no difference in terms of their object manipulation, there were subtle signs that showed reduced grasping activity in HR infants as compared to their LR age-peers. These findings demonstrate that regardless of developmental outcomes, early motor skill differences in HR infants may represent an endophenotype that can be linked to ASD.

About Experiment 1

In experiment 1, participants included 129 infants, largely consisting of infant siblings of children with confirmed ASD diagnoses. During the testing period, most participants were six months old and were then followed longitudinally to the age of 36 months. Infants completed an assessment using the Mullen Scales of Early Learning (MSEL), which is a standardized assessment tool providing scores in five categories: Gross Motor (GM); Fine Motor (FM); Visual Reception (VR); Receptive Language (RL); and Expressive Language (EL). Based on the results of this assessment, infants were then divided into four groups : low-risk (LR) infants without ASD; high-risk (HR) infants without ASD, language, or social delays; HR infants showing language or social delays but not ASD; and HR infants with autism or ASD diagnosis. All children in the HR ASD group met DSM-IV diagnostic criteria for the disorder.

All four groups in Experiment 1 scored within the typical range on the MSEL subtests, meaning that none exhibited a clinical delay in their overall fine motor development at age six months. Subtle differences between HR and LR infants emerged even in HR infants who did not receive a diagnosis of ASD or other delays by age 36 months, which suggests that lower fine motor scores on the MSEL are characteristic of infants at high familial risk for ASD. In order to examine whether the HR infants would catch up to the LR infants in time, researchers conducted a second experiment with new participants.

About Experiment 2

Experiment 2 focused on a new group of six-month-old infants in both LR and HR categories and examined only their grasping behaviors in a naturalistic, free-play context, which was an important factor that emerged in Experiment 1. Participants included 42 infants who were siblings of children with ASD. The infants were observed in an unstructured play session.

The results of Experiment 2 showed reduced grasping and object exploration activity in six-month-old infants at HR for ASD. Overall, the MSEL FM T-score results observed in Experiment 2 show a similar pattern as in Experiment 1, but statistical results are somewhat weakened by an effect of gender in the LR sample. Unique to Experiment 2, was the sole focus on object manipulation-related items of the MSEL, which offered a consistent measure to identify differences between HR and LR infants. Reduced grasping activity in HR infants at age 6 months was also observed during an unstructured free-play task in Experiment 2, which provides additional evidence for the findings observed in Experiment 1. However, the HR infants caught up to the LR group in grasping, as measured in this study, by 10 months of age.

Future studies are needed to examine these preliminary findings more closely to specifically assess grasping ability in infants that receive an ASD diagnosis later in life.

(Image: Bigstock)

Filed under ASD autism motor control motor activity infants psychology neuroscience science

free counters