Neuroscience

Articles and news from the latest research reports.

Posts tagged science

268 notes

Oxytocin gene partly responsible for how adolescents feel in company
Loneliness: could there be a genetic explanation for it? Yes, to some extent! At least in the case of young female adolescents who, it appears, are more likely to feel lonely in everyday life if they have a specific variant of the gene that regulates how oxytocin – also known as the ‘bonding hormone’ – is received in the brain. Boys who carry this variant are not lonelier but, like girls, respond more strongly to a negative social environment. These findings were published this week in the academic journal PlosONE.
Oxytocin is a hormone with an important role in social behaviour. In the period following birth, it is an important factor in the bonding process between mother and baby, but it also has an influence on other relationships. The gene that regulates oxytocin-sensitivity in the brain varies between one person and another. Some people are less sensitive to oxytocin and therefore more likely to feel lonely. Various indicators have already suggested this. This prompted a group of behavioural researchers in Nijmegen to carry out a fresh and in-depth study of oxytocin effects in a group in which ‘belonging’ is of paramount importance: young adolescents.
A large group, frequently surveyed
The study involved 278 adolescents, 58 per cent of whom were girls. They were contacted via their smartphones nine times a day over a six-day period and asked to report how they felt and who they were with. The presence of the variant of the oxytocin receptor gene OXTR was also determined. ‘This is a new approach to researching the interaction between gene variation and the environment,’ explains Eeske van Roekel, the lead author of the article published online in PlosONE on Monday 4 November. ‘By asking the subjects nine times a day “How do you feel? Who are you with? What do you think of the people you are with?,” we managed to put together a clear picture of how adolescents feel in everyday life. These real-time reports are more reliable than responses after the event.’
Lonelier with specific OXTR variant
‘Our most important finding was that girls who carried a certain variant of the oxytocin gene in their DNA felt lonelier than girls who did not. Boys with this variant were also adversely affected by negative company at the weekend: their feelings increased the longer they were in such company, while boys without this variant were unaffected. These findings apply to both boys and girls.’ The measured effects are small but still relevant, says Van Roekel. ‘These methods reveal more about actual everyday experiences than methods that ask people once at a later date to describe how they felt.’ Heightened sensitivity to negative company in the case of this specific variant was only visible at weekends. How can that be explained? ‘We surmise that it’s because you have more freedom in the weekend to choose the people you mix with than through the week,’ says Van Roekel. ‘Then it makes a deeper impression if they treat you in a negative manner.’
New trend
No-one knows yet exactly how the receptor gene works. ‘We still don’t know how it translates into, for example, oxytocin levels in the brain,’ says Van Roekel. ‘So more research is needed on that front.’ Research on connections between genes and behaviour is developing gradually. ‘We think that our approach, which takes multiple measurements in the daily life of adolescents, has a lot to offer when it comes to discovering connections.’ Van Roekel conducted her research in the group of Professor Rutger Engels at the Behavioural Science Institute of Radboud University Nijmegen.

Oxytocin gene partly responsible for how adolescents feel in company

Loneliness: could there be a genetic explanation for it? Yes, to some extent! At least in the case of young female adolescents who, it appears, are more likely to feel lonely in everyday life if they have a specific variant of the gene that regulates how oxytocin – also known as the ‘bonding hormone’ – is received in the brain. Boys who carry this variant are not lonelier but, like girls, respond more strongly to a negative social environment. These findings were published this week in the academic journal PlosONE.

Oxytocin is a hormone with an important role in social behaviour. In the period following birth, it is an important factor in the bonding process between mother and baby, but it also has an influence on other relationships. The gene that regulates oxytocin-sensitivity in the brain varies between one person and another. Some people are less sensitive to oxytocin and therefore more likely to feel lonely. Various indicators have already suggested this. This prompted a group of behavioural researchers in Nijmegen to carry out a fresh and in-depth study of oxytocin effects in a group in which ‘belonging’ is of paramount importance: young adolescents.

A large group, frequently surveyed

The study involved 278 adolescents, 58 per cent of whom were girls. They were contacted via their smartphones nine times a day over a six-day period and asked to report how they felt and who they were with. The presence of the variant of the oxytocin receptor gene OXTR was also determined. ‘This is a new approach to researching the interaction between gene variation and the environment,’ explains Eeske van Roekel, the lead author of the article published online in PlosONE on Monday 4 November. ‘By asking the subjects nine times a day “How do you feel? Who are you with? What do you think of the people you are with?,” we managed to put together a clear picture of how adolescents feel in everyday life. These real-time reports are more reliable than responses after the event.’

Lonelier with specific OXTR variant

‘Our most important finding was that girls who carried a certain variant of the oxytocin gene in their DNA felt lonelier than girls who did not. Boys with this variant were also adversely affected by negative company at the weekend: their feelings increased the longer they were in such company, while boys without this variant were unaffected. These findings apply to both boys and girls.’ The measured effects are small but still relevant, says Van Roekel. ‘These methods reveal more about actual everyday experiences than methods that ask people once at a later date to describe how they felt.’ Heightened sensitivity to negative company in the case of this specific variant was only visible at weekends. How can that be explained? ‘We surmise that it’s because you have more freedom in the weekend to choose the people you mix with than through the week,’ says Van Roekel. ‘Then it makes a deeper impression if they treat you in a negative manner.’

New trend

No-one knows yet exactly how the receptor gene works. ‘We still don’t know how it translates into, for example, oxytocin levels in the brain,’ says Van Roekel. ‘So more research is needed on that front.’ Research on connections between genes and behaviour is developing gradually. ‘We think that our approach, which takes multiple measurements in the daily life of adolescents, has a lot to offer when it comes to discovering connections.’ Van Roekel conducted her research in the group of Professor Rutger Engels at the Behavioural Science Institute of Radboud University Nijmegen.

Filed under oxytocin oxytocin receptor gene loneliness adolescence neuroscience genetics science

138 notes

Scientists discover that ants, like humans, can change their priorities

All animals have to make decisions every day. Where will they live and what will they eat? How will they protect themselves? They often have to make these decisions as a group, too, turning what may seem like a simple choice into a far more nuanced process. So, how do animals know what’s best for their survival?

image

For the first time, Arizona State University researchers have discovered that at least in ants, animals can change their decision-making strategies based on experience. They can also use that experience to weigh different options.

The findings are featured today in the early online edition of the scientific journal Biology Letters, as well as in its Dec. 23 edition.

Co-authors Taka Sasaki and Stephen Pratt, both with ASU’s School of Life Sciences, have studied insect collectives, such as ants, for years. Sasaki, a postdoctoral research associate, specializes in adapting psychological theories and experiments that are designed for humans to ants, hoping to understand how the collective decision-making process arises out of individually ignorant ants.

“The interesting thing is we can make decisions and ants can make decisions – but ants do it collectively,” said Sasaki. “So how different are we from ant colonies?”

To answer this question, Sasaki and Pratt gave a number of Temnothorax rugatulus ant colonies a series of choices between two nests with differing qualities. In one treatment, the entrances of the nests had varied sizes, and in the other, the exposure to light was manipulated. Since these ants prefer both a smaller entrance size and a lower level of light exposure, they had to prioritize.

“It’s kind of like a humans and buying a house,” said Pratt, an associate professor with the school. “There’s so many options to consider – the size, the number of rooms, the neighborhood, the price, if there’s a pool. The list goes on and on. And for the ants it’s similar, since they live in cavities that can be dark or light, big or small. With all of these things, just like with a human house, it’s very unlikely to find a home that has everything you want.”

Pratt continued to explain that because it is impossible to find the perfect habitat, ants make various tradeoffs for certain qualities, ordering them in a queue of most important aspects. But, when faced with a decision between two different homes, the ants displayed a previously unseen level of intelligence.

According to their data, the series of choices the ants faced caused them to reprioritize their preferences based on the type of decision they faced. Ants that had to choose a nest based on light level prioritized light level over entrance size in the final choice. On the other hand, ants that had to choose a nest based on entrance size ranked light level lower in the later experiment.

This means that, like people, ants take the past into account when weighing options while making a choice. The difference is that ants somehow manage to do this as a colony without any dissent. While this research builds on groundwork previously laid down by Sasaki and Pratt, the newest experiments have already raised more questions.

“You have hundreds of these ants, and somehow they have to reach a consensus,” Pratt said. “How do they do it without anyone in charge to tell them what to do?”

Pratt likened individual ants to individual neurons in the human brain. Both play a key role in the decision-making process, but no one understands how every neuron influences a decision.

Sasaki and Pratt hope to delve deeper into the realm of ant behavior so that one day, they can understand how individual ants influence the colony. Their greater goal is to apply what they discover to help society better understand how humanity can make collective decisions with the same ease ants display.

“This helps us learn how collective decision-making works and how it’s different from individual decision-making,” said Pratt. “And ants aren’t the only animals that make collective decisions – humans do, too. So maybe we can gain some general insight.”

(Source: asunews.asu.edu)

Filed under ants learning decision making collective decision making neuroscience psychology science

57 notes

Simple Dot Test May Help Gauge the Progression of Dopamine Loss in Parkinson’s Disease

A pilot study by a multi-disciplinary team of investigators at Georgetown University suggests that a simple dot test could help doctors gauge the extent of dopamine loss in individuals with Parkinson’s disease (PD). Their study is being presented at Neuroscience 2013, the annual meeting of the Society for Neuroscience.

“It is very difficult now to assess the extent of dopamine loss — a hallmark of Parkinson’s disease — in people with the disease,” says lead author Katherine R. Gamble, a psychology PhD student working with two Georgetown psychologists, a psychiatrist and a neurologist. “Use of this test, called the Triplets Learning Task (TLT), may provide some help for physicians who treat people with Parkinson’s disease, but we still have much work to do to better understand its utility,” she adds.

Gamble works in the Cognitive Aging Laboratory, led by the study’s senior investigator, Darlene Howard, PhD, Davis Family Distinguished Professor in the department of psychology and member of the Georgetown Center for Brain Plasticity and Recovery.

The TLT tests implicit learning, a type of learning that occurs without awareness or intent, which relies on the caudate nucleus, an area of the brain affected by loss of dopamine.

The test is a sequential learning task that does not require complex motor skills, which tend to decline in people with PD. In the TLT, participants see four open circles, see two red dots appear, and are asked to respond when they see a green dot appear. Unbeknownst to them, the location of the first red dot predicts the location of the green target. Participants learn implicitly where the green target will appear, and they become faster and more accurate in their responses.

Previous studies have shown that the caudate region in the brain underlies implicit learning. In the study, PD participants implicitly learned the dot pattern with training, but a loss of dopamine appears to negatively impact that learning compared to healthy older adults.

“Their performance began to decline toward the end of training, suggesting that people with Parkinson’s disease lack the neural resources in the caudate, such as dopamine, to complete the learning task,” says Gamble.

In this study of 27 people with PD, the research team is now testing how implicit learning may differ by different PD stages and drug doses.

“This work is important in that it may be a non-invasive way to evaluate the level of dopamine deficiency in PD patients, and which may lead to future ways to improve clinical treatment of PD patients,” explains Steven E. Lo, MD, associate professor of neurology at Georgetown University Medical Center, and a co-author of the study.

They hope the TLT may one day be a tool to help determine levels of dopamine loss in PD.

(Source: explore.georgetown.edu)

Filed under parkinson's disease dopamine caudate nucleus Neuroscience 2013 neuroscience science

218 notes

Research gives new insight into how antidepressants work in the brain

Research from Oregon Health & Science University’s Vollum Institute, published in the current issue of Nature (1, 2), is giving scientists a never-before-seen view of how nerve cells communicate with each other. That new view can give scientists a better understanding of how antidepressants work in the human brain — and could lead to the development of better antidepressants with few or no side effects.

The article in today’s edition of Nature came from the lab of Eric Gouaux, Ph.D., a senior scientist at OHSU’s Vollum Institute and a Howard Hughes Medical Institute Investigator. The article describes research that gives a better view of the structural biology of a protein that controls communication between nerve cells. The view is obtained through special structural and biochemical methods Gouaux uses to investigate these neural proteins.

The Nature article focuses on the structure of the dopamine transporter, which helps regulate dopamine levels in the brain. Dopamine is an essential neurotransmitter for the human body’s central nervous system; abnormal levels of dopamine are present in a range of neurological disorders, including Parkinson’s disease, drug addiction, depression and schizophrenia. Along with dopamine, the neurotransmitters noradrenaline and serotonin are transported by related transporters, which can be studied with greater accuracy based on the dopamine transporter structure.

The Gouaux lab’s more detailed view of the dopamine transporter structure better reveals how antidepressants act on the transporters and thus do their work.

The more detailed view could help scientists and pharmaceutical companies develop drugs that do a much better job of targeting what they’re trying to target — and not create side effects caused by a broader blast at the brain proteins.

"By learning as much as possible about the structure of the transporter and its complexes with antidepressants, we have laid the foundation for the design of new molecules with better therapeutic profiles and, hopefully, with fewer deleterious side effects," said Gouaux.

Gouaux’s latest dopamine transporter research is also important because it was done using the molecule from fruit flies, a dopamine transporter that is much more similar to those in humans than the bacteria models that previous studies had used.

The dopamine transporter article was one of two articles Gouaux had published in today’s edition of Nature. The other article also dealt with a modified amino acid transporter that mimics the mammalian neurotransmitter transporter proteins targeted by antidepressants. It gives new insights into the pharmacology of four different classes of widely used antidepressants that act on certain transporter proteins, including transporters for dopamine, serotonin and noradrenaline. The second paper in part was validated by findings of the first paper — in how an antidepressant bound itself to a specific transporter.

"What we ended up finding with this research was complementary and mutually reinforcing with the other work — so that was really important," Gouaux said. "And it told us a great deal about how these transporters work and how they interact with the antidepressant molecules."

(Source: ohsu.edu)

Filed under antidepressants nerve cells dopamine neurotransmission neuroscience science

122 notes

Stress makes snails forgetful

New research on pond snails has revealed that high levels of stress can block memory processes. Researchers from the University of Exeter and the University of Calgary trained snails and found that when they were exposed to multiple stressful events they were unable remember what they had learned.

image

Previous research has shown that stress also affects human ability to remember. This study, published in the journal PLOS ONE, found that experiencing multiple stressful events simultaneously has a cumulative detrimental effect on memory.

Dr Sarah Dalesman, a Leverhulme Trust Early Career Fellow, from Biosciences at the University of Exeter, formally at the University of Calgary, said: “It’s really important to study how different forms of stress interact as this is what animals, including people, frequently experience in real life. By training snails, and then observing their behaviour and brain activity following exposure to stressful situations, we found that a single stressful event resulted in some impairment of memory but multiple stressful events prevented any memories from being formed.” 

The pond snail, Lymnaea stagnalis, has easily observable behaviours linked to memory and large neurons in the brain, both useful benefits when studying memory processes. They also respond to stressful events in a similar way to mammals, making them a useful model species to study learning and memory.

In the study, the pond snails were trained to reduce how often they breathed outside water. Usually pond snails breathe underwater and absorb oxygen through their skin. In water with low oxygen levels the snails emerge and inhale air using a basic lung opened to the air via a breathing hole.

To train the snails not to breathe air they were placed in poorly oxygenated water and their breathing holes were gently poked every time they emerged to breathe. Snail memory was tested by observing how many times the snails attempted to breathe air after they had received their training. Memory was considered to be present if there was a reduction in the number of times they opened their breathing holes. The researchers also assessed memory by monitoring neural activity in the brain. 

Immediately before training, the snails were exposed to two different stressful experiences, low calcium - which is stressful as calcium is necessary for healthy shells - and overcrowding by other pond snails.

When faced with the stressors individually, the pond snails had reduced ability to form long term memory, but were still able to learn and form short and intermediate term memory lasting from a few minutes to hours. However, when both stressors were experienced at the same time, results showed that they had additive effects on the snails’ ability to form memory and all learning and memory processes were blocked. 

Future work will focus on the effects of stress on different populations of pond snail.

(Source: exeter.ac.uk)

Filed under snail lymnaea stagnalis memory neural activity stress neuroscience science

143 notes

Researchers Develop At-home 3D Video Game for Stroke Patients

Researchers at The Ohio State University Wexner Medical Center have developed a therapeutic at-home gaming program for stroke patients who experience motor weakness affecting 80 percent of survivors.

Hemiparesis affects 325,000 individuals each year, according to the National Stroke Association. It is defined as weakness or the inability to move one side of the body, and can be debilitating as it impacts everyday functions such as eating, dressing or grabbing objects.

Constraint-induced movement therapy (CI therapy) is an intense treatment recommended for stroke survivors, and improves motor function, as well as the use of impaired upper extremities. However, less than 1 percent of those affected by hemiparesis receives the beneficial therapy.

“Lack of access, transportation and cost are contributing barriers to receiving CI therapy. To address this disparity, our team developed a 3D gaming system to deliver CI therapy to patients in their homes,” said Lynne Gauthier, assistant professor of physical medicine and rehabilitation in Ohio State’s College of Medicine.

Gauthier, also principal investigator of the study and a neuroscientist, is collaborating with a multi-disciplinary team comprised of clinicians, computer scientists, an electrical engineer and a biomechanist to design an innovative video game incorporating effective ingredients CI therapy.

For a combined 30 hours over the course of two weeks, the patient-gamer is immersed in a river canyon environment, where he or she receives engaging high repetition motor practice targeting the affected hand and arm. Various game scenarios promote movements that challenge the stroke survivor and are beneficial to recovery. Some examples include: rowing and paddling down a river, swatting away bats inside a cave, grabbing bottles from the water, fishing, avoiding rocks in the rapids, catching parachutes containing supplies and steering to capture treasure chests. Throughout the intensive training schedule, the participant wears a padded mitt on the less affected hand for 10 hours daily, to promote the use of the more affected hand.

To ensure that motor gains made through the game carry over to daily life, the game encourages participants to reflect on their daily use of the weaker arm and engages the gamer in additional problem-solving ways of using the weaker arm for daily activities.

“This novel model of therapy has shown positive results for individuals who have played the game. Gains in motor speed, as measured by the Wolf Motor Function Test, rival those made through traditional CI therapy,” said Gauthier. “It provides intense high quality motor practice for patients, in their own homes. Patients have reported they have more motivation, time goes by quicker and the challenges are exciting and not so tedious.”

Gauthier said that, if this initial trial demonstrates sufficient evidence of efficacy in stroke survivors, future expansion of gaming CI therapy is possible for other patients with traumatic brain injury, cerebral palsy and multiple sclerosis.

Filed under stroke constraint-induced therapy hemiparesis rehabilitation video games neuroscience science

162 notes

New Study Decodes Brain’s Process for Decision Making

When faced with a choice, the brain retrieves specific traces of memories, rather than a generalized overview of past experiences, from its mental Rolodex, according to new brain-imaging research from The University of Texas at Austin.

image

Led by Michael Mack, a postdoctoral researcher in the departments of psychology and neuroscience, the study is the first to combine computer simulations with brain-imaging data to compare two different types of decision-making models.

In one model — exemplar — a decision is framed around concrete traces of memories, while in the other model — prototype — the decision is based on a generalized overview of all memories lumped into a specific category.

Whether one model drives decisions more than the other has remained a matter of debate among scientists for more than three decades. But according to the findings, the exemplar model is more consistent with decision-making behavior.

The study was published this month in Current Biology. The authors include Alison Preston, associate professor in the Department of Psychology and the Center for Learning and Memory; and Bradley Love, a professor at University College London.

In the study, 20 respondents were asked to sort various shapes into two categories. During the task their brain activity was observed using functional magnetic resonance imaging (fMRI), allowing researchers to see how the respondents associate shapes with past memories.

According to the findings, behavioral research alone cannot determine whether a subject uses the exemplar or prototype model to make decisions. With brain-imaging analysis, researchers found that the exemplar model accounted for the majority of participants’ decisions. The results show three different regions associated with the exemplar model were activated during the learning task: occipital (visual perception), parietal (sensory) and frontal cortex (attention).

While processing new information, the brain stores concrete traces of experiences, allowing it to make different kinds of decisions, such as categorization information (is that a dog?), identification (is that John’s dog?) and recall (when did I last see John’s dog?).

To illustrate, Mack says: Imagine having a conversation with a friend about buying a new car. When you think of the category “car,” you’re likely to think of an abstract concept of a car, but not specific details. However, abstract categories are composed of memories from individual experiences. So when you imagine “car,” the abstract mental picture is actually derived from experiences, such as your friend’s white sedan or the red sports car you saw on the morning commute.

“We flexibly memorize our experiences, and this allows us to use these memories for different kinds of decisions,” Mack says. “By storing concrete traces of our experiences, we can make decisions about different types of cars and even specific past experiences in our life with the same memories.”

Mack says this new approach to model-based cognitive neuroscience could lead to discoveries in cognitive research.

“The field has struggled with linking theories of how we behave and act to the activation measures we see in the brain,” Mack says. “Our work offers a method to move beyond simply looking at blobs of brain activation. Instead, we use patterns of brain activation to decode the algorithms underlying cognitive behaviors like decision making.”

(Source: utexas.edu)

Filed under decision making memory brain activity brain imaging neuroscience science

101 notes

In Animal Study, “Cold Turkey” Withdrawal from Drugs Triggers Mental Decline

Can quitting drugs without treatment trigger a decline in mental health? That appears to be the case in an animal model of morphine addiction. Georgetown University Medical Center researchers say their observations suggest that managing morphine withdrawal could promote a healthier mental state in people.

“Over time, drug-abusing individuals often develop mental disorders,” says Italo Mocchetti, PhD, a professor of neuroscience. “It’s been thought that drug abuse itself contributes to mental decline, but our findings suggest that ‘quitting cold turkey’ can also lead to damage.”

In the study published in the November issue of Brain, Behavior and Immunity and presented at Neuroscience 2013, Mocchetti and his research colleagues treated the animals with morphine, or allowed them to undergo withdrawal by stopping the treatment. Then, they measured pro-inflammatory cytokines, which can promote damage and cell death, and the protein CCL5, which has various protective effects in the brain.

“Interestingly, we found that treating the addicted animals with morphine both increased the protective CCL5 protein while decreasing pro-inflammatory cytokines, suggesting a beneficial effect,” Mocchetti explains. The animals that ween’t treated during withdrawal had the opposite results — decreased CCL5 and increased levels of the damaging cytokines.

“From these findings, it appears that morphine withdrawal may be a causative factor that leads to mental decline, presenting an important avenue for research in how we can better help people who are trying to quit using drugs,” concludes Mocchetti.

(Source: explore.georgetown.edu)

Filed under morphine addiction cytokines morphine withdrawal CCL5 mental health neuroscience science

218 notes

Robotic advances promise artificial legs that emulate healthy limbs
Recent advances in robotics technology make it possible to create prosthetics that can duplicate the natural movement of human legs. This capability promises to dramatically improve the mobility of lower-limb amputees, allowing them to negotiate stairs and slopes and uneven ground, significantly reducing their risk of falling as well as reducing stress on the rest of their bodies.
That is the view of Michael Goldfarb, the H. Fort Flowers Professor of Mechanical Engineering, and his colleagues at Vanderbilt University’s Center for Intelligent Mechatronics expressed in a perspective’s article in the Nov. 6 issue of the journal Science Translational Medicine.
For the last decade, Goldfarb’s team has been doing pioneering research in lower-limb prosthetics. It developed the first robotic prosthesis with both powered knee and ankle joints. And the design became the first artificial leg controlled by thought when researchers at the Rehabilitation Institute of Chicago created a neural interface for it.
In the article, Goldfarb and graduate students Brian Lawson and Amanda Shultz describe the technological advances that have made robotic prostheses viable. These include lithium-ion batteries that can store more electricity, powerful brushless electric motors with rare-Earth magnets, miniaturized sensors built into semiconductor chips, particularly accelerometers and gyroscopes, and low-power computer chips.
The size and weight of these components is small enough so that they can be combined into a package comparable to that of a biological leg and they can duplicate all of its basic functions. The electric motors play the role of muscles. The batteries store enough power so the robot legs can operate for a full day on a single charge. The sensors serve the function of the nerves in the peripheral nervous system, providing vital information such as the angle between the thigh and lower leg and the force being exerted on the bottom of the foot, etc. The microprocessor provides the coordination function normally provided by the central nervous system. And, in the most advanced systems, a neural interface enhances integration with the brain.
Unlike passive artificial legs, robotic legs have the capability of moving independently and out of sync with its user’s movements. So the development of a system that integrates the movement of the prosthesis with the movement of the user is “substantially more important with a robotic leg,” according to the authors.
Not only must this control system coordinate the actions of the prosthesis within an activity, such as walking, but it must also recognize a user’s intent to change from one activity to another, such as moving from walking to stair climbing.
Identifying the user’s intent requires some connection with the central nervous system. Currently, there are several different approaches to establishing this connection that vary greatly in invasiveness. The least invasive method uses physical sensors that divine the user’s intent from his or her body language. Another method – the electromyography interface – uses electrodes implanted into the user’s leg muscles. The most invasive techniques involve implanting electrodes directly into a patient’s peripheral nerves or directly into his or her brain. The jury is still out on which of these approaches will prove to be best. “Approaches that entail a greater degree of invasiveness must obviously justify the invasiveness with substantial functional advantage,” the article states.
There are a number of potential advantages of bionic legs, the authors point out.
Studies have shown that users equipped with the lower-limb prostheses with powered knee and heel joints naturally walk faster with decreased hip effort while expending less energy than when they are using passive prostheses.
In addition, amputees using conventional artificial legs experience falls that lead to hospitalization at a higher rate than elderly living in institutions. The rate is actually highest among younger amputees, presumably because they are less likely to limit their activities and terrain. There are several reasons why a robotic prosthesis should decrease the rate of falls: Users don’t have to compensate for deficiencies in its movement like they do for passive legs because it moves like a natural leg. Both walking and standing, it can compensate better for uneven ground. Active responses can be programmed into the robotic leg that helps users recover from stumbles.
Before individuals in the U.S. can begin realizing these benefits, however, the new devices must be approved by the U.S. Food and Drug Administration (FDA).
Single-joint devices are currently considered to be Class I medical devices, so they are subject to the least amount of regulatory control. Currently, transfemoral prostheses are generally constructed by combining two, single-joint prostheses. As a result, they have also been considered Class I devices.
In robotic legs the knee and ankle joints are electronically linked. According to the FDA that makes them multi-joint devices, which are considered Class II medical devices. This means that they must meet a number of additional regulatory requirements, including the development of performance standards, post-market surveillance, establishing patient registries and special labeling requirements.
Another translational issue that must be resolved before robotic prostheses can become viable products is the need to provide additional training for the clinicians who prescribe prostheses. Because the new devices are substantially more complex than standard prostheses, the clinicians will need additional training in robotics, the authors point out.
In addition to the robotics leg, Goldfarb’s Center for Intelligent Mechatronics has developed an advanced exoskeleton that allows paraplegics to stand up and walk, which led Popular Mechanics magazine to name him as one of the 10 innovators who changed the world in 2013, and a robotic hand with a dexterity that approaches that of the human hand.

Robotic advances promise artificial legs that emulate healthy limbs

Recent advances in robotics technology make it possible to create prosthetics that can duplicate the natural movement of human legs. This capability promises to dramatically improve the mobility of lower-limb amputees, allowing them to negotiate stairs and slopes and uneven ground, significantly reducing their risk of falling as well as reducing stress on the rest of their bodies.

That is the view of Michael Goldfarb, the H. Fort Flowers Professor of Mechanical Engineering, and his colleagues at Vanderbilt University’s Center for Intelligent Mechatronics expressed in a perspective’s article in the Nov. 6 issue of the journal Science Translational Medicine.

For the last decade, Goldfarb’s team has been doing pioneering research in lower-limb prosthetics. It developed the first robotic prosthesis with both powered knee and ankle joints. And the design became the first artificial leg controlled by thought when researchers at the Rehabilitation Institute of Chicago created a neural interface for it.

In the article, Goldfarb and graduate students Brian Lawson and Amanda Shultz describe the technological advances that have made robotic prostheses viable. These include lithium-ion batteries that can store more electricity, powerful brushless electric motors with rare-Earth magnets, miniaturized sensors built into semiconductor chips, particularly accelerometers and gyroscopes, and low-power computer chips.

The size and weight of these components is small enough so that they can be combined into a package comparable to that of a biological leg and they can duplicate all of its basic functions. The electric motors play the role of muscles. The batteries store enough power so the robot legs can operate for a full day on a single charge. The sensors serve the function of the nerves in the peripheral nervous system, providing vital information such as the angle between the thigh and lower leg and the force being exerted on the bottom of the foot, etc. The microprocessor provides the coordination function normally provided by the central nervous system. And, in the most advanced systems, a neural interface enhances integration with the brain.

Unlike passive artificial legs, robotic legs have the capability of moving independently and out of sync with its user’s movements. So the development of a system that integrates the movement of the prosthesis with the movement of the user is “substantially more important with a robotic leg,” according to the authors.

Not only must this control system coordinate the actions of the prosthesis within an activity, such as walking, but it must also recognize a user’s intent to change from one activity to another, such as moving from walking to stair climbing.

Identifying the user’s intent requires some connection with the central nervous system. Currently, there are several different approaches to establishing this connection that vary greatly in invasiveness. The least invasive method uses physical sensors that divine the user’s intent from his or her body language. Another method – the electromyography interface – uses electrodes implanted into the user’s leg muscles. The most invasive techniques involve implanting electrodes directly into a patient’s peripheral nerves or directly into his or her brain. The jury is still out on which of these approaches will prove to be best. “Approaches that entail a greater degree of invasiveness must obviously justify the invasiveness with substantial functional advantage,” the article states.

There are a number of potential advantages of bionic legs, the authors point out.

Studies have shown that users equipped with the lower-limb prostheses with powered knee and heel joints naturally walk faster with decreased hip effort while expending less energy than when they are using passive prostheses.

In addition, amputees using conventional artificial legs experience falls that lead to hospitalization at a higher rate than elderly living in institutions. The rate is actually highest among younger amputees, presumably because they are less likely to limit their activities and terrain. There are several reasons why a robotic prosthesis should decrease the rate of falls: Users don’t have to compensate for deficiencies in its movement like they do for passive legs because it moves like a natural leg. Both walking and standing, it can compensate better for uneven ground. Active responses can be programmed into the robotic leg that helps users recover from stumbles.

Before individuals in the U.S. can begin realizing these benefits, however, the new devices must be approved by the U.S. Food and Drug Administration (FDA).

Single-joint devices are currently considered to be Class I medical devices, so they are subject to the least amount of regulatory control. Currently, transfemoral prostheses are generally constructed by combining two, single-joint prostheses. As a result, they have also been considered Class I devices.

In robotic legs the knee and ankle joints are electronically linked. According to the FDA that makes them multi-joint devices, which are considered Class II medical devices. This means that they must meet a number of additional regulatory requirements, including the development of performance standards, post-market surveillance, establishing patient registries and special labeling requirements.

Another translational issue that must be resolved before robotic prostheses can become viable products is the need to provide additional training for the clinicians who prescribe prostheses. Because the new devices are substantially more complex than standard prostheses, the clinicians will need additional training in robotics, the authors point out.

In addition to the robotics leg, Goldfarb’s Center for Intelligent Mechatronics has developed an advanced exoskeleton that allows paraplegics to stand up and walk, which led Popular Mechanics magazine to name him as one of the 10 innovators who changed the world in 2013, and a robotic hand with a dexterity that approaches that of the human hand.

Filed under robotics robotic leg artificial limbs prosthetics CNS technology neuroscience science

62 notes

New Method Predicts Time from Alzheimer’s Onset to Nursing Home, Death

A Columbia University Medical Center-led research team has clinically validated a new method for predicting time to full-time care, nursing home residence, or death for patients with Alzheimer’s disease. The method, which uses data gathered from a single patient visit, is based on a complex model of Alzheimer’s disease progression that the researchers developed by consecutively following two sets of Alzheimer’s patients for 10 years each. The results were published online ahead of print in the Journal of Alzheimer’s Disease.

image

“Predicting Alzheimer’s progression has been a challenge because the disease varies significantly from one person to another—two Alzheimer’s patients may both appear to have mild forms of the disease, yet one may progress rapidly, while the other progresses much more slowly,” said senior author Yaakov Stern, PhD, professor of neuropsychology (in neurology, psychiatry, and psychology and in the Taub Institute for Research on Alzheimer’s Disease and the Aging Brain and the Gertrude H. Sergievsky Center) at CUMC. “Our method enables clinicians to predict the disease path with great specificity.”

(Source: newsroom.cumc.columbia.edu)

Read more …

Filed under alzheimer's disease dementia neurodegeneration neuroscience science

free counters