Posts tagged science

Posts tagged science
Mind-controlled artificial limb gives patients sense of touch again
Artificial limbs and prosthetics have come a long way from the 1963 CO2 gas-powered artificial arms exhibited at the Wellcome Trust in 2012.
In the 21st century, the Pentagon’s research division, Darpa, has been at the cutting edge of prosthetics development, in no small part due to the wars in Iraq and Afghanistan.
Darpa’s touch-sensitive artificial prosthetic, described in a statement on 30 May, interfaces directly with the wearer’s neural system and shows just how far we’ve come.
Unlike direct brain neural interfaces, the prosthetic connects with nerves in the patient’s limb, therefore requiring less serious and less risky surgery.
It doesn’t require any visual information to operate, allowing the wearer to control it without maintaining visual contact. This makes “blind” tasks, like rummaging through a bag, much easier.
A flat interface nerve electrode (Fine) provides direct sensory feedback to the patient. Fine is a way of hacking into the body’s nervous system by flattening a nerve. This exposes more of the nerve to electrical contact, making it easier to interface with it. Researchers at Case Western Reserve University, involved with the touch-sensitive prosthetic, previously used Fine to reactivate paralysed limbs.
In the video, the wearer of the prosthetic hand is able to identify which finger researchers at Case Western Reserve University are touching without looking.
Groups across the world are engaged in similar research, including a team at the École Polytechnique Fédérale de Lausanne in France which announced in February that it would be trialling a touch-sensitive prosthetic this year.
Startlingly natural prosthetic movement, including bouncing and catching a tennis ball with a fully artificial arm and hand, is also described in Darpa’s 30 May statement.
Using a type of neural connection called targeted muscle re-innervation (TMR), researchers at the Rehabilitation Institute of Chicago (RIC) were able to achieve simultaneous control of the shoulder, elbow and wrist.
TMR involves re-wiring nerves from amputated limbs so that existing muscles, like those in the shoulder, for example, can be used to control the prosthetic arm.
Last year, Zac Vawter climbed the 442m Willis Tower in Chicago with an artificial leg that used TMR. He was fundraising for the RIC.
This video shows former Army Staff Sgt Glen Lehman, injured in Iraq, demonstrating the full range of fluid motions enabled by the TMR prosthetic arm.
How Birds and Babies Learn to Talk
Few things are harder to study than human language. The brains of living humans can only be studied indirectly, and language, unlike vision, has no analogue in the animal world. Vision scientists can study sight in monkeys using techniques like single-neuron recording. But monkeys don’t talk.
However, in an article published in Nature, a group of researchers, including myself, detail a discovery in birdsong that may help lead to a revised understanding of an important aspect of human language development. Almost five years ago, I sent a piece of fan mail to Ofer Tchernichovski, who had just published an article showing that, in just three or four generations, songbirds raised in isolation often developed songs typical of their species. He invited me to visit his lab, a cramped space stuffed with several hundred birds residing in souped-up climate-controlled refrigerators. Dina Lipkind, at the time Tchernichovski’s post-doctoral student, explained a method she had developed for teaching zebra finches two songs. (Ordinarily, a zebra finch learns only one song in its lifetime.) She had discovered that by switching the song of a tutor bird at precisely the right moment, a juvenile bird could learn a second, new song after it had mastered the first one.
Thinking about bilingualism and some puzzles I had encountered in my own lab, I suggested that Lipkind’s method could be useful in casting light on the question of how a creature—any creature—learns to put linguistic elements together. We mapped out an experiment that day: birds would learn one “grammar” in which every phrase followed the form of ABCABC, and then we would switch things up, giving them a new target, ACBACB (the As, Bs, and Cs were certain stereotyped chirps and peeps).
The results were thrilling: most of the birds could accomplish the task. But it was clearly difficult—it took several weeks for them to learn the new grammar—and it was challenging in a particular way. While the birds showed no sign of needing to relearn individual sounds, the connections between individual syllables, known as “transitions,” proved incredibly difficult. The birds proceeded slowly and systematically, incrementally working out each transition (e.g., from C to B, and B to A). They could not freely move syllables around, and did not engage in trial and error, either. Instead, they undertook a systematic struggle to learn particular connections between specific, individual syllables. The moment they mastered the third transition of the sequence, they were able to produce the entire grammar. Never, to my knowledge, had the process of learning any sort of grammar been so precisely articulated.
We wrote up the results, but Nature declined to publish them. Then Dina and Ofer speculated that our findings might be more convincing if they were true for not only zebra finches (hardly the Einsteins of the bird world) but for other species as well. Ofer contacted a Japanese researcher, Kazuo Okanoya, who he thought might be able to gather data for Bengalese finches, which have a more complex grammar than zebra finches. Amazingly, the Bengalese finches followed almost exactly the same learning pattern as the zebra finches.
Then we decided to test our ideas about the incrementality of vocal learning in human infants, enlisting the help of a graduate student I had been working with at N.Y.U., Doug Bemis. Bemis and Lipkind analyzed an old, publicly available set of human-babbling data, drawn from the CHILDES database, in a new way. The literature said that in the later part of the first year of life, babies undergo a change from “reduplicated” babbling—repeating a syllable, like bababa—to “variegated” babbling—often switching between syllables, like babadaga. Our birdsong results led us to wonder whether such a change might be more piecemeal than is commonly presumed, and our examination of the data proved that, in fact, the change did not happen all at once. It was gradual, with new transitions worked out one by one; human babies were stymied in the same ways that the birds were. Nobody had ever really explained why babbling took so many months; our birdsong data has finally yielded a first clue.
Today, almost five years after Lipkind and Tchernichovski began developing the methods that are at the paper’s core, the work is finally being published by Nature.
What we don’t yet know is whether the similarity between birds and babies stems from a fundamental similarity between species at the biological level. When two species do something in similar ways, it can be a matter of “homology,” a genuine lineage at the genetic level, or “analogy,” which is independent reinvention. It will likely be years before we know for sure, but there is reason to believe that our results are not purely an accident of independent invention. Some of the important genes in human vocal learning (including FOXP2, the gene thus far most decisively tied to human language) are also involved in avian vocal learning, as a new book, “Birdsong, Speech, and Language,” discusses at length.
Language will never be as easy to dissect as birdsong, but knowledge about one can inform knowledge about the other. Our brains didn’t evolve to be easily understood, but the fact that humans share so many genes with so many other species gives scientists a fighting chance.
The Quantified Brain of a Self-Tracking Neuroscientist
A neuroscientist is getting a brain scan twice every week for a year to try to see how neural networks behave over time
Russell Poldrack, a neuroscientist at the University of Texas at Austin, is undertaking some intense introspection. Every day, he tracks his mood and mental state, what he ate, and how much time he spent outdoors. Twice a week, he gets his brain scanned in an MRI machine. And once a week, he has his blood drawn so that it can be analyzed for hormones and gene activity levels. Poldrack plans to gather a year’s worth of brain and body data to answer an unexplored question in the neuroscience community: how do brain networks behave and change over a year?
Microbleeding in Brain May Be Behind Senior Moments
People may grow wiser with age, but they don’t grow smarter. Many of our mental abilities decline after midlife, and now researchers say that they’ve fingered a culprit. A study presented here last week at the annual meeting of the Association for Psychological Science points to microbleeding in the brain caused by stiffening arteries. The finding may lead to new therapies to combat senior moments.
This isn’t the first time that microbleeds have been suspected as a cause of cognitive decline. “We have known [about them] for some time thanks to neuroimaging studies,” says Matthew Pase, a psychology Ph.D. student at Swinburne University of Technology in Melbourne, Australia. The brains of older people are sometimes peppered with dark splotches where blood vessels have burst and created tiny dead zones of tissue. How important these microbleeds are to cognitive decline, and what causes them, have remained open questions, however.
Pase wondered if high blood pressure might be behind the microbleeds. The brain is a very blood-hungry organ, he notes. “It accounts for only 2% of the body weight yet receives 15% of the cardiac output and consumes 20% of the body’s oxygen expenditure.” Rather than getting the oxygen in pulses, the brain needs a smooth, continuous supply. So the aorta, the largest blood vessel branching off the heart, smooths out blood pressure before it reaches the brain by absorbing the pressure with its flexible walls. But as people age, the aorta stiffens. That translates to higher pressure on the brain, especially during stress. The pulse of blood can be strong enough to burst vessels in the brain, resulting in microbleeds.
A stumbling block has been accurately measuring the blood pressure that the brain experiences. The hand-pumped armband devices commonly used in doctor’s offices measure only the local pressure of blood in the arm, known as the brachial pressure. To calculate aorta stiffness, the “central blood pressure” in the aorta is needed. A technique for measuring central blood pressure was developed in the late 1990s, called applanation tonometry (AT). It works by comparing the pressure wave of blood from the heart with the reflected pressure wave from the vessels farthest from the heart—the aorta stiffness is calculated from the difference in pressure from the two. Devices for measuring AT have appeared on the market that are fast and painless.
To see if central blood pressure and aorta stiffening are related to cognitive abilities, Pase and colleagues recruited 493 people in Melbourne, 20 to 82 years old. They made traditional blood pressure measurements and also used AT to measure central blood pressure and estimate aorta stiffness. They also measured their subjects’ cognitive abilities with a standard battery of computer tests.
Central blood pressure and aorta stiffness alone were sensitive predictors of cognitive abilities, Pase reported at the meeting. The higher the central pressure and aorta stiffness, the worse people tended to perform on tests of visual processing and memory. The traditional measures of blood pressure in the arm were correlated with only scores on one test of visual processing.
To prove that aorta stiffening causes microbleeds, the researchers will need to repeat the experiment on the same people over the course of several years, using neuroimaging as well to establish that aorta stiffening leads to both microbleeding and cognitive decline. Pase notes that other causes of microbleeding have been proposed, such as weakening of blood vessels in the brain.
"This work is so important because the problem is so pervasive," says Earl Hunt, a veteran intelligence researcher at the University of Washington, Seattle, who was not involved in the work. The individual effects of these microbleeds are probably too small to measure. "But even a trifling difference multiplied a million times is big," he says. Pase’s collaborator at Swinburne, Con Stough, is now leading a study of how to prevent microbleeding through dietary supplements. He proposes that the elasticity of the aorta could be preserved by providing fatty acids or antioxidants that help maintain its structure. The results are expected in 2015.
Genetically engineered immune cells seem to promote healing in mice infected with a neurological disease similar to multiple sclerosis (MS), cleaning up lesions and allowing the mice to regain use of their legs and tails.
The new finding, by a team of University of Wisconsin School of Medicine and Public Health researchers, suggests that immune cells could be engineered to create a new type of treatment for people with MS. Currently, there are few good medications for MS, an autoimmune inflammatory disease that affects some 400,000 people in the United States, and none that reverse progress of the disease.
Dr. Michael Carrithers, assistant professor of neurology, led a team that created a specially designed macrophage – an immune cell whose name means “big eater.” Macrophages rush to the site of an injury or infection, to destroy bacteria and viruses and clear away damaged tissue. The research team added a human gene to the mouse immune cell, creating a macrophage that expressed a sodium channel called NaVI.5, which seems to enhance the cell’s immune response.
But because macrophages can also be part of the autoimmune response that damages the protective covering (myelin) of the nerves in people with MS, scientists weren’t sure whether the NaV1.5 macrophages would help or make the disease worse.
When the mice developed experimental autoimmune encephalomyelitis – the mouse version of MS — they found that the NaV1.5 macrophages sought out the lesions caused by the disease and promoted recovery.
“This finding was unexpected because we weren’t sure how much damage they would do, versus how much cleaning up they would do,” Carrithers says. “Some people thought the mice would get more ill, but we found that it protected them and they either had no disease or a very mild case.”
In follow-up experiments, regular mice that do not express the human gene were treated with the NaV1.5 macrophages after the onset of symptoms, which include weakness of the back and front limbs. The majority of these mice developed complete paralysis of their hindlimbs. Almost all of the mice that were treated with the Na1.5 macrophages regained the ability to walk. Mice treated with placebo solution or regular mouse macrophages that did not have NaV1.5 did not show any recovery or became more ill. In treated mice, the research team also found the NaV1.5 macrophages at the site of the lesions, and found smaller lesions and less damaged tissue in the treated mice.
Because the NaV1.5 variation is present in human immune cells, Carrithers says, “The questions are, ‘Why are these repair mechanisms deficient in patients with MS and what can we do to enhance them?’’’ He says the long-range goal is to develop the NaV1.5 enhanced macrophages as a treatment for people with MS.
The study is being published in the June issue of the Journal of Neuropathology and Experimental Neurology.
(Source: med.wisc.edu)
Ritalin activates specific areas of the brain in children with attention-deficit/hyperactivity disorder (ADHD), mimicking the brain activity of children without the condition, a new review says.

"This suggests that Ritalin does bring the brain [of a child with ADHD] back to the brain the typically developing kid has," said study author Constance Moore, associate director of the translational center for comparative neuroimaging at the University of Massachusetts Medical School.
Analyzing data from earlier studies that looked at how children’s brains were affected by doing certain tasks that are sometimes challenging for kids with ADHD, the researchers found that Ritalin (methylphenidate) was having a visible impact on three areas of the brain known to be associated with ADHD: the cortex, the cerebellum and the basal ganglia.
The study could be helpful in diagnosing and treating children with ADHD, Moore said. “It may be helpful to know that in certain children, Ritalin is having a physiological effect in the areas of the brain involved with attention and impulse control,” she said.
The research was published recently in the Harvard Review of Psychiatry.
Nine studies analyzed by the researchers used functional MRI to evaluate brain changes after children had taken a single dose of Ritalin. The children were involved in different types of tasks that tested their ability to focus and inhibit an impulse to act.
For example, to observe the brain’s reaction during a test of what is called “inhibitory control,” a child was told that every time he saw a zero show up on a screen, he should push the button on the right; every time he saw an X appear, he should push the left button. The children would then be asked to flip their responses, pushing the left button when they saw a zero.
"That’s hard to do," Moore said, "because you’ve developed the habit [of pushing the other button], so you have to suppress your impulse. If you do 20 zeros and keep pressing and then you see an X, most kids with ADHD will hit the wrong button."
In three out of five of the inhibitory control studies, Ritalin at least partially normalized brain activation in ADHD children.
To note how the brain reacted to a selective attention test, Moore said, children would first be asked, for example, what word they were seeing. The word would be “red,” and the color of the type also would be red. Then they would be shown the word “red,” but the color of the type would be green. In several studies, Ritalin affected activation in the frontal lobes during such inhibitory control tasks.
Most of the studies included in the review were performed in the United States or the United Kingdom. The majority of participants were adolescent boys, and all studies compared their results to healthy children of the same approximate age.
Because none of the studies looked at the correlation between ADHD symptoms and whether the child was taking Ritalin, there is no way to link the changes in brain activation with clinical improvement, Moore said. “It’s possible that kids who are not responsive to Ritalin may have brain changes too,” she said.
ADHD affects between 3 percent and 7 percent of school-aged children in the United States, according to the American Psychiatric Association. Boys are more likely to have ADHD than girls.
One expert was not surprised by the results.
"The review article shows there is a consensus of well-designed imaging studies showing that [Ritalin] has an impact on the frontal cortex of the brain, where we have long believed these patients have issues," said Dr. Andrew Adesman, chief of developmental and behavioral pediatrics at the Steven & Alexandra Cohen Children’s Medical Center of New York, in New Hyde Park. Adesman wondered if Ritalin may play a role in helping the brain mature.
"Their data provides partial support for that," he said. "But if anything, the medicine seems to help the brain look more normal and doesn’t seem to do anything bad to it."
(Source: consumer.healthday.com)

Circadian rhythms control body’s response to intestinal infections
Circadian rhythms can boost the body’s ability to fight intestinal bacterial infections, UC Irvine researchers have found.
This suggests that targeted treatments may be particularly effective for pathogens such as salmonella that prompt a strong immune system response governed by circadian genes. It also helps explain why disruptions in the regular day-night pattern – as experienced by, say, night-shift workers or frequent fliers – may raise susceptibility to infectious diseases.
UC Irvine’s Paolo Sassone-Corsi, one of the world’s leading researchers on circadian rhythm genetics, and microbiologist Manuela Raffatellu led the study, which appears this week in the early online edition of Proceedings of the National Academy of Sciences. Marina Bellet, a postdoctoral researcher from Italy’s University of Perugia also played a key role in the experiments.
“Although many immune responses are known to follow daily oscillations, the role of the circadian clock in the immune response to acute infections has not been understood,” said Sassone-Corsi, the Donald Bren Professor of Biological Chemistry. “What we’re learning is that the intrinsic power of the body clock can help fight infections.”
Circadian rhythms of 24 hours govern fundamental physiological functions in almost all organisms. The circadian clock is an intrinsic time-tracking system in the human body that anticipates environmental changes and adapts to the appropriate time of day. Disruption of these normal rhythms can profoundly influence people’s health.
Up to 15 percent of human genes are regulated by the day-night pattern of circadian rhythms, including those that respond to intestinal infections.
In tests on mice infected with salmonella, the researchers noted that circadian-controlled genes govern the immune response to the invading pathogen, leading to day-night differences in infection potential and in the immune system’s ability to deal with pathogens.
Mice are nocturnal, with circadian rhythms opposite those of humans. While important differences exist in the immune response of mice and humans, Sassone-Corsi said, these test results could provide clues to how circadian-controlled intestinal genes regulate daily changes in the effectiveness of the human immune system.
“Salmonella is a good pathogen to study what happens during infection,” said Raffatellu, assistant professor of microbiology & molecular genetics. “We think these findings may be broadly applicable to other infectious diseases in the gut, and possibly in other organs controlled by circadian patterns.”
Sassone-Corsi added that it’s important to understand the circadian genetics regulating immunity. “This gives us the ability to target treatments that supplement the power of the body clock to boost immune response,” he said.
(Image: Stephen Sedam / Los Angeles Times)
Exposure to general anaesthesia increases the risk of dementia in the elderly by 35%, says new research presented at Euroanaesthesia, the annual congress of the European Society of Anaesthesiology (ESA). The research is by Dr Francois Sztark, INSERM and University of Bordeaux, France, and colleagues.
Postoperative cognitive dysfunction, or POCD, could be associated with dementia several years later. POCD is a common complication in elderly patients after major surgery. It has been proposed that there is an association between POCD and the development of dementia due to a common pathological mechanism through the amyloid β peptide. Several experimental studies suggest that some anaesthetics could promote inflammation of neural tissues leading to POCD and/or Alzheimer’s disease (AD) precursors including β-amyloid plaques and neurofibrillary tangles. But it remains uncertain whether POCD can be a precursor of dementia.
In this new study, the researchers analysed the risk of dementia associated with anaesthesia within a prospective population-based cohort of elderly patients (aged 65 years and over). The team used data from the Three-City study, designed to assess the risk of dementia and cognitive decline due to vascular risk factors. Between 1999 and 2001, the 3C study included 9294 community-dwelling French people aged 65 years and over in three French cities (Bordeaux, Dijon and Montpellier).
Participants aged 65 years and over were interviewed at baseline and subsequently 2, 4, 7 and 10 years after. Each examination included a complete cognitive evaluation with systematic screening of dementia. From the 2-year follow-up, 7008 non-demented participants were asked at each follow-up whether they have had a history of anaesthesia (general anaesthesia (GA) or local/locoregional anaesthesia (LRA)) since the last follow-up. The data were adjusted to take account of potential confounders such as socioeconomic status and comorbidities.
The mean age of participants was 75 years and 62% were women. At the 2-year follow-up, 33% of the participants (n=2309) reported an anaesthesia over the 2 previous years, with 19% (n=1333) reporting a GA and 14% (n=948) a LRA. A total of 632 (9%) participants developed dementia over the 8 subsequent years of follow-up, among them 284 probable AD and 228 possible AD, and the remaining 120 non-Alzheimer’s dementia. The researchers found that demented patients were more likely to have received anaesthesia (37%) than non-demented patients (32%). This difference in anaesthesia was due to difference in numbers receiving general anaesthetics, with 22% of demented patients reporting a GA compared with 19% of non-demented patients. After adjustment, participants with at least one GA over the follow-up had a 35% increased risk of developing a dementia compared with participants without anaesthesia.
Dr Sztark concludes: “These results are in favour of an increased risk for dementia several years after general anaesthesia. Recognition of POCD is essential in the perioperative management of elderly patients. A long-term follow-up of these patients should be planned.”
(Source: eurekalert.org)
Scientists from the University of Sussex have revealed that men are significantly better than women at using speech ‘formants’ to compare the apparent size of the source. Formants are important phonetic elements of human speech that are used by mammals to assess the body size of potential mates and rivals. This research is the first to indicate that formant perception may have evolved through sexual selection.

Dr. Benjamin D. Charlton and his team tested 18 males and 37 females, aged between 17 and 20 years. Participants heard 60 unique stimulus pairs with different formants, representing two different animals, and their task was to decide which one sounded ‘larger’. Researchers tested the ability of listeners to detect small differences in apparent size across a wide range of formants which encompassed the range of the human speaking voice.
Speech formants, which give us our particular vowel sounds, are based on the length of the vocal tract, and thus relate directly to body size. But whereas men appear to use formants to judge the physical dominance of potential rivals, formants are not consistently found to predict how women rate the attractiveness of men’s voices. Women have been found to be more reliant on voice pitch rather than formants when rating how attractive they find a male voice.
The researchers conclude that the sex differences they report could be either innate or acquired or both. Hence, while they are compatible with the hypothesis that males rely on size assessment more than females, they do not conclusively demonstrate that these abilities arose through sexual selection. For example, it is possible that males learn this skill because this information is more important to them during their everyday social interactions. There may also be key differences across cultures, particularly in societies where gender roles differ markedly. Thus, they look forward to future studies examining the effects of training and personality as well as social and cultural factors.
(Source: royalsociety.org)
Researchers at the MRC Laboratory of Molecular Biology in the United Kingdom have determined the crystal structure of Parkin, a protein found in cells that when mutated can lead to a hereditary form of Parkinson’s disease. The results, which are published in The EMBO Journal, define the position of many of the mutations linked to hereditary Parkinson’s disease and explain how these alterations may affect the stability and function of the protein. The findings may in time reveal how the activity of Parkin is affected in patients with this rare but debilitating type of Parkinson’s disease.
Parkinson’s disease is a progressive neurodegenerative disease that affects more than seven million people worldwide. Most cases of the disease occur in older individuals and are sporadic (non-familial), but around 15% of patients develop symptoms early in life because of inherited mutations in a limited number of disease genes. Why Parkin mutations are especially detrimental in nerve cells is not fully understood, but previous research indicates that Parkin regulates the function of mitochondria, the organelles that generate energy in the cell. Some disease mutations in the PARKIN gene can be easily explained since they lead to loss or instability of the Parkin protein, but many others are more difficult to understand.
Around 50% of cases of familial recessive Parkinson’s disease are caused by mutations in the PARKIN gene, which encodes a protein that belongs to the RBR ubiquitin ligase enzyme family. Enzymes in this family couple other proteins in the cell to a molecule called ubiquitin, a step that can alter the function or stability of these target proteins. To understand how Parkin and other RBR ubiquitin ligase enzymes achieve this, EMBO Young Investigator David Komander and his coworker Tobias Wauer crystallized a form of human Parkin and used X-ray diffraction patterns to determine how the Parkin protein chain folds into a three-dimensional structure. Their experiments revealed an in-built control mechanism for Parkin activity, which is lost in the presence of some of the mutations responsible for Parkinson’s disease. Wauer and Komander pinpointed amino acids of Parkin with key functions in ubiquitin ligase activity that are sensitive to blocking by reagents previously characterized in their laboratory. “This sensitivity to inhibitors that were developed for a very different class of enzymes is particularly exciting,” Komander remarked. “We could also show that these inhibitors affect related RBR ubiquitin ligases such as HOIP, which is important for inflammatory immune responses.”
The crystal structure of Parkin is already revealing some of the secrets of this molecule, which under the right conditions can protect cells from the damage that arises during Parkinson’s disease. “In time the structure may also allow development of other compounds that alter Parkin activity, which could serve as ways to limit the progression and impact of Parkinson’s disease,” concluded Komander.
(Source: embo.org)