Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

91 notes

Sheep Help Scientists Fight Huntington’s Disease
When University of Cambridge neurobiologist Jenny Morton began working with sheep five years ago, she anticipated docile, dull creatures. Instead she discovered that sheep are complex and curious. Morton, who studies neurodegenerative diseases such as Huntington’s, is helping evaluate sheep as new large animal models for human brain diseases.
Huntington’s is a fatal, hereditary illness that causes a cascade of cell death in the brain’s basal ganglia region. The idea to use sheep to study this disease arose in 1993 in New Zealand, a country where sheep outnumber humans seven to one. Researchers had already identified disorders shared by humans and sheep, but University of Auckland neuroscientist Richard Faull and geneticist Russell Snell had a more ambitious notion. They decided to develop a line of sheep carrying Huntington’s, which is brought on by repeats within the gene IT15, in the hopes of studying the condition’s progression and developing a treatment. They accomplished their goal in 2006 after extensive efforts.
Why sheep? For one, they have big brains—comparable to macaques, which are the only other large animals currently used to study this disease—with developed, cortical folding like our own. Also, sheep can be kept in large paddocks with their fellows and monitored remotely via data-logger backpacks, allowing scientists to study these creatures in a natural setting with fewer ethical concerns than studying caged primates. What is more, these long-lived, social animals are active and expressive, recognize faces, and have long memories. They also learn quickly and engage in experiments readily. This has allowed Morton to develop cognitive tests similar to those given to humans. The researchers can study the full progression of Huntington’s—which in humans is associated with gradual mental and motor decline—and compare the changes with the normal functioning of healthy individuals.
This spring Faull, Snell, Morton and their colleagues will begin monitoring two flocks of Huntington’s sheep in Australia. One flock will be inoculated with one of the most promising therapies yet devised—a virus that silences IT15’s mutations—and the other will serve as the control. Currently no cure exists for any human brain disease. The researchers believe these studies could be a milestone. “The tragedy of this disease is enormous. It’s a curse on the family,” Faull says. “Maybe we can lift that curse.”

Sheep Help Scientists Fight Huntington’s Disease

When University of Cambridge neurobiologist Jenny Morton began working with sheep five years ago, she anticipated docile, dull creatures. Instead she discovered that sheep are complex and curious. Morton, who studies neurodegenerative diseases such as Huntington’s, is helping evaluate sheep as new large animal models for human brain diseases.

Huntington’s is a fatal, hereditary illness that causes a cascade of cell death in the brain’s basal ganglia region. The idea to use sheep to study this disease arose in 1993 in New Zealand, a country where sheep outnumber humans seven to one. Researchers had already identified disorders shared by humans and sheep, but University of Auckland neuroscientist Richard Faull and geneticist Russell Snell had a more ambitious notion. They decided to develop a line of sheep carrying Huntington’s, which is brought on by repeats within the gene IT15, in the hopes of studying the condition’s progression and developing a treatment. They accomplished their goal in 2006 after extensive efforts.

Why sheep? For one, they have big brains—comparable to macaques, which are the only other large animals currently used to study this disease—with developed, cortical folding like our own. Also, sheep can be kept in large paddocks with their fellows and monitored remotely via data-logger backpacks, allowing scientists to study these creatures in a natural setting with fewer ethical concerns than studying caged primates. What is more, these long-lived, social animals are active and expressive, recognize faces, and have long memories. They also learn quickly and engage in experiments readily. This has allowed Morton to develop cognitive tests similar to those given to humans. The researchers can study the full progression of Huntington’s—which in humans is associated with gradual mental and motor decline—and compare the changes with the normal functioning of healthy individuals.

This spring Faull, Snell, Morton and their colleagues will begin monitoring two flocks of Huntington’s sheep in Australia. One flock will be inoculated with one of the most promising therapies yet devised—a virus that silences IT15’s mutations—and the other will serve as the control. Currently no cure exists for any human brain disease. The researchers believe these studies could be a milestone. “The tragedy of this disease is enormous. It’s a curse on the family,” Faull says. “Maybe we can lift that curse.”

Filed under huntington's disease animals sheep mutations genetics neuroscience science

30 notes

Rice opens new window on Parkinson’s disease
Rice University scientists have discovered a new way to look inside living cells and see the insoluble fibrillar deposits associated with Parkinson’s disease.
The combined talents of two Rice laboratories – one that studies the misfolded proteins that cause neurodegenerative diseases and another that specializes in photoluminescent probes – led to the spectroscopic technique that could become a valuable tool for scientists and pharmaceutical companies.
The research by the Rice labs of Angel Martí and Laura Segatori appeared online today in the Journal of the American Chemical Society.
The researchers designed a molecular probe based on the metallic element ruthenium. Testing inside live neuroglioma cells, they found the probe binds with the misfolded alpha-synuclein proteins that clump together and form fibrils and disrupt the cell’s functions. The ruthenium complex lit up when triggered by a laser – but only when attached to the fibril, which allowed aggregation to be tracked using photoluminescence spectroscopy.

Rice opens new window on Parkinson’s disease

Rice University scientists have discovered a new way to look inside living cells and see the insoluble fibrillar deposits associated with Parkinson’s disease.

The combined talents of two Rice laboratories – one that studies the misfolded proteins that cause neurodegenerative diseases and another that specializes in photoluminescent probes – led to the spectroscopic technique that could become a valuable tool for scientists and pharmaceutical companies.

The research by the Rice labs of Angel Martí and Laura Segatori appeared online today in the Journal of the American Chemical Society.

The researchers designed a molecular probe based on the metallic element ruthenium. Testing inside live neuroglioma cells, they found the probe binds with the misfolded alpha-synuclein proteins that clump together and form fibrils and disrupt the cell’s functions. The ruthenium complex lit up when triggered by a laser – but only when attached to the fibril, which allowed aggregation to be tracked using photoluminescence spectroscopy.

Filed under brain parkinson's disease alpha-synuclein proteins photoluminescence spectroscopy neuroscience science

40 notes



Video-based Test to Study Language Development in Toddlers and Children with Autism
Parents often wonder how much of the world their young children really understand. Though typically developing children are not able to speak or point to objects on command until they are between eighteen months and two years old, they do provide clues that they understand language as early as the age of one. These clues provide a point of measurement for psychologists interested in language comprehension of toddlers and young children with autism, as demonstrated in a new video-article published in JoVE (Journal of Visualized Experiments). 
In the assessment, psychologists track a child’s eye movements while they are watching two side by side videos. Children who understand language are more likely to look at the video that the audio corresponds to. This way, language comprehension is tested by attention, not by asking the child to respond or point something out.  Furthermore, all assessments can be conducted in the child’s home, using mobile, commercially available equipment. The technique was developed in the laboratory of Dr. Letitia Naigles, and is known as a portable intermodal preferential looking assessment (IPL).
"When I started working with children with autism, I realized that they have similar issues with strangers that very young typical children do," Dr. Naigles tells us. "Children with autism may understand more than they can show because they are not socially inclined and find social interaction aversive and challenging." Dr. Naigles’ approach helps make this assessment more valuable. By testing the child in the home, where they are comfortable, Dr. Naigles removes much of the anxiety associated with a new environment that may skew results.
While this technique identifies some similarities between typically developing toddlers and children with autism spectrum disorder, such as understanding some types of sentences before they produce them, this does not mean that these children are the same. “Some strategies of word learning that typical children have acquired are not demonstrated in children with autism.” Dr. Naigles says. By illuminating both strengths and weaknesses, the test is valuable for assessing language development. “JoVE is useful because in the past, I have gone to visit various labs to coach them in putting together an IPL. JoVE will enable other labs to set up the procedure more efficiently.” JoVE associate editor Allison Diamond stated, “Showing this work in a video format will allow other scientists in the field to quickly adapt Dr. Naigles’ technique, and use it to address the question of language development in autism, an extremely important field of research.”

Video-based Test to Study Language Development in Toddlers and Children with Autism

Parents often wonder how much of the world their young children really understand. Though typically developing children are not able to speak or point to objects on command until they are between eighteen months and two years old, they do provide clues that they understand language as early as the age of one. These clues provide a point of measurement for psychologists interested in language comprehension of toddlers and young children with autism, as demonstrated in a new video-article published in JoVE (Journal of Visualized Experiments).

In the assessment, psychologists track a child’s eye movements while they are watching two side by side videos. Children who understand language are more likely to look at the video that the audio corresponds to. This way, language comprehension is tested by attention, not by asking the child to respond or point something out.  Furthermore, all assessments can be conducted in the child’s home, using mobile, commercially available equipment. The technique was developed in the laboratory of Dr. Letitia Naigles, and is known as a portable intermodal preferential looking assessment (IPL).

"When I started working with children with autism, I realized that they have similar issues with strangers that very young typical children do," Dr. Naigles tells us. "Children with autism may understand more than they can show because they are not socially inclined and find social interaction aversive and challenging." Dr. Naigles’ approach helps make this assessment more valuable. By testing the child in the home, where they are comfortable, Dr. Naigles removes much of the anxiety associated with a new environment that may skew results.

While this technique identifies some similarities between typically developing toddlers and children with autism spectrum disorder, such as understanding some types of sentences before they produce them, this does not mean that these children are the same. “Some strategies of word learning that typical children have acquired are not demonstrated in children with autism.” Dr. Naigles says. By illuminating both strengths and weaknesses, the test is valuable for assessing language development. “JoVE is useful because in the past, I have gone to visit various labs to coach them in putting together an IPL. JoVE will enable other labs to set up the procedure more efficiently.” JoVE associate editor Allison Diamond stated, “Showing this work in a video format will allow other scientists in the field to quickly adapt Dr. Naigles’ technique, and use it to address the question of language development in autism, an extremely important field of research.”

Filed under autism language language development eye movements language comprehension psychology neuroscience science

58 notes

Resistance to cocaine addiction may be passed down from father to son

Research from the Perelman School of Medicine at the University of Pennsylvania and Massachusetts General Hospital (MGH) reveals that sons of male rats exposed to cocaine are resistant to the rewarding effects of the drug, suggesting that cocaine-induced changes in physiology are passed down from father to son. The findings are published in the latest edition of Nature Neuroscience.

"We know that genetic factors contribute significantly to the risk of cocaine abuse, but the potential role of epigenetic influences – how the expression of certain genes related to addiction is controlled – is still relatively unknown," said senior author R. Christopher Pierce, PhD, associate professor of Neuroscience in Psychiatry at Penn. "This study is the first to show that the chemical effects of cocaine use can be passed down to future generations to cause a resistance to addictive behavior, indicating that paternal exposure to toxins such as cocaine can have profound effects on gene expression and behavior in their offspring."

In the current study, the team used an animal model to study inherited effects of cocaine abuse. Male rats self-administered cocaine for 60 days, while controls were administered saline. The male rats were mated with females that had never been exposed to the drug. To eliminate any influence that the males’ behavior would have on the pregnant females, they were separated directly after they mated.

The rats’ offspring were monitored to see whether they would begin to self-administer cocaine when it was offered to them. The researchers discovered that male offspring of rats exposed to the drug, but not the female offspring, acquired cocaine self-administration more slowly and had decreased levels of cocaine intake relative to controls. Moreover, control animals were willing to work significantly harder for a single cocaine dose than the offspring of cocaine-addicted rats, suggesting that the rewarding effect of cocaine was decreased.

In collaboration with Ghazaleh Sadri-Vakili, MS, PhD, from MGH, the researchers subsequently examined the animals’ brains and found that male offspring of the cocaine-addicted rats had increased levels of a protein in the prefrontal cortex called brain-derived neurotrophic factor (BDNF), which is known to blunt the behavioral effects of cocaine.

"We were quite surprised that the male offspring of sires that used cocaine didn’t like cocaine as much," said Pierce. "While we identified one change in the brain that appears to underlie this cocaine resistance effect, there are undoubtedly other physiological changes as well and we are currently performing more broad experiments to identify them. We also are eager to perform similar studies with more widely used drugs of abuse such as nicotine and alcohol."

The findings suggest that cocaine use causes epigenetic changes in sperm, thereby reprogramming the information transmitted between generations. The researchers don’t know exactly why only the male offspring received the cocaine-resistant trait from their fathers, but speculate that sex hormones such as testosterone, estrogen and/or progesterone may play a role.

(Source: eurekalert.org)

Filed under animal model cocaine cocaine addiction genetics epigenetics neuroscience science

98 notes

Neuroscience offers a glimpse into the mind - and our future
Hassan Rasouli recently accomplished a remarkable feat: He lifted his thumb in a way that suggests he was making a thumbs-up gesture.
The feat was a remarkable one since doctors at Sunnybrook Health Sciences Centre in Toronto had diagnosed him as being in a persistent vegetative state (PVS), a mysterious condition in which patients appear to be awake but show no clinical signs of conscious awareness.
The condition first came to prominence in 1998 when family members, and then courts and politicians, engaged in a protracted battle over the care of Floridian Terri Schiavo. The matter was finally settled in 2005 when Schiavo, who was in a persistent vegetative state, was removed from life support and died.
Doctors at Sunnybrook similarly wanted to transfer Rasouli to palliative care, but Rasouli’s family refused. The doctors therefore sought a court order, and the Supreme Court of Canada heard arguments in the case on Monday.
The court’s decision might not affect Rasouli since, given his ability to give a thumbs-up gesture, he is no longer considered to be in a persistent vegetative state (PVS). But the case could have a profound impact on the many other patients who have been diagnosed as being in a PVS, as it could answer pressing legal questions about when someone can be removed from life support, and who has the authority to order that such support be discontinued.
The Rasouli case also raises further troubling questions of fact: Was Rasouli’s ability to give a thumbs-up gesture an indication that his condition had improved, or was he never in a persistent vegetative state? Was he, and other people similarly diagnosed, always consciously aware, but, thanks to being trapped in a paralyzed body, unable to express his thoughts?
(Illustration by Bert Dodson)
Continue reading

Neuroscience offers a glimpse into the mind - and our future

Hassan Rasouli recently accomplished a remarkable feat: He lifted his thumb in a way that suggests he was making a thumbs-up gesture.

The feat was a remarkable one since doctors at Sunnybrook Health Sciences Centre in Toronto had diagnosed him as being in a persistent vegetative state (PVS), a mysterious condition in which patients appear to be awake but show no clinical signs of conscious awareness.

The condition first came to prominence in 1998 when family members, and then courts and politicians, engaged in a protracted battle over the care of Floridian Terri Schiavo. The matter was finally settled in 2005 when Schiavo, who was in a persistent vegetative state, was removed from life support and died.

Doctors at Sunnybrook similarly wanted to transfer Rasouli to palliative care, but Rasouli’s family refused. The doctors therefore sought a court order, and the Supreme Court of Canada heard arguments in the case on Monday.

The court’s decision might not affect Rasouli since, given his ability to give a thumbs-up gesture, he is no longer considered to be in a persistent vegetative state (PVS). But the case could have a profound impact on the many other patients who have been diagnosed as being in a PVS, as it could answer pressing legal questions about when someone can be removed from life support, and who has the authority to order that such support be discontinued.

The Rasouli case also raises further troubling questions of fact: Was Rasouli’s ability to give a thumbs-up gesture an indication that his condition had improved, or was he never in a persistent vegetative state? Was he, and other people similarly diagnosed, always consciously aware, but, thanks to being trapped in a paralyzed body, unable to express his thoughts?

(Illustration by Bert Dodson)

Continue reading

Filed under brain brain scans vegetative state neuroimaging neuroscience science

164 notes

Controversial Surgery for Addiction Burns Away Brain’s Pleasure Center
How far should doctors go in attempting to cure addiction? In China, some physicians are taking the most extreme measures. By destroying parts of the brain’s “pleasure centers” in heroin addicts and alcoholics, these neurosurgeons hope to stop drug cravings. But damaging the brain region involved in addictive desires risks permanently ending the entire spectrum of natural longings and emotions, including the ability to feel joy.
In 2004, the Ministry of Health in China banned this procedure due to lack of data on long term outcomes and growing outrage in Western media over ethical issues about whether the patients were fully aware of the risks.
However, some doctors were allowed to continue to perform it for research purposes—and recently, a Western medical journal even published a new study of the results. In 2007, The Wall Street Journal detailed the practice of a physician who claimed he performed 1000 such procedures to treat mental illnesses such as depression, schizophrenia and epilepsy, after the ban in 2004; the surgery for addiction has also since been done on at least that many people.
Read more

Controversial Surgery for Addiction Burns Away Brain’s Pleasure Center

How far should doctors go in attempting to cure addiction? In China, some physicians are taking the most extreme measures. By destroying parts of the brain’s “pleasure centers” in heroin addicts and alcoholics, these neurosurgeons hope to stop drug cravings. But damaging the brain region involved in addictive desires risks permanently ending the entire spectrum of natural longings and emotions, including the ability to feel joy.

In 2004, the Ministry of Health in China banned this procedure due to lack of data on long term outcomes and growing outrage in Western media over ethical issues about whether the patients were fully aware of the risks.

However, some doctors were allowed to continue to perform it for research purposes—and recently, a Western medical journal even published a new study of the results. In 2007, The Wall Street Journal detailed the practice of a physician who claimed he performed 1000 such procedures to treat mental illnesses such as depression, schizophrenia and epilepsy, after the ban in 2004; the surgery for addiction has also since been done on at least that many people.

Read more

Filed under brain addiction pleasure center neurosurgery nucleus accumbens neuroscience science

190 notes

The ethical minefield of using neuroscience to prevent crime
On the evening of March 10, 2007, Abdelmalek Bayout, an Algerian citizen living in Italy, brutally stabbed to death Walter Perez, a fellow immigrant from Colombia. Bayout admitted to the crime, saying he was provoked by Perez, who ridiculed him for wearing eye makeup.
According to Nature magazine, Bayout’s defence argued that he was mentally ill at the time of the offence. The court accepted that argument and, although it found Bayout guilty of the crime, imposed on him a reduced prison sentence of nine years and two months.
Bayout nevertheless appealed the judgment, and the Court of Appeal ordered a new psychiatric report. That report showed, among other things, that Bayout had low levels of the neurotransmitter monoamine oxidase A (MAO-A) — an important development given that previous research discovered that men who had low MAO-A levels and who had been abused as children were more likely to be convicted of violent crimes as adults.
Ultimately, the Court of Appeal further reduced Bayout’s sentence by a year, with Judge Pier Valerio Reinotti describing the MAO-A evidence as “particularly compelling.”
Upon a brief review of the scientific evidence, certain glaring problems with the court’s judgment quickly become apparent. Most obviously, the research showing an association between low MAO-A levels and violence tells us nothing about Bayout’s — or any specific individual’s — propensity for violence. Indeed, while a significant percentage of men with low MAO-A levels commit violent offences, the majority do not.
Yet the fact that the court allowed such evidence to influence its verdict suggests that neuroscience, while not eliminating criminal responsibility, might lead courts to conclude that defendants with certain neurological deficits are less responsible than those with “normal” brains.
There is, in fact, a precedent for this, and it’s one that few people question. Adolescents in virtually every country are subject to differential sentencing, and in many cases to an entirely separate system of justice, because their neurobiology renders them less blameworthy, less responsible than adults.
Indeed, while the limbic system, or emotional centre of the brain, is typically mature by the age of 16, the prefrontal cortex, which is associated with one’s capacity to control emotions, is not fully developed, in most people, until the early 20s. Hence according to what’s sometimes called the “two systems” theory, the imbalance in development of the limbic system and the PFC explains the risk taking and emotional behaviour that is characteristic of adolescence. And it justifies our treating adolescents as less responsible than adults.
There are, of course, substantial differences between adolescents and adults with neurological deficits, the most obvious being that most adolescents will outgrow the developmental imbalance. But the basic principle — that people who suffer from neurological aberrations that render them less capable of controlling their behaviour should be held less blameworthy — seems to have swayed the Italian Court of Appeal.
But not just the Italian Court of Appeal. While the “MAO-A defence” has been tried and failed in many courts around the world, recent research led by University of Utah psychologist Lisa Aspinwall suggests that many judges, when presented with neurobiological evidence, are inclined to reduce defendants’ sentences.
Read more

The ethical minefield of using neuroscience to prevent crime

On the evening of March 10, 2007, Abdelmalek Bayout, an Algerian citizen living in Italy, brutally stabbed to death Walter Perez, a fellow immigrant from Colombia. Bayout admitted to the crime, saying he was provoked by Perez, who ridiculed him for wearing eye makeup.

According to Nature magazine, Bayout’s defence argued that he was mentally ill at the time of the offence. The court accepted that argument and, although it found Bayout guilty of the crime, imposed on him a reduced prison sentence of nine years and two months.

Bayout nevertheless appealed the judgment, and the Court of Appeal ordered a new psychiatric report. That report showed, among other things, that Bayout had low levels of the neurotransmitter monoamine oxidase A (MAO-A) — an important development given that previous research discovered that men who had low MAO-A levels and who had been abused as children were more likely to be convicted of violent crimes as adults.

Ultimately, the Court of Appeal further reduced Bayout’s sentence by a year, with Judge Pier Valerio Reinotti describing the MAO-A evidence as “particularly compelling.”

Upon a brief review of the scientific evidence, certain glaring problems with the court’s judgment quickly become apparent. Most obviously, the research showing an association between low MAO-A levels and violence tells us nothing about Bayout’s — or any specific individual’s — propensity for violence. Indeed, while a significant percentage of men with low MAO-A levels commit violent offences, the majority do not.

Yet the fact that the court allowed such evidence to influence its verdict suggests that neuroscience, while not eliminating criminal responsibility, might lead courts to conclude that defendants with certain neurological deficits are less responsible than those with “normal” brains.

There is, in fact, a precedent for this, and it’s one that few people question. Adolescents in virtually every country are subject to differential sentencing, and in many cases to an entirely separate system of justice, because their neurobiology renders them less blameworthy, less responsible than adults.

Indeed, while the limbic system, or emotional centre of the brain, is typically mature by the age of 16, the prefrontal cortex, which is associated with one’s capacity to control emotions, is not fully developed, in most people, until the early 20s. Hence according to what’s sometimes called the “two systems” theory, the imbalance in development of the limbic system and the PFC explains the risk taking and emotional behaviour that is characteristic of adolescence. And it justifies our treating adolescents as less responsible than adults.

There are, of course, substantial differences between adolescents and adults with neurological deficits, the most obvious being that most adolescents will outgrow the developmental imbalance. But the basic principle — that people who suffer from neurological aberrations that render them less capable of controlling their behaviour should be held less blameworthy — seems to have swayed the Italian Court of Appeal.

But not just the Italian Court of Appeal. While the “MAO-A defence” has been tried and failed in many courts around the world, recent research led by University of Utah psychologist Lisa Aspinwall suggests that many judges, when presented with neurobiological evidence, are inclined to reduce defendants’ sentences.

Read more

Filed under brain neurotransmitters MAO-A neurological deficits crime prefrontal cortex neuroscience science

223 notes

Hacking the Human Brain: The Next Domain of Warfare
It’s been fashionable in military circles to talk about cyberspace as a “fifth domain” for warfare, along with land, space, air and sea. But there’s a sixth and arguably more important warfighting domain emerging: the human brain.
This new battlespace is not just about influencing hearts and minds with people seeking information. It’s about involuntarily penetrating, shaping, and coercing the mind in the ultimate realization of Clausewitz’s definition of war: compelling an adversary to submit to one’s will. And the most powerful tool in this war is brain-computer interface (BCI) technologies, which connect the human brain to devices.
Current BCI work ranges from researchers compiling and interfacing neural data such as in the Human Conectome Project to work by scientists hardening the human brain against rubber hose cryptanalysis to technologists connecting the brain to robotic systems. While these groups are streamlining the BCI for either security or humanitarian purposes, the reality is that misapplication of such research and technology has significant implications for the future of warfare.
Where BCIs can provide opportunities for injured or disabled soldiers to remain on active duty post-injury, enable paralyzed individuals to use their brain to type, or allow amputees to feel using bionic limbs, they can also be exploited if hacked. BCIs can be used to manipulate … or kill.
Recently, security expert Barnaby Jack demonstrated the vulnerability of biotechnological systems by highlighting how easily pacemakers and implantable cardioverter-defibrillators (ICDs) could be hacked, raising fears about the susceptibility of even life-saving biotechnological implants. This vulnerability could easily be extended to biotechnologies that connect directly to the brain, such as vagus nerve stimulation or deep-brain stimulation.
Outside the body, recent experiments have proven that the brain can control and maneuver quadcopter drones and metal exoskeletons. How long before we harness the power of mind-controlled weaponized drones – or use BCIs to enhance the power, efficiency, and sheer lethality of our soldiers?
Given that military research arms such as the United States’ DARPA are investing in understanding complex neural processes and enhanced threat detection through BCI scan for P300 responses, it seems the marriage between neuroscience and military systems will fundamentally alter the future of conflict.
And it is here that military researchers need to harden the systems that enable military application of BCIs. We need to prevent BCIs from being disrupted or manipulated, and safeguard against the ability of the enemy to hack an individual’s brain.
The possibilities for damage, destruction, and chaos are very real. This could include manipulating a soldier’s BCI during conflict so that s/he were forced to pull the gun trigger on friendlies, install malicious code in his own secure computer system, call in inaccurate coordinates for an air strike, or divulge state secrets to the enemy seemingly voluntarily. Whether an insider has fallen victim to BCI hacking and exploits a system from within, or an external threat is compelled to initiate a physical attack on hard and soft targets, the results would present major complications: in attribution, effectiveness of kinetic operations, and stability of geopolitical relations.
Like every other domain of warfare, the mind as the sixth domain is neither isolated nor removed from other domains; coordinated attacks across all domains will continue to be the norm. It’s just that military and defense thinkers now need to account for the subtleties of the human mind … and our increasing reliance upon the brain-computer interface.
Regardless of how it will look, though, the threat is real and not as far away as we would like – especially now that researchers just discovered a zero-day vulnerability in the brain.

Hacking the Human Brain: The Next Domain of Warfare

It’s been fashionable in military circles to talk about cyberspace as a “fifth domain” for warfare, along with land, space, air and sea. But there’s a sixth and arguably more important warfighting domain emerging: the human brain.

This new battlespace is not just about influencing hearts and minds with people seeking information. It’s about involuntarily penetrating, shaping, and coercing the mind in the ultimate realization of Clausewitz’s definition of war: compelling an adversary to submit to one’s will. And the most powerful tool in this war is brain-computer interface (BCI) technologies, which connect the human brain to devices.

Current BCI work ranges from researchers compiling and interfacing neural data such as in the Human Conectome Project to work by scientists hardening the human brain against rubber hose cryptanalysis to technologists connecting the brain to robotic systems. While these groups are streamlining the BCI for either security or humanitarian purposes, the reality is that misapplication of such research and technology has significant implications for the future of warfare.

Where BCIs can provide opportunities for injured or disabled soldiers to remain on active duty post-injury, enable paralyzed individuals to use their brain to type, or allow amputees to feel using bionic limbs, they can also be exploited if hacked. BCIs can be used to manipulate … or kill.

Recently, security expert Barnaby Jack demonstrated the vulnerability of biotechnological systems by highlighting how easily pacemakers and implantable cardioverter-defibrillators (ICDs) could be hacked, raising fears about the susceptibility of even life-saving biotechnological implants. This vulnerability could easily be extended to biotechnologies that connect directly to the brain, such as vagus nerve stimulation or deep-brain stimulation.

Outside the body, recent experiments have proven that the brain can control and maneuver quadcopter drones and metal exoskeletons. How long before we harness the power of mind-controlled weaponized drones – or use BCIs to enhance the power, efficiency, and sheer lethality of our soldiers?

Given that military research arms such as the United States’ DARPA are investing in understanding complex neural processes and enhanced threat detection through BCI scan for P300 responses, it seems the marriage between neuroscience and military systems will fundamentally alter the future of conflict.

And it is here that military researchers need to harden the systems that enable military application of BCIs. We need to prevent BCIs from being disrupted or manipulated, and safeguard against the ability of the enemy to hack an individual’s brain.

The possibilities for damage, destruction, and chaos are very real. This could include manipulating a soldier’s BCI during conflict so that s/he were forced to pull the gun trigger on friendlies, install malicious code in his own secure computer system, call in inaccurate coordinates for an air strike, or divulge state secrets to the enemy seemingly voluntarily. Whether an insider has fallen victim to BCI hacking and exploits a system from within, or an external threat is compelled to initiate a physical attack on hard and soft targets, the results would present major complications: in attribution, effectiveness of kinetic operations, and stability of geopolitical relations.

Like every other domain of warfare, the mind as the sixth domain is neither isolated nor removed from other domains; coordinated attacks across all domains will continue to be the norm. It’s just that military and defense thinkers now need to account for the subtleties of the human mind … and our increasing reliance upon the brain-computer interface.

Regardless of how it will look, though, the threat is real and not as far away as we would like – especially now that researchers just discovered a zero-day vulnerability in the brain.

Filed under brain brain-computer interface bionic limbs robotics neuroscience science

114 notes









Want Your Baby to Learn? Research Shows Sitting Up Helps
From the Mozart effect to educational videos, many parents want to aid their infants in learning. New research out of North Dakota State University, Fargo, and Texas A&M shows that something as simple as the body position of babies while they learn plays a critical role in their cognitive development.
The study shows that for babies, sitting up, either by themselves or with assistance, plays a significant role in how infants learn. The research titled “Posture Support Improves Object Individuation in Infants,” co-authored by Rebecca J. Woods, assistant professor of human development and family science and doctoral psychology lecturer at North Dakota State University, and by psychology professor Teresa Wilcox of Texas A&M, is published in the journal Developmental Psychology®.
The study’s results show that babies’ ability to sit up unsupported has a profound effect on their ability to learn about objects. The research also shows that when babies who cannot sit up alone are given posture support from infant seats that help them sit up, they learn as well as babies who can already sit alone.
“An important part of human cognitive development is the ability to understand whether an object in view is the same or different from an object seen earlier,” said Dr. Woods. Through two experiments, she confirmed that 5-and-a-half- and 6-and-a-half-month-olds don’t use patterns to differentiate objects on their own. However, 6-and-a-half-month-olds can be primed to use patterns, if they have the opportunity to look at, touch and mouth the objects before being tested.
“An advantage the 6-and-a-half-month-olds may have is the ability to sit unsupported, which makes it easier for babies to reach for, grasp and manipulate objects. If babies don’t have to focus on balancing, their attention can be on exploring the object,” said Woods.
In a third experiment, 5-and-a-half-month-olds were given full postural support while they explored objects. When they had posture support, they were able to use patterns to differentiate objects. The research study also suggests that delayed sitting may cause babies to miss learning experiences that affect other areas of development.
“Helping a baby sit up in a secure, well-supported manner during learning sessions may help them in a wide variety of learning situations, not just during object-feature learning,” Woods said. “This knowledge can be advantageous, particularly to infants who have cognitive delays who truly need an optimal learning environment.”

Want Your Baby to Learn? Research Shows Sitting Up Helps

From the Mozart effect to educational videos, many parents want to aid their infants in learning. New research out of North Dakota State University, Fargo, and Texas A&M shows that something as simple as the body position of babies while they learn plays a critical role in their cognitive development.

The study shows that for babies, sitting up, either by themselves or with assistance, plays a significant role in how infants learn. The research titled “Posture Support Improves Object Individuation in Infants,” co-authored by Rebecca J. Woods, assistant professor of human development and family science and doctoral psychology lecturer at North Dakota State University, and by psychology professor Teresa Wilcox of Texas A&M, is published in the journal Developmental Psychology®.

The study’s results show that babies’ ability to sit up unsupported has a profound effect on their ability to learn about objects. The research also shows that when babies who cannot sit up alone are given posture support from infant seats that help them sit up, they learn as well as babies who can already sit alone.

“An important part of human cognitive development is the ability to understand whether an object in view is the same or different from an object seen earlier,” said Dr. Woods. Through two experiments, she confirmed that 5-and-a-half- and 6-and-a-half-month-olds don’t use patterns to differentiate objects on their own. However, 6-and-a-half-month-olds can be primed to use patterns, if they have the opportunity to look at, touch and mouth the objects before being tested.

“An advantage the 6-and-a-half-month-olds may have is the ability to sit unsupported, which makes it easier for babies to reach for, grasp and manipulate objects. If babies don’t have to focus on balancing, their attention can be on exploring the object,” said Woods.

In a third experiment, 5-and-a-half-month-olds were given full postural support while they explored objects. When they had posture support, they were able to use patterns to differentiate objects. The research study also suggests that delayed sitting may cause babies to miss learning experiences that affect other areas of development.

“Helping a baby sit up in a secure, well-supported manner during learning sessions may help them in a wide variety of learning situations, not just during object-feature learning,” Woods said. “This knowledge can be advantageous, particularly to infants who have cognitive delays who truly need an optimal learning environment.”

Filed under cognitive development babies learning object individuation psychology neuroscience science posture support

26 notes

Faulty gene linked to condition in infants

Researchers at King’s College London have for the first time identified a defective gene at the root of Vici syndrome, a rare inherited disorder which affects infants from birth, leading to impaired development of the brain, eyes and skin, and progressive failure of the heart, skeletal muscles and the immune system.

Published in the journal Nature Genetics, the study identified a defect in the EPG-5 gene, indicating a genetic cause of the condition which was previously unknown. Researchers at King’s and Guy’s & St Thomas’ NHS Foundation Trust, part of King’s Health Partners, analysed the DNA of 18 infants with Vici syndrome and identified the inactivity of EPG-5 as a major cause of the condition.

Infants born with Vici syndrome inherit two copies of the defective gene, one from each parent. Although there are only around 50 known cases of the disorder across the world, researchers believe the precise incidence is unknown due to lack of awareness of this condition. Dr Heinz Jungbluth, from the Children’s Neuroscience Centre at St Thomas’ Hospital, who led the study along with Professor Mathias Gautel from the Cardiovascular Division at King’s, said: ‘Vici syndrome is likely to be under-diagnosed as there is potential for misdiagnosis, particularly when you consider the many different organ systems affected by Vici and the significant overlap with other, more common disorders.’

The study also highlighted the ‘autophagy’ process and the role of EPG-5 in causing this mechanism to fail. Autophagy is a highly regulated cellular process that removes damaged or unwanted components, which is crucial for the health of all cell types, including those involved in muscles, the immune system and brain development. Abnormalities in this process have been implicated previously in neurodegenerative conditions, but defects causing disorders of normal development such as Vici syndrome have rarely been reported. The researchers suggest that autophagy could play a key role in causing a range of disorders, offering the potential for treatment of other conditions. Dr Jungbluth said: ‘Although the condition is very rare, it is likely that insights provided by research into Vici syndrome will also be transferable to the diagnosis and therapy of neurodegenerative and neurodevelopmental disorders, and a wider range of primary muscle conditions.’

Professor Gautel added: ‘Having identified where this genetic defect occurs we are now able to explore potential interventions. For instance, there is the possibility of enhancing other pathways unaffected by the EPG-5 gene, or by preventing use of the defective pathway in the first place.’

As the defective gene is inherited from both the mother and father, there is also the possibility of screening families with a known history of Vici syndrome. Professor Gautel said: ‘Mothers could be offered preimplantation diagnosis, which involves removing a cell from an embryo when it is around three days old and testing it for genetic disorders, so that an unaffected embryo can be implanted into the mother’s womb, if necessary.’

(Source: kcl.ac.uk)

Filed under infants vici syndrome EPG-5 gene genetics defective gene immune system neuroscience science

free counters