Neuroscience

Articles and news from the latest research reports.

Posts tagged medicine

91 notes

Eye movements reveal impaired reading in schizophrenia

A study of eye movements in schizophrenia patients provides new evidence of impaired reading fluency in individuals with the mental illness.

The findings, by researchers at McGill University in Montreal, could open avenues to earlier detection and intervention for people with the illness.
While schizophrenia patients are known to have abnormalities in language and in eye movements, until recently reading ability was believed to be unaffected. That is because most previous studies examined reading in schizophrenia using single-word reading tests, the McGill researchers conclude. Such tests aren’t sensitive to problems in reading fluency, which is affected by the context in which words appear and by eye movements that shift attention from one word to the next.
The McGill study, led by Ph.D. candidate Veronica Whitford and psychology professors Debra Titone and Gillian A. O’Driscoll, monitored how people move their eyes as they read simple sentences. The results, which were first published online last year, appear in the February issue of the Journal of Experimental Psychology: General.
Eye movement measures provide clear and objective indicators of how hard people are working as they read. For example, when struggling with a difficult sentence, people generally make smaller eye movements, spend more time looking at each word, and spend more time re-reading words. They also have more difficulty attending to upcoming words, so they plan their eye movements less efficiently.
The McGill study, which involved 20 schizophrenia outpatients and 16 non-psychiatric participants, showed that reading patterns in people with schizophrenia differed in several important ways from healthy participants matched for gender, age, and family social status. People with schizophrenia read more slowly, generated smaller eye movements, spent more time processing individual words, and spent more time re-reading. In addition, people with schizophrenia were less efficient at processing upcoming words to facilitate reading.
The researchers evaluated factors that could contribute to the problems in reading fluency among the schizophrenia outpatients – specifically, their ability to parse words into sound components and their ability to skillfully control eye movements in non-reading contexts. Both factors were found to contribute to the reading deficits.

Eye movements reveal impaired reading in schizophrenia

A study of eye movements in schizophrenia patients provides new evidence of impaired reading fluency in individuals with the mental illness.

The findings, by researchers at McGill University in Montreal, could open avenues to earlier detection and intervention for people with the illness.

While schizophrenia patients are known to have abnormalities in language and in eye movements, until recently reading ability was believed to be unaffected. That is because most previous studies examined reading in schizophrenia using single-word reading tests, the McGill researchers conclude. Such tests aren’t sensitive to problems in reading fluency, which is affected by the context in which words appear and by eye movements that shift attention from one word to the next.

The McGill study, led by Ph.D. candidate Veronica Whitford and psychology professors Debra Titone and Gillian A. O’Driscoll, monitored how people move their eyes as they read simple sentences. The results, which were first published online last year, appear in the February issue of the Journal of Experimental Psychology: General.

Eye movement measures provide clear and objective indicators of how hard people are working as they read. For example, when struggling with a difficult sentence, people generally make smaller eye movements, spend more time looking at each word, and spend more time re-reading words. They also have more difficulty attending to upcoming words, so they plan their eye movements less efficiently.

The McGill study, which involved 20 schizophrenia outpatients and 16 non-psychiatric participants, showed that reading patterns in people with schizophrenia differed in several important ways from healthy participants matched for gender, age, and family social status. People with schizophrenia read more slowly, generated smaller eye movements, spent more time processing individual words, and spent more time re-reading. In addition, people with schizophrenia were less efficient at processing upcoming words to facilitate reading.

The researchers evaluated factors that could contribute to the problems in reading fluency among the schizophrenia outpatients – specifically, their ability to parse words into sound components and their ability to skillfully control eye movements in non-reading contexts. Both factors were found to contribute to the reading deficits.

Filed under eye movements visual attention schizophrenia neuroscience medicine science

58 notes

Queen’s University study aims to use stem cells to help save sight of diabetes sufferers
Scientists at Queen’s University Belfast are hoping to develop a novel approach that could save the sight of millions of diabetes sufferers using adult stem cells.
Currently millions of diabetics worldwide are at risk of sight loss due to a condition called Diabetic Retinopathy. This is when high blood sugar causes the blood vessels in the eye to become blocked or to leak. Failed blood flow harms the retina and leads to vision impairment and if left untreated can lead to blindness.
The novel REDDSTAR study (Repair of Diabetic Damage by Stromal Cell Administration) involving researchers from Queen’s Centre for Vision and Vascular Science in the School of Medicine, Dentistry and Biomedical Sciences, will see them isolating stem cells from donors, expanding them in a laboratory setting and re-delivering them to a patient where they help to repair the blood vessels in the eye. This is especially relevant to patients with diabetes were the vessels of the retina become damaged.
At present there are very few treatments available to control the progression of diabetic complications. There are no treatments which will improve glucose levels and simultaneously treat the diabetic complication.
The €6 million EU funded research is being carried out with NUI Galway and brings together experts from Northern Ireland, Ireland, Germany, the Netherlands, Denmark, Portugal and the US.
Professor Alan Stitt, Director of the Centre for Vision and Vascular Science in Queen’s and lead scientist for the project said: “The Queen’s component of the REDDSTAR study involves investigating the potential of a unique stem cell population to promote repair of damaged blood vessels in the retina during diabetes. The impact could be profound for patients, because regeneration of damaged retina could prevent progression of diabetic retinopathy and reduce the risk of vision loss.
“Currently available treatments for diabetic retinopathy are not always satisfactory. They focus on end-stages of the disease, carry many side effects and fail to address the root causes of the condition. A novel, alternative therapeutic approach is to harness adult stem cells to promote regeneration of the damaged retinal blood vessels and thereby prevent and/or reverse retinopathy.”
“This new research project is one of several regenerative medicine approaches ongoing in the centre. The approach is quite simple: we plan to isolate a very defined population of stem cells and then deliver them to sites in the body that have been damaged by diabetes. In the case of some patients with diabetes, they may gain enormous benefit from stem cell-mediated repair of damaged blood vessels in their retina. This is the first step towards an exciting new therapy in an area where it is desperately needed.”
The research focuses on specific adult stem-cells derived from bone-marrow. Which are being provided by Orbsen Therapeutics, a spin-out from the Science Foundation Ireland-funded Regenerative Medicine Institute (REMEDI) at NUI Galway.
The project will develop ways to grow the bone-marrow-derived stem cells. They will be tested in several preclinical models of diabetic complications at centres in Belfast, Galway, Munich, Berlin and Porto before human trials take place in Denmark.
Further information on the Centre for Vision and Vascular Science at Queen’s is available online at http://www.qub.ac.uk/research-centres/CentreforVisionandVascularScience/

Queen’s University study aims to use stem cells to help save sight of diabetes sufferers

Scientists at Queen’s University Belfast are hoping to develop a novel approach that could save the sight of millions of diabetes sufferers using adult stem cells.

Currently millions of diabetics worldwide are at risk of sight loss due to a condition called Diabetic Retinopathy. This is when high blood sugar causes the blood vessels in the eye to become blocked or to leak. Failed blood flow harms the retina and leads to vision impairment and if left untreated can lead to blindness.

The novel REDDSTAR study (Repair of Diabetic Damage by Stromal Cell Administration) involving researchers from Queen’s Centre for Vision and Vascular Science in the School of Medicine, Dentistry and Biomedical Sciences, will see them isolating stem cells from donors, expanding them in a laboratory setting and re-delivering them to a patient where they help to repair the blood vessels in the eye. This is especially relevant to patients with diabetes were the vessels of the retina become damaged.

At present there are very few treatments available to control the progression of diabetic complications. There are no treatments which will improve glucose levels and simultaneously treat the diabetic complication.

The €6 million EU funded research is being carried out with NUI Galway and brings together experts from Northern Ireland, Ireland, Germany, the Netherlands, Denmark, Portugal and the US.

Professor Alan Stitt, Director of the Centre for Vision and Vascular Science in Queen’s and lead scientist for the project said: “The Queen’s component of the REDDSTAR study involves investigating the potential of a unique stem cell population to promote repair of damaged blood vessels in the retina during diabetes. The impact could be profound for patients, because regeneration of damaged retina could prevent progression of diabetic retinopathy and reduce the risk of vision loss.

“Currently available treatments for diabetic retinopathy are not always satisfactory. They focus on end-stages of the disease, carry many side effects and fail to address the root causes of the condition. A novel, alternative therapeutic approach is to harness adult stem cells to promote regeneration of the damaged retinal blood vessels and thereby prevent and/or reverse retinopathy.”

“This new research project is one of several regenerative medicine approaches ongoing in the centre. The approach is quite simple: we plan to isolate a very defined population of stem cells and then deliver them to sites in the body that have been damaged by diabetes. In the case of some patients with diabetes, they may gain enormous benefit from stem cell-mediated repair of damaged blood vessels in their retina. This is the first step towards an exciting new therapy in an area where it is desperately needed.”

The research focuses on specific adult stem-cells derived from bone-marrow. Which are being provided by Orbsen Therapeutics, a spin-out from the Science Foundation Ireland-funded Regenerative Medicine Institute (REMEDI) at NUI Galway.

The project will develop ways to grow the bone-marrow-derived stem cells. They will be tested in several preclinical models of diabetic complications at centres in Belfast, Galway, Munich, Berlin and Porto before human trials take place in Denmark.

Further information on the Centre for Vision and Vascular Science at Queen’s is available online at http://www.qub.ac.uk/research-centres/CentreforVisionandVascularScience/

Filed under stem cells diabetes vision sight loss diabetic retinopathy REDDSTAR study medicine science

273 notes

How Neuroscience Will Fight Five Age-Old Afflictions
SEIZURES
A device delivers targeted drugs to calm overactive neurons
For years, large clinical trials have treated people with epilepsy using so-called deep-brain stimulation: surgically implanted electrodes that can detect a seizure and stop it with an electrical jolt. The technology leads to a 69 percent reduction in seizures after five years, according to the latest results.
Tracy Cui, a biomedical engineer at the University of Pittsburgh, hopes to improve upon that statistic. Her group has designed an electrode that would deliver both an electrical pulse and antiseizure medication. “We know where we want to apply the drug,” Cui says, “so you would not need a lot of it.”
To build the device, Cui’s team immersed a metal electrode in a solution containing two key ingredients: a molecule called a monomer and the drug CNQX. Zapping the solution with electricity causes the monomers to link together and form a long chain called a polymer. Because the polymer is positively charged, it attracts the negatively charged CNQX, leaving the engineers with their target product: an electrode coated in a film that’s infused with the drug.
The researchers then placed the electrodes in a petri dish with rat neurons. Another zap of electricity disrupted the electrostatic attraction in the film, causing the polymer to release its pharmacological payload—and nearby cells to quiet their erratic firing patterns. Cui says her team has successfully repeated the experiment in living rats. Next, she’d like to test the electrodes in epileptic rats and then begin the long process of regulatory approval for human use.
The body’s blood-brain barrier protects the organ from everything but the smallest molecules, rendering most drugs ineffective. As a result, this drug-​delivery mechanism could treat other brain disorders, Cui says. The electrodes can be loaded with any kind of small drug—like dopamine or painkillers—making it useful for treating Parkinson’s disease, chronic pain, or even drug addiction.
DEMENTIA
Electrode arrays stimulate mental processing
Dementia is one of the most well-known and frustrating brain afflictions. It damages many of the fundamental cognitive functions that make us human: working memory, decision-making, language, and logical reasoning. Alzheimer’s, Huntington’s, and Parkinson’s diseases all lead to dementia, and it’s also sometimes associated with multiple sclerosis, AIDS, and the normal process of aging.
Theodore Berger, a biomedical engineer at the University of Southern California, hopes to help people stave off the symptoms of dementia with a device implanted in the brain’s prefrontal cortex, a region crucial for sophisticated cognition. He and colleagues at Wake Forest Baptist Medical Center tested the device in a study involving five monkeys and a memory game.
First the team implanted an electrode array so that it could record from layers 2/3 and 5 of the prefrontal cortex and stimulate layer 5. The neural signals that jet back and forth between these areas relate to attention and decision-making. The team then trained the monkeys to play a computer game in which they saw a cartoon picture—such as a truck, lion, or paint palette—and had to select the same image from a panel of pictures 90 seconds later.
The scientists initially analyzed the electrical signals sent between the two cortical layers when the monkeys made a correct match. In later experiments, the team caused the array to emit the same signal just before the monkey made its decision. The animals’ accuracy improved by about 10 percent. That effect may be even more profound in an impaired brain. When the monkeys played the same game after receiving a hit of cocaine, their performance dropped by about 20 percent. But electrical stimulation restored their accuracy to normal levels.
Dementia involves far more complicated circuitry than these two layers of the brain. But once scientists better understand exactly how dementia works, it may be possible to combine several implants to each target a specific region.
BLINDNESS
Gene therapy converts cells into photoreceptors, restoring eyesight
Millions of people lose their eyesight when disease damages the photoreceptor cells in their retinas. These cells, called rods and cones, play a pivotal role in vision: They convert incoming light into electrical impulses that the brain interprets as an image.
In recent years, a handful of companies have developed electrode-array implants that bypass the damaged cells. A microprocessor translates information from a video camera into electric pulses that stimulate the retina; as a result, blind subjects in clinical trials have been able to distinguish objects and even read very large type. But the implanted arrays have one big drawback: They stimulate only a small number of retinal cells—about 60 out of 100,000—which ultimately limits a person’s visual resolution.
A gene therapy being developed by Michigan-based RetroSense could replace thousands of damaged retinal cells. The company’s technology targets the layer of the retina containing ganglion cells. Normally, ganglion cells transmit the electric signal from the rods and cones to the brain. But RetroSense inserts a gene that makes the ganglion cells sensitive to light; they take over the job of the photoreceptors. So far, scientists have successfully tested the technology on rodents and monkeys. In rat studies, the gene therapy allowed the animals to see well enough to detect the edge of a platform as they neared it.
The company plans to launch the first clinical trial of the technology next year, with nine subjects blinded by a disease called retinitis pigmentosa. Unlike the surgeries to implant electrode arrays, the procedure to inject gene therapy will take just minutes and requires only local anesthesia. “The visual signal that comes from the ganglion cells may not be encoded in exactly the fashion that they’re used to,” says Peter Francis, chief medical officer of RetroSense. “But what is likely to happen is that their brain is going to adapt.”
PARALYSIS
A brain-machine interface controls limbs while sensing what they touch
Last year, clinical trials involving brain implants gave great hope to people with severe spinal cord injuries. Two paralyzed subjects imagined picking up a cup of coffee. Electrode arrays decoded those neural instructions in real time and sent them to a robotic arm, which brought the coffee to their lips.
But to move limbs with any real precision, the brain also requires tactile feedback. Miguel Nicolelis, a biomedical engineer at Duke University, has now demonstrated that brain-machine interfaces can simultaneously control motion and relay a sense of touch—at least in virtual reality.
For the experiment, Nicolelis’s team inserted electrodes in two brain areas in monkeys: the motor cortex, which controls movement, and the nearby somatosensory cortex, which interprets touch signals from the outside world. Then the monkeys played a computer game in which they controlled a virtual arm—first by using a joystick and eventually by simply imagining the movement. The arm could touch three identical-looking gray circles. But each circle had a different virtual “texture” that sent a distinct electrical pattern to the monkeys’ somatosensory cortex. The monkeys learned to select the texture that produced a treat, proving that the implant was both sending and receiving neural messages.
This year, a study in Brazil will test the ability of 10 to 20 patients with spinal cord injuries to control an exoskeleton using the implant. Nicolelis, an ardent fan of Brazilian soccer, has set a strict timetable for his team: A nonprofit consortium he created, the Walk Again Project, plans to outfit a paraplegic man with a robotic exoskeleton and take him to the 2014 World Cup in São Paulo, where he will deliver the opening kick.
DEAFNESS
Stem cells repair a damaged auditory nerve, improving hearing
Over the past 25 years, more than 30,000 people with hearing loss have received an electronic implant that replaces the cochlea, the snail-shaped organ in the inner ear whose cells transform sound waves into electrical signals. The device acts as a microphone, picking up sounds from the environment and transmitting them to the auditory nerve, which carries them on to the brain.
But a cochlear implant won’t help the 10 percent of people whose profound hearing loss is caused by damage to the auditory nerve. Fortunately for this group, a team of British scientists has found a way to restore that nerve using stem cells.
The researchers exposed human embryonic stem cells to growth factors, substances that cause them to differentiate into the precursors of auditory neurons. Then they injected some 50,000 of these cells into the cochleas of gerbils whose auditory nerves had been damaged. (Gerbils are often used as models of deafness because their range of hearing is similar to that of people.) Three months after the transplant, about one third of the original number of auditory neurons had been restored; some appeared to form projections that connected to the brain stem. The animals’ hearing improved, on average, by 46 percent.
It will be years before the technique is tested in humans. Once it is, researchers say, it has the potential to help not only those with nerve damage but also people with more widespread impairment whose auditory nerve must be repaired in order to receive a cochlear implant.

How Neuroscience Will Fight Five Age-Old Afflictions

SEIZURES

A device delivers targeted drugs to calm overactive neurons

For years, large clinical trials have treated people with epilepsy using so-called deep-brain stimulation: surgically implanted electrodes that can detect a seizure and stop it with an electrical jolt. The technology leads to a 69 percent reduction in seizures after five years, according to the latest results.

Tracy Cui, a biomedical engineer at the University of Pittsburgh, hopes to improve upon that statistic. Her group has designed an electrode that would deliver both an electrical pulse and antiseizure medication. “We know where we want to apply the drug,” Cui says, “so you would not need a lot of it.”

To build the device, Cui’s team immersed a metal electrode in a solution containing two key ingredients: a molecule called a monomer and the drug CNQX. Zapping the solution with electricity causes the monomers to link together and form a long chain called a polymer. Because the polymer is positively charged, it attracts the negatively charged CNQX, leaving the engineers with their target product: an electrode coated in a film that’s infused with the drug.

The researchers then placed the electrodes in a petri dish with rat neurons. Another zap of electricity disrupted the electrostatic attraction in the film, causing the polymer to release its pharmacological payload—and nearby cells to quiet their erratic firing patterns. Cui says her team has successfully repeated the experiment in living rats. Next, she’d like to test the electrodes in epileptic rats and then begin the long process of regulatory approval for human use.

The body’s blood-brain barrier protects the organ from everything but the smallest molecules, rendering most drugs ineffective. As a result, this drug-​delivery mechanism could treat other brain disorders, Cui says. The electrodes can be loaded with any kind of small drug—like dopamine or painkillers—making it useful for treating Parkinson’s disease, chronic pain, or even drug addiction.

DEMENTIA

Electrode arrays stimulate mental processing

Dementia is one of the most well-known and frustrating brain afflictions. It damages many of the fundamental cognitive functions that make us human: working memory, decision-making, language, and logical reasoning. Alzheimer’s, Huntington’s, and Parkinson’s diseases all lead to dementia, and it’s also sometimes associated with multiple sclerosis, AIDS, and the normal process of aging.

Theodore Berger, a biomedical engineer at the University of Southern California, hopes to help people stave off the symptoms of dementia with a device implanted in the brain’s prefrontal cortex, a region crucial for sophisticated cognition. He and colleagues at Wake Forest Baptist Medical Center tested the device in a study involving five monkeys and a memory game.

First the team implanted an electrode array so that it could record from layers 2/3 and 5 of the prefrontal cortex and stimulate layer 5. The neural signals that jet back and forth between these areas relate to attention and decision-making. The team then trained the monkeys to play a computer game in which they saw a cartoon picture—such as a truck, lion, or paint palette—and had to select the same image from a panel of pictures 90 seconds later.

The scientists initially analyzed the electrical signals sent between the two cortical layers when the monkeys made a correct match. In later experiments, the team caused the array to emit the same signal just before the monkey made its decision. The animals’ accuracy improved by about 10 percent. That effect may be even more profound in an impaired brain. When the monkeys played the same game after receiving a hit of cocaine, their performance dropped by about 20 percent. But electrical stimulation restored their accuracy to normal levels.

Dementia involves far more complicated circuitry than these two layers of the brain. But once scientists better understand exactly how dementia works, it may be possible to combine several implants to each target a specific region.

BLINDNESS

Gene therapy converts cells into photoreceptors, restoring eyesight

Millions of people lose their eyesight when disease damages the photoreceptor cells in their retinas. These cells, called rods and cones, play a pivotal role in vision: They convert incoming light into electrical impulses that the brain interprets as an image.

In recent years, a handful of companies have developed electrode-array implants that bypass the damaged cells. A microprocessor translates information from a video camera into electric pulses that stimulate the retina; as a result, blind subjects in clinical trials have been able to distinguish objects and even read very large type. But the implanted arrays have one big drawback: They stimulate only a small number of retinal cells—about 60 out of 100,000—which ultimately limits a person’s visual resolution.

A gene therapy being developed by Michigan-based RetroSense could replace thousands of damaged retinal cells. The company’s technology targets the layer of the retina containing ganglion cells. Normally, ganglion cells transmit the electric signal from the rods and cones to the brain. But RetroSense inserts a gene that makes the ganglion cells sensitive to light; they take over the job of the photoreceptors. So far, scientists have successfully tested the technology on rodents and monkeys. In rat studies, the gene therapy allowed the animals to see well enough to detect the edge of a platform as they neared it.

The company plans to launch the first clinical trial of the technology next year, with nine subjects blinded by a disease called retinitis pigmentosa. Unlike the surgeries to implant electrode arrays, the procedure to inject gene therapy will take just minutes and requires only local anesthesia. “The visual signal that comes from the ganglion cells may not be encoded in exactly the fashion that they’re used to,” says Peter Francis, chief medical officer of RetroSense. “But what is likely to happen is that their brain is going to adapt.”

PARALYSIS

A brain-machine interface controls limbs while sensing what they touch

Last year, clinical trials involving brain implants gave great hope to people with severe spinal cord injuries. Two paralyzed subjects imagined picking up a cup of coffee. Electrode arrays decoded those neural instructions in real time and sent them to a robotic arm, which brought the coffee to their lips.

But to move limbs with any real precision, the brain also requires tactile feedback. Miguel Nicolelis, a biomedical engineer at Duke University, has now demonstrated that brain-machine interfaces can simultaneously control motion and relay a sense of touch—at least in virtual reality.

For the experiment, Nicolelis’s team inserted electrodes in two brain areas in monkeys: the motor cortex, which controls movement, and the nearby somatosensory cortex, which interprets touch signals from the outside world. Then the monkeys played a computer game in which they controlled a virtual arm—first by using a joystick and eventually by simply imagining the movement. The arm could touch three identical-looking gray circles. But each circle had a different virtual “texture” that sent a distinct electrical pattern to the monkeys’ somatosensory cortex. The monkeys learned to select the texture that produced a treat, proving that the implant was both sending and receiving neural messages.

This year, a study in Brazil will test the ability of 10 to 20 patients with spinal cord injuries to control an exoskeleton using the implant. Nicolelis, an ardent fan of Brazilian soccer, has set a strict timetable for his team: A nonprofit consortium he created, the Walk Again Project, plans to outfit a paraplegic man with a robotic exoskeleton and take him to the 2014 World Cup in São Paulo, where he will deliver the opening kick.

DEAFNESS

Stem cells repair a damaged auditory nerve, improving hearing

Over the past 25 years, more than 30,000 people with hearing loss have received an electronic implant that replaces the cochlea, the snail-shaped organ in the inner ear whose cells transform sound waves into electrical signals. The device acts as a microphone, picking up sounds from the environment and transmitting them to the auditory nerve, which carries them on to the brain.

But a cochlear implant won’t help the 10 percent of people whose profound hearing loss is caused by damage to the auditory nerve. Fortunately for this group, a team of British scientists has found a way to restore that nerve using stem cells.

The researchers exposed human embryonic stem cells to growth factors, substances that cause them to differentiate into the precursors of auditory neurons. Then they injected some 50,000 of these cells into the cochleas of gerbils whose auditory nerves had been damaged. (Gerbils are often used as models of deafness because their range of hearing is similar to that of people.) Three months after the transplant, about one third of the original number of auditory neurons had been restored; some appeared to form projections that connected to the brain stem. The animals’ hearing improved, on average, by 46 percent.

It will be years before the technique is tested in humans. Once it is, researchers say, it has the potential to help not only those with nerve damage but also people with more widespread impairment whose auditory nerve must be repaired in order to receive a cochlear implant.

Filed under seizures dementia blindness paralysis deafness neuroscience medicine science

35 notes

Sticky Cells: Cyclic Mechanical Reinforcement Extends Longevity of Bonds Between Cells
Research carried out by scientists at the Georgia Institute of Technology and The University of Manchester has revealed new insights into how cells stick to each other and to other bodily structures, an essential function in the formation of tissue structures and organs. It’s thought that abnormalities in their ability to do so play an important role in a broad range of disorders, including cardiovascular disease and cancer.
The study’s findings are outlined in the journal Molecular Cell and describe a surprising new aspect of cell adhesion involving the family of cell adhesion molecules known as integrins, which are found on the surfaces of most cells. The research uncovered a phenomenon termed “cyclic mechanical reinforcement,” in which the length of time during which bonds exist is extended with repeated pulling and release between the integrins and ligands that are part of the extracellular matrix to which the cells attach.
Professor Martin Humphries, dean of the faculty of life sciences at the University of Manchester and one of the paper’s co-authors, says the study suggests some new capabilities for cells: “This paper identifies a new kind of bond that is strengthened by cyclical applications of force, and which appears to be mediated by complex shape changes in integrin receptors. The findings also shed light on a possible mechanism used by cells to sense extracellular topography and to aggregate information through ‘remembering’ multiple interaction events.”
The cyclic mechanical reinforcement allows force to prolong the lifetimes of bonds, demonstrating a mechanical regulation of receptor-ligand interactions and identifying a molecular mechanism for strengthening cell adhesion through cyclical forces.
“Many cell functions such as differentiation, growth and the expression of particular genes depend on cell interaction with the ligands of the intracellular matrix,” said Cheng Zhu, a professor in the Coulter Department of Biomedical Engineering at Georgia Tech and Emory University and the study’s corresponding author. “The cells respond to their environment, which includes many mechanical aspects. This study has extended our understanding of how connections are made and how mechanical forces regulate interactions.”
The research was published online by the journal on February 14th. The work was supported by the National Institutes of Health (NIH) and the Wellcome Trust.

Sticky Cells: Cyclic Mechanical Reinforcement Extends Longevity of Bonds Between Cells

Research carried out by scientists at the Georgia Institute of Technology and The University of Manchester has revealed new insights into how cells stick to each other and to other bodily structures, an essential function in the formation of tissue structures and organs. It’s thought that abnormalities in their ability to do so play an important role in a broad range of disorders, including cardiovascular disease and cancer.

The study’s findings are outlined in the journal Molecular Cell and describe a surprising new aspect of cell adhesion involving the family of cell adhesion molecules known as integrins, which are found on the surfaces of most cells. The research uncovered a phenomenon termed “cyclic mechanical reinforcement,” in which the length of time during which bonds exist is extended with repeated pulling and release between the integrins and ligands that are part of the extracellular matrix to which the cells attach.

Professor Martin Humphries, dean of the faculty of life sciences at the University of Manchester and one of the paper’s co-authors, says the study suggests some new capabilities for cells: “This paper identifies a new kind of bond that is strengthened by cyclical applications of force, and which appears to be mediated by complex shape changes in integrin receptors. The findings also shed light on a possible mechanism used by cells to sense extracellular topography and to aggregate information through ‘remembering’ multiple interaction events.”

The cyclic mechanical reinforcement allows force to prolong the lifetimes of bonds, demonstrating a mechanical regulation of receptor-ligand interactions and identifying a molecular mechanism for strengthening cell adhesion through cyclical forces.

“Many cell functions such as differentiation, growth and the expression of particular genes depend on cell interaction with the ligands of the intracellular matrix,” said Cheng Zhu, a professor in the Coulter Department of Biomedical Engineering at Georgia Tech and Emory University and the study’s corresponding author. “The cells respond to their environment, which includes many mechanical aspects. This study has extended our understanding of how connections are made and how mechanical forces regulate interactions.”

The research was published online by the journal on February 14th. The work was supported by the National Institutes of Health (NIH) and the Wellcome Trust.

Filed under cells cell interaction integrins cyclic mechanical reinforcement medicine science

21 notes

Research update: Imaging fish in 3-D
Zebrafish larvae — tiny, transparent and fast-growing vertebrates — are widely used to study development and disease. However, visually examining the larvae for variations caused by drugs or genetic mutations is an imprecise, painstaking and time-consuming process.
Engineers at MIT have now built an automated system that can rapidly produce 3-D, micron-resolution images of thousands of zebrafish larvae and precisely analyze their physical traits. The system, described in the Feb. 12 edition of Nature Communications, offers a comprehensive view of how potential drugs affect vertebrates, says Mehmet Fatih Yanik, senior author of the paper.
“Complex processes involving organs cannot be accurately recapitulated in cell culture today. Existing 3-D tissue models are still far too simple to model live animals,” says Yanik, an MIT associate professor of electrical engineering and computer science and biological engineering. “In whole animals, the biology is far more complicated.”
Lead authors of the paper are MIT graduate student Carlos Pardo-Martin and Amin Allalou, a visiting student at MIT. Other authors are MIT senior research scientist Peter Eimon, MIT intern Jaime Medina, and Carolina Wahlby of the Broad Institute.
Zebrafish are genetically similar to humans and have many of the same developmental pathways, so scientists often use them to model human diseases including cancer, diabetes, Parkinson’s disease and autism.
Using the new technology, researchers can grow larvae in tiny wells and flow them through a channel to an imaging platform. Once there, the embryos are rotated and 320 images are taken from different angles, allowing 3-D reconstructions to be made using optical projection tomography (OPT). Getting larvae to the platform takes about 15 seconds, and the imaging takes only 2.5 seconds. This allows hundreds or thousands of larvae to be imaged within hours.
In a 2010 paper, Yanik’s team described the system that transports the embryos to the imaging platform, which they combined with high-resolution two-dimensional imaging. In the latest version, they developed a high-speed OPT imaging technique, which takes hundreds of two-dimensional images and subsequently generates a 3-D image, similar to a CT scan.
They also created a computer algorithm that can measure hundreds of traits and use that information to create a comprehensive phenotype map — the overall description of an organism’s characteristics — for each larva. This enables rapid and detailed studies of how different drugs affect those phenotypes.
“You could probably look at almost any organ or tissue that you’re interested in,” Eimon says. “It gives researchers a way to rapidly measure and quantify and put numbers on the kinds of phenotypes and gene-expression patterns that they’ve been looking at for years and years.”
In this study, the researchers focused on the craniofacial skeleton, which is analogous to the human skull. They measured the length and volume of each of the bones that make up this structure, as well as the angles between the bones.
Each embryo was imaged five days after being treated with one of nine different teratogens — drugs that cause developmental abnormalities. The researchers compared their results with the drugs’ known effects and found that they were very consistent. They also obtained high-resolution, 3-D images of the craniofacial skeletons, which are less than a millimeter long.
“Now that we’re able to load the animals, and we can image them really quickly, and we have a way to start looking at the information, the sky’s the limit,” Pardo-Martin says. “What we have to do now is ask the big questions, because the technology has advanced.”
This kind of analysis could be very valuable for drug developers who need to efficiently screen thousands of drug candidates. It could also be used to study hard-to-detect changes in phenotype caused by genetic mutations, says Joseph Fetcho, a professor of neurobiology and behavior at Cornell University.
“A really high-throughput way to assess phenotype is very important for measuring small effects on the development of an organism,” says Fetcho, who was not part of the research team. “You can see what the phenotype looks like in a large population and quantify it in a very rigorous way.”

Research update: Imaging fish in 3-D

Zebrafish larvae — tiny, transparent and fast-growing vertebrates — are widely used to study development and disease. However, visually examining the larvae for variations caused by drugs or genetic mutations is an imprecise, painstaking and time-consuming process.

Engineers at MIT have now built an automated system that can rapidly produce 3-D, micron-resolution images of thousands of zebrafish larvae and precisely analyze their physical traits. The system, described in the Feb. 12 edition of Nature Communications, offers a comprehensive view of how potential drugs affect vertebrates, says Mehmet Fatih Yanik, senior author of the paper.

“Complex processes involving organs cannot be accurately recapitulated in cell culture today. Existing 3-D tissue models are still far too simple to model live animals,” says Yanik, an MIT associate professor of electrical engineering and computer science and biological engineering. “In whole animals, the biology is far more complicated.”

Lead authors of the paper are MIT graduate student Carlos Pardo-Martin and Amin Allalou, a visiting student at MIT. Other authors are MIT senior research scientist Peter Eimon, MIT intern Jaime Medina, and Carolina Wahlby of the Broad Institute.

Zebrafish are genetically similar to humans and have many of the same developmental pathways, so scientists often use them to model human diseases including cancer, diabetes, Parkinson’s disease and autism.

Using the new technology, researchers can grow larvae in tiny wells and flow them through a channel to an imaging platform. Once there, the embryos are rotated and 320 images are taken from different angles, allowing 3-D reconstructions to be made using optical projection tomography (OPT). Getting larvae to the platform takes about 15 seconds, and the imaging takes only 2.5 seconds. This allows hundreds or thousands of larvae to be imaged within hours.

In a 2010 paper, Yanik’s team described the system that transports the embryos to the imaging platform, which they combined with high-resolution two-dimensional imaging. In the latest version, they developed a high-speed OPT imaging technique, which takes hundreds of two-dimensional images and subsequently generates a 3-D image, similar to a CT scan.

They also created a computer algorithm that can measure hundreds of traits and use that information to create a comprehensive phenotype map — the overall description of an organism’s characteristics — for each larva. This enables rapid and detailed studies of how different drugs affect those phenotypes.

“You could probably look at almost any organ or tissue that you’re interested in,” Eimon says. “It gives researchers a way to rapidly measure and quantify and put numbers on the kinds of phenotypes and gene-expression patterns that they’ve been looking at for years and years.”

In this study, the researchers focused on the craniofacial skeleton, which is analogous to the human skull. They measured the length and volume of each of the bones that make up this structure, as well as the angles between the bones.

Each embryo was imaged five days after being treated with one of nine different teratogens — drugs that cause developmental abnormalities. The researchers compared their results with the drugs’ known effects and found that they were very consistent. They also obtained high-resolution, 3-D images of the craniofacial skeletons, which are less than a millimeter long.

“Now that we’re able to load the animals, and we can image them really quickly, and we have a way to start looking at the information, the sky’s the limit,” Pardo-Martin says. “What we have to do now is ask the big questions, because the technology has advanced.”

This kind of analysis could be very valuable for drug developers who need to efficiently screen thousands of drug candidates. It could also be used to study hard-to-detect changes in phenotype caused by genetic mutations, says Joseph Fetcho, a professor of neurobiology and behavior at Cornell University.

“A really high-throughput way to assess phenotype is very important for measuring small effects on the development of an organism,” says Fetcho, who was not part of the research team. “You can see what the phenotype looks like in a large population and quantify it in a very rigorous way.”

Filed under zebrafish vertebrates genetic mutations genetics cell cultures medicine science

34 notes

In Some Dystonia Cases, Deep Brain Therapy Benefits May Linger After Device Turned Off
Two patients freed from severe to disabling effects of dystonia through deep brain stimulation therapy continued to have symptom relief for months after their devices accidentally were fully or partly turned off, according to a report published online Feb. 11 in the journal Movement Disorders.
“Current thought is that symptoms will worsen within hours or days of device shut-off, but these two young men continued to have clinical benefit despite interruption of DBS therapy for several months. To our knowledge, these two cases represent the longest duration of retained benefit in primary generalized dystonia. Moreover, when these patients’ symptoms did return, severity was far milder than it was before DBS,” said senior author Michele Tagliati, MD, director of the Movement Disorders Program at Cedars-Sinai’s Department of Neurology.
Dystonia causes muscles to contract, with the affected body part twisting involuntarily and symptoms ranging from mild to crippling. If drugs – which often have undesirable side effects, especially at higher doses – fail to give relief, neurosurgeons and neurologists may work together to supplement medications with deep brain stimulation, aimed at modulating abnormal nerve signals. Electrical leads are implanted in the brain – one on each side – and an electrical pulse generator is placed near the collarbone. The device is then programmed with a remote, hand-held controller. Tagliati is an expert in device programming, which fine-tunes stimulation for individual patients.
Few studies have looked at the consequences of interrupted DBS therapy, although one found “fairly rapid worsening of dystonia in 14 patients after interruption of stimulation for 48 hours, with symptom severity at times becoming worse than the pre-operative baseline.” In another study of 10 patients with generalized dystonia, however, symptoms did not return in four patients when stimulation was discontinued for 48 hours.
Findings from the 10-patient study correlate well with these two cases, Tagliati said.
“It appears that several factors – age, duration of disease, length of time the patient has received DBS treatment and stimulation parameters – determine which patients may retain symptom relief after prolonged DBS interruption. Our two patients were young, 20 years old. They both began DBS therapy a relatively short time after disease onset; one at four years and the other at seven years. One had received continuous stimulation for five years and the other for 18 months before stimulation was interrupted,” Tagliati said.
“We can’t say for certain why these factors make the difference,” he added, “But we theorize that a younger brain with shorter exposure to the negative effects of dystonia may be more responsive to therapy and have greater ‘plasticity’ to adapt back to normal. Both of our patients received DBS therapy at a lower energy than most patients experience, suggesting the possibility that low-frequency stimulation over an extended time may help retrain the brain’s low-frequency electrical activity.”
Both instances of device shut-off were accidental and were discovered during doctor visits after mild symptoms returned. The patient who had undergone five years of DBS therapy had only one stimulator turned off for about three months; the one stimulating the left side of his brain remained active. In the other patient, the left device had been off for about seven months and the right one for two months, Tagliati said.
Tagliati was senior author of a 2011 Journal of Neurology article on a study showing that for patients suffering from dystonia, deep brain therapy tends to get better, quicker results when started earlier rather than later.
“We knew from earlier work that younger patients with shorter disease duration had better clinical outcomes in the short term. In our 2011 article, we reported that they fare best in the long term, as well. That study uniquely showed that age and disease duration play complementary roles in predicting long-term clinical outcomes. The good news for older patients is that while they may not see the rapid gains of younger patients, their symptoms may gradually improve over several years,” Tagliati said.

In Some Dystonia Cases, Deep Brain Therapy Benefits May Linger After Device Turned Off

Two patients freed from severe to disabling effects of dystonia through deep brain stimulation therapy continued to have symptom relief for months after their devices accidentally were fully or partly turned off, according to a report published online Feb. 11 in the journal Movement Disorders.

“Current thought is that symptoms will worsen within hours or days of device shut-off, but these two young men continued to have clinical benefit despite interruption of DBS therapy for several months. To our knowledge, these two cases represent the longest duration of retained benefit in primary generalized dystonia. Moreover, when these patients’ symptoms did return, severity was far milder than it was before DBS,” said senior author Michele Tagliati, MD, director of the Movement Disorders Program at Cedars-Sinai’s Department of Neurology.

Dystonia causes muscles to contract, with the affected body part twisting involuntarily and symptoms ranging from mild to crippling. If drugs – which often have undesirable side effects, especially at higher doses – fail to give relief, neurosurgeons and neurologists may work together to supplement medications with deep brain stimulation, aimed at modulating abnormal nerve signals. Electrical leads are implanted in the brain – one on each side – and an electrical pulse generator is placed near the collarbone. The device is then programmed with a remote, hand-held controller. Tagliati is an expert in device programming, which fine-tunes stimulation for individual patients.

Few studies have looked at the consequences of interrupted DBS therapy, although one found “fairly rapid worsening of dystonia in 14 patients after interruption of stimulation for 48 hours, with symptom severity at times becoming worse than the pre-operative baseline.” In another study of 10 patients with generalized dystonia, however, symptoms did not return in four patients when stimulation was discontinued for 48 hours.

Findings from the 10-patient study correlate well with these two cases, Tagliati said.

“It appears that several factors – age, duration of disease, length of time the patient has received DBS treatment and stimulation parameters – determine which patients may retain symptom relief after prolonged DBS interruption. Our two patients were young, 20 years old. They both began DBS therapy a relatively short time after disease onset; one at four years and the other at seven years. One had received continuous stimulation for five years and the other for 18 months before stimulation was interrupted,” Tagliati said.

“We can’t say for certain why these factors make the difference,” he added, “But we theorize that a younger brain with shorter exposure to the negative effects of dystonia may be more responsive to therapy and have greater ‘plasticity’ to adapt back to normal. Both of our patients received DBS therapy at a lower energy than most patients experience, suggesting the possibility that low-frequency stimulation over an extended time may help retrain the brain’s low-frequency electrical activity.”

Both instances of device shut-off were accidental and were discovered during doctor visits after mild symptoms returned. The patient who had undergone five years of DBS therapy had only one stimulator turned off for about three months; the one stimulating the left side of his brain remained active. In the other patient, the left device had been off for about seven months and the right one for two months, Tagliati said.

Tagliati was senior author of a 2011 Journal of Neurology article on a study showing that for patients suffering from dystonia, deep brain therapy tends to get better, quicker results when started earlier rather than later.

“We knew from earlier work that younger patients with shorter disease duration had better clinical outcomes in the short term. In our 2011 article, we reported that they fare best in the long term, as well. That study uniquely showed that age and disease duration play complementary roles in predicting long-term clinical outcomes. The good news for older patients is that while they may not see the rapid gains of younger patients, their symptoms may gradually improve over several years,” Tagliati said.

Filed under deep brain stimulation dystonia nerve signals neuroscience medicine science

74 notes

‘Robot’ cells answer call to arms
By thinking of cells as programmable robots, researchers at Rice University hope to someday direct how they grow into the tiny blood vessels that feed the brain and help people regain functions lost to stroke and disease.
Rice bioengineer Amina Qutub and her colleagues simulate patterns of microvasculature cell growth and compare the results with real networks grown in their lab. Eventually, they want to develop the ability to control the way these networks develop.
The results of a long study are the focus of a new paper in the Journal of Theoretical Biology.
“We want to be able to design particular capillary structures,” said Qutub, an assistant professor of bioengineering based at Rice’s BioScience Research Collaborative. “In our computer model, the cells are miniature adaptive robots that respond to each other, respond to their environment and pattern into unique structures that parallel what we see in the lab.”
When brain cells are deprived of oxygen – a condition called hypoxia that can lead to strokes – they pump out growth factor proteins that signal endothelial cells. Those cells, which line the interior of blood vessels, are prompted to branch off as capillaries in a process called angiogenesis to bring oxygen to starved neurons.
How these new vessels form networks and the shapes they take are of great interest to bioengineers who want to improve blood flow to parts of the brain by regenerating the microvasculature.
“The problem, especially as we age, is that we become less able to grow these blood vessels,” Qutub said. “At the same time, we’re at higher risk for strokes and neurodegenerative diseases. If we can understand how to guide the vessel structures and help them self-repair, we are a step closer to aiding treatment.”

‘Robot’ cells answer call to arms

By thinking of cells as programmable robots, researchers at Rice University hope to someday direct how they grow into the tiny blood vessels that feed the brain and help people regain functions lost to stroke and disease.

Rice bioengineer Amina Qutub and her colleagues simulate patterns of microvasculature cell growth and compare the results with real networks grown in their lab. Eventually, they want to develop the ability to control the way these networks develop.

The results of a long study are the focus of a new paper in the Journal of Theoretical Biology.

“We want to be able to design particular capillary structures,” said Qutub, an assistant professor of bioengineering based at Rice’s BioScience Research Collaborative. “In our computer model, the cells are miniature adaptive robots that respond to each other, respond to their environment and pattern into unique structures that parallel what we see in the lab.”

When brain cells are deprived of oxygen – a condition called hypoxia that can lead to strokes – they pump out growth factor proteins that signal endothelial cells. Those cells, which line the interior of blood vessels, are prompted to branch off as capillaries in a process called angiogenesis to bring oxygen to starved neurons.

How these new vessels form networks and the shapes they take are of great interest to bioengineers who want to improve blood flow to parts of the brain by regenerating the microvasculature.

“The problem, especially as we age, is that we become less able to grow these blood vessels,” Qutub said. “At the same time, we’re at higher risk for strokes and neurodegenerative diseases. If we can understand how to guide the vessel structures and help them self-repair, we are a step closer to aiding treatment.”

Filed under brain cells blood vessels hypoxia neurodegenerative diseases stroke medicine science

69 notes

Old drug may point the way to new treatments for diabetes and obesity
Researchers at the University of Michigan’s Life Sciences Institute have found that amlexanox, an off-patent drug currently prescribed for the treatment of asthma and other uses, also reverses obesity, diabetes and fatty liver in mice.
The findings from the lab of Alan Saltiel, the Mary Sue Coleman director of the Life Sciences Institute, were published online Feb. 10 in the journal Nature Medicine.
"One of the reasons that diets are so ineffective in producing weight loss for some people is that their bodies adjust to the reduced calories by also reducing their metabolism, so that they are ‘defending’ their body weight," Saltiel said. "Amlexanox seems to tweak the metabolic response to excessive calorie storage in mice."
Different formulations of amlexanox are currently prescribed to treat asthma in Japan and canker sores in the United States. Saltiel is teaming up with clinical-trial specialists at U-M to test whether amlexanox will be useful for treating obesity and diabetes in humans. He is also working with medicinal chemists at U-M to develop a new compound based on the drug that optimizes its formula.
The study appears to confirm and extend the notion that the genes IKKE and TBK1 play a crucial role for maintaining metabolic balance, a discovery published by the Saltiel lab in 2009 in the journal Cell.
"Amlexanox appears to work in mice by inhibiting two genes—IKKE and TBK1—that we think together act as a sort of brake on metabolism," Saltiel said. "By releasing the brake, amlexanox seems to free the metabolic system to burn more, and possibly store less, energy."

Old drug may point the way to new treatments for diabetes and obesity

Researchers at the University of Michigan’s Life Sciences Institute have found that amlexanox, an off-patent drug currently prescribed for the treatment of asthma and other uses, also reverses obesity, diabetes and fatty liver in mice.

The findings from the lab of Alan Saltiel, the Mary Sue Coleman director of the Life Sciences Institute, were published online Feb. 10 in the journal Nature Medicine.

"One of the reasons that diets are so ineffective in producing weight loss for some people is that their bodies adjust to the reduced calories by also reducing their metabolism, so that they are ‘defending’ their body weight," Saltiel said. "Amlexanox seems to tweak the metabolic response to excessive calorie storage in mice."

Different formulations of amlexanox are currently prescribed to treat asthma in Japan and canker sores in the United States. Saltiel is teaming up with clinical-trial specialists at U-M to test whether amlexanox will be useful for treating obesity and diabetes in humans. He is also working with medicinal chemists at U-M to develop a new compound based on the drug that optimizes its formula.

The study appears to confirm and extend the notion that the genes IKKE and TBK1 play a crucial role for maintaining metabolic balance, a discovery published by the Saltiel lab in 2009 in the journal Cell.

"Amlexanox appears to work in mice by inhibiting two genes—IKKE and TBK1—that we think together act as a sort of brake on metabolism," Saltiel said. "By releasing the brake, amlexanox seems to free the metabolic system to burn more, and possibly store less, energy."

Filed under obesity diabetes animal model metabolism calories medicine science

47 notes

Experimental gene therapy treatment for duchenne muscular dystrophy offers hope for youngster
Jacob Rutt is a bright 11-year-old who likes to draw detailed maps in his spare time. But the budding geographer has a hard time with physical skills most children take for granted ― running and climbing trees are beyond him, and even walking can be difficult. He was diagnosed with a form of muscular dystrophy known as Duchenne when he was two years old.
The disease affects about 1 in 3,500 newborns ― mostly boys ― worldwide. It usually becomes apparent in early childhood, as weakened skeletal muscles cause delays in milestones such as sitting and walking. Children usually become wheelchair-dependent during their teens. As heart muscle is increasingly affected, the disease becomes life threatening and many patients die from heart failure in their 20s.
Today, Jacob is one of 51 children participating in a nationwide clinical trial for a new type treatment that could offer help to those suffering from devastating neuromuscular disease. Clinical researchers at UC Davis Medical Center and a handful other research centers around the nation are testing a high-tech drug designed to fix the underlying genetic defect causing the progressive muscular decline that is seen in children with Duchenne.
“This type of genetic therapy is the most exciting treatment approach I have witnessed in my career for Duchenne muscular dystrophy,” said Craig McDonald, professor and chair of the Department of Physical Medicine Rehabilitation at UC Davis, as well as principal investigator of the national clinical trial that Jacob is participating in. “We are hopeful that it will delay many of the disease’s manifestations and ultimately improve life expectancy for patients.”

Experimental gene therapy treatment for duchenne muscular dystrophy offers hope for youngster

Jacob Rutt is a bright 11-year-old who likes to draw detailed maps in his spare time. But the budding geographer has a hard time with physical skills most children take for granted ― running and climbing trees are beyond him, and even walking can be difficult. He was diagnosed with a form of muscular dystrophy known as Duchenne when he was two years old.

The disease affects about 1 in 3,500 newborns ― mostly boys ― worldwide. It usually becomes apparent in early childhood, as weakened skeletal muscles cause delays in milestones such as sitting and walking. Children usually become wheelchair-dependent during their teens. As heart muscle is increasingly affected, the disease becomes life threatening and many patients die from heart failure in their 20s.

Today, Jacob is one of 51 children participating in a nationwide clinical trial for a new type treatment that could offer help to those suffering from devastating neuromuscular disease. Clinical researchers at UC Davis Medical Center and a handful other research centers around the nation are testing a high-tech drug designed to fix the underlying genetic defect causing the progressive muscular decline that is seen in children with Duchenne.

“This type of genetic therapy is the most exciting treatment approach I have witnessed in my career for Duchenne muscular dystrophy,” said Craig McDonald, professor and chair of the Department of Physical Medicine Rehabilitation at UC Davis, as well as principal investigator of the national clinical trial that Jacob is participating in. “We are hopeful that it will delay many of the disease’s manifestations and ultimately improve life expectancy for patients.”

Filed under duchenne muscular dystrophy muscular dystrophy dystrophin oligonucleotide medicine science

86 notes

The Cost of War Includes at Least 253,330 Brain Injuries and 1,700 Amputations
Here are indications of the lingering costs of 11 years of warfare. Nearly 130,000 U.S. troops have been diagnosed with post-traumatic stress disorder, and vastly more have experienced brain injuries. Over 1,700 have undergone life-changing limb amputations. Over 50,000 have been wounded in action. As of Wednesday, 6,656 U.S. troops and Defense Department civilians have died.
That updated data (.pdf) comes from a new Congressional Research Service report into military casualty statistics that can sometimes be difficult to find — and even more difficult for American society to fully appreciate. It almost certainly understates the extent of the costs of war.
Start with post-traumatic stress disorder, or PTSD. Counting since 2001 across the U.S. military services, 129,731 U.S. troops have been diagnosed with the disorder since 2001. The vast majority of those, nearly 104,000, have come from deployed personnel.
But that’s the tip of the PTSD iceberg, since not all — and perhaps not even most — PTSD cases are diagnosed. The former vice chief of staff of the Army, retired Gen. Peter Chiarelli, has proposed dropping the “D” from PTSD so as not to stigmatize those who suffer from it — and, perhaps, encourage more veterans to seek diagnosis and treatment for it. (Not all veterans advocates agree with Chiarelli.)
The congressional study also brings to light the extent of one of the signature injuries of the post-9/11 wars, Traumatic Brain Injury (TBI), often suffered by survivors of explosions from homemade insurgent bombs. From 2000 (a pre-9/11 year probably chosen for inclusion for control purposes) to the end of 2012, some 253,330 troops have experienced TBI in some form. About 77 percent of those cases are classified by the Defense Department as “mild,” meaning a “confused or disoriented state lasting less than 24 hours; loss of consciousness for up to thirty minutes; memory loss lasting less than 24 hours; and structural brain imaging that yields normal results.”
More-severe TBI is measured along those metrics, lasting longer than a day. Nearly 6,500 of of those cases are “severe or penetrating TBI,” which include the effects of open head injuries, skull fractures, or projectiles lodged in the brain.
Like with PTSD, the TBI diagnoses scratch the surface. The military’s screening for TBI is notoriously bad: One former Army chief of staff described it as “basically a coin flip.” Worse, poor military medical technology, particularly in bandwidth-deprived areas like Iraq and Afghanistan, have made it uncertain that battlefield diagnoses of TBI actually transmit back to troops’ permanent medical files.
Amputations are a feature of any prolonged war. Almost 800 Iraq veterans have undergone “major limb” amputations, such as a leg, and another 194 have experienced partial foot, finger or other so-called “minor limb” losses. For Afghanistan veterans, those numbers are 696 and 28, respectively.
The Iraq war is over for all but a handful of U.S. troops and thousands of contractors. The Afghanistan war is in the process of a troop drawdown through 2014 of unknown speed and will feature a residual troop presence of unknown size. Even if the U.S. deaths and injuries in those wars may almost be over, the aftereffects of the wars on a huge number of veterans will not end.

The Cost of War Includes at Least 253,330 Brain Injuries and 1,700 Amputations

Here are indications of the lingering costs of 11 years of warfare. Nearly 130,000 U.S. troops have been diagnosed with post-traumatic stress disorder, and vastly more have experienced brain injuries. Over 1,700 have undergone life-changing limb amputations. Over 50,000 have been wounded in action. As of Wednesday, 6,656 U.S. troops and Defense Department civilians have died.

That updated data (.pdf) comes from a new Congressional Research Service report into military casualty statistics that can sometimes be difficult to find — and even more difficult for American society to fully appreciate. It almost certainly understates the extent of the costs of war.

Start with post-traumatic stress disorder, or PTSD. Counting since 2001 across the U.S. military services, 129,731 U.S. troops have been diagnosed with the disorder since 2001. The vast majority of those, nearly 104,000, have come from deployed personnel.

But that’s the tip of the PTSD iceberg, since not all — and perhaps not even most — PTSD cases are diagnosed. The former vice chief of staff of the Army, retired Gen. Peter Chiarelli, has proposed dropping the “D” from PTSD so as not to stigmatize those who suffer from it — and, perhaps, encourage more veterans to seek diagnosis and treatment for it. (Not all veterans advocates agree with Chiarelli.)

The congressional study also brings to light the extent of one of the signature injuries of the post-9/11 wars, Traumatic Brain Injury (TBI), often suffered by survivors of explosions from homemade insurgent bombs. From 2000 (a pre-9/11 year probably chosen for inclusion for control purposes) to the end of 2012, some 253,330 troops have experienced TBI in some form. About 77 percent of those cases are classified by the Defense Department as “mild,” meaning a “confused or disoriented state lasting less than 24 hours; loss of consciousness for up to thirty minutes; memory loss lasting less than 24 hours; and structural brain imaging that yields normal results.”

More-severe TBI is measured along those metrics, lasting longer than a day. Nearly 6,500 of of those cases are “severe or penetrating TBI,” which include the effects of open head injuries, skull fractures, or projectiles lodged in the brain.

Like with PTSD, the TBI diagnoses scratch the surface. The military’s screening for TBI is notoriously bad: One former Army chief of staff described it as “basically a coin flip.” Worse, poor military medical technology, particularly in bandwidth-deprived areas like Iraq and Afghanistan, have made it uncertain that battlefield diagnoses of TBI actually transmit back to troops’ permanent medical files.

Amputations are a feature of any prolonged war. Almost 800 Iraq veterans have undergone “major limb” amputations, such as a leg, and another 194 have experienced partial foot, finger or other so-called “minor limb” losses. For Afghanistan veterans, those numbers are 696 and 28, respectively.

The Iraq war is over for all but a handful of U.S. troops and thousands of contractors. The Afghanistan war is in the process of a troop drawdown through 2014 of unknown speed and will feature a residual troop presence of unknown size. Even if the U.S. deaths and injuries in those wars may almost be over, the aftereffects of the wars on a huge number of veterans will not end.

Filed under PTSD TBI ambutations statistics U.S. Military Casualty Statistics brain injury medicine science

free counters