Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

159 notes

Part of the brain stays “youthful” into older age

At least one part of the human brain may be able to process information the same way in older age as it does in the prime of life, according to new research conducted at the University of Adelaide.

image

A study compared the ability of 60 older and younger people to respond to visual and non-visual stimuli in order to measure their “spatial attention” skills.

Spatial attention is critical for many aspects of life, from driving, to walking, to picking up and using objects.

"Our studies have found that older and younger adults perform in a similar way on a range of visual and non-visual tasks that measure spatial attention," says Dr Joanna Brooks, who conducted the study as a Visiting Research Fellow with the University of Adelaide’s School of Psychology and the School of Medicine.

"Both younger (aged 18-38 years) and older (55-95 years) adults had the same responses for spatial attention tasks involving touch, sight or sound.

"In one task, participants were asked to feel wooden objects whilst blindfolded and decide where the middle of the object was - participants’ judgements were significantly biased towards the left-hand side of the true object centre. This bias is subtle but highly consistent," Dr Brooks says.

"When we think of ageing, we think not just of the physical aspects but also the cognitive side of it, especially when it comes to issues such as reaction time, which is typically slower among older adults. However, our research suggests that certain types of cognitive systems in the right cerebral hemisphere - like spatial attention - are ‘encapsulated’ and may be protected from ageing," she says.

Dr Brooks, who is now a Research Fellow in Healthy Ageing based at the Australian National University, recently presented her results at the 12th International Cognitive Neuroscience Conference in Brisbane. Her project is part of an international collaboration with scientists at the University of Edinburgh and Queen Margaret University in Scotland to better understand spatial attention in the human brain.

"Our results challenge current models of cognitive ageing because they show that the right side of the brain remains dominant for spatial processing throughout the entire adult lifespan," Dr Brooks says. "We now need to better understand how and why some areas of the brain seem to be more affected by ageing than others."

Dr Brooks’s research could also be helpful in better understanding how diseases such as Alzheimer’s affect the brain.

(Source: adelaide.edu.au)

Filed under spatial attention aging psychology neuroscience science

157 notes

(Image caption: Vertebral artery as it passes through the neck vertebrae of the spine and enters the skull base. Arrows indicate head movement during lateral rotation and lateral flexion, motions that may be performed as part of a neck manipulation. Credit: American Heart Association)
Neck manipulation may be associated with stroke
Treatments involving neck manipulation may be associated with stroke, though it cannot be said with certainty that neck manipulation causes strokes, according to a new scientific statement published in the American Heart Association’s journal Stroke.
Cervical artery dissection (CD) is a small tear in the layers of artery walls in the neck. It can result in ischemic stroke if a blood clot forms after a trivial or major trauma in the neck and later causes blockage of a blood vessel in the brain. Cervical artery dissection is an important cause of stroke in young and middle-aged adults.
“Most dissections involve some trauma, stretch or mechanical stress,” said José Biller, M.D., lead statement author and professor and chair of neurology at the Loyola University Chicago Stritch School of Medicine. “Sudden movements that can hyperextend or rotate the neck — such as whiplash, certain sports movements, or even violent coughing or vomiting — can result in CD, even if they are deemed inconsequential by the patient.”
Although techniques for cervical manipulative therapy vary, some maneuvers used as therapy by health practitioners also extend and rotate the neck and sometimes involve a forceful thrust.
There are four arteries that supply blood to the brain: the two carotid arteries on each side of the neck, and the two vertebral arteries on the back of the neck. The influence of neck manipulation seems more important in vertebral artery dissection than in internal carotid artery dissection.
“Although a cause-and-effect relationship between these therapies and CD has not been established and the risk is probably low, CD can result in serious neurological injury,” Biller said. “Patients should be informed of this association before undergoing neck manipulation.”
The association between cervical artery dissection and cervical manipulative therapies was identified in case control studies, which aren’t designed to prove cause and effect. An association means that there appears to be a relationship between two things, i.e., manipulative therapy of the neck and a greater incidence of cervical dissection/stroke. However, it’s not clear whether other factors could account for the apparent relationship.
The relationship between neck manipulation and cervical artery dissection is difficult to evaluate because patients who already are beginning to have a cervical artery dissection may seek treatment to relieve neck pain, a common symptom of cervical artery dissection that can precede symptoms of stroke by several days.
You should seek emergency medical evaluation if you develop neurological symptoms after neck manipulation or trauma, such as:
Pain in the back of your neck or in your head;
Dizziness/vertigo;
Double vision;
Unsteadiness when walking;
Slurred speech;
Nausea and vomiting;
Jerky eye movements.
“Tell the physician if you have recently had a neck trauma or neck manipulation,” Biller said. “Some symptoms, such as dizziness or vertigo, are very common and can be due to minor conditions rather than stroke, but giving the information about recent neck manipulation can raise a red flag that you may have a CD rather than a less serious problem, particularly in the presence of neck pain.”

(Image caption: Vertebral artery as it passes through the neck vertebrae of the spine and enters the skull base. Arrows indicate head movement during lateral rotation and lateral flexion, motions that may be performed as part of a neck manipulation. Credit: American Heart Association)

Neck manipulation may be associated with stroke

Treatments involving neck manipulation may be associated with stroke, though it cannot be said with certainty that neck manipulation causes strokes, according to a new scientific statement published in the American Heart Association’s journal Stroke.

Cervical artery dissection (CD) is a small tear in the layers of artery walls in the neck. It can result in ischemic stroke if a blood clot forms after a trivial or major trauma in the neck and later causes blockage of a blood vessel in the brain. Cervical artery dissection is an important cause of stroke in young and middle-aged adults.

“Most dissections involve some trauma, stretch or mechanical stress,” said José Biller, M.D., lead statement author and professor and chair of neurology at the Loyola University Chicago Stritch School of Medicine. “Sudden movements that can hyperextend or rotate the neck — such as whiplash, certain sports movements, or even violent coughing or vomiting — can result in CD, even if they are deemed inconsequential by the patient.”

Although techniques for cervical manipulative therapy vary, some maneuvers used as therapy by health practitioners also extend and rotate the neck and sometimes involve a forceful thrust.

There are four arteries that supply blood to the brain: the two carotid arteries on each side of the neck, and the two vertebral arteries on the back of the neck. The influence of neck manipulation seems more important in vertebral artery dissection than in internal carotid artery dissection.

“Although a cause-and-effect relationship between these therapies and CD has not been established and the risk is probably low, CD can result in serious neurological injury,” Biller said. “Patients should be informed of this association before undergoing neck manipulation.”

The association between cervical artery dissection and cervical manipulative therapies was identified in case control studies, which aren’t designed to prove cause and effect. An association means that there appears to be a relationship between two things, i.e., manipulative therapy of the neck and a greater incidence of cervical dissection/stroke. However, it’s not clear whether other factors could account for the apparent relationship.

The relationship between neck manipulation and cervical artery dissection is difficult to evaluate because patients who already are beginning to have a cervical artery dissection may seek treatment to relieve neck pain, a common symptom of cervical artery dissection that can precede symptoms of stroke by several days.

You should seek emergency medical evaluation if you develop neurological symptoms after neck manipulation or trauma, such as:

  • Pain in the back of your neck or in your head;
  • Dizziness/vertigo;
  • Double vision;
  • Unsteadiness when walking;
  • Slurred speech;
  • Nausea and vomiting;
  • Jerky eye movements.

“Tell the physician if you have recently had a neck trauma or neck manipulation,” Biller said. “Some symptoms, such as dizziness or vertigo, are very common and can be due to minor conditions rather than stroke, but giving the information about recent neck manipulation can raise a red flag that you may have a CD rather than a less serious problem, particularly in the presence of neck pain.”

Filed under neck manipulation stroke cervical artery dissection neuroscience science

78 notes

Notch Developmental Pathway Regulates Fear Memory Formation

Nature is thrifty. The same signals that embryonic cells use to decide whether to become nerves, skin or bone come into play again when adult animals are learning whether to become afraid.

Researchers at Yerkes National Primate Research Center, Emory University, have learned that the molecule Notch, critical in many processes during embryonic development, is also involved in fear memory formation. Understanding fear memory formation is critical to developing more effective treatments and preventions for anxiety disorders such as post-traumatic stress disorder (PTSD). The results are scheduled for publication online this week by the journal Neuron.

"We are finding that developmental pathways that appear to be quiescent during adulthood are transiently reactivated to allow new memory formation to occur," says Kerry Ressler, MD, PhD, professor of psychiatry and behavioral sciences at Emory University School of Medicine and Yerkes National Primate Research Center, and senior author of the paper.

The first author of the paper is postdoctoral fellow Brian Dias, PhD, and co-authors include undergraduates Jared Goodman, Ranbir Ahluwalia and Audrey Easton, and post-doctoral researcher Raul Andero, PhD.

The Notch signaling pathway, present in insects, worms and vertebrates, is involved in embryonic patterning as well as nervous system and cardiovascular development. It’s a way for cells to communicate and coordinate which cells are going to become what types of tissues.

Dias and Ressler probed the Notch pathway because they were examining many genes that are activated in the brains of mice after they learn to become afraid of a sound paired with a mild foot-shock. They were looking for changes in the amygdala, a region of the brain known to regulate fear learning.

The researchers were particularly interested in micro RNAs. MicroRNAs do not encode proteins but can inhibit other genes, often several at once in a coordinated way. Dias and Ressler found that levels of miRNA-34a are increased in the amygdala after fear learning occurs. A day after fear training, animals whose brains were injected with a virus engineered to carry a “sponge” against miRNA-34a froze less often than control animals.

The researchers found that miRNA-34a regulated several genes that encode components of the Notch pathway. They believe their study is the first to link miRNA-34a and Notch signaling to a role in memory consolidation.

Notch is under investigation as a target in the treatment of various cancers and some drugs that target Notch have been well-tolerated by humans.

"From a therapeutic perspective, our data suggest that relevant drugs that regulate Notch signaling could potentially be a starting point for preventing or treating PTSD," Dias says.

(Source: yerkes.emory.edu)

Filed under PTSD memory formation memory consolidation fear amygdala miRNA-34a neuroscience science

541 notes

New prosthetic arm controlled by neural messages 
This design hopes to identify the memory of movement in the amputee’s brain to translate to an order allowing manipulation of the device.
Controlling a prosthetic arm by just imagining a motion may be possible through the work of Mexican scientists at the Centre for Research and Advanced Studies (CINVESTAV), who work in the development of an arm replacement to identify movement patterns from brain signals.
First, it is necessary to know if there is a memory pattern to remember in the amputee’s brain in order to know how it moved and, thus, translating it to instructions for the prosthesis,” says Roberto Muñoz Guerrero, researcher at the Department of Electrical Engineering and project leader at Cinvestav.
He explains that the electric signal won’t come from the muscles that form the stump, but from the movement patterns of the brain. “If this phase is successful, the patient would be able to move the prosthesis by imagining different movements.”
However, Muñoz Guerrero acknowledges this is not an easy task because the brain registers a wide range of activities that occur in the human body and from all of them, the movement pattern is tried to be drawn. “Therefore, the first step is to recall the patterns in the EEG and define there the memory that can be electrically recorded. Then we need to evaluate how sensitive the signal is to other external shocks, such as light or blinking.”
Regarding this, it should be noted that the prosthesis could only be used by individuals who once had their entire arm and was amputated because some accident or illness. Patients were able to move the arm naturally and stored in their memory the process that would apply for the use of the prosthesis.
According to the researcher, the prosthesis must be provided with a mechanical and electronic system, the elements necessary to activate it and a section that would interpret the brain signals. “Regarding the material with which it must be built, it has not yet been fully defined because it must weigh between two and three kilograms, which is similar to the missing arm’s weight.”
The unique prosthesis represents a new topic in bioelectronics called BCI (Brain Computer Interface), which is a direct communication pathway between the brain and an external device in order to help or repair sensory and motor functions. “An additional benefit is the ability to create motion paths for the prosthesis, which is not possible with commercial products,” says Muñoz Guerrero.

New prosthetic arm controlled by neural messages

This design hopes to identify the memory of movement in the amputee’s brain to translate to an order allowing manipulation of the device.

Controlling a prosthetic arm by just imagining a motion may be possible through the work of Mexican scientists at the Centre for Research and Advanced Studies (CINVESTAV), who work in the development of an arm replacement to identify movement patterns from brain signals.

First, it is necessary to know if there is a memory pattern to remember in the amputee’s brain in order to know how it moved and, thus, translating it to instructions for the prosthesis,” says Roberto Muñoz Guerrero, researcher at the Department of Electrical Engineering and project leader at Cinvestav.

He explains that the electric signal won’t come from the muscles that form the stump, but from the movement patterns of the brain. “If this phase is successful, the patient would be able to move the prosthesis by imagining different movements.”

However, Muñoz Guerrero acknowledges this is not an easy task because the brain registers a wide range of activities that occur in the human body and from all of them, the movement pattern is tried to be drawn. “Therefore, the first step is to recall the patterns in the EEG and define there the memory that can be electrically recorded. Then we need to evaluate how sensitive the signal is to other external shocks, such as light or blinking.”

Regarding this, it should be noted that the prosthesis could only be used by individuals who once had their entire arm and was amputated because some accident or illness. Patients were able to move the arm naturally and stored in their memory the process that would apply for the use of the prosthesis.

According to the researcher, the prosthesis must be provided with a mechanical and electronic system, the elements necessary to activate it and a section that would interpret the brain signals. “Regarding the material with which it must be built, it has not yet been fully defined because it must weigh between two and three kilograms, which is similar to the missing arm’s weight.”

The unique prosthesis represents a new topic in bioelectronics called BCI (Brain Computer Interface), which is a direct communication pathway between the brain and an external device in order to help or repair sensory and motor functions. “An additional benefit is the ability to create motion paths for the prosthesis, which is not possible with commercial products,” says Muñoz Guerrero.

Filed under BCI prosthetics prosthetic arm motor movement EEG neuroscience science

120 notes

NIH and Italian Scientists Develop Nasal Test for Human Prion Disease
A nasal brush test can rapidly and accurately diagnose Creutzfeldt-Jakob disease (CJD), an incurable and ultimately fatal neurodegenerative disorder, according to a study by National Institutes of Health scientists and their Italian colleagues.
Up to now, a definitive CJD diagnosis requires testing brain tissue obtained after death or by biopsy in living patients. The study describing the less invasive nasal test appears in the Aug. 7 issue of the New England Journal of Medicine.
CJD is a prion disease. These diseases originate when, for reasons not fully understood, normally harmless prion protein molecules become abnormal and gather in clusters. Prion diseases affect animals and people. Human prion diseases include variant, familial and sporadic CJD. The most common form, sporadic CJD, affects an estimated 1 in one million people annually worldwide. Other prion diseases include scrapie in sheep; chronic wasting disease in deer, elk and moose; and bovine spongiform encephalopathy (BSE), or mad cow disease, in cattle. Scientists have associated the accumulation of these clusters with tissue damage that leaves sponge-like holes in the brain.
“This exciting advance, the culmination of decades of studies on prion diseases, markedly improves on available diagnostic tests for CJD that are less reliable, more difficult for patients to tolerate, and require more time to obtain results,” said Anthony S. Fauci, M.D., director of the National Institute of Allergy and Infectious Diseases (NIAID), a component of NIH. “With additional validation, this test has potential for use in clinical and agricultural settings.”
An easy-to-use diagnostic test would let doctors clearly differentiate prion diseases from other brain diseases, according to Byron Caughey, Ph.D., the lead NIAID scientist involved in the study. Although specific CJD treatments are not available, prospects for their development and effectiveness could be enhanced by early and accurate diagnoses. Further, a test that identifies people with various forms of prion diseases could help to prevent the spread of prion diseases among and between species. For instance, it is known that human prion diseases can be transmitted via medical procedures such as blood transfusions, transplants and the contamination of surgical instruments. People also have contracted variant CJD after exposure to BSE-infected cattle.
The NIAID study involved 31 nasal samples from patients with CJD and 43 nasal samples from patients who had other neurologic diseases or no neurologic disease at all. These samples were collected primarily by Gianluigi Zanusso, M.D., Ph.D., and colleagues at the University of Verona in Italy, who developed the technique of brushing the inside of the nose to collect olfactory neurons connected to the brain. Testing in Dr. Caughey’s lab in Montana then correctly identified 30 of the 31 CJD patients (97 percent sensitivity) and correctly showed negative results for all 43 of the non-CJD patients (100 percent specificity). By comparison, tests using cerebral spinal fluid—currently used to detect sporadic CJD—were 77 percent sensitive and 100 percent specific, and the results took twice as long to obtain.
Jason Wilham, Ph.D., Christina Orrú, Ph.D., Dr. Caughey, and other members of his research group had previously developed the cerebral spinal fluid test method with Ryuichiro Atarashi, M.D., Ph.D., a former NIAID postdoctoral fellow who is now at Nagasaki University in Japan.
While continuing to validate the test method in CJD patients, Dr. Caughey’s group is looking to expand the study to diagnose forms of prion diseases in sheep, cattle and wildlife. The team continues to collaborate with Dr. Zanusso’s group, which is looking to replace the nasal brush with an even simpler swabbing approach.

NIH and Italian Scientists Develop Nasal Test for Human Prion Disease

A nasal brush test can rapidly and accurately diagnose Creutzfeldt-Jakob disease (CJD), an incurable and ultimately fatal neurodegenerative disorder, according to a study by National Institutes of Health scientists and their Italian colleagues.

Up to now, a definitive CJD diagnosis requires testing brain tissue obtained after death or by biopsy in living patients. The study describing the less invasive nasal test appears in the Aug. 7 issue of the New England Journal of Medicine.

CJD is a prion disease. These diseases originate when, for reasons not fully understood, normally harmless prion protein molecules become abnormal and gather in clusters. Prion diseases affect animals and people. Human prion diseases include variant, familial and sporadic CJD. The most common form, sporadic CJD, affects an estimated 1 in one million people annually worldwide. Other prion diseases include scrapie in sheep; chronic wasting disease in deer, elk and moose; and bovine spongiform encephalopathy (BSE), or mad cow disease, in cattle. Scientists have associated the accumulation of these clusters with tissue damage that leaves sponge-like holes in the brain.

“This exciting advance, the culmination of decades of studies on prion diseases, markedly improves on available diagnostic tests for CJD that are less reliable, more difficult for patients to tolerate, and require more time to obtain results,” said Anthony S. Fauci, M.D., director of the National Institute of Allergy and Infectious Diseases (NIAID), a component of NIH. “With additional validation, this test has potential for use in clinical and agricultural settings.”

An easy-to-use diagnostic test would let doctors clearly differentiate prion diseases from other brain diseases, according to Byron Caughey, Ph.D., the lead NIAID scientist involved in the study. Although specific CJD treatments are not available, prospects for their development and effectiveness could be enhanced by early and accurate diagnoses. Further, a test that identifies people with various forms of prion diseases could help to prevent the spread of prion diseases among and between species. For instance, it is known that human prion diseases can be transmitted via medical procedures such as blood transfusions, transplants and the contamination of surgical instruments. People also have contracted variant CJD after exposure to BSE-infected cattle.

The NIAID study involved 31 nasal samples from patients with CJD and 43 nasal samples from patients who had other neurologic diseases or no neurologic disease at all. These samples were collected primarily by Gianluigi Zanusso, M.D., Ph.D., and colleagues at the University of Verona in Italy, who developed the technique of brushing the inside of the nose to collect olfactory neurons connected to the brain. Testing in Dr. Caughey’s lab in Montana then correctly identified 30 of the 31 CJD patients (97 percent sensitivity) and correctly showed negative results for all 43 of the non-CJD patients (100 percent specificity). By comparison, tests using cerebral spinal fluid—currently used to detect sporadic CJD—were 77 percent sensitive and 100 percent specific, and the results took twice as long to obtain.

Jason Wilham, Ph.D., Christina Orrú, Ph.D., Dr. Caughey, and other members of his research group had previously developed the cerebral spinal fluid test method with Ryuichiro Atarashi, M.D., Ph.D., a former NIAID postdoctoral fellow who is now at Nagasaki University in Japan.

While continuing to validate the test method in CJD patients, Dr. Caughey’s group is looking to expand the study to diagnose forms of prion diseases in sheep, cattle and wildlife. The team continues to collaborate with Dr. Zanusso’s group, which is looking to replace the nasal brush with an even simpler swabbing approach.

Filed under creutzfeldt-jakob disease prion disease cerebrospinal fluid olfaction neuroscience science

140 notes

Older adults have morning brains! Study shows noticeable differences in brain function across the day

Older adults who are tested at their optimal time of day (the morning), not only perform better on demanding cognitive tasks but also activate the same brain networks responsible for paying attention and suppressing distraction as younger adults, according to Canadian researchers.

image

The study, published online July 7th in the journal Psychology and Aging (ahead of print publication), has yielded some of the strongest evidence yet that there are noticeable differences in brain function across the day for older adults.

Time of day really does matter when testing older adults. This age group is more focused and better able to ignore distraction in the morning than in the afternoon,” said lead author John Anderson, a PhD candidate with the Rotman Research Institute at Baycrest Health Sciences and University of Toronto, Department of Psychology.

“Their improved cognitive performance in the morning correlated with greater activation of the brain’s attentional control regions – the rostral prefrontal and superior parietal cortex – similar to that of younger adults.” 

Asked how his team’s findings may be useful to older adults in their daily activities, Anderson recommended that older adults try to schedule their most mentally-challenging tasks for the morning time. Those tasks could include doing taxes, taking a test (such as a driver’s license renewal), seeing a doctor about a new condition, or cooking an unfamiliar recipe.

In the study, 16 younger adults (aged 19 – 30) and 16 older adults (aged 60-82) participated in a series of memory tests during the afternoon from 1 – 5 p.m. The tests involved studying and recalling a series of picture and word combinations flashed on a computer screen. Irrelevant words linked to certain pictures and irrelevant pictures linked to certain words also flashed on the screen as a distraction. During the testing, participants’ brains were scanned with fMRI which allows researchers to detect with great precision which areas of the brain are activated. Older adults were 10 percent more likely to pay attention to the distracting information than younger adults who were able to successfully focus and block this information. The fMRI data confirmed that older adults showed substantially less engagement of the attentional control areas of the brain compared to younger adults. Indeed, older adults tested in the afternoon were “idling” – showing activations in the default mode (a set of regions that come online primarily when a person is resting or thinking about nothing in particular) indicating that perhaps they were having great difficulty focusing. When a person is fully engaged with focusing, resting state activations are suppressed.

When 18 older adults were morning tested (8:30 a.m. – 10:30 a.m.) they performed noticeably better, according to two separate behavioural measures of inhibitory control. They attended to fewer distracting items than their peers tested at off-peak times of day, closing the age difference gap in performance with younger adults. Importantly, older adults tested in the morning activated the same brain areas young adults did to successfully ignore the distracting information. This suggests that when older adults are tested is important for both how they perform and what brain activity one should expert to see.

“Our research is consistent with previous science reports showing that at a time of day that matches circadian arousal patterns, older adults are able to resist distraction,” said Dr. Lynn Hasher, senior author on the paper and a leading authority in attention and inhibitory functioning in younger and older adults.

The Baycrest findings offer a cautionary flag to those who study cognitive function in older adults. “Since older adults tend to be morning-type people, ignoring time of day when testing them on some tasks may create an inaccurate picture of age differences in brain function,” said Dr. Hasher, senior scientist at Baycrest’s Rotman Research Institute and Professor of Psychology at University of Toronto.

(Source: baycrest.org)

Filed under aging cognitive performance prefrontal cortex parietal cortex brain activity brain function psychology neuroscience science

112 notes

(Image caption: In mice whose brain tumor cells (in green) couldn’t make galectin-1, the body’s immune system was able to recognize and attack the cells, causing them to die. In this microscope image, the orange areas show where tumor cells had died in just the first three days after the tumor was implanted in the brain. Six days later, the tumor had been eradicated. Credit: University of Michigan Medical School)
Brain tumors fly under the body’s radar like stealth jets
Brain tumors fly under the radar of the body’s defense forces by coating their cells with extra amounts of a specific protein, new research shows.
Like a stealth fighter jet, the coating means the cells evade detection by the early-warning immune system that should detect and kill them. The stealth approach lets the tumors hide until it’s too late for the body to defeat them.
The findings, made in mice and rats, show the key role of a protein called galectin-1 in some of the most dangerous brain tumors, called high grade malignant gliomas. A research team from the University of Michigan Medical School made the discovery and has published it online in the journal Cancer Research.
In a stunning example of scientific serendipity, the team uncovered galectin-1’s role by pursuing a chance finding. They had actually been trying to study how the extra production of galectin-1 by tumor cells affects cancer’s ability to grow and spread in the brain.
Instead, they found that when they blocked cancer cells from making galectin-1, the tumors were eradicated; they did not grow at all. That’s because the “first responders” of the body’s immune system – called natural killer or NK cells – spotted the tumor cells almost immediately and killed them.
But when the tumor cells made their usual amounts of galectin-1, the immune cells couldn’t recognize the cancerous cells as dangerous. That meant that the immune system couldn’t trigger the body’s “second line of defense”, called T cells – until the tumors had grown too large for the body to beat.
Team leader Pedro Lowenstein, M.D., Ph.D, of the U-M Department of Neurosurgery, says the findings open the door to research on the effect of blocking galectin-1 in patients with gliomas.
"This is an incredibly novel and exciting development, and shows that in science we must always be open-minded and go where the science takes us; no matter where we thought we wanted to go," says Lowenstein, whose graduate student Gregory J. Baker is the first author of the paper.
"In this case, we found that over-expression of galectin-1 inhibits the innate immune system, and this allows the tumor to grow enough to evade any possible effective T cell response," he explains. "By the time it’s detected, the battle is already lost."
The NK-evading “stealth” function of the extra-thick coating of galectin-1 came as a surprise, because glioma researchers everywhere had assumed the extra protein had more to do with the insidious ability of gliomas to invade the brain, and to evade the attacks of T cells.
Gliomas, which make up about 80 percent of all malignant brain tumors, include anaplastic oligodendrogliomas, anaplastic astrocytomas, and glioblastoma multiforme. More than 24,000 people in the U.S. are diagnosed with a primary malignant brain tumor each year.
The tiny tendrils of tumor that extend into brain tissue from a glioma are what make them so dangerous. Even when a neurosurgeon removes the bulk of the tumor, small invasive areas escape detection and keep growing, unchecked by the body.
Helping the innate immune system to recognize early stages of cancer growth, and sound the alarm for the body’s defense system to act while the remaining cancer is still small enough for them to kill, could potentially help patients.
While the new discovery opens the door to that kind of approach, much work needs to be done before the mouse-based research could help human patients, says Lowenstein, who is the Richard Schneider Collegiate Professor in Neurosurgery and also holds an appointment in the U-M Department of Cell and Developmental Biology. Galectin-1 may help other types of tumor evade the innate NK cells, too
The new research suggests that in the brain’s unique environment, galectin-1 creates an immunosuppressive effect immediately around tumor cells. The brain cancer cells seem to have evolved the ability to express their galectin-1 genes far more than normal, to allow the tumor to keep growing.
Lowenstein and co-team leader Maria Castro, Ph.D., have long studied the immune system’s interactions with brain cancer, using funding from the National Institutes of Health, and are co-leading a new clinical trial for malignant glioma (NCT01811992), that aims to translate prior research achievements into new trials for patients with brain tumors.
Most brain tumor immune research has focused on triggering the action of the adaptive immune system – whose cells control the process that allows the body to kill invaders from outside or within.
But that system take days or even weeks to reach full force – enough time for incipient tumors to grow too large for immune cells to eliminate solid tumor growth. The new research suggests the importance of enhancing the ability of the innate immune system’s “early warning” sentinels to spot glioma cells as early as possible.

(Image caption: In mice whose brain tumor cells (in green) couldn’t make galectin-1, the body’s immune system was able to recognize and attack the cells, causing them to die. In this microscope image, the orange areas show where tumor cells had died in just the first three days after the tumor was implanted in the brain. Six days later, the tumor had been eradicated. Credit: University of Michigan Medical School)

Brain tumors fly under the body’s radar like stealth jets

Brain tumors fly under the radar of the body’s defense forces by coating their cells with extra amounts of a specific protein, new research shows.

Like a stealth fighter jet, the coating means the cells evade detection by the early-warning immune system that should detect and kill them. The stealth approach lets the tumors hide until it’s too late for the body to defeat them.

The findings, made in mice and rats, show the key role of a protein called galectin-1 in some of the most dangerous brain tumors, called high grade malignant gliomas. A research team from the University of Michigan Medical School made the discovery and has published it online in the journal Cancer Research.

In a stunning example of scientific serendipity, the team uncovered galectin-1’s role by pursuing a chance finding. They had actually been trying to study how the extra production of galectin-1 by tumor cells affects cancer’s ability to grow and spread in the brain.

Instead, they found that when they blocked cancer cells from making galectin-1, the tumors were eradicated; they did not grow at all. That’s because the “first responders” of the body’s immune system – called natural killer or NK cells – spotted the tumor cells almost immediately and killed them.

But when the tumor cells made their usual amounts of galectin-1, the immune cells couldn’t recognize the cancerous cells as dangerous. That meant that the immune system couldn’t trigger the body’s “second line of defense”, called T cells – until the tumors had grown too large for the body to beat.

Team leader Pedro Lowenstein, M.D., Ph.D, of the U-M Department of Neurosurgery, says the findings open the door to research on the effect of blocking galectin-1 in patients with gliomas.

"This is an incredibly novel and exciting development, and shows that in science we must always be open-minded and go where the science takes us; no matter where we thought we wanted to go," says Lowenstein, whose graduate student Gregory J. Baker is the first author of the paper.

"In this case, we found that over-expression of galectin-1 inhibits the innate immune system, and this allows the tumor to grow enough to evade any possible effective T cell response," he explains. "By the time it’s detected, the battle is already lost."

The NK-evading “stealth” function of the extra-thick coating of galectin-1 came as a surprise, because glioma researchers everywhere had assumed the extra protein had more to do with the insidious ability of gliomas to invade the brain, and to evade the attacks of T cells.

Gliomas, which make up about 80 percent of all malignant brain tumors, include anaplastic oligodendrogliomas, anaplastic astrocytomas, and glioblastoma multiforme. More than 24,000 people in the U.S. are diagnosed with a primary malignant brain tumor each year.

The tiny tendrils of tumor that extend into brain tissue from a glioma are what make them so dangerous. Even when a neurosurgeon removes the bulk of the tumor, small invasive areas escape detection and keep growing, unchecked by the body.

Helping the innate immune system to recognize early stages of cancer growth, and sound the alarm for the body’s defense system to act while the remaining cancer is still small enough for them to kill, could potentially help patients.

While the new discovery opens the door to that kind of approach, much work needs to be done before the mouse-based research could help human patients, says Lowenstein, who is the Richard Schneider Collegiate Professor in Neurosurgery and also holds an appointment in the U-M Department of Cell and Developmental Biology. Galectin-1 may help other types of tumor evade the innate NK cells, too

The new research suggests that in the brain’s unique environment, galectin-1 creates an immunosuppressive effect immediately around tumor cells. The brain cancer cells seem to have evolved the ability to express their galectin-1 genes far more than normal, to allow the tumor to keep growing.

Lowenstein and co-team leader Maria Castro, Ph.D., have long studied the immune system’s interactions with brain cancer, using funding from the National Institutes of Health, and are co-leading a new clinical trial for malignant glioma (NCT01811992), that aims to translate prior research achievements into new trials for patients with brain tumors.

Most brain tumor immune research has focused on triggering the action of the adaptive immune system – whose cells control the process that allows the body to kill invaders from outside or within.

But that system take days or even weeks to reach full force – enough time for incipient tumors to grow too large for immune cells to eliminate solid tumor growth. The new research suggests the importance of enhancing the ability of the innate immune system’s “early warning” sentinels to spot glioma cells as early as possible.

Filed under galectin-1 brain tumours glioma cancer cells t cells immune system neuroscience science

173 notes

Could your brain be reprogrammed to work better?

Researchers from The University of Western Australia have shown that electromagnetic stimulation can alter brain organisation which may make your brain work better.

image

In results from a study published today in the prestigious Journal of Neuroscience, researchers from The University of Western Australia and the Université Pierre et Marie Curie in France demonstrated that weak sequential electromagnetic pulses (repetitive transcranial magnetic stimulation - or rTMS) on mice can shift abnormal neural connections to more normal locations.

The discovery has important implications for treatment of many nervous system disorders related to abnormal brain organisation such as depression, epilepsy and tinnitus.

To better understand what magnetic stimulation does to the brain Research Associate Professor Jennifer Rodger from UWA’s School of Animal Biology and her colleagues tested a low-intensity version of the therapy - known as low-intensity repetitive transcranial magnetic stimulation (LI-rTMS) - on mice born with abnormal brain organisation.

Lead author, PhD candidate Kalina Makowiecki, said the research demonstrated that even at low intensities, pulsed magnetic stimulation could reduce abnormally located neural connections, shifting them towards their correct locations in the brain.

"This reorganisation is associated with changes in a specific brain chemical, and occurred in several brain regions, across a whole network. Importantly, this structural reorganisation was not seen in the healthy brain or the appropriate connections in the abnormal mice, suggesting that the therapy could have minimal side effects in humans.

"Our findings greatly increase our understanding of the specific cellular and molecular events that occur in the brain during this therapy and have implications for how best to use it in humans to treat disease and improve brain function," Ms Makowiecki said.

(Source: news.uwa.edu.au)

Filed under brain function transcranial magnetic stimulation depression epilepsy brain stimulation neuroscience science

51 notes

Dementia Risk Quadrupled in People with Mild Cognitive Impairment

In a long-term, large-scale population-based study of individuals aged 55 years or older in the general population researchers found that those diagnosed with mild cognitive impairment (MCI) had a four-fold increased risk of developing dementia or Alzheimer’s disease (AD) compared to cognitively healthy individuals. Several risk factors including older age, positive APOE-ɛ4 status, low total cholesterol levels, and stroke, as well as specific MRI findings were associated with an increased risk of developing MCI. The results are published in a supplement to the Journal of Alzheimer’s Disease.

“Mild cognitive impairment has been identified as the transitional stage between normal aging and dementia,” comments M. Arfan Ikram, MD, PhD, a neuroepidemiologist at Erasmus MC University Medical Center (Rotterdam). “Identifying persons at a higher risk of dementia could postpone or even prevent dementia by timely targeting modifiable risk factors.”

Unlike a clinical trial, the Rotterdam study is an observational cohort study focusing on the general population, instead of persons referred to a memory clinic. The Rotterdam study began in 1990, when almost 8,000 inhabitants of Rotterdam aged 55 years or older agreed to participate in the study. Ten years later, another 3,000 individuals were added. Participants undergo home interviews and examinations every four years.

“This important prospective study adds to the accumulating evidence that strokes, presumably related to so called ‘vascular’ risk factors, also contribute to the appearance of dementia in Alzheimer’s disease. This leads to the conclusion that starting at midlife people should minimize those risk factors. The recent results of the Finish FINGER study corroborate this idea. It should be remembered that delaying the onset of dementia by five years will reduce the prevalence of the disease by half. And of course, since there is no cure for AD, prevention is the best approach at present,” explains Professor Emeritus Amos D Korczyn, Tel Aviv University, Ramat Aviv, Israel, and Guest Editor of the Supplement.

To be diagnosed with MCI in the study, individuals were required to meet three criteria: a self-reported awareness of having problems with memory or everyday functioning; deficits detected on a battery of cognitive tests; and no evidence of dementia. They were categorized into those with memory problems (amnestic MCI) and those with normal memory (non-amnestic MCI).

Of 4,198 persons found to be eligible for the study, almost 10% were diagnosed with MCI. Of these, 163 had amnestic MCI and 254 had non-amnestic MCI.

The risk of dementia was especially high for people with amnestic MCI. Similar results were observed regarding the risk for Alzheimer’s disease. Those with MCI also faced a somewhat higher risk of death. 

The research team investigated possible determinants of MCI, considering factors such as age, APOE-ɛ status, waist circumference, hypertension, diabetes mellitus, total and HDL-cholesterol levels, smoking, and stroke. Only older age, being an APOE-ɛ4 carrier, low total cholesterol levels, and stroke at baseline were associated with developing MCI. Having the APOE-ɛ4 genotype and smoking were related only to amnestic MCI.

When the investigators analysed MRI studies of the brain, they found that participants with MCI, particularly those with non-amnestic MCI, had larger white matter lesion volumes and worse microstructural integrity of normal-appearing white matter compared to controls. They were also three-times more likely than controls to have lacunes (3 to 15 mm cerebrospinal fluid (CSF)-filled cavities in the basal ganglia or white matter, frequently observed when imaging older people). MCI was not associated with total brain volume, hippocampal volume, or cerebral microbleeds.

“Our results suggest that accumulating vascular damage plays a role in both amnestic and non-amnestic MCI,” says Dr. Ikram. “We propose that timely targeting of modifiable vascular risk factors might contribute to the prevention of MCI and dementia.”

Reference:

Determinants, MRI Correlates, and Prognosis of Mild Cognitive Impairment: The Rotterdam Study. Renée F.A.G. de Bruijn, Saloua Akoudad, Lotte G.M. Cremers, Albert Hofman, Wiro J. Niessen, Aad van der Lugt, Peter J. Koudstaal, Meike W. Vernooij, M. Arfan Ikram. Journal of Alzheimer’s Disease, Volume 42/Supplement 3 (August 2014): 2013 International Congress on Vascular Dementia (Guest Editor: Amos D. Korczyn)

(Source: iospress.nl)

Filed under cognitive impairment dementia alzheimer's disease memory brain structure neuroscience science

280 notes

Link between vitamin D and dementia risk confirmed

Vitamin D deficiency is associated with a substantially increased risk of dementia and Alzheimer’s disease in older people, according to the most robust study of its kind ever conducted.

image

An international team, led by Dr David Llewellyn at the University of Exeter Medical School, found that study participants who were severely Vitamin D deficient were more than twice as likely to develop dementia and Alzheimer’s disease.

The team studied elderly Americans who took part in the Cardiovascular Health Study. They discovered that adults in the study who were moderately deficient in vitamin D had a 53 per cent increased risk of developing dementia of any kind, and the risk increased to 125 per cent in those who were severely deficient.

Similar results were recorded for Alzheimer’s disease, with the moderately deficient group 69 per cent more likely to develop this type of dementia, jumping to a 122 per cent increased risk for those severely deficient.

The study was part-funded by the Alzheimer’s Association, and is published in August 6 2014 online issue of Neurology, the medical journal of the American Academy of Neurology. It looked at 1,658 adults aged 65 and over, who were able to walk unaided and were free from dementia, cardiovascular disease and stroke at the start of the study. The participants were then followed for six years to investigate who went on to develop Alzheimer’s disease and other forms of dementia.

Dr Llewellyn said: “We expected to find an association between low Vitamin D levels and the risk of dementia and Alzheimer’s disease, but the results were surprising – we actually found that the association was twice as strong as we anticipated.

“Clinical trials are now needed to establish whether eating foods such as oily fish or taking vitamin D supplements can delay or even prevent the onset of Alzheimer’s disease and dementia. We need to be cautious at this early stage and our latest results do not demonstrate that low vitamin D levels cause dementia. That said, our findings are very encouraging, and even if a small number of people could benefit, this would have enormous public health implications given the devastating and costly nature of dementia.”

Research collaborators included experts from Angers University Hospital, Florida International University, Columbia University, the University of Washington, the University of Pittsburgh and the University of Michigan. The study was supported by the Alzheimer’s Association, the Mary Kinross Charitable Trust, the James Tudor Foundation, the Halpin Trust, the Age Related Diseases and Health Trust, the Norman Family Charitable Trust, and the National Institute for Health Research Collaboration for Leadership in Applied Research and Care South West Peninsula (NIHR PenCLAHRC).

Dementia is one of the greatest challenges of our time, with 44 million cases worldwide – a number expected to triple by 2050 as a result of rapid population ageing. A billion people worldwide are thought to have low vitamin D levels and many older adults may experience poorer health as a result.

The research is the first large study to investigate the relationship between vitamin D and dementia risk where the diagnosis was made by an expert multidisciplinary team, using a wide range of information including neuroimaging. Previous research established that people with low vitamin D levels are more likely to go on to experience cognitive problems, but this study confirms that this translates into a substantial increase in the risk of Alzheimer’s disease and dementia.

Vitamin D comes from three main sources – exposure of skin to sunlight, foods such as oily fish, and supplements. Older people’s skin can be less efficient at converting sunlight into Vitamin D, making them more likely to be deficient and reliant on other sources. In many countries the amount of UVB radiation in winter is too low to allow vitamin D production.

The study also found evidence that there is a threshold level of Vitamin D circulating in the bloodstream below which the risk of developing dementia and Alzheimer’s disease increases.  The team had previously hypothesized that this might lie in the region of 25-50 nmol/L, and their new findings confirm that vitamin D levels above 50 nmol/L are most strongly associated with good brain health.

Commenting on the study, Dr Doug Brown, Director of Research and Development at Alzheimer’s Society said: “Shedding light on risk factors for dementia is one of the most important tasks facing today’s health researchers. While earlier studies have suggested that a lack of the sunshine vitamin is linked to an increased risk of Alzheimer’s disease, this study found that people with very low vitamin D levels were more than twice as likely to develop any kind of dementia.

“During this hottest of summers, hitting the beach for just 15 minutes of sunshine is enough to boost your vitamin D levels. However, we’re not quite ready to say that sunlight or vitamin D supplements will reduce your risk of dementia. Large scale clinical trials are needed to determine whether increasing vitamin D levels in those with deficiencies can help prevent the dementia from developing.”

(Source: exeter.ac.uk)

Filed under alzheimer's disease dementia vitamin deficiency vitamin d neuroscience science

free counters