Neuroscience

Articles and news from the latest research reports.

111 notes

Study creates new memories by directly changing the brain
Findings could prove helpful in understanding and resolving learning and memory disorders
By studying how memories are made, UC Irvine neurobiologists created new, specific memories by direct manipulation of the brain, which could prove key to understanding and potentially resolving learning and memory disorders.
Research led by senior author Norman M. Weinberger, a research professor of neurobiology & behavior at UC Irvine, and colleagues has shown that specific memories can be made by directly altering brain cells in the cerebral cortex, which produces the predicted specific memory. The researchers say this is the first evidence that memories can be created by direct cortical manipulation. Study results appeared in the August 29 issue of Neuroscience.
During the research, Weinberger and colleagues played a specific tone to test rodents then stimulated the nucleus basalis deep within their brains, releasing acetylcholine (ACh), a chemical involved in memory formation. This procedure increased the number of brain cells responding to the specific tone. The following day, the scientists played many sounds to the animals and found that their respiration spiked when they recognized the particular tone, showing that specific memory content was created by brain changes directly induced during the experiment. Created memories have the same features as natural memories including long-term retention.
"Disorders of learning and memory are a major issue facing many people and since we’ve found not only a way that the brain makes memories, but how to create new memories with specific content, our hope is that our research will pave the way to prevent or resolve this global issue," said Weinberger, who is also a fellow with the Center for the Neurobiology of Learning & Memory and the Center for Hearing Research at UC Irvine.
The creation of new memories by directly changing the cortex is the culmination of several years of research in Weinberger’s lab implicating the nucleus basalis and ACh in brain plasticity and specific memory formation. Previously, the authors had also shown that the strength of memory is controlled by the number of cells in the auditory cortex that process a sound.

Study creates new memories by directly changing the brain

Findings could prove helpful in understanding and resolving learning and memory disorders

By studying how memories are made, UC Irvine neurobiologists created new, specific memories by direct manipulation of the brain, which could prove key to understanding and potentially resolving learning and memory disorders.

Research led by senior author Norman M. Weinberger, a research professor of neurobiology & behavior at UC Irvine, and colleagues has shown that specific memories can be made by directly altering brain cells in the cerebral cortex, which produces the predicted specific memory. The researchers say this is the first evidence that memories can be created by direct cortical manipulation. Study results appeared in the August 29 issue of Neuroscience.

During the research, Weinberger and colleagues played a specific tone to test rodents then stimulated the nucleus basalis deep within their brains, releasing acetylcholine (ACh), a chemical involved in memory formation. This procedure increased the number of brain cells responding to the specific tone. The following day, the scientists played many sounds to the animals and found that their respiration spiked when they recognized the particular tone, showing that specific memory content was created by brain changes directly induced during the experiment. Created memories have the same features as natural memories including long-term retention.

"Disorders of learning and memory are a major issue facing many people and since we’ve found not only a way that the brain makes memories, but how to create new memories with specific content, our hope is that our research will pave the way to prevent or resolve this global issue," said Weinberger, who is also a fellow with the Center for the Neurobiology of Learning & Memory and the Center for Hearing Research at UC Irvine.

The creation of new memories by directly changing the cortex is the culmination of several years of research in Weinberger’s lab implicating the nucleus basalis and ACh in brain plasticity and specific memory formation. Previously, the authors had also shown that the strength of memory is controlled by the number of cells in the auditory cortex that process a sound.

Filed under memory formation acetylcholine nucleus basalis neurons plasticity neuroscience science

115 notes

Western scientists discover a novel opiate addiction switch in the brain

Neuroscientists at Western University (London, Canada) have made a remarkable new discovery revealing the underlying molecular process by which opiate addiction develops in the brain. Opiate addiction is largely controlled by the formation of powerful reward memories that link the pleasurable effects of opiate-class drugs to environmental triggers that induce drug craving in individuals addicted to opiates. The research is published in the September 11th issue of The Journal of Neuroscience.

The Addiction Research Group led by Steven Laviolette of the Schulich School of Medicine & Dentistry was able to identify how exposure to heroin induces a specific switch in a memory molecule in a region of the brain called the basolateral amygdala, which is involved importantly in controlling memories related to opiate addiction, withdrawal, and relapse. Using a rodent model of opiate addiction, Laviolette’s team found that the process of opiate addiction and withdrawal triggered a switch between two molecular pathways in the amygdala controlling how opiate addiction memories were formed.  In the non-dependent state, they found that a molecule called extracellular signal-related kinase or “ERK” was recruited for early stage addiction memories. However, once opiate addiction had developed, the scientists observed a functional switch to a separate molecular memory pathway, controlled by a molecule called calmodulin-dependent kinase II or “CaMKII”.

“These findings will shed important new light on how the brain is altered by opiate drugs and provide exciting new targets for the development of novel pharmacotherapeutic treatments for individuals suffering from chronic opiate addiction,” says Laviolette, an associate professor in the Departments of Anatomy & Cell Biology, Psychiatry, and Psychology.

(Source: communications.uwo.ca)

Filed under addiction opiate addiction basolateral amygdala extracellular signal-related kinase memory neuroscience science

62 notes

Early-onset Parkinson’s disease linked to genetic deletion

Scientists at the Centre for Addiction and Mental Health (CAMH) and University Health Network (UHN) have found a new link between early-onset Parkinson’s disease and a piece of DNA missing from chromosome 22. The findings help shed new light on the molecular changes that lead to Parkinson’s disease.

The study appears online today in JAMA Neurology.

Among people aged 35 to 64 who were missing DNA from a specific part of chromosome 22, the research team found a marked increase in the number of cases of Parkinson’s disease, compared to expected rates of Parkinson’s disease in the general population from the same age group.

The deletion, which occurs when a person is born with about 50 genes missing on one chromosome 22, is associated with 22q11.2 deletion syndrome. People with this condition may have heart or other birth defects, learning or speech difficulties, and some develop schizophrenia. It occurs in an estimated 1 in 2,000 to 4,000 births, but is believed to be under-diagnosed.

“22q11.2 deletion syndrome has been fairly well studied in childhood and adolescence, but less is known about its effects as people age,” said Dr. Anne Bassett, Director of CAMH’s Clinical Genetics Research Program and Director of the Dalglish Family Hearts and Minds Clinic at UHN, the world’s first clinic dedicated to adults with 22q11.2 deletion syndrome. A few cases of patients with the syndrome who had Parkinson’s disease symptoms had been previously reported, which suggested that the two conditions might be linked.

Parkinson’s disease is one of the most common neurodegenerative disorders worldwide, typically affecting people over the age of 65. Earlier onset of Parkinson’s disease, before age 50, is rare and has been associated with several other genetic changes that are not on chromosome 22.

The researchers studied 159 adults with 22q11.2 deletion syndrome to discover how many had been clinically diagnosed with Parkinson’s disease. For three individuals with the deletion and Parkinson’s disease who were deceased, brain tissue was also examined.

“Through a post-mortem examination, we were able to show that all three patients had a loss of neurons that was typical of that seen in Parkinson’s disease. The examination also helped to show that the symptoms of Parkinson’s disease were not related to side effects of the medications commonly used to treat schizophrenia,” added Dr.Rasmus Kiehl, neuropathologist in UHN’s Laboratory Medicine Program, who co-authored the report with CAMH graduate student Nancy Butcher. The team also found that Parkinson’s disease in 22q11.2 deletion syndrome is associated with abnormal accumulations of protein called Lewy bodies in the brain in some, but not all cases, just as in another genetic form of Parkinson’s disease.

The findings highlight the complexity of clinical care when both Parkinson’s disease and 22q11.2 deletion syndrome are present. “Our results may inform best practices in the clinic in these cases,” said Dr. Bassett, Senior Scientist in CAMH’s Campbell Family Mental Health Research Institute.

Because patients with 22q11.2DS who have schizophrenia are often prescribed anti-psychotic medications, they may experience side-effects such as tremors and muscle stiffness, similar to symptoms of Parkinson’s disease.

As a result, the researchers found that anti-psychotic use delayed the diagnosis of Parkinson’s disease – and the opportunity for treatment – by up to 10 years.

For people with early-onset Parkinson’s disease, who also have other features that could indicate 22q11.2 deletion syndrome, clinical genetic testing for the deletion on chromosome 22 should be considered, the researchers suggest.

“Our discovery that the 22q11.2 deletion syndrome is associated with Parkinson’s disease is very exciting,” said Dr. Anthony Lang, Director of the Movement Disorders Program at the Krembil Neuroscience Centre of Toronto Western Hospital. “The varying pathology that we found is reminiscent of certain other genetic causes of Parkinson’s disease, and opens new directions to search for novel genes that could cause its more common form. Studies of patients with 22q11.2 deletion syndrome before they ever develop clinical features of Parkinson’s disease may not only provide important information on the effectiveness of screening methods for early detection of the disease, but also allow for future ‘neuroprotective treatments’ to be introduced at the ultimate time when they can have a chance to make an important impact on preventing the disease or slowing its course.” 

“Most people with 22q11.2 deletion syndrome will not develop Parkinson’s disease,” emphasizes Dr. Bassett. “But it does occur at a rate higher than in the general population. We will now be on the look-out for this so we can provide the best care for patients.”

(Source: camh.ca)

Filed under parkinson's disease chromosome 22 22q11.2 deletion syndrome genetics neuroscience science

51 notes

Therapy Slows Onset and Progression of Lou Gehrig’s Disease

Studies of a therapy designed to treat amyotrophic lateral sclerosis (ALS) suggest that the treatment dramatically slows onset and progression of the deadly disease, one of the most common neuromuscular disorders in the world. The researchers, led by teams from The Research Institute at Nationwide Children’s Hospital and the Ludwig Institute at the University of California, San Diego, found a survival increase of up to 39 percent in animal models with a one-time treatment, a crucial step toward moving the therapy into human clinical trials.

The therapy reduces expression of a gene called SOD1, which in some cases of familial ALS has a mutation that weakens and kills nerve cells called motor neurons that control muscle movement. While many drug studies involve only one type of animal model, this effort included analysis in two different models treated before and after disease onset. The in-depth study could vault the drug into human clinical trials, said Brian Kaspar, PhD, a principal investigator in the Center for Gene Therapy at Nationwide Children’s and a senior author on the research, which was published online Sept. 6 in Molecular Therapy.

“We designed these rigorous studies using two different models of the disease with the experimenters blinded to the treatment and in two separate laboratories,” said Dr. Kaspar, who collaborated on the study with a team led by Don Cleveland, PhD, at the University of California, San Diego. “We were very pleased with the results, and found that the delivery approach was successful in a larger species, enabling us to initiate a clinical translational plan for this horrible disease.”

There currently is no cure for ALS, also called Lou Gehrig’s disease. The Centers for Disease Control and Prevention estimates there are about 5,000 new cases in the U.S. each year, mostly in people age 50 to 60. Although the exact cause of ALS is unknown, more than 170 mutations in the SOD1 gene have been found in many patients with familial ALS, which accounts for about 2 percent of all cases.

SOD1 provides instructions for making an enzyme called superoxide dismutase, which is found throughout the body and breaks down toxic molecules that can be damaging to cells. When mutated, the SOD1 gene yields a faulty version of the enzyme that is especially harmful to motor neurons. One of the mutations, which is found in about half of all familial ALS patients, is particularly devastating, with death usually coming within 18 months of diagnosis. SOD1 has also been implicated in other types of ALS, called sporadic ALS, which means the therapy could prove beneficial for larger numbers of patients suffering with this disease.

Earlier work by Dr. Kaspar and others found that they could halt production of the mutated enzyme by blocking SOD1 expression, which in turn, they suspected, would slow ALS progression. To test this hypothesis, the researchers would not only need to come up with an approach that would block the gene, but also figure out how to specifically target cells implicated in the disease, which include motor neurons and glial cells. What’s more, the therapy would preferably be administered noninvasively instead of direct delivery via burr holes drilled into the skull.

Dr. Kaspar’s team accomplished the second part of this challenge in 2009, when they discovered that adeno-associated virus serotype 9 (AAV9) could cross the blood-brain barrier, making it an ideal transport system for delivering genes and RNA interference strategies designed to treat disease.

In this new work, funded by the National Institutes of Health, the researchers blocked human SOD1, using a technology known as short hairpin RNA, or shRNA. These single strands of RNA are designed in the lab to seek out specific sequences found in the human SOD1 gene, latch onto them and block gene expression.

In one of the mouse models used in the study, ALS develops earlier and advances more quickly. In the other, the disease develops later and progresses more slowly. All of the mice received a single injection of AAV9-SOD1-shRNA before or after disease onset.

Results showed that in the rapid-disease-progressing model, mice treated before disease onset saw a  39 percent increase in survival compared to control treated mice. Strikingly, in mice treated at 21 days of age, disease progression was slowed by 66 percent. Perhaps more surprising was the finding that even after symptoms surfaced in these models, treatment still resulted in a 23 percent increase in survival  and a 36 percent reduction in disease progression. In the slower-disease-onset model, treatment extended survival by 22 percent and delayed disease progression by 38 percent.

“The extension of survival is fantastic, and the fact that we delayed disease progression in both models when treated at disease onset is what drives our excitement to advance this work to human clinical trials,” said Kevin Foust, PhD, co-first author on the manuscript and an assistant professor in neurosciences at The Ohio State University College of Medicine.

In addition to the potential therapeutic benefit, the study also offers some interesting insights into the biological underpinnings of ALS. The role of motor neurons in ALS has been well documented, but this study also highlighted another key player—astrocytes, the most abundant cell type in the human brain and supporters of neuronal function.

“Recent work from our collaborator Dr. Cleveland has demonstrated that astrocytes and other types of glia are as important if not more important in ALS, as they really drive disease progression,” said Dr. Kaspar. “Indeed, in looking at data from mice, more than 50 percent of astrocytes were targeted throughout the spinal cord by this gene-delivery approach.”

Ideally, a therapy would hit motor neurons and astrocytes equally hard. The best way to do that is to deliver the drug directly into the cerebrospinal fluid (CSF), which would reduce the amount of SOD1 suppression in cells outside the brain and reduce immune system exposure to AAV9—elements that would add weight to an argument for studying the drug in humans.

Injections directly into CSF cannot be done easily in mice, so the team took the study a crucial step further by injecting AAV9-SOD1-shRNA into the CSF of healthy nonhuman primates. The results were just as the team hoped—the amount of gene expression dropped by as much as 90 percent in motor neurons and nearly 70 percent in astrocytes and no side effects were reported, laying the groundwork towards moving to human clinical trials.

“We have a vast amount of work to do to move this toward a clinical trial, but we’re encouraged by the results to date and our team at Nationwide Children’s and our outstanding collaborators are fully committed to making a difference in this disease,” Dr. Kaspar said.

The findings could impact other studies underway in Dr. Kaspar’s lab, including research on Spinal Muscular Atrophy, an often fatal genetic disease in infants and children that can cause profoundly weakened muscles in the arms and legs and respiratory failure.

“This research provides further proof of targeting motor neurons and glial cells throughout the entire spinal cord for treatment of Spinal Muscular Atrophy and other degenerative diseases of the brain and spinal cord, through a less invasive manner than direct injections,” said Dr. Kaspar, who also is an associate professor of pediatrics and neurosciences at The Ohio State University College of Medicine.

(Source: nationwidechildrens.org)

Filed under ALS neurodegeneration neurodegenerative diseases motor neurons SOD1 gene neuroscience science

41 notes

Brain circuitry loss may be a very early sign of cognitive decline in healthy elderly people

The degeneration of a small, wishbone-shaped structure deep inside the brain may provide the earliest clues to future cognitive decline, long before healthy older people exhibit clinical symptoms of memory loss or dementia, a study by researchers with the UC Davis Alzheimer’s Disease Center has found.

image

The longitudinal study found that the only discernible brain differences between normal people who later developed cognitive impairment and those who did not were changes in their fornix, an organ that carries messages to and from the hippocampus, and that has long been known to play a role in memory.

“This could be a very early and useful marker for future incipient decline,” said Evan Fletcher, the study’s lead author and a project scientist with the UC Davis Alzheimer’s Disease Center.

“Our results suggest that fornix variables are measurable brain factors that precede the earliest clinically relevant deterioration of cognitive function among cognitively normal elderly individuals,” Fletcher said.

The research is published online today in JAMA Neurology.

Hippocampal atrophy occurs in the later stages of cognitive decline and is one of the most studied changes associated with the Alzheimer’s disease process. However, changes to the fornix and other regions of the brain structurally connected to the hippocampus have not been as closely examined. The study found that degeneration of the fornix in relation to cognition was detectable even earlier than changes in the hippocampus.

“Although hippocampal measures have been studied much more deeply in relation to cognitive decline, our direct comparison between fornix and hippocampus measures suggests that fornix properties have a superior ability to identify incipient cognitive decline among healthy individuals,” Fletcher said.

The study was conducted over five years in a group of 102 diverse, cognitively normal people with an average age of 73 who were recruited through community outreach at the Alzheimer’s Disease Center. The researchers conducted magnetic resonance imaging (MRI) studies of the participants’ brains that described their volumes and integrity. A different type of MRI was used to determine the integrity of the myelin, the fatty coating that sheaths and protects the axons. The axons are analogous to the copper wiring of the brain’s circuitry and the myelin is like the wiring’s plastic insulation.

Either one of those things being lost will “degrade the signal transmission” in the brain, Fletcher said.

The researchers also conducted psychological tests and cognitive evaluations of the study participants to gauge their level of cognitive functioning. The participants returned for updated MRIs and cognitive testing at approximately one-year intervals. At the outset, none of the study participants exhibited symptoms of cognitive decline. Over time about 20 percent began to show symptoms that led to diagnoses with either mild cognitive impairment (MCI) and, in a minority of cases, Alzheimer’s disease.

“We found that if you looked at various brain factors there was one — and only one — that seemed to be predictive of whether a person would have cognitive decline, and that was the degradation of the fornix,” Fletcher said.

The study measured two relevant fornix characteristics predicting future cognitive impairment — low fornix white matter volume and reduced axonal integrity. Each of these was stronger than any other brain factor in models predicting cognitive loss, Fletcher said. 

He said that routine MRI examination of the fornix could conceivably be used clinically in the future as a predictor of abnormal cognitive decline.

“Our findings suggest that if your fornix volume or integrity is within a certain range you’re at an increased risk of cognitive impairment down the road. But developing the use of the fornix as a predictor in a clinical setting will take some time, in the same way that it took time for evaluation of cholesterol levels to be used to predict future heart disease,” he said.

Fletcher also said that the finding may mark a paradigm shift toward evaluation of the brain’s white matter, rather than its gray matter, as among the very earliest indicators of developing cognitive loss. There is currently a strong research focus on understanding brain processes that lead eventually to Alzheimer’s disease. He said the current finding could fill in one piece of the picture and motivate new directions in research to understand why and how fornix and other white matter change is such an important harbinger of cognitive impairment. 

“The key importance of this finding is that it suggests that white matter tract measures may prove to be promising candidate biomarkers for predicting incipient cognitive decline among cognitively normal individuals in a clinical setting, possibly more so than gray matter measures,” he said.

(Source: ucdmc.ucdavis.edu)

Filed under alzheimer's disease dementia cognitive decline fornix hippocampus neuroscience science

58 notes

A New Method Will Enable the Early Detection of Parkinson’s Disease Through Handwriting

Today’s primary tool for diagnosing Parkinson’s disease is the diagnostic ability of the physician, who can generally identify the clinical symptoms only when the disease is at a relatively advanced stage. A new joint study by researchers at the University of Haifa and Rambam Hospital that compared the handwriting of 40 sick and healthy subjects suggests an innovative and noninvasive method of diagnosing Parkinson’s at a fairly early stage.

“Identifying the changes in handwriting could lead to an early diagnosis of the illness and neurological intervention at a critical moment,” explains Prof. Sara Rosenblum, of the University of Haifa’s Department of Occupational Therapy, who initiated the study.

The methods for diagnosing Parkinson’s today are a physician evaluation or a test called SPECT, which uses radioactive material to image the brain. The latter, however, is no more effective in diagnosing the illness than an expert doctor and it exposes the patient to unnecessary radiation.

Studies from recent years show that there are unique and distinctive differences between the handwriting of patients with Parkinson’s disease and that of healthy people. However, most studies that to date have focused on handwriting focused on motor skills (such as the drawing of spirals) and not on writing that involves cognitive abilities, such as signing a check, copying addresses, etc.

According to Prof. Rosenblum, Parkinson’s patients report feeling a change in their cognitive abilities before detecting a change in their motor abilities and therefore a test of cognitive impairment like the one performed in this study could attest to the presence of the disease and offer a way to diagnose it earlier.

This research was conducted in cooperation with Dr. Ilana Schlesinger, head of the Center for Movement Disorders and Parkinson’s Disease at Haifa’s Rambam Medical Center and occupational therapists working in the hospital. In the study, the researchers asked the subjects to write their names and gave them addresses to copy, two everyday tasks that require cognitive abilities. Participants were 40 adults with at least 12 years of schooling, half healthy and half known to be in the early stages of Parkinson’s disease (before obvious motor signs are visible).

The writing was done on a regular piece of paper that was placed on electronic tablet, using a special pen with pressure-sensitive sensors operated by the pen when it hit the writing surface. A computerized analysis of the results compared a number of parameters: writing form (length, width and height of the letters), time required, and the pressure exerted on the surface while performing the assignment.

Analysis of the results showed significant differences between the patients and the healthy group, and all subjects, except one, had their status correctly diagnosed (97.5% accuracy). The Parkinson’s disease patients wrote smaller letters (“micrograph”), exerted less pressure on the writing surface, and took more time to complete the task. According to Prof. Rosenblum a particularly noticeable difference was the length of time the pen was in the air between the writing of each letter and each word.

“This finding is particularly important because while the patient holds the pen in the air, his mind is planning his next action in the writing process, and the need for more time reflects the subject’s reduced cognitive ability. Changes in handwriting can occur years before a clinical diagnosis and therefore can be an early signal of the approaching disease,” Prof. Rosenblum said.

According to Dr. Schlesinger, validating these findings in a broader study would allow this method to be used for a preliminary diagnosis of the disease in a safe and non-invasive fashion. “This study is a breakthrough toward an objective diagnosis of the disease,” said Dr. Schlesinger, adding, “Publication of the study in the journal of the European Neurological Society aroused great interest at the International Congress of Parkinson’s Disease and Movement held last week in Sydney, Australia.”

The researchers note that this diagnostic method has the added benefit of reducing the load on the health system, because the test can be performed by a professional other than a doctor. After the results are in, patients can be referred to a doctor for further treatment and testing if necessary. The researchers are currently using the method in a new experiment, in which they use handwriting analysis to evaluate the degree of Parkinson’s patients’ improved functioning after they have brain pacemakers implanted.

(Source: newswise.com)

Filed under parkinson's disease handwriting SPECT biomarker neuroscience science

120 notes

Cell transplants may be a novel treatment for schizophrenia

Rodent research suggests feasibility of restoring neuron function

Research from the School of Medicine at The University of Texas Health Science Center at San Antonio suggests the exciting possibility of using cell transplants to treat schizophrenia.

Cells called “interneurons” inhibit activity within brain regions, but this braking or governing function is impaired in schizophrenia. Consequently, a group of nerve cells called the dopamine system go into overdrive. Different branches of the dopamine system are involved in cognition, movement and emotions.

“Since these cells are not functioning properly, our idea is to replace them,” said study senior author Daniel Lodge, Ph.D., assistant professor of pharmacology in the School of Medicine.

Transplant restored normal function

Dr. Lodge and lead author Stephanie Perez, graduate student in his laboratory, biopsied tissue from rat fetuses, isolated cells from the tissue and injected the cells into a brain center called the hippocampus. This center regulates the dopamine system and plays a role in learning, memory and executive functions such as decision making. Rats treated with the transplanted cells have restored hippocampal and dopamine function.

Stem cells are able to become different types of cells, and in this case interneurons were selected. “We put in a lot of cells and not all survived, but a significant portion did and restored hippocampal and dopamine function back to normal,” Dr. Lodge said.

‘You can essentially fix the problem’

Unlike traditional approaches to treating schizophrenia, such as medications and deep-brain stimulation, transplantation of interneurons potentially can produce a permanent solution. “You can essentially fix the problem,” Dr. Lodge said. “Ultimately, if this is translated to humans, we want to reprogram a patient’s own cells and use them.”

After meeting with other students, Perez brought the research idea to Dr. Lodge. “The students have journal club, and somebody had done a similar experiment to restore motor deficits and had good results,” Perez said. “We thought, why can’t we use it for schizophrenia and have good results, and so far we have.”

The study is in Molecular Psychiatry.

(Source: uthscsa.edu)

Filed under schizophrenia stem cells interneurons dopamine hippocampus neuroscience science

49 notes

Hypertensive smoking women have an exceptionally high risk of a fatal brain bleeding

Subarachnoid haemorrhage (SAH) is one of the most devastating cerebrovascular catastrophes causing death in 40 to 50% of the cases. The most common cause of SAH is a rupture of an intracranial aneurysm. If the aneurysm is found, it can be treated before the possible rupture. However, some intracranial aneurysms will never rupture – the problem is that the doctors don’t know which aneurysms will and which will not. So, they don’t know which patients should be treated and who can safely be left untreated.

image

(Image: This picture shows: A middle cerebral artery bifurcation aneurysm. Credit: Miikka Korja)

A long-term, population-based Finnish study on SAH, which is based on the FINRISK health examination surveys, and published in PLOS ONE on 9th September, shows that the risk of SAH depends strongly on the combination of certain risk factors. The SAH incidence was shown to vary from 8 up to 171 per 100 000 person-years, depending on whether people had multiple risk factors for SAH – such as smoking, hypertension and female sex – or not.

Such an extreme risk factor -dependent variation in the incidence of any cardiovascular disease is exceptional, and may have significant clinical implications, says one of the main authors, Associate Professor Miikka Korja from the Helsinki University Central Hospital and Australian School of Advanced Medicine.

If smoking women with high systolic blood pressure values have 20 times higher rate of these brain bleeds than never-smoking men with low blood pressure values, it may very well be that these women diagnosed with unruptured intracranial aneurysms should be treated. On the other hand, never-smoking men with low blood pressure values and intracranial aneurysms may not need to be treated at all.

In this largest SAH risk factor study ever, the study group also identified three new risk factors for SAH: previous myocardial infarction, history of stroke in mother, and elevated cholesterol levels in men. The results revise the understanding of the epidemiology of SAH and indicate that the risk factors for SAH appear to be similar to those for other cardiovascular diseases.

We have previously shown that lifestyle risk factors affect significantly the life expectancy of SAH survivors, and now we have shown that the same risk factors also affect dramatically the risk of SAH itself. Thus, it appears quite clear that especially smoking cessation and hypertension treatment are important in preventing SAH and increasing life expectancy after SAH, clarifies one of the study group members, Academy Professor Jaakko Kaprio, from the University of Helsinki and National Institute for Health and Welfare, referring to their previous publication on cause-specific mortality on SAH survivors (Korja et al., Neurology, 2013).

The study group members have previously published also the largest twin study to date, confirming that heritability for SAH is very low (Korja et al., Stroke, 2010), and the first study on the incidence of SAH in type 1 diabetes, showing that the rate of non-aneurysmal SAHs in type 1 diabetes is unusually high (Korja et al., Diabetes Care, 2013).

Many of the previous studies on the epidemiology of SAH have relied on retrospective and single-center databases, which are unfortunately not very reliable data sources. Due to the unique health care system and common academic interest among doctors in Nordic countries, it has been possible to conduct high-quality and unbiased studies on SAH. We hope that our studies truly help doctors and patients, and are not only of interest in coffee tables on university campuses, says neurosurgeon Korja, and rushes to continue his working day in the operation room in Macquarie University Hospital, Sydney, which is one of his current appointments.

(Source: eurekalert.org)

Filed under aneurysm subarachnoid haemorrhage cardiovascular disease smoking hypertension neuroscience science

63 notes

Robots with Display Screens: A Robot with a More Humanlike Face Display Is Perceived To Have More Mind and a Better Personality 
It is important for robot designers to know how to make robots that interact effectively with humans. One key dimension is robot appearance and in particular how humanlike the robot should be. Uncanny Valley theory suggests that robots look uncanny when their appearance approaches, but is not absolutely, human. An underlying mechanism may be that appearance affects users’ perceptions of the robot’s personality and mind. This study aimed to investigate how robot facial appearance affected perceptions of the robot’s mind, personality and eeriness. A repeated measures experiment was conducted. 30 participants (14 females and 16 males, mean age 22.5 years) interacted with a Peoplebot healthcare robot under three conditions in a randomized order: the robot had either a humanlike face, silver face, or no-face on its display screen. Each time, the robot assisted the participant to take his/her blood pressure. Participants rated the robot’s mind, personality, and eeriness in each condition. The robot with the humanlike face display was most preferred, rated as having most mind, being most humanlike, alive, sociable and amiable. The robot with the silver face display was least preferred, rated most eerie, moderate in mind, humanlikeness and amiability. The robot with the no-face display was rated least sociable and amiable. There was no difference in blood pressure readings between the robots with different face displays. Higher ratings of eeriness were related to impressions of the robot with the humanlike face display being less amiable, less sociable and less trustworthy. These results suggest that the more humanlike a healthcare robot’s face display is, the more people attribute mind and positive personality characteristics to it. Eeriness was related to negative impressions of the robot’s personality. Designers should be aware that the face on a robot’s display screen can affect both the perceived mind and personality of the robot.

Robots with Display Screens: A Robot with a More Humanlike Face Display Is Perceived To Have More Mind and a Better Personality

It is important for robot designers to know how to make robots that interact effectively with humans. One key dimension is robot appearance and in particular how humanlike the robot should be. Uncanny Valley theory suggests that robots look uncanny when their appearance approaches, but is not absolutely, human. An underlying mechanism may be that appearance affects users’ perceptions of the robot’s personality and mind. This study aimed to investigate how robot facial appearance affected perceptions of the robot’s mind, personality and eeriness. A repeated measures experiment was conducted. 30 participants (14 females and 16 males, mean age 22.5 years) interacted with a Peoplebot healthcare robot under three conditions in a randomized order: the robot had either a humanlike face, silver face, or no-face on its display screen. Each time, the robot assisted the participant to take his/her blood pressure. Participants rated the robot’s mind, personality, and eeriness in each condition. The robot with the humanlike face display was most preferred, rated as having most mind, being most humanlike, alive, sociable and amiable. The robot with the silver face display was least preferred, rated most eerie, moderate in mind, humanlikeness and amiability. The robot with the no-face display was rated least sociable and amiable. There was no difference in blood pressure readings between the robots with different face displays. Higher ratings of eeriness were related to impressions of the robot with the humanlike face display being less amiable, less sociable and less trustworthy. These results suggest that the more humanlike a healthcare robot’s face display is, the more people attribute mind and positive personality characteristics to it. Eeriness was related to negative impressions of the robot’s personality. Designers should be aware that the face on a robot’s display screen can affect both the perceived mind and personality of the robot.

Filed under robots robotics perception technology neuroscience science

184 notes

Neural and Behavioral Evidence for an Intrinsic Cost of Self-Control
The capacity for self-control is critical to adaptive functioning, yet our knowledge of the underlying processes and mechanisms is presently only inchoate. Theoretical work in economics has suggested a model of self-control centering on two key assumptions: (1) a division within the decision-maker between two ‘selves’ with differing preferences; (2) the idea that self-control is intrinsically costly. Neuroscience has recently generated findings supporting the ‘dual-self’ assumption. The idea of self-control costs, in contrast, has remained speculative. We report the first independent evidence for self-control costs. Through a neuroimaging meta-analysis, we establish an anatomical link between self-control and the registration of cognitive effort costs. This link predicts that individuals who strongly avoid cognitive demand should also display poor self-control. To test this, we conducted a behavioral experiment leveraging a measure of demand avoidance along with two measures of self-control. The results obtained provide clear support for the idea of self-control costs.

Neural and Behavioral Evidence for an Intrinsic Cost of Self-Control

The capacity for self-control is critical to adaptive functioning, yet our knowledge of the underlying processes and mechanisms is presently only inchoate. Theoretical work in economics has suggested a model of self-control centering on two key assumptions: (1) a division within the decision-maker between two ‘selves’ with differing preferences; (2) the idea that self-control is intrinsically costly. Neuroscience has recently generated findings supporting the ‘dual-self’ assumption. The idea of self-control costs, in contrast, has remained speculative. We report the first independent evidence for self-control costs. Through a neuroimaging meta-analysis, we establish an anatomical link between self-control and the registration of cognitive effort costs. This link predicts that individuals who strongly avoid cognitive demand should also display poor self-control. To test this, we conducted a behavioral experiment leveraging a measure of demand avoidance along with two measures of self-control. The results obtained provide clear support for the idea of self-control costs.

Filed under self-control neuroimaging brain activity decision making neuroscience science

free counters