Posts tagged science

Posts tagged science
With their whiskers rats can detect the texture of objects in the same way as humans do using their fingertips. A study, in which some scientists of SISSA have taken part, shows that it is possible to understand what specific object has been touched by a rat by observing the activation of brain neurons. A further step towards understanding how the brain, also in humans, represents the outside world.
We know the world through the sensory representations within our brain. Such “reconstruction” is performed through the electrical activation of neural cells, the code that contains the information that is constantly processed by the brain. If we wish to understand what are the rules followed by the representation of the world inside the brain we have to comprehend how electrical activation is linked to the sensory experience. For this reason, a team of researchers including Mathew Diamond, Houman Safaai and Moritz von Heimendahl of the International School for Advanced Studies (SISSA) of Trieste have analyzed the behavior and the activation of neural networks in rats while they were carrying out tactile object recognition tests.
During the experiments researchers observed the performance of rats – the animals were discriminating one texture from another – along with the activation of a group of sensory neurons. “For the first time the study has monitored the activity of multiple neurons, while until now, due to technical limitations, researchers had examined only individual neurons,” explains Diamond, who heads up the Tactile Perception and Learning Lab at SISSA. “The activity of such groups of neurons is represented in our model as multi-dimensional clouds, comprising as many dimensions as the number of cells under examination (up to ten). We have observed a different cloud for the contact with each different texture.”
By analyzing the “clouds”, Diamond and his colleagues were able to successfully decode the object contacted by the rodent. “Our method is so accurate that when the rat would mistake one object for another, the decoding would also indicate a different object from the one actually touched. And this happened because the representation made by the brain – and, as a consequence, our decoding – appeared like that of a different object. Hence the error.”
Diamond’s team has no intention of stopping here. “In real life, we generally recognize objects using more senses all together, in an integrated manner. We use touch and sight at the same time, for instance,” explains Diamond. “For this reason we are now working on new experiments employing more neurons, with more complicated stimuli, and more senses, to build ‘multimodal’ representations of objects.”
More in detail…
This kind of “mind reading” carried out on rats’ brain by Diamond and his colleagues is important to understand how the brain forms a representation of the world. “Each one of us perceives a physical world outside ourselves, yet actually all we have at our disposal to create an experience of the world is the representation that our brain makes of it through the input of sensory organs” says Diamond.
To understand that such a representation is at the very least partial it is enough to think of all the information about the world that escapes us all the time: for instance, we are blind to infrared and ultraviolet rays, we are unable to hear certain sound frequencies or smell some chemical substances or others. Some details pertaining to the physical world are completely invisible or, to put it better, imperceptible (others are interpreted incorrectly, like visual illusions, for example.)
This is a further demonstration that what we perceive is not the physical world in itself, but the neuronal activation the world evokes inside our brain.
Deep brain stimulation (DBS) in a precise region of the brain appears to reduce caloric intake and prompt weight loss in obese animal models, according to a new study led by researchers at the University of Pennsylvania. The study, reported in the Journal of Neuroscience, reinforces the involvement of dopamine deficits in increasing obesity-related behaviors such as binge eating, and demonstrates that DBS can reverse this response via activation of the dopamine type-2 receptor.
"Based on this research, DBS may provide therapeutic relief to binge eating, a behavior commonly seen in obese humans, and frequently unresponsive to other approaches," said senior author Tracy L. Bale, PhD, associate professor of neuroscience in Penn’s School of Veterinary Medicine’s Department of Animal Biology and in the Perelman School of Medicine’s Department of Psychiatry. DBS is currently used to reduce tremors in Parkinson’s disease and is under investigation as a therapy for major depression and obsessive-compulsive disorder.
Nearly 50 percent of obese people binge eat, uncontrollably consuming palatable highly caloric food within a short period of time. In this study, researchers targeted the nucleus accumbens, a small structure in the brain reward center known to be involved in addictive behaviors. Mice receiving the stimulation ate significantly less of the high fat food compared to mice not receiving DBS. Following stimulation, mice did not compensate for the loss of calories by eating more. However, on days when the device was turned off, binge eating resumed.
Researchers also tested the long-term effects of DBS on obese mice that had been given unlimited access to high-fat food. During four days of continuous stimulation, the obese mice consumed fewer calories and, importantly, their body weight dropped. These mice also showed improvement in their glucose sensitivity, suggestive of a reversal of type 2 diabetes.
“These results are our best evidence yet that targeting the nucleus accumbens with DBS may be able to modify specific feeding behaviors linked to body weight changes and obesity,” Bale added.
“Once replicated in human clinical trials, DBS could rapidly become a treatment for people with obesity due to the extensive groundwork already established in other disease areas,” said lead author Casey Halpern, MD, resident in the Department of Neurosurgery of the Perelman School of Medicine at the University of Pennsylvania.
(Source: uphs.upenn.edu)

ALS trial shows novel therapy is safe
An investigational treatment for an inherited form of Lou Gehrig’s disease has passed an early phase clinical trial for safety, researchers at Washington University School of Medicine in St. Louis and Massachusetts General Hospital report.
The researchers have shown that the therapy produced no serious side effects in patients with the disease, also known as amyotrophic lateral sclerosis (ALS). The phase 1 trial’s results, available online in Lancet Neurology, also demonstrate that the drug was successfully introduced into the central nervous system.
The treatment uses a technique that shuts off the mutated gene that causes the disease. This approach had never been tested against a condition that damages nerve cells in the brain and spinal cord.
“These results let us move forward in the development of this treatment and also suggest that it’s time to think about applying this same approach to other mutated genes that cause central nervous system disorders,” says lead author Timothy Miller, MD, PhD, assistant professor of neurology at Washington University. “These could include some forms of Alzheimer’s disease, Parkinson’s disease, Huntington’s disease and other conditions.”
ALS destroys nerves that control muscles, gradually leading to paralysis and death. For treatment of the disease, the sole FDA-approved medication, Riluzole, has only a marginal effect.
Most cases of ALS are sporadic, but about 10 percent are linked to inherited mutations. Scientists have identified changes in 10 genes that can cause ALS and are still looking for others.
The study focused on a form of ALS caused by mutations in a gene called SOD1, which account for 2 percent of all ALS cases. Researchers have found more than 100 mutations in the SOD1 gene that cause ALS.
“At the molecular level, these mutations affect the properties of the SOD1 protein in a variety of ways, but they all lead to ALS,” says Miller, who is director of the Christopher Wells Hobler Lab for ALS Research at the Hope Center for Neurological Disorders at Washington University.
Rather than try to understand how each mutation causes ALS, Miller and his colleagues focused on blocking production of the SOD1 protein using a technique called antisense therapy.
To make a protein, cells have to copy the protein-building instructions from the gene. Antisense therapy blocks the cell from using these copies, allowing researchers to selectively silence individual genes.
“Antisense therapy has been considered and tested for a variety of disorders over the past several decades,” Miller says. “For example, the FDA recently approved an antisense therapy called Kynamro for familial hypercholesterolemia, an inherited condition that increases cholesterol levels in the blood.”
Miller and colleagues at the University of California-San Diego devised an antisense drug for SOD1 and successfully tested it in an animal model of the disease.
Merit Cudkowicz, MD, chief of neurology at Massachusetts General Hospital, was co-PI of the phase I clinical safety trial described in the new paper. Clinicians at Barnes-Jewish Hospital, Massachusetts General Hospital, Johns Hopkins Hospital and the Methodist Neurological Institute in Houston gave antisense therapy or a placebo to 21 patients with SOD1-related ALS. Treatment consisted of spinal infusions that lasted 11 hours.
The scientists found no significant difference between side effects in the control and treatment groups. Headache and back pain, both of which are often associated with spinal infusion, were among the most common side effects.
Immediately after the injections, the researchers took spinal fluid samples. This let them confirm the antisense drug was circulating in the spinal fluid of patients who received the treatment.
To treat SOD1-related ALS in the upcoming phase II trial, researchers will need to increase the dosage of the antisense drug. As the dose rises, they will watch to ensure that the therapy does not cause harmful inflammation or other side effects as it lowers SOD1 protein levels.
“All the information that we have so far suggests lowering SOD1 will be safe,” Miller says. “In fact, completely disabling SOD1 in mice seems to have little to no effect. We think it will be OK in patients, but we won’t know for sure until we’ve conducted further trials.”
The therapy may one day be helpful in the more common, noninherited forms of ALS, some of which may be linked to problems with the SOD1 protein.
“Before we can consider using this same therapy for sporadic ALS, we need more evidence that SOD1 is a major contributor to these forms of the disorder,” Miller says.
The trial was conducted with support from ISIS Pharmaceuticals, which co-owns a patent on the SOD1 antisense drug.
Scientists from King’s College London have identified patterns of epigenetic changes involved in autism spectrum disorder (ASD) by studying genetically identical twins who differ in autism traits.

The study, published in Molecular Psychiatry, is the largest of its kind and may shed light on the biological mechanism by which environmental influences regulate the activity of certain genes and in turn contribute to the development of ASD and related behaviour traits.
ASD affects approximately 1 in 100 people in the UK and involves a spectrum of disorders which manifest themselves differently in different people. People with ASD have varying levels of impairment across three common areas: deficits in social interactions and understanding, repetitive behaviour and interests, and impairments in language and communication development.
Evidence from twin studies shows there is a strong genetic component to ASD and previous studies suggest that genes that direct brain development may be involved in the disorder. In approximately 70% of cases, when one identical twin has ASD, so does the other. However, in 30% of cases, identical twins differ for ASD. Because identical twins share the same genetic code, this suggests non-genetic, or epigenetic, factors may be involved.
Epigenetic changes affect the expression or activity of genes without changing the underlying DNA sequence – they are believed to be one mechanism by which the environment can interact with the genome. Importantly, epigenetic changes are potentially reversible and may therefore provide targets for the development of new therapies.
The researchers studied an epigenetic mechanism called DNA methylation. DNA methylation acts to block the genetic sequences that drive gene expression, silencing gene activity. They examined DNA methylation at over 27,000 sites across the genome using samples taken from 50 identical twin pairs (100 individuals) from the UK Medical Research Council (MRC) funded Twins Early Development Study (TEDS): 34 pairs who differed for ASD or autism related behaviour traits, 5 pairs where both twins have ASD, and 11 healthy twin pairs.
Dr Chloe Wong, first author of the study from King’s College London’s Institute of Psychiatry, says: “We’ve identified distinctive patterns of DNA methylation associated with both autism diagnosis and related behaviour traits, and increasing severity of symptoms. Our findings give us an insight into the biological mechanism mediating the interaction between gene and environment in autism spectrum disorder.”
DNA methylation at some genetic sites was consistently altered for all individuals with ASD, and differences at other sites were specific to certain symptom groups. The number of DNA methylation sites across the genome was also linked to the severity of autism symptoms suggesting a quantitative relationship between the two. Additionally, some of the differences in DNA methylation markers were located in genetic regions that previous research has associated with early brain development and ASD.
Professor Jonathan Mill, lead author of the paper from King’s College London’s Institute of Psychiatry and the University of Exeter, says: “Research into the intersection between genetic and environmental influences is crucial because risky environmental conditions can sometimes be avoided or changed. Epigenetic changes are potentially reversible, so our next step is to embark on larger studies to see whether we can identify key epigenetic changes common to the majority of people with autism to help us develop possible therapeutic interventions.”
Dr Alycia Halladay, Senior Director of Environmental and Clinical Sciences from Autism Speaks who funded the research, says: “This is the first large-scale study to take a whole genome approach to studying epigenetic influences in twins who are genetically identical but have different symptoms. These findings open the door to future discoveries in the role of epigenetics – in addition to genetics – in the development of autism symptoms.”
(Source: kcl.ac.uk)
Brain biology tied to social reorientation during entry to adolescence
A specific region of the brain is in play when children consider their identity and social status as they transition into adolescence — that often-turbulent time of reaching puberty and entering middle school, says a University of Oregon psychologist.
In a study of 27 neurologically typical children who underwent functional magnetic resonance imaging (fMRI) at ages 10 and 13, activity in the brain’s ventromedial prefrontal cortex increased dramatically when the subjects responded to questions about how they view themselves.
The findings, published in the April 24 issue of the Journal of Neuroscience, confirm previous findings that specific brain networks support self-evaluations in the growing brain, but, more importantly, provide evidence that basic biology may well drive some of these changes, says Jennifer H. Pfeifer, professor of psychology and director of the psychology department’s Developmental Social Neuroscience Lab.
"This is a longitudinal fMRI study, which is still relatively uncommon," Pfeifer said. "It suggests a link between neural responses during self-evaluative processing in the social domain, and pubertal development. This provides a rare piece of empirical evidence in humans, rather than animal models, that supports the common theory that adolescents are biologically driven to go through a social reorientation."
Participants were scanned for about seven minutes at each visit. They responded to a series of attributes tied to social or academic domains — social ones such as “I am popular” or “I wish I had more friends” and academic ones such as “I like to read just for fun” or “Writing is so boring.” Social and academic evaluations were made about both the self and a familiar fictional character, Harry Potter.
In previous research, Pfeifer had found that a more dorsal region of the medial prefrontal cortex was more responsive in 10-year-old children during self-evaluations, when they were compared to adults. The new study, she said, provides a more detailed picture of how the brain supports self-development by looking at change within individuals.
The fMRI analyses found it was primarily the social self-evaluations that triggered significant increases over time in blood-oxygen levels, which fMRI detects, in the ventral medial prefrontal cortex. Additionally, these increases were strongest in children who experienced the most pubertal development over the three-year study period, for both girls and boys. Increases during academic self-evaluations were at best marginal. Whole-brain analyses found no other areas of the brain had significant increases or decreases in activity related to pubertal development.
"Neural changes in the social domain were more robust," Pfeifer said. "Increased responses in this one region of the brain from age 10 to 13 were very evident in social self-evaluations, but not academic ones. This pattern is consistent with the enormous importance that most children entering adolescence place on their peer relationships and social status, compared to the relatively diminished value often associated with academics during this transition."
In youth with autism spectrum disorders, this specialized response in ventral medial prefrontal cortex is missing, she added, citing a paper she co-authored in the February 2013 issue of the Journal of Autism and Developmental Disorders and a complementary study led by Michael V. Lombardo, University of Cambridge, in the February 2010 issue of the journal Brain. The absence of this typical effect, Pfeifer said, might be related to the challenges these individuals often face in both self-understanding and social relations.
"Dr. Pfeifer’s research examining self-evaluations during adolescence adds significantly to the intricate puzzle of this turbulent age period," said Kimberly Andrews Espy, vice president for research and innovation and dean of the graduate school. "Researchers at the University of Oregon are piecing together how both biology and the environment dynamically and interactively support healthy social development."
New IBN Peptides May Help Researchers Combat Alzheimer’s, Diabetes and Cancer
Amyloids, or fibrous aggregates of abnormally folded proteins, are a common feature in degenerative diseases such as Alzheimer’s, diabetes and cancer. Amyloids occur naturally in the body, but despite decades of research, their mechanism of formation remains unknown, hampering drug development efforts. Now, a new class of ultrasmall peptides developed by the Institute of Bioengineering and Nanotechnology (IBN) offers scientists a platform for understanding this phenomenon, providing them with the insights required to design more effective treatments for these diseases.
IBN Executive Director Professor Jackie Y. Ying said, “Our researchers have been focusing on creating biomimetic materials for nanomedicine and cell and tissue engineering applications. The novel ultrasmall peptides developed by IBN are not only highly effective as synthetic cell culture substrates, but also as a model for studying the mystery of amyloid formation. Such fundamental understanding could contribute towards advancing medical treatment of amyloid-related disorders.”
First discovered in 2011 by IBN Team Leader and Principal Research Scientist Dr Charlotte Hauser, the peptides were formed from only 3-7 amino acids, making them the smallest ever reported class of self-assembling aliphatic compounds. Peptides perform a wide range of functions in the body, and are distinguished from proteins based on size. Building on this earlier research, IBN researchers have found a striking similarity between the structure of their synthetic peptides and the protein structure of naturally occurring amyloids in the latest study published in Proceedings of the National Academy of Sciences.
Dr Hauser elaborated, “This is the first proof-of-concept that our peptides self-assemble in the same way as naturally occurring amyloid sequences. Knowing that the process of amyloid formation is common across various chronic degenerative diseases, our goal is to identify the specific trigger so that we can design the appropriate drugs to inhibit and control the aggregate formation.”
The IBN team collaborated with researchers from the Institute of High Performance Computing and the European Synchrotron Radiation Facility to validate their peptides with the core protein sequences of three diseases: Alzheimer’s, diabetes and thyroid cancer.

The results revealed that the mechanism behind the self-assembly of amyloids from smaller intermediate structures into larger amyloid structures was similar to how the IBN peptides were formed. In addition, this study supports the growing evidence that early intermediates are more toxic than the final amyloid fibers, and may even be the driving force behind amyloid formation.
Patent applications have been filed on this research, and the next step of this project is pre-clinical evaluation of ultrasmall peptide therapeutics. IBN will also investigate other amyloid disorders such as corneal dystrophy, which can result in blindness.
(Source: a-star.edu.sg)
Scientific progress in Huntington’s disease (HD) relies upon the availability of appropriate animal models that enable insights into the disease’s genetics and/or pathophysiology. Large animal models, such as domesticated farm animals, offer some distinct advantages over rodent models, including a larger brain that is amenable to imaging and intracerebral therapy, longer lifespan, and a more human-like neuro-architecture. Three articles in the latest issue of the Journal of Huntington’s Disease discuss the potential benefits of using large animal models in HD research and the implications for the development of gene therapy.
A review by Morton and Howland explores the advantages and drawbacks of small and large animal models of HD. In the same issue, Baxa et al. highlight the development of a transgenic minipig HD model that expresses a human mutant huntingtin (HTT) fragment through the central nervous system (CNS) and peripheral tissues and manifests neurochemical and reproductive changes with age. In another report, Van der Bom et al. describe a technique employing CT and MRI that allows precise intracerebral application of therapeutics to transgenic HD sheep.
Huntington’s disease (HD) is an inherited progressive neurological disorder for which there is presently no effective treatment. It is caused by a single dominant gene mutation an expanded CAG repeat in the HTT gene - leading to expression of mutant HTT protein. Expression of mutant HTT causes changes in cellular functions, which ultimately results in uncontrollable movements, progressive psychiatric difficulties, and loss of mental abilities.
The search for new large animal models of HD arises from the recognition that there are some practical limitations of rodent and other small animal models. Because neurodegenerative diseases like HD progress over a lifetime, a rodent’s short life span excludes the possibility of studying long-term changes. There are also important anatomic differences between the brains of humans and rodents that become especially relevant when studying HD, including the lack of a gyrencephalic (convoluted) cortex and differences in the structure and cellular characteristics of the basal ganglia compared to humans. Not only does a rodent’s small brain often preclude the use of advanced neuroimaging techniques, it is also not clear how intracerebral application of trophic factors, transplant therapies, and gene therapies in small animals might translate to the much larger human brain.
"Importantly, the brains of large animals can be studied using sensitive measures that should be highly translatable to the human condition, including MRI and PET imaging, EEG, and electrophysiology, as well as behavioral tests looking at motor and cognitive function," says Professor Jenny Morton, PhD, of the Department of Physiology, Development and Neuroscience at the University of Cambridge. "Moving to larger-brained animal models after promising results are obtained in rodents is a logical, and possibly necessary, step to optimize delivery and biodistribution, validating on-target mechanism of action, and assessing safety profiles," says Professor Morton
"Strategies directed against the huntingtin gene in the brain are an important part of CHDI’s therapeutic portfolio", says David Howland, PhD, Director of Model Systems at CHDI. "Translating preclinical results for gene-based therapies from rodent models to larger-brained models of HD is an important step along the path toward clinical testing."
Significant advances have been made in the creation and characterization of HD models in nonhuman primates (NHP). “The relevance to human biology of NHP models in Huntington’s disease hold great potential value for preclinical research and development, but we need to fully consider the substantial issues of cost, long-term housing of affected animals, access of the models to HD investigators, and ethical concerns with modeling in these species,” says Dr Howland. “CHDI has invested in efforts to expand modeling in large animals to include sheep and minipigs to work around some of these concerns about NHP models.”
Large domesticated farm animals offer some distinct advantages as models of HD. Sheep, for example, are domesticated, docile, live outdoors, are easy to care for, and relatively economical to maintain. A sheep’s brain is about the same size as a large primate’s, is gyrencephalic, and the basal ganglia that degenerate in HD are anatomically similar to those in humans. Sheep live long enough that the time available for studying progressive neurological diseases such as HD is much greater than is possible in rodents. HD transgenic sheep express HTT protein in the brain and abnormal HD-associated neurochemical changes. These HD sheep have been subject to advanced genomic techniques and, because they carry a human transgene that is expressed at both an mRNA and protein level, they are seen as suitable for testing gene therapy-based reagents directed against human HTT. A further advantage, says Professor Morton, is that “although sheep have a reputation for being stupid, this is probably undeserved they have very good memories and are capable of learning and remembering new tasks.”
In order to advance the use of the HD sheep model, I.M.J. van der Bom, PhD, from the Department of Radiology at the University of Massachusetts, and colleagues developed a multi-modal technique using skull markings seen with CT imaging and brain anatomy from MR imaging to allow more precise placement of intracerebral cannulae into sheep brain. The technique offers the ability to directly image micro-cannula placement to ensure accurate targeting of the therapeutic injection in the brain. With this technique, the authors hope to study the extent of optimal safety, spread and neuronal uptake of adeno-associated virus (AAV) based therapeutics.
"Pigs, and mainly minipigs, represent a viable model for preclinical drug trials and long-term safety studies," says Jan Motlik, DVM, PhD, DSc, from the Laboratory of Cell Regeneration and Plasticity of the Institute of Animal Physiology and Genetics in Libechov, Czech Republic. Advantages include its large brain size and long lifespan. Genetic advances have been made, including defining the porcine genome, with a 96% similarity between the porcine and human huntingtin genes. In addition to well-established methods for pig husbandry, they are economical to house and have body systems very similar to that of humans.
In the report by Baxa et al., a new HD minipig model using lentiviral infection of porcine embryos is described. The authors report that they successfully developed a heterozygote transgenic HD minipig that expresses a human mutant HTT fragment throughout the CNS and peripheral tissues through 4 successive generations. The model produces viable offspring, with a total neonatal mortality rate of 17%. The authors reported that one affected HD minipig showed a decline beginning at 16 months of a neuronal phosphoprotein, DARPP32, in the neostriatum, the brain region most affected by HD. A loss of fertility, possibly HD related, was also found.
(Source: news.bio-medicine.org)
'Clean' your memory to pick a winner
Predicting the winner of a sporting event with accuracy close to that of a statistical computer programme could be possible with proper training, according to researchers.
In a study published today, experiment participants who had been trained on statistically idealised data vastly improved their ability to predict the outcome of a baseball game.
In normal situations, the brain selects a limited number of memories to use as evidence to guide decisions. As real-world events do not always have the most likely outcome, retrieved memories can provide misleading information at the time of a decision.
Now, researchers at UCL and the University of Montreal have found a way to train the brain to accurately predict the outcome of an event, for example a baseball game, by giving subjects idealised scenarios that always conform to statistical probability.
Dr Bradley Love (UCL Department of Cognition, Perception and Brain Sciences), lead author of study, said: “Providing people with idealized situations, as opposed to actual outcomes, ‘cleans’ their memory and provides a stock of good quality evidence for the brain to use.”
In the study, published in Proceedings of the National Academy of Sciences, researchers programmed computers to use all available statistics to form a decision - making them more likely to predict the correct outcome. By using all data from previous sports leagues, the computer’s predictions always reflected the most likely outcome.
Next, researchers ‘trained’ the brains of participants by giving them a scenario which they had to predict the outcome of. Two groups of subjects, those given actual outcomes to situations and those given ideal outcomes were trained and then tested to compare their progress.
The scenarios consisted of games between two Major League baseball teams. Participants had to predict which team would win and were told if their prediction was correct. Those in the ‘actual’ group we told the true outcome of the game and those in the ‘ideal’ group were given fictional results.
Prior to participants’ predictions, the teams had been ranked in order based on their number of wins. For the ideal group, researchers changed the results of the match so the highest ranking team won regardless of the true outcome. This created ideal outcomes for the subjects as the best team always won, which of course does not happen in reality.
Participants in the experiment were tested by being asked to predict the outcomes for the rest of the matches played in the league, but they were not given feedback on their performance. Even though the ‘ideal’ group had been given incorrect data during training, they were significantly better at predicting the winner.
Dr Love explained: “Unlike machine systems, people’s decisions are messy because they rely on whatever memories are retrieved by chance. One consequence is that people perform better when the training situation is idealised – a useful fiction that fits are cognitive limitations.”
Participants’ prediction abilities were compared to computer models that were either optimised for prediction or modelled on human brains. After ideal outcome training, the study showed that ‘ideal’ subjects had greatly enhanced their skills and were comparable with the optimised model when predicting baseball game outcomes.
Authors suggest that idealised real world situations could be used to train professionals who rely on the ability to analyse and classify information. Doctors making diagnoses from x-rays, financial analysts and even those wanting to predict the weather could all benefit from the research.
The disrupted metabolism of sugar, fat and calcium is part of the process that causes the death of neurons in Alzheimer’s disease. Researchers from Karolinska Institutet in Sweden have now shown, for the first time, how important parts of the nerve cell that are involved in the cell’s energy metabolism operate in the early stages of the disease. These somewhat surprising results shed new light on how neuronal metabolism relates to the development of the disease.
In the Alzheimer’s disease brain, plaques consisting of so called amyloid-beta-peptide (Aβ) are accumulated. It is also a well-known fact that the nerve cells of patients with Alzheimer’s disease have problems metabolising for example glucose and calcium, and that these disorders are associated with cell death. The metabolism of these substances is the job of the cell mitochondria, which serve as the cell’s power plant and supply the cell with energy.
However, for the mitochondria to do this, they need good contact with another part of the cell called the endoplasmic reticulum (ER). The specialised region of ER that is in contact with mitochondria is called the MAM region. Earlier studies on yeast and other types of cells have shown that the deactivation of certain proteins in the MAM region disrupt the contact points between the mitochondria and the ER, preventing the delivery of energy to the cell and causing cell death.
Now for the first time, researchers at Karolinska Institutet have studied the MAM region in nerve cells, and examined the interaction between the mitochondria and the ER in early stage Alzheimer’s disease. Although at this point in the development of the disease Aβ has not formed large, lumpy plaques, symptoms still appear, implying that Aβ that has not yet formed plaque is toxic to neurons.
The team’s results are slightly surprising. When nerve cells are exposed to low doses of Aβ, it leads to an increase in the number of contact points between the mitochondria and the ER, causing more calcium to be transferred from the ER to the mitochondria. The resulting over-accumulation of calcium is toxic to the mitochondria and affects their ability to supply energy to the nerve cell.
“It’s urgent that we find out what causes neuronal death if we’re to develop molecules that check the disease,” says Maria Ankarcrona, docent and researcher at the Department of Neurobiology, Care Sciences and Society, and the Alzheimer’s Disease Research Centre of Karolinska Institutet. “In the long run we might be able to produce a drug that can arrest the progress of the disease at a stage when the patient is still able to manage their daily lives. If we can extend that period by a number of years, we’d have made great gains. Today there are no drugs that affect the actual disease process.”
The researchers conducted their studies on mice bred to develop symptoms of Alzheimer’s disease. They also studied nerve cells from deceased Alzheimer’s patients and neurons cultivated in the laboratory.
(Source: alphagalileo.org)
Red Light Increases Alertness During “Post-Lunch Dip”
Acute or chronic sleep deprivation resulting in increased feelings of fatigue is one of the leading causes of workplace incidents and related injuries. More incidents and performance failures, such as automobile accidents, occur in the mid-afternoon hours known as the “post-lunch dip.” The post-lunch dip typically occurs from 2-4 p.m., or about 16-18 hours after an individual’s bedtime from the previous night.
A new study from the Lighting Research Center (LRC) at Rensselaer Polytechnic Institute shows that exposure to certain wavelengths and levels of light has the potential to increase alertness during the post-lunch dip. The research was a collaboration between Mariana Figueiro, LRC Light and Health Program director and associate professor at Rensselaer, and LRC doctoral student Levent Sahin. Results of the study titled “Alerting effects of short-wavelength (blue) and long-wavelength (red) lights in the afternoon,” were recently published in Physiology & Behavior journal.
The collaboration between Figueiro and Sahin lays the groundwork for the possible use of tailored light exposures as a non-pharmacological intervention to increase alertness during the daytime. Figueiro has previously conducted studies that show that light has the potential to increase alertness at night. Exposure to more than 2500 lux of white light at night increases performance, elevates core body temperature, and increases heart rate.
In most studies to date, the alerting effects of light have been linked to its ability to suppress melatonin. However, results from another study led by Figueiro demonstrate that acute melatonin suppression is not needed for light to affect alertness during the nighttime. They showed that both short-wavelength (blue) and long-wavelength (red) lights increased measures of alertness but only short-wavelength light suppressed melatonin. Melatonin levels are typically lower during the daytime, and higher at night.
Figueiro and Sahin hypothesized that if light can impact alertness via pathways other than melatonin suppression, then certain wavelengths and levels of light might also increase alertness during the middle of the afternoon, close to the post-lunch dip hours.
During the study conducted at the LRC, participants experienced two experimental lighting conditions in addition to darkness. Long-wavelength “red” light (λmax = 630 nanometers) and short-wavelength “blue” light (λmax = 470 nanometers) were delivered to the corneas of each participant by arrays of light emitting diodes (LEDs) placed in 60 × 60 × 60 cm light boxes. Participant alertness was measured by electroencephalogram (EEG) and subjective sleepiness (KSS scale).
The team found that, compared to remaining in darkness, exposure to red light in the middle of the afternoon significantly reduces power in the alpha, alpha theta, and theta ranges. Because high power in these frequency ranges has been associated with sleepiness, these results suggest that red light positively affects measures of alertness not only at night, but also during the day. Red light also seemed to be a more potent stimulus for modulating brain activities associated with daytime alertness than blue light, although they did not find any significant differences in measures of alertness after exposure to red and blue lights. This suggests that blue light, especially higher levels of blue light, could still increase alertness in the afternoon. It appears that melatonin suppression is not needed for light to have an impact on objective measures of alertness.
“Our study suggests that photoreceptors other than the intrinsically photosensitive retinal ganglion cells respond to light for the arousal system,” said Figueiro. “Future research should look into the spectral sensitivity of alertness and how it changes over the course of 24 hours.”
Sahin, who has more than 10 years of experience in railway engineering, was interested in this study from a transportation safety perspective, and what the results could mean to the transportation industry. “Safety is a prerequisite and one of the most important quality indicators in the transportation industry,” said Sahin. “Our recent findings provided the scientifically valid underpinnings in approaching fatigue related safety problems in 24 hour transportation operations.”
From the present results, it is not possible to determine the underlying mechanisms contributing to light-induced changes in alertness because the optical radiation incident on the retina has multiple effects on brain activity through parallel neural pathways. According to Figueiro, that is an area that she would like to explore in future research.