Posts tagged memory
Posts tagged memory
RMIT University researchers have brought ultra-fast, nano-scale data storage within striking reach, using technology that mimics the human brain.
The researchers have built a novel nano-structure that offers a new platform for the development of highly stable and reliable nanoscale memory devices.
The pioneering work will feature on a forthcoming cover of prestigious materials science journal Advanced Functional Materials (11 November).
Project leader Dr Sharath Sriram, co-leader of the RMIT Functional Materials and Microsystems Research Group, said the nanometer-thin stacked structure was created using thin film, a functional oxide material more than 10,000 times thinner than a human hair.
“The thin film is specifically designed to have defects in its chemistry to demonstrate a ‘memristive’ effect – where the memory element’s behaviour is dependent on its past experiences,” Dr Sriram said.
“With flash memory rapidly approaching fundamental scaling limits, we need novel materials and architectures for creating the next generation of non-volatile memory.
“The structure we developed could be used for a range of electronic applications – from ultrafast memory devices that can be shrunk down to a few nanometers, to computer logic architectures that replicate the versatility and response time of a biological neural network.
“While more investigation needs to be done, our work advances the search for next generation memory technology can replicate the complex functions of human neural system – bringing us one step closer to the bionic brain.”
The research relies on memristors, touted as a transformational replacement for current hard drive technologies such as Flash, SSD and DRAM. Memristors have potential to be fashioned into non-volatile solid-state memory and offer building blocks for computing that could be trained to mimic synaptic interfaces in the human brain.
(Image caption: In the two brain regions IPF (lateral prefrontal cortex) and V4, a region of the visual system, the brain activity oscillates in a specific frequency range. Credit: © Stefanie Liebe, MPI for biological Cybernetics)
School children and university students are often big fans of the short-term memory – not least when they have to cram large volumes of information on the eve of an exam. Although its duration is brief, short term memory is a complex network of neurons in the brain that includes different brain regions. To store the information, these regions must work together. Researchers from the Max Planck Institute for Biological Cybernetics in Tübingen have now discovered that the participating regions must be active at the same time to enable us to form short-term memories of things that happen.
When we see something, signals from the eyes are processed in areas of the cerebral cortex located at the back of the head. For short-term memory, in contrast, regions in the front part of the cerebral cortex must be active. In order for us to remember something we have seen briefly, these far-apart regions of the brain must collate their information.
How this works can only be examined in apes. Scientists from Nikos Logothetis’s Department at the Max Planck Institute for Biological Cybernetics in Tübingen measured the electrical activity in an optic region and in the front area of the brain while the animals had to remember different images.
In the process, the scientists observed electrical vibrations, known as theta-band oscillations, in the two regions of brain. Surprisingly, these oscillations did not arise independently, but were synchronous. The more synchronously active the regions, the better the animals were able to remember an image.
Accordingly, the functioning of short-term memory can be envisaged as two revolving doors: While the memory is at work, the two doors move in time with each other and, in this way, facilitate the more effective exchange of information.
The study shows how important synchronised brain oscillations are for the communication between the different regions of the brain. Almost all higher intellectual capacities result from the complex interplay of specialised neuronal networks in different parts of the brain.
New research by scientists at the University of Kentucky’s Sanders-Brown Center on Aging suggests that people who notice their memory is slipping may be on to something.
The research, led by Richard Kryscio, Ph.D., chair of the Department of of Biostatistics and associate director of the Alzheimer’s Disease Center at UK, appears to confirm that self-reported memory complaints are strong predictors of clinical memory impairment later in life.
Kryscio and his group asked 531 people with an average age of 73 and free of dementia if they had noticed any changes in their memory in the prior year. The participants were also given annual memory and thinking tests for an average of 10 years. After death, participants’ brains were examined for evidence of Alzheimer’s disease.
During the study, 56 percent of the participants reported changes in their memory, at an average age of 82. The study found that participants who reported changes in their memory were nearly three times more likely to develop memory and thinking problems. About one in six participants developed dementia during the study, and 80 percent of those first reported memory changes.
"What’s notable about our study is the time it took for the transition from self-reported memory complaint to dementia or clinical impairment — about 12 years for dementia and nine years for clinical impairment — after the memory complaints began," Kryscio said. "That suggests that there may be a significant window of opportunity for intervention before a diagnosable problem shows up."
Kryscio points out that while these findings add to a growing body of evidence that self-reported memory complaints can be predictive of cognitive impairment later in life, there isn’t cause for immediate alarm if you can’t remember where you left your keys.
"Certainly, someone with memory issues should report it to their doctor so they can be followed. Unfortunately, however, we do not yet have preventative therapies for Alzheimer’s disease or other illnesses that cause memory problems."
The research, which was supported by grants from the National Institutes of Health, the National Institute on Aging, and the National Center for Advancing Translational Sciences, was published in the Sept. 24, 2014, online issue of Neurology.
Xanthohumol, a type of flavonoid found in hops and beer, has been shown in a new study to improve cognitive function in young mice, but not in older animals.
The research was just published in Behavioral Brain Research by scientists from the Linus Pauling Institute and College of Veterinary Medicine at Oregon State University. It’s another step toward understanding, and ultimately reducing the degradation of memory that happens with age in many mammalian species, including humans.
Flavonoids are compounds found in plants that often give them their color. The study of them – whether in blueberries, dark chocolate or red wine - has increased in recent years due to their apparent nutritional benefits, on issues ranging from cancer to inflammation or cardiovascular disease. Several have also been shown to be important in cognition.
Xanthohumol has been of particular interest because of possible value in treating metabolic syndrome, a condition associated with obesity, high blood pressure and other concerns, including age-related deficits in memory. The compound has been used successfully to lower body weight and blood sugar in a rat model of obesity.
The new research studied use of xanthohumol in high dosages, far beyond what could be obtained just by diet. At least in young animals, it appeared to enhance their ability to adapt to changes in the environment. This cognitive flexibility was tested with a special type of maze designed for that purpose.
“Our goal was to determine whether xanthohumol could affect a process we call palmitoylation, which is a normal biological process but in older animals may become harmful,” said Daniel Zamzow, a former OSU doctoral student and now a lecturer at the University of Wisconsin/Rock County.
“Xanthohumol can speed the metabolism, reduce fatty acids in the liver and, at least with young mice, appeared to improve their cognitive flexibility, or higher level thinking,” Zamzow said. “Unfortunately it did not reduce palmitoylation in older mice, or improve their learning or cognitive performance, at least in the amounts of the compound we gave them.”
Kathy Magnusson, a professor in the OSU Department of Biomedical Sciences, principal investigator with the Linus Pauling Institute and corresponding author on this study, said that xanthohumol continues to be of significant interest for its biological properties, as are many other flavonoids.
“This flavonoid and others may have a function in the optimal ability to form memories,” Magnusson said. “Part of what this study seems to be suggesting is that it’s important to begin early in life to gain the full benefits of healthy nutrition.”
It’s also important to note, Magnusson said, that the levels of xanthohumol used in this study were only possible with supplements. As a fairly rare micronutrient, the only normal dietary source of it would be through the hops used in making beer, and “a human would have to drink 2000 liters of beer a day to reach the xanthohumol levels we used in this research.”
In this and other research, Magnusson’s research has primarily focused on two subunits of the NMDA receptor, called GluN1 and GluN2B. Their decline with age appears to be related to the decreased ability to form and quickly recall memories.
In humans, many adults start to experience deficits in memory around the age of 50, and some aspects of cognition begin to decline around age 40, the researchers noted in their report.
Brain activity can be used to tell whether someone recognizes details they encountered in normal, daily life, which may have implications for criminal investigations and use in courtrooms, new research shows.
The findings, published in Psychological Science, a journal of the Association for Psychological Science, suggest that a particular brain wave, known as P300, could serve as a marker that identifies places, objects, or other details that a person has seen and recognizes from everyday life.
Research using EEG recordings of brain activity has shown that the P300 brain wave tends to be large when a person recognizes a meaningful item among a list of nonmeaningful items. Using P300, researchers can give a subject a test called the Concealed Information Test (CIT) to try to determine whether they recognize information that is related to a crime or other event.
Most studies investigating P300 and recognition have been conducted in lab settings that are far removed from the kinds of information a real witness or suspect might be exposed to. This new study marks an important advance, says lead research John B. Meixner of Northwestern University, because it draws on details from activities in participants’ normal, daily lives.
“Much like a real crime, our participants made their own decisions and were exposed to all of the distracting information in the world,” he explains.
“Perhaps the most surprising finding was the extent to which we could detect very trivial details from a subject’s day, such as the color of umbrella that the participant had used,” says Meixner. “This precision is exciting for the future because it indicates that relatively peripheral crime details, such as physical features of the crime scene, might be usable in a real-world CIT — though we still need to do much more work to learn about this.”
To achieve a more realistic CIT, Meixner and co-author J. Peter Rosenfeld outfitted 24 college student participants with small cameras that recorded both video and sound — the students wore the cameras clipped to their clothes for 4 hours as they went about their day.
For half of the students, the researchers used the recordings to identify details specific to each person’s day, which became “probe” items for that person. The researchers also came up with corresponding, “irrelevant” items that the student had not encountered — if the probe item was a specific grocery store, for example, the irrelevant items might include other grocery stores.
For the other half of the students, the “probe” items related to details or items they had not encountered, but which were instead drawn from the recordings of other participants. The researchers wanted to simulate a real investigation, in which a suspect with knowledge of a crime would be shown the same crime-related details as a suspect who may have no crime-related knowledge.
The next day, all of the students returned to the lab and were shown a series of words that described different details or items (i.e., the probe and irrelevant items), while their brain activity was recorded via EEG.
The results showed that the P300 was larger for probe items than for irrelevant items, but only for the students who had actually seen or encountered the probe.
Further analyses revealed that P300 responses effectively distinguished probe items from irrelevant items on the level of each individual participant, suggesting that it is a robust and reliable marker of recognition.
These findings have implications for memory research, but they may also have real-world application in the domain of criminal law given that some countries, like Japan and Israel, use the CIT in criminal investigations.
“One reason that the CIT has not been used in the US is that the test may not meet the criteria to be admissible in a courtroom,” says Meixner. “Our work may help move the P300-based CIT one step closer to admissibility by demonstrating the test’s validity and reliability in a more realistic context.”
Meixner, Rosenfeld, and colleagues plan on investigating additional factors that may impact detection, including whether images from the recordings may be even more effective at eliciting recognition than descriptive words – preliminary data suggest this may be the case.
Brain inflammation can rapidly disrupt our ability to retrieve complex memories of similar but distinct experiences, according to UC Irvine neuroscientists Jennifer Czerniawski and John Guzowski.
Their study – which appears today in The Journal of Neuroscience – specifically identifies how immune system signaling molecules, called cytokines, impair communication among neurons in the hippocampus, an area of the brain critical for discrimination memory. The findings offer insight into why cognitive deficits occurs in people undergoing chemotherapy and those with autoimmune or neurodegenerative diseases.
Moreover, since cytokines are elevated in the brain in each of these conditions, the work suggests potential therapeutic targets to alleviate memory problems in these patients.
“Our research provides the first link among immune system activation, altered neural circuit function and impaired discrimination memory,” said Guzowski, the James L. McGaugh Chair in the Neurobiology of Learning & Memory. “The implications may be beneficial for those who have chronic diseases, such as multiple sclerosis, in which memory loss occurs and even for cancer patients.”
What he found interesting is that increased cytokine levels in the hippocampus only affected complex discrimination memory, the type that lets us differentiate among generally similar experiences – what we did at work or ate at dinner, for example. A simpler form of memory processed by the hippocampus – which would be akin to remembering where you work – was not altered by brain inflammation.
In the study, Czerniawski, a UCI postdoctoral scholar, exposed rats to two similar but discernable environments over several days. They received a mild foot shock daily in one, making them apprehensive about entering that specific site. Once the rodents showed that they had learned the difference between the two environments, some were given a low dose of a bacterial agent to induce a neuroinflammatory response, leading to cytokine release in the brain. Those animals were then no longer able to distinguish between the two environments.
Afterward, the researchers explored the activity patterns of neurons – the primary cell type for information processing – in the rats’ hippocampi using a gene-based cellular imaging method developed in the Guzowski lab. In the rodents that received the bacterial agent (and exhibited memory deterioration), the networks of neurons activated in the two environments were very similar, unlike those in the animals not given the agent (whose memories remained strong). This finding suggests that cytokines impaired recall by disrupting the function of these specific neuron circuits in the hippocampus.
“The cytokines caused the neural network to react as if no learning had taken place,” said Guzowski, associate professor of neurobiology & behavior. “The neural circuit activity was back to the pattern seen before learning.”
The work may also shed light on a chemotherapy-related mental phenomenon known as “chemo brain,” in which cancer patients find it difficult to efficiently process information. UCI neuro-oncologists have found that chemotherapeutic agents destroy stem cells in the brain that would have become neurons for creating and storing memories.
Dr. Daniela Bota, who co-authored that study, is currently collaborating with Guzowski’s research group to see if brain inflammation may be another of the underlying causes of “chemo brain” symptoms.
She said they’re looking for a simple intervention, such as an anti-inflammatory or steroid drug, that could lessen post-chemo inflammation. Bota will test this approach on patients, pending the outcome of animal studies.
“It will be interesting to see if limiting neuroinflammation will give cancer patients fewer or no problems,” she said. “It’s a wonderful idea, and it presents a new method to limit brain cell damage, improving quality of life. This is a great example of basic science and clinical ideas coming together to benefit patients.”
Using a zebrafish model of a human genetic disease called neurofibromatosis (NF1), a team from the Perelman School of Medicine at the University of Pennsylvania has found that the learning and memory components of the disorder are distinct features that will likely need different treatment approaches. They published their results this month in Cell Reports.
NF1 is one of the most common inherited neurological disorders, affecting about one in 3,000 people. It is characterized by tumors, attention deficits, and learning problems. Most people with NF1 have symptoms before the age of 10. Therapies target Ras, a protein family that guides cell proliferation. The NF1 gene encodes neurofibromin, a very large protein with a small domain involved in Ras regulation.
Unexpectedly, the Penn team showed that some of the behavioral defects in mutant fish are not related to abnormal Ras, but can be corrected by drugs that affect another signaling pathway controlled by the small molecule cAMP. They used the zebrafish model of NF1 to show that memory defects – such as the recall of a learned task — can be corrected by drugs that target Ras, while learning deficits are corrected by modulation of the cAMP pathway. Overall, the team’s results have implications for potential therapies in people with NF1.
“We now know that learning and memory defects in NF1 are distinct and potentially amenable to drug therapy,” says co-senior author Jon Epstein, MD, chair of the department of Cell and Developmental Biology. “Our data convincingly show that memory defects in mutant fish are due to abnormal Ras activity, but learning defects are completely unaffected by modulation of these pathways. Rather these deficits are corrected with medicines that modulate cAMP.”
Over the last 20 years, zebrafish have become great models for studying development and disease. Like humans, zebrafish are vertebrates, and most of the genes required for normal embryonic development in zebrafish are also present in humans. When incorrectly regulated, these same genes often cause tumor formation and metastatic cancers.
Zebrafish have also become an ideal model for studying vertebrate neuroscience and behavior. In fact, co-senior author Michael Granato, PhD, professor of Cell and Developmental Biology, has developed the first high-throughput behavioral assays that measure learning and memory in fish. For example, Granato explains, “normal fish startle with changes in noise and light level by bending and swimming away from the annoying stimuli and do eventually habituate, that is get used to the alternations in their environment. But, NF1 fish mutants fail to habituate. However, after adding cAMP to their water, they do learn, and then behave like the non-mutant fish.”
This clearly indicates that learning deficits in the NF1 mutant fish are corrected by adding various substances that boost cAMP signaling. “Our data also indicate that learning and memory defects are reversible with acute pharmacologic treatments and are therefore not hard-wired, as might be expected for a defect in the development of nerves,” says Epstein. “This offers great hope for therapeutic intervention for NF1 patients.”
People with blood type AB may be more likely to develop memory loss in later years than people with other blood types, according to a study published in the September 10, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology.
AB is the least common blood type, found in about 4 percent of the U.S. population. The study found that people with AB blood were 82 percent more likely to develop the thinking and memory problems that can lead to dementia than people with other blood types. Previous studies have shown that people with type O blood have a lower risk of heart disease and stroke, factors that can increase the risk of memory loss and dementia.
The study was part of a larger study (the REasons for Geographic And Racial Differences in Stroke, or REGARDS Study) of more than 30,000 people followed for an average of 3.4 years. In those who had no memory or thinking problems at the beginning, the study identified 495 participants who developed thinking and memory problems, or cognitive impairment, during the study. They were compared to 587 people with no cognitive problems.
People with AB blood type made up 6 percent of the group who developed cognitive impairment, which is higher than the 4 percent found in the population.
“Our study looks at blood type and risk of cognitive impairment, but several studies have shown that factors such as high blood pressure, high cholesterol and diabetes increase the risk of cognitive impairment and dementia,” said study author Mary Cushman, MD, MSc, of the University of Vermont College of Medicine in Burlington. “Blood type is also related to other vascular conditions like stroke, so the findings highlight the connections between vascular issues and brain health. More research is needed to confirm these results.”
Researchers also looked at blood levels of factor VIII, a protein that helps blood to clot. High levels of factor VIII are related to higher risk of cognitive impairment and dementia. People in this study with higher levels of factor VIII were 24 percent more likely to develop thinking and memory problems than people with lower levels of the protein. People with AB blood had a higher average level of factor VIII than people with other blood types.
When we learn, we associate a sensory experience either with other stimuli or with a certain type of behaviour. The neurons in the cerebral cortex that transmit the information modify the synaptic connections that they have with the other neurons. According to a generally-accepted model of synaptic plasticity, a neuron that communicates with others of the same kind emits an electrical impulse as well as activating its synapses transiently. This electrical pulse, combined with the signal received from other neurons, acts to stimulate the synapses. How is it that some neurons are caught up in the communication interplay even when they are barely connected? This is the crucial chicken-or-egg puzzle of synaptic plasticity that a team led by Anthony Holtmaat, professor in the Department of Basic Neurosciences in the Faculty of Medicine at UNIGE, is aiming to solve. The results of their research into memory in silent neurons can be found in the latest edition of Nature.
Learning and memory are governed by a mechanism of sustainable synaptic strengthening. When we embark on a learning experience, our brain associates a sensory experience either with other stimuli or with a certain form of behaviour. The neurons in the cerebral cortex responsible for ensuring the transmission of the relevant information, then modify the synaptic connections that they have with other neurons. This is the very arrangement that subsequently enables the brain to optimise the way information is processed when it is met again, as well as predicting its consequences.
Neuroscientists typically induce electrical pulses in the neurons artificially in order to perform research on synaptic mechanisms.
The neuroscientists from UNIGE, however, chose a different approach in their attempt to discover what happens naturally in the neurons when they receive sensory stimuli. They observed the cerebral cortices of mice whose whiskers were repeatedly stimulated mechanically without an artificially-induced electrical pulse. The rodents use their whiskers as a sensor for navigating and interacting; they are, therefore, a key element for perception in mice.
An extremely low signal is enough
By observing these natural stimuli, professor Holtmaat’s team was able to demonstrate that sensory stimulus alone can generate long-term synaptic strengthening without the neuron discharging either an induced or natural electrical pulse. As a result – and contrary to what was previously believed – the synapses will be strengthened even when the neurons involved in a stimulus remain silent.In addition, if the sensory stimulation lasts over time, the synapses become so strong that the neuron in turn is activated and becomes fully engaged in the neural network. Once activated, the neuron can then further strengthen the synapses in a forwards and backwards movement. These findings could solve the brain’s “What came first?” mystery, as they make it possible to examine all the synaptic pathways that contribute to memory, rather than focusing on whether it is the synapsis or the neuron that activates the other.
The entire brain is mobilised
A second discovery lay in store for the researchers. During the same experiment, they were also able to establish that the stimuli that were most effective in strengthening the synapses came from secondary, non-cortical brain regions rather than major cortical pathways (which convey actual sensory information). Accordingly, storing information would simply require the co-activation of several synaptic pathways in the neuron, even if the latter remains silent. These findings may also have important implications both for the way we understand learning mechanisms and for therapeutic possibilities, in particular for rehabilitation following a stroke or in neurodegenerative disorders. As professor Holtmaat explains: “It is possible that sensory stimulation, when combined with another activity (motor activity, for example), works better for strengthening synaptic connections”. The professor concludes: “In the context of therapy, you could combine two different stimuli as a way of enhancing the effectiveness.”
Using functional near infrared spectroscopy (fNIRS), Kessler Foundation researchers have shown differential brain activation patterns between people with multiple sclerosis (MS) and healthy controls. This is the first MS study in which brain activation was studied using fNIRS while participants performed a cognitive task. The article, “Neuroimaging and cognition using functional near infrared spectroscopy (fNIRS) in multiple sclerosis,” was published online on June 11 by Brain Imaging and Behavior. Authors are Jelena Stojanovic-Radic, PhD, Glenn Wylie, DPhil, Gerald Voelbel, PhD, Nancy Chiaravalloti, PhD, and John DeLuca, PhD.
Researchers compared 13 individuals with MS with 12 controls for their performance on a working memory task with four levels of difficulty. Most such studies have employed functional magnetic resonance imaging (fMRI); fNIRS has been used infrequently in clinical populations, and has not been applied previously to neuroimaging research in MS. Studies comparing fMRI findings with those of fNIRS, however, show broad agreement in terms of activation patterns.
Results showed differences in activation between the groups that were dependent on task load. The MS group had an increase in activation at low task difficulty and a decrease in activation at high task difficulty. Conversely, in the control group, activation decreased with low task difficulty and increased with high task difficulty. Performance accuracy was lower in the MS group for low task load; there were no differences between the groups at the higher task loads.
“The data we obtained via fNIRS are consistent with fMRI data for clinical populations. We demonstrated that fNIRS is capable of detecting neuronal activation with a reasonable degree of detail,” noted Glenn Wylie, DPhil, associate director of Neuroscience and the Neuroimaging Center at Kessler Foundation. “We attribute the differences in brain activation patterns to the effort expended during the working memory task rather than to differences in speed of processing,” he added. “Because fNIRS is more portable and easier to use that fMRI, it may offer advantages in monitoring cognitive interventions that require frequent scans.”
In addition to working memory, future research in clinical populations should focus on processing speed and episodic memory, cognitive functions that are also affected in MS.