Posts tagged neuroscience

Posts tagged neuroscience
Findings Point to New Potential Drug Target—GABA Neurons—to Treat Patients with Depression and Other Mood Disorders
A new drug target to treat depression and other mood disorders may lie in a group of GABA neurons (gamma-aminobutyric acid –the neurotransmitters which inhibit other cells) shown to contribute to symptoms like social withdrawal and increased anxiety, Penn Medicine researchers report in a new study in the Journal of Neuroscience.
Experts know that people suffering from depression and other mood disorders often react to rejection or bullying by withdrawing themselves socially more than the average person who takes it in strides, yet the biological processes behind these responses have remained unclear.
Now, a preclinical study, from the labs of Olivier Berton, PhD, an assistant professor in the department of Psychiatry, with Collin Challis of the Neuroscience Graduate Group, and Sheryl Beck, PhD, a professor in the department of Anesthesiology at Children’s Hospital of Philadelphia, found that bullying and other social stresses triggered symptoms of depression in mice by activating GABA neurons, in a never-before-seen direct relationship between social stimuli and this neural circuitry. Activation of those neurons, they found, directly inhibited levels of serotonin, long known to play a vital role in behavioral responses—without it, a depressed person is more likely to socially withdrawal.
Conversely, when the researchers successfully put the brake on the GABA neurons, mice became more resilient to bullying and didn’t avoid once -perceived threats.
“This is the first time that GABA neuron activity—found deep in the brainstem—has been shown to play a key role in the cognitive processes associated with social approach or avoidance behavior in mammals,” said Dr. Berton. “The results help us to understand why current antidepressants may not work for everyone and how to make them work better—by targeting GABA neurons that put the brake on serotonin cells.”
Less serotonin elicits socially defensive responses such as avoidance or submission, where enhancement—the main goal of antidepressants—induces a positive shift in the perception of socio-affective stimuli, promoting affiliation and dominance. However, current antidepressants targeting serotonin, like SSRIs, are only effective in about 50 percent of patients.
These new findings point to GABA neurons as a new, neural drug target that could help treat the other patients who don’t respond to today’s treatment.
For the study, “avoidant” mice were exposed to brief bouts of aggression from trained “bully” mice. By comparing gene expression in the brains of resilient and avoidant mice, Berton and colleagues discovered that bullying in avoidant mice puts GABA neurons in a state where they become more excitable and the mice exhibit signs of social defeat. Resilient mice, however, had no change in neuron levels and behavior.
To better understand the link between GABA and the development of stress resilience, Berton, Beck, and colleagues also devised an approach to directly manipulate levels: Lifting GABA inhibition of serotonin neurons reduced social and anxiety symptoms in mice exposed to bullies and also fully prevented neurobiological changes due to stress.
“Our paper provides a novel cellular understanding of how social defensiveness and social withdrawal develop in mice and gives us a stepping stone to better understand the basis of similar social symptoms in humans,” said Berton. “This has important implications for the understanding and treatment of mood disorders.”
(Source: uphs.upenn.edu)

Scientists use latest stem cell and gene-editing techniques to generate neurons in a dish, and reveal new clues behind deadly diseases of the brain
There is no easy way to study diseases of the brain. Extracting neurons from a living patient is both difficult and risky, while examining a patient’s brain post-mortem usually only reveals the disease’s final stages. And animal models, while incredibly informative, have frequently fallen short during the crucial drug-development stage of research. The result: we are woefully unprepared to fight—and win—the war against this class of diseases.
But scientists at the Gladstone Institutes and the University of California, San Francisco (UCSF) are taking a potentially more powerful approach: an advanced stem-cell technique that creates a human model of degenerative disease in a dish.
Using this model, the team uncovered a molecular process that causes neurons to degenerate, a hallmark sign of conditions such as Alzheimer’s disease and frontotemporal dementia (FTD). The results, published in the latest issue of Stem Cell Reports, offer fresh ammunition in the continued battle against these and other deadly neurodegenerative disorders.
The research team, led by Gladstone Investigator Yadong Huang, MD, PhD, identified an important mechanism behind tauopathies. A group of disorders that includes both Alzheimer’s and FTD, tauopathies are characterized by the abnormal accumulation of the protein Tau in neurons. This buildup is thought to contribute to the degeneration of these neurons over time, leading to debilitating symptoms such as dementia and memory loss. But while this notion has been around for a long time, the underlying processes have largely remained unclear.
“So much about the mechanisms that cause tauopathies is a mystery, in part because traditional approaches—such as post-mortem brain analysis and animal models—give an incomplete picture,” explained Dr. Huang. “But by using the latest stem-cell technology, we generated human neurons in a dish that exhibited the same pattern of cell degeneration and death that occurs inside a patient’s brain. Studying these models allowed us to see for the first time how a specific genetic mutation may kick start the tauopathy process.”
Other scientists recently discovered that the Tau mutation in question could increase a person’s risk of developing different tauopathies, including Alzheimer’s or FTD. So the research team, in collaboration with Bruce Miller, MD, who directs the UCSF Memory and Aging Center and who provided skin cells from a patient with this mutation, transformed these cells into induced pluripotent stem cells, or iPS cells. This technique, pioneered by Gladstone Investigator and 2012 Nobel Laureate Shinya Yamanaka, MD, PhD, allows scientists to reprogram adult skin cells into cells that are virtually identical to stem cells. These stem cells can then develop into almost any cell in the body.
The team combined this method with a cutting-edge gene-editing technique that essentially eliminated the Tau mutation in some of the iPS cells. The result was a system that allowed the team to compare neurons that had the mutation to those that did not.
“Our approach allowed us to grow human neurons in a dish that contained the exact same mutation as the neurons in the brain of the patient,” explained first author Helen Fong, PhD, who is also a California Institute for Regenerative Medicine postdoctoral scholar. “By comparing these diseased neurons with the ‘genetically corrected’ healthy neurons, we could see—cell by cell—how the Tau mutation leads to the abnormal build up of Tau and, over time, neuronal degeneration and death.”
“Tau’s main functions include keeping the skeletal structure of individual neurons intact and regulating neuronal activity,” said Dr. Huang. “But our research showed that the Tau produced by neurons from people with the Tau mutation is different; so it is red-flagged by the cell and targeted for destruction. However, instead of being flushed out, Tau gets chopped into pieces. These potentially toxic fragments accumulate over time and may in fact cause the neuron to degenerate and die.”
But by correcting the Tau mutation, the team effectively removed Tau’s red flag. The protein remained in one piece, the abnormal buildup ceased and the neurons remained healthy. Ongoing studies aim to determine whether the abnormal fragmentation and buildup of mutant tau is really the main cause of the neuronal death and, if so, how to block it.
Finding a way to block this toxic buildup of tau fragments has been a key focus of drug development—but has thus far been unsuccessful. But Dr. Huang and his colleagues are optimistic that their approach could be exactly what researchers need to fight back against deadly tauopathies.
“These findings not only offer a glimpse into how these powerful new models can shed light on mechanisms of disease” said Dr. Miller, “They may also prove invaluable for screening potential drugs that could be developed into better treatments for Alzheimer’s disease, FTD and related conditions.”
Investigators at The Feinstein Institute for Medical Research have discovered a new way to measure the progression of Huntington’s disease, using positron emission tomography (PET) to scan the brains of carriers of the gene. The findings are published in the September issue of The Journal of Clinical Investigation.
Huntington’s disease causes the progressive breakdown of nerve cells in the brain, which leads to impairments in movement, thinking and emotions. Most people with Huntington’s disease develop signs and symptoms in their 40s or 50s, but the onset of disease may be earlier or later in life. Medications are available to help manage the symptoms of Huntington’s disease, but treatments do not prevent the physical, mental and behavioral decline associated with the condition.
Huntington’s disease is an inherited disease, passed from parent to child through a mutation in the normal gene. Each child of a parent with Huntington’s disease has a 50/50 chance of inheriting the Huntington’s disease gene, and a child who inherits the gene will eventually develop the disease. Genetic testing for Huntington’s disease can be performed to determine whether a person carries the gene and is developing the disease even before symptoms appear. Having this ability provides an opportunity for scientists to study how the disease first develops and how it progresses in its early, presymptomatic stages. Even though a carrier of the Huntington’s disease gene may not have experienced symptoms, changes in the brain have already taken place, which ultimately lead to severe disability. Brain imaging is one tool that could be used to track how quickly Huntington’s disease progresses in gene carriers. Having a better way to track the disease at its earliest stages will make it easier to test drugs designed to delay or even prevent the onset of symptoms.
Researchers at the Feinstein Institute used PET scanning to map changes in brain metabolism in 12 people with the Huntington’s disease gene who had not developed clinical signs of the illness. The researchers scanned the subjects repeatedly over a seven-year period and found a characteristic set (network) of abnormalities in their brains. The network was used to measure the rate of disease progression in the study participants. The Feinstein Institute investigators then confirmed the progression rate through independent measurements in scans from a separate group of Huntington’s disease gene carriers who were studied in the Netherlands. The investigators believe that progression networks similar to the one identified in Huntington’s disease carriers will have an important role in evaluating new drugs for degenerative brain disorders.
“Huntington’s disease is an extremely debilitating disease. The findings make it possible to evaluate the effects of new drugs on disease progression before symptoms actually appear. This is a major advance in the field,” said David Eidelberg, MD, Susan and Leonard Feinstein Professor and head of the Center for Neurosciences at the Feinstein Institute.
(Source: northshorelij.com)
The brain doesn’t require simultaneous visual and audio stimulation to locate the source of a sound

As ventriloquists have long known, your eyes can sometimes tell your brain where a sound is coming from more convincingly than your ears can.
A series of experiments in humans and monkeys by Duke University researchers has found that the brain does not require simultaneous visual and audio stimulation to locate the source of a sound. Rather, visual feedback obtained from trying to find a sound with the eyes had a stronger effect than visual stimuli presented at the same time as the audio, according to the Duke study.
The findings could help those with mild hearing loss learn to localize voices better, improving their ability to communicate in noisy environments, said Jennifer Groh, a professor of psychology and neuroscience at Duke.
Locating where a sound is coming from is partially learned with the aid of vision. Researchers sought to learn more about how the brain locates the source of a sound when the source is unclear and there are a number of possible visual matches.
"Our study is related to ventriloquism, in which the visual image of a puppet’s mouth ‘captures’ the sound of the puppeteer’s voice," Groh said. "It is thought that one reason this illusion occurs is because vision normally teaches the brain how to tell where sounds are coming from. We investigated how the brain knows which visual stimulus should capture the location of a sound, such as why it is the puppet’s mouth and not some other visual stimulus."
The study, which appears Thursday (Aug. 29) in the journal PLOS ONE, tested two competing hypotheses. In one, the brain determines the location of a sound based on the simultaneous occurrence of audio and its visual source. In the other, the brain uses a “guess and check” method. In this scenario, visual feedback sent to the brain after the eye focuses on a sound affects how the eye searches for that sound in the future, possibly through the brain’s reward-related circuitry.
In both paradigms, the visual stimulus — an LED — was displaced from the sound. Groh’s team then looked for evidence that the LED caused a persistent mislocation of the sound.
"Surprisingly, we found that visual feedback exerts the more powerful effect on altering localization of sounds," Groh said. "This suggests that the active behavior of looking at the puppet during a ventriloquism performance plays a role in causing the shift in where you hear the voice."
Participants in the study — 11 humans and two rhesus monkeys — shifted their sight to a sound under different visual and audio scenarios.
In one scenario, called the “synchrony-only” task, a visual stimulus appeared at the same time as a sound but too briefly to provide feedback after an eye movement to that sound.
In another, the “feedback-only” task, the visual stimulus appeared during the execution of an eye movement to a sound, but was never on at the same time as the sound.
The study found that the “feedback-only task” exerted a much more powerful effect on the estimation of sound location, as measured with eye tracking, than did the other scenario. This suggests that those who have difficulty localizing sounds may benefit from practice involving eye movements.
On average, participants altered their eye movements in the direction of the lightsâ location to a greater degree, about a quarter of the way, when the visual stimulus was presented as feedback than when it was presented at the same time as the sound, the study found.
"This is about the brain’s self-improvement skills," said co-author Daniel Pages, a graduate student in Psychology & Neuroscience at Duke. "What we’re getting at is how the brain uses different types of information to improve how it does its job. In this case, it uses vision coupled with eye movements to improve hearing."
"We were surprised at how important the eye movements were," Groh said. "But finding sounds is really hard. Feedback about your performance is important for anything that is difficult, whether it is the B- you get on your homework or the error your eyes detect in localizing a sound."
(Source: today.duke.edu)
The white arrow highlights the primary neuronal cilium, a hair-like structure on nerve cells. The neuron on the right has no cilium because of the loss of a protein linked to intellectual disability in humans. Credit: YOSHIHO IKEUCHI
Intellectual disability linked to nerve cells that lose their ‘antennae’
An odd and little-known feature of nerve cells may be linked to several forms of inherited intellectual disability, researchers at Washington University School of Medicine in St. Louis have learned.
The scientists report that a genetic mutation that causes intellectual disability also blocks formation of the neuronal primary cilium, a hair-like structure that protrudes from the bodies of nerve cells.
"The primary cilium acts as a kind of antenna for nerve cells,” said first author Yoshiho Ikeuchi, PhD, a staff scientist. “It’s covered in receptors that monitor environmental conditions outside the cell and may influence the cell’s functions.”
Learning more about how the mutation sabotages production of the nerve cell cilium eventually will help scientists develop drugs to treat intellectual disability, according to senior author Azad Bonni, MD, PhD, the Edison Professor and chairman of the Department of Anatomy and Neurobiology.
"Intellectual disability—sometimes known as mental retardation—affects 1 to 2 percent of the general population, and researchers have identified more than 100 genes on the X chromosome that can cause these conditions,” Bonni said. “But we don’t know what most of these genes do, and that information is essential for new treatments.”
The research appears online Aug. 29 in Cell Reports.
Nearly every cell in the mammalian body has a primary cilium—a structure that acts as an environmental sensor. Some cells have many cilia that move together in waves. Problems with cilia are associated with disorders throughout the body, including illnesses of the kidneys, eyes and reproductive organs.
"Some of the X-linked intellectual disorders are syndromes that not only hamper brain development but also cause problems elsewhere in the body,” Bonni said. “That makes sense in the context of this new connection we’ve identified between intellectual disability and the primary cilium.”
Scientists only recently have recognized the potential of a primary cilium malfunction to impair nerve cell development and function. Studies have suggested that the primary cilium may be where nerve cells receive the growth signals that allow them to extend branches to each other and form circuits. Other research has shown that blocking of signal receptors on the primary cilium leads to memory problems in mice.
Bonni’s path to the primary cilium led through the nucleus, the command center that contains a cell’s DNA. Proteins found inside a cell’s nucleus often regulate the turning on or off of other genes, making them influential in orchestrating the responses and functions of cells.
Bonni and his colleagues scanned the literature on X chromosome genes linked to intellectual disability to learn which genes produce proteins found in the nucleus. When they disabled 15 such genes in individual nerve cells, they found that the loss of the gene for polyglutamine-binding protein 1 (PQBP1) produced the most dramatic effect, leaving nerve cells with shortened primary cilia or no cilia at all.
In other cell types outside the brain, PQBP1 is typically found only in the nucleus. But the new results show that in neurons the protein is present both in the nucleus and, surprisingly, at the base of the primary cilium.
The scientists learned PQBP1 binds to another protein outside the nucleus that suppresses growth of the primary cilium. By binding to the suppressor, PQBP1 gets that suppressor out of the way, allowing cilium formation to proceed normally.
Scientists may one day try to imitate this effect with drugs, potentially allowing the brain to develop more normally when PQBP1 is mutated. For now, the researchers want to learn more about the suppressor protein and also are investigating the possibility that PQBP1 may continue to influence the functions of the primary cilium after it is formed.
Schizophrenia is one of the most devastating neurological conditions, with only 30 percent of sufferers ever experiencing full recovery. While current medications can control most psychotic symptoms, their side effects can leave individuals so severely impaired that the disease ranks among the top ten causes of disability in developed countries.
Now, in this week’s issue of the Proceedings of the National Academy of Sciences, Thomas Albright and Ricardo Gil-da-Costa of the Salk Institute for Biological Studies describe a model system that completes the bridge between cellular and human studies of schizophrenia, an advance that should help speed the development of therapeutics for schizophrenia and other neurological disorders.
"Part of the terror of schizophrenia is that the brain can’t properly integrate sensory information, so the world is a disorientating series of unrelated bits of input," says Albright, the Conrad T. Prebys Chair in Vision Research. "We’ve created a model that tests the ability to do sensory integration, which should be extremely useful for pharmaceutical research."
Currently, over 1.1 percent of the world’s population has schizophrenia, with an estimated three million individuals in the United States alone. The economic cost is high: In 2002, Americans spent nearly $63 billion on treatment and managing disability. The emotional cost is higher still: Ten percent of those with schizophrenia are driven to commit suicide by the burden of coping with the disease.
Initially, it was thought that excessive amounts of the neurotransmitter dopamine caused psychotic symptoms, and indeed, current anti-psychotic drugs work by blocking dopamine from entering brain cells. But nearly all of these drugs have severe cognitive side effects, which led researchers to speculate that some other mechanism must also be involved.
A major clue to understanding schizophrenia came with the development of phencyclidine (PCP) in 1956. It was intended to keep patients safely asleep during surgeries, but many woke up with symptoms similar to those experienced by people with schizophrenia, including hallucinations and the disorientation of feeling “dissociated” from their limbs, resulting in PCP being abandoned for clinical purposes. A decade later, it was replaced by a derivative called ketamine. At doses high enough to put patients to sleep, ketamine is an effective anesthetic. At lower doses, it temporarily produces the same schizophrenia-like effects as PCP.
The two drugs are part of a class called N-methyl-D-aspartate receptor antagonists. Essentially, they work by gumming up the mechanism by which glutamate, the main excitatory neurotransmitter, would enter brain cells. Thus, it is clear that dopamine dysfunction accounts for some of the symptoms of psychosis, although that is probably not the full story.
"While dopamine has limited reach in the brain, any dysfunction in glutamate would be expected to have the sort of widespread effects we see in the perceptual disorders associated with schizophrenia," says Albright. "Nevertheless, which neurotransmitter was primary to these disorders—glutamate or dopamine—has been argued about for years."
Standing in the way of a definitive answer was a researcher’s Catch-22: Many experiments designed to understand cognitive disorders such as schizophrenia or Alzheimer’s require a participant’s conscious attention-yet these disorders interfere with attention.
To get around this, scientists turned to electroencephalograms (EEGs), which can be used to detect changes in cases where a subject is not consciously paying attention to a stimulus, by recording the brain’s electrical signals through electrodes placed in a scalp cap. In one test, a series of tones is played, but an “oddball” tone breaks the pattern in the sequence. A healthy brain can still easily spot the differences, even if a participant is concentrating on another task, such as reading a magazine.
"The test works because the brain is a prediction machine-it’s built to anticipate what should come next," says Albright. "If you have healthy working memory, you should be able to perceive a pattern and notice when something violates it, but patients suffering from some mental health disorders lack that basic ability."
In their latest research, Albright’s team detected the difference through two signals, event-related brain potentials called mismatch negativity (MMN) and P3. The MMN reflects differential brain activity to the detected oddball tone, below the level of conscious awareness. P3 picks up the next phase: a subject’s attention orientation to the oddball tone.
Still, a gap in understanding remained. While scientists could do cellular work in animal models on the role of dopamine versus glutamate, and they could do EEGs in human beings, a bridge between the two remained elusive. Such a bridge can help scientists understanding of how healthy and disordered brains work from the cellular level all the way to the multiple interactions between brain areas. Moreover, it can enable pre-clinical and clinical trials linking cellular and systems levels for successful therapeutic avenues.
Gil-da-Costa has at last crossed the bridge by crafting the first non-invasive scalp EEG setup that records accurately from the brains of non-human primates, with the same proportional density of electrodes as a human cap and no distortions in signal caused by an incorrect fit. This setup allows him to get accurate measurements of MMN and P3, with the same protocols that are followed in humans. As a result, the lab has come closer than ever before to untangling the roles of dopamine and glutamate.
"While rodents are essential for understanding mechanisms at a cellular or molecular level, at a higher cognitive level, the best you could do was a sort of rough analogy. Now, finally, we can have a one-to-one correspondence," says Gil-da-Costa. "For sensory integration, our findings with this model support the glutamate hypothesis."
Pharmaceutical companies are interested in the model, because of the potential for more precise testing and the universality of the MMN/P3 assays. “These brain makers are the same across dozens of neurological diseases, as well as brain trauma, so you can test potential therapies not just for schizophrenia, but for conditions such as Parkinson’s, Alzheimer’s, bi-polar disorder, and traumatic brain injuries,” says Gil-da-Costa. “We hope this will help begin a new era in neurological therapeutics.”
(Source: salk.edu)
Study is the first to find functional MRI differences in working memory in people with primary insomnia

A new brain imaging study may help explain why people with insomnia often complain that they struggle to concentrate during the day even when objective evidence of a cognitive problem is lacking.
"We found that insomnia subjects did not properly turn on brain regions critical to a working memory task and did not turn off ‘mind-wandering’ brain regions irrelevant to the task," said lead author Sean P.A. Drummond, PhD, associate professor in the department of psychiatry at the University of California, San Diego, and the VA San Diego Healthcare System, and Secretary/Treasurer of the Sleep Research Society. "Based on these results, it is not surprising that someone with insomnia would feel like they are working harder to do the same job as a healthy sleeper."
The research team led by Drummond and co-principal investigator Matthew Walker, PhD, studied 25 people with primary insomnia and 25 good sleepers. Participants had an average age of 32 years. The study subjects underwent a functional magnetic resonance imaging scan while performing a working memory task.
Results published in the September issue of the journal Sleep show that participants with insomnia did not differ from good sleepers in objective cognitive performance on the working memory task. However, the MRI scans revealed that people with insomnia could not modulate activity in brain regions typically used to perform the task.
As the task got harder, good sleepers used more resources within the working memory network of the brain, especially the dorsolateral prefrontal cortex. Insomnia subjects, however, were unable to recruit more resources in these brain regions. Furthermore, as the task got harder, participants with insomnia did not dial down the “default mode” regions of the brain that are normally only active when our minds are wandering.
"The data help us understand that people with insomnia not only have trouble sleeping at night, but their brains are not functioning as efficiently during the day," said Drummond. "Some aspects of insomnia are as much of a daytime problem as a nighttime problem. These daytime problems are associated with organic, measurable abnormalities of brain activity, giving us a biological marker for treatment success."
According to the authors, the study is the largest to examine cerebral activation with functional MRI during cognitive performance in people with primary insomnia, relative to well-matched good sleepers. It also is the first to characterize functional MRI differences in working memory in people with primary insomnia.
The American Academy of Sleep Medicine reports that about 10 to 15 percent of adults have an insomnia disorder with distress or daytime impairment. Most often insomnia is a comorbid disorder occurring with another problem such as depression or chronic pain, or caused by a medication or substance. Fewer people suffering from insomnia are considered to have primary insomnia, which is defined as a difficulty falling asleep or maintaining sleep in the absence of a coexisting condition.
(Source: eurekalert.org)
Toward an early diagnostic tool for Alzheimer’s disease
Despite all the research done on Alzheimer’s, there is still no early diagnostic tool for the disease. By looking at the brain wave components of individuals with the disease, Professor Tiago H. Falk of INRS’s Centre Énergie Matériaux Télécommunications has identified a promising avenue of research that may not only help diagnose the disease, but also assess its severity. This non-invasive, objective method is the subject of an article in the journal PLOS ONE.
Patients with Alzheimer’s disease currently undergo neuropsychological testing to detect signs of the disease. The test results are difficult to interpret and are insufficient for making a definitive diagnosis. But as scientists have already discovered, activity in certain areas of the cerebral cortex is affected even in the early stages of the disease. Professor Falk, who specialises in biological signal acquisition, examined this phenomenon and compared the electroencephalograms (EEGs) of healthy individuals (27), individuals with mild Alzheimer’s (27), and individuals with moderate cases of the disease (22). He found statistically significant differences across the three groups.
In collaboration with neurologists and Francisco J. Fraga, an INRS visiting professor specializing in biological signals, Professor Falk used an algorithm that dissects brain waves of varying frequencies. “What makes this algorithm innovative is that it characterizes the changes in temporal dynamics of the patients’ brain waves,” explains Professor Falk. “The findings show that healthy individuals have different patterns than those with mild Alzheimer’s disease. We also found a difference between patients with mild levels of the disease and those with moderate Alzheimer’s.”
To validate the model in order to eventually develop an early diagnostic tool for Alzheimer’s disease, Professor Falk’s team is sharing their algorithm on the NeuroAccelerator.org online data analysis portal. It is the first open source algorithm posted on the portal and may be used by researchers around the world to produce additional research findings.
Alzheimer’s disease accounts for 60% to 80% of all dementia cases in North America and is skyrocketing. This step toward the development of an early diagnostic tool that is non-invasive, objective, and relatively inexpensive is therefore welcome news for the research community.

Study Shows that Intensity of Facebook Use Can Be Predicted by Reward-related Activity in the Brain
Neuroscientists at Freie Universität Berlin show a link between reward activity in the brain due to discovering one has a good reputation and social media use
A person’s intensity of Facebook use can be predicted by activity in the nucleus accumbens, a reward-related area of the brain, according to a new study published by neuroscientists in the Languages of Emotion Cluster of Excellence at Freie Universität Berlin. Dr. Dar Meshi and his colleagues conducted this first ever study to relate brain activity (functional MRI) to social media use. The study was published in the latest issue of the open-access journal Frontiers in Human Neuroscience.
The researchers focused on the nucleus accumbens, a small but critical structure located deep in the center of the brain, because previous research has shown that rewards —including food, money, sex, and gains in reputation — are processed in this region.
“As human beings, we evolved to care about our reputation. In today’s world, one way we’re able to manage our reputation is by using social media websites like Facebook,” says Dar Meshi, lead author of the paper. Facebook is the world’s largest social media channel with 1.2 billion monthly active users. It was used in the study because interactions on the website are carried out in view of the user’s friends or public and can affect their reputation. For example, Facebook consists of users “liking” posted information. This approval is positive social feedback, and can be considered related to their reputation.
All 31 participants completed the Facebook Intensity Scale to determine how many friends each participant had, how many minutes they each spent on Facebook, and general thoughts. The participants were selected to vary widely in their Facebook Intensity Scale scores.
First, the subjects participated in a video interview. Next, the brain activity of the subjects was recorded, by using functional magnetic resonance imaging, in different situations. In the scanner, subjects were told whether people who supposedly viewed the video interview thought highly of them, and subjects also found out whether people thought highly of another person. They also performed a card task to win money.
Results showed that participants who received positive feedback about themselves produced stronger activation of the nucleus accumbens than when they saw the positive feedback that another person received. The strength of this difference corresponded to participants’ reported intensity of Facebook use. But the nucleus accumbens response to monetary reward did not predict Facebook use.
“Our study reveals that the processing of social gains in reputation in the left nucleus accumbens predicts the intensity of Facebook use across individuals,” says Meshi. “These findings expand upon our present knowledge of nucleus accumbens function as it relates to complex human behavior.”
Regarding the potential for social media addiction and the effects of social media on education quality, these results may provide important motivation for clinical research and for further research on learning. As Meshi says, “Our findings relating individual social media use to the individual response of the brain’s reward system may also be relevant for both educational and clinical research in the future.” The authors point out, however, that their results do not determine if positive social feedback drives people to interact on social media, or if sustained use of social media changes the way positive social feedback is processed by the brain.
The age at which children learn a second language can have a significant bearing on the structure of their adult brain, according to a new joint study by the Montreal Neurological Institute and Hospital - The Neuro at McGill University and Oxford University. The majority of people in the world learn to speak more than one language during their lifetime. Many do so with great proficiency particularly if the languages are learned simultaneously or from early in development.

The study concludes that the pattern of brain development is similar if you learn one or two language from birth. However, learning a second language later on in childhood after gaining proficiency in the first (native) language does in fact modify the brain’s structure, specifically the brain’s inferior frontal cortex. The left inferior frontal cortex became thicker and the right inferior frontal cortex became thinner. The cortex is a multi-layered mass of neurons that plays a major role in cognitive functions such as thought, language, consciousness and memory.
The study suggests that the task of acquiring a second language after infancy stimulates new neural growth and connections among neurons in ways seen in acquiring complex motor skills such as juggling. The study’s authors speculate that the difficulty that some people have in learning a second language later in life could be explained at the structural level.
“The later in childhood that the second language is acquired, the greater are the changes in the inferior frontal cortex,” said Dr. Denise Klein, researcher in The Neuro’s Cognitive Neuroscience Unit and a lead author on the paper published in the journal Brain and Language. “Our results provide structural evidence that age of acquisition is crucial in laying down the structure for language learning.”
Using a software program developed at The Neuro, the study examined MRI scans of 66 bilingual and 22 monolingual men and women living in Montreal. The work was supported by a grant from the Natural Science and Engineering Research Council of Canada and from an Oxford McGill Neuroscience Collaboration Pilot project.
(Source: mcgill.ca)