Posts tagged brain activity

Posts tagged brain activity

Ιn resting brains, Yale researchers see signs of schizophrenia
In an advance that increases hopes of finding biological markers for schizophrenia, Yale researchers have discovered widespread disruption of signals while the brain is at rest in those suffering from the disabling neuropsychiatric disease.
The Yale team used fMRI scans and created a mathematical model that simulates brain activity to discover the disruptions in global signaling — or patterns of neurological activity while the brain is not involved in any particular task. Previously, many researchers had thought that the overall brain activity at rest was mostly “background noise” and not clinically important, said Alan Anticevic, assistant professor in psychiatry at the Yale School of Medicine and senior author of the study, reported online May 5 in the Proceedings of the National Academy of Sciences. “To our knowledge these results provide the first evidence that global whole-brain signals are altered in schizophrenia, calling into question the standard removal of this signal in clinical neuroimaging studies,” Anticevic said.
These novel results have vital and broad implications for neuroimaging, as the search for neuropsychiatric biomarkers that could lead to early intervention and improved patient outcomes remains a prominent focus outlined by the National Institute of Mental Health.
How Does Stress Increase Your Risk for Stroke and Heart Attack?
Scientists have shown that anger, anxiety, and depression not only affect the functioning of the heart, but also increase the risk for heart disease.
Stroke and heart attacks are the end products of progressive damage to blood vessels supplying the heart and brain, a process called atherosclerosis. Atherosclerosis progresses when there are high levels of chemicals in the body called pro-inflammatory cytokines.
It is thought that persisting stress increases the risk for atherosclerosis and cardiovascular disease by evoking negative emotions that, in turn, raise the levels of pro-inflammatory chemicals in the body.
Researchers have now investigated the underlying neural circuitry of this process, and report their findings in the current issue of Biological Psychiatry.
“Drawing upon the observation that many of the same brain areas involved in emotion are also involved in sensing and regulating levels of inflammation in the body, we hypothesized that brain activity linked to negative emotions – specifically efforts to regulate negative emotions – would relate to physical signs of risk for heart disease,” explained Dr. Peter Gianaros, Associate Professor at the University of Pittsburgh and first author on the study.
To conduct the study, Gianaros and his colleagues recruited 157 healthy adult volunteers who were asked to regulate their emotional reactions to unpleasant pictures while their brain activity was measured with functional imaging. The researchers also scanned their arteries for signs of atherosclerosis to assess heart disease risk and measured levels of inflammation in the bloodstream, a major physiological risk factor for atherosclerosis and premature death by heart disease.
They found that individuals who show greater brain activation when regulating their negative emotions also exhibit elevated blood levels of interleukin-6, one of the body’s pro-inflammatory cytokines, and increased thickness of the carotid artery wall, a marker of atherosclerosis.
The inflammation levels accounted for the link between signs of atherosclerosis and brain activity patterns seen during emotion regulation. Importantly, the findings were significant even after controlling for a number of different factors, like age, gender, smoking, and other conventional heart disease risk factors.
“These new findings agree with the popular belief that emotions are connected to heart health,” said Gianaros. “We think that the mechanistic basis for this connection may lie in the functioning of brain regions important for regulating both emotion and inflammation.”
These findings may have implications for brain-based prevention and intervention efforts to improve heart health and protect against heart disease.”
“It is remarkable to see the links develop between negative emotional states, brain circuits, inflammation, and markers of poor physical health,” said Dr. John Krystal, Editor of Biological Psychiatry. “As we identify the key mechanisms linking brain and body, we may be able to also break the cycle through which stress and depression impair physical health.”
(Image: Bigstock)
Motor cortex shown to play active role in learning movement patterns
Skilled motor movements of the sort tennis players employ while serving a tennis ball or pianists use in playing a concerto, require precise interactions between the motor cortex and the rest of the brain. Neuroscientists had long assumed that the motor cortex functioned something like a piano keyboard.
"Every time you wanted to hear a specific note, there was a specific key to press," says Andrew Peters, a neurobiologist at UC San Diego’s Center for Neural Circuits and Behavior. "In other words, every specific movement of a muscle required the activation of specific cells in the motor cortex because the main job of the motor cortex was thought to be to listen to the rest of the cortex and press the keys it’s directed to press."
But in a study published in this week’s advance online publication of the journal Nature, Peters, the first author of the paper, and his colleagues found that the motor cortex itself plays an active role in learning new motor movements. In a series of experiments using mice, the researchers showed in detail how those movements are learned over time.
"Our finding that the relationship between body movements and the activity of the part of the cortex closest to the muscles is profoundly plastic and shaped by learning provides a better picture of this process," says Takaki Komiyama, an assistant professor of biology at UC San Diego who headed the research team. "That’s important, because elucidating brain plasticity during learning could lead to new avenues for treating learning and movement disorders, including Parkinson’s disease."
With Simon Chen, another UC San Diego neurobiologist, the researchers monitored the activity of neurons in the motor cortex over a period of two weeks while mice learned to press a lever in a specific way with their front limbs to receive a reward.
"What we saw was that during learning, different patterns of activity—which cells are active, when they’re active—were evident in the motor cortex," says Peters. "This ends up translating to different patterns of activity even for similar movements. Once the animal has learned the movement, similar movements are then accompanied by consistent activity. This consistent activity moreover is totally new to the animal: it wasn’t used early in learning even with movements that were similar to the later movement."
"Early on," Peters says, "the animals will occasionally make movements that look like the expert movements they make after learning. The patterns of brain activity that accompany those similar early and late movements are actually completely different though. Over the course of learning, the animal generates a whole new set of activity in the motor cortex to make that movement. In the piano keyboard analogy, that’s like using one key to make a note early on, but a different key to make the same note later."
Activity in areas of the brain related to reward and self-control may offer neural markers that predict whether people are likely to resist or give in to temptations, like food, in daily life, according to research in Psychological Science, a journal of the Association for Psychological Science.

“Most people have difficulty resisting temptation at least occasionally, even if what tempts them differs,” say psychological scientists Rich Lopez and Todd Heatherton of Dartmouth College, authors on the study. “The overarching motivation of our work is to understand why some people are more likely to experience this self-regulation failure than others.”
The research findings reveal that activity in reward areas of the brain in response to pictures of appetizing food predicts whether people tend to give in to food cravings and desires in real life, whereas activity in prefrontal areas during taxing self-control tasks predicts their ability to resist tempting food.
Lopez and colleagues used functional MRI (fMRI) to explore the interplay between activity in prefrontal brain regions associated with self-control (e.g., inferior frontal gyrus) and subcortical areas involved in affect and reward (e.g., nucleus accumbens), and to see whether the interplay between these regions predicts how successful (or unsuccessful) people are in controlling their desires to eat on a daily basis.
The researchers recruited 31 female participants to take part in an initial fMRI scanning session that included two important tasks.
For the first task, the participants were presented with various images, including some of high-calorie foods, like dessert items, fast-food items, and snacks. The participants were simply asked to indicate whether each image was set indoors or outdoors — the researchers were specifically interested in measuring activity in the nucleus accumbens in response to the food-related images.
For the second task, the participants were asked to press or not press a button based on the specific cues provided with each image, a task designed to gauge self-control ability. During this task, the researchers measured activity in the inferior frontal gyrus (IFG).
The fMRI scanning session was followed by 1 week of so-called “experience sampling,” in which participants were signaled several times a day on a smartphone and asked to report their food desires and eating behaviors. Any time participants reported a food desire, they were then asked about the strength of the desire and their resistance to it. If they ultimately gave in to the craving, they were asked to say how much they had eaten.
As expected, participants who had relatively higher activity in the nucleus accumbens in response to the food images tended to experience more intense food desires. More importantly, they were also more likely to give in to their food cravings and eat the desired food.
The researchers were surprised by how robust this association was:
“Reward-related brain activity, which can be considered an implicit measure, predicted who gave in to temptations to eat, as well as who ate more, above and beyond the desire strength reported by participants in the moment,” say Lopez and Heatherton. “This could help to explain a previous finding from our lab that people who show this kind of brain activity the most are also the most likely to gain weight over six months.”
But brain activity also predicted who was more likely to be able to resist temptation: Participants who showed relatively higher IFG activity on the self-control task acted on their cravings less often.
When the researchers grouped the participants according to their IFG activity, the data revealed that participants who had high IFG activity were more successful at controlling how much they ate in particularly tempting situations than those who had low IFG activity. In fact, participants with low IFG activity were about 8.2 times more likely to give in to a food desire than those who had high IFG activity.
“Taken together, the results from the present study provide initial evidence for neural markers of everyday eating behaviors that can identify individuals who are more likely than others to give in to temptations to eat,” the researchers write.
Lopez, Heatherton, and colleagues are currently conducting studies focused on groups of people who are especially prone to self-regulation failure: chronic dieters.
They’re investigating, for example, how dieters’ brains respond to food cues after they’ve exhausted their self-control resources. The researchers hypothesize that depleting self-control may heighten reward-related brain activity, effectively “turning up the volume on temptations,” and predicting behaviors like overeating in daily life.
“Failures of self-control contribute to nearly half of all death in the United States each year,” the researchers note. “Our findings and future research may ultimately help people learn ways to resist their temptations.”
In recognizing speech sounds, the brain does not work the way a computer does
How does the brain decide whether or not something is correct? When it comes to the processing of spoken language – particularly whether or not certain sound combinations are allowed in a language – the common theory has been that the brain applies a set of rules to determine whether combinations are permissible. Now the work of a Massachusetts General Hospital (MGH) investigator and his team supports a different explanation – that the brain decides whether or not a combination is allowable based on words that are already known. The findings may lead to better understanding of how brain processes are disrupted in stroke patients with aphasia and also address theories about the overall operation of the brain.
"Our findings have implications for the idea that the brain acts as a computer, which would mean that it uses rules – the equivalent of software commands – to manipulate information. Instead it looks like at least some of the processes that cognitive psychologists and linguists have historically attributed to the application of rules may instead emerge from the association of speech sounds with words we already know," says David Gow, PhD, of the MGH Department of Neurology.
"Recognizing words is tricky – we have different accents and different, individual vocal tracts; so the way individuals pronounce particular words always sounds a little different," he explains. "The fact that listeners almost always get those words right is really bizarre, and figuring out why that happens is an engineering problem. To address that, we borrowed a lot of ideas from other fields and people to create powerful new tools to investigate, not which parts of the brain are activated when we interpret spoken sounds, but how those areas interact."
Human beings speak more than 6,000 distinct language, and each language allows some ways to combine speech sounds into sequences but prohibits others. Although individuals are not usually conscious of these restrictions, native speakers have a strong sense of whether or not a combination is acceptable.
"Most English speakers could accept "doke" as a reasonable English word, but not "lgef," Gow explains. "When we hear a word that does not sound reasonable, we often mishear or repeat it in a way that makes it sound more acceptable. For example, the English language does not permit words that begin with the sounds "sr-," but that combination is allowed in several languages including Russian. As a result, most English speakers pronounce the Sanskrit word ‘sri’ – as in the name of the island nation Sri Lanka – as ‘shri,’ a combination of sounds found in English words like shriek and shred."
Gow’s method of investigating how the human brain perceives and distinguishes among elements of spoken language combines electroencephalography (EEG), which records electrical brain activity; magnetoencephalograohy (MEG), which the measures subtle magnetic fields produced by brain activity, and magnetic resonance imaging (MRI), which reveals brain structure. Data gathered with those technologies are then analyzed using Granger causality, a method developed to determine cause-and-effect relationships among economic events, along with a Kalman filter, a procedure used to navigate missiles and spacecraft by predicting where something will be in the future. The results are “movies” of brain activity showing not only where and when activity occurs but also how signals move across the brain on a millisecond-by-millisecond level, information no other research team has produced.
In a paper published earlier this year in the online journal PLOS One, Gow and his co-author Conrad Nied, now a PhD candidate at the University of Washington, described their investigation of how the neural processes involved in the interpretation of sound combinations differ depending on whether or not a combination would be permitted in the English language. Their goal was determining which of three potential mechanisms are actually involved in the way humans “repair” nonpermissible sound combinations – the application of rules regarding sound combinations, the frequency with which particular combinations have been encountered, or whether sound combinations occur in known words.
The study enrolled 10 adult American English speakers who listened to a series of recordings of spoken nonsense syllables that began with sounds ranging between “s” to “shl” – a combination not found at the beginning of English words – and indicated by means of a button push whether they heard an initial “s” or “sh.” EEG and MEG readings were taken during the task, and the results were projected onto MR images taken separately. Analysis focused on 22 regions of interest where brain activation increased during the task, with particular attention to those regions’ interactions with an area previously shown to play a role in identifying speech sounds.
While the results revealed complex patterns of interaction between the measured regions, the areas that had the greatest effect on regions that identify speech sounds were regions involved in the representation of words, not those responsible for rules. “We found that it’s the areas of the brain involved in representing the sound of words, not sounds in isolation or abstract rules, that send back the important information. And the interesting thing is that the words you know give you the rules to follow. You want to put sounds together in a way that’s easy for you to hear and to figure out what the other person is saying,” explains Gow, who is a clinical instructor in Neurology at Harvard Medical School and a professor of Psychology at Salem State University.
You Took the Words Right Out of My Brain
Our brain activity is more similar to that of speakers we are listening to when we can predict what they are going to say, a team of neuroscientists has found. The study, which appears in the Journal of Neuroscience, provides fresh evidence on the brain’s role in communication.
“Our findings show that the brains of both speakers and listeners take language predictability into account, resulting in more similar brain activity patterns between the two,” says Suzanne Dikker, the study’s lead author and a post-doctoral researcher in New York University’s Department of Psychology and Utrecht University. “Crucially, this happens even before a sentence is spoken and heard.”
“A lot of what we’ve learned about language and the brain has been from controlled laboratory tests that tend to look at language in the abstract—you get a string of words or you hear one word at a time,” adds Jason Zevin, an associate professor of psychology and linguistics at the University of Southern California and one of the study’s co-authors. “They’re not so much about communication, but about the structure of language. The current experiment is really about how we use language to express common ground or share our understanding of an event with someone else.”
The study’s other authors were Lauren Silbert, a recent PhD graduate from Princeton University, and Uri Hasson, an assistant professor in Princeton’s Department of Psychology.
Traditionally, it was thought that our brains always process the world around us from the “bottom up”—when we hear someone speak, our auditory cortex first processes the sounds, and then other areas in the brain put those sounds together into words and then sentences and larger discourse units. From here, we derive meaning and an understanding of the content of what is said to us.
However, in recent years, many neuroscientists have shifted to a “top-down” view of the brain, which they now see as a “prediction machine”: We are constantly anticipating events in the world around us so that we can respond to them quickly and accurately. For example, we can predict words and sounds based on context—and our brain takes advantage of this. For instance, when we hear “Grass is…” we can easily predict “green.”
What’s less understood is how this predictability might affect the speaker’s brain, or even the interaction between speakers and listeners.
In the Journal of Neuroscience study, the researchers collected brain responses from a speaker while she described images that she had viewed. These images varied in terms of likely predictability for a specific description. For instance, one image showed a penguin hugging a star (a relatively easy image in which to predict a speaker’s description). However, another image depicted a guitar stirring a bicycle tire submerged in a boiling pot of water—a picture that is much less likely to yield a predictable description: Is it “a guitar cooking a tire,” “a guitar boiling a wheel,” or “a guitar stirring a bike”?
Then, another group of subjects listened to those descriptions while viewing the same images. During this period, the researchers monitored the subjects’ brain activity.
When comparing the speaker’s brain responses directly to the listeners’ brain responses, they found that activity patterns in brain areas where spoken words are processed were more similar between the listeners and the speaker when the listeners could predict what the speaker was going to say.
When listeners can predict what a speaker is going to say, the authors suggest, their brains take advantage of this by sending a signal to their auditory cortex that it can expect sound patterns corresponding to predicted words (e.g., “green” while hearing “grass is…”). Interestingly, they add, the speaker’s brain is showing a similar effect as she is planning what she will say: brain activity in her auditory language areas is affected by how predictable her utterance will be for her listeners.
“In addition to facilitating rapid and accurate processing of the world around us, the predictive power of our brains might play an important role in human communication,” notes Dikker, who conducted some of the research as a post-doctoral fellow at Weill Cornell Medical College’s Sackler Institute for Developmental Psychobiology. “During conversation, we adapt our speech rate and word choices to each other—for example, when explaining science to a child as opposed to a fellow scientist—and these processes are governed by our brains, which correspondingly align to each other.”
A brain area activated by group decisions can distinguish people more likely to conform to the will of a group, say researchers from UCL.
The team, led by Dr Tali Sharot, UCL Affective Brain Lab, monitored the brain activity of individuals in groups of five people choosing food or drink they’d like to consume before and after being told the most popular choice in their group.

The results showed that people were likely to conform to the most popular choice in their group if their original preference was different.
Caroline Charpentier (UCL Institute of Cognitive Neuroscience) said: “Most people don’t think their everyday decisions, such as having eggs on toast for breakfast or a pint of lager at the pub, are influenced by other people’s preferences.”
She added: “But our results suggest that when other people make different choices than you, for example your friends order beer while you order wine, your brain records this information and this signal is mirrored in your brain later on, for example when you order another drink, making you more likely to choose beer, even if you initially preferred wine”.
The team, led by Dr Tali Sharot, used functional magnetic resonance imaging (fMRI) to monitor the brain responses of 20 volunteers during a decision-making task, while 78 more volunteers completed the task simultaneously on computers located outside the MRI room. They came to the lab in small groups of five.
In one session, volunteers were shown 60 pairs of food and drink items and asked to select which item of each pair they would prefer to consume at the end of the experiment. Straight after making this choice, the participants were told which item most people in their group selected. This part of the experiment provided the volunteers with social feedback.
Volunteers then took part in a following session a few minutes later, when they opted again for which item they would prefer to consume from the same series of pairs, but this time made the choice for themselves and did not receive any social feedback.
After the experiment, the participants completed a personality questionnaire that assessed trait conformity, which measures their general tendency to follow other people’s ideas and behaviours. Comparison of results from the choice experiment and conformity questionnaire indeed showed that people who scored high on trait conformity were about twice as likely to change their food choices to agree with the group decision as people who scored low for conformity.
What differed between the brains of people who were more likely to conform and people who held on to their own opinion?
The imaging study showed that the orbito-frontal cortex (OFC) – a region at the front of the brain that has been associated with emotional and social behaviour – was active during the two choice sessions and distinguished between these two groups of people.
Miss Charpentier said: “The orbito-frontal cortex was the only region specifically activated, and the first area to react to group disagreement. This region was activated both at the time of the initial social conflict (when your friends all choose beer while you prefer wine) and also later when you make an individual choice (when you order another drink for yourself).”
The OFC has previously been associated with emotions and social behaviour. Some clinical studies have suggested that people with brain damage in the OFC may behave inappropriately in groups.
Miss Charpentier concluded: “When OFC activity during the initial social conflict is mirrored in your brain at a later time when you make an individual choice, you are more likely to change your choice and follow the group. This is what happens in ‘high conformers’. In other words, it is the temporal dynamics of the OFC that distinguishes “conformers” from people who hold on to their own initial opinion”.
(Source: ucl.ac.uk)

Controlling Brain Waves to Improve Vision
Have you ever accidently missed a red light or a stop sign? Or have you heard someone mention a visible event that you passed by but totally missed seeing?
“When we have different things competing for our attention, we can only be aware of so much of what we see,” said Kyle Mathewson, Beckman Institute Postdoctoral Fellow. “For example, when you’re driving, you might really be concentrating on obeying traffic signals.”
But say there’s an unexpected event: an emergency vehicle, a pedestrian, or an animal running into the road—will you actually see the unexpected, or will you be so focused on your initial task that you don’t notice?
“In the car, we may see something so brief or so faint, while we’re paying attention to something else, that the event won’t come into our awareness,” says Mathewson. “If you present this scenario hundreds of times to someone, sometimes they will see the unexpected event, and sometimes they won’t because their brain is in a different preparation state.”
By using a novel technique to test brain waves, Mathewson and colleagues are discovering how the brain processes external stimuli that do and don’t reach our awareness. A paper about their results, “Dynamics of Alpha Control: Preparatory Suppression of Posterior Alpha Oscillations by Frontal Modulators Revealed with Combined EEG and Event-related Optical Signal,” published this month in the Journal of Cognitive Neuroscience, reveals how alpha waves, typically thought of as your brain’s electrical activity while it’s at rest, can actually influence what we see or don’t see.
The researchers used both electroencephalography (EEG) and the event-related optical signal (EROS), developed in the Cognitive Neuroimaging Laboratory of Gabriele Gratton and Monica Fabiani, professors of psychology and members of the Beckman Institute’s Cognitive Neuroscience Group, and authors of the study.
While EEG records the electrical activity along the scalp, EROS uses infrared light passed through optical fibers to measure changes in optical properties in the active areas of the cerebral cortex. Because of the hard skull between the EEG sensors and the brain, it can be difficult to find exactly WHERE signals are produced. EROS, which examines how light is scattered, can noninvasively pinpoint activity within the brain.
“EROS is based on near-infrared light,” explained Fabiani and Gratton via email. “It exploits the fact that when neurons are active, they swell a little, becoming slightly more transparent to light: this allows us to determine when a particular part of the cortex is processing information, as well as where the activity occurs.”
This allowed the researchers to not only measure activity in the brain, but also allowed them to map where the alpha oscillations were originating. Their discovery: the alpha waves are produced in the cuneus, located in the part of the brain that processes visual information.
The alpha can inhibit what is processed visually, making it hard for you to see something unexpected.
By focusing your attention and concentrating more fully on what you are experiencing, however, the executive function of the brain can come into play and provide “top-down” control—putting a brake on the alpha waves, thus allowing you to see things that you might have missed in a more relaxed state.
“We found that the same brain regions known to control our attention are involved in suppressing the alpha waves and improving our ability to detect hard-to-see targets,” said Diane Beck, a member of the Beckman’s Cognitive Neuroscience Group, and one of the study’s authors.
“Knowing where the waves originate means we can target that area specifically with electrical stimulation” said Mathewson. “Or we can also give people moment-to-moment feedback, which could be used to alert drivers that they are not paying attention and should increase their focus on the road ahead, or in other situations alert students in a classroom that they need to focus more, or athletes, or pilots and equipment operators.”
The study examined 16 subjects and mapped the electrical and optical data onto individual MRI brain images.
A recently FDA-approved device has been shown to reduce seizures in patients with medication-resistant epilepsy by as much as 50 percent. When coupled with an innovative electrode placement planning system developed by physicians at Rush, the device facilitated the complete elimination of seizures in nearly half of the implanted Rush patients enrolled in the decade-long clinical trials.

That’s good news for a large portion of the nearly 400,000 people in the U.S. living with epilepsy whose seizures can’t be controlled with medications and who are not candidates for brain surgery.
Epilepsy is a chronic neurological condition characterized by recurrent seizures that disrupt the senses, or can involve short periods of unconsciousness or convulsions. “Many people with epilepsy have scores of unpredictable seizures every day that make it impossible for them to drive, work or even get a good night’s sleep,” said Dr. Marvin Rossi, co-principal investigator of the NeuroPace Pivotal Clinical Trial and assistant professor of neurology at the Rush Epilepsy Center.
The NeuroPace RNS System uses responsive, or ‘on-demand’ direct stimulation to detect abnormal electrical activity in the brain and deliver small amounts of electrical stimulation to suppress seizures before they begin.
The device is surgically placed underneath the scalp within the skull and connected to electrodes that are strategically placed within the brain where the seizures originate (called the seizure focus). A programmed computer chip in the skull communicates with the system to record data and to help regulate responsive stimulation.
The unique electrode placement planning modeling system developed at Rush uses a computer-intensive mapping system that facilitates surgical placement of electrodes at the precise location in the brain’s temporal lobe circuitry. When stimulated, these extensive epileptic circuits are calmed. The modeling system predicts where in the brain the activity begins and spreads, so that the device can better influence the maximal extent of the epileptic pathway.
The device also acts as an implanted EEG for recording brain activity. This function was first shown at Rush to help determine whether the patient will further benefit from a surgical resection, in which surgeons remove a portion of the temporal lobe network. Dr. Richard Byrne, chairman of Neurosurgery at Rush, implants the electrodes in the temporal lobes.
As a result, physicians at Rush can offer patients the new implantable neurostimulator device, a surgical resection or both with the possibility of completely eliminating seizures. “This device is also being used at Rush as a foundation and inspiration for building cutting-edge hybrid stimulation therapy-drug molecule delivery systems,” said Rossi.
“Devices that treat epilepsy may offer new hope to patients when medication is ineffective and resection is not an option,” said Rossi. “Not long ago, it was highly unlikely that these patients would ever be free of their seizures. Now, several of our Rush patients with this device are actually able to drive, lower or even eliminate their medications and aren’t as limited as they once were. There is no doubt that quality of life of the majority of our implanted patients is significantly improved.”
According to the Centers for Disease Control and Prevention, in 2010, epilepsy affected approximately 2.3 million adults in the U.S. and 467,711 children under the age of 17.
(Source: rush.edu)
First brain images of African infants enable research into cognitive effects of nutrition
Brain activity of babies in developing countries could be monitored from birth to reveal the first signs of cognitive dysfunction, using a new technique piloted by a London-based university collaboration.
The cognitive function of infants can be visualised and tracked more quickly, more accurately and more cheaply using the method, called functional near infra-red spectroscopy (fNIRS), compared to the behavioural assessments Western regions have relied upon for decades.
Professor Clare Elwell, Professor of Medical Physics at University College London (UCL), said: “Brain activity soon after birth has barely been studied in low-income countries, because of the lack of transportable brain imaging facilities needed to do this at any reasonable scale. We have high hopes of building on these promising findings to develop functional near infra-red spectroscopy into an assessment tool for investigating cognitive function of infants who may be at risk of malnutrition or childhood diseases associated with low income settings.”
The pioneering study, published this week in Nature Scientific Reports, was performed by a collaboration of researchers from UCL; the London School of Hygiene and Tropical Medicine; the Babylab at Birkbeck, University of London; and the Medical Research Council unit in Gambia. It aimed to investigate the impact of nutrition in resource-poor regions on infant brain development, and was funded by the Bill and Melinda Gates Foundation.
Professor Clare Elwell (UCL Medical Physics & Bioengineering), said: “This is the first use of brain imaging methods to investigate localised brain activity in African infants.
"Until now, much of our understanding of brain development in low income countries has relied upon behavioural assessments which need careful cultural and linguistic translations to ensure they are accurate. Our technology, functional near infrared spectroscopy, can provide a more objective marker of brain activity."
For the studies in the Gambia, babies aged 4–8 months old were played sounds and shown videos of adults performing specific movements, such as playing ‘peek-a-boo’. The fNIRS system monitored changes in blood flow to the baby’s brain and showed that distinct brain regions responded to visual–social prompts, while others responded to auditory-social stimuli. Comparison of the results with those obtained from babies in the UK showed that the responses were similar in both groups.
fNIRS has previously been used to study brain development in UK infants and most recently to investigate early markers of autism during the first few months of life.
Professor Andrew Prentice (Medical Research Council International Nutrition Group, London School of Hygiene and Tropical Medicine) said: “Humans have evolved to survive and succeed on the basis of their large brain and intelligence, but nutritional deficits in early life can limit this success. In order to plan the best interventions to maximise brain function we need tools that can give us an early read out. fNIRS is showing great promise in this respect.”