Posts tagged brain
Posts tagged brain
Consider a failed murder attempt. Or a simple mistake that causes another to die. Is one of these more acceptable than the other?
Neuroscientists don’t pretend to hold the answers as to how people know what is right and what is wrong. But studies show individual biology may influence the ways people process the actions of others.
It turns out we judge others not only for what they do, but also for what we perceive they are thinking while they do it.
Consider the following scenario: Grace and Sally are touring a chemical factory when Grace decides to grab a cup of coffee. Sally asks Grace to pour her a cup as well. Grace spots a container of white powder next to the coffee maker and, knowing that her friend takes sugar in her coffee, she pours some into Sally’s cup. As it turns out, the powder is poison, and Sally dies after a few sips.
Most of us would understand and maybe forgive Grace for accidentally poisoning — or even killing — her friend. But what would you think of Grace if you were to learn that she had a hunch that the powder was toxic, yet decided to add it to her friend’s cup anyway?
“Often, what determines moral blame is not what the outcome is, but what [we think] is going on in the mind of the person performing the act,” says Rebecca Saxe, a neuroscientist at the Massachusetts Institute of Technology who studies how the brain casts judgment.
Scientists are learning the ways the brain responds when we attempt to determine right from wrong. Ultimately, they hope such information will help show how the brain processes difficult situations.
What was she thinking?
One way scientists study how we make right-or-wrong judgments is to look at brain regions that are most active when people attempt to interpret the thoughts of others.
In some studies, participants read stories about characters that either accidentally or intentionally cause harm to others while scientists use functional magnetic resonance imaging (fMRI) to track how brain activity changes. Such studies show that thinking about another’s thoughts increases the activity of nerve cells in a brain region known as the right temporo-parietal junction located behind the right ear.
As it turns out, some of these cells respond differently when presented with an intentional harm versus an accident. By zeroing in on the distinct patterns of activity in these cells, Saxe’s group discovered that they could accurately predict how forgiving the participants would be.
“People who say accidents are forgivable have really different [activity] patterns” than those less willing to overlook the unintentional harm, Saxe says.
Thinking about harm
Neuroscientists also study how people respond when asked how they themselves would act in morally challenging scenarios.
In one popular moral dilemma scenario, scientists ask participants to imagine the following: A runaway train is barreling down on five people. The only way to save these people is to hit a switch that would redirect the train onto tracks where it will kill only one person. Would you hit the switch?
What if, instead, you had to push a man off of a bridge to stop the train, knowing that doing so will kill him but save the lives of the others?
Studies ran these scenarios by people with damage to the ventromedial prefrontal cortex — a region believed to be involved in the processing of emotions — and those without damage. Both groups equally support the decision to hit the switch to redirect the train to save more lives.
However, those with damage to the ventromedial prefrontal cortex are much more likely to endorse pushing the man in front of the train, a more direct and personal harm. These studies, led by neuroscientist Antonio Damasio of the University of Southern California, suggest the important role of emotion in the generation of such judgments.
To test how important the ventromedial prefrontal cortex is when we judge the actions of others, Damasio along with neuroscientist Liane Young of Boston asked a small group of people with damage to this region to evaluate variations of the Grace and Sally story.
When told that Grace deliberately puts powder she believes is toxic into Sally’s cup, only to later learn the powder was sugar, healthy adults regularly condemn Grace’s failed attempt to harm her friend. However, people with ventromedial prefrontal cortex damage shrug off Grace’s action. As they see it, as long as Sally survives, Grace’s actions are no big deal.
Damasio says these results, along with others, reveal the role of the ventromedial prefrontal cortex and emotion in evaluating harmful intent.
That’s not fair
There is also evidence that changes in the chemistry of the brain influence how we behave when others treat us unfairly.
To measure how changes in brain chemistry affect people’s reactions to unfairness, University College London neuroscientist Molly Crockett and others gave study participants a drink to drive down levels of the neurotransmitter serotonin in the brain before asking them to play the ultimatum game.
In the ultimatum game, participants are paired with strangers they are told have been given a lump sum of money to share with them. The stranger determines how to divvy up the money, and proposes a split to the participant. The participant decides whether or not to accept the stranger’s offer. If the participant accepts, both players walk away with some money. However, a participant may reject the offer, believing it to be unfair, leaving both players empty-handed. Crockett found that people with lower levels of serotonin were more likely than others to reject offers they deemed to be unfair.
When the scientists examined the brain activity of participants with depleted serotonin levels as they accepted or rejected the offers, they found that rejecting offers led to increased activity in the dorsal striatum — a region involved in processing reward. Crockett says the findings suggest that dips in serotonin can shift people’s motivations to punish unfairness. For instance, when you deplete serotonin, people who are normally more forgiving may become happier with revenge, she says.
Crockett notes that serotonin levels may fluctuate when people are hungry or stressed. The findings illustrate how individual differences in biology might influence the way people view, and respond to, the actions of others.
When art meets neuroscience, strange things happen.
Consider the Museum of Scientifically Accurate Fabric Brain Art in Oregon which features rugs and knitting based on a brain scan motif. Or the neuroscientist at the University of Nevada-Reno who scanned the brain of a portrait artist while he drew a picture of a face.
And then there’s the ongoing war of words between scientists who think it’s possible to use analysis of brain activity to define beauty–or even art–and their critics who argue that it’s absurd to try to make sense of something so interpretive and contextual by tying it to biology and the behavior of neurons.
Beauty and the brain
On one side you have the likes of Semir Zeki, who heads a research center called the Institute of Neuroesthetics at London’s University College. A few years ago he started studying what happens in a person’s brain when they look at a painting or listen to a piece of music they find beautiful. He looked at the flip side, too–what goes on in there when something strikes us as ugly.
What he found is that when his study’s subjects experienced a piece of art or music they described as beautiful, their medial orbito-frontal cortex–the part of the brain just behind the eyes–”lit up” in brain scans. Art they found ugly stimulated their motor cortex instead. Zeki also discovered that whether the beauty came through their ears, in music, or their eyes, in art, the brain’s response was the same–it had increased blood flow to what’s known as its pleasure center. Beauty gave the brains a dopamine reward.
Zeki doesn’t go so far as to suggest that the essence of art can be captured in a brain scan. He insists his research really isn’t about explaining what art is, but rather what our neurons’ response to it can tell us about how brains work. But if, in the process, we learn about common characteristics in things our brains find beautiful, his thinking goes, what harm is there in that?
Beware of brain rules?
Plenty, potentially, responds the critics’ chorus. Writing recently in the journal Nature, Philip Ball makes the point that this line of research ultimately could lead to rule-making about beauty, to “creating criteria of right or wrong, either in the art itself or in individual reactions to it.” It conceivably could devolve to “scientific” formulas for beauty, guidelines for what, in music or art or literature, gets the dopamine flowing.
Although it is worth knowing that musical ‘chills’ are neurologically akin to the responses invoked by sex or drugs, an approach that cannot distinguish Bach from barbiturates is surely limited.
Others, such as University of California philosophy professor Alva Noe, suggest that to this point at least, brain science is too limiting in what it can reveal, that it focuses more on beauty as shaped by people’s preferences, as opposed to addressing the big questions, such as “Why does art move us?” and “Why does art matter?”
And he wonders if a science built around analyzing events in an individual’s brain can ever answer them. As he wrote in the New York Times:
…there can be nothing like a settled, once-and-for-all account of what art is, just as there can be no all-purpose account of what happens when people communicate or when they laugh together. Art, even for those who make it and love it, is always a question, a problem for itself. What is art? The question must arise, but it allows no definitive answer.
Fad or fortune?
So what of neuroaesthetics? Is it just another part of the “neuro” wave, where brain scans are being billed as neurological Rosetta Stones that proponents claim can explain or even predict behavior–from who’s likely to commit crimes to why people make financial decisions to who’s going to gain weight in the next six months.
More jaded souls have suggested that neuroaesthetics and its bulky cousin, neurohumanities, are attempts to capture enough scientific sheen to attract research money back to liberal arts. Alissa Quart, writing in The Nation earlier this month, cut to the chase:
Neurohumanities offers a way to tap the popular enthusiasm for science and, in part, gin up more funding for humanities. It may also be a bid to give more authority to disciplines that are more qualitative and thus are construed, in today’s scientized and digitalized world, as less desirable or powerful.
Samir Zeki, of course, believes this is about much more than research grants. He really isn’t sure where neuroaesthetics will lead, but he’s convinced that only by “understanding the neural laws,” as he puts it, can we begin to make sense of morality, religion and yes, art.
Our researchers have found a previously undiscovered link between epileptic seizures and the signs of autism in adults.
Dr SallyAnn Wakeford from the Department of Psychology revealed that adults with epilepsy were more likely to have higher traits of autism and Asperger syndrome.
Characteristics of autism, which include impairment in social interaction and communication as well as restricted and repetitive interests, can be severe and go unnoticed for many years, having tremendous impact on the lives of those who have them.
The research found that epileptic seizures disrupt the neurological function that affects social functioning in the brain resulting in the same traits seen in autism.
Dr Wakeford said: “The social difficulties in epilepsy have been so far under-diagnosed and research has not uncovered any underlying theory to explain them. This new research links social difficulties to a deficit in somatic markers in the brain, explaining these characteristics in adults with epilepsy.”
Dr Wakeford and her colleagues discovered that having increased autistic traits was common to all epilepsy types, however, this was more pronounced for adults with Temporal Lobe Epilepsy (TLE).
The researchers suggest that one explanation may be because anti-epileptic drugs are often less effective for TLE. The reason why they suspect these drugs are implicated is because they were strongly related to the severity of autistic characteristics.
Dr Wakeford carried out a comprehensive range of studies with volunteers with epilepsy and discovered that all of the adults with epilepsy showed autism traits.
She said: “It is unknown whether these adults had a typical developmental period during childhood or whether they were predisposed to having autistic traits before the onset of their epilepsy. However what is known is that the social components of autistic characteristics in adults with epilepsy may be explained by social cognitive differences, which have largely been unrecognised until now.”
Dr Wakeford said the findings could lead to improved treatment for people with epilepsy and autism. She said: “Epilepsy has a history of cultural stigma, however the more we understand about the psychological consequences of epilepsy the more we can remove the stigma and mystique of this condition.
“These findings could mean that adults with epilepsy get access to better services, as there is a wider range of treatments available for those with autism condition.”
Margaret Rawnsley, research administration officer at Epilepsy Action welcomed the findings.
She said: “We welcome any research that could further our understanding of epilepsy and ultimately improve the lives of those with the condition. This research has the potential to tell us more about the links between epilepsy and other conditions, such as autism spectrum disorders.”
With obesity reaching epidemic levels in some parts of the world, scientists have only begun to understand why it is such a persistent condition. A study in the Journal of Biological Chemistry adds substantially to the story by reporting the discovery of a molecular chain of events in the brains of obese rats that undermined their ability to suppress appetite and to increase calorie burning.
It’s a vicious cycle, involving a breakdown in how brain cells process key proteins, that allows obesity to beget further obesity. But in a finding that might prove encouraging in the long term, the researchers at Brown University and Lifespan also found that they could intervene to break that cycle by fixing the core protein-processing problem.
Before the study, scientists knew that one mechanism in which obesity perpetuates itself was by causing resistance to leptin, a hormone that signals the brain about the status of fat in the body. But years ago senior author Eduardo A. Nillni, professor of medicine at Brown University and a researcher at Rhode Island Hospital, observed that after meals obese rats had a dearth of another key hormone — alpha-MSH — compared to rats of normal weight.
Alpha-MSH has two jobs in parts of the hypothalamus region of the brain. One is to suppress the activity of food-seeking brain cells. The second is to signal other brain cells to produce the hormone TRH, which prompts the thyroid gland to spur calorie burning activity in the body.
In the obese rats alpha-MSH was low, despite an abundance of leptin and despite normal levels of gene expression both for its biochemical precursor protein called pro-opiomelanocortin (POMC) and for a key enzyme called PC2 that processes POMC in brain cells. There had to be more to the story than just leptin, and it wasn’t a problem with expressing the needed genes.
Nillni and his co-authors, including lead authors Isin Cakir and Nicole Cyr, conducted the new study to find out where the alpha-MSH deficit was coming from. Nillni said he suspected that the problem might lie in the brain cells’ mechanism for processing the POMC protein to make alpha-MSH.
Protein processing problems
To do their work, the team fed some rats a high-calorie diet and fed others a normal diet for 12 weeks. The overfed rats developed the condition of “diet-induced obesity.” The team then studied the hormone levels and brain cell physiology of the rats. They also tested their findings by experimenting with the biochemistry of key individual cells on the lab bench.
They found that in the obese rats, a key “machine” in the brain cells’ assembly line of protein-making, called the endoplasmic reticulum (ER), becomes stressed and overwhelmed. The overloaded ER apparently fumbles the proper handling of PC2, perhaps discarding it because it can’t be folded up properly. The PC2 levels they measured in obese rats, for example, were 53 percent lower than in normal rats. Alpha-MSH peptides were also barely more than half as abundant in obese rats as they were in healthy rats.
“In our study we showed that what actually prevents the production of more alpha-MSH peptide is that ER stress was decreasing the biosynthesis of POMC by affecting one key enzyme that is essential for the formation of alpha-MSH,” Nillni said. “This is so novel. Nobody ever looked at that.”
Novel as it was, the story — a stressed ER mishandles PC2, which leaves POMC unfolded, which impedes alpha-MSH production — needed experimental confirmation.
The team provided that confirmation in several ways: In obese rats they measured elevated levels of known markers of ER stress. They also purposely induced ER stress in cells using pharmacological agents and saw that both PC2 and Alpha-MSH levels dropped.
Next they conducted an experiment to see if fixing ER stress would improve alpha-MSH production. They treated lean and obese rats for two days with a chemical called TUDCA, which is known to alleviate ER stress. If ER stress is responsible for alpha-MSH production problems, the researchers would see alpha-MSH recover in obese rats treated with TUDCA. Sure enough, while TUDCA didn’t increase alpha-MSH production in normal rats, it increased it markedly in the obese rats.
Similarly on the benchtop they took mouse neurons that produce PC2 and POMC and pretreated some with a similar chemical called PBA that prevents ER stress. They left others untreated. Then they induced ER stress in all the cells. Under that ER stress, those that had been pretreated with PBA produced about twice as much PC2 as those that had not.
Nillni cautioned that although his team found ways to restore PC2 and alpha-MSH by treating ER stress in living rats and individual cells, the agents used in the study are not readily applicable as medicines for treating obesity in humans. There could well be unknown and unwanted side effects, for example, and TUDCA is not approved for human use by the U.S. Food and Drug Administration.
But by laying out the exact mechanism responsible for why the brains of the obese rats failed to curb appetite or spur greater calorie burning, Nillni said, the study points drug makers to several opportunities where they can intervene to break this new, vicious cycle that helps obesity to perpetuate itself.
“Understanding the central control of energy-regulating neuropeptides during diet-induced obesity is important for the identification of therapeutic targets to prevent and or mitigate obesity pathology,” the authors wrote.
The instability of “white matter” in humans may contribute to greater cognitive decline during the aging of humans compared with chimpanzees, scientists from Yerkes National Primate Research Center, Emory University have found.
Yerkes scientists have discovered that white matter — the wires connecting the computing centers of the brain — begins to deteriorate earlier in the human lifespan than in the lives of aging chimpanzees.
This was the first examination of white matter integrity in aging chimpanzees. The results were published April 24 and are available online before print in the journal Neurobiology of Aging.
“Our study demonstrates that the price we pay for greater longevity than other primates may be the unique vulnerability of humans to neurodegenerative disease,” says research associate Xu (Jerry) Chen, first author of the paper. “The breakdown of white matter in later life could be part of that vulnerability.”
Both humans’ longer life spans and distinctive metabolism could lie behind the differences in the patterns of brain aging, says co-author Todd Preuss, PhD, associate research professor in Yerkes’ Division of Neuropharmacology and Neurologic Diseases.
“White matter integrity actually peaks around the same absolute age in both chimpanzees and humans, but humans may experience more degradation because they live longer. Perhaps the need to retain brain capacity late in life is one reason increased brain size was selected for in human evolution,” Preuss says.
The senior author is James Rilling, PhD, Yerkes researcher, associate professor of anthropology at Emory and director of the Laboratory for Darwinian Neuroscience. Collaborators at the University of Oslo also contributed to the paper.
In the brain, gray matter represents information processing centers, while white matter represents wires connecting these centers. White matter looks white because it is made up of myelin, a fatty electrical insulator that coats the axons of neurons.
If myelin deteriorates, neurons’ electrical signals are not transmitted as effectively, which contributes to cognitive decline. Myelin breakdown has been linked with cognitive decline both in healthy aging and in the context of Alzheimer’s disease.
The team’s data show that white matter integrity, as measured through a form of magnetic resonance imaging (MRI), peaks at age 31 in chimpanzees and at age 30 in humans. The average lifespan of chimpanzees is between 40 to 45 years, although in zoos or research facilities some have lived until 60. For comparison, human life expectancy in some developed countries is more than 80 years.
“The human equivalent of a 31 year old chimpanzee is about 47 years,” Rilling says. “Extrapolating from chimpanzees, we could expect that human white matter integrity would peak at age 47, but instead it peaks and begins to decline at age 30.”
The researchers collected MRI scans from 32 female chimpanzees and 20 female rhesus macaques and compared them with a pre-existing set of scans from human females. They used diffusion-weighted imaging (a form of MRI) to examine age-related changes in white matter integrity.
Diffusion-weighted imaging picks up microscopic changes in white matter by detecting directional differences in the ability of water molecules to diffuse. When the myelin coating of axons breaks down, water molecules in the brain can diffuse more freely, especially in directions perpendicular to axon bundles, Chen says.
New technology developed at the University of California, Berkeley, is using wireless signals to provide real-time, non-invasive diagnoses of brain swelling or bleeding.
The device analyzes data from low energy electromagnetic waves that are similar to those used to transmit radio and mobile signals. The technology, described in the May 14 issue of the journal PLOS ONE, could potentially become a cost-effective tool for medical diagnostics and to triage injuries in areas where access to medical care, especially medical imaging, is limited.
The researchers tested a prototype in a small-scale pilot study of healthy adults and brain trauma patients admitted to a military hospital for the Mexican Army. The results from the healthy participants were clearly distinguishable from the patients with brain damage, and data for bleeding was distinct from data for swelling.
Boris Rubinsky, Professor of the Graduate School at UC Berkeley’s Department of Mechanical Engineering, led the research team along with César A. González, a professor in Mexico at the Instituto Politécnico Nacional, Escuela Superior de Medicina (National Polytechnic Institute’s Superior School of Medicine).
“There are large populations in Mexico and the world that do not have adequate access to advanced medical imaging, either because it is too costly or the facilities are far away,” said González. “This technology is inexpensive, it can be used in economically disadvantaged parts of the world and in rural areas that lack industrial infrastructure, and it may substantially reduce the cost and change the paradigm of medical diagnostics. We have also shown that the technology could be combined with cell phones for remote diagnostics.”
Rubinsky noted that symptoms of serious head injuries and brain damage are not always immediately obvious, and for treatment, time is of the essence. For example, the administration of clot-busting medication for certain types of strokes must be given within three hours of the onset of symptoms.
“Some people might delay traveling to a hospital to get examined because it is an hour or more away, or because it is exceedingly expensive,” said Rubinsky. “If people had access to an affordable device that could indicate whether there is brain damage or not, they could then make an informed decision about making that trip to a facility to get prompt treatment, which is especially important for head injuries.”
The researchers took advantage of the characteristic changes in tissue composition and structure in brain injuries. For brain edemas, swelling results from an increase in fluid in the tissue. For brain hematomas, internal bleeding causes the buildup of blood in certain regions of the brain. Because fluid conducts electricity differently than brain tissue, it is possible to measure changes in electromagnetic properties. Computer algorithms interpret the changes to determine the likelihood of injury.
The study involved 46 healthy adults, ages 18 to 48, and eight patients with brain damage, ages 27 to 70.
The engineers fashioned two coils into a helmet-like device that was fitted over the heads of the study participants. One coil acted as a radio emitter and the other served as the receiver. Electromagnetic signals were broadcast through the brain from the emitter to the receiver.
“We have adjusted the coils so that if the brain works perfectly, we have a clean signal,” said Rubinsky. “Whenever there are interferences in the functioning of the brain, we detect them as changes in the received signal. We can tell from the changes, or ‘noises,’ what the brain injury is.”
Rubinsky noted that the waves are extremely weak, and are comparable to standing in a room with the radio or television turned on.
The device’s diagnoses for the brain trauma patients in the study matched the results obtained from conventional computerized tomography (CT) scans.
The tests also revealed some insights into the aging brain.
“With an increase in age, the average electromagnetic transmission signature of a normal human brain changes and approaches that of younger patients with a severe medical condition of hematoma in the brain,” said González. “This suggests the potential for the device to be used as an indication for the health of the brain in older patients in a similar way in which measurements of blood pressure, ECG, cholesterol or other health markers are used for diagnostic of human health conditions.”
Data from more than 180 research papers suggests fish oils could minimise the effects that junk food can have on the brain, a review by researchers at the University of Liverpool has shown.
The team at the University’s Institute of Ageing and Chronic Disease reviewed research from around the world to see whether there was sufficient data available to suggest that omega-3s had a role to play in aiding weight loss.
Stimulating the brain
Research over the past 10 years has indicated that high-fat diets could disrupt neurogenesis, a process that generates new nerve cells, but diets rich in omega-3s could prevent these negative effects by stimulating the area of the brain that control feeding, learning and memory.
Data from 185 research papers revealed, however, that fish oils do not have a direct impact on this process in these areas of the brain, but are likely to play a significant role in stalling refined sugars and saturated fats’ ability to inhibit the brain’s control on the body’s intake of food.
Dr Lucy Pickavance, from the University’s Institute of Ageing and Chronic Disease, explains: “Body weight is influenced by many factors, and some of the most important of these are the nutrients we consume. Excessive intake of certain macronutrients, the refined sugars and saturated fats found in junk food, can lead to weight gain, disrupt metabolism and even affect mental processing.
“These changes can be seen in the brain’s structure, including its ability to generate new nerve cells, potentially linking obesity to neurodegenerative diseases. Research, however, has suggested that omega-3 fish oils can reverse or even prevent these effects. We wanted to investigate the literature on this topic to determine whether there is evidence to suggest that omega-3s might aid weight loss by stimulating particular brain processes.”
Research papers showed that on high-fat diets hormones that are secreted from body tissues into the circulation after eating, and which normally protect neurons and stimulate their growth, are prevented from passing into the brain by increased circulation of inflammatory molecules and a type of fat called triglycerides.
Molecules that stimulate nerve growth are also reduced, but it appears, in studies with animal models, that omega-3s restore normal function by interfering with the production of these inflammatory molecules, suppressing triglycerides, and returning these nerve growth factors to normal.
Dr Pickavance added: “Fish oils don’t appear to have a direct impact on weight loss, but they may take the brakes off the detrimental effects of some of the processes triggered in the brain by high-fat diets. They seem to mimic the effects of calorie restrictive diets and including more oily fish or fish oil supplements in our diets could certainly be a positive step forward for those wanting to improve their general health.”
The research is published in the British Journal of Nutrition. Dr Pickavance will also be discussing the effects of high-fat diets on meal patterns and the impacts of high-saturated fats on muscle composition at the 20th European Congress on Obesity at the Liverpool Arena and Convention Centre later this month.
Your brain often works on autopilot when it comes to grammar. That theory has been around for years, but University of Oregon neuroscientists have captured elusive hard evidence that people indeed detect and process grammatical errors with no awareness of doing so.
Participants in the study — native-English speaking people, ages 18-30 — had their brain activity recorded using electroencephalography, from which researchers focused on a signal known as the Event-Related Potential (ERP). This non-invasive technique allows for the capture of changes in brain electrical activity during an event. In this case, events were short sentences presented visually one word at a time.
Subjects were given 280 experimental sentences, including some that were syntactically (grammatically) correct and others containing grammatical errors, such as “We drank Lisa’s brandy by the fire in the lobby,” or “We drank Lisa’s by brandy the fire in the lobby.” A 50 millisecond audio tone was also played at some point in each sentence. A tone appeared before or after a grammatical faux pas was presented. The auditory distraction also appeared in grammatically correct sentences.
This approach, said lead author Laura Batterink, a postdoctoral researcher, provided a signature of whether awareness was at work during processing of the errors. “Participants had to respond to the tone as quickly as they could, indicating if its pitch was low, medium or high,” she said. “The grammatical violations were fully visible to participants, but because they had to complete this extra task, they were often not consciously aware of the violations. They would read the sentence and have to indicate if it was correct or incorrect. If the tone was played immediately before the grammatical violation, they were more likely to say the sentence was correct even it wasn’t.”
When tones appeared after grammatical errors, subjects detected 89 percent of the errors. In cases where subjects correctly declared errors in sentences, the researchers found a P600 effect, an ERP response in which the error is recognized and corrected on the fly to make sense of the sentence.
When the tones appear before the grammatical errors, subjects detected only 51 percent of them. The tone before the event, said co-author Helen J. Neville, who holds the UO’s Robert and Beverly Lewis Endowed Chair in psychology, created a blink in their attention. The key to conscious awareness, she said, is based on whether or not a person can declare an error, and the tones disrupted participants’ ability to declare the errors. But, even when the participants did not notice these errors, their brains responded to them, generating an early negative ERP response. These undetected errors also delayed participants’ reaction times to the tones.
“Even when you don’t pick up on a syntactic error your brain is still picking up on it,” Batterink said. “There is a brain mechanism recognizing it and reacting to it, processing it unconsciously so you understand it properly.”
The study was published in the May 8 issue of the Journal of Neuroscience.
The brain processes syntactic information implicitly, in the absence of awareness, the authors concluded. “While other aspects of language, such as semantics and phonology, can also be processed implicitly, the present data represent the first direct evidence that implicit mechanisms also play a role in the processing of syntax, the core computational component of language.”
It may be time to reconsider some teaching strategies, especially how adults are taught a second language, said Neville, a member of the UO’s Institute of Neuroscience and director of the UO’s Brain Development Lab.
Children, she noted, often pick up grammar rules implicitly through routine daily interactions with parents or peers, simply hearing and processing new words and their usage before any formal instruction. She likened such learning to “Jabberwocky,” the nonsense poem introduced by writer Lewis Carroll in 1871 in “Through the Looking Glass,” where Alice discovers a book in an unrecognizable language that turns out to be written inversely and readable in a mirror.
For a second language, she said, “Teach grammatical rules implicitly, without any semantics at all, like with jabberwocky. Get them to listen to jabberwocky, like a child does.”
Arizona had one of the worst allergy seasons in recent memory this year. Even people who normally don’t suffer found themselves with itchy eyes and runny noses.
Thankfully it’s only a couple months out of the year, but for one valley man, he had year-round allergy symptoms. A runny nose all the time.
He was shocked to find out after years of suffering, his runny nose was really a leaking brain.
Joe Nagy first noticed it when he sat up to get out of bed.
“Brooop! This clear liquid dribbled out of my nose like tears out of your eyes. I go what is this?”
A runny nose that got worse.
“Once or twice a week. Then pretty soon it was all the time.”
He started taking allergy medicine, but the runny nose didn’t stop.
“I got to the point where I had tissues all the time. in my pocket full of tissues always had them all folded up.”
He still remembers the embarrassing moments when he couldn’t get to the tissues in time, like when he was picking up blueprints for his model airplanes.
“It was about a teaspoon full. Splashed all over the top sheet… I said, these damn allergies. I was embarrassed as hell.”
Fed up with the runny nose, Joe went to a specialist to test that fluid dripping out of his nose and found out it wasn’t a runny nose. It was leaking brain fluid.
“I was scared to death if you want to know the truth.”
The membrane surrounding Joe’s brain had a hole in it and his brain fluid was leaking out.
“You don’t really think about it, but our brains are really just above our noses all of the time,” says Barrow Neurological Institute neurosurgeon Peter Nakaji.
“This is one of the more common conditions to be missed for a long time… because so many people have runny noses.”
Joe was ready to have brain surgery to fix the leak. When he got a near-deadly case of meningitis, that brain fluid became infected.
“Some people come in with meningitis and at first they have to be treated to stop the infection itself. Then as soon as the infection is under control we repair the leak.”
You might wonder how Joe could have brain fluid leaking out of his nose for a year and a half. Wouldn’t the brain dry out?
Each day our bodies produce about 12 ounces of brain fluid, give or take. Producing enough to keep the brain bathed in liquid.
“These leaks can be very very tiny, a little like a puncture on a bicycle tire, that sometimes you have trouble even finding where it is.”
Dr. Nakaji eventually found the leak.
“If you look right here you can see a little tiny hole. You see a little bit of what looks like running water.”
Dr. Nakaji showed us how this problem is fixed with surgery.
“Nowadays we do quite a bit of surgery on the brain and base of brain through the nose. We never have to cut up into the brain. We’re getting a needle up into the space to check it out, and then to put a little bit of glue. This is just a bit of cartilage from the nose that we can get to repair over it and then the body will seal it up.”
Joe wasn’t convinced it would work. After all, he’d been dealing with the problem for so long. But days after the surgery, they removed the gauze from his nose.
“I was waiting for the dribble. This leaking cause I was so used to it every day. I got my hankie. Nothing. It’s never come back.”
What has come back is his desire to work on the hobbies he loves, like his model airplanes. And bigger projects.
“Now I’m going to build a sailboat and the sailboat I’m building is called a Great Pelican.”
And after all he’s been through, Joe feels pretty confident this boat won’t leak.
Before you call a brain surgeon about your runny nose, Dr. Nakaji says it most likely is just a runny nose. Brain fluid, it’s different than a runny nose caused by allergies in that the liquid is very, very clear.
So if you have a chronic runny nose, start with an allergist or an ear, nose and throat specialist. They can perform a simple test to determine if it’s a typical runny nose or something more serious.
The causes of this type of leak can be numerous. Sometimes a past head injury can lead to brain fluid leaking, or it can be caused from complications of a spinal tap or surgery.
William Gibson’s popular science fiction tale “Johnny Mnemonic” foresaw sensitive information being carried by microchips in the brain by 2021. A team of American neuroscientists could be making this fantasy world a reality.
Their motivation is different but the outcome would be somewhat similar. Hailed as one of 2013’s top ten technological breakthroughs by MIT, the work by the University of Southern California, North Carolina’s Wake Forest University and other partners has actually spanned a decade.
But the U.S.-wide team now thinks that it will see a memory device being implanted in a small number of human volunteers within two years and available to patients in five to 10 years. They can’t quite contain their excitement.
“I never thought I’d see this in my lifetime,” said Ted Berger, professor of biomedical engineering at the University of Southern California in Los Angeles. “I might not benefit from it myself but my kids will.”
Rob Hampson, associate professor of physiology and pharmacology at Wake Forest University, agrees. “We keep pushing forward, every time I put an estimate on it, it gets shorter and shorter.”
The scientists — who bring varied skills to the table, including mathematical modeling and psychiatry — believe they have cracked how long-term memories are made, stored and retrieved and how to replicate this process in brains that are damaged, particularly by stroke or localized injury.
Berger said they record a memory being made, in an undamaged area of the brain, then use that data to predict what a damaged area “downstream” should be doing. Electrodes are then used to stimulate the damaged area to replicate the action of the undamaged cells.
They concentrate on the hippocampus — part of the cerebral cortex which sits deep in the brain — where short-term memories become long-term ones. Berger has looked at how electrical signals travel through neurons there to form those long-term memories and has used his expertise in mathematical modeling to mimic these movements using electronics.
Hampson, whose university has done much of the animal studies, adds: “We support and reinforce the signal in the hippocampus but we are moving forward with the idea that if you can study enough of the inputs and outputs to replace the function of the hippocampus, you can bypass the hippocampus.”
The team’s experiments on rats and monkeys have shown that certain brain functions can be replaced with signals via electrodes. You would think that the work of then creating an implant for people and getting such a thing approved would be a Herculean task, but think again.
For 15 years, people have been having brain implants to provide deep brain stimulation to treat epilepsy and Parkinson’s disease — a reported 80,000 people have now had such devices placed in their brains. So many of the hurdles have already been overcome — particularly the “yuck factor” and the fear factor.
“It’s now commonly accepted that humans will have electrodes put in them — it’s done for epilepsy, deep brain stimulation, (that has made it) easier for investigative research, it’s much more acceptable now than five to 10 years ago,” Hampson says.
Much of the work that remains now is in shrinking down the electronics.
“Right now it’s not a device, it’s a fair amount of equipment,”Hampson says. “We’re probably looking at devices in the five to 10 year range for human patients.”
The ultimate goal in memory research would be to treat Alzheimer’s Disease but unlike in stroke or localized brain injury, Alzheimer’s tends to affect many parts of the brain, especially in its later stages, making these implants a less likely option any time soon.
Berger foresees a future, however, where drugs and implants could be used together to treat early dementia. Drugs could be used to enhance the action of cells that surround the most damaged areas, and the team’s memory implant could be used to replace a lot of the lost cells in the center of the damaged area. “I think the best strategy is going to involve both drugs and devices,” he says.
Unfortunately, the team found that its method can’t help patients with advanced dementia.
“When looking at a patient with mild memory loss, there’s probably enough residual signal to work with, but not when there’s significant memory loss,” Hampson said.
Constantine Lyketsos, professor of psychiatry and behavioral sciences at John Hopkins Medicine in Baltimore which is trialing a deep brain stimulator implant for Alzheimer’s patients was a little skeptical of the other team’s claims.
“The brain has a lot of redundancy, it can function pretty well if loses one or two parts. But memory involves circuits diffusely dispersed throughout the brain so it’s hard to envision.” However, he added that it was more likely to be successful in helping victims of stroke or localized brain injury as indeed its makers are aiming to do.
The UK’s Alzheimer’s Society is cautiously optimistic.
“Finding ways to combat symptoms caused by changes in the brain is an ongoing battle for researchers. An implant like this one is an interesting avenue to explore,” said Doug Brown, director of research and development.
Hampson says the team’s breakthrough is “like the difference between a cane, to help you walk, and a prosthetic limb — it’s two different approaches.”
It will still take time for many people to accept their findings and their claims, he says, but they don’t expect to have a shortage of volunteers stepping forward to try their implant — the project is partly funded by the U.S. military which is looking for help with battlefield injuries.
There are U.S. soldiers coming back from operations with brain trauma and a neurologist at DARPA (the Defense Advanced Research Projects Agency) is asking “what can you do for my boys?” Hampson says.
“That’s what it’s all about.”