Posts tagged cognition
Posts tagged cognition
Human intelligence cannot be explained by the size of the brain’s frontal lobes, say researchers.
Research into the comparative size of the frontal lobes in humans and other species has determined that they are not - as previously thought - disproportionately enlarged relative to other areas of the brain, according to the most accurate and conclusive study of this area of the brain.
It concludes that the size of our frontal lobes cannot solely account for humans’ superior cognitive abilities.
The study by Durham and Reading universities suggests that supposedly more ‘primitive’ areas, such as the cerebellum, were equally important in the expansion of the human brain. These areas may therefore play unexpectedly important roles in human cognition and its disorders, such as autism and dyslexia, say the researchers.
The study is published in the Proceedings of the National Academy of Sciences (PNAS) today.
The frontal lobes are an area in the brain of mammals located at the front of each cerebral hemisphere, and are thought to be critical for advanced intelligence.
Lead author Professor Robert Barton from the Department of Anthropology at Durham University, said: “Probably the most widespread assumption about how the human brain evolved is that size increase was concentrated in the frontal lobes.
“It has been thought that frontal lobe expansion was particularly crucial to the development of modern human behaviour, thought and language, and that it is our bulging frontal lobes that truly make us human. We show that this is untrue: human frontal lobes are exactly the size expected for a non-human brain scaled up to human size.
“This means that areas traditionally considered to be more primitive were just as important during our evolution. These other areas should now get more attention. In fact there is already some evidence that damage to the cerebellum, for example, is a factor in disorders such as autism and dyslexia.”
The scientists argue that many of our high-level abilities are carried out by more extensive brain networks linking many different areas of the brain. They suggest it may be the structure of these extended networks more than the size of any isolated brain region that is critical for cognitive functioning.
Previously, various studies have been conducted to try and establish whether humans’ frontal lobes are disproportionately enlarged compared to their size in other primates such as apes and monkeys. They have resulted in a confused picture with use of different methods and measurements leading to inconsistent findings.
The Durham and Reading researchers, funded by The Leverhulme Trust, analysed data sets from previous animal and human studies using phylogenetic, or ‘evolutionary family tree’, methods, and found consistent results across all their data. They used a new method to look at the speed with which evolutionary change occurred, concluding that the frontal lobes did not evolve especially fast along the human lineage after it split from the chimpanzee lineage.
Opposing thumbs, expressive faces, complex social systems: it’s hard to miss the similarities between apes and humans. Now a new study with a troop of zoo baboons and lots of peanuts shows that a less obvious trait—the ability to understand numbers—also is shared by man and his primate cousins.
“The human capacity for complex symbolic math is clearly unique to our species,” says co-author Jessica Cantlon, assistant professor of brain and cognitive sciences at the University of Rochester. “But where did this numeric prowess come from? In this study we’ve shown that non-human primates also possess basic quantitative abilities. In fact, non-human primates can be as accurate at discriminating between different quantities as a human child.”
“This tells us that non-human primates have in common with humans a fundamental ability to make approximate quantity judgments,” says Cantlon. “Humans build on this talent by learning number words and developing a linguistic system of numbers, but in the absence of language and counting, complex math abilities do still exist.”
Cantlon, her research assistant Allison Barnard, postdoctoral fellow Kelly Hughes, and other colleagues at the University of Rochester and the Seneca Park Zoo in Rochester, N.Y., reported their findings online May 2 in the open-access journal Frontiers in Psychology.
The study tracked eight olive baboons, ages 4 to 14, in 54 separate trials of guess-which-cup-has-the-most-treats. Researchers placed one to eight peanuts into each of two cups, varying the numbers in each container. The baboons received all the peanuts in the cup they chose, whether it was the cup with the most goodies or not. The baboons guessed the larger quantity roughly 75 percent of the time on easy pairs when the relative difference between the quantities was large, for example two versus seven. But when the ratios were more difficult to discriminate, say six versus seven, their accuracy fell to 55 percent.
That pattern, argue the authors, helps to resolve a standing question about how animals understand quantity. Scientists have speculated that animals may use two different systems for evaluating numbers: one based on keeping track of discrete objects—a skill known to be limited to about three items at a time—and a second approach based on comparing the approximate differences between counts.
The baboons’ choices, conclude the authors, clearly relied on this latter “more than” or “less than” cognitive approach, known as the analog system. The baboons were able to consistently discriminate pairs with numbers larger than three as long as the relative difference between the peanuts in each cup was large. Research has shown that children who have not yet learned to count also depend on such comparisons to discriminate between number groups, as do human adults when they are required to quickly estimate quantity.
Studies with other animals, including birds, lemurs, chimpanzees, and even fish, have also revealed a similar ability to estimate relative quantity, but scientists have been wary of the findings because much of this research is limited to animals trained extensively in experimental procedures. The concern is that the results could reflect more about the experimenters than about the innate ability of the animals.
“We want to make sure we are not creating a ‘Clever Hans effect,’” cautions Cantlon, referring to the horse whose alleged aptitude for math was shown to rest instead on the ability to read the unintentional body language of his human trainer. To rule out such influence, the study relied on zoo baboons with no prior exposure to experimental procedures. Additionally, a control condition tested for human bias by using two experimenters—each blind to the contents of the other cup—and found that the choice patterns remained unchanged.
A final experiment tested two baboons over 130 more trials. The monkeys showed little improvement in their choice rate, indicating that learning did not play a significant role in understanding quantity.
“What’s surprising is that without any prior training, these animals have the ability to solve numerical problems,” says Cantlon. The results indicate that baboons not only use comparisons to understand numbers, but that these abilities occur naturally and in the wild, the authors conclude.
Finding a functioning baboon troop for cognitive research was serendipitous, explains study co-author Jenna Bovee, the elephant handler at the Seneca Park Zoo who is also the primary keeper for the baboons. The African monkeys are hierarchical, with an alpha male at the top of the social ladder and lots of jockeying for status among the other members of the group. Many zoos have to separate baboons that don’t get along, leaving only a handful of zoos with functioning troops, Bovee explained.
Involvement in this study and ongoing research has been enriching for the 12-member troop, she said, noting that several baboons participate in research tasks about three days a week. “They enjoy it,” she says. “We never have to force them to participate. If they don’t want to do it that day, no big deal.
“It stimulates our animals in a new way that we hadn’t thought of before,” Bovee adds. “It kind of breaks up their routine during the day, gets them thinking. It gives them time by themselves to get the attention focused on them for once. And it reduces fighting among the troop. So it’s good for everybody.”
The zoo has actually adapted some of the research techniques, like a matching game with a touch-screen computer that dispenses treats, and taken it to the orangutans. “They’re using an iPad,” she says.
She also enjoys documenting the intelligence of her charges. “A lot of people don’t realize how smart these animals are. Baboons can show you that five is more than two. That’s as accurate as a typical three year old, so you have to give them that credit.”
Cantlon extends those insights to young children: “In the same way that we underestimate the cognitive abilities of non-human animals, we sometimes underestimate the cognitive abilities of preverbal children. There are quantitative abilities that exist in children prior to formal schooling or even being able to use language.”
Insomniacs desperate for some zzzs may one day have a safer way to get them. Scientists have developed a new sleep medication that has induced sleep in rodents and monkeys without apparently impairing cognition, a potentially dangerous side effect of common sleep aids. The discovery, which originated in work explaining narcolepsy, could lead to a new class of drugs that help people who don’t respond to other treatments.
Between 10% and 15% of Americans chronically struggle with getting to or staying asleep. Many of them turn to sleeping pills for relief, and most are prescribed drugs, such as zolpidem (Ambien) and eszopiclone (Lunesta), that slow down the brain by binding to receptors for GABA, a neurotransmitter that’s involved in mood, cognition, and muscle tone. But because the drugs target GABA indiscriminately, they can also impair cognition, causing amnesia, confusion, and other problems with learning and memory, along with a number of strange sleepwalking behaviors, including wandering, eating, and driving while asleep. This has led many researchers to seek out alternative mechanisms for inducing sleep.
Neuroscientist Jason Uslaner of Merck Research Laboratories in West Point, Pennsylvania, and colleagues decided to tap into the brain’s orexin system. Orexin (also known as hypocretin) is a protein that controls wakefulness and is missing in people with narcolepsy. Past studies successfully induced sleep by inhibiting orexin, but had not looked into its effects on cognition. The researchers developed a new orexin-inhibiting compound called DORA-22 and confirmed that it could induce sleep in rats and rhesus monkeys as effectively as the GABA-modulating drugs.
Then the researchers went about testing the drugs’ effects on the animals’ cognition. They measured the rats’ cognition and memory by assessing the rodents’ ability to recognize objects. They presented the rats with a new object—say, a cone or a sphere—that the rats then sniffed and explored. Then they took the object away for an hour. After that hour, the rats were exposed to a new object and the one they’d already gotten to know; if the rats remembered, they spent less time checking out the familiar object. With the primates, Uslaner’s team tested their ability to match colors on a touchscreen and to pay attention to and identify the origin of a flashing light. In all the cases, the researchers found the GABA-modulating sleeping pills caused both the rats and the primates to respond more slowly and less accurately. Monkeys taking the memory and attention tests, for example, were 20% less accurate on the highest dose of each of the GABA-modulating drugs. But DORA-22 had no such effect on cognition, the team reports today in Science Translational Medicine.
“We were very excited,” Uslaner says. “Folks who take sleep medications need to be able to perform cognitive tasks when they awake, and this [compound] could help them do so without impairment.”
Although DORA-22 has not yet been tested in humans, it holds tremendous promise for helping people suffering from sleep disorders, says Emmanuel Mignot, a sleep researcher with the Stanford University School of Medicine in Palo Alto, California. “This study is encouraging and exciting, because there’s good reason to believe it would work differently from what we’ve used in the past,” says Mignot, who helped discover the link between orexin (or its absence) and narcolepsy. “Not every drug works for everyone, so it’s really, really good news to have a potential new drug on the horizon.”
Non-musicians who speak tonal languages may have a better ear for learning musical notes, according to Canadian researchers.
Tonal languages, found mainly in Asia, Africa and South America, have an abundance of high and low pitch patterns as part of speech. In these languages, differences in pitch can alter the meaning of a word. Vietnamese, for example, has eleven different vowel sounds and six different tones. Cantonese also has an intricate six-tone system, while English has no tones.
Researchers at Baycrest Health Sciences’ Rotman Research Institute (RRI) in Toronto have found the strongest evidence yet that speaking a tonal language may improve how the brain hears music. While the findings may boost the egos of tonal language speakers who excel in musicianship, they are exciting neuroscientists for another reason: they represent the first strong evidence that music and language – which share overlapping brain structures – have bi-directional benefits!
The findings are published today in PLOS ONE, an international, peer-reviewed open-access science journal.
The benefits of music training for speech and language are already well documented (showing positive influences on speech perception and recognition, auditory working memory, aspects of verbal intelligence, and awareness of the sound structure of spoken words). The reverse – the benefits of language experience for learning music – has largely been unexplored until now.
“For those who speak tonal languages, we believe their brain’s auditory system is already enhanced to allow them to hear musical notes better and detect minute changes in pitch,” said lead investigator Gavin Bidelman, who conducted the research as a post-doctoral fellow at Baycrest’s RRI, supported by a GRAMMY Foundation® grant.
“If you pick up an instrument, you may be able to acquire the skills faster to play that instrument because your brain has already built up these auditory perceptual advantages through speaking your native tonal language.”
But Bidelman, now assistant professor with the Institute for Intelligent Systems and School of Communication Science & Disorders at the University of Memphis, was quick to dispel the notion that people who speak tonal languages make better musicians. Musicianship requires much more than the sense of hearing and plenty of English-speaking musical icons will put that quick assumption to rest.
That music and language – two key domains of human cognition – can influence each other offers exciting possibilities for devising new approaches to rehabilitation for people with speech and language deficits, said Bidelman.
“If music and language are so intimately coupled, we may be able to design rehabilitation treatments that use musical training to help individuals improve speech-related functions that have been impaired due to age, aphasia or stroke,” he suggested. Bidelman added that similar benefits might also work in the opposite direction. Musical listening skills could be improved by designing well-crafted speech and language training programs.
Fifty-four healthy adults in their mid-20s were recruited for the study from the University of Toronto and Greater Toronto Area. They were divided into three groups: English-speaking trained musicians (instrumentalists) and Cantonese-speaking and English-speaking non-musicians. Wearing headphones in a sound-proof lab, participants were tested on their ability to discriminate complex musical notes. They were assessed on measures of auditory pitch acuity and music perception as well as general cognitive ability such as working memory and fluid intelligence (abstract reasoning, thinking quickly).
While the musicians demonstrated superior performance on all auditory measures, the Cantonese non-musicians showed similar performance to musicians on music and cognitive behavioural tasks, testing 15 to 20 percent higher than that of the English-speaking non-musicians.
Bidelman added that not all tonal languages may offer the music listening benefits seen with the Cantonese speakers in his study. Mandarin, for example, has more “curved” tones and the pitch patterns vary with time – which is different from how pitch occurs in music. Musical pitch resembles “stair step, level pitch patterns” which happen to share similarities with the Cantonese language, he explained.
Previous evidence points to a causal link between playing action video games and enhanced cognition and perception. However, benefits of playing other video games are under-investigated. We examined whether playing non-action games also improves cognition. Hence, we compared transfer effects of an action and other non-action types that required different cognitive demands.
We instructed 5 groups of non-gamer participants to play one game each on a mobile device (iPhone/iPod Touch) for one hour a day/five days a week over four weeks (20 hours). Games included action, spatial memory, match-3, hidden- object, and an agent-based life simulation. Participants performed four behavioral tasks before and after video game training to assess for transfer effects. Tasks included an attentional blink task, a spatial memory and visual search dual task, a visual filter memory task to assess for multiple object tracking and cognitive control, as well as a complex verbal span task. Action game playing eliminated attentional blink and improved cognitive control and multiple-object tracking. Match-3, spatial memory and hidden object games improved visual search performance while the latter two also improved spatial working memory. Complex verbal span improved after match-3 and action game training.
Cognitive improvements were not limited to action game training alone and different games enhanced different aspects of cognition. We conclude that training specific cognitive abilities frequently in a video game improves performance in tasks that share common underlying demands. Overall, these results suggest that many video game-related cognitive improvements may not be due to training of general broad cognitive systems such as executive attentional control, but instead due to frequent utilization of specific cognitive processes during game play. Thus, many video game training related improvements to cognition may be attributed to near-transfer effects.
Children as young as 3 years old know when they are not sure about a decision, and can use that uncertainty to guide decision making, according to new research from the Center for Mind and Brain at the University of California, Davis.
“There is behavioral evidence that they can do this, but the literature has assumed that until late preschool, children cannot introspect and make a decision based on that introspection,” said Simona Ghetti, professor of psychology at UC Davis and co-author of the study with graduate student Kristen Lyons, now an assistant professor at Metropolitan State University of Denver. [Preschoolers Use Introspection to Make Decisions]
The findings are published online by the journal Child Development and will appear in print in an upcoming issue.
Ghetti studies how reasoning, memory and cognition emerge during childhood. It is known that children get better at introspection through elementary school, she said. Lyons and Ghetti wanted to see whether this ability to ponder exists in younger children.
Previous studies have used open-ended questions to find out how children feel about a decision, but that approach is limited by younger children’s ability to report on the content of their mental activity. Instead, Lyons and Ghetti showed 3-, 4- and 5-year-olds ambiguous drawings of objects and asked them to point to a particular object, such as a cup, a car or the sun. Then they asked the children to point to one of two pictures of faces, one looking confident and one doubtful, to rate whether they were confident or not confident about a decision.
In one of the tests, children had to choose a drawing even if unsure. In a second set of tests they had a “don’t want to pick” option.
Across the age range, children were more likely to say they were not confident about their decision when they had in fact made a wrong choice. When they had a “don’t know” option, they were most likely to take it if they had been unsure of their choice in the “either/or” test.
By opting not to choose when uncertain, the children could improve their overall accuracy on the test.
“Children as young as 3 years of age are aware of when they are making a mistake, they experience uncertainty that they can introspect on, and then they can use that introspection to drive their decision making,” Ghetti said.
The researchers hope to extend their studies to younger children to examine the emergence of introspection and reasoning.
(Image: Jupiter Images)
Gentle electrical zaps to the brain can accelerate learning and boost performance on a wide range of mental tasks, scientists have reported in recent years. But a new study suggests there may be a hidden price: Gains in one aspect of cognition may come with deficits in another.
Researchers who study transcranial electrical stimulation, which uses electrodes placed on the scalp, see it as a potentially promising way to enhance cognition in neurological patients, struggling students, and perhaps even ordinary people. Scientists have used it to speed up rehab in people whose speech or movement has been affected by a stroke, and DARPA has studied it as a way to accelerate learning in intelligence analysts or soldiers on the lookout for bad guys and bombs.
Until now, the papers coming out of this field have reported one good-news finding after another.
“This is the first paper to my knowledge to show a cost associated with the gains in cognitive function,” said neuropsychologist Rex Jung of the University of New Mexico, who was not associated with the study. “It’s a really nice demonstration.”
Cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford, who led the study, has been investigating brain stimulation to boost mathematical abilities. He has applied for a patent on a brain stimulator he hopes could help math-challenged students get a better grip on the basics, or even help the mathematically inclined perform even better.
Cohen Kadosh and his colleague Teresa Iuculano investigated 19 volunteers as they learned a new numerical system by trial and error. The new system was based on arbitrary symbols: A cylinder represented the number five, for example, and a triangle represented the number nine. In several training sessions the volunteers viewed pairs of symbols on a computer screen and pressed a key to indicate which one represented a bigger quantity. At first they had to guess, but they eventually learned which symbols corresponded with which numbers.
All of the volunteers wore electrodes on their scalp during these training session. Some received mild electrical stimulation that targeted the posterior parietal cortex, an area implicated in previous studies of numerical cognition. Others received stimulation of the dorsolateral prefrontal cortex, an area involved in a wide range of functions, including learning and memory. A third group received sham stimulation that caused a slight tingling of the skin but no change in brain activity.
Those who had the parietal area involved in numerical cognition stimulated learned the new number system more quickly than those who got sham stimulation, the researchers report in the Journal of Neuroscience. But at the end of the weeklong study their reaction times were slower when they had to put their newfound knowledge to use to solve a new task that they hadn’t seen during the training sessions. ”They had trouble accessing what they’d learned,” Cohen Kadosh said.
The volunteers who had the prefrontal area involved in learning and memory stimulated showed the opposite pattern. They were slower than the control group to learn the new numerical system, but they performed faster on the new test at the end of the experiment. The bottom line, says Cohen Kadosh, is that stimulating either brain region had both benefits and drawbacks. ”Just like with drugs, there seem to be side effects,” he said.
Going forward, Cohen Kadosh says, more work is needed on how to maximize the benefits and minimize the costs of electrical brain stimulation. He thinks the approach has promise, but only when it’s used strategically, by picking the right brain regions to target and stimulating them while a person is training on the skill they want to improve. ”I think it’s going to be useless unless you pair it with some type of cognitive training,” he said.
But that’s not stopping some people from giving it a try on their own. Although it should be obvious that DIY brain stimulation is a bad idea, both Jung and Cohen Kadosh say there seems to be growing interest in the general public in using it for cognitive enhancement.
“There are some do it yourself websites I’ve stumbled across that are pretty frightening,” Jung said. “People are definitely tinkering around with this in their garage.”
The new study suggests one way that could backfire. And that’s not all, said Jung. ”You can burn yourself if nothing else.”
For more than a century, neurons have been the superstars of the brain. Their less glamorous partners, glial cells, can’t send electric signals, and so they’ve been mostly ignored.
Now scientists have injected some human glial cells into the brains of newborn mice. When the mice grew up, they were faster learners. The study, published Thursday in Cell Stem Cell, not only introduces a new tool to study the mechanisms of the human brain, it supports the hypothesis that glial cells — and not just neurons — play an important role in learning.
The scientific obsession with neurons really began at the end of the 19th century. Spanish anatomy professor Santiago Ramon y Cajal used a special dye to stain brain tissue. Under the microscope, neurons were revealed in exquisite detail. “A dense forest,” Ramón y Cajal called it — a field of little branching cells that would soon be named neurons.
With beautiful ink drawings, Ramón y Cajal painstakingly mapped neural networks and slowly developed the theory that neurons are the telegraph lines of thought (an idea later embraced by Schoolhouse Rock). Every idea and memory — every aspect of learning — could be traced back to the electric signals sent between neurons. Ramón y Cajal won the Nobel Prize for his work, and scientists focused on neurons for the next century.
But neurons aren’t the only cells in the brain.
“We’ve overlooked half the brain,” says Douglas Fields, a neuroscientist at the National Institutes of Health. “We’ve only been studying one kind of cell in the brain.” The other kind of cell — glial cells — are at least as abundant as neurons. But early scientists thought they were so boring they didn’t even merit a singular noun. “Glia is plural — there is no singular,” Fields says. “We have ‘neuron’ but we don’t have ‘glion.’ “
Glial cells lacked the ability to send electric signals, and most scientists thought they were housekeeping cells, helping provide nutrients and insulation.
It was only in the last decade or so that scientists realized glial cells were more than that. Special types of glial cells, called astrocytes, which are named for the star-like patterns of their cellular structure, have their own form of chemical signaling. They have the potential to coordinate whole groups of neurons. “Glia are in a position to regulate the flow of information through the brain,” Fields says. “This is all missing from our models.”
And there’s something else. This type of glial cell, these astrocytes, have changed a lot as humans have evolved, while neurons have pretty much stayed the same. A mouse neuron and a human neuron look so much alike, even experienced neuroscientists can’t tell them apart.
“I can’t tell the differences between a neuron from a bird or a mouse or a primate or a human,” says Steve Goldman, a neuroscientist at the University of Rochester who has studied brain cells for decades. But Goldman says glial cells are easy to tell apart.
“Human glial cells — human astrocytes — are much larger than those of lower species,” he says. “They have more fibers and they send those fibers out over greater distances.”
The thought is maybe these glial cells have played a role in making humans smarter. So Goldman teamed up with this wife, Maiken Nedergaard, to test this idea.
They injected some human glial cells into the brains of newborn mice. The mice grew up, and so did the human glial cells. The cells spread through the mouse brain, integrating perfectly with mouse neurons and, in some areas, outnumbering their mouse counterparts. All the while Goldman says the glial cells maintained their human characteristics.
“They very much thought that they were in the human brain, in terms of how they developed and integrated,” he says.
So what are these mice like, the ones with brains full of functioning human cells? Their neural circuitry is still just the same, so they act completely normal. They still socialize with other mice and still seem interested in mousey things.
But the researchers say these mice are measurably smarter. In classic maze tests, they learn faster. “They make many fewer errors, and it takes them less time to come to the appropriate answer,” Goldman says.
It might take a normal mouse four or five attempts to learn the correct route, for example. But a mouse with human brain cells could get it on the second try. Glial cells — those boring glial cells — somehow enhance learning.
In fact, they could be changing what it means to be a mouse, and that raises ethical questions for this kind of research.
“Maybe bioethicists have been a little bit too cavalier assuming that a mouse with some human brain cells in it is just your normal old mouse,” says Robert Streiffer, a bioethicist from the University of Wisconsin-Madison. “Well, it’s not going to be human, but that doesn’t mean it’s a normal old mouse either.”
Streiffer says it’s not just that these mice can get through a maze more quickly — they’re better at recognizing things that scare them. And perception of fear is one of the things bioethicists must weigh when they decide the types of experiments you can do on an animal.
“So you have to sort of step back and do some hardcore philosophy,” he says. Like, will these types of human-animal hybrids eventually get close enough to humanity that we would feel uncomfortable performing experiments on them?
The researchers in this study say we’re really, really far from that point. And if you want to investigate the role of glial cells, these hybrid mice are the best tools available.
Good mental health and clear thinking depend upon our ability to store and manipulate thoughts on a sort of “mental sketch pad.” In a new study, Yale School of Medicine researchers describe the molecular basis of this ability — the hallmark of human cognition — and describe how a breakdown of the system contributes to diseases such as schizophrenia and Alzheimer’s disease.
“Insults to these highly evolved cortical circuits impair the ability to create and maintain our mental representations of the world, which is the basis of higher cognition,” said Amy Arnsten, professor of neurobiology and senior author of the paper published in the Feb. 20 issue of the journal Neuron.
High-order thinking depends upon our ability to generate mental representations in our brains without any sensory stimulation from the environment. These cognitive abilities arise from highly evolved circuits in the prefrontal cortex. Mathematical models by former Yale neurobiologist Xiao-Jing Wang, now of New York University, predicted that in order to maintain these visual representations the prefrontal cortex must rely on a family of receptors that allow for slow, steady firing of neurons. The Yale scientists show that NMDA-NR2B receptors involved in glutamate signaling regulate this neuronal firing. These receptors, studied at Yale for more than a decade, are responsible for activity of highly evolved brain circuits found especially in primates.
Earlier studies have shown these types of NMDA receptors are often altered in patients with schizophrenia. The Neuron study suggests that those suffering from the disease may be unable to hold onto a stable view of the world. Also, these receptors seem to be altered in Alzheimer’s patients, which may contribute to the cognitive deficits of dementia.
The lab of Dr. John Krystal, chair of the department of psychiatry at Yale, has found that the anesthetic ketamine, abused as a street drug, blocks NMDA receptors and can mimic some of the symptoms of schizophrenia. The current study in Neuron shows that ketamine may reduce the firing of the same higher-order neural circuits that are decimated in schizophrenia.
“Identifying the receptor needed for higher cognition may help us to understand why certain genetic insults lead to cognitive impairment and will help us to develop strategies for treating these debilitating disorders,” Arnsten said.
Cognitive brain researchers have studied a magic trick filmed in magician duo Penn & Teller’s theater in Las Vegas, to illuminate the neuroscience of illusion. Their results advance our understanding of how observers can be misdirected and will aid magicians as they work to improve their art.
The research team was led by Dr. Stephen Macknik, Director of the Laboratory of Behavioral Neurophysiology at Barrow Neurological Institute, in collaboration with fellow Barrow researchers Hector Rieiro and Dr. Susana Martinez-Conde, Director of the Laboratory of Visual Neuroscience. The study, titled “Perceptual elements in Penn and Teller’s “Cups and Balls” magic trick” was published today, Feb 12th 2013, as part of the launch of PeerJ, a new peer reviewed open access journal in which all articles are freely available to everyone. “Cups and Balls,” a magic illusion in which balls appear and disappear under the cover of cups, is one of the oldest magic tricks in history, with documented descriptions going back to Roman conjurors in 3 B.C. “But we still don’t know how it really works in the brain,” says Macknik, “because this is the first, long overdue, neuroscientific study of the trick.”
The discovery concerns the way magicians manipulate human cognition and perception. The “Cups and Balls” trick has many variations, but the most common one uses three balls and three cups. The magician makes the balls pass through the bottom of cups, jump from cup to cup, disappear from a cup and turn up elsewhere, turn into other objects, and so on. The cups are usually opaque and the balls brightly colored. Penn & Teller’s variant is performed with three opaque and then with three transparent cups. “The transparent cups mean that visual information about the loading of the balls is readily available to the brain, yet still the spectators cannot see how the trick is done!” said Martinez-Conde.
Magicians have performed and systematically developed the art and theory of this illusion for thousands of years, but each new generation of conjurers offers new insights and hypotheses about how and why it works for the audience. Here the scientists turned the power of the scientific method to the illusion. The experiments tracked when and where observers looked during video clips portraying specific element of the performance, filmed by a NOVA scienceNOW TV crew. By quantifying how well observers tracked the loading and unloading of balls with and without transparent cups, the scientists determined that some aspects of the illusion were even more powerful at controlling attention than aspects originally predicted by the magician.
The end result is that cognitive scientists now have an improved understanding of how (and by how much) observers can be misdirected. In addition, this knowledge can help magicians further hone their art.