Neuroscience

Articles and news from the latest research reports.

1,078 notes

Neanderthals were not inferior to modern humans
If you think Neanderthals were stupid and primitive, it’s time to think again.
The widely held notion that Neanderthals were dimwitted and that their inferior intelligence allowed them to be driven to extinction by the much brighter ancestors of modern humans is not supported by scientific evidence, according to a researcher at the University of Colorado Boulder. 
Neanderthals thrived in a large swath of Europe and Asia between about 350,000 and 40,000 years ago. They disappeared after our ancestors, a group referred to as “anatomically modern humans,” crossed into Europe from Africa.
In the past, some researchers have tried to explain the demise of the Neanderthals by suggesting that the newcomers were superior to Neanderthals in key ways, including their ability to hunt, communicate, innovate and adapt to different environments.  
But in an extensive review of recent Neanderthal research, CU-Boulder researcher Paola Villa and co-author Wil Roebroeks, an archaeologist at Leiden University in the Netherlands, make the case that the available evidence does not support the opinion that Neanderthals were less advanced than anatomically modern humans. Their paper was published in the journal PLOS ONE.
"The evidence for cognitive inferiority is simply not there,” said Villa, a curator at the University of Colorado Museum of Natural History. “What we are saying is that the conventional view of Neanderthals is not true."
Villa and Roebroeks scrutinized nearly a dozen common explanations for Neanderthal extinction that rely largely on the notion that the Neanderthals were inferior to anatomically modern humans. These include the hypotheses that Neanderthals did not use complex, symbolic communication; that they were less efficient hunters who had inferior weapons; and that they had a narrow diet that put them at a competitive disadvantage to anatomically modern humans, who ate a broad range of things.
The researchers found that none of the hypotheses were supported by the available research. For example, evidence from multiple archaeological sites in Europe suggests that Neanderthals hunted as a group, using the landscape to aid them.
Researchers have shown that Neanderthals likely herded hundreds of bison to their death by steering them into a sinkhole in southwestern France. At another site used by Neanderthals, this one in the Channel Islands, fossilized remains of 18 mammoths and five woolly rhinoceroses were discovered at the base of a deep ravine. These findings imply that Neanderthals could plan ahead, communicate as a group and make efficient use of their surroundings, the authors said.
Other archaeological evidence unearthed at Neanderthal sites provides reason to believe that Neanderthals did in fact have a diverse diet. Microfossils found in Neanderthal teeth and food remains left behind at cooking sites indicate that they may have eaten wild peas, acorns, pistachios, grass seeds, wild olives, pine nuts and date palms depending on what was locally available.
Additionally, researchers have found ochre, a kind of earth pigment, at sites inhabited by Neanderthals, which may have been used for body painting. Ornaments have also been collected at Neanderthal sites. Taken together, these findings suggest that Neanderthals had cultural rituals and symbolic communication.
Villa and Roebroeks say that the past misrepresentation of Neanderthals’ cognitive ability may be linked to the tendency of researchers to compare Neanderthals, who lived in the Middle Paleolithic, to modern humans living during the more recent Upper Paleolithic period, when leaps in technology were being made.
“Researchers were comparing Neanderthals not to their contemporaries on other continents but to their successors,” Villa said. “It would be like comparing the performance of Model T Fords, widely used in America and Europe in the early part of the last century, to the performance of a modern-day Ferrari and conclude that Henry Ford was cognitively inferior to Enzo Ferrari.”
Although many still search for a simple explanation and like to attribute the Neanderthal demise to a single factor, such as cognitive or technological inferiority, archaeology shows that there is no support for such interpretations, the authors said.
But if Neanderthals were not technologically and cognitively disadvantaged, why didn’t they survive?
The researchers argue that the real reason for Neanderthal extinction is likely complex, but they say some clues may be found in recent analyses of the Neanderthal genome over the last several years. These genomic studies suggest that anatomically modern humans and Neanderthals likely interbred and that the resulting male children may have had reduced fertility. Recent genomic studies also suggest that Neanderthals lived in small groups. All of these factors could have contributed to the decline of the Neanderthals, who were eventually swamped and assimilated by the increasing numbers of modern immigrants.
(Image: Reconstruction by Kennis & Kennis / Photograph by Joe McNally)

Neanderthals were not inferior to modern humans

If you think Neanderthals were stupid and primitive, it’s time to think again.

The widely held notion that Neanderthals were dimwitted and that their inferior intelligence allowed them to be driven to extinction by the much brighter ancestors of modern humans is not supported by scientific evidence, according to a researcher at the University of Colorado Boulder. 

Neanderthals thrived in a large swath of Europe and Asia between about 350,000 and 40,000 years ago. They disappeared after our ancestors, a group referred to as “anatomically modern humans,” crossed into Europe from Africa.

In the past, some researchers have tried to explain the demise of the Neanderthals by suggesting that the newcomers were superior to Neanderthals in key ways, including their ability to hunt, communicate, innovate and adapt to different environments.  

But in an extensive review of recent Neanderthal research, CU-Boulder researcher Paola Villa and co-author Wil Roebroeks, an archaeologist at Leiden University in the Netherlands, make the case that the available evidence does not support the opinion that Neanderthals were less advanced than anatomically modern humans. Their paper was published in the journal PLOS ONE.

"The evidence for cognitive inferiority is simply not there,” said Villa, a curator at the University of Colorado Museum of Natural History. “What we are saying is that the conventional view of Neanderthals is not true."

Villa and Roebroeks scrutinized nearly a dozen common explanations for Neanderthal extinction that rely largely on the notion that the Neanderthals were inferior to anatomically modern humans. These include the hypotheses that Neanderthals did not use complex, symbolic communication; that they were less efficient hunters who had inferior weapons; and that they had a narrow diet that put them at a competitive disadvantage to anatomically modern humans, who ate a broad range of things.

The researchers found that none of the hypotheses were supported by the available research. For example, evidence from multiple archaeological sites in Europe suggests that Neanderthals hunted as a group, using the landscape to aid them.

Researchers have shown that Neanderthals likely herded hundreds of bison to their death by steering them into a sinkhole in southwestern France. At another site used by Neanderthals, this one in the Channel Islands, fossilized remains of 18 mammoths and five woolly rhinoceroses were discovered at the base of a deep ravine. These findings imply that Neanderthals could plan ahead, communicate as a group and make efficient use of their surroundings, the authors said.

Other archaeological evidence unearthed at Neanderthal sites provides reason to believe that Neanderthals did in fact have a diverse diet. Microfossils found in Neanderthal teeth and food remains left behind at cooking sites indicate that they may have eaten wild peas, acorns, pistachios, grass seeds, wild olives, pine nuts and date palms depending on what was locally available.

Additionally, researchers have found ochre, a kind of earth pigment, at sites inhabited by Neanderthals, which may have been used for body painting. Ornaments have also been collected at Neanderthal sites. Taken together, these findings suggest that Neanderthals had cultural rituals and symbolic communication.

Villa and Roebroeks say that the past misrepresentation of Neanderthals’ cognitive ability may be linked to the tendency of researchers to compare Neanderthals, who lived in the Middle Paleolithic, to modern humans living during the more recent Upper Paleolithic period, when leaps in technology were being made.

“Researchers were comparing Neanderthals not to their contemporaries on other continents but to their successors,” Villa said. “It would be like comparing the performance of Model T Fords, widely used in America and Europe in the early part of the last century, to the performance of a modern-day Ferrari and conclude that Henry Ford was cognitively inferior to Enzo Ferrari.”

Although many still search for a simple explanation and like to attribute the Neanderthal demise to a single factor, such as cognitive or technological inferiority, archaeology shows that there is no support for such interpretations, the authors said.

But if Neanderthals were not technologically and cognitively disadvantaged, why didn’t they survive?

The researchers argue that the real reason for Neanderthal extinction is likely complex, but they say some clues may be found in recent analyses of the Neanderthal genome over the last several years. These genomic studies suggest that anatomically modern humans and Neanderthals likely interbred and that the resulting male children may have had reduced fertility. Recent genomic studies also suggest that Neanderthals lived in small groups. All of these factors could have contributed to the decline of the Neanderthals, who were eventually swamped and assimilated by the increasing numbers of modern immigrants.

(Image: Reconstruction by Kennis & Kennis / Photograph by Joe McNally)

Filed under Neanderthals intelligence cognitive ability cognitive inferiority evolution neuroscience science

608 notes

Artificial intelligence ‘could be the worst thing to happen to humanity’: Stephen Hawking warns that rise of robots may be disastrous for mankind
A sinister threat is brewing deep inside the technology laboratories of Silicon Valley.
Artificial Intelligence, disguised as helpful digital assistants and self-driving vehicles, is gaining a foothold – and it could one day spell the end for mankind.
This is according to Stephen Hawking who has warned that humanity faces an uncertain future as technology learns to think for itself and adapt to its environment.
Read more

Artificial intelligence ‘could be the worst thing to happen to humanity’: Stephen Hawking warns that rise of robots may be disastrous for mankind

A sinister threat is brewing deep inside the technology laboratories of Silicon Valley.

Artificial Intelligence, disguised as helpful digital assistants and self-driving vehicles, is gaining a foothold – and it could one day spell the end for mankind.

This is according to Stephen Hawking who has warned that humanity faces an uncertain future as technology learns to think for itself and adapt to its environment.

Read more

Filed under AI robotics robots Stephen Hawking transcendence technology neuroscience science

355 notes

Bioengineers create circuit board modeled on the human brain
Stanford bioengineers have developed faster, more energy-efficient microchips based on the human brain – 9,000 times faster and using significantly less power than a typical PC. This offers greater possibilities for advances in robotics and a new way of understanding the brain. For instance, a chip as fast and efficient as the human brain could drive prosthetic limbs with the speed and complexity of our own actions.
Stanford bioengineers have developed a new circuit board modeled on the human brain, possibly opening up new frontiers in robotics and computing.
For all their sophistication, computers pale in comparison to the brain. The modest cortex of the mouse, for instance, operates 9,000 times faster than a personal computer simulation of its functions.
Not only is the PC slower, it takes 40,000 times more power to run, writes Kwabena Boahen, associate professor of bioengineering at Stanford, in an article for the Proceedings of the IEEE.
"From a pure energy perspective, the brain is hard to match," says Boahen, whose article surveys how "neuromorphic" researchers in the United States and Europe are using silicon and software to build electronic systems that mimic neurons and synapses.
Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed “Neurocore” chips. Together these 16 chips can simulate 1 million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. The result was Neurogrid – a device about the size of an iPad that can simulate orders of magnitude more neurons and synapses than other brain mimics on the power it takes to run a tablet computer.
The National Institutes of Health funded development of this million-neuron prototype with a five-year Pioneer Award. Now Boahen stands ready for the next steps – lowering costs and creating compiler software that would enable engineers and computer scientists with no knowledge of neuroscience to solve problems – such as controlling a humanoid robot – using Neurogrid.
Its speed and low power characteristics make Neurogrid ideal for more than just modeling the human brain. Boahen is working with other Stanford scientists to develop prosthetic limbs for paralyzed people that would be controlled by a Neurocore-like chip.
"Right now, you have to know how the brain works to program one of these," said Boahen, gesturing at the $40,000 prototype board on the desk of his Stanford office. "We want to create a neurocompiler so that you would not need to know anything about synapses and neurons to able to use one of these."
Brain ferment
In his article, Boahen notes the larger context of neuromorphic research, including the European Union’s Human Brain Project, which aims to simulate a human brain on a supercomputer. By contrast, the U.S. BRAIN Project – short for Brain Research through Advancing Innovative Neurotechnologies – has taken a tool-building approach by challenging scientists, including many at Stanford, to develop new kinds of tools that can read out the activity of thousands or even millions of neurons in the brain as well as write in complex patterns of activity.
Zooming from the big picture, Boahen’s article focuses on two projects comparable to Neurogrid that attempt to model brain functions in silicon and/or software.
One of these efforts is IBM’s SyNAPSE Project – short for Systems of Neuromorphic Adaptive Plastic Scalable Electronics. As the name implies, SyNAPSE involves a bid to redesign chips, code-named Golden Gate, to emulate the ability of neurons to make a great many synaptic connections – a feature that helps the brain solve problems on the fly. At present a Golden Gate chip consists of 256 digital neurons each equipped with 1,024 digital synaptic circuits, with IBM on track to greatly increase the numbers of neurons in the system.
Heidelberg University’s BrainScales project has the ambitious goal of developing analog chips to mimic the behaviors of neurons and synapses. Their HICANN chip – short for High Input Count Analog Neural Network – would be the core of a system designed to accelerate brain simulations, to enable researchers to model drug interactions that might take months to play out in a compressed time frame. At present, the HICANN system can emulate 512 neurons each equipped with 224 synaptic circuits, with a roadmap to greatly expand that hardware base.
Each of these research teams has made different technical choices, such as whether to dedicate each hardware circuit to modeling a single neural element (e.g., a single synapse) or several (e.g., by activating the hardware circuit twice to model the effect of two active synapses). These choices have resulted in different trade-offs in terms of capability and performance.
In his analysis, Boahen creates a single metric to account for total system cost – including the size of the chip, how many neurons it simulates and the power it consumes.
Neurogrid was by far the most cost-effective way to simulate neurons, in keeping with Boahen’s goal of creating a system affordable enough to be widely used in research.
Speed and efficiency
But much work lies ahead. Each of the current million-neuron Neurogrid circuit boards cost about $40,000. Boahen believes dramatic cost reductions are possible. Neurogrid is based on 16 Neurocores, each of which supports 65,536 neurons. Those chips were made using 15-year-old fabrication technologies.
By switching to modern manufacturing processes and fabricating the chips in large volumes, he could cut a Neurocore’s cost 100-fold – suggesting a million-neuron board for $400 a copy. With that cheaper hardware and compiler software to make it easy to configure, these neuromorphic systems could find numerous applications.
For instance, a chip as fast and efficient as the human brain could drive prosthetic limbs with the speed and complexity of our own actions – but without being tethered to a power source. Krishna Shenoy, an electrical engineering professor at Stanford and Boahen’s neighbor at the interdisciplinary Bio-X center, is developing ways of reading brain signals to understand movement. Boahen envisions a Neurocore-like chip that could be implanted in a paralyzed person’s brain, interpreting those intended movements and translating them to commands for prosthetic limbs without overheating the brain.
A small prosthetic arm in Boahen’s lab is currently controlled by Neurogrid to execute movement commands in real time. For now it doesn’t look like much, but its simple levers and joints hold hope for robotic limbs of the future.
Of course, all of these neuromorphic efforts are beggared by the complexity and efficiency of the human brain.
In his article, Boahen notes that Neurogrid is about 100,000 times more energy efficient than a personal computer simulation of 1 million neurons. Yet it is an energy hog compared to our biological CPU.
"The human brain, with 80,000 times more neurons than Neurogrid, consumes only three times as much power," Boahen writes. "Achieving this level of energy efficiency while offering greater configurability and scale is the ultimate challenge neuromorphic engineers face."

Bioengineers create circuit board modeled on the human brain

Stanford bioengineers have developed faster, more energy-efficient microchips based on the human brain – 9,000 times faster and using significantly less power than a typical PC. This offers greater possibilities for advances in robotics and a new way of understanding the brain. For instance, a chip as fast and efficient as the human brain could drive prosthetic limbs with the speed and complexity of our own actions.

Stanford bioengineers have developed a new circuit board modeled on the human brain, possibly opening up new frontiers in robotics and computing.

For all their sophistication, computers pale in comparison to the brain. The modest cortex of the mouse, for instance, operates 9,000 times faster than a personal computer simulation of its functions.

Not only is the PC slower, it takes 40,000 times more power to run, writes Kwabena Boahen, associate professor of bioengineering at Stanford, in an article for the Proceedings of the IEEE.

"From a pure energy perspective, the brain is hard to match," says Boahen, whose article surveys how "neuromorphic" researchers in the United States and Europe are using silicon and software to build electronic systems that mimic neurons and synapses.

Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed “Neurocore” chips. Together these 16 chips can simulate 1 million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. The result was Neurogrid – a device about the size of an iPad that can simulate orders of magnitude more neurons and synapses than other brain mimics on the power it takes to run a tablet computer.

The National Institutes of Health funded development of this million-neuron prototype with a five-year Pioneer Award. Now Boahen stands ready for the next steps – lowering costs and creating compiler software that would enable engineers and computer scientists with no knowledge of neuroscience to solve problems – such as controlling a humanoid robot – using Neurogrid.

Its speed and low power characteristics make Neurogrid ideal for more than just modeling the human brain. Boahen is working with other Stanford scientists to develop prosthetic limbs for paralyzed people that would be controlled by a Neurocore-like chip.

"Right now, you have to know how the brain works to program one of these," said Boahen, gesturing at the $40,000 prototype board on the desk of his Stanford office. "We want to create a neurocompiler so that you would not need to know anything about synapses and neurons to able to use one of these."

Brain ferment

In his article, Boahen notes the larger context of neuromorphic research, including the European Union’s Human Brain Project, which aims to simulate a human brain on a supercomputer. By contrast, the U.S. BRAIN Project – short for Brain Research through Advancing Innovative Neurotechnologies – has taken a tool-building approach by challenging scientists, including many at Stanford, to develop new kinds of tools that can read out the activity of thousands or even millions of neurons in the brain as well as write in complex patterns of activity.

Zooming from the big picture, Boahen’s article focuses on two projects comparable to Neurogrid that attempt to model brain functions in silicon and/or software.

One of these efforts is IBM’s SyNAPSE Project – short for Systems of Neuromorphic Adaptive Plastic Scalable Electronics. As the name implies, SyNAPSE involves a bid to redesign chips, code-named Golden Gate, to emulate the ability of neurons to make a great many synaptic connections – a feature that helps the brain solve problems on the fly. At present a Golden Gate chip consists of 256 digital neurons each equipped with 1,024 digital synaptic circuits, with IBM on track to greatly increase the numbers of neurons in the system.

Heidelberg University’s BrainScales project has the ambitious goal of developing analog chips to mimic the behaviors of neurons and synapses. Their HICANN chip – short for High Input Count Analog Neural Network – would be the core of a system designed to accelerate brain simulations, to enable researchers to model drug interactions that might take months to play out in a compressed time frame. At present, the HICANN system can emulate 512 neurons each equipped with 224 synaptic circuits, with a roadmap to greatly expand that hardware base.

Each of these research teams has made different technical choices, such as whether to dedicate each hardware circuit to modeling a single neural element (e.g., a single synapse) or several (e.g., by activating the hardware circuit twice to model the effect of two active synapses). These choices have resulted in different trade-offs in terms of capability and performance.

In his analysis, Boahen creates a single metric to account for total system cost – including the size of the chip, how many neurons it simulates and the power it consumes.

Neurogrid was by far the most cost-effective way to simulate neurons, in keeping with Boahen’s goal of creating a system affordable enough to be widely used in research.

Speed and efficiency

But much work lies ahead. Each of the current million-neuron Neurogrid circuit boards cost about $40,000. Boahen believes dramatic cost reductions are possible. Neurogrid is based on 16 Neurocores, each of which supports 65,536 neurons. Those chips were made using 15-year-old fabrication technologies.

By switching to modern manufacturing processes and fabricating the chips in large volumes, he could cut a Neurocore’s cost 100-fold – suggesting a million-neuron board for $400 a copy. With that cheaper hardware and compiler software to make it easy to configure, these neuromorphic systems could find numerous applications.

For instance, a chip as fast and efficient as the human brain could drive prosthetic limbs with the speed and complexity of our own actions – but without being tethered to a power source. Krishna Shenoy, an electrical engineering professor at Stanford and Boahen’s neighbor at the interdisciplinary Bio-X center, is developing ways of reading brain signals to understand movement. Boahen envisions a Neurocore-like chip that could be implanted in a paralyzed person’s brain, interpreting those intended movements and translating them to commands for prosthetic limbs without overheating the brain.

A small prosthetic arm in Boahen’s lab is currently controlled by Neurogrid to execute movement commands in real time. For now it doesn’t look like much, but its simple levers and joints hold hope for robotic limbs of the future.

Of course, all of these neuromorphic efforts are beggared by the complexity and efficiency of the human brain.

In his article, Boahen notes that Neurogrid is about 100,000 times more energy efficient than a personal computer simulation of 1 million neurons. Yet it is an energy hog compared to our biological CPU.

"The human brain, with 80,000 times more neurons than Neurogrid, consumes only three times as much power," Boahen writes. "Achieving this level of energy efficiency while offering greater configurability and scale is the ultimate challenge neuromorphic engineers face."

Filed under neurogrid microchip robotics neural networks brain modeling neuroscience science

203 notes

Study: People Pay More Attention to the Upper Half of Field of Vision
A new study from North Carolina State University and the University of Toronto finds that people pay more attention to the upper half of their field of vision – a finding which could have ramifications for everything from traffic signs to software interface design.
“Specifically, we tested people’s ability to quickly identify a target amidst visual clutter,” says Dr. Jing Feng, an assistant professor of psychology at NC State and lead author of a paper on the work. “Basically, we wanted to see where people concentrate their attention at first glance.”
Researchers had participants fix their eyes on the center of a computer screen, and then flashed a target and distracting symbols onto the screen for 10 to 80 milliseconds. The screen was then replaced by an unconnected “mask” image to disrupt their train of thought. Participants were asked to indicate where the target had been located on the screen.
Researchers found that people were 7 percent better at finding the target when it was located in the upper half of the screen.
“It doesn’t mean people don’t pay attention to the lower field of vision, but they were demonstrably better at paying attention to the upper field,” Feng says.
“A difference of 7 percent could make a significant difference for technologies that are safety-related or that we interact with on a regular basis,” Feng says. “For example, this could make a difference in determining where to locate traffic signs to make them more noticeable to drivers, or where to place important information on a website to highlight that information for users.”
The paper, “Upper Visual Field Advantage in Localizing a Target among Distractors,” is published online in the open-access journal i-Perception. The paper was co-authored by Dr. Ian Spence of the University of Toronto. The work was supported, in part, by the Natural Sciences and Engineering Research Council of Canada.

Study: People Pay More Attention to the Upper Half of Field of Vision

A new study from North Carolina State University and the University of Toronto finds that people pay more attention to the upper half of their field of vision – a finding which could have ramifications for everything from traffic signs to software interface design.

“Specifically, we tested people’s ability to quickly identify a target amidst visual clutter,” says Dr. Jing Feng, an assistant professor of psychology at NC State and lead author of a paper on the work. “Basically, we wanted to see where people concentrate their attention at first glance.”

Researchers had participants fix their eyes on the center of a computer screen, and then flashed a target and distracting symbols onto the screen for 10 to 80 milliseconds. The screen was then replaced by an unconnected “mask” image to disrupt their train of thought. Participants were asked to indicate where the target had been located on the screen.

Researchers found that people were 7 percent better at finding the target when it was located in the upper half of the screen.

“It doesn’t mean people don’t pay attention to the lower field of vision, but they were demonstrably better at paying attention to the upper field,” Feng says.

“A difference of 7 percent could make a significant difference for technologies that are safety-related or that we interact with on a regular basis,” Feng says. “For example, this could make a difference in determining where to locate traffic signs to make them more noticeable to drivers, or where to place important information on a website to highlight that information for users.”

The paper, “Upper Visual Field Advantage in Localizing a Target among Distractors,” is published online in the open-access journal i-Perception. The paper was co-authored by Dr. Ian Spence of the University of Toronto. The work was supported, in part, by the Natural Sciences and Engineering Research Council of Canada.

Filed under attention spatial attention vision visual field psychology neuroscience science

131 notes

Fast contractions and depolarizations in mitochondria revealed with multiparametric imaging
When something bad happens to otherwise healthy neurons it’s easy to blame the usual suspects—the mitochondria. In some cases the nucleus might be the one at fault, as in a de novo mutation in a critical gene or in some other runaway error process in the instruction pipeline. Other times there could be leakage into the brain of toxins, bacteria, or even overzealous patriot cells of the host. But by and large, it’s the mitochondria who bear responsibility for nearly everything the brain does and so it is they who must accept it when it fails. To better understand how these organelles function, researchers have turned to special imaging methods that let them observe multiple aspects of their behavior all at once.
In one of the most revealing studies of its kind to date, researchers in Germany were able to observe the tiny contractions that mitochondria undergo during their complex shifts through different redox states and levels of depolarization. Publishing in a recent issue of Nature Medicine they relate these effects to pH and calcium concentration in the both the mitochondria and surrounding axon, and also to the larger spiking activity of the neuron.
Read more

Fast contractions and depolarizations in mitochondria revealed with multiparametric imaging

When something bad happens to otherwise healthy neurons it’s easy to blame the usual suspects—the mitochondria. In some cases the nucleus might be the one at fault, as in a de novo mutation in a critical gene or in some other runaway error process in the instruction pipeline. Other times there could be leakage into the brain of toxins, bacteria, or even overzealous patriot cells of the host. But by and large, it’s the mitochondria who bear responsibility for nearly everything the brain does and so it is they who must accept it when it fails. To better understand how these organelles function, researchers have turned to special imaging methods that let them observe multiple aspects of their behavior all at once.

In one of the most revealing studies of its kind to date, researchers in Germany were able to observe the tiny contractions that mitochondria undergo during their complex shifts through different redox states and levels of depolarization. Publishing in a recent issue of Nature Medicine they relate these effects to pH and calcium concentration in the both the mitochondria and surrounding axon, and also to the larger spiking activity of the neuron.

Read more

Filed under mitochondria neural activity neurons calcium concentration neuroscience science

95 notes

Low-fat diet helps fatigue in people with MS

People with multiple sclerosis who for one year followed a plant-based diet very low in saturated fat had much less MS-related fatigue at the end of that year — and significantly less fatigue than a control group of people with MS who didn’t follow the diet, according to an Oregon Health & Science University study being presented today at the American Academy of Neurology’s annual meeting in Philadelphia, Pa.

The study was the first randomized-controlled trial to examine the potential benefits of the low fat diet on the management of MS. The study found no significant differences between the two groups in brain lesions detected on MRI brain scans or on other measures of MS. But while the number of trial participants was relatively small, study leaders believe the significantly improved fatigue symptoms merited further and larger studies of the diet.

"Fatigue can be a debilitating problem for many people living with relapsing-remitting MS," said Vijayshree Yadav, M.D., an associate professor of neurology in the OHSU School of Medicine and clinical medical director of the OHSU Multiple Sclerosis Center. "So this study’s results — showing some notable improvement in fatigue for people who follow this diet — are a hopeful hint of something that could help many people with MS."

The study investigated the effects of following a diet called the McDougall Diet, devised by John McDougall, M.D. The diet is partly based on an MS-fighting diet developed in the 1940s and 1950s by the late Roy Swank, M.D., a former head of the division of neurology at OHSU. The McDougall diet, very low in saturated fat, focuses on eating starches, fruits and vegetables and does not include meat, fish or dairy products.

The study, which began in 2008, looked at the diet’s effect on the most common form of MS, called relapsing-remitting MS. About 85 percent of people with MS have relapsing-remitting MS, characterized by clearly defined attacks of worsening neurological function followed by recovery periods when symptoms improve partially or completely.

The study measured indicators of MS among a group of people who followed the McDougall Diet for 12 months and a control group that did not. The study measured a range of MS indicators and symptoms, including brain lesions on MRI brain scans of study participants, relapse rate, disabilities caused by the disease, body weight and cholesterol levels.

It found no difference between the diet group and the control group in the number of MS-caused brain lesions detected on the MRI scans. It also found no difference between the two groups in relapse rate or level of disability caused by the disease. People who followed the diet did lose significantly more weight than the control group and had significantly lower cholesterol levels. People who followed the diet also had higher scores on a questionnaire that measured their quality of life and overall mood.

The study’s sample size was relatively small. Fifty-three people completed the study, with 27 in the control group and 22 people in the diet group who complied with the diet’s restrictions.

"This study showed the low-fat diet might offer some promising help with the fatigue that often comes with MS," said Dennis Bourdette, M.D., F.A.A.N., chair of OHSU’s Department of Neurology, director of OHSU’s MS Center and a study co-author. "But further study is needed, hopefully with a larger trial where we can more closely look at how the diet might help fatigue and possibly affect other symptoms of MS."

(Source: eurekalert.org)

Filed under MS fatigue McDougall Diet diet brain lesions brain scans neuroscience science

163 notes

Researchers reveal new cause of epilepsy
A team of researchers from Sanford-Burnham and SUNY Downstate Medical Center has found that deficiencies in hyaluronan, also known as hyaluronic acid or HA, can lead to spontaneous epileptic seizures. HA is a polysaccharide molecule widely distributed throughout connective, epithelial, and neural tissues, including the brain’s extracellular space (ECS). Their findings, published on April 30 in The Journal of Neuroscience, equip scientists with key information that may lead to new therapeutic approaches to epilepsy.
The multicenter study used mice to provide the first evidence of a physiological role for HA in the maintenance of brain ECS volume. It also suggests a potential role in human epilepsy for HA and genes that are involved in hyaluraonan synthesis and degradation.
While epilepsy is one of the most common neurological disorders—affecting approximately 1 percent of the population worldwide—it is one of the least understood. It is characterized by recurrent spontaneous seizures caused by the abnormal firing of neurons. Although epilepsy treatment is available and effective for about 70 percent of cases, a substantial number of patients could benefit from a new therapeutic approach.
“Hyaluronan is widely known as a key structural component of cartilage and important for maintaining healthy cartilage. Curiously, it has been recognized that the adult brain also contains a lot of hyaluronan, but little is known about what hyaluronan does in the brain,” said Yu Yamaguchi, M.D., Ph.D., professor in our Human Genetics Program.
“This is the first study that demonstrates the important role of this unique molecule for normal functioning of the brain, and that its deficiency may be a cause of epileptic disorders. A better understanding of how hyaluronan regulates brain function could lead to new treatment approaches for epilepsy,” Yamaguchi added.
The extracellular matrix of the brain has a unique molecular composition. Earlier studies focused on the role of matrix molecules in cell adhesion and axon pathfinding during neural development. In recent years, increasing attention has been focused on the roles of these molecules in the regulation of physiological functions in the adult brain.
In this study, the investigators examined the role of HA using mutant mice deficient in each of the three hyaluronan synthase genes (Has1, Has2, Has3).
“We showed that Has-mutant mice develop spontaneous epileptic seizures, indicating that HA is functionally involved in the regulation of neuronal excitability. Our study revealed that deficiency of HA results in a reduction in the volume of the brain’s ECS, leading to spontaneous epileptiform activity in hippocampal CA1 pyramidal neurons,” said Sabina Hrabetova, M.D., Ph.D., associate professor in the Department of Cell Biology at SUNY.
“We believe that this study not only addresses one of the longstanding questions concerning the in-vivo role of matrix molecules in the brain, but also has broad appeal to epilepsy research in general,” said Katherine Perkins, Ph.D., associate professor in the Department of Physiology and Pharmacology at SUNY.
“More specifically, it should stimulate researchers in the epilepsy field because our study reveals a novel, non-synaptic mechanism of epileptogenesis. The fact that our research can lead to new anti-epileptic therapies based on the preservation of hyaluronan adds further significance for the broader biomedical community and the public,” the authors added.

Researchers reveal new cause of epilepsy

A team of researchers from Sanford-Burnham and SUNY Downstate Medical Center has found that deficiencies in hyaluronan, also known as hyaluronic acid or HA, can lead to spontaneous epileptic seizures. HA is a polysaccharide molecule widely distributed throughout connective, epithelial, and neural tissues, including the brain’s extracellular space (ECS). Their findings, published on April 30 in The Journal of Neuroscience, equip scientists with key information that may lead to new therapeutic approaches to epilepsy.

The multicenter study used mice to provide the first evidence of a physiological role for HA in the maintenance of brain ECS volume. It also suggests a potential role in human epilepsy for HA and genes that are involved in hyaluraonan synthesis and degradation.

While epilepsy is one of the most common neurological disorders—affecting approximately 1 percent of the population worldwide—it is one of the least understood. It is characterized by recurrent spontaneous seizures caused by the abnormal firing of neurons. Although epilepsy treatment is available and effective for about 70 percent of cases, a substantial number of patients could benefit from a new therapeutic approach.

“Hyaluronan is widely known as a key structural component of cartilage and important for maintaining healthy cartilage. Curiously, it has been recognized that the adult brain also contains a lot of hyaluronan, but little is known about what hyaluronan does in the brain,” said Yu Yamaguchi, M.D., Ph.D., professor in our Human Genetics Program.

“This is the first study that demonstrates the important role of this unique molecule for normal functioning of the brain, and that its deficiency may be a cause of epileptic disorders. A better understanding of how hyaluronan regulates brain function could lead to new treatment approaches for epilepsy,” Yamaguchi added.

The extracellular matrix of the brain has a unique molecular composition. Earlier studies focused on the role of matrix molecules in cell adhesion and axon pathfinding during neural development. In recent years, increasing attention has been focused on the roles of these molecules in the regulation of physiological functions in the adult brain.

In this study, the investigators examined the role of HA using mutant mice deficient in each of the three hyaluronan synthase genes (Has1, Has2, Has3).

“We showed that Has-mutant mice develop spontaneous epileptic seizures, indicating that HA is functionally involved in the regulation of neuronal excitability. Our study revealed that deficiency of HA results in a reduction in the volume of the brain’s ECS, leading to spontaneous epileptiform activity in hippocampal CA1 pyramidal neurons,” said Sabina Hrabetova, M.D., Ph.D., associate professor in the Department of Cell Biology at SUNY.

“We believe that this study not only addresses one of the longstanding questions concerning the in-vivo role of matrix molecules in the brain, but also has broad appeal to epilepsy research in general,” said Katherine Perkins, Ph.D., associate professor in the Department of Physiology and Pharmacology at SUNY.

“More specifically, it should stimulate researchers in the epilepsy field because our study reveals a novel, non-synaptic mechanism of epileptogenesis. The fact that our research can lead to new anti-epileptic therapies based on the preservation of hyaluronan adds further significance for the broader biomedical community and the public,” the authors added.

Filed under epilepsy epileptic seizures hyaluronic acid neurons neural activity neuroscience science

268 notes

Brain inflammation a recipe for chronic fatigue
Patients with chronic fatigue syndrome (CFS), also known as myalgic encephalomyelitis, experience severe and often disabling exhaustion. Other symptoms include cognitive dysfunction, pain and depression. Although brain inflammation is thought to be involved in the development of these symptoms, direct evidence of this relationship has proved elusive. 
Yasuyoshi Watanabe, Yasuhito Nakatomi, Kei Mizuno and colleagues from the RIKEN Center for Life Science Technologies and other institutes in Japan have now shown using a noninvasive brain imaging technique that the neuropsychological symptoms of patients with CFS are closely associated with widespread inflammation in the brain.
Positron emission tomography (PET) is a brain imaging technique that uses radioactive tracers attached to particular cell types or molecules to noninvasively track changes in the brain in disease states. To examine the effect of CFS, the researchers used a radioactive tracer that labels activated glial cells, which tend to be associated with neuroinflammation. They performed PET imaging studies on nine CFS sufferers and ten healthy individuals to identify the extent to which brain inflammation plays a role in CFS. They found that the levels of tracer binding were much higher in multiple brain regions in the CFS patients compared with the same brain regions in the healthy participants.
The investigation also found correlations between tracer binding in various brain regions and the severity of symptoms in the CFS patients. The researchers found that inflammation in the thalamus—a region of the brain responsible for relaying motor and sensory information to and from the cerebral cortex—correlated with the severity of both cognitive impairment and pain in the CFS patients. They also identified a correlation between inflammation in the amygdala—a part of the brain linked to emotional memory—and the severity of cognitive impairment. The severity of depression in CFS patients, on the other hand, was linked to the extent of inflammation in the hippocampus, which is a part of the brain known to be associated with depression.
The findings suggest that inflammation in the brain plays a key role in CFS in humans. Drugs that fight inflammation in the brain may therefore offer promising therapies to prevent or treat CFS and its related symptoms of pain, depression and cognitive dysfunction.
“Because CFS is diagnosed based on subjective symptoms such as fatigue, pain, sleep problems and cognitive impairment,” says Mizuno, “neuroinflammation as observed by PET imaging could be helpful as a more objective biomarker for diagnosis of the disorder.”

Brain inflammation a recipe for chronic fatigue

Patients with chronic fatigue syndrome (CFS), also known as myalgic encephalomyelitis, experience severe and often disabling exhaustion. Other symptoms include cognitive dysfunction, pain and depression. Although brain inflammation is thought to be involved in the development of these symptoms, direct evidence of this relationship has proved elusive. 

Yasuyoshi Watanabe, Yasuhito Nakatomi, Kei Mizuno and colleagues from the RIKEN Center for Life Science Technologies and other institutes in Japan have now shown using a noninvasive brain imaging technique that the neuropsychological symptoms of patients with CFS are closely associated with widespread inflammation in the brain.

Positron emission tomography (PET) is a brain imaging technique that uses radioactive tracers attached to particular cell types or molecules to noninvasively track changes in the brain in disease states. To examine the effect of CFS, the researchers used a radioactive tracer that labels activated glial cells, which tend to be associated with neuroinflammation. They performed PET imaging studies on nine CFS sufferers and ten healthy individuals to identify the extent to which brain inflammation plays a role in CFS. They found that the levels of tracer binding were much higher in multiple brain regions in the CFS patients compared with the same brain regions in the healthy participants.

The investigation also found correlations between tracer binding in various brain regions and the severity of symptoms in the CFS patients. The researchers found that inflammation in the thalamus—a region of the brain responsible for relaying motor and sensory information to and from the cerebral cortex—correlated with the severity of both cognitive impairment and pain in the CFS patients. They also identified a correlation between inflammation in the amygdala—a part of the brain linked to emotional memory—and the severity of cognitive impairment. The severity of depression in CFS patients, on the other hand, was linked to the extent of inflammation in the hippocampus, which is a part of the brain known to be associated with depression.

The findings suggest that inflammation in the brain plays a key role in CFS in humans. Drugs that fight inflammation in the brain may therefore offer promising therapies to prevent or treat CFS and its related symptoms of pain, depression and cognitive dysfunction.

“Because CFS is diagnosed based on subjective symptoms such as fatigue, pain, sleep problems and cognitive impairment,” says Mizuno, “neuroinflammation as observed by PET imaging could be helpful as a more objective biomarker for diagnosis of the disorder.”

Filed under chronic fatigue syndrome myalgic encephalomyelitis inflammation brain imaging cognitive impairment neuroscience science

154 notes

Investigators Discover How Key Protein Enhances Memory and Learning

Case Western Reserve researchers have discovered that a protein previously implicated in disease plays such a positive role in learning and memory that it may someday contribute to cures of cognitive impairments. The findings regarding the potential virtues of fatty acid binding protein 5 (FABP5) — usually associated with cancer and psoriasis — appear in the May 2 edition of The Journal of Biological Chemistry.

image

“Overall, our data show that FABP5 enhances cognitive function and that FABP5 deficiency impairs learning and memory functions in the brain hippocampus region,” said senior author Noa Noy, PhD, a professor of pharmacology at the School of Medicine. “We believe if we could find a way to upregulate the expression of FABP5 in the brain, we might have a therapeutic handle on cognitive dysfunction or memory impairment in some human diseases.”

FABP5 resides in many tissues and is especially highly expressed in the brain. Noy and her Case Western Reserve School of Medicine and National Institute on Alcohol Abuse and Alcoholism colleagues particularly wanted to understand how this protein functioned in neurons. They performed imaging studies comparing the activation of a key transcription factor in the brain tissue of normal mice and in FABP5-deficient mice. (Transcription factor is a protein the controls the flow of genetic information). The investigations revealed that FABP5 performs two different functions in neurons. First, it facilitates the degradation of endocannabinoids, which are neurological modulators controlling appetite, pain sensation, mood and memory. Second, FABP5 regulates gene expression, a process that essentially gives cells their marching orders on structure, appearance and function.

“FABP5 improves learning and memory both because it delivers endocannabinoids to cellular machinery that breaks them down and because it shuttles compounds to a transcription factor that increases the expression of cognition-associated genes,” Noy said.

Even though endocannabinoids affect essential physiological processes from appetite to memory, the “cannabinoid” part of the word signifies that these natural biological compounds act similarly to drugs such as marijuana and hashish. Too much endocannabinoid can lead to impaired learning and memory.

In simple terms, FABP5 transports endocannabinoids for processing. FABP5 functions like a bus and carries the brain’s endocannabinoids and their biological products to two stations within the neuron cell. FABP5 captures endocannabinoids entering the neuron and delivers them to an enzyme that degrades them (station 1). Then, that degraded product is picked up by the same protein (FABP5) and shuttled to the cell nucleus — specifically, to a transcription factor within it (station 2). Binding of the degraded product activates the transcription factor and allows it to induce expression of multiple genes. The genes that are induced in this case tell the cells to take steps that promote learning and memory.

Noy and associates also compared memory and learning in FABP5-deficient mice and in normal ones. In one test, both sets of mice repeatedly swam in mazes that had a platform in one established location where they could climb out of the water. During subsequent swims, the wild-type mice reached the platform quickly because they had learned — and remembered — its location. Their FABP5-deficient counterparts took much longer, typically finding the platform’s location by chance.

“In addition to regulating cell growth as in skin and in cancer cells, for example, FABP5 also plays a key role in neurons of the brain,” Noy said. “FABP5 controls the biological actions of small compounds that affect memory and learning and that activate a transcription factor, which regulates neuronal function.”

(Source: casemed.case.edu)

Filed under FABP5 cognitive function learning memory hippocampus endocannabinoids neuroscience science

94 notes

Study explores genetics behind Alzheimer’s resiliency

Autopsies have revealed that some individuals develop the cellular changes indicative of Alzheimer’s disease without ever showing clinical symptoms in their lifetime.

Vanderbilt University Medical Center memory researchers have discovered a potential genetic variant in these asymptomatic individuals that may make brains more resilient against Alzheimer’s.

“Most Alzheimer’s research is searching for genes that predict the disease, but we’re taking a different approach. We’re looking for genes that predict who among those with Alzheimer’s pathology will actually show clinical symptoms of the disease,” said principal investigator Timothy Hohman, Ph.D., a post-doctoral research fellow in the Center for Human Genetics Research and the Vanderbilt Memory and Alzheimer’s Center.

The article, “Genetic modification of the relationship between phosphorylated tau and neurodegeneration,” was published online recently in the journal Alzheimer’s and Dementia.

The researchers used a marker of Alzheimer’s disease found in cerebrospinal fluid called phosphorylated tau. In brain cells, tau is a protein that stabilizes the highways of cellular transport in neurons. In Alzheimer’s disease tau forms “tangles” that disrupt cellular messages.

Analyzing a sample of 700 subjects from the Alzheimer’s Disease Neuroimaging Initiative, Hohman and colleagues looked for genetic variants that modify the relationship between phosphorylated tau and lateral ventricle dilation — a measure of disease progression visible with magnetic resonance imaging (MRI). One genetic mutation (rs4728029) was found to relate to both ventricle dilation and cognition and is a marker of neuroinflammation.

“This gene marker appears to be related to an inflammatory response in the presence of phosphorylated tau,” Hohman said.

“It appears that certain individuals with a genetic predisposition toward a ‘bad’ neuroinflammatory response have neurodegeneration. But those with a genetic predisposition toward no inflammatory response, or a reduced one, are able to endure the pathology without marked neurodegeneration.”

Hohman hopes to expand the study to include a larger sample and investigate gene and protein expression using data from a large autopsy study of Alzheimer’s disease.

“The work highlights the possible mechanism behind asymptomatic Alzheimer’s disease, and with that mechanism we may be able to approach intervention from a new perspective. Future interventions may be able to activate these innate response systems that protect against developing Alzheimer’s symptoms,” Hohman said.

(Source: news.vanderbilt.edu)

Filed under alzheimer's disease neurodegeneration memory phosphorylated tau genetics neuroscience science

free counters