Neuroscience

Articles and news from the latest research reports.

Posts tagged science

88 notes

Too early to learn
Reseachers from Bochum and Warwick suggest consequences for planning school lessons

Being born preterm goes hand in hand with an increased risk for neuro-cognitive deficits. Psychologists from the Ruhr-Universität Bochum and the University of Warwick, UK have investigated the relation between the duration of pregnancy and cognitive abilities under varying work load conditions. “Cognitive performance deficits of children dramatically increase as cognitive workload of tasks increases and pregnancy duration decreases,” says Dr Julia Jäkel from the Ruhr-Universität. In the journal “PLOS ONE”, the researchers report a new cognitive workload model describing the association between task complexity and incremental performance deficits of preterm children.

Large numbers of preterm born babies will place new demands on education system
About 15 million, i.e., more than ten per cent of all babies worldwide are born preterm every year; that is before the 37th week of pregnancy – and the numbers are rising due to improvements in neonatal medicine and demographic changes. Recent studies suggest that delivery at any gestation other than full term (39 to 41 weeks gestational age) may impair brain development, rendering survivors at risk for adverse neuro-cognitive outcomes. Considering that 50 per cent of children are born before the 39th week of pregnancy, even small increases in cognitive impairments may have large effects on a population level. “As the total number of children born preterm increases there will be parallel increases in special education needs placing new demands on the education system,” Julia Jäkel and her colleagues say. To date, uncertainties remain regarding the nature and underlying causes of learning difficulties in preterm children. The new cognitive workload model now reconciles previous inconsistent findings on the relationship of gestational age and cognitive performance.
Cognitive deficits of children born preterm depend on the workload of the task
The research team tested 1326 children, born between weeks 23 and 41 of pregnancy, at an age of eight years. Data were collected as part of the prospective Bavarian Longitudinal Study. The children took part in a range of cognitive tests with varying workload. High workload tasks require the simultaneous integration of different sources of information, thereby placing high demands on the so called working memory. The results: The higher the workload and the shorter the pregnancy duration, the larger were the cognitive performance deficits. Deficits were disproportionally higher for children born before the 34th week of pregnancy compared with children born after week 33. Being born preterm specifically affected the ability to solve high workload tasks, whereas lower workload tasks were largely unaffected.
Results are relevant for cognitive follow-ups and planning of school lessons
According to the researchers, these results should be taken into account for routine cognitive follow-ups of preterm children as well as for planning school lessons. “New studies suggest that computerized training can improve working memory capacity,” Prof Dieter Wolke from Warwick says. “In addition, educational interventions could be developed in which information is not presented simultaneously to preterm children but more slowly and sequentially to promote academic attainment.”

Too early to learn

Reseachers from Bochum and Warwick suggest consequences for planning school lessons

Being born preterm goes hand in hand with an increased risk for neuro-cognitive deficits. Psychologists from the Ruhr-Universität Bochum and the University of Warwick, UK have investigated the relation between the duration of pregnancy and cognitive abilities under varying work load conditions. “Cognitive performance deficits of children dramatically increase as cognitive workload of tasks increases and pregnancy duration decreases,” says Dr Julia Jäkel from the Ruhr-Universität. In the journal “PLOS ONE”, the researchers report a new cognitive workload model describing the association between task complexity and incremental performance deficits of preterm children.

Large numbers of preterm born babies will place new demands on education system

About 15 million, i.e., more than ten per cent of all babies worldwide are born preterm every year; that is before the 37th week of pregnancy – and the numbers are rising due to improvements in neonatal medicine and demographic changes. Recent studies suggest that delivery at any gestation other than full term (39 to 41 weeks gestational age) may impair brain development, rendering survivors at risk for adverse neuro-cognitive outcomes. Considering that 50 per cent of children are born before the 39th week of pregnancy, even small increases in cognitive impairments may have large effects on a population level. “As the total number of children born preterm increases there will be parallel increases in special education needs placing new demands on the education system,” Julia Jäkel and her colleagues say. To date, uncertainties remain regarding the nature and underlying causes of learning difficulties in preterm children. The new cognitive workload model now reconciles previous inconsistent findings on the relationship of gestational age and cognitive performance.

Cognitive deficits of children born preterm depend on the workload of the task

The research team tested 1326 children, born between weeks 23 and 41 of pregnancy, at an age of eight years. Data were collected as part of the prospective Bavarian Longitudinal Study. The children took part in a range of cognitive tests with varying workload. High workload tasks require the simultaneous integration of different sources of information, thereby placing high demands on the so called working memory. The results: The higher the workload and the shorter the pregnancy duration, the larger were the cognitive performance deficits. Deficits were disproportionally higher for children born before the 34th week of pregnancy compared with children born after week 33. Being born preterm specifically affected the ability to solve high workload tasks, whereas lower workload tasks were largely unaffected.

Results are relevant for cognitive follow-ups and planning of school lessons

According to the researchers, these results should be taken into account for routine cognitive follow-ups of preterm children as well as for planning school lessons. “New studies suggest that computerized training can improve working memory capacity,” Prof Dieter Wolke from Warwick says. “In addition, educational interventions could be developed in which information is not presented simultaneously to preterm children but more slowly and sequentially to promote academic attainment.”

Filed under preterm children cognitive development cognitive performance cognitive deficits neuroscience science

134 notes

Rats have a double view of the world
Scientists from the Max Planck Institute for Biological Cybernetics in Tübingen, using miniaturised high-speed cameras and high-speed behavioural tracking, discovered that rats move their eyes in opposite directions in both the horizontal and the vertical plane when running around. Each eye moves in a different direction, depending on the change in the animal’s head position. An analysis of both eyes’ field of view found that the eye movements exclude the possibility that rats fuse the visual information into a single image like humans do. Instead, the eyes move in such a way that enables the space above them to be permanently in view – presumably an adaptation to help them deal with the major threat from predatory birds that rodents face in their natural environment.
Like many mammals, rats have their eyes on the sides of their heads. This gives them a very wide visual field, useful for detection of predators. However, three-dimensional vision requires overlap of the visual fields of the two eyes. Thus, the visual system of these animals needs to meet two conflicting demands at the same time; on the one hand maximum surveillance and on the other hand detailed binocular vision.
The research team from the Max Planck Institute for Biological Cybernetics have now, for the first time, observed and characterised the eye movements of freely moving rats. They fitted minuscule cameras weighing only about one gram to the animals’ heads, which could record the lightning-fast eye movements with great precision. The scientists also used another new method to measure the position and direction of the head, enabling them to reconstruct the rats’ exact line of view at any given time.
The Max Planck scientists’ findings came as a complete surprise. Although rats process visual information from their eyes through very similar brain pathways to other mammals, their eyes evidently move in a totally different way. “Humans move their eyes in a very stereotypical way for both counteracting head movements and searching around. Both our eyes move together and always follow the same object. In rats, on the other hand, the eyes generally move in opposite directions,” explains Jason Kerr from the Max Planck Institute for Biological Cybernetics.
In a series of behavioural experiments, the neurobiologists also discovered that the eye movements largely depend on the position of the animal’s head. “When the head points downward, the eyes move back, away from the tip of the nose. When the rat lifts its head, the eyes look forward: cross-eyed, so to speak. If the animal puts its head on one side, the eye on the lower side moves up and the other eye moves down.” says Jason Kerr.
In humans, the direction in which the eyes look must be precisely aligned, otherwise an object cannot be fixated. A deviation measuring less than a single degree of the field of view is enough to cause double vision. In rats, the opposing eye movements between left and right eye mean that the line of vision varies by as much as 40 degrees in the horizontal plane and up to 60 degrees in the vertical plane. The consequence of these unusual eye movements is that irrespective of vigorous head movements in all planes, the eyes movements always move in such a way to ensure that the area above the animal is always in view simultaneously by both eyes –something that does not occur in any other region of the rat’s visual field.
These unusual eye movements that rats possess appear to be the visual system’s way of adapting to the animals’ living conditions, given that they are preyed upon by numerous species of birds. Although the observed eye movements prevent the fusion of the two visual fields, the scientists postulate that permanent visibility in the direction of potential airborne attackers dramatically increases the animals’ chances of survival.

Rats have a double view of the world

Scientists from the Max Planck Institute for Biological Cybernetics in Tübingen, using miniaturised high-speed cameras and high-speed behavioural tracking, discovered that rats move their eyes in opposite directions in both the horizontal and the vertical plane when running around. Each eye moves in a different direction, depending on the change in the animal’s head position. An analysis of both eyes’ field of view found that the eye movements exclude the possibility that rats fuse the visual information into a single image like humans do. Instead, the eyes move in such a way that enables the space above them to be permanently in view – presumably an adaptation to help them deal with the major threat from predatory birds that rodents face in their natural environment.

Like many mammals, rats have their eyes on the sides of their heads. This gives them a very wide visual field, useful for detection of predators. However, three-dimensional vision requires overlap of the visual fields of the two eyes. Thus, the visual system of these animals needs to meet two conflicting demands at the same time; on the one hand maximum surveillance and on the other hand detailed binocular vision.

The research team from the Max Planck Institute for Biological Cybernetics have now, for the first time, observed and characterised the eye movements of freely moving rats. They fitted minuscule cameras weighing only about one gram to the animals’ heads, which could record the lightning-fast eye movements with great precision. The scientists also used another new method to measure the position and direction of the head, enabling them to reconstruct the rats’ exact line of view at any given time.

The Max Planck scientists’ findings came as a complete surprise. Although rats process visual information from their eyes through very similar brain pathways to other mammals, their eyes evidently move in a totally different way. “Humans move their eyes in a very stereotypical way for both counteracting head movements and searching around. Both our eyes move together and always follow the same object. In rats, on the other hand, the eyes generally move in opposite directions,” explains Jason Kerr from the Max Planck Institute for Biological Cybernetics.

In a series of behavioural experiments, the neurobiologists also discovered that the eye movements largely depend on the position of the animal’s head. “When the head points downward, the eyes move back, away from the tip of the nose. When the rat lifts its head, the eyes look forward: cross-eyed, so to speak. If the animal puts its head on one side, the eye on the lower side moves up and the other eye moves down.” says Jason Kerr.

In humans, the direction in which the eyes look must be precisely aligned, otherwise an object cannot be fixated. A deviation measuring less than a single degree of the field of view is enough to cause double vision. In rats, the opposing eye movements between left and right eye mean that the line of vision varies by as much as 40 degrees in the horizontal plane and up to 60 degrees in the vertical plane. The consequence of these unusual eye movements is that irrespective of vigorous head movements in all planes, the eyes movements always move in such a way to ensure that the area above the animal is always in view simultaneously by both eyes –something that does not occur in any other region of the rat’s visual field.

These unusual eye movements that rats possess appear to be the visual system’s way of adapting to the animals’ living conditions, given that they are preyed upon by numerous species of birds. Although the observed eye movements prevent the fusion of the two visual fields, the scientists postulate that permanent visibility in the direction of potential airborne attackers dramatically increases the animals’ chances of survival.

Filed under rats eye movements binocular vision double vision visual system neuroscience science

70 notes

Pitt team finds mechanism that causes noise-induced tinnitus and drug that can prevent it

An epilepsy drug shows promise in an animal model at preventing tinnitus from developing after exposure to loud noise, according to a new study by researchers at the University of Pittsburgh School of Medicine. The findings, reported this week in the early online version of the Proceedings of the National Academy of Sciences, reveal for the first time the reason the chronic and sometimes debilitating condition occurs.

image

An estimated 5 to 15 percent of Americans hear whistling, clicking, roaring and other phantom sounds of tinnitus, which typically is induced by exposure to very loud noise, said senior investigator Thanos Tzounopoulos, Ph.D., associate professor and member of the auditory research group in the Department of Otolaryngology, Pitt School of Medicine.

"There is no cure for it, and current therapies such as hearing aids don’t provide relief for many patients," he said. "We hope that by identifying the underlying cause, we can develop effective interventions."

The team focused on an area of the brain that is home to an important auditory center called the dorsal cochlear nucleus (DCN). From previous research in a mouse model, they knew that tinnitus is associated with hyperactivity of DCN cells — they fire impulses even when there is no actual sound to perceive. For the new experiments, they took a close look at the biophysical properties of tiny channels, called KCNQ channels, through which potassium ions travel in and out of the cell.

"We found that mice with tinnitus have hyperactive DCN cells because of a reduction in KCNQ potassium channel activity," Dr. Tzounopoulos said. "These KCNQ channels act as effective "brakes" that reduce excitability or activity of neuronal cells."

In the model, sedated mice are exposed in one ear to a 116-decibel sound, about the loudness of an ambulance siren, for 45 minutes, which was shown in previous work to lead to the development of tinnitus in 50 percent of exposed mice. Dr. Tzounopoulos and his team tested whether an FDA-approved epilepsy drug called retigabine, which specifically enhances KCNQ channel activity, could prevent the development of tinnitus. Thirty minutes into the noise exposure and twice daily for the next five days, half of the exposed group was given injections of retigabine.

Seven days after noise exposure, the team determined whether the mice had developed tinnitus by conducting startle experiments, in which a continuous, 70 dB tone is played for a period, then stopped briefly and then resumed before being interrupted with a much louder pulse. Mice with normal hearing perceive the gap in sounds and are aware something had changed, so they are less startled by the loud pulse than mice with tinnitus, which hear phantom noise that masks the moment of silence in between the background tones.

The researchers found that mice that were treated with retigabine immediately after noise exposure did not develop tinnitus. Consistent with previous studies, 50 percent of noise-exposed mice that were not treated with the drug exhibited behavioral signs of the condition.

"This is an important finding that links the biophysical properties of a potassium channel with the perception of a phantom sound," Dr. Tzounopoulos said. "Tinnitus is a channelopathy, and these KCNQ channels represent a novel target for developing drugs that block the induction of tinnitus in humans."

The KCNQ family is comprised of five different subunits, four of which are sensitive to retigabine. He and his collaborators aim to develop a drug that is specific for the two KCNQ subunits involved in tinnitus to minimize the potential for side effects.

"Such a medication could be a very helpful preventive strategy for soldiers and other people who work in situations where exposure to very loud noise is likely," Dr. Tzounopoulos said. "It might also be useful for other conditions of phantom perceptions, such as pain in a limb that has been amputated."

(Source: eurekalert.org)

Filed under tinnitus noise exposure potassium channels dorsal cochlear nucleus animal model neuroscience science

79 notes

Copper on the Brain
The value of copper has risen dramatically in the 21st century as many a thief can tell you, but in addition to the thermal and electrical properties that make it such a hot commodity metal, copper has chemical properties that make it essential to a healthy brain. Working at the interface of chemistry and neuroscience, Berkeley Lab chemist Christopher Chang and his research group at UC Berkeley have developed a series of fluorescent probes for molecular imaging of copper in the brain. Speaking at the recent national meeting of the American Chemical Society in New Orleans, he described the challenges of creating and applying live-cell and live-animal copper imaging probes and explained the importance of meeting these challenges.
“The human brain is a unique biological system, possessing unparalleled biological complexity in a compact space,” Chang said. “Although it accounts for only two-percent of total body mass, it consumes 20-percent of the oxygen taken in through respiration. As a consequence of its high demand for oxygen and oxidative metabolism, the brain has among the highest levels of copper, as well as iron and zinc in the body.”
Neuron and glia cells in the brain both require copper for the basic respiratory and antioxidant enzymes cytochrome c oxidase and superoxide dismutase. Copper is also necessary for brain-specific enzymes that control neurotransmitters, such as dopamine, as well as neuropeptides and dietary amines. Disruption of copper oxidation in the brain has been linked to several neurodegenerative diseases, including Alzheimer’s, Parkinson’s, Menkes’ and Wilson’s.
“The complex relationships between copper status and various stages of health and disease have been difficult to determine in part because of a lack of methods for monitoring dynamic changes in copper pools in whole living organisms,” Chang said. “We’ve been designing fluorescent probes that can map the movement of copper in live cells, tissue or even model organisms, such as mice and zebrafish.”
Their first success was Coppersensor-3 (CS3), a small-molecule fluorescent probe that can be used to image labile copper pools in living cells at endogenous, basal levels. They used CS3 in conjunction with synchrotron-based X-ray fluorescence microscopy (XRFM) to discover that neuronal cells move significant pools of copper upon activation and that these copper movements are dependent on calcium signaling.
“This was the first established link between mobile copper and major cell signaling pathways,” Chang said. “Being able to map transient copper movements after neuronal depolarization revealed how neural activity triggers copper mobility, and enabled us to create a model for calcium/copper crosstalk in neurons.”
The CS3 probe was followed by Mitochondrial Coppersensor-1 (Mito-CS1), a fluorescent sensor that can selectively target mitochondria and detect basal and labile copper pools in living cells. Mitochondria, the organelles that generate most of the chemical energy used by cells, are important reservoirs for copper. By allowing direct, real-time visualization of exchangeable mitochondrial copper pools, the Mito-CS1 probe enabled Chang and his colleagues to discover that cells maintain copper homeostasis in mitochondria even in situations of copper deficiency and metabolic malfunctions.
“This work illustrated the importance of regulating copper stores in mitochondria,” Chang said.
The latest copper probe from Chang’s group is Coppersensor 790 (CS790), a fluorescent sensor that features near-infrared excitation and emission capabilities, ideal for penetrating thicker biological specimens. CS790 can be used to monitor fluctuations in exchangeable copper stores under basal conditions, as well as under copper overload or deficiency conditions. Chang and his group are using CS790 to study a mouse model of Wilson’s disease, a genetic disorder characterized by an accumulation of excess copper.
“The in vivo fluorescence detection of copper provided by CS790 and our other fluorescent probes is opening up unique opportunities to explore the roles that copper plays in the healthy physiology of the brain, as well as in the development and progression copper-related diseases,” Chang said.

Copper on the Brain

The value of copper has risen dramatically in the 21st century as many a thief can tell you, but in addition to the thermal and electrical properties that make it such a hot commodity metal, copper has chemical properties that make it essential to a healthy brain. Working at the interface of chemistry and neuroscience, Berkeley Lab chemist Christopher Chang and his research group at UC Berkeley have developed a series of fluorescent probes for molecular imaging of copper in the brain. Speaking at the recent national meeting of the American Chemical Society in New Orleans, he described the challenges of creating and applying live-cell and live-animal copper imaging probes and explained the importance of meeting these challenges.

“The human brain is a unique biological system, possessing unparalleled biological complexity in a compact space,” Chang said. “Although it accounts for only two-percent of total body mass, it consumes 20-percent of the oxygen taken in through respiration. As a consequence of its high demand for oxygen and oxidative metabolism, the brain has among the highest levels of copper, as well as iron and zinc in the body.”

Neuron and glia cells in the brain both require copper for the basic respiratory and antioxidant enzymes cytochrome c oxidase and superoxide dismutase. Copper is also necessary for brain-specific enzymes that control neurotransmitters, such as dopamine, as well as neuropeptides and dietary amines. Disruption of copper oxidation in the brain has been linked to several neurodegenerative diseases, including Alzheimer’s, Parkinson’s, Menkes’ and Wilson’s.

“The complex relationships between copper status and various stages of health and disease have been difficult to determine in part because of a lack of methods for monitoring dynamic changes in copper pools in whole living organisms,” Chang said. “We’ve been designing fluorescent probes that can map the movement of copper in live cells, tissue or even model organisms, such as mice and zebrafish.”

Their first success was Coppersensor-3 (CS3), a small-molecule fluorescent probe that can be used to image labile copper pools in living cells at endogenous, basal levels. They used CS3 in conjunction with synchrotron-based X-ray fluorescence microscopy (XRFM) to discover that neuronal cells move significant pools of copper upon activation and that these copper movements are dependent on calcium signaling.

“This was the first established link between mobile copper and major cell signaling pathways,” Chang said. “Being able to map transient copper movements after neuronal depolarization revealed how neural activity triggers copper mobility, and enabled us to create a model for calcium/copper crosstalk in neurons.”

The CS3 probe was followed by Mitochondrial Coppersensor-1 (Mito-CS1), a fluorescent sensor that can selectively target mitochondria and detect basal and labile copper pools in living cells. Mitochondria, the organelles that generate most of the chemical energy used by cells, are important reservoirs for copper. By allowing direct, real-time visualization of exchangeable mitochondrial copper pools, the Mito-CS1 probe enabled Chang and his colleagues to discover that cells maintain copper homeostasis in mitochondria even in situations of copper deficiency and metabolic malfunctions.

“This work illustrated the importance of regulating copper stores in mitochondria,” Chang said.

The latest copper probe from Chang’s group is Coppersensor 790 (CS790), a fluorescent sensor that features near-infrared excitation and emission capabilities, ideal for penetrating thicker biological specimens. CS790 can be used to monitor fluctuations in exchangeable copper stores under basal conditions, as well as under copper overload or deficiency conditions. Chang and his group are using CS790 to study a mouse model of Wilson’s disease, a genetic disorder characterized by an accumulation of excess copper.

“The in vivo fluorescence detection of copper provided by CS790 and our other fluorescent probes is opening up unique opportunities to explore the roles that copper plays in the healthy physiology of the brain, as well as in the development and progression copper-related diseases,” Chang said.

Filed under neurodegenerative diseases glia cells antioxidant enzymes copper copper oxidation neuroscience science

141 notes

Scientists discover the origin of a giant synapse
Humans and most mammals can determine the spatial origin of sounds with remarkable acuity. We use this ability all the time—crossing the street; locating an invisible ringing cell phone in a cluttered bedroom. To accomplish this small daily miracle, the brain has developed a circuit that’s rapid enough to detect the tiny lag that occurs between the moment the auditory information reaches one of our ears, and the moment it reaches the other. The mastermind of this circuit is the “Calyx of Held,” the largest known synapse in the brain. EPFL scientists have revealed the role that a certain protein plays in initiating the growth of these giant synapses.
The discovery, published in Nature Neuroscience, could also help shed light on a number of neuropsychiatric disorders.
Enormous synapses enable faster communication
Ordinarily, neurons have thousands of contact points – known as synapses - with neighboring neurons. Within a given time frame, a neuron has to receive several signals from its neighbors in order to be able to fire its own signal in response. Because of this, information passes from neuron to neuron in a relatively random manner.
In the auditory part of the brain, this is not the case. Synapses often grow to extremely large sizes, and these behemoths are known as “Calyx of Held” synapses. Because they have hundreds of contact points, they are capable of transmitting a signal singlehandedly to a neighboring neuron. “It’s almost like peer-to-peer communication between neurons,” explains EPFL professor Ralf Schneggenburger, who led the study. The result is that information is processed extremely quickly, in a few fractions of a millisecond, instead of the slower pace of more than 10 milliseconds that occurs in most other neuronal circuits.
Identifying the protein
To isolate the protein responsible for controlling the growth of this gigantic synapse, the scientists had to perform painstaking research. Using methods for analyzing gene expression in mice, they identified several members of the “BMP” family of proteins from among more than 20,000 possible candidates.
To verify that they had truly identified the right protein, the researchers disabled BMP protein receptors in the auditory part of a mouse brain. “The resulting electrophysiological signal of the Calyx of Held was significantly altered,” explains Le Xiao, first author on the study. “This would suggest a large anatomical difference.”
The scientists then reconstructed the synapses in three dimensions from slices that were observed under an electron microscope. Instead of a single, massive Calyx of Held, which would encompass nearly half the neuron, the 3D image of the neuron clearly shows several, smaller synapses. “This shows that the process involving the BMP protein not only causes that one synapse to grow, but also performs a selection, by eliminating the others,” says Schneggenburger.
Synaptic connectivity, the key to many psychiatric puzzles
The impact of this study will go well beyond increasing our understanding of the auditory system. The results suggest that the BMP protein plays an important role in developing connectivity in the brain. Schneggenburger and his colleagues are currently investigating its role elsewhere in the brain. “Some neuropsychiatric disorders, such as schizophrenia and autism, are characterized by the abnormal development of synaptic connectivity in certain key parts of the brain,” explains Schneggenburger. By identifying and explaining the role of various proteins in this process, the scientists hope to be able to shed more light on these poorly understood disorders.

Scientists discover the origin of a giant synapse

Humans and most mammals can determine the spatial origin of sounds with remarkable acuity. We use this ability all the time—crossing the street; locating an invisible ringing cell phone in a cluttered bedroom. To accomplish this small daily miracle, the brain has developed a circuit that’s rapid enough to detect the tiny lag that occurs between the moment the auditory information reaches one of our ears, and the moment it reaches the other. The mastermind of this circuit is the “Calyx of Held,” the largest known synapse in the brain. EPFL scientists have revealed the role that a certain protein plays in initiating the growth of these giant synapses.

The discovery, published in Nature Neuroscience, could also help shed light on a number of neuropsychiatric disorders.

Enormous synapses enable faster communication

Ordinarily, neurons have thousands of contact points – known as synapses - with neighboring neurons. Within a given time frame, a neuron has to receive several signals from its neighbors in order to be able to fire its own signal in response. Because of this, information passes from neuron to neuron in a relatively random manner.

In the auditory part of the brain, this is not the case. Synapses often grow to extremely large sizes, and these behemoths are known as “Calyx of Held” synapses. Because they have hundreds of contact points, they are capable of transmitting a signal singlehandedly to a neighboring neuron. “It’s almost like peer-to-peer communication between neurons,” explains EPFL professor Ralf Schneggenburger, who led the study. The result is that information is processed extremely quickly, in a few fractions of a millisecond, instead of the slower pace of more than 10 milliseconds that occurs in most other neuronal circuits.

Identifying the protein

To isolate the protein responsible for controlling the growth of this gigantic synapse, the scientists had to perform painstaking research. Using methods for analyzing gene expression in mice, they identified several members of the “BMP” family of proteins from among more than 20,000 possible candidates.

To verify that they had truly identified the right protein, the researchers disabled BMP protein receptors in the auditory part of a mouse brain. “The resulting electrophysiological signal of the Calyx of Held was significantly altered,” explains Le Xiao, first author on the study. “This would suggest a large anatomical difference.”

The scientists then reconstructed the synapses in three dimensions from slices that were observed under an electron microscope. Instead of a single, massive Calyx of Held, which would encompass nearly half the neuron, the 3D image of the neuron clearly shows several, smaller synapses. “This shows that the process involving the BMP protein not only causes that one synapse to grow, but also performs a selection, by eliminating the others,” says Schneggenburger.

Synaptic connectivity, the key to many psychiatric puzzles

The impact of this study will go well beyond increasing our understanding of the auditory system. The results suggest that the BMP protein plays an important role in developing connectivity in the brain. Schneggenburger and his colleagues are currently investigating its role elsewhere in the brain. “Some neuropsychiatric disorders, such as schizophrenia and autism, are characterized by the abnormal development of synaptic connectivity in certain key parts of the brain,” explains Schneggenburger. By identifying and explaining the role of various proteins in this process, the scientists hope to be able to shed more light on these poorly understood disorders.

Filed under brain synapses calyx of held synapses neurons auditory system psychiatric disorders neuroscience science

40 notes

Researchers identify genetic suspects in sporadic Lou Gehrig’s disease

Researchers at the Stanford University School of Medicine have identified mutations in several new genes that might be associated with the development of spontaneously occurring cases of the neurodegenerative disease known as amyotrophic lateral sclerosis, or ALS. Also known as Lou Gehrig’s disease, the progressive, fatal condition, in which the motor neurons that control movement and breathing gradually cease to function, has no cure.

Although researchers know of some mutations associated with inherited forms of ALS, the majority of patients have no family history of the disease, and there are few clues as to its cause. The Stanford researchers compared the DNA sequences of 47 patients who have the spontaneous form of the disease, known as sporadic ALS, with those of their unaffected parents. The goal was to identify new mutations that were present in the patient but not in either parent that may have contributed to disease development.

Several suspects are mutations in genes that encode chromatin regulators — cellular proteins that govern how DNA is packed into the nucleus of a cell and how it is accessed when genes are expressed. Protein members of one these chromatin-regulatory complexes have recently been shown to play roles in normal development and some forms of cancer.

"The more we know about the genetic causes of the disorder, the greater insight we will have as to possible therapeutic targets," said Aaron Gitler, PhD, associate professor of genetics. "Until now, researchers have primarily relied upon large families with many cases of inherited ALS and attempted to pinpoint genetic regions that seem to occur only in patients. But more than 90 percent of ALS cases are sporadic, and many of the genes involved in these cases are unknown."

Gitler is the senior author of the study, published online May 26 in Nature Neuroscience. Postdoctoral scholar Alessandra Chesi, PhD, is the lead author. Gitler and Chesi collaborated with members of the laboratory of Gerald Crabtree, MD, professor of developmental biology and of pathology. Crabtree, a Howard Hughes Medical Institute investigator, is also a co-author of the study.

Chesi and Gitler combined deductive reasoning with recent advances in sequencing technology to conduct the work, which relied on the availability of genetic samples from not only ALS patients, but also the patients’ unaffected parents. Such trios can be difficult to obtain for diseases like sporadic ALS that strike well into adulthood when a patient’s parents may no longer be alive. Gitler and Chesi collaborated with researchers from Emory University and Johns Hopkins University to collect these samples.

The researchers compared the sequences of a portion of the genome called the exome, which directly contributes to the amino acid sequences of all the proteins in a cell. (Many genes contain intervening, non-protein-coding regions of DNA called introns that are removed prior to protein production.) Mutations found only in the patient’s exome, but not in that of his or her parents’, were viewed as potential disease-associated candidates - particularly if they affected the composition or structure of the resulting protein made from that gene.

Focusing on just the exome, which is about 1 percent of the total amount of DNA in each human cell, vastly reduced the total amount of DNA that needed to be sequenced and allowed the researchers to achieve relatively high coverage (or repeated sequencing to ensure accuracy) of each sample.

"We wanted to find novel changes in the patients," Chesi said. "These represent a class of mutations called de novo mutations that likely occurred during the production of the parents’ reproductive cells." As a result, these mutations would be carried in all the cells of patients, but not in their parents or siblings.

Using the exome sequencing technique, the researchers identified 25 de novo mutations in the ALS patients. Of these, five are known to be in genes involved in the regulation of the tightly packed form of DNA called chromatin — a proportion that is much higher than would have been expected by chance, according to Chesi.

Furthermore, one of the five chromatin regulatory proteins, SS18L1, is a member of a neuron-specific complex called nBAF, which has long been studied in Crabtree’s laboratory. This complex is strongly expressed in the brain and spinal cord, and affects the ability of the neurons to form branching structures called dendrites that are essential to nerve signaling.

"We found that, in one sporadic ALS case, the last nine amino acids of this protein are missing," Gitler said. "I knew that Gerald Crabtree’s lab had been investigating SS18L1, so I asked him about it. In fact, they had already identified these amino acids as being very important to the function of the protein."

When the researchers expressed the mutant SS18L1 in motor neurons isolated from mouse embryos, they found the neurons were unable to extend and grow new dendrites as robustly as normal neurons in response to stimuli. They also showed that SS18L1 appears to physically interact with another protein known to be involved in cases of familial, or inherited, ALS.

Although the results are intriguing, the researchers caution that more work is necessary to conclusively prove whether and how mutations in SS18L1 contribute to sporadic cases of ALS. But now they have an idea of where to look in other patients, without requiring the existence of patient and parent trios. They are planning to sequence SS18L1 and other candidates in an additional few thousand sporadic ALS cases.

"This is the first systematic analysis of ALS triads for the presence of de novo mutations," Chesi said. "Now we have a list of candidate genes we can pursue. We haven’t proven that these mutations cause ALS, but we’ve shown, at least in the context of SS18L1, that the mutation carried by some patients is damaging to the protein and affects the ability of mouse motor neurons to form dendrites."

(Source: med.stanford.edu)

Filed under ALS Lou Gehrig's disease DNA sequence mutations neurodegenerative diseases neuroscience science

81 notes

‘Should I stay or should I go?’ CSHL scientists link brain cell types to behavior
You are sitting on your couch flipping through TV channels trying to decide whether to stay put or get up for a snack. Such everyday decisions about whether to “stay” or to “go” are supported by a brain region called the anterior cingulate cortex (ACC), which is part of the prefrontal cortex. Neuroscientists from Cold Spring Harbor Laboratory (CSHL) have now identified key circuit elements that contribute to such decisions in the ACC.
CSHL Associate Professor Adam Kepecs and his team publish results that, for the first time, link specific brain cell types to a particular behavior pattern in mice – a “stay or go” pattern called foraging behavior. The paper, published online in Nature, shows that the firing of two distinct types of inhibitory neurons, known as somatostatin (SOM) and parvalbumin (PV) neurons, has a strong correlation with the start and end of a period of foraging behavior.
Linking specific neuronal types to well-defined behaviors has proved extremely difficult. “There’s a big gap in our knowledge between our understanding of neuron types in terms of their physical location and their place in any given neural circuit, and what these neurons actually do during behavior,” says Kepecs.
Part of the problem is the technical challenge of doing these studies in live, freely behaving mice. Key to solving that problem is a mouse model developed in the laboratory of CSHL Professor Z. Josh Huang. The mouse has a genetic modification that allows investigators to target a specific population of neurons with any protein of interest.
Kepecs’ group, led by postdocs Duda Kvitsiani and Sachin Ranade, used this mouse to label specific neuron types in the ACC with a light-activated protein – a technique known as optogenetic tagging. Whenever they shone light onto the brains of the mice they were recording from, only the tagged PV and SOM neurons responded promptly with a ‘spike’ in their activity, enabling the researchers to pick them out from the vast diversity of cellular responses seen at any given moment.
The team recorded neural activity in the ACC of these mice while they engaged in foraging behavior. They discovered that the PV and SOM inhibitory neurons responded around the time of the foraging decisions — in other words whether to stay and drink or go and explore elsewhere. Specifically, when the mice entered an area where they could collect a water reward, SOM inhibitory neurons shut down and entered a period of low-level activity, thereby opening a ‘gate’ for information to flow in to ACC. When the mice decided to leave that area and look elsewhere, PV inhibitory neurons fired and abruptly reset cell activity.
“The brain is complex and continuously active, so it makes sense that these two types of inhibitory interneurons define the boundaries of a behavior such as foraging, opening and then closing the ‘gate’ within a particular neural circuit through changes in their activity,” says Kepecs.
This is an important advance, addressing a problem in behavioral neuroscience that scientists call “the cortical response zoo.” When researchers record neural activity in cortex during behavior, and they don’t know which type of neurons they are recording from, a bewildering array of responses is seen. This greatly complicates the task of interpretation. Hence the significance of the Kepecs team’s results, for the first time showing that specific cortical neuron types can be linked to specific aspects of behavior.
“We think about the brain and behavior in terms of levels; what the cell types are and the circuits or networks they form; which regions of the brain they are in; and what behavior is modulated by them,” explains Kepecs. “By observing that the activity of specific cell types in the prefrontal cortex is correlated with a behavioral period, we have identified a link between these levels.”

‘Should I stay or should I go?’ CSHL scientists link brain cell types to behavior

You are sitting on your couch flipping through TV channels trying to decide whether to stay put or get up for a snack. Such everyday decisions about whether to “stay” or to “go” are supported by a brain region called the anterior cingulate cortex (ACC), which is part of the prefrontal cortex. Neuroscientists from Cold Spring Harbor Laboratory (CSHL) have now identified key circuit elements that contribute to such decisions in the ACC.

CSHL Associate Professor Adam Kepecs and his team publish results that, for the first time, link specific brain cell types to a particular behavior pattern in mice – a “stay or go” pattern called foraging behavior. The paper, published online in Nature, shows that the firing of two distinct types of inhibitory neurons, known as somatostatin (SOM) and parvalbumin (PV) neurons, has a strong correlation with the start and end of a period of foraging behavior.

Linking specific neuronal types to well-defined behaviors has proved extremely difficult. “There’s a big gap in our knowledge between our understanding of neuron types in terms of their physical location and their place in any given neural circuit, and what these neurons actually do during behavior,” says Kepecs.

Part of the problem is the technical challenge of doing these studies in live, freely behaving mice. Key to solving that problem is a mouse model developed in the laboratory of CSHL Professor Z. Josh Huang. The mouse has a genetic modification that allows investigators to target a specific population of neurons with any protein of interest.

Kepecs’ group, led by postdocs Duda Kvitsiani and Sachin Ranade, used this mouse to label specific neuron types in the ACC with a light-activated protein – a technique known as optogenetic tagging. Whenever they shone light onto the brains of the mice they were recording from, only the tagged PV and SOM neurons responded promptly with a ‘spike’ in their activity, enabling the researchers to pick them out from the vast diversity of cellular responses seen at any given moment.

The team recorded neural activity in the ACC of these mice while they engaged in foraging behavior. They discovered that the PV and SOM inhibitory neurons responded around the time of the foraging decisions — in other words whether to stay and drink or go and explore elsewhere. Specifically, when the mice entered an area where they could collect a water reward, SOM inhibitory neurons shut down and entered a period of low-level activity, thereby opening a ‘gate’ for information to flow in to ACC. When the mice decided to leave that area and look elsewhere, PV inhibitory neurons fired and abruptly reset cell activity.

“The brain is complex and continuously active, so it makes sense that these two types of inhibitory interneurons define the boundaries of a behavior such as foraging, opening and then closing the ‘gate’ within a particular neural circuit through changes in their activity,” says Kepecs.

This is an important advance, addressing a problem in behavioral neuroscience that scientists call “the cortical response zoo.” When researchers record neural activity in cortex during behavior, and they don’t know which type of neurons they are recording from, a bewildering array of responses is seen. This greatly complicates the task of interpretation. Hence the significance of the Kepecs team’s results, for the first time showing that specific cortical neuron types can be linked to specific aspects of behavior.

“We think about the brain and behavior in terms of levels; what the cell types are and the circuits or networks they form; which regions of the brain they are in; and what behavior is modulated by them,” explains Kepecs. “By observing that the activity of specific cell types in the prefrontal cortex is correlated with a behavioral period, we have identified a link between these levels.”

Filed under anterior cingulate cortex prefrontal cortex foraging behavior animal model neurons neuroscience science

536 notes

Old schooled: You never stop learning like a child

The adult brain is far more malleable that we thought, and so learning can be child’s play if you know how.

image

Some 36-year-olds choose to collect vintage wine, vinyl records or sports memorabilia. For Richard Simcott, it is languages. His itch to learn has led him to study more than 30 foreign tongues – and he’s not ready to give up.

During our conversation in a London restaurant, he reels off sentences in Spanish, Turkish and Icelandic as easily as I can name the pizza and pasta on our menu. He has learned Dutch on the streets of Rotterdam, Czech in Prague and Polish during a house share with some architects. At home, he talks to his wife in fluent Macedonian.

What’s remarkable about Simcott isn’t just the number and diversity of languages he has mastered. It’s his age. Long before grey hairs appear and waistlines expand, the mind’s cogs are meant to seize up, making it difficult to pick up any new skill, be it a language, the flute, or archery. Even if Simcott had primed his mind for new languages while at school, he should have faced a steep decline in his abilities as the years went by – yet he still devours unfamiliar grammars and strange vocabularies to a high level. “My linguistic landscape is always changing,” he says. “If you’re school-aged, or middle-aged – I don’t think there’s a big difference.”

A decade ago, few neuroscientists would have agreed that adults can rival the learning talents of children. But we needn’t be so defeatist. The mature brain, it turns out, is more supple than anyone thought. “The idea that there’s a critical period for learning in childhood is overrated,” says Gary Marcus, a psychologist at New York University. What’s more, we now understand the best techniques to accelerate knowledge and skill acquisition in adults, so can perhaps unveil a few tricks of the trade of super-learners like Simcott. Whatever you want to learn, it’s never too late to charge those grey cells.

The idea that the mind fossilises as it ages is culturally entrenched. The phrase “an old dog will learn no tricks" is recorded in an 18th century book of proverbs and is probably hundreds of years older.

When researchers finally began to investigate the adult brain’s malleability in the 1960s, their results appeared to agree with the saying. Most insights came indirectly from studies of perception, which suggested that an individual’s visual abilities were capped at a young age. For example, restricting young animals’ vision for a few weeks after birth means they will never manage to see normally. The same is true for people born with cataracts or a lazy eye – repair too late, and the brain fails to use the eye properly for life. “For a very long time, it seemed that those constraints were set in stone after that critical period,” says Daphne Bavelier at the University of Rochester, New York.

These are extreme circumstances, of course, but the evidence suggested that the same neural fossilisation would stifle other kinds of learning. Many of the studies looked at language development – particularly in families of immigrants. While the children picked up new tongues with ease, their parents were still stuttering broken sentences. But if there is a critical period for foreign language learning, everyone should be affected equally; Simcott’s ability to master a host of languages should be as impossible as a dog playing the piano.

Bearing this in mind, Ellen Bialystok at York University in Toronto, Canada, recently turned to the US census records, which detailed the linguistic skills of more than 2 million Hispanic and Chinese immigrants. A “critical period” for learning a second language in infancy should have created a sharp difference between those who had moved country in early childhood and those who were uprooted in adolescence. In reality? “There was absolutely no discontinuity,” Bialystok says. Instead, she saw a very gradual decline with age among immigrants – which could reflect differences in environment as much as the adults’ rusty brain circuits. “People talk more slowly and clearly to children in short, simple sentences,” she says. “And the child’s entire social and educational network is organised around that language.”

Yet while Bialystok’s study suggested that adult brains are more pliable than had once been imagined, there was still the suspicion that children might have the edge in certain skills. Adult learners sometimes find it harder to learn to sing in tune, hit a home run or mimic an accent convincingly. At first glance, the problem might seem to lie in adults’ perception and motor skills. Learning involving these abilities differs from the acquisition of factual knowledge, because it needs us to rewire the eyes, ears and muscles.

It’s something that Marcus can identify with. At the age of 38, he devoted himself to learning the guitar, an experience he detailed in his book Guitar Zero. “My family’s initial response was laughter – but they soon saw I was making progress,” he says. Still, during his research, he attended a musical summer camp for 8 to 15-year-olds. He says he was quicker to catch on to the structure of songs, but his younger bandmates had better coordination and sense of pitch.

Yet the available evidence hints that children may not always be inherently better at such tasks. One study by Yang Zhang at the University of Minnesota in Minneapolis that focused on the acquisition of foreign accents in adults suggests we may simply be suffering from poor tuition. When the researchers gave them recordings that mimicked the exaggerated baby talk of cooing mothers, the adult learners progressed rapidly.

Nor do adults necessarily fumble over the intricate movements that are crucial for music or sport. When volunteers visiting Virginia Penhune's lab at Concordia University in Montreal, Canada, learned to press keys in a certain sequence, at certain times – essentially a boiled-down version of keyboard practice – the adults tended to outshine the younger volunteers.

During a more challenging test of hand-eye coordination, nearly 1000 volunteers of all age groups learned to juggle over a series of six training sessions. As you might expect, the senior citizens aged 60 to 80 began with some hesitation, but they soon caught up with the 30-year-olds and by the end of the trials all the adults were juggling more confidently than the 5 to 10-year-olds.

Old dogs, then, are much more adaptable than folklore would have it – and if we do have deficits, they aren’t insurmountable. The reason that children appear to be better learners may have more to do with their environment, and factors such as physical fitness (see “Faster body, faster mind”).

Indeed, many researchers believe that an adult’s lifestyle may be the biggest obstacle. “A child’s sole occupation is learning to speak and move around,” says Ed Cooke, a cognitive scientist who has won many memory contests. “If an adult had that kind of time to spend on attentive learning, I’d be very disappointed if they didn’t do a good job.”

A glut of free time and a carefree existence are out of reach for most of us, but there are other behaviours that boost children’s learning, and these habits can be easily integrated into even an adult’s schedule. For example, children are continually quizzed on what they know – and for good reason: countless studies have shown that testing doubles long-term recall, outperforming all other memory tactics. Yet most adults attempting to learn new skills will rely more on self-testing which, let’s be honest, happens less often.

That’s why Cooke developed a website, called Memrise, which helps take some of the pain out of testing and, crucially, can integrate learning into the adult day. It is designed to track your learning curve with cunningly timed tests that force you to retrieve the information just as you are about to forget it.

"Memrise engages your brain to the greatest possible extent," says Cooke, who has himself used the site to learn thousands of words of foreign vocabulary. Users can create their own courses – the topics range from art to zoology – and importantly, it is easy to load the site in the few spare minutes of your lunch break or while you are waiting for a train. Cooke also plans to launch a smartphone app.

What about tasks that involve perceptual learning or motor skills – like battling against a lifetime of tone deafness, or perfecting that golf swing? Here too, there are guiding principles that can help you rediscover the seemingly effortless learning of youth.

Adults can hamper progress with their own perfectionism: whereas children throw themselves into tasks, adults often agonise over the mechanics of the movements, trying to conceptualise exactly what is required. This could be one of our biggest downfalls. “Adults think so much more about what they are doing,” says Gabriele Wulf at the University of Nevada, Las Vegas. “Children just copy what they see.”

Wulf’s work over the past decade shows that you should focus on the outcome of your actions rather than the intricacies of the movements. She applies this finding in her own life: as a keen golfer, she has found it is better to think about the swing of the club, for instance, rather than the position of her hands. “I’m always trying to find where best to focus my attention,” she says. Similarly, if you are learning to sing, then you should concentrate on the tone of the voice, rather than on the larynx or the placement of the tongue. Study after study shows that simply shifting your mindset in this way accelerates your learning– perhaps by encouraging the subconscious, automatic movements that mark proficiency.

Misplaced conscientiousness may also lead adults to rely on overly rigid practice regimes that stifle long-term learning. The adult talent for perseverance, it seems, is not always a virtue. Left to their own devices, most people segment their sessions into separate blocks – when learning basketball, for instance, they may work on each shot in turn, perhaps because they feel a desire to master it. The approach may bring rapid improvements at first, but a host of studies have found that the refined technique is soon forgotten.

Instead, you do better to take a carousel approach, quickly rotating through the different skills to be practised without lingering too long on each one. Although the reason is still unclear, it seems that jumping between skills makes your mind work a little harder when applying what you’ve learned, helping you to retain the knowledge in the long term – a finding that has helped people improve in activities ranging from tennis and kayaking to pistol shooting.

Such an approach might not be to everyone’s taste – with intricate skills, it might feel like you are making no progress. But even if you do revert to stints of lengthy practice, you can still reap some of the same benefits by occasionally trying out your skills in an unfamiliar situation. In tennis, you might move to a different part of the court for a couple of serves before returning to the regular position; while playing scales on a musical instrument, you might switch hands temporarily. According to work by Arnaud Boutin at the Leibniz Research Centre for Working Environment and Human Factors in Dortmund, Germany, venturing out of your comfort zone in this way helps to ensure that you improve your overall performance rather than confining your progress to the single task at hand. “Otherwise, the longer you practise, the harder it becomes to transfer the skills that you’ve learned to new situations,” says Boutin.

If none of that helps you learn like a child, simply adopting the arrogance of youth may do no harm. “As we get older, we lose our confidence, and I’m convinced that has a big impact on performance,” says Wulf. To test the assumption, she recently trained a small group of people to pitch a ball. While half were given no encouragement, she offered the others a sham test, rigged to demonstrate that their abilities were above average. They learned to pitch on target with much greater accuracy than those who didn’t get an ego boost.

Whether your itch to learn will ever match Simcott’s appetite for foreign languages is another matter. “What I do – it’s like an extreme sport. There’s no need to learn that many languages,” he says. He has recently turned to Chinese, and has no plans to stop after that. “I’m like a linguistic butterfly. There’s always another, really far away, that suddenly feels appealing.”

Still, embrace the idea that your mind is as capable as Simcott’s, and the lure of extreme learning might take hold of you too.

-by David Robson, New Scientist

Filed under adult brain learning perception linguistic skills critical period psychology neuroscience science

196 notes

Painting through the power of thought enabled by scientists
To the viewer it is an accomplished semiabstract image of flowers and clouds, but in fact this painting was produced by a paralysed woman solely through the power of thought.

Heide Pfützner, a former teacher from Leipzig, Germany, was diagnosed with Amyotrophic Lateral Sclerosis, also known as Motor Neurone Disease, yet she has managed to produce a series of the paintings with the aid of a new brain controlled computer.


She has been trained to master the device that uses brain waves to take control of a palette of colours, shapes and brushes to produce digital artworks.


Building on decades of knowledge about the meaning of the tiny electrical impulses created by the brain during thought, scientists have been able to create a computer programme which translates thoughts into electronic images.
As well as helping patients with progressive brain diseases like Mrs Pfützner, other users of the device include those who are “locked in” to a physically unresponsive state and therefore unable to communicate with the rest of the world.
The system works by detecting changes in the pattern of the user’s brain waves to allow them to select options in software and to move a cursor around a screen in front of them.
Read more

Painting through the power of thought enabled by scientists

To the viewer it is an accomplished semiabstract image of flowers and clouds, but in fact this painting was produced by a paralysed woman solely through the power of thought.

Heide Pfützner, a former teacher from Leipzig, Germany, was diagnosed with Amyotrophic Lateral Sclerosis, also known as Motor Neurone Disease, yet she has managed to produce a series of the paintings with the aid of a new brain controlled computer.

She has been trained to master the device that uses brain waves to take control of a palette of colours, shapes and brushes to produce digital artworks.

Building on decades of knowledge about the meaning of the tiny electrical impulses created by the brain during thought, scientists have been able to create a computer programme which translates thoughts into electronic images.

As well as helping patients with progressive brain diseases like Mrs Pfützner, other users of the device include those who are “locked in” to a physically unresponsive state and therefore unable to communicate with the rest of the world.

The system works by detecting changes in the pattern of the user’s brain waves to allow them to select options in software and to move a cursor around a screen in front of them.

Read more

Filed under BCI brainwaves ALS art brain painting device neuroscience science

132 notes

Help at hand for schizophrenics

Researchers from the Bergen fMRI Group at the University of Bergen (UiB) are working on how to help schizophrenics, who hear voices. The way they do this is by studying people who also hear voices, but who do not suffer from a mental illness. For a five-year period, the group is studying the brain processes causing people to hear voices. A recent report published in Frontiers in Human Neuroscience shows some of the group’s startling results.

– We have found that the primary auditory cortex of healthy people who hear voices, responds less to outside stimulus than the corresponding area of the brain in people who don’t hear voices, says Post Doctor Kristiina Kompus.

Kompus, who works at UiB’s Department of Biological and Medical Psychology, is lead author of the just published study.

Variations in cognitive control

The primary auditory cortex is the region of the brain that processes sound. Kompus’ study shows that healthy people who hear voices share some attributes with schizophrenics, as the cortical region in both groups reacts less to outside stimulus.

However, there is an important difference between people who hear voices. Whilst those with schizophrenia have a reduced ability to regulate the primary auditory cortex using cognitive control, those who hear voices but are healthy are able to do so.

– Because of this cognitive control, healthy people who hear voices are able to direct their attention outwards. This sets them apart from schizophrenics, who have a tendency to direct their attention inwards due to their decreased ability to regulate their primary auditory cortex, says Kompus before adding.

– These discoveries have brought us one step close to understanding the hallucinations of schizophrenics and why the voices become a problem for some people but not for others.

Many healthy people hear voices

So what is the next step for Kompus and her fellow researchers?

– We will do further research on the brain structure of people with auditory hallucinations. In particular, we wish to look at the brain’s networks that process outside voices. This is to establish whether these voice hallucinations and the outside voices occur in the same parts of the brain. We also wish to establish if hearing voices is a genetical trait, she says.

According to the researchers, approximately five per cent of us hear voices in the head, even if otherwise healthy. This number is based on research from several countries and surveys. For their own research, Kompus and her team used local media in Bergen to call for people who hear voices. The results were overwhelming, with around 30 people getting in touch with the researchers to register for the study.

Filed under schizophrenia auditory cortex auditory hallucinations hallucinations neuroscience science

free counters