Neuroscience

Month

February 2012

Memory Function - Decaffeinated Coffee May Help

Article Date: 05 Feb 2012 - 0:00 PST

Drinking decaffeinated coffee may improve brain energy metabolism associated with diabetes type 2, according to a study published in Nutritional Neuroscience and carried out by researchers at Mount Sinai School of Medicine. Brain energy metabolism is a dysfunction with a known risk factor for dementia and other neurodegenerative disorders like Alzheimer’s disease.

Giulio Maria Pasinetti, MD, PhD, and team decided to investigate whether dietary supplementation with a standard decaffeinated coffee prior to diabetes onset could improve insulin resistance and glucose utilization in mice with diet-induced type 2 diabetes.

The mice were given the supplement for five months, after which the researchers assessed the animals’ brain’s genetic response. They discovered that the brain could metabolize glucose more effectively and that it was used for cellular energy in the brain. People with type 2 diabetes have reduced glucose utilization in the brain, which often leads to neurocognitive problems.

Dr. Pasinetti stated:

"Impaired energy metabolism in the brain is known to be tightly correlated with cognitive decline during aging and in subjects at high risk for developing neurodegenerative disorders. This is the first evidence showing the potential benefits of decaffeinated coffee preparations for both preventing and treating cognitive decline caused by type 2 diabetes, aging, and/or neurodegenerative disorders."



Drinking coffee is not recommended for everyone, because of its association with cardiovascular health risks, including elevated blood cholesterol and blood pressure, both of which result in a higher risk of developing heart disease, stroke, and premature death. However, these negative effects have mainly been caused because of the high caffeine content of coffee - the study findings prove that some components in decaffeinated coffee have beneficial health factors for mice.

Dr. Pasinetti wants to investigate whether decaffeinated coffee as a dietary supplement in humans can act as a preventive measure.

He concludes:

"In light of recent evidence suggesting that cognitive impairment associated with Alzheimer’s disease and other age-related neurodegenerative disorders may be traced back to neuropathological conditions initiated several decades before disease onset, developing preventive treatments for such disorders is critical."


Petra Rattue 

Source: Medical News Today

Feb 6, 2012
#science #neuroscience #psychology #brain #memory
Hearing Metaphors Activates Brain Regions Involved in Sensory Experience

ScienceDaily (Feb. 3, 2012) — When a friend tells you she had a rough day, do you feel sandpaper under your fingers? The brain may be replaying sensory experiences to help understand common metaphors, new research suggests.

Regions of the brain activated by hearing textural metaphors are shown in green. Yellow and red show regions activated by sensory experience of textures visually and through touch. (Credit: Image courtesy of Emory University)

Linguists and psychologists have debated how much the parts of the brain that mediate direct sensory experience are involved in understanding metaphors. George Lakoff and Mark Johnson, in their landmark work ‘Metaphors we live by’, pointed out that our daily language is full of metaphors, some of which are so familiar (like “rough day”) that they may not seem especially novel or striking. They argued that metaphor comprehension is grounded in our sensory and motor experiences.

New brain imaging research reveals that a region of the brain important for sensing texture through touch, the parietal operculum, is also activated when someone listens to a sentence with a textural metaphor. The same region is not activated when a similar sentence expressing the meaning of the metaphor is heard.

The results were published online this week in the journal Brain & Language.

"We see that metaphors are engaging the areas of the cerebral cortex involved in sensory responses even though the metaphors are quite familiar," says senior author Krish Sathian, MD, PhD, professor of neurology, rehabilitation medicine, and psychology at Emory University. "This result illustrates how we draw upon sensory experiences to achieve understanding of metaphorical language."

Sathian is also medical director of the Center for Systems Imaging at Emory University School of Medicine and director of the Rehabilitation R&D Center of Excellence at the Atlanta Veterans Affairs Medical Center.

Seven college students who volunteered for the study were asked to listen to sentences containing textural metaphors as well as sentences that were matched for meaning and structure, and to press a button as soon as they understood each sentence. Blood flow in their brains was monitored by functional magnetic resonance imaging. On average, response to a sentence containing a metaphor took slightly longer (0.84 vs 0.63 seconds).

In a previous study, the researchers had already mapped out, for each of these individuals, which parts of the students’ brains were involved in processing actual textures by touch and sight. This allowed them to establish with confidence the link within the brain between metaphors involving texture and the sensory experience of texture itself.

"Interestingly, visual cortical regions were not activated by textural metaphors, which fits with other evidence for the primacy of touch in texture perception," says research associate Simon Lacey, PhD, the first author of the paper.

The researchers did not find metaphor-specific differences in cortical regions well known to be involved in generating and processing language, such as Broca’s or Wernicke’s areas. However, this result doesn’t rule out a role for these regions in processing metaphors, Sathian says. Also, other neurologists have seen that injury to various areas of the brain can interfere with patients’ understanding of metaphors.

"I don’t think that there’s only one area responsible for metaphor processing," Sathian says. "Actually, several recent lines of research indicate that engagement with abstract concepts is distributed around the brain." "I think our research highlights the role of neural networks, rather than a single area of the brain, in these processes. What could be happening is that the brain is conducting an internal simulation as a way to understand the metaphor, and that’s why the regions associated with touch get involved. This also demonstrates how complex processes involving symbols, such as appreciating a painting or understanding a metaphor, do not depend just on evolutionarily new parts of the brain, but also on adaptations of older parts of the brain."

Sathian’s future plans include asking whether similar relationships exist for other senses, such as vision. The researchers also plan to probe whether magnetic stimulation of the brain in regions associated with sensory experience can interfere with understanding metaphors.

The research was supported by the National Institutes of Health and the National Science Foundation.

Source: ScienceDaily

Feb 6, 2012
#science #neuroscience #psychology #brain
Feb 4, 2012293 notes
Treating Brain Injuries With Stem Cell Transplants - Promising Results

Article Date: 04 Feb 2012 - 10:00 PST

The February edition of Neurosurgery reports that animal experiments in brain-injured rats have shown that stem cells injected via the carotid artery travel directly to the brain, greatly enhancing functional recovery. The study demonstrates, according to leading researcher Dr Toshiya Osanai, of Hokkaido University Graduate School of Medicine in Sapporo, Japan, that the carotid artery injection technique could, together with some form of in-vivo optical imaging to track the stem cells after transplantation, potentially be part of a new approach for stem cell transplantation in human brain trauma injuries (TBI).

Dr. Osanai and team assessed a new “intra-arterial” technique of stem cell transplantation in rats, with the aim of delivering the stem cells directly to the brain without having to go through the general circulation. They induced TBI in the animals before injecting stem cells into the carotid artery seven days later.

The stem cells were obtained from the rats’ bone marrow and were labeled with “quantum dots” prior to being injected. Quantom dots are a biocompatible, fluorescent semiconductor created with nanotechnology that emit near-infrared light with much longer wavelengths that penetrate bone and skin, enabling a non-invasive method of monitoring the stem cells for a period of four weeks following transplantation.

This in vivo optical imaging technique enabled the scientists to observe that the injected stem cells entered the brain on the first attempt, without entering the general circulation. They observed that the stem cells started migrating from the capillaries into the injured part of the brain within three hours.

At week 4, the researchers noted that the rats in the stem cell transplant group achieved a substantial recovery of motor function, compared with the untreated animals that had no signs of recovery.

The team learnt, after examining the treated brains, that the stem cells had transformed into different brain cell types and aided in healing the injured brain area.

Over the last few years, the potential of stem cell therapy for curing and treating illnesses and conditions has been growing rapidly. Below is a list of some of its possible uses.

(Photo by: Mikael Häggström)

Developing stem cell therapy for brain injury in human patients

Stem cells represent a potential, new important method of treatment for those who suffered brain injuries, TBI and stroke. But even though bone marrow stem cells, similar to the ones used in the new study, are a promising source of donor cells, many questions remain open regarding the optimal timing, dose and route of stem cell delivery.


In the new animal study, the rats were injected with the stem cells one week after TBI. This is a “clinically relevant” time, given that this is the minimum time it takes to develop stem cells from bone marrow.

Transplanting the stem cells into the carotid artery is a fairly simple procedure that delivers the cells directly to the brain.

The experiments have also provided key evidence that stem cell treatment can promote healing after TBI with a substantial recovery of function.

Dr. Osanai and team write that by using in vivo optical imaging:

"The present study was the first to successfully track donor cells that were intra-arterially transplanted into the brain of living animals over four weeks."

A similar form of imaging technology could also prove beneficial for monitoring the effects of stem cell transplantation in humans, although the tracking will pose challenges, due to the human skull and scalp being much thicker than in rats.

The researchers conclude:

"Further studies are warranted to apply in vivo optical imaging clinically.”

Written by Petra Rattue

Source: Medical News Today

Feb 4, 20122 notes
#science #neuroscience #psychology #brain
Discovery of Extremely Long-Lived Proteins May Provide Insight Into Cell Aging and Neurodegenerative Diseases

ScienceDaily (Feb. 3, 2012) — One of the big mysteries in biology is why cells age. Now scientists at the Salk Institute for Biological Studies report that they have discovered a weakness in a component of brain cells that may explain how the aging process occurs in the brain.

This microscope image shows extremely long-lived proteins, or ELLPs, glowing green on the outside of the nucleus of a rat brain cell. DNA inside the nucleus is pictured in blue. The Salk scientists discovered that the ELLPs, which form channels through the wall of the nucleus, lasted for more than a year without being replaced. Deterioration of these proteins may allow toxins to enter the nucleus, resulting in cellular aging. (Credit: Courtesy of Brandon Toyama, Salk Institute for Biological Studies)

The scientists discovered that certain proteins, called extremely long-lived proteins (ELLPs), which are found on the surface of the nucleus of neurons, have a remarkably long lifespan.

While the lifespan of most proteins totals two days or less, the Salk Institute researchers identified ELLPs in the rat brain that were as old as the organism, a finding they reported February 3 in Science.

The Salk scientists are the first to discover an essential intracellular machine whose components include proteins of this age. Their results suggest the proteins last an entire lifetime, without being replaced.

ELLPs make up the transport channels on the surface of the nucleus; gates that control what materials enter and exit. Their long lifespan might be an advantage if not for the wear-and-tear that these proteins experience over time. Unlike other proteins in the body, ELLPs are not replaced when they incur aberrant chemical modifications and other damage.

Damage to the ELLPs weakens the ability of the three-dimensional transport channels that are composed of these proteins to safeguard the cell’s nucleus from toxins, says Martin Hetzer, a professor in Salk’s Molecular and Cell Biology Laboratory, who headed the research. These toxins may alter the cell’s DNA and thereby the activity of genes, resulting in cellular aging.

Funded by the Ellison Medical Foundation and the Glenn Foundation for Medical Research, Hetzer’s research group is the only lab in the world that is investigating the role of these transport channels, called the nuclear pore complex (NPC), in the aging process.

Previous studies have revealed that alterations in gene expression underlie the aging process. But, until the Hetzer lab’s discovery that mammals’ NPCs possess an Achilles’ heel that allows DNA-damaging toxins to enter the nucleus, the scientific community has had few solid clues about how these gene alterations occur.

"The fundamental defining feature of aging is an overall decline in the functional capacity of various organs such as the heart and the brain," says Hetzer. "This decline results from deterioration of the homeostasis, or internal stability, within the constituent cells of those organs. Recent research in several laboratories has linked breakdown of protein homeostasis to declining cell function."

The results that Hetzer and his team just report suggest that declining neuron function may originate in ELLPs that deteriorate as a result of damage over time.

"Most cells, but not neurons, combat functional deterioration of their protein components through the process of protein turnover, in which the potentially impaired parts of the proteins are replaced with new functional copies," says Hetzer.

"Our results also suggest that nuclear pore deterioration might be a general aging mechanism leading to age-related defects in nuclear function, such as the loss of youthful gene expression programs," he adds.

The findings may prove relevant to understanding the molecular origins of aging and such neurodegenerative disorders as Alzheimer’s disease and Parkinson’s disease.

In previous studies, Hetzer and his team discovered large filaments in the nuclei of neurons of old mice and rats, whose origins they traced to the cytoplasm. Such filaments have been linked to various neurological disorders including Parkinson’s disease. Whether the misplaced molecules are a cause, or a result, of the disease has not yet been determined.

Also in previous studies, Hetzer and his team documented age-dependent declines in the functioning of NPCs in the neurons of healthy aging rats, which are laboratory models of human biology.

Hetzer’s team includes his colleagues at the Salk Institute as well as John Yates III, a professor in the Department of Chemical Physiology of The Scripps Research Institute.

When Hetzer decided three years ago to investigate whether the NPC plays a role in initiating or contributing to the onset of aging and certain neurodegenerative diseases, some members of the scientific community warned him that such a study was too bold and would be difficult and expensive to conduct. But Hetzer was determined despite the warnings.

Source: ScienceDaily

Feb 4, 20126 notes
#science #neuroscience #psychology #disease
Feb 4, 201216 notes
#science #neuroscience #psychology #brain #brain wave
Human Brains Wire Up Slowly but Surely

by Jon Cohen on 1 February 2012, 6:00 PM

Synaptic division. Compared with chimpanzees, human children children slowly wire their brains. Credit: Fotosearch

As the father-to-son exchange in the old Cat Stevens song advised, “take your time, think a lot, … think of everything you’ve got.” Turns out the mellow ’70s folkie had stumbled upon what may explain a key feature of our brains that sets us apart from our closest relatives: We unhurriedly make synaptic connections through much of our early childhoods, and this plasticity enables us to slowly wire our brains based on our experiences. Given that humans and chimpanzees share 98.8% of the same genes, researchers have long wondered what drives our unique cognitive and social skills. Yes, chimpanzees are smart and cooperative to a degree, but we clearly outshine them when it comes to abstract thinking, self-regulation, assimilation of cultural knowledge, and reasoning abilities. Now a study that looks at postmortem brain samples from humans, chimpanzees, and macaques collected from before birth to up to the end of the life span for each of these species has found a key difference in the expression of genes that control the development and function of synapses, the connections among neurons through which information flows.

As researchers describe in a report published online today in Genome Research, they analyzed the expression of some 12,000 genes—part of the so-called transcriptome—from each species. They found 702 genes in the prefrontal cortex (PFC) of humans that had a pattern of expression over time that differed from the two other species. (The PFC plays a central role in social behavior, working toward goals, and reasoning.) By comparison, genes in the chimpanzee PFC at various life stages had only 55 unique expression patterns—12-fold fewer than found in humans.

The genes the researchers analyzed have myriad functions. But when the researchers created five modules that lumped together genes that were co-expressed, they found that the module in humans that’s most closely tied to synapse formation and function had a “drastically” different developmental trajectory. These genes were turned on high from just after birth until about 5 years of age; the same genes in chimpanzees and macaques began to stop expressing themselves shortly after birth. “We might have discovered one of the differences that makes human brains work differently from chimpanzees and macaques,” says lead researcher Philipp Khaitovich, an evolutionary biologist who works at both the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and the Chinese Academy of Sciences (CAS) in Shanghai, China.

The researchers, including Svante Pääbo of the Leipzig institute and Xiling Liu of CAS, went a step further and actually counted more than 7000 synapses visible in electron micrographs from the three species at different ages. They found that the number of synapses in macaques and chimpanzees skyrocketed shortly after birth but did not peak in humans until about 4 years of age. “Humans have much more time to form synaptic connections,” Khaitovich concludes.

In their analyses, the researchers factored in that humans have much longer life spans than the other species and develop and mature more slowly in general. Their findings still stood out, even when adjusting for this developmental delay.

The work builds on behavioral evidence that showed the advantages of a prolonged childhood, as well as several other studies that have found differences in chimpanzee and human genes involved with synapse formation and function. But no group has ever done such a thorough comparative, longitudinal analysis of the brain transciptomes of these three species, says Todd Preuss, a neuroscientist at the Yerkes National Primate Research Center in Atlanta. “The whole thing is a technical tour de force,” Preuss says.

Nenad Sestan, a neurobiologist at Yale University who published a comprehensive analysis of the transcriptome of human brains from embryos to late adulthood in the 27 October 2011 issue of Nature, says the new work “is novel and provocative.” Sestan says to clarify differences between the species, the field now needs to examine more brain regions “to have a clearer idea of how specific this may be to the dorsolateral prefrontal cortex.”

The findings from Khaitovich and colleagues promise to spark future studies that address profound questions about everything from evolution to gene regulation. For example, they suggest in their report that the differences they found may also separate us from Neandertals, as evidence suggests that these extinct humans had faster cranial and dental development than modern humans.

Neurologist Eric Courchesne of the University of California, San Diego, says the new findings also mesh with his own studies of autism and brain overgrowth. Courchesne has found that the brains of autistic children grow more quickly than normal, which he theorizes prevents them from having enough experiences to properly wire neurons. “This is an absolutely fascinating study that will have great importance for advancing understanding of human disorders of early brain development as well as illuminating the evolutionary changes in neural development,” Courchesne says.

Source: ScienceNow

Feb 4, 20121 note
#science #neuroscience #psychology #brain
New procedure repairs severed nerves in minutes, restoring limb use in days or weeks

February 3rd, 2012 in Neuroscience 

American scientists believe a new procedure to repair severed nerves could result in patients recovering in days or weeks, rather than months or years. The team used a cellular mechanism similar to that used by many invertebrates to repair damage to nerve axons. Their results are published today in the Journal of Neuroscience Research.

"We have developed a procedure which can repair severed nerves within minutes so that the behavior they control can be partially restored within days and often largely restored within two to four weeks," said Professor George Bittner from the University of Texas. "If further developed in clinical trials this approach would be a great advance on current procedures that usually imperfectly restore lost function within months at best."

The team studied the mechanisms all animal cells use to repair damage to their membranes and focused on invertebrates, which have a superior ability to regenerate nerve axons compared to mammals. An axon is a long extension arising from a nerve cell body that communicates with other nerve cells or with muscles.

This research success arises from Bittner’s discovery that nerve axons of invertebrates which have been severed from their cell body do not degenerate within days, as happens with mammals, but can survive for months, or even years.

The severed proximal nerve axon in invertebrates can also reconnect with its surviving distal nerve axon to produce much quicker and much better restoration of behaviour than occurs in mammals.

"Severed invertebrate nerve axons can reconnect proximal and distal ends of severed nerve axons within seven days, allowing a rate of behavioural recovery that is far superior to mammals," said Bittner. "In mammals the severed distal axonal stump degenerates within three days and it can take nerve growths from proximal axonal stumps months or years to regenerate and restore use of muscles or sensory areas, often with less accuracy and with much less function being restored."

The team described their success in applying this process to rats in two research papers published today. The team were able to repair severed sciatic nerves in the upper thigh, with results showing the rats were able to use their limb within a week and had much function restored within 2 to 4 weeks, in some cases to almost full function.

"We used rats as an experimental model to demonstrate how severed nerve axons can be repaired. Without our procedure, the return of nearly full function rarely comes close to happening," said Bittner. "The sciatic nerve controls all muscle movement of the leg of all mammals and this new approach to repairing nerve axons could almost-certainly be just as successful in humans."

To explore the long term implications and medical uses of this procedure, MD’s and other scientist- collaborators at Harvard Medical School and Vanderbilt Medical School and Hospitals are conducting studies to obtain approval to begin clinical trials.

"We believe this procedure could produce a transformational change in the way nerve injuries are repaired," concluded Bittner.

Provided by Wiley

"New procedure repairs severed nerves in minutes, restoring limb use in days or weeks." February 3rd, 2012. http://medicalxpress.com/news/2012-02-procedure-severed-nerves-minutes-limb.html

Feb 4, 20124 notes
#science #neuroscience #psychology
Renowned physicist invents microscope that can peer at living brain cells

February 3, 2012

Schematic drawing of the upright STED microscope used for the experiments. Image: Science, DOI:10.1126/science.1215369

(PhysOrg.com) — Ever since scientists began studying the brain, they’ve wanted to get a better look at what was going on. Researchers have poked and prodded and looked at dead cells under electron microscopes, but never before have they been able to get high resolution microscopic views of actual living brain cells as they function inside of a living animal. Now, thanks to work by physicist Stefan Hell and his colleagues at the Max Planck Institute in Germany, that dream is realized. In a paper published in Science, Hell and his team describe the workings of their marvelous discovery.

Hell (which in German means “bright”) and others at the Institute have been working for years on ultra high resolution microscopes that go by the name “stimulated emission depletion” or STED microscopes. Now, they’ve taken their work to a whole new level by cutting away a small portion of a mouse’s skull and replacing it with a glass window and then placing their latest STED microscope against the glass to peer inside. To make it easier to see what is what, the team first genetically altered the mouse to make certain brain cells fluorescent. Then, to allow for focusing exclusively on just those cells that are lit up, they added software to the microscope to blot out anything that was not lit up. The result is super high resolution real time imagery of the neurons that exist on the exterior part of a living mouse brain. 

(video)

STED time-lapse recording of a single spine at an interval of 10 seconds. The measurement includes 128 z-stacks consisting of 5 slices each. Most of the rapid remodeling of the spine head appears continuous and smooth at this frame rate. No damage is observed at the dendrite or the spine after recording a total of 640 slices. The movie was acquired in a different experiment than the spines in Fig.1. Scale bar = 1µm. Video: DOI:10.1126/science.1215369

The new microscope provides clear resolution down to 70 nanometers, which is four times that ever achieved before and is enough to allow scientists to see the actual movement of dendritic spines, which may help researches understand why they do so.

It is likely that researchers will find many varied uses for the new microscope. One prominent area will almost certainly involve looking into what psychiatric drugs are really doing within synapses, perhaps leading to breakthroughs in pharmaceutical drugs that are better able to target specific illnesses.

One downside to any new scientific breakthrough however, is the natural tendency of many to move from excitation, to wondering about what will come next. In this case, Hell and his team have already started contemplating ideas on ways to allow researchers to study any cell in the living brain at such high resolution, not just those that lie on the surface.

More information: Nanoscopy in a Living Mouse Brain, Science 3 February 2012: Vol. 335 no. 6068 p. 551. DOI: 10.1126/science.1215369

"Renowned physicist invents microscope that can peer at living brain cells." February 3rd, 2012. http://www.physorg.com/news/2012-02-renowned-physicist-microscope-peer-brain.html

Feb 4, 2012
#brain #science #neuroscience #psychology #physics
Feb 3, 20121 note
#placebo #placebo effect #brain
Placebo Effect: New Study Shows How to Boost the Power of Pain Relief, Without Drugs

ScienceDaily (Feb. 3, 2012) — Placebos reduce pain by creating an expectation of relief. Distraction — say, doing a puzzle — relieves it by keeping the brain busy. But do they use the same brain processes? Neuromaging suggests they do. When applying a placebo, scientists see activity in the dorsolateral prefrontal cortex. That’s the part of the brain that controls high-level cognitive functions like working memory and attention — which is what you use to do that distracting puzzle.

Now a new study challenges the theory that the placebo effect is a high-level cognitive function. The authors — Jason T. Buhle, Bradford L. Stevens, and Jonathan J. Friedman of Columbia University and Tor D. Wager of the University of Colorado Boulder — reduced pain in two ways — either by giving them a placebo, or a difficult memory task. lacebo. But when they put the two together, “the level of pain reduction that people experienced added up. There was no interference between them,” says Buhle. “That suggests they rely on separate mechanisms.” The findings, published in Psychological Science, a journal of the Association for Psychological Science, could help clinicians maximize pain relief without drugs.

In the study, 33 participants came in for three separate sessions. In the first, experimenters applied heat to the skin with a little metal plate and calibrated each individual’s pain perceptions. In the second session, some of the people applied an ordinary skin cream they were told was a powerful but safe analgesic. The others put on what they were told was a regular hand cream. In the placebo-only trials, participants stared at a cross on the screen and rated the pain of numerous applications of heat — the same level, though they were told it varied. For other trials they performed a tough memory task — distraction and placebo simultaneously. For the third session, those who’d had the plain cream got the “analgesic” and vice versa. The procedure was the same.

The results: With either the memory task or the placebo alone, participants felt less pain than during the trials when they just stared at the cross. Together, the two effects added up; they didn’t interact or interfere with each other. The data suggest that the placebo effect does not require executive attention or working memory.

So what about that neuroimaging? “Neuroimaging is great,” says Buhle, “but because each brain region does many things, when you see activation in a particular area, you don’t know what cognitive process is driving it.” This study tested the theory about how placebos work with direct behavioral observation.

The findings are promising for pain relief. Clinicians use both placebos and distraction — for instance, virtual reality in burn units. But they weren’t sure if one might diminish the other’s efficacy. “This study shows you can use them together,” says Buhle, “and get the maximum bang for your buck without medications.”

Source: ScienceDaily

Feb 3, 20121 note
#science #neuroscience #psychology #placebo
Schizophrenia: When Hallucinatory Voices Suppress Real Ones, New Electronic Application May Help

ScienceDaily (Feb. 3, 2012) — When a patient afflicted with schizophrenia hears inner voices something is taking place inside the brain that prevents the individual from perceiving real voices. A simple electronic application may help the patient learn to shift focus.

 

Image captures of the brain show how neurons are activated in healthy control subjects when hearing actual voices (top row) whereas activation fails to occur in patients who experience auditory hallucinations. (Credit: Kenneth Hugdahl)

"The patient experiences the inner voices as 100 per cent real, just as if someone was standing next to him and speaking" explains Professor Kenneth Hugdahl of the University of Bergen. "At the same time, he can’t hear voices of others actually present in the same room."

Auditory hallucinations are one of the most common symptoms associated with schizophrenia.

Neural activity ceases

Dr Hugdahl’s research group has made use of a variety of neuroimaging techniques, including functional magnetic resonance imaging technology (fMRI) to enable them quite literally to see what happens inside the brain when the inner voices make their presence known. The project received funding under the NevroNor national initiative on neuroscientific research administered under the auspices of the Research Council of Norway

Images of patients’ brains reveal a spontaneous activation of neurons in a particular area of the brain — specifically the rear, upper region of the left temporal lobe. This is the area responsible for speech perception, and when healthy people hear speech it becomes activated. So what happens when patients with schizophrenia hear a real voice and a hallucinatory one at the same time?

"It would be natural to assume that neural activity would increase somewhat — even twofold. But quite the opposite takes place; we actually observed that the activity ceased altogether," states Professor Hugdahl.

Losing contact with the outside world

In order to learn more about what was happening, Hugdahl and his colleagues Kristiina Kompus and René Westerhausen carried out a meta-analysis of 23 studies. These studies focused either on spontaneous inner-voice triggered neural activation in subjects with schizophrenia or the stimulatory reaction prompted by actual sounds in both healthy and schizophrenic subjects.

It emerged that many researchers had observed either that a spontaneous activation of neurons occurs in patients hearing inner voices or that the patients’ perception of actual voices becomes suppressed when these are heard simultaneously with inner voices. No one had seen the connection between these findings.

"Previously, we thought these were two separate phenomena. But our analyses revealed that the one causes the other: when neurons become activated by inner voices it inhibits perception of outside speech. The neurons become ‘preoccupied’ and can’t ‘process’ voices from the outside," explains Professor Hugdahl.

"This may explain why schizophrenic patients close themselves off so completely and lose touch with the outside world when experiencing hallucinations," he purports.

Electronic app designed to improve impulse control

Hugdal and his colleagues made yet another discovery that may well help explain how the lives of these individuals become consumed by inner voices. It turns out that the frontal lobe in the brains of schizophrenia patients does not function exactly the way it should. As a result, these patients have a lesser degree of impulse control and are unable to filter out their inner voices.

"Every one of us hears inner voices or melodies from time to time. The difference between non-afflicted individuals and schizophrenia patients is that the former manage to tune these out better," the professor points out.

If patients could learn to stifle inner noise it could have a huge impact on our ability to treat schizophrenia, he states. To this end, Professor Hugdahl’s research group has developed an application that can be used on mobile phones and other simple electronic devices, to help patients improve their filters.

Wearing headphones, the patient is exposed to simple speech sounds with different sounds played in each ear. The task is to practice hearing the sound in one ear while blocking out sound in the other. The application has only been tested on two patients with schizophrenia so far. The response from these patients is promising, Dr Hugdahl relates.

"The voices are still there, but the test subjects feel that they have control over the voices instead of the other way around. The patient feels it is a breakthrough since it means he can actively shift his focus from the inner voices over to the sounds coming from the outside," the professor explains.

Source: ScienceDaily

Feb 3, 20127 notes
#science #neuroscience #psychology #brain #schizophrenia
Noise Exposure Can Cause Long-Lasting Changes To Sensory Pathways; Touch-Sensing Nerve Cells May Lead To Future Tinnitus Treatments

Article Date: 03 Feb 2012 - 0:00 PST

We all know that it can take a little while for our hearing to bounce back after listening to our iPods too loud or attending a raucous concert. But new research at the University of Michigan Health System suggests over-exposure to noise can actually cause more lasting changes to our auditory circuitry - changes that may lead to tinnitus, commonly known as ringing in the ears.

U-M researchers previously demonstrated that after hearing damage, touch-sensing “somatosensory” nerves in the face and neck can become overactive, seeming to overcompensate for the loss of auditory input in a way the brain interprets - or “hears” - as noise that isn’t really there.

The new study, which appears in The Journal of Neuroscience, found that somatosensory neurons maintain a high level of activity following exposure to loud noise, even after hearing itself returns to normal.

The findings were made in guinea pigs, but mark an important step toward potential relief for people plagued by tinnitus, says lead investigator Susan E. Shore, Ph.D., of U-M’s Kresge Hearing Research Institute and a professor of otolaryngology and molecular and integrative physiology at the U-M Medical School.

“The animals that developed tinnitus after a temporary loss in their hearing after loud noise exposure were the ones who had sustained increases in activity in these neural pathways,” Shore says. “In the future it may be possible to treat tinnitus patients by dampening the hyperactivity by reprogramming these auditory-touch circuits in the brain.”

In normal hearing, a part of the brain called the dorsal cochlear nucleus is the first stop for signals arriving from the ear via the auditory nerve. But it’s also a hub where “multitasking” neurons process other sensory signals, such as touch, together with hearing information.

During hearing loss, the other sensory signals entering the dorsal cochlear nucleus are amplified, Shore’s earlier research found. This overcompensation by the somatosensory neurons, which carry information about touch, vibration, skin temperature and pain, is believed to fuel tinnitus in many cases.

Tinnitus affects up to 50 million people in the United States and millions more worldwide, according to the American Tinnitus Association. It can range from intermittent and mildly annoying to chronic, severe and debilitating. There is no cure.

It especially affects baby boomers, who, as they reach an age at which hearing tends to diminish, increasingly find that tinnitus moves in. The condition most commonly occurs with hearing loss, but can also follow head and neck trauma, such as after an auto accident, or dental work. Tinnitus is the number one disability afflicting members of the armed forces.

The involvement of touch sensing (or “somatosensory”) nerves in the head and neck explains why many tinnitus sufferers can change the volume and pitch of the sound by clenching their jaw, or moving their head and neck, Shore explains.

While the new study builds on previous discoveries by Shore and her team, many aspects are new.

“This is the first research to show that, in the animals that developed tinnitus after hearing returned to normal, increased excitation from the somatosensory nerves in the head and neck continued. This dovetails with our previous research, which suggests this somatosensory excitation is a major component of tinnitus,” says Shore, who serves on the scientific advisory committee of the American Tinnitus Association.

“The better we understand the underlying causes of tinnitus, the better we’ll be able to develop new treatments,” she adds.

Source: Medical News Today 

Feb 3, 201230 notes
#science #neuroscience #psychology #ear #tinnitus
Play
Feb 3, 201215 notes
Feb 3, 201213 notes
Investigating The Neural Basis Of Prosopagnosia

Article Date: 03 Feb 2012 - 0:00 PST

For Bradley Duchaine, there is definitely more than meets the eye where faces are concerned.

With colleagues at Birkbeck College in the University of London, he is investigating the process of facial recognition, seeking to understand the complexity of what is actually taking place in the brain when one person looks at another.

His studies target people who display an inability to recognize faces, a condition long known as prosopagnosia. Duchaine is trying to understand the neural basis of the condition while also make inferences about what is going wrong in terms of information processing - where in the stages that our brains go through to recognize a face is the system breaking down. A paper published in Brain details the most recent experimental results.

“We refer to prosopagnosia as a ‘selective’ deficit of face recognition, in that other cognitive process do not seem to be affected,” explains Duchaine, an associate professor of psychological and brain sciences. “[People with the condition] might be able to recognize voices perfectly, which demonstrates that it is really a visual problem. In what we call pure cases, people can recognize cars perfectly, and they can recognize houses perfectly. It is just faces that are a problem.”

The condition may be acquired as the result of a stroke, for example. But in the recent study, Duchaine focused on developmental prosopagnosia, in which a person fails to develop facial recognition abilities.

“Other parts of the brain develop apparently normally,” Duchaine says. “These are intelligent people who have good jobs and get along fine but they can’t recognize faces.”

The primary experimental tool in this experiment was the electroencephalogram (EEG), which has the advantage of providing excellent temporal resolution - pinpointing the timing of the brain’s electrical response to a given stimulus.

Duchaine and his colleagues placed a series of electrodes around the scalps of prosopagnosics and showed them images of famous faces and non-famous faces, recording their responses. As expected, many of the famous faces were not recognized.

They found an electrical response at about 250 milliseconds (ms) after seeing the faces. Among the control group of non-prosopagnosics, a real difference was observed between their responses to famous and non-famous faces. In half the prosopagnosics there was not. Surprisingly, however, in the other half of the prosopagnosic test subjects they did find a difference.

“On the many trials where half failed to categorize a famous face as familiar, they nevertheless showed an EEG difference around 250ms after stimulus presentation between famous and non-famous faces like normal subjects do. Normal subjects also show a difference between famous and non-famous about 600ms after presentation, but the prosopagnosics did not show this difference,” Duchaine observes.

This pattern of results suggests the prosopagnosics unconsciously recognized the famous faces at an early stage (250ms) but this information was lost by the later stage (600ms). Duchaine concludes that even though they are not consciously aware that this is a famous face, some part of their brain at this stage in the process is aware and is recognizing that face, a phenomenon termed covert face recognition.

He suggests that the other half of the prosopagnosics, who showed no difference between responses at 250ms, were experiencing a malfunction in their face processing system already at this early stage suggesting a different type of prosopagnosia.

“The temporal lobe contains a number of face processing areas, so you can imagine there are many different ways that this system can malfunction. Not only can an area not work, connections between areas might not work yielding probably dozens of these different variants of this condition,” he surmises.

Covert recognition has been demonstrated in prosopagnosia acquired through brain damage, but Duchaine’s work is the first convincing demonstration of covert recognition in developmental prosopagnosia, the much more common form. 

Source: Medical News Today

Feb 3, 20122 notes
#science #brain #psychology #neuroscience #prosopagnosia
An Explanation For Why The Brain May Become More Reluctant To Function As We Grow Older

Article Date: 03 Feb 2012 - 0:00 PST

New findings, led by neuroscientists at the University of Bristol and published this week in the journal Neurobiology of Aging, reveal a novel mechanism through which the brain may become more reluctant to function as we grow older.

It is not fully understood why the brain’s cognitive functions such as memory and speech decline as we age. Although work published this year suggests cognitive decline can be detectable before 50 years of age. The research, led by Professor Andy Randall and Dr Jon Brown from the University’s School of Physiology and Pharmacology, identified a novel cellular mechanism underpinning changes to the activity of neurones which may underlie cognitive decline during normal healthy aging.

The brain largely uses electrical signals to encode and convey information. Modifications to this electrical activity are likely to underpin age-dependent changes to cognitive abilities.

The researchers examined the brain’s electrical activity by making recordings of electrical signals in single cells of the hippocampus, a structure with a crucial role in cognitive function. In this way they characterised what is known as “neuronal excitability” - this is a descriptor of how easy it is to produce brief, but very large, electrical signals called action potentials; these occur in practically all nerve cells and are absolutely essential for communication within all the circuits of the nervous system.

Action potentials are triggered near the neurone’s cell body and once produced travel rapidly through the massively branching structure of the nerve cell, along the way activating the synapses the nerve cell makes with the numerous other nerve cells to which it is connected.

The Bristol group identified that in the aged brain it is more difficult to make hippocampal neurones generate action potentials. Furthermore they demonstrated that this relative reluctance to produce action potential arises from changes to the activation properties of membrane proteins called sodium channels, which mediate the rapid upstroke of the action potential by allowing a flow of sodium ions into neurones.

Professor Randall, Professor in Applied Neurophysiology said: “Much of our work is about understanding dysfunctional electrical signalling in the diseased brain, in particular Alzheimer’s disease. We began to question, however, why even the healthy brain can slow down once you reach my age. Previous investigations elsewhere have described age-related changes in processes that are triggered by action potentials, but our findings are significant because they show that generating the action potential in the first place is harder work in aged brain cells.

“Also by identifying sodium channels as the likely culprit for this reluctance to produce action potentials, our work even points to ways in which we might be able modify age-related changes to neuronal excitability, and by inference cognitive ability.”  

Source: Medical News Today

Feb 3, 2012
#science #neuroscience #psychology #brain
Gene regulator in brain's executive hub tracked across lifespan

February 2nd, 2012 in Genetics


A representative gene shows how sex can influence levels of methylation across the lifespan. Each dot represents a different brain. Credit: Barbara Lipska, Ph.D., NIMH Clinical Brain Disorders Branch

For the first time, scientists have tracked the activity, across the lifespan, of an environmentally responsive regulatory mechanism that turns genes on and off in the brain’s executive hub. Among key findings of the study by National Institutes of Health scientists: genes implicated in schizophrenia and autism turn out to be members of a select club of genes in which regulatory activity peaks during an environmentally-sensitive critical period in development. The mechanism, called DNA methylation, abruptly switches from off to on within the human brain’s prefrontal cortex during this pivotal transition from fetal to postnatal life. As methylation increases, gene expression slows down after birth.

Epigenetic mechanisms like methylation leave chemical instructions that tell genes what proteins to make -what kind of tissue to produce or what functions to activate. Although not part of our DNA, these instructions are inherited from our parents. But they are also influenced by environmental factors, allowing for change throughout the lifespan.

“Developmental brain disorders may be traceable to altered methylation of genes early in life,” explained Barbara Lipska, Ph.D., a scientist in the NIH’s National Institute of Mental Health (NIMH) and lead author of the study. “For example, genes that code for the enzymes that carry out methylation have been implicated in schizophrenia. In the prenatal brain, these genes help to shape developing circuitry for learning, memory and other executive functions which become disturbed in the disorders. Our study reveals that methylation in a family of these genes changes dramatically during the transition from fetal to postnatal life - and that this process is influenced by methylation itself, as well as genetic variability. Regulation of these genes may be particularly sensitive to environmental influences during this critical early life period.”

Lipska and colleagues report on the ebb and flow of the human prefrontal cortex’s (PFC) epigenome across the lifespan, February 2, 2012, online in the American Journal of Human Genetics.



Two representative genes show strikingly opposite trajectories of PFC methylation across the lifespan. Each dot represents a different brain. Usually, the more methylation, the less gene expression. Credit: Barbara Lipska, Ph.D., NIMH Clinical Brain Disorders Branch

“This new study reminds us that genetic sequence is only part of the story of development. Epigenetics links nurture and nature, showing us when and where the environment can influence how the genetic sequence is read,” said NIMH director Thomas R. Insel, M.D.

In a companion study published last October, the NIMH researchers traced expression of gene products in the PFC across the lifespan. The current study instead examined methylation at 27,000 sites within PFC genes that regulate such expression. Both studies examined post-mortem brains of non-psychiatrically impaired individuals ranging in age from two weeks after conception to 80 years old.

In most cases, when chemicals called methyl groups attach to regulatory regions of genes, they silence them. Usually, the more methylation, the less gene expression. Lipska’s team found that the overall level of PFC methylation is low prenatally when gene expression is highest and then switches direction at birth, increasing as gene expression plummets in early childhood. It then levels off as we grow older. But methylation in some genes shows an opposite trajectory. The study found that methylation is strongly influenced by gender, age and genetic variation.

For example, methylation levels differed between males and females in 85 percent of X chromosome sites examined, which may help to explain sex differences in disorders like autism and schizophrenia.

Different genes - and subsets of genes - methylate at different ages. Some of the suspect genes found to peak in methylation around birth code for enzymes, called methytransferases, that are over-expressed in people with schizophrenia and bipolar disorder. This process is influenced, in turn, by methylation in other genes, as well as by genetic variation. So genes associated with risk for such psychiatric disorders may influence gene expression through methylation in addition to inherited DNA.

Provided by National Institutes of Health

“Gene regulator in brain’s executive hub tracked across lifespan.” February 2nd, 2012. http://medicalxpress.com/news/2012-02-gene-brain-hub-tracked-lifespan.html

Feb 3, 2012
#science #neuroscience #psychology #brain #genetics
Feb 2, 201252 notes
Untangling the Mysteries of Alzheimer's

ScienceDaily (Feb. 2, 2012) — One of the most distinctive signs of the development of Alzheimer’s disease is a change in the behavior of a protein that neuroscientists call tau. In normal brains, tau is present in individual units essential to neuron health. In the cells of Alzheimer’s brains, by contrast, tau proteins aggregate into twisted structures known as “neurofibrillary tangles.” These tangles are considered a hallmark of the disease, but their precise role in Alzheimer’s pathology has long been a point of contention among researchers.

Now, University of Texas Medical Branch at Galveston researchers have found new evidence that confirms the significance of tau to Alzheimer’s. Instead of focusing on tangles, however, their work highlights the intermediary steps between a single tau protein unit and a neurofibrillary tangle — assemblages of two, three, four, or more tau proteins known as “oligomers,” which they believe are the most toxic entities in Alzheimer’s.

"What we discovered is that there are smaller structures that form before the neurofibrillary tangles, and they are much more toxic than the big structures," said Rakez Kayed, UTMB assistant professor and senior author of a paper on the work now online in the FASEB Journal. “And we established that they were toxic in real human brains, which is important to developing an effective therapy.”

According to Kayed, a key antibody developed at UTMB called T22 enabled the team to produce a detailed portrait of tau oligomer behavior in human brain tissue. Specifically designed to bond only to tau oligomers (and not lone tau proteins or neurofibrillary tangles), the antibody made it possible for the researchers to use a variety of analytical tools to compare samples of Alzheimer’s brain with samples of age-matched healthy brain.

"One thing that’s remarkable about this research is that before we developed this antibody, people couldn’t even see tau oligomers in the brain," Kayed said. "With T22, we were able to thoroughly characterize them, and also study them in human brain cells."

Among the researchers’ most striking findings: in some of the Alzheimer’s brains they examined, tau oligomer levels were as much as four times as high as those found in age-matched control brains.

Other experiments revealed specific biochemical behavior and structures taken on by oligomers, and demonstrated their presence outside neurons — in particular, on the walls of blood vessels.

"We think this is going to make a big impact scientifically, because it opens up a lot of new areas to study," Kayed said. "It also relates to our main focus, developing a cure for Alzheimer’s. And I find that very, very exciting."

Provided by University of Texas Medical Branch at Galveston

Source: ScienceDaily

Feb 2, 201214 notes
#Alzheimer's #brain #science #psychology #neuroscience
Scientists Have Now Discovered How Different Brain Regions Cooperate During Short-Term Memory

Article Date: 02 Feb 2012 - 1:00 PST

Holding information within one’s memory for a short while is a seemingly simple and everyday task. We use our short-term memory when remembering a new telephone number if there is nothing to write at hand, or to find the beautiful dress inside the store that we were just admiring in the shopping window. Yet, despite the apparent simplicity of these actions, short-term memory is a complex cognitive act that entails the participation of multiple brain regions. However, whether and how different brain regions cooperate during memory has remained elusive. A group of researchers from the Max Planck Institute for Biological Cybernetics in Tubingen, Germany have now come closer to answering this question. They discovered that oscillations between different brain regions are crucial in visually remembering things over a short period of time.

It has long been known that brain regions in the frontal part of the brain are involved in short-term memory, while processing of visual information occurs primarily at the back of the brain. However, to successfully remember visual information over a short period of time, these distant regions need to coordinate and integrate information.

To better understand how this occurs, scientists from the Max Planck Institute of Biological Cybernetics in the department of Nikos Logothetis recorded electrical activity both in a visual area and in the frontal part of the brain in monkeys. The scientists showed the animals identical or different images within short intervals while recording their brain activity. The animals then had to indicate whether the second image was the same as the first one.

The scientists observed that, in each of the two brain regions, brain activity showed strong oscillations in a certain set of frequencies called the theta-band. Importantly, these oscillations did not occur independently of each other, but synchronized their activity temporarily: “It is as if you have two revolving doors in each of the two areas. During working memory, they get in sync, thereby allowing information to pass through them much more efficiently than if they were out of sync,” explains Stefanie Liebe, the first author of the study, conducted in the team of Gregor Rainer in cooperation with Gregor Hörzer from the Technical University Graz. The more synchronized the activity was, the better could the animals remember the initial image. Thus, the authors were able to establish a direct relationship between what they observed in the brain and the performance of the animal.

The study highlights how synchronized brain oscillations are important for the communication and interaction of different brain regions. Almost all multi-faceted cognitive acts, such as visual recognition, arise from a complex interplay of specialized and distributed neural networks. How relationships between such distributed sites are established and how they contribute to represent and communicate information about external and internal events in order to attain a coherent percept or memory is still poorly understood.

Source: Medical News Today

Feb 2, 201230 notes
#brain #neuroscience #science #memory #psychology
Feb 2, 201215,386 notes
Just another pretty face: Professor investigates neural basis of prosopagnosia

February 1st, 2012 in Psychology & Psychiatry 

These are examples of famous faces and non-famous faces used in Bradley Duchaine’s prosopagnosia experiment. Paired famous and non-famous faces are shown in corresponding positions. Credit: Bradley Duchaine

For Bradley Duchaine, there is definitely more than meets the eye where faces are concerned.

With colleagues at Birkbeck College in the University of London, he is investigating the process of facial recognition, seeking to understand the complexity of what is actually taking place in the brain when one person looks at another.

His studies target people who display an inability to recognize faces, a condition long known as prosopagnosia. Duchaine is trying to understand the neural basis of the condition while also make inferences about what is going wrong in terms of information processing-where in the stages that our brains go through to recognize a face is the system breaking down. A paper published in Brain details the most recent experimental results.

"We refer to prosopagnosia as a ‘selective’ deficit of face recognition, in that other cognitive process do not seem to be affected," explains Duchaine, an associate professor of psychological and brain sciences. "[People with the condition] might be able to recognize voices perfectly, which demonstrates that it is really a visual problem. In what we call pure cases, people can recognize cars perfectly, and they can recognize houses perfectly. It is just faces that are a problem."

The condition may be acquired as the result of a stroke, for example. But in the recent study, Duchaine focused on developmental prosopagnosia, in which a person fails to develop facial recognition abilities.

"Other parts of the brain develop apparently normally," Duchaine says. "These are intelligent people who have good jobs and get along fine but they can’t recognize faces."

The primary experimental tool in this experiment was the electroencephalogram (EEG), which has the advantage of providing excellent temporal resolution-pinpointing the timing of the brain’s electrical response to a given stimulus.

Duchaine and his colleagues placed a series of electrodes around the scalps of prosopagnosics and showed them images of famous faces and non-famous faces, recording their responses. As expected, many of the famous faces were not recognized.

They found an electrical response at about 250 milliseconds (ms) after seeing the faces. Among the control group of non-prosopagnosics, a real difference was observed between their responses to famous and non-famous faces. In half the prosopagnosics there was not. Surprisingly, however, in the other half of the prosopagnosic test subjects they did find a difference.

"On the many trials where half failed to categorize a famous face as familiar, they nevertheless showed an EEG difference around 250ms after stimulus presentation between famous and non-famous faces like normal subjects do. Normal subjects also show a difference between famous and non-famous about 600ms after presentation, but the prosopagnosics did not show this difference," Duchaine observes.

This pattern of results suggests the prosopagnosics unconsciously recognized the famous faces at an early stage (250ms) but this information was lost by the later stage (600ms). Duchaine concludes that even though they are not consciously aware that this is a famous face, some part of their brain at this stage in the process is aware and is recognizing that face, a phenomenon termed covert face recognition.

He suggests that the other half of the prosopagnosics, who showed no difference between responses at 250ms, were experiencing a malfunction in their face processing system already at this early stage suggesting a different type of prosopagnosia.

"The temporal lobe contains a number of face processing areas, so you can imagine there are many different ways that this system can malfunction. Not only can an area not work, connections between areas might not work yielding probably dozens of these different variants of this condition," he surmises.

Covert recognition has been demonstrated in prosopagnosia acquired through brain damage, but Duchaine’s work is the first convincing demonstration of covert recognition in developmental prosopagnosia, the much more common form.

Provided by Dartmouth College

"Just another pretty face: Professor investigates neural basis of prosopagnosia." February 1st, 2012. http://medicalxpress.com/news/2012-02-pretty-professor-neural-basis-prosopagnosia.html

Feb 2, 20122 notes
#science #neuroscience #psychology #prosopagnosia
Brain capacity limits exponential online data growth

February 1st, 2012 in Physics / General Physics 

Scientists have found that the capacity of the human brain to process and record information - and not economic constraints - may constitute the dominant limiting factor for the overall growth of globally stored information. These findings have just been published in an article in EPJ B by Claudius Gros and colleagues from the Institute for Theoretical Physics at Goethe University Frankfurt in Germany.

The authors first looked at the distribution of 633 public internet files by plotting the number of videos, audio and image files against the size of the files. They gathered files which were produced by humans or intended for human use with the spider file search engine Findfiles.net. They chose to focus on files which are hosted on domains pointing from the online encyclopaedia Wikipedia and the open web directory dmoz.

Assuming that economic costs for data production are proportional to the amount of data produced, these costs should be driving the generation of information exponentially. However, the authors found that, in fact, economic costs were not the limiting factors for data production. The absence of exponential tails for the graph representing the number of files indicates this conclusion.

They found that underlying neurophysiological processes influence the brain’s ability to handle information. For example, when a person produces an image and attributes a subjective value to it, for example, a given resolution, he or she is influenced by his or her perception of the quality of that image. Their perception of the amount of information gained when increasing the resolution of a low-quality image is substantially higher than when increasing the resolution of a high-quality photo by the same degree. This relation is known as the Weber-Fechner law.

The authors observed that file-size distributions obey this Weber-Fechner law. This means that the total amount of information cannot grow faster than our ability to digest or handle it.

More information: Gros C., Kaczor G., Markovic D., (2012) Neuropsychological constraints to human data production on a global scale, European Physical Journal B (EPJ B) 85: 28, DOI 10.1140/epjb/e2011-20581-3

Provided by Springer

"Brain capacity limits exponential online data growth." February 1st, 2012.http://www.physorg.com/news/2012-02-brain-capacity-limits-exponential-online.html

Feb 2, 20121 note
#science #neuroscience #brain #physics #psychology
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December