Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

291 notes

New approach makes cancer cells explode
Researchers at Karolinska Institutet have discovered that a substance called Vacquinol-1 makes cells from glioblastoma, the most aggressive type of brain tumour, literally explode. When mice were given the substance, which can be given in tablet form, tumour growth was reversed and survival was prolonged. The findings are published in the journal Cell.
The established treatments that are available for glioblastoma include surgery, radiation and chemotherapy. But even if this treatment is given the average survival is just 15 months. It is therefore critical to find better treatments for malignant brain tumours.
Researchers at Karolinska Institutet and colleagues at Uppsala University have discovered an entirely new mechanism to kill tumour cells in glioblastoma. Researchers in an initial stage have exposed tumour cells to a wide range of molecules. If the cancer cells died, the molecule was considered of interest for further studies, which initially applied to over 200 kinds of molecules. Following extensive studies, a single molecule has been identified as being of particular interest. The researchers wanted to find out why it caused cancer cell death.
It was found that the molecule gave the cancer cells an uncontrolled vacuolization, a process in which the cell carries substances from outside the cell into its interior. This carrying process is made via the vacuoles, which can roughly be described as blisters or bags consisting of cell membranes. The process is similar to what was behind last year’s Nobel Prize in physiology or medicine, the discovery that describes how cellular vesicles move things from the interior of the cell to its surface.
Cell membranes collapsed
When cancer cells were filled with a large amount of vacuoles, the cell membranes, the outer wall of the cell, collapsed and the cell simply exploded and necrotized.
“This is an entirely new mechanism for cancer treatment. A possible medicine based on this principle would therefore attack the glioblastoma in an entirely new way. This principle may also work for other cancer diseases, we have not really explored this yet,” says Patrik Ernfors, professor of tissue biology at the Department of Medical Biochemistry and Biophysics at Karolinska Institutet.
Researchers made mice that had human glioblastoma cells transplanted ingest the substance for five days. The average survival was about 30 days for the control group that did not receive the substance. Of those who received the substance six of eight mice were still alive after 80 days. The study was then considered of such interest that the scientific journal wanted to publish the article immediately.
“We now want to try to take this discovery in basic research through preclinical development and all the way to the clinic. The goal is to get into a phase 1 trial,” says Patrik Ernfors.

New approach makes cancer cells explode

Researchers at Karolinska Institutet have discovered that a substance called Vacquinol-1 makes cells from glioblastoma, the most aggressive type of brain tumour, literally explode. When mice were given the substance, which can be given in tablet form, tumour growth was reversed and survival was prolonged. The findings are published in the journal Cell.

The established treatments that are available for glioblastoma include surgery, radiation and chemotherapy. But even if this treatment is given the average survival is just 15 months. It is therefore critical to find better treatments for malignant brain tumours.

Researchers at Karolinska Institutet and colleagues at Uppsala University have discovered an entirely new mechanism to kill tumour cells in glioblastoma. Researchers in an initial stage have exposed tumour cells to a wide range of molecules. If the cancer cells died, the molecule was considered of interest for further studies, which initially applied to over 200 kinds of molecules. Following extensive studies, a single molecule has been identified as being of particular interest. The researchers wanted to find out why it caused cancer cell death.

It was found that the molecule gave the cancer cells an uncontrolled vacuolization, a process in which the cell carries substances from outside the cell into its interior. This carrying process is made via the vacuoles, which can roughly be described as blisters or bags consisting of cell membranes. The process is similar to what was behind last year’s Nobel Prize in physiology or medicine, the discovery that describes how cellular vesicles move things from the interior of the cell to its surface.

Cell membranes collapsed

When cancer cells were filled with a large amount of vacuoles, the cell membranes, the outer wall of the cell, collapsed and the cell simply exploded and necrotized.

“This is an entirely new mechanism for cancer treatment. A possible medicine based on this principle would therefore attack the glioblastoma in an entirely new way. This principle may also work for other cancer diseases, we have not really explored this yet,” says Patrik Ernfors, professor of tissue biology at the Department of Medical Biochemistry and Biophysics at Karolinska Institutet.

Researchers made mice that had human glioblastoma cells transplanted ingest the substance for five days. The average survival was about 30 days for the control group that did not receive the substance. Of those who received the substance six of eight mice were still alive after 80 days. The study was then considered of such interest that the scientific journal wanted to publish the article immediately.

“We now want to try to take this discovery in basic research through preclinical development and all the way to the clinic. The goal is to get into a phase 1 trial,” says Patrik Ernfors.

Filed under cancer cells glioblastoma brain tumour vacquinol-1 cancer neuroscience science

113 notes

Bioimaging: Visualizing real-time development of capillary networks in adult brains
The advancement of microscopic photoimaging techniques has enabled the visualization of real-time cellular events in living organs. The brain capillary network exhibits a unique feature that forms a blood-brain barrier (BBB), which is an interface of vascular endothelial cells that control the traffic of substances from the bloodstream into the brain. Damage and disruption to the BBB are implicated in contributing to the pathogenesis and progression of neurodegenerative disorders such as Alzheimer’s and epilepsy. However, the cellular interactions present in the BBB are incredibly difficult to study in vivo, so understanding of these mechanisms in living brains is limited.
Now, Kazuto Masamoto and co-workers at the University of Electro-Communications in Tokyo, National Institute of Radiological Sciences, and Keio University School of Medicine, have used 4D live imaging technology to study the effects of hypoxia (a deprivation of oxygen) on the BBB plasticity in live adult mice.
The team focused their attention on how the BBB plastic changes work against hypoxia, looking in particular at the endothelial cells and their communications to the neighboring astrocytes - interactions which take place in controlling the BBB traffic to fulfill neural demands. Using genetically-modified mice with endothelial cells that express green-fluorescent protein, Masamoto and colleagues imaged the real-time changes of BBBs before and during a three-week period of hypoxia in adult mouse cortex.
Their results showed that the capillaries in the BBB, which prior to hypoxia showed no signs of activity, began to sprout new blood vessels which in places formed new networks together. The neighboring astrocytes reacted quickly to wrap the outside of the new vessels, activity which the researchers believe helps stabilize the BBB traffic and integrity.
Further investigations into the molecular mechanisms that control BBB plasticity are expected to lead to advances in treatment of neurodegenerative disorders and cerebral ischemia, and thus provide an effective way for preventing BBB dysfunction in diabetes, hypertension, and aging.

Bioimaging: Visualizing real-time development of capillary networks in adult brains

The advancement of microscopic photoimaging techniques has enabled the visualization of real-time cellular events in living organs. The brain capillary network exhibits a unique feature that forms a blood-brain barrier (BBB), which is an interface of vascular endothelial cells that control the traffic of substances from the bloodstream into the brain. Damage and disruption to the BBB are implicated in contributing to the pathogenesis and progression of neurodegenerative disorders such as Alzheimer’s and epilepsy. However, the cellular interactions present in the BBB are incredibly difficult to study in vivo, so understanding of these mechanisms in living brains is limited.

Now, Kazuto Masamoto and co-workers at the University of Electro-Communications in Tokyo, National Institute of Radiological Sciences, and Keio University School of Medicine, have used 4D live imaging technology to study the effects of hypoxia (a deprivation of oxygen) on the BBB plasticity in live adult mice.

The team focused their attention on how the BBB plastic changes work against hypoxia, looking in particular at the endothelial cells and their communications to the neighboring astrocytes - interactions which take place in controlling the BBB traffic to fulfill neural demands. Using genetically-modified mice with endothelial cells that express green-fluorescent protein, Masamoto and colleagues imaged the real-time changes of BBBs before and during a three-week period of hypoxia in adult mouse cortex.

Their results showed that the capillaries in the BBB, which prior to hypoxia showed no signs of activity, began to sprout new blood vessels which in places formed new networks together. The neighboring astrocytes reacted quickly to wrap the outside of the new vessels, activity which the researchers believe helps stabilize the BBB traffic and integrity.

Further investigations into the molecular mechanisms that control BBB plasticity are expected to lead to advances in treatment of neurodegenerative disorders and cerebral ischemia, and thus provide an effective way for preventing BBB dysfunction in diabetes, hypertension, and aging.

Filed under blood-brain barrier astrocytes endothelial cells neurodegenerative diseases neuroscience science

243 notes

Genetic factor contributes to forgetfulness

University of Bonn psychologists prove genetic variation is underlying factor in higher incidence of forgetfulness

Misplaced your keys? Can’t remember someone’s name? Didn’t notice the stop sign? Those who frequently experience such cognitive lapses now have an explanation. Psychologists from the University of Bonn have found a connection between such everyday lapses and the DRD2 gene. Those who have a certain variant of this gene are more easily distracted and experience a significantly higher incidence of lapses due to a lack of attention. The scientific team will probably report their results in the May issue of “Neuroscience Letters,” which is already available online in advance.

image

Most of us are familiar with such everyday lapses; can’t find your keys, again! Or you walk into another room but forgot what you actually went there for. Or you are on the phone with someone and cannot remember their name. “Such short-term memory lapses are very common, but some people experience them particularly often,” said Prof. Dr. Martin Reuter from the department for Differential and Biological Psychology at the University of Bonn. Mistakes occurring due to such short-term lapses can become a hazard in cases where, e.g., a person overlooks a stop sign at an intersection. And in the workplace, a lack of attention can also become a problem–so for example when it results in forgetting to save essential data.

A gene “directing” your brain

"A familial clustering of such lapses suggests that they are subject to genetic effects," explained Dr. Sebastian Markett, the principal author and a member of Prof. Reuter’s team. In lab experiments, the group of scientists had already found indications earlier that the so-called dopamine D2 receptor gene (DRD2) plays a part in forgetfulness. DRD2 has an essential function in signal transmission within the frontal lobes. "This structure can be compared to a director coordinating the brain like an orchestra," Dr. Markett added. In this simile, the DRD2 gene would correspond to the baton, because it plays a part in dopamine transmission in the brain. If the baton skips a beat, the orchestra gets confused.

The psychologists from the University of Bonn tested a total of 500 women and men by taking a saliva sample and examining it using methods from molecular biology. All humans carry the DRD2 gene, which comes in two variants that are distinguished by only one letter within the genetic code. The one variant has C (cytosine) in one locus, which is displaced by T (thymine) in the other. According to the research team’s analyses, about a quarter of the subjects exclusively had the DRD2 gene with the cytosine nucleobase, while three quarters were the genotype with at least one thymine base.

The scientists then wanted to find out whether this difference in the genetic code also had an effect on everyday behavior. By means of a self-assessment survey they asked the subjects to state how frequently they experience these lapses–how often they forgot names, misplaced their keys. The survey also included questions regarding certain impulsivity-related factors, such as how easily a subject was distracted from actual tasks at hand, and how long they were able to maintain their concentration.

Lapses can clearly be tied to the gene variant

The scientists used statistical methods to check whether it was possible to associate the forgetfulness symptoms elicited by means of the surveys to one of the DRD2 gene variants. The results showed that functions such as attention and memory are less clearly expressed in persons who carry the thymine variant of the gene than in the cytosine type. “The connection is obvious; such lapses can partially be attributed to this gene variant,” reported Dr. Markett. According to their own statements, the subjects with the thymine DRD2 variant more frequently “fall victim” to forgetfulness or attention deficits. And vice versa, the cytosine type seems to be protected from that. “This result matches the results of other studies very well,” added Dr. Markett.

Carriers of the gene variant linked to forgetfulness may now find solace in the fact that they are not responsible for their genes, and that this is just their fate….but Dr. Markett doesn’t agree. “There are things you can do to compensate for forgetfulness; writing yourself notes or making more of an effort to put your keys down in a specific location–and not just anywhere.” Those who develop such strategies for the different areas of their lives are better able to handle their deficit.

(Source: www3.uni-bonn.de)

Filed under forgetfulness DRD2 dopamine memory frontal lobe neuroscience science

324 notes

Computers See Through Faked Expressions of Pain Better Than People
A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.
The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.
“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”
Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”
The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.
“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”
“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”
The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.
“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”
In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.
“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Computers See Through Faked Expressions of Pain Better Than People

A joint study by researchers at the University of California, San Diego and the University of Toronto has found that a computer system spots real or faked expressions of pain more accurately than people can.

The work, titled “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.

“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” said Marian Bartlett, research professor at UC San Diego’s Institute for Neural Computation and lead author of the study. “Human observers just aren’t very good at telling real from faked expressions of pain.”

Senior author Kang Lee, professor at the Dr. Eric Jackman Institute of Child Study at the University of Toronto, said “humans can simulate facial expressions and fake emotions well enough to deceive most observers. The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”

The research team found that humans could not discriminate real from faked expressions of pain better than random chance – and, even after training, only improved accuracy to a modest 55 percent. The computer system attains an 85 percent accuracy.

“In highly social species such as humans,” said Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”

“By revealing the dynamics of facial action through machine vision systems,” said Bartlett, “our approach has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”

The single most predictive feature of falsified expressions, the study shows, is the mouth, and how and when it opens. Fakers’ mouths open with less variation and too regularly.

“Further investigations,” said the researchers, “will explore whether over-regularity is a general feature of fake expressions.”

In addition to detecting pain malingering, the computer-vision system might be used to detect other real-world deceptive actions in the realms of homeland security, psychopathology, job screening, medicine, and law, said Bartlett.

“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” she said. “In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

Filed under pain emotion facial expressions computer-vision system psychology neuroscience science

90 notes

What singing fruit flies can tell us about quick decisions

You wouldn’t hear the mating song of the male fruit fly as you reached for the infested bananas in your kitchen. Yet, the neural activity behind the insect’s amorous call could help scientists understand how you made the quick decision to pull your hand back from the tiny swarm.

image

Male fruit flies base the pitch and tempo of their mating song on the movement and behavior of their desired female, Princeton University researchers have discovered. In the animal kingdom, lusty warblers such as birds typically have a mating song with a stereotyped pattern. A fruit fly’s song, however, is an unordered series of loud purrs and soft drones made by wing vibrations, the researchers reported in the journal Nature. A male adjusts his song in reaction to his specific environment, which in this case is the distance and speed of a female — the faster and farther away she’s moving, the louder he “sings.”

While the actors are small, the implications of these findings could be substantial for understanding rapid decision-making, explained corresponding author Mala Murthy, a Princeton assistant professor of molecular biology and the Princeton Neuroscience Institute. Fruit flies are a common model for studying the systems of more advanced beings such as humans, and have the basic components of more complex nervous systems, she said.

The researchers have provided a possible tool for studying the neural pathways behind how an organism engaged in a task adjusts its behavior to sudden changes, be it a leopard chasing a zigzagging gazelle, or a commuter navigating stop-and-go traffic, Murthy said. She and her co-authors created a model that could predict a fly’s choice of song in response to its changing environment, and identified the neural pathways involved in these decisions.

"Here we have natural courtship behavior and we have this discovery that males are using information about their sensory environment in real time to shape their song. That makes the fly system a unique model to study decision-making in a natural context," Murthy said.

"You can imagine that if a fly can integrate visual information quickly to modulate his song, the way in which it does that is probably a very basic equivalent of how a more complicated animal solves a similar problem," she said. "To figure out at the level of individual neurons how flies perform sensory-motor integration will give us insight into how a mammalian brain does it and, ultimately, maybe how a human brain does it."

Read more

Filed under fruit flies decision making mating song neural circuitry neuroscience science

182 notes

Sniff study suggests humans can distinguish more than 1 trillion scents
The human sense of smell does not get the respect it deserves, new research suggests. In an experiment led by Andreas Keller, of Rockefeller’s Laboratory of Neurogenetics and Behavior, researchers tested volunteers’ ability to distinguish between complex mixtures of scents. Based on the sensitivity of these people’s noses and brains, the team calculated the human sense of smell can detect more than 1 trillion odor mixtures, far more discrete stimuli than previous smell studies have estimated.
The existing generally accepted number is just 10,000, says Leslie Vosshall, Robert Chemers Neustein Professor and head of the laboratory. “Everyone in the field had the general sense that this number was ludicrously small, but Andreas was the first to put the number to a real scientific test,” Vosshall says.
In fact, even 1 trillion may be understating it, says Keller. “The message here is that we have more sensitivity in our sense of smell than for which we give ourselves credit. We just don’t pay attention to it and don’t use it in everyday life,” he says.
The quality of an odor has multiple dimensions, because the odors we encounter in real life are composed of complex mixes of molecules. For instance, the characteristic scent of rose has 275 components, but only a small percentage of those dominate the perceived smell. That makes odor much more difficult to study than vision and hearing, which require us to detect variations in a single dimension. For comparison, researchers estimate the number of colors we can distinguish at between 2.3 and 7.5 million and audible tones at about 340,000.
To overcome this complexity, Keller combined odors and asked volunteers whether they could distinguish between mixtures with some components in common. “Our trick is we use mixtures of odor molecules, and we use the percentage of overlap between two mixtures to measure the sensitivity of a person’s sense of smell,” Keller says. To create his mixtures, Keller drew upon 128 odor molecules responsible for scents such as orange, anise and spearmint. He mixed these in combinations of 10, 20 and 30 with different proportions of components in common. The volunteers received three vials, two of which contained identical mixes, and they were asked to pick out the odd one.
This approach was inspired by previous work at the Weizmann Institute in Israel, in which researchers combined odors at similar intensities to create neutral smelling “olfactory white.” In that experiment and in Keller’s study, the researchers were interested in the perception of odor qualities, such as fishy, floral or musky — not their intensity. But since intensity can interfere with the perceived qualities, both had to account for it.
The results, published this week in Science, show that while individual volunteers’ performance varied greatly, on average they could tell the difference between mixtures containing as much as 51 percent of the same components. Once the mixes shared more than half of their components, fewer volunteers could tell the difference between them. This was true for mixes of 10, 20 and 30 odors.
By analyzing the data, the researchers could calculate the total number of distinguishable mixtures.
“It turns out that the resolution of the olfactory system is not extraordinary – you need to change a fair fraction of the components before the change can be reliably detected by more than 50 percent of the subjects,” says collaborator Marcelo O. Magnasco, head of the Laboratory of Mathematical Physics at Rockefeller. “However, because the number of combinations is quite literally astronomical, even after accounting for this limitation the total number of distinguishable odor combinations is quite large.” The 1 trillion estimate is almost certainly too low, the researchers say, because there are many, many more odor molecules in the real world that can be mixed in many more ways.
Keller theorizes that our ancestors had much more use and appreciation for our sense of smell than we do. Humans’ upright posture lifted our noses far from the ground where most smells originate, and more recently, conveniences such as refrigerators and daily showers, have effectively limited odors in the modern world. “This could explain our attitude that smell is unimportant, compared to hearing and vision,” he says.
Nevertheless, the sense of smell remains closely linked to human behavior, and studying it can tell us a lot about how our brains process complex information. The results of this study are a step toward an elusive quantitative science of odor perception that can help drive further research, Keller says.

Sniff study suggests humans can distinguish more than 1 trillion scents

The human sense of smell does not get the respect it deserves, new research suggests. In an experiment led by Andreas Keller, of Rockefeller’s Laboratory of Neurogenetics and Behavior, researchers tested volunteers’ ability to distinguish between complex mixtures of scents. Based on the sensitivity of these people’s noses and brains, the team calculated the human sense of smell can detect more than 1 trillion odor mixtures, far more discrete stimuli than previous smell studies have estimated.

The existing generally accepted number is just 10,000, says Leslie Vosshall, Robert Chemers Neustein Professor and head of the laboratory. “Everyone in the field had the general sense that this number was ludicrously small, but Andreas was the first to put the number to a real scientific test,” Vosshall says.

In fact, even 1 trillion may be understating it, says Keller. “The message here is that we have more sensitivity in our sense of smell than for which we give ourselves credit. We just don’t pay attention to it and don’t use it in everyday life,” he says.

The quality of an odor has multiple dimensions, because the odors we encounter in real life are composed of complex mixes of molecules. For instance, the characteristic scent of rose has 275 components, but only a small percentage of those dominate the perceived smell. That makes odor much more difficult to study than vision and hearing, which require us to detect variations in a single dimension. For comparison, researchers estimate the number of colors we can distinguish at between 2.3 and 7.5 million and audible tones at about 340,000.

To overcome this complexity, Keller combined odors and asked volunteers whether they could distinguish between mixtures with some components in common. “Our trick is we use mixtures of odor molecules, and we use the percentage of overlap between two mixtures to measure the sensitivity of a person’s sense of smell,” Keller says. To create his mixtures, Keller drew upon 128 odor molecules responsible for scents such as orange, anise and spearmint. He mixed these in combinations of 10, 20 and 30 with different proportions of components in common. The volunteers received three vials, two of which contained identical mixes, and they were asked to pick out the odd one.

This approach was inspired by previous work at the Weizmann Institute in Israel, in which researchers combined odors at similar intensities to create neutral smelling “olfactory white.” In that experiment and in Keller’s study, the researchers were interested in the perception of odor qualities, such as fishy, floral or musky — not their intensity. But since intensity can interfere with the perceived qualities, both had to account for it.

The results, published this week in Science, show that while individual volunteers’ performance varied greatly, on average they could tell the difference between mixtures containing as much as 51 percent of the same components. Once the mixes shared more than half of their components, fewer volunteers could tell the difference between them. This was true for mixes of 10, 20 and 30 odors.

By analyzing the data, the researchers could calculate the total number of distinguishable mixtures.

“It turns out that the resolution of the olfactory system is not extraordinary – you need to change a fair fraction of the components before the change can be reliably detected by more than 50 percent of the subjects,” says collaborator Marcelo O. Magnasco, head of the Laboratory of Mathematical Physics at Rockefeller. “However, because the number of combinations is quite literally astronomical, even after accounting for this limitation the total number of distinguishable odor combinations is quite large.” The 1 trillion estimate is almost certainly too low, the researchers say, because there are many, many more odor molecules in the real world that can be mixed in many more ways.

Keller theorizes that our ancestors had much more use and appreciation for our sense of smell than we do. Humans’ upright posture lifted our noses far from the ground where most smells originate, and more recently, conveniences such as refrigerators and daily showers, have effectively limited odors in the modern world. “This could explain our attitude that smell is unimportant, compared to hearing and vision,” he says.

Nevertheless, the sense of smell remains closely linked to human behavior, and studying it can tell us a lot about how our brains process complex information. The results of this study are a step toward an elusive quantitative science of odor perception that can help drive further research, Keller says.

Filed under olfaction smell odor perception olfactory system neuroscience science

205 notes

The Aging Brain Needs REST

Why do neurodegenerative diseases such as Alzheimer’s affect only the elderly? Why do some people live to be over 100 with intact cognitive function while others develop dementia decades earlier?

image

Image: A new study shows that a gene regulator called REST, dormant in the brains of young people (left), switches on in normal aging brains (center) to protect against various stresses, including abnormal proteins associated with neurodegenerative diseases. REST is lost in critical brain regions of people with Alzheimer’s (right). Credit: Yankner Lab

More than a century of research into the causes of dementia has focused on the clumps and tangles of abnormal proteins that appear in the brains of people with neurodegenerative diseases. However, scientists know that at least one piece of the puzzle has been missing because some people with these abnormal protein clumps show few or no signs of cognitive decline.

A new study offers an explanation for these longstanding mysteries. Researchers have discovered that a gene regulator active during fetal brain development, called REST, switches back on later in life to protect aging neurons from various stresses, including the toxic effects of abnormal proteins. The researchers also showed that REST is lost in critical brain regions of people with Alzheimer’s and mild cognitive impairment.

(Source: hms.harvard.edu)

Read more …

Filed under dementia neurodegenerative diseases REST genetics neuroscience science

61 notes

Rats’ brains may “remember” odor experienced while under general anesthesia
Rats’ brains may remember odors they were exposed to while deeply anesthetized, suggests research in rats published in the April issue of Anesthesiology.
Previous research has led to the belief that sensory information is received by the brain under general anesthesia but not perceived by it. These new findings suggest the brain not only receives sensory information, but also registers the information at the cellular level while anesthetized without behavioral reporting of the same information after recovering from anesthesia.
In the study, rats were exposed to a specific odor while under general anesthesia. Examination of the brain tissue after they had recovered from anesthesia revealed evidence of cellular imprinting, even though the rats behaved as if they had never encountered the odor before.
“It raises the question of whether our brains are being imprinted during anesthesia in ways we don’t recognize because we simply don’t remember,” said Yan Xu, Ph.D., lead author and vice chairman for basic sciences in the Department of Anesthesiology at the University of Pittsburgh School of Medicine. “The fact that an anesthetized brain can receive sensory information – and distinguish whether that information is novel or familiar during and after anesthesia, even if one does not remember receiving it – suggests a need to re-evaluate how the depth of anesthesia should be measured clinically.”
Researchers randomly assigned 107 rats to 12 different anesthesia and odor exposure paradigms: some were exposed to the same odor during and after anesthesia, some to air before and an odor after, some to familiar odors, others to novel odors, and still others were not exposed to odors at all. After the rats had recovered from the anesthesia, researchers observed their behavior of looking for hidden odors or interacting with scented beads to determine their memory of the smell. Researchers then analyzed the rats’ brains at a cellular level. While the rats had no memory of being exposed to the odor under anesthesia, changes in the brain tissue on a cellular level suggested the rats “remembered” the exposure to the odor under anesthesia and no longer registered the odor as novel.
“This study reveals important new information about how anesthesia affects our brains,” said Dr. Xu. “The results highlight a need for additional research into the effects of general anesthesia on learning and memory.”

Rats’ brains may “remember” odor experienced while under general anesthesia

Rats’ brains may remember odors they were exposed to while deeply anesthetized, suggests research in rats published in the April issue of Anesthesiology.

Previous research has led to the belief that sensory information is received by the brain under general anesthesia but not perceived by it. These new findings suggest the brain not only receives sensory information, but also registers the information at the cellular level while anesthetized without behavioral reporting of the same information after recovering from anesthesia.

In the study, rats were exposed to a specific odor while under general anesthesia. Examination of the brain tissue after they had recovered from anesthesia revealed evidence of cellular imprinting, even though the rats behaved as if they had never encountered the odor before.

“It raises the question of whether our brains are being imprinted during anesthesia in ways we don’t recognize because we simply don’t remember,” said Yan Xu, Ph.D., lead author and vice chairman for basic sciences in the Department of Anesthesiology at the University of Pittsburgh School of Medicine. “The fact that an anesthetized brain can receive sensory information – and distinguish whether that information is novel or familiar during and after anesthesia, even if one does not remember receiving it – suggests a need to re-evaluate how the depth of anesthesia should be measured clinically.”

Researchers randomly assigned 107 rats to 12 different anesthesia and odor exposure paradigms: some were exposed to the same odor during and after anesthesia, some to air before and an odor after, some to familiar odors, others to novel odors, and still others were not exposed to odors at all. After the rats had recovered from the anesthesia, researchers observed their behavior of looking for hidden odors or interacting with scented beads to determine their memory of the smell. Researchers then analyzed the rats’ brains at a cellular level. While the rats had no memory of being exposed to the odor under anesthesia, changes in the brain tissue on a cellular level suggested the rats “remembered” the exposure to the odor under anesthesia and no longer registered the odor as novel.

“This study reveals important new information about how anesthesia affects our brains,” said Dr. Xu. “The results highlight a need for additional research into the effects of general anesthesia on learning and memory.”

Filed under odors olfaction anesthesia memory learning neuroscience science

496 notes

Researchers Show How Lost Sleep Leads to Lost Neurons
Most people appreciate that not getting enough sleep impairs cognitive performance. For the chronically sleep-deprived such as shift workers, students, or truckers, a common strategy is simply to catch up on missed slumber on the weekends. According to common wisdom, catch up sleep repays one’s “sleep debt,” with no lasting effects. But a new Penn Medicine study shows disturbing evidence that chronic sleep loss may be more serious than previously thought and may even lead to irreversible physical damage to and loss of brain cells. The research is published today in The Journal of Neuroscience.
Using a mouse model of chronic sleep loss, Sigrid Veasey, MD, associate professor of Medicine and a member of the Center for Sleep and Circadian Neurobiology at the Perelman School of Medicine and collaborators from Peking University, have determined that extended wakefulness is linked to injury to, and loss of, neurons that are essential for alertness and optimal cognition, the locus coeruleus (LC) neurons. 
"In general, we’ve always assumed full recovery of cognition following short- and long-term sleep loss," Veasey says. "But some of the research in humans has shown that attention span and several other aspects of cognition may not normalize even with three days of recovery sleep, raising the question of lasting injury in the brain. We wanted to figure out exactly whether chronic sleep loss injures neurons, whether the injury is reversible, and which neurons are involved."
Mice were examined following periods of normal rest, short wakefulness, or extended wakefulness, modeling a shift worker’s typical sleep pattern. The Veasey lab found that in response to short-term sleep loss, LC neurons upregulate the sirtuin type 3 (SirT3) protein, which is important for mitochondrial energy production and redox responses, and protect the neurons from metabolic injury. SirT3 is essential across short-term sleep loss to maintain metabolic homeostasis, but in extended wakefulness, the SirT3 response is missing. After several days of shift worker sleep patterns, LC neurons in the mice began to display reduced SirT3, increased cell death, and the mice lost 25 percent of these neurons.
"This is the first report that sleep loss can actually result in a loss of neurons," Veasey notes. Particularly intriguing is, that the findings suggest that mitochondria in LC neurons respond to sleep loss and can adapt to short-term sleep loss but not to extended wake. This raises the possibility that somehow increasing SirT3 levels in the mitochondria may help rescue neurons or protect them across chronic or extended sleep loss. The study also demonstrates the importance of sleep for restoring metabolic homeostasis in mitochondria in the LC neurons and possibly other important brain areas, to ensure their optimal functioning during waking hours.
Veasey stresses that more work needs to be done to establish whether a similar phenomenon occurs in humans and to determine what durations of wakefulness place individuals at risk of neural injury. “In light of the role for SirT3 in the adaptive response to sleep loss, the extent of neuronal injury may vary across individuals. Specifically, aging, diabetes, high-fat diet and sedentary lifestyle may all reduce SirT3. If cells in individuals, including neurons, have reduced SirT3 prior to sleep loss, these individuals may be set up for greater risk of injury to their nerve cells.”
The next step will be putting the SirT3 model to the test. “We can now overexpress SirT3 in LC neurons,” explains Veasey.  “If we can show that we can protect the cells and wakefulness, then we’re launched in the direction of a promising therapeutic target for millions of shift workers.” 
The team also plans to examine shift workers post-mortem for evidence of increased LC neuron loss and signs of neurodegenerative disorders such as Alzheimer’s and Parkinson’s, since some previous mouse models have shown that lesions or injury to LC neurons can accelerate the course of those diseases. While not directly causing theses diseases, “injuring LC neurons due to sleep loss could potentially facilitate or accelerate neurodegeneration in individuals who already have these disorders,” Veasey says.
While more research will be needed to settle these questions, the present study provides another confirmation of a rapidly growing scientific consensus:  sleep is more important than was previously believed. In the past, Veasey observes, “No one really thought that the brain could be irreversibly injured from sleep loss.”  It’s now clear that it can be.

Researchers Show How Lost Sleep Leads to Lost Neurons

Most people appreciate that not getting enough sleep impairs cognitive performance. For the chronically sleep-deprived such as shift workers, students, or truckers, a common strategy is simply to catch up on missed slumber on the weekends. According to common wisdom, catch up sleep repays one’s “sleep debt,” with no lasting effects. But a new Penn Medicine study shows disturbing evidence that chronic sleep loss may be more serious than previously thought and may even lead to irreversible physical damage to and loss of brain cells. The research is published today in The Journal of Neuroscience.

Using a mouse model of chronic sleep loss, Sigrid Veasey, MD, associate professor of Medicine and a member of the Center for Sleep and Circadian Neurobiology at the Perelman School of Medicine and collaborators from Peking University, have determined that extended wakefulness is linked to injury to, and loss of, neurons that are essential for alertness and optimal cognition, the locus coeruleus (LC) neurons. 

"In general, we’ve always assumed full recovery of cognition following short- and long-term sleep loss," Veasey says. "But some of the research in humans has shown that attention span and several other aspects of cognition may not normalize even with three days of recovery sleep, raising the question of lasting injury in the brain. We wanted to figure out exactly whether chronic sleep loss injures neurons, whether the injury is reversible, and which neurons are involved."

Mice were examined following periods of normal rest, short wakefulness, or extended wakefulness, modeling a shift worker’s typical sleep pattern. The Veasey lab found that in response to short-term sleep loss, LC neurons upregulate the sirtuin type 3 (SirT3) protein, which is important for mitochondrial energy production and redox responses, and protect the neurons from metabolic injury. SirT3 is essential across short-term sleep loss to maintain metabolic homeostasis, but in extended wakefulness, the SirT3 response is missing. After several days of shift worker sleep patterns, LC neurons in the mice began to display reduced SirT3, increased cell death, and the mice lost 25 percent of these neurons.

"This is the first report that sleep loss can actually result in a loss of neurons," Veasey notes. Particularly intriguing is, that the findings suggest that mitochondria in LC neurons respond to sleep loss and can adapt to short-term sleep loss but not to extended wake. This raises the possibility that somehow increasing SirT3 levels in the mitochondria may help rescue neurons or protect them across chronic or extended sleep loss. The study also demonstrates the importance of sleep for restoring metabolic homeostasis in mitochondria in the LC neurons and possibly other important brain areas, to ensure their optimal functioning during waking hours.

Veasey stresses that more work needs to be done to establish whether a similar phenomenon occurs in humans and to determine what durations of wakefulness place individuals at risk of neural injury. “In light of the role for SirT3 in the adaptive response to sleep loss, the extent of neuronal injury may vary across individuals. Specifically, aging, diabetes, high-fat diet and sedentary lifestyle may all reduce SirT3. If cells in individuals, including neurons, have reduced SirT3 prior to sleep loss, these individuals may be set up for greater risk of injury to their nerve cells.”

The next step will be putting the SirT3 model to the test. “We can now overexpress SirT3 in LC neurons,” explains Veasey.  “If we can show that we can protect the cells and wakefulness, then we’re launched in the direction of a promising therapeutic target for millions of shift workers.” 

The team also plans to examine shift workers post-mortem for evidence of increased LC neuron loss and signs of neurodegenerative disorders such as Alzheimer’s and Parkinson’s, since some previous mouse models have shown that lesions or injury to LC neurons can accelerate the course of those diseases. While not directly causing theses diseases, “injuring LC neurons due to sleep loss could potentially facilitate or accelerate neurodegeneration in individuals who already have these disorders,” Veasey says.

While more research will be needed to settle these questions, the present study provides another confirmation of a rapidly growing scientific consensus:  sleep is more important than was previously believed. In the past, Veasey observes, “No one really thought that the brain could be irreversibly injured from sleep loss.”  It’s now clear that it can be.

Filed under locus coeruleus neurons sleep sleep loss sleep deprivation oxidative stress neuroscience science

311 notes

Out of mind, out of sight: suppressing unwanted memories reduces their unconscious influence on behaviour 



The study, part-funded by the Medical Research Council (MRC) and published online in PNAS, challenges the idea that suppressed memories remain fully preserved in the brain’s unconscious, allowing them to be inadvertently expressed in someone’s behaviour. The results of the study suggest instead that the act of suppressing intrusive memories helps to disrupt traces of the memories in the parts of the brain responsible for sensory processing.
The team at the MRC Cognition and Brain Sciences Unit and the University of Cambridge’s Behavioural and Clinical Neuroscience Institute (BCNI) have examined how suppression affects a memory’s unconscious influences in an experiment that focused on suppression of visual memories, as intrusive unwanted memories are often visual in nature.  
After a trauma, most people report intrusive memories or images, and people will often try to push these intrusions from their mind, as a way to cope. Importantly, the frequency of intrusive memories decreases over time for most people. It is critical to understand how the healthy brain reduces these intrusions and prevents unwanted images from entering consciousness, so that researchers can better understand how these mechanisms may go awry in conditions such as post-traumatic stress disorder.
Participants were asked to learn a set of word-picture pairs so that, when presented with the word as a reminder, an image of the object would spring to mind. After learning these pairs, brain activity was recorded using functional magnetic resonance imaging (fMRI) while participants either thought of the object image when given its reminder word, or instead tried to stop the memory of the picture from entering their mind.
The researchers studied whether suppressing visual memories had altered people’s ability to see the content of those memories when they re-encountered it again in their visual worlds. Without asking participants to consciously remember, they simply asked people to identify very briefly displayed objects that were made difficult to see by visual distortion. In general, under these conditions, people are better at identifying objects they have seen recently, even if they do not remember seeing the object before—an unconscious influence of memory. Strikingly, they found that suppressing visual memories made it harder for people to later see the suppressed object compared to other recently seen objects.  
Brain imaging showed that people’s difficulty seeing the suppressed object arose because suppressing the memory from conscious awareness in the earlier memory suppression phase had inhibited activity in visual areas of the brain, disrupting visual memories that usually help people to see better. In essence, suppressing something from the mind’s eye had made it harder to see in the world, because visual memories and seeing rely on the same brain areas: out of mind, out of sight.
Over the last decade, research has shown that suppressing unwanted memories reduces people’s ability to consciously remember the experiences. The researchers’ studies on memory suppression have been inspired, in part, by trying to understand how people adapt memory after psychological trauma. Although this may work as a coping mechanism to help people adapt to the trauma, there is the possibility that if the memory traces were able to exert an influence on unconscious behaviour, they could potentially exacerbate mental health problems. The idea that suppression leaves unconscious memories that undermine mental health has been influential for over a century, beginning with Sigmund Freud.
These findings challenge the assumption that, even when supressed, a memory remains fully intact, which can then be expressed unconsciously. Moreover, this discovery pinpoints the neurobiological mechanisms underlying how this suppression process happens, and could inform further research on uncontrolled ‘intrusive memories’, a classic characteristic of post-traumatic stress disorder.
Dr Michael Anderson, at the MRC Cognition and Brain Sciences Unit said: “While there has been a lot of research looking at how suppression affects conscious memory, few studies have examined the influence this process might have on unconscious expressions of memory in behaviour and thought. Surprisingly, the effects of suppression are not limited to conscious memory. Indeed, it is now clear, that the influence of suppression extends beyond areas of the brain associated with conscious memory, affecting perceptual traces that can influence us unconsciously. This may contribute to making unwanted visual memories less intrusive over time, and perhaps less vivid and detailed.”  
Dr Pierre Gagnepain, lead author at INSERM in France said: “Our memories can be slippery and hard to pin down. Out of hand and uncontrolled, their remembrance can haunt us and cause psychological troubles, as we see in PTSD. We were interested whether the brain can genuinely suppress memories in healthy participants, even at the most unconscious level, and how it might achieve this. The answer is that it can, though not all people were equally good at this. The better understanding of the neural mechanisms underlying this process arising from this study may help to better explain differences in how well people adapt to intrusive memories after a trauma”

Out of mind, out of sight: suppressing unwanted memories reduces their unconscious influence on behaviour

The study, part-funded by the Medical Research Council (MRC) and published online in PNAS, challenges the idea that suppressed memories remain fully preserved in the brain’s unconscious, allowing them to be inadvertently expressed in someone’s behaviour. The results of the study suggest instead that the act of suppressing intrusive memories helps to disrupt traces of the memories in the parts of the brain responsible for sensory processing.

The team at the MRC Cognition and Brain Sciences Unit and the University of Cambridge’s Behavioural and Clinical Neuroscience Institute (BCNI) have examined how suppression affects a memory’s unconscious influences in an experiment that focused on suppression of visual memories, as intrusive unwanted memories are often visual in nature.  

After a trauma, most people report intrusive memories or images, and people will often try to push these intrusions from their mind, as a way to cope. Importantly, the frequency of intrusive memories decreases over time for most people. It is critical to understand how the healthy brain reduces these intrusions and prevents unwanted images from entering consciousness, so that researchers can better understand how these mechanisms may go awry in conditions such as post-traumatic stress disorder.

Participants were asked to learn a set of word-picture pairs so that, when presented with the word as a reminder, an image of the object would spring to mind. After learning these pairs, brain activity was recorded using functional magnetic resonance imaging (fMRI) while participants either thought of the object image when given its reminder word, or instead tried to stop the memory of the picture from entering their mind.

The researchers studied whether suppressing visual memories had altered people’s ability to see the content of those memories when they re-encountered it again in their visual worlds. Without asking participants to consciously remember, they simply asked people to identify very briefly displayed objects that were made difficult to see by visual distortion. In general, under these conditions, people are better at identifying objects they have seen recently, even if they do not remember seeing the object before—an unconscious influence of memory. Strikingly, they found that suppressing visual memories made it harder for people to later see the suppressed object compared to other recently seen objects.  

Brain imaging showed that people’s difficulty seeing the suppressed object arose because suppressing the memory from conscious awareness in the earlier memory suppression phase had inhibited activity in visual areas of the brain, disrupting visual memories that usually help people to see better. In essence, suppressing something from the mind’s eye had made it harder to see in the world, because visual memories and seeing rely on the same brain areas: out of mind, out of sight.

Over the last decade, research has shown that suppressing unwanted memories reduces people’s ability to consciously remember the experiences. The researchers’ studies on memory suppression have been inspired, in part, by trying to understand how people adapt memory after psychological trauma. Although this may work as a coping mechanism to help people adapt to the trauma, there is the possibility that if the memory traces were able to exert an influence on unconscious behaviour, they could potentially exacerbate mental health problems. The idea that suppression leaves unconscious memories that undermine mental health has been influential for over a century, beginning with Sigmund Freud.

These findings challenge the assumption that, even when supressed, a memory remains fully intact, which can then be expressed unconsciously. Moreover, this discovery pinpoints the neurobiological mechanisms underlying how this suppression process happens, and could inform further research on uncontrolled ‘intrusive memories’, a classic characteristic of post-traumatic stress disorder.

Dr Michael Anderson, at the MRC Cognition and Brain Sciences Unit said: “While there has been a lot of research looking at how suppression affects conscious memory, few studies have examined the influence this process might have on unconscious expressions of memory in behaviour and thought. Surprisingly, the effects of suppression are not limited to conscious memory. Indeed, it is now clear, that the influence of suppression extends beyond areas of the brain associated with conscious memory, affecting perceptual traces that can influence us unconsciously. This may contribute to making unwanted visual memories less intrusive over time, and perhaps less vivid and detailed.”  

Dr Pierre Gagnepain, lead author at INSERM in France said: “Our memories can be slippery and hard to pin down. Out of hand and uncontrolled, their remembrance can haunt us and cause psychological troubles, as we see in PTSD. We were interested whether the brain can genuinely suppress memories in healthy participants, even at the most unconscious level, and how it might achieve this. The answer is that it can, though not all people were equally good at this. The better understanding of the neural mechanisms underlying this process arising from this study may help to better explain differences in how well people adapt to intrusive memories after a trauma”

Filed under memory neuroimaging visual memory mental health consciousness neuroscience science

free counters