Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

141 notes

Scientists Hunt Down Origin of Huntington’s Disease in the Brain and Provide Insights to Help Deliver Therapy
The gene mutation that causes Huntington’s disease appears in every cell in the body, yet kills only two types of brain cells. Why? UCLA scientists used a unique approach to switch the gene off in individual brain regions and zero in on those that play a role in causing the disease in mice.
Published in the April 28 online edition of Nature Medicine, the research sheds light on where Huntington’s starts in the brain. It also suggests new targets and routes for therapeutic drugs to slow the devastating disease, which strikes an estimated 35,000 Americans.
“From day one of conception, the mutant gene that causes Huntington’s appears everywhere in the body, including every cell in the brain,” explained X. William Yang, professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA. “Before we can develop effective strategies to treat the disorder, we need to first identify where it starts and how it ravages the brain.”
Huntington’s disease is passed from parent to child through a mutation in a gene called huntingtin. Scientists blame a genetic “stutter” — a repetitive stretch of DNA at one end of the altered gene—for the cell death and brain atrophy that progressively deprives patients of their ability to move, speak, eat and think clearly. No cure exists, and people with aggressive cases may die in as little as 10 years.
Huntington’s disease targets cells in two brain regions for destruction: the cortex and the striatum. Far more neurons die in the striatum—a cerebral region named after its striped layers of gray and white matter. But it’s unclear whether cortical neurons play a role in the disease, including striatal neurons’ malfunction and death.
Yang’s team used a unique approach to uncover where the mutant gene wreaks the most damage in the brain.
In 2008, Yang collaborated with co-first author Michelle Gray, a former UCLA postdoctoral researcher now at the University of Alabama, to engineer a mouse model for Huntington’s disease. The scientists inserted the entire human huntintin gene, including the stutter, into the mouse genome. As the animals’ brains atrophied, the mice developed motor and psychiatric-like problems similar to the human patients.
In the current study, Yang and Nan Wang, co-first author and UCLA postdoctoral researcher, took the model one step further. They integrated a “genetic scissors” that snipped off the stutter and shut down the defective gene—first in the cortical neurons, then the striatal neurons and finally in both sets of cells. In each case, they measured how the mutant gene influenced disease development in the cells and affected the animals’ brain atrophy, motor and psychiatric-like symptoms.
“The genetic scissors gave us the power to study the role of any cell type in Huntington’s,” said Wang. “We were surprised to learn that cortical neurons play a key role in initiating aspects of the disease in the brain.”
The UCLA team discovered that reducing huntingtin in the cortex partially improved the animals’ symptoms. More importantly, shutting down mutant huntingtin in both the cortical and striatal neurons—while leaving it untouched in the rest of the brain— corrected every symptom they measured in the mice, including motor and psychiatric-like behavioral impairment and brain atrophy.
“We have evidence that the gene mutation highjacks communication between the cortical and striatal neurons,” explained Yang. “Reducing the defective gene in the cortex normalized this communication and helped lessen the disease’s impact on the striatum.”
“Our research helps to shed lights on an age-old question in the field,” he added. “Where does Huntington’s disease start? Equally important, our findings provide crucial insights on where to target therapies to reduce mutant gene levels in the brain—we should target both cortical and striatal neurons.”
Some of the current experimental therapies can be delivered only to limited brain areas, because their properties do not allow them to broadly spread in the brain.
The UCLA team’s next step will be to study how mutant huntingtin affects cortical and striatal neurons’ function and communication, and to identify therapeutic targets that may normalize cellular miscommunication to help slow progression of the disease.

Scientists Hunt Down Origin of Huntington’s Disease in the Brain and Provide Insights to Help Deliver Therapy

The gene mutation that causes Huntington’s disease appears in every cell in the body, yet kills only two types of brain cells. Why? UCLA scientists used a unique approach to switch the gene off in individual brain regions and zero in on those that play a role in causing the disease in mice.

Published in the April 28 online edition of Nature Medicine, the research sheds light on where Huntington’s starts in the brain. It also suggests new targets and routes for therapeutic drugs to slow the devastating disease, which strikes an estimated 35,000 Americans.

“From day one of conception, the mutant gene that causes Huntington’s appears everywhere in the body, including every cell in the brain,” explained X. William Yang, professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA. “Before we can develop effective strategies to treat the disorder, we need to first identify where it starts and how it ravages the brain.”

Huntington’s disease is passed from parent to child through a mutation in a gene called huntingtin. Scientists blame a genetic “stutter” — a repetitive stretch of DNA at one end of the altered gene—for the cell death and brain atrophy that progressively deprives patients of their ability to move, speak, eat and think clearly. No cure exists, and people with aggressive cases may die in as little as 10 years.

Huntington’s disease targets cells in two brain regions for destruction: the cortex and the striatum. Far more neurons die in the striatum—a cerebral region named after its striped layers of gray and white matter. But it’s unclear whether cortical neurons play a role in the disease, including striatal neurons’ malfunction and death.

Yang’s team used a unique approach to uncover where the mutant gene wreaks the most damage in the brain.

In 2008, Yang collaborated with co-first author Michelle Gray, a former UCLA postdoctoral researcher now at the University of Alabama, to engineer a mouse model for Huntington’s disease. The scientists inserted the entire human huntintin gene, including the stutter, into the mouse genome. As the animals’ brains atrophied, the mice developed motor and psychiatric-like problems similar to the human patients.

In the current study, Yang and Nan Wang, co-first author and UCLA postdoctoral researcher, took the model one step further. They integrated a “genetic scissors” that snipped off the stutter and shut down the defective gene—first in the cortical neurons, then the striatal neurons and finally in both sets of cells. In each case, they measured how the mutant gene influenced disease development in the cells and affected the animals’ brain atrophy, motor and psychiatric-like symptoms.

“The genetic scissors gave us the power to study the role of any cell type in Huntington’s,” said Wang. “We were surprised to learn that cortical neurons play a key role in initiating aspects of the disease in the brain.”

The UCLA team discovered that reducing huntingtin in the cortex partially improved the animals’ symptoms. More importantly, shutting down mutant huntingtin in both the cortical and striatal neurons—while leaving it untouched in the rest of the brain— corrected every symptom they measured in the mice, including motor and psychiatric-like behavioral impairment and brain atrophy.

“We have evidence that the gene mutation highjacks communication between the cortical and striatal neurons,” explained Yang. “Reducing the defective gene in the cortex normalized this communication and helped lessen the disease’s impact on the striatum.”

“Our research helps to shed lights on an age-old question in the field,” he added. “Where does Huntington’s disease start? Equally important, our findings provide crucial insights on where to target therapies to reduce mutant gene levels in the brain—we should target both cortical and striatal neurons.”

Some of the current experimental therapies can be delivered only to limited brain areas, because their properties do not allow them to broadly spread in the brain.

The UCLA team’s next step will be to study how mutant huntingtin affects cortical and striatal neurons’ function and communication, and to identify therapeutic targets that may normalize cellular miscommunication to help slow progression of the disease.

Filed under huntington’s disease huntingtin neurons cell death cortex striatum neuroscience science

296 notes

How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes
A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty.
Full Article

How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes

A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty.

Full Article

Filed under decision making computational models sensory perception neuroscience science

152 notes

Extrasynaptic NMDA Receptor Involvement in Central Nervous System Disorders
NMDA receptor (NMDAR)-induced excitotoxicity is thought to contribute to the cell death associated with certain neurodegenerative diseases, stroke, epilepsy, and traumatic brain injury. Targeting NMDARs therapeutically is complicated by the fact that cell signaling downstream of their activation can promote cell survival and plasticity as well as excitotoxicity. However, research over the past decade has suggested that overactivation of NMDARs located outside of the synapse plays a major role in NMDAR toxicity, whereas physiological activation of those inside the synapse can contribute to cell survival, raising the possibility of therapeutic intervention based on NMDAR subcellular localization. Here, we review the evidence both supporting and refuting this localization hypothesis of NMDAR function and discuss the role of NMDAR localization in disorders of the nervous system. Preventing excessive extrasynaptic NMDAR activation may provide therapeutic benefit, particularly in Alzheimer disease and Huntington disease.
Full Article

Extrasynaptic NMDA Receptor Involvement in Central Nervous System Disorders

NMDA receptor (NMDAR)-induced excitotoxicity is thought to contribute to the cell death associated with certain neurodegenerative diseases, stroke, epilepsy, and traumatic brain injury. Targeting NMDARs therapeutically is complicated by the fact that cell signaling downstream of their activation can promote cell survival and plasticity as well as excitotoxicity. However, research over the past decade has suggested that overactivation of NMDARs located outside of the synapse plays a major role in NMDAR toxicity, whereas physiological activation of those inside the synapse can contribute to cell survival, raising the possibility of therapeutic intervention based on NMDAR subcellular localization. Here, we review the evidence both supporting and refuting this localization hypothesis of NMDAR function and discuss the role of NMDAR localization in disorders of the nervous system. Preventing excessive extrasynaptic NMDAR activation may provide therapeutic benefit, particularly in Alzheimer disease and Huntington disease.

Full Article

Filed under NMDA receptors neurodegenerative diseases cell death CNS neuroscience science

194 notes

Preparing for adulthood: thousands upon thousands of new cells are born in the hippocampus during puberty, and most survive with effortful learning
The dentate gyrus of the hippocampal formation generates new granule neurons throughout life. The number of neurons produced each day is inversely related to age, with thousands more produced during puberty than during adulthood, and many fewer produced during senescence. In adulthood, approximately half of these cells undergo apoptosis shortly after they are generated. Most of these cells can be rescued from death by effortful and successful learning experiences (Gould et al., 1999; Waddell and Shors, 2008; Curlik and Shors, 2011). Once rescued, the newly-generated cells differentiate into neurons, and remain in the hippocampus for at least several months (Leuner et al., 2004). Here, we report that many new hippocampal cells also undergo cell death during puberty. Because the juvenile brain is more plastic than during adulthood, and because many experiences are new, we hypothesized that a great number of cells would be rescued by learning during puberty. Indeed, adolescent rats that successfully acquired the trace eyeblink response retained thousands more cells than animals that were not trained, and those that failed to learn. Because the hippocampus generates thousands more cells during puberty than during adulthood, these results support the idea that the adolescent brain is especially responsive to learning. This enhanced response can have significant consequences for the functional integrity of the hippocampus. Such a massive increase in cell proliferation is likely an adaptive response as the young animal must emerge from the care of its mother to face the dangers, challenges, and opportunities of adulthood.
Full Article

Preparing for adulthood: thousands upon thousands of new cells are born in the hippocampus during puberty, and most survive with effortful learning

The dentate gyrus of the hippocampal formation generates new granule neurons throughout life. The number of neurons produced each day is inversely related to age, with thousands more produced during puberty than during adulthood, and many fewer produced during senescence. In adulthood, approximately half of these cells undergo apoptosis shortly after they are generated. Most of these cells can be rescued from death by effortful and successful learning experiences (Gould et al., 1999; Waddell and Shors, 2008; Curlik and Shors, 2011). Once rescued, the newly-generated cells differentiate into neurons, and remain in the hippocampus for at least several months (Leuner et al., 2004). Here, we report that many new hippocampal cells also undergo cell death during puberty. Because the juvenile brain is more plastic than during adulthood, and because many experiences are new, we hypothesized that a great number of cells would be rescued by learning during puberty. Indeed, adolescent rats that successfully acquired the trace eyeblink response retained thousands more cells than animals that were not trained, and those that failed to learn. Because the hippocampus generates thousands more cells during puberty than during adulthood, these results support the idea that the adolescent brain is especially responsive to learning. This enhanced response can have significant consequences for the functional integrity of the hippocampus. Such a massive increase in cell proliferation is likely an adaptive response as the young animal must emerge from the care of its mother to face the dangers, challenges, and opportunities of adulthood.

Full Article

Filed under hippocampus neurogenesis dentate gyrus puberty adulthood learning neuroscience science

238 notes

Diet Can Predict Cognitive Decline
The importance of long-chain polyunsaturated fatty acids (PUFAs) to brain health has been demonstrated in multiple studies. To assess whether lower dietary intake of alpha-linolenic acid (ALA), eicosapentaenoic acid (EPA), and docosahexanoic acid (DHA) were risk factors for cognitive decline, Tammy Scott, PhD, a scientist at the Jean Mayer USDA Human Nutrition Research Center on Aging (USDA HNRCA) at Tufts University recently conducted a longitudinal, observational study using the Boston Puerto Rican Health Study cohort. Alice Lichtenstein, DSc, also from the USDA HNRCA at Tufts University, and Katherine Tucker, PhD, the cohort director from the University of Massachusetts-Lowell, were co-authors of the study, which has been published as an abstract.
“The participants were put through an intensive series of cognitive tests such as memory tests using a list of words, an attention test to repeat lists of numbers forward and backward, and a test of organization and planning involving copying complex figures,” said Dr. Scott. To determine the participants’ intake of PUFAs they were given a questionnaire. The results were determined after comparing baseline test numbers with a 2 year follow up.
The researchers found that the intake of omega-3 PUFAs in the study sample of 895 participants was low. The 2010 U.S. Dietary Guidelines recommended an intake of 8 or more ounces of seafood per week (less for young children) to ensure an adequate intake of the very long chain omega-3 fatty acids (EPA and DHA). This translates to about 1,750 mg of EPA and DHA per week, which averages to 250 mg per day. Scott’s group reported that only 27% of the participants in their study met or exceeded that recommendation. The major source of EPA and DHA in their diets appeared to be from canned tuna. Based on the scientists’ findings, being in the lowest four quintiles of EPA and DHA intake was predictive of cognitive decline over 2 years.
What is the takeaway from this research? There is growing evidence that very long chain omega-3 fatty acids are beneficial for maintaining cognitive health, and many Americans do not have an adequate intake of these nutrients. “While more research is needed to determine whether intake of fatty fish such as salmon, tuna and trout can help prevent against cognitive decline, our preliminary data support previous research showing that intake of these types of fish have health benefits,” Scott said.
(Image: Fotolia)

Diet Can Predict Cognitive Decline

The importance of long-chain polyunsaturated fatty acids (PUFAs) to brain health has been demonstrated in multiple studies. To assess whether lower dietary intake of alpha-linolenic acid (ALA), eicosapentaenoic acid (EPA), and docosahexanoic acid (DHA) were risk factors for cognitive decline, Tammy Scott, PhD, a scientist at the Jean Mayer USDA Human Nutrition Research Center on Aging (USDA HNRCA) at Tufts University recently conducted a longitudinal, observational study using the Boston Puerto Rican Health Study cohort. Alice Lichtenstein, DSc, also from the USDA HNRCA at Tufts University, and Katherine Tucker, PhD, the cohort director from the University of Massachusetts-Lowell, were co-authors of the study, which has been published as an abstract.

“The participants were put through an intensive series of cognitive tests such as memory tests using a list of words, an attention test to repeat lists of numbers forward and backward, and a test of organization and planning involving copying complex figures,” said Dr. Scott. To determine the participants’ intake of PUFAs they were given a questionnaire. The results were determined after comparing baseline test numbers with a 2 year follow up.

The researchers found that the intake of omega-3 PUFAs in the study sample of 895 participants was low. The 2010 U.S. Dietary Guidelines recommended an intake of 8 or more ounces of seafood per week (less for young children) to ensure an adequate intake of the very long chain omega-3 fatty acids (EPA and DHA). This translates to about 1,750 mg of EPA and DHA per week, which averages to 250 mg per day. Scott’s group reported that only 27% of the participants in their study met or exceeded that recommendation. The major source of EPA and DHA in their diets appeared to be from canned tuna. Based on the scientists’ findings, being in the lowest four quintiles of EPA and DHA intake was predictive of cognitive decline over 2 years.

What is the takeaway from this research? There is growing evidence that very long chain omega-3 fatty acids are beneficial for maintaining cognitive health, and many Americans do not have an adequate intake of these nutrients. “While more research is needed to determine whether intake of fatty fish such as salmon, tuna and trout can help prevent against cognitive decline, our preliminary data support previous research showing that intake of these types of fish have health benefits,” Scott said.

(Image: Fotolia)

Filed under cognitive decline diet omega-3 memory nutrition Experimental Biology Meeting 2014 neuroscience science

60 notes

Zinc Supplementation Shows Promise in Reducing Cell Stress After Blasts

Each year, approximately 2 million traumatic brain injuries (TBIs) occur in the USA, according to the Centers for Disease Control and Prevention. That number includes troops wounded in Iraq and Afghanistan, for whom TBI is considered an invisible wound of war, one that has few successful treatments. “We have nothing beyond ibuprofen for most TBIs,” said Dr. Angus Scrimgeour, who has been investigating the effects of low zinc diets on cell stress following a blast injury. “The adult brain does not self-repair from this kind of trauma.”

Scrimgeour works for the US Army Research Institute of Environmental Medicine and recently looked at the effects of 5-weeks of low and adequate zinc diets on a specific protein in muscle cells called MMP. The study recreated blast injuries in 32 rats similar to what soldiers experience from IEDs, including loss of consciousness. An equal number of rats served as a control group. Results suggest that zinc supplementation reduces blast-induced cell stress. He presented the results of his research at the American Society for Nutrition’s Scientific Sessions & Annual Meeting at EB on Sunday, April 27.

“We know that soldiers’ brain tissue cannot repair on low zinc diets,” said Scrimgeour. “And they are losing zinc through diarrhea and sweating.” The question moving forward is whether prevention through diet supplementation or post-blast treatment works best to repair behavioral deficits associated with mild TBI.

Scrimgeour added that further research is planned to investigate nutrient combinations for treating mild TBI, including omega-3, vitamin D, glutamine and/or zinc. Although the Army is conducting this research, the results can be applied outside of the military, according to Scrimgeour. “As the blast impact experienced by Soldiers are similar to those experienced during head injuries received in a car accident or during an NFL concussion, these findings could translate from the Soldier to the civilian population.” Scrimgeour cautioned, however, that what works in animals doesn’t always work in soldiers, which is why more research is needed.

(Source: newswise.com)

Filed under TBI brain injury diet zinc Experimental Biology Meeting 2014 neuroscience science

163 notes

Fight Memory Loss with a Smile (or Chuckle) 
Too much stress can take its toll on the body, mood, and mind. As we age it can contribute to a number of health problems, including high blood pressure, diabetes, and heart disease. Recent research has shown that the stress hormone cortisol damages certain neurons in the brain and can negatively affect memory and learning ability in the elderly. Researchers at Loma Linda University have delved deeper into cortisol’s relationship to memory and whether humor and laughter—a well-known stress reliever—can help lessen the damage that cortisol can cause. Their findings were presented on Sunday, April 27, at the Experimental Biology meeting.
Gurinder Singh Bains et al. showed a 20-minute laugh-inducing funny video to a group of healthy elderly individuals and a group of elderly people with diabetes. The groups where then asked to complete a memory assessment that measured their learning, recall, and sight recognition. Their performance was compared to a control group of elderly people who also completed the memory assessment, but were not shown a funny video. Cortisol concentrations for both groups were also recorded at the beginning and end of the experiment.
The research team found a significant decrease in cortisol concentrations among both groups who watched the video. Video-watchers also showed greater improvement in all areas of the memory assessment when compared to controls, with the diabetic group seeing the most dramatic benefit in cortisol level changes and the healthy elderly seeing the most significant changes in memory test scores.
From the authors: “Our research findings offer potential clinical and rehabilitative benefits that can be applied to wellness programs for the elderly,” Dr. Bains said. “The cognitive components—learning ability and delayed recall—become more challenging as we age and are essential to older adults for an improved quality of life: mind, body, and spirit. Although older adults have age-related memory deficits, complimentary, enjoyable, and beneficial humor therapies need to be implemented for these individuals.”
Study co-author and long-time psychoneuroimmunology humor researcher, Dr. Lee Berk, added, “It’s simple, the less stress you have the better your memory. Humor reduces detrimental stress hormones like cortisol that decrease memory hippocampal neurons, lowers your blood pressure, and increases blood flow and your mood state. The act of laughter—or simply enjoying some humor—increases the release of endorphins and dopamine in the brain, which provides a sense of pleasure and reward. These positive and beneficial neurochemical changes, in turn, make the immune system function better. There are even changes in brain wave activity towards what’s called the “gamma wave band frequency”, which also amp up memory and recall. So, indeed, laughter is turning out to be not only a good medicine, but also a memory enhancer adding to our quality of life.”

Fight Memory Loss with a Smile (or Chuckle)

Too much stress can take its toll on the body, mood, and mind. As we age it can contribute to a number of health problems, including high blood pressure, diabetes, and heart disease. Recent research has shown that the stress hormone cortisol damages certain neurons in the brain and can negatively affect memory and learning ability in the elderly. Researchers at Loma Linda University have delved deeper into cortisol’s relationship to memory and whether humor and laughter—a well-known stress reliever—can help lessen the damage that cortisol can cause. Their findings were presented on Sunday, April 27, at the Experimental Biology meeting.

Gurinder Singh Bains et al. showed a 20-minute laugh-inducing funny video to a group of healthy elderly individuals and a group of elderly people with diabetes. The groups where then asked to complete a memory assessment that measured their learning, recall, and sight recognition. Their performance was compared to a control group of elderly people who also completed the memory assessment, but were not shown a funny video. Cortisol concentrations for both groups were also recorded at the beginning and end of the experiment.

The research team found a significant decrease in cortisol concentrations among both groups who watched the video. Video-watchers also showed greater improvement in all areas of the memory assessment when compared to controls, with the diabetic group seeing the most dramatic benefit in cortisol level changes and the healthy elderly seeing the most significant changes in memory test scores.

From the authors: “Our research findings offer potential clinical and rehabilitative benefits that can be applied to wellness programs for the elderly,” Dr. Bains said. “The cognitive components—learning ability and delayed recall—become more challenging as we age and are essential to older adults for an improved quality of life: mind, body, and spirit. Although older adults have age-related memory deficits, complimentary, enjoyable, and beneficial humor therapies need to be implemented for these individuals.”

Study co-author and long-time psychoneuroimmunology humor researcher, Dr. Lee Berk, added, “It’s simple, the less stress you have the better your memory. Humor reduces detrimental stress hormones like cortisol that decrease memory hippocampal neurons, lowers your blood pressure, and increases blood flow and your mood state. The act of laughter—or simply enjoying some humor—increases the release of endorphins and dopamine in the brain, which provides a sense of pleasure and reward. These positive and beneficial neurochemical changes, in turn, make the immune system function better. There are even changes in brain wave activity towards what’s called the “gamma wave band frequency”, which also amp up memory and recall. So, indeed, laughter is turning out to be not only a good medicine, but also a memory enhancer adding to our quality of life.”

Filed under aging memory memory loss laughter stress cortisol Experimental Biology Meeting 2014 neuroscience science

397 notes

Laughter May Work Like Meditation in the Brain
Laughter triggers brain waves similar to those associated with meditation, according to a small new study.
It also found that other forms of stimulation produce different types of brain waves.
The study included 31 people whose brain waves were monitored while they watched humorous, spiritual or distressing video clips. While watching the humorous videos, the volunteers’ brains had high levels of gamma waves, which are the same ones produced during meditation, researchers found.
During the spiritual videos, the participants’ brains showed higher levels of alpha brain waves, similar to when a person is at rest. The distressing videos caused flat brain wave bands, similar to when a person feels detached, nonresponsive or doesn’t want to be in a certain situation.
Researchers were led by Lee Berk, an associate professor in the School of Allied Health Professions, and an associate research professor of pathology and human anatomy in the School of Medicine, at Loma Linda University, in California.
The study was scheduled to be presented Sunday at the Experimental Biology meeting held in San Diego. The data and conclusions should be viewed as preliminary until published in a peer-reviewed journal.
“What we have found in our study is that humor associated with mirthful laughter sustains high-amplitude gamma-band oscillations. Gamma is the only frequency found in every part of the brain,” Berk said in a university news release.
“What this means is that humor actually engages the entire brain — it is a whole brain experience with the gamma wave band frequency and humor, similar to meditation, holds it there; we call this being ‘in the zone,’” Berk explained.
He said that with laughter, “it’s as if the brain gets a workout.” This effect is important because it “allows for the subjective feeling states of being able to think more clearly and have more integrative thoughts,” Berk said. “This is of great value to individuals who need or want to revisit, reorganize or rearrange various aspects of their lives or experiences, to make them feel whole or more focused.”

Laughter May Work Like Meditation in the Brain

Laughter triggers brain waves similar to those associated with meditation, according to a small new study.

It also found that other forms of stimulation produce different types of brain waves.

The study included 31 people whose brain waves were monitored while they watched humorous, spiritual or distressing video clips. While watching the humorous videos, the volunteers’ brains had high levels of gamma waves, which are the same ones produced during meditation, researchers found.

During the spiritual videos, the participants’ brains showed higher levels of alpha brain waves, similar to when a person is at rest. The distressing videos caused flat brain wave bands, similar to when a person feels detached, nonresponsive or doesn’t want to be in a certain situation.

Researchers were led by Lee Berk, an associate professor in the School of Allied Health Professions, and an associate research professor of pathology and human anatomy in the School of Medicine, at Loma Linda University, in California.

The study was scheduled to be presented Sunday at the Experimental Biology meeting held in San Diego. The data and conclusions should be viewed as preliminary until published in a peer-reviewed journal.

“What we have found in our study is that humor associated with mirthful laughter sustains high-amplitude gamma-band oscillations. Gamma is the only frequency found in every part of the brain,” Berk said in a university news release.

“What this means is that humor actually engages the entire brain — it is a whole brain experience with the gamma wave band frequency and humor, similar to meditation, holds it there; we call this being ‘in the zone,’” Berk explained.

He said that with laughter, “it’s as if the brain gets a workout.” This effect is important because it “allows for the subjective feeling states of being able to think more clearly and have more integrative thoughts,” Berk said. “This is of great value to individuals who need or want to revisit, reorganize or rearrange various aspects of their lives or experiences, to make them feel whole or more focused.”

Filed under laughter brainwaves meditation gamma oscillations Experimental Biology Meeting 2014 neuroscience science

2,860 notes

Human consciousness is simply a state of matter, like a solid or liquid – but quantum
Thanks to the work of a small group neuroscientists and theoretical physicists over the last few years, we may finally have found a way of analyzing the mysterious, metaphysical realm of consciousness in a scientific manner. The latest breakthrough in this new field, published by Max Tegmark of MIT, postulates that consciousness is actually a state of matter. “Just as there are many types of liquids, there are many types of consciousness,” he says. With this new model, Tegmark says that consciousness can be described in terms of quantum mechanics and information theory, allowing us to scientifically tackle murky topics such as self awareness, and why we perceive the world in classical three-dimensional terms, rather than the infinite number of objective realities offered up by the many-worlds interpretation of quantum mechanics.
Read more

Human consciousness is simply a state of matter, like a solid or liquid – but quantum

Thanks to the work of a small group neuroscientists and theoretical physicists over the last few years, we may finally have found a way of analyzing the mysterious, metaphysical realm of consciousness in a scientific manner. The latest breakthrough in this new field, published by Max Tegmark of MIT, postulates that consciousness is actually a state of matter. “Just as there are many types of liquids, there are many types of consciousness,” he says. With this new model, Tegmark says that consciousness can be described in terms of quantum mechanics and information theory, allowing us to scientifically tackle murky topics such as self awareness, and why we perceive the world in classical three-dimensional terms, rather than the infinite number of objective realities offered up by the many-worlds interpretation of quantum mechanics.

Read more

Filed under consciousness quantum mechanics information theory neuroscience science

81 notes

The Influence of Spatiotemporal Structure of Noisy Stimuli in Decision Making
Decision making is a process of utmost importance in our daily lives, the study of which has been receiving notable attention for decades. Nevertheless, the neural mechanisms underlying decision making are still not fully understood. Computational modeling has revealed itself as a valuable asset to address some of the fundamental questions. Biophysically plausible models, in particular, are useful in bridging the different levels of description that experimental studies provide, from the neural spiking activity recorded at the cellular level to the performance reported at the behavioral level. In this article, we have reviewed some of the recent progress made in the understanding of the neural mechanisms that underlie decision making. We have performed a critical evaluation of the available results and address, from a computational perspective, aspects of both experimentation and modeling that so far have eluded comprehension. To guide the discussion, we have selected a central theme which revolves around the following question: how does the spatiotemporal structure of sensory stimuli affect the perceptual decision-making process? This question is a timely one as several issues that still remain unresolved stem from this central theme. These include: (i) the role of spatiotemporal input fluctuations in perceptual decision making, (ii) how to extend the current results and models derived from two-alternative choice studies to scenarios with multiple competing evidences, and (iii) to establish whether different types of spatiotemporal input fluctuations affect decision-making outcomes in distinctive ways. And although we have restricted our discussion mostly to visual decisions, our main conclusions are arguably generalizable; hence, their possible extension to other sensory modalities is one of the points in our discussion.
Full Article

The Influence of Spatiotemporal Structure of Noisy Stimuli in Decision Making

Decision making is a process of utmost importance in our daily lives, the study of which has been receiving notable attention for decades. Nevertheless, the neural mechanisms underlying decision making are still not fully understood. Computational modeling has revealed itself as a valuable asset to address some of the fundamental questions. Biophysically plausible models, in particular, are useful in bridging the different levels of description that experimental studies provide, from the neural spiking activity recorded at the cellular level to the performance reported at the behavioral level. In this article, we have reviewed some of the recent progress made in the understanding of the neural mechanisms that underlie decision making. We have performed a critical evaluation of the available results and address, from a computational perspective, aspects of both experimentation and modeling that so far have eluded comprehension. To guide the discussion, we have selected a central theme which revolves around the following question: how does the spatiotemporal structure of sensory stimuli affect the perceptual decision-making process? This question is a timely one as several issues that still remain unresolved stem from this central theme. These include: (i) the role of spatiotemporal input fluctuations in perceptual decision making, (ii) how to extend the current results and models derived from two-alternative choice studies to scenarios with multiple competing evidences, and (iii) to establish whether different types of spatiotemporal input fluctuations affect decision-making outcomes in distinctive ways. And although we have restricted our discussion mostly to visual decisions, our main conclusions are arguably generalizable; hence, their possible extension to other sensory modalities is one of the points in our discussion.

Full Article

Filed under decision making neural networks computational models neurons neuroscience science

free counters