Posts tagged brain

Posts tagged brain
July 6, 2012
(Medical Xpress) — Scientists at the University of Liverpool have found that a protein produced by a gene identified in fruitflies, is responsible for communication between nerve cells in the brain.

Dr Stephen Royle: “This research is another step towards fully understanding the complexities of the human brain.”
The ‘stoned’ gene was discovered in fruitflies by scientists in the 1970s. When this gene was mutated, the flies had problems walking and flying, giving rise to the term ‘stoned’ gene. The same gene was found in mammals some years later, but until now scientists have not known precisely what this gene is responsible for and why it causes problems with physical functions when it mutates.
‘Packets of chemicals’
Scientists at Liverpool have found that the protein the gene expresses in mammals, called stonin2, is responsible for retrieving ‘packets’ of chemicals that nerve cells in the brain release in order to communicate with each other. The inability of the gene to express this protein in the fruitfly study, suggests why the insect appeared not to be able to walk or fly normally.
The team used advanced techniques to inactivate stonin2 for short and long periods of time in animal cells grown in the laboratory. The cells used where from an area of the brain associated with learning and memory. They showed that without stonin2 the nerve cells could not retrieve the ‘packets’ needed to transport the chemicals required for communications between nerve cells.
Dr Stephen Royle, from the University’s Institute of Translational Medicine, explains: “Nerve cells in the brain communicate by releasing ‘packets’ of chemicals. These ‘packets’ must be retrieved and refilled with chemicals so that they can be used once again. This recycling programme is very important for nerve cells to keep communicating with each other.
“We have shown that a protein called stonin 2 is needed for the packets to be retrieved. There is currently no evidence to suggest that the gene which expresses this protein is mutated in human disease, but any failure in its function would be disastrous. The research is another step towards fully understanding the complexities of the human brain.”
The research is published in the journal, Current Biology.
Provided by University of Liverpool
Source: medicalxpress.com
July 6, 2012 by Nancy Owano
(Phys.org) — Talk about fMRI may not be entirely familiar to many people, but that could change with new events that are highlighting efforts to link up humans and machines. fMRI (Functional Magnetic Resonance Imaging) is a promising technology that can help human move beyond joysticks to control robots via brain scanners instead. Now a research project exploring ways to develop robot surrogates with whom humans can interact has turned a corner. A university student‘s ability to make his robot surrogate move around, using fMRI technology, was successful. The experiment linked up Israeli student Tirosh Shapira in a lab at Bar-Ilan University, Israel, with a small robot in another lab far away at Beziers Technology Institute in France.
Shapira merely had to think about moving his arms or legs and the robot, with a camera on its head with an image displayed in front of Shapira, successfully would do the same. If Shapira thought about moving forward or backward, the robot responded accordingly.
fmri monitors blood flowing through the brain and can spot when areas associated with certain actions, such as movement, are in use. The fMRI read the student’s thoughts, which were translated via computer into commands relayed across the Internet to the robot in France.
There is much more work to be done to advance this approach, however. The researchers seek to devise a different type of scanning. An fMRI scanner is an expensive piece of equipment but the scientists believe that improvements in software might allow for a head-mounted device. Another research goal is to see if they can get humans to speak via the robot. The size of the robot will need modification, closer to the size and movement of a human, and engineered with a wider range of movement that would include hand gestures. In sum, according to the researchers, this experiment is only one of many steps ahead.
Medical applications for this technology are seen as promising, especially as scientists explore how patients with paralysis can interface with robots so that the patients can reconnect to the world. Another suggested application has been in the military, where robot surrogates rather than soldiers would be sent into battle.
Source: PHYS.ORG
July 6, 2012
(Medical Xpress) — Researchers decode a molecular mechanism that sheds light on how trauma can become engraved in the brain

Scientists at the Universities of Bonn and Berlin have discovered a mechanism which stops the process of forgetting anxiety after a stress event. In experiments they showed that feelings of anxiety don’t subside if too little dynorphin is released into the brain. The results can help open up new paths in the treatment of trauma patients. The study has been published in the current edition of the Journal of Neuroscience.
Feelings of anxiety very effectively prevent people from getting into situations that are too dangerous. Those who have had a terrible experience initially tend to avoid the place of tragedy out of fear. If no other oppressive situation arises, normally the symptoms of fear gradually subside. “The memory of the terrible events is not just erased.” states first author, PD Dr. Andras Bilkei Gorzo, from the Institute for Molecular Psychiatry at the University of Bonn. “Those impacted learn rather via an active learning process that they no longer need to be afraid because the danger has passed.” But following extreme psychical stress resulting from wars, hostage-takings, accidents or catastrophes chronic anxiety disorders can develop which even after months don’t subside.
Body’s own dynorphin weakens fears
Why is it that in some people terrible events are deeply engraved in their memory, while after a while others seem to have completely put aside any anxiety related to the incident? Scientists in the fields of psychiatry, molecular psychiatry and radiology at the University of Bonn are all involved in probing this issue. “We were able to demonstrate by way of a series of experiments that dynorphin plays an important role in weakening anxiety,” says Prof. Dr. Andreas Zimmer, Director of the Institute for Molecular Psychiatry at the University of Bonn. The substance group in question is opiods which also includes, for instance, endorphins. The latter are released by the body of athletes and have an analgesic and euphoric effect. The reverse, however, is true of dynorphins: They are known for putting a damper on emotional moods.
Mice with disabled gene exhibit persistent anxiety
The team working with Prof. Zimmer tested the exact impact of dynorphins on the brain using mice whose gene for the formation of this substance had been disabled. After being exposed to a brief and unpleasant electric shock, the animals exhibited persistent anxiety symptoms, even if they hadn’t been confronted with the negative stimulus over a longer time. Mice exhibiting a normal amount of released dynorphin were anxious to begin with as well, but the symptoms quickly subsided. “This behavior is the same in humans: If you burn your hand on the stove once, you don’t forget the incident that quickly,” explains Prof. Zimmer. “Learning vocabulary, on the other hand, typically tends to be more tedious because it’s not tied to emotions.”
Results are transferrable to people
Next the researchers showed that these results can be transferred to people. “We took advantage of the fact that people exhibit natural variations of the dynorphin gene that lead to different levels of this substance being released in the brain,” reports Prof. Dr. Henrik Walter, Director of the Research Area Mind and Brain at the Psychiatric University Clinic at the Charité in Berlin, who also used to perform research in this area at the University Clinic in Bonn. A total of 33 healthy probands were divided into two groups: One with the genetically stronger dynorphin release and the other which exhibits less gene activity.
Unpleasant stimulus leads to stress reactions in the probands
Equipped with computer glasses the probands observed blue and green squares which appeared and then disappeared again in a magnetic resonance tomograph (MRT). When the green square was visible the scientists repeatedly gave probands an unpleasant stimulus on the hand using a laser. Scientists were able to prove that these negative stimuli actually led to a stress reaction given the increased sweat on the skin. At the same time, researchers recorded the activities of various brain areas with the tomograph. After this conditioning stage came part two of the experiment: The researchers showed the colored squares without any unpleasant stimuli and recorded how long the stress reaction acquired earlier lasted. The next day the experiment was continued without the laser stimulus in an effort to monitor the longer-term development.
New paths in the treatment of trauma patients
It became apparent that, as in mice human, probands with lower gene activity for dynorphin exhibited stress reactions lasting considerably longer than those probands who released considerably more. Moreover, in brain scans it could be observed that the amygdala – a brain structure in the temporal lobes that processes emotional contents - was also active even if in later testing rounds a green square was shown without the subsequent laser stimulus.
“After the negative laser stimulus stopped this amygdala activity gradually became weaker. This means that the acquired anxiety reaction to the stimulus was forgotten,” reports Prof. Walter. This effect was not as pronounced in the group with less dynorphin activity and prolonged anxiety. “But the ‘forgetting’ of acquired anxiety reactions isn’t a fading, but, rather, an active process which involves the ventromedial prefrontal cortex,” emphasizes Prof. Walter. To corroborate this, researchers found that in the group with less dynorphin activity there was reduced coupling between the prefrontal cortex and the amygdala. “In all likelihood dynorphins affect fear forgetting in a crucial way through this structure,” says Prof. Walter. The scientists now hope that by using the results they will be able to develop long-term approaches for new strategies when it comes to the treatment of trauma patients.
Provided by University of Bonn
Source: medicalxpress.com
ScienceDaily (July 5, 2012) — Sensory substitution devices (SSDs) use sound or touch to help the visually impaired perceive the visual scene surrounding them. The ideal SSD would assist not only in sensing the environment but also in performing daily activities based on this input. For example, accurately reaching for a coffee cup, or shaking a friend’s hand. In a new study, scientists trained blindfolded sighted participants to perform fast and accurate movements using a new SSD, called EyeMusic. Their results are published in the July issue of Restorative Neurology and Neuroscience.

Left: An illustration of the EyeMusic SSD, showing a user with a camera mounted on the glasses, and scalp headphones, hearing musical notes that create a mental image of the visual scene in front of him. He is reaching for the red apple in a pile of green ones. Top right: close-up of the glasses-mounted camera and headphones; bottom right: hand-held camera pointed at the object of interest. (Credit: Maxim Dupliy, Amir Amedi and Shelly Levy-Tzedek)
The EyeMusic, developed by a team of researchers at the Hebrew University of Jerusalem, employs pleasant musical tones and scales to help the visually impaired “see” using music. This non-invasive SSD converts images into a combination of musical notes, or “soundscapes.”
The device was developed by the senior author Prof. Amir Amedi and his team at the Edmond and Lily Safra Center for Brain Sciences (ELSC) and the Institute for Medical Research Israel-Canada at the Hebrew University. The EyeMusic scans an image and represents pixels at high vertical locations as high-pitched musical notes and low vertical locations as low-pitched notes according to a musical scale that will sound pleasant in many possible combinations. The image is scanned continuously, from left to right, and an auditory cue is used to mark the start of the scan. The horizontal location of a pixel is indicated by the timing of the musical notes relative to the cue (the later it is sounded after the cue, the farther it is to the right), and the brightness is encoded by the loudness of the sound.
The EyeMusic’s algorithm uses different musical instruments for each of the five colors: white (vocals), blue (trumpet), red (reggae organ), green (synthesized reed), yellow (violin); Black is represented by silence. Prof. Amedi mentions that “The notes played span five octaves and were carefully chosen by musicians to create a pleasant experience for the users.” Sample sound recordings are available at http://brain.huji.ac.il/em/.
"We demonstrated in this study that the EyeMusic, which employs pleasant musical scales to convey visual information, can be used after a short training period (in some cases, less than half an hour) to guide movements, similar to movements guided visually," explain lead investigators Drs. Shelly Levy-Tzedek, an ELSC researcher at the Faculty of Medicine, Hebrew University, Jerusalem, and Prof. Amir Amedi. "The level of accuracy reached in our study indicates that performing daily tasks with an SSD is feasible, and indicates a potential for rehabilitative use."
The study tested the ability of 18 blindfolded sighted individuals to perform movements guided by the EyeMusic, and compared those movements to those performed with visual guidance. At first, the blindfolded participants underwent a short familiarization session, where they learned to identify the location of a single object (a white square) or of two adjacent objects (a white and a blue square).
In the test sessions, participants used a stylus on a digitizing tablet to point to a white square located either in the north, the south, the east or the west. In one block of trials they were blindfolded (SSD block), and in the other block (VIS block) the arm was placed under an opaque cover, so they could see the screen but did not have direct visual feedback from the hand. The endpoint location of their hand was marked by a blue square. In the SSD block, they received feedback via the EyeMusic. In the VIS block, the feedback was visual.
"Participants were able to use auditory information to create a relatively precise spatial representation," notes Dr. Levy-Tzedek.
The study lends support to the hypothesis that representation of space in the brain may not be dependent on the modality with which the spatial information is received, and that very little training is required to create a representation of space without vision, using sounds to guide fast and accurate movements. “SSDs may have great potential to provide detailed spatial information for the visually impaired, allowing them to interact with their external environment and successfully make movements based on this information, but further research is now required to evaluate the use of our device in the blind ” concludes Dr. Levy-Tzedek. These results demonstrate the potential application of the EyeMusic in performing everyday tasks — from accurately reaching for the red (but not the green!) apples in the produce aisle, to, perhaps one day, playing a Kinect / Xbox game.
Source: Science Daily
Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.
Read more: Robot vision: Muscle-like action allows camera to mimic human eye movement
July 5, 2012
Feeling full involves more than just the uncomfortable sensation that your waistband is getting tight. Investigators reporting online on July 5th in the Cell Press journal Cell have now mapped out the signals that travel between your gut and your brain to generate the feeling of satiety after eating a protein-rich meal. Understanding this back and forth loop between the brain and gut may pave the way for future approaches in the treatment and/or prevention of obesity.

Feeling full involves more than just the uncomfortable sensation that your waistband is getting tight. Investigators reporting online on July 5th in the Cell Press journal Cell have now mapped out the signals that travel between your gut and your brain to generate the feeling of satiety after eating a protein-rich meal. Understanding this back and forth loop between the brain and gut may pave the way for future approaches in the treatment and/or prevention of obesity. Credit: Duraffourd et al., Cell
Food intake can be modulated through mu-opioid receptors (MORs, which also bind morphine) on nerves found in the walls of the portal vein, the major blood vessel that drains blood from the gut. Specifically, stimulating the receptors enhances food intake, while blocking them suppresses intake. Investigators have now found that peptides, the products of digested dietary proteins, block MORs, curbing appetite. The peptides send signals to the brain that are then transmitted back to the gut to stimulate the intestine to release glucose, suppressing the desire to eat.
Mice that were genetically engineered to lack MORs did not carry out this release of glucose, nor did they show signs of ‘feeling full’, after eating high-protein foods. Giving them MOR stimulators or inhibitors did not affect their food intake, unlike normal mice.
Because MORs are also present in the neurons lining the walls of the portal vein in humans, the mechanisms uncovered here may also take place in people.
"These findings explain the satiety effect of dietary protein, which is a long-known but unexplained phenomenon,” says senior author Dr. Gilles Mithieux of the Université de Lyon, in France. “They provide a novel understanding of the control of food intake and of hunger sensations, which may offer novel approaches to treat obesity in the future,” he adds.
Provided by Cell Press
Source: medicalxpress.com
ScienceDaily (July 5, 2012) — The widely used diabetes drug metformin comes with a rather unexpected and alluring side effect: it encourages the growth of new neurons in the brain. The study reported in the July 6th issue of Cell Stem Cell, a Cell Press publication, also finds that those neural effects of the drug also make mice smarter.

New research finds that the widely used diabetes drug metformin comes with a rather unexpected and alluring side effect: it encourages the growth of new neurons in the brain. (Credit: iStockphoto/Guido Vrola)
The discovery is an important step toward therapies that aim to repair the brain not by introducing new stem cells but rather by spurring those that are already present into action, says the study’s lead author Freda Miller of the University of Toronto-affiliated Hospital for Sick Children. The fact that it’s a drug that is so widely used and so safe makes the news all that much better.
Earlier work by Miller’s team highlighted a pathway known as aPKC-CBP for its essential role in telling neural stem cells where and when to differentiate into mature neurons. As it happened, others had found before them that the same pathway is important for the metabolic effects of the drug metformin, but in liver cells.
"We put two and two together," Miller says. If metformin activates the CBP pathway in the liver, they thought, maybe it could also do that in neural stem cells of the brain to encourage brain repair.
The new evidence lends support to that promising idea in both mouse brains and human cells. Mice taking metformin not only showed an increase in the birth of new neurons, but they were also better able to learn the location of a hidden platform in a standard maze test of spatial learning.
While it remains to be seen whether the very popular diabetes drug might already be serving as a brain booster for those who are now taking it, there are already some early hints that it may have cognitive benefits for people with Alzheimer’s disease. It had been thought those improvements were the result of better diabetes control, Miller says, but it now appears that metformin may improve Alzheimer’s symptoms by enhancing brain repair.
Miller says they now hope to test whether metformin might help repair the brains of those who have suffered brain injury due to trauma or radiation therapies for cancer.
Source: Science Daily
ScienceDaily (July 5, 2012) — Although many areas of the human brain are devoted to social tasks like detecting another person nearby, a new study has found that one small region carries information only for decisions during social interactions. Specifically, the area is active when we encounter a worthy opponent and decide whether to deceive them.

(Credit: © wtamas / Fotolia)
A brain imaging study conducted by researchers at the Duke Center for Interdisciplinary Decision Science (D-CIDES) put human subjects through a functional MRI brain scan while playing a simplified game of poker against a computer and human opponents. Using computer algorithms to sort out what amount of information each area of the brain was processing, the team found only one brain region — the temporal-parietal junction, or TPJ — carried information that was unique to decisions against the human opponent.
Some of the time, the subjects were dealt an obviously weak hand. The researchers wanted to see whether they could watch the player calculate whether to bluff his opponent. The brain signals in the TPJ told the researchers whether the subject would soon bluff against a human opponent, especially if that opponent was judged to be skilled. But against a computer, signals in the TPJ did not predict the subject’s decisions.
The TPJ is in a boundary area of the brain, and may be an intersection for two streams of information, said lead researcher McKell Carter, a postdoctoral fellow at Duke. It brings together a flow of attentional information and biological information, such as “is that another person?”
Carter observed that in general, participants paid more attention to their human opponent than their computer opponent while playing poker, which is consistent with humans’ drive to be social.
Throughout the poker game experiment, regions of the brain that are typically thought to be social in nature did not carry information specific to a social context. “The fact that all of these brain regions that should be specifically social are used in other circumstances is a testament to the remarkable flexibility and efficiency of our brains,” said Carter.
"There are fundamental neural differences between decisions in social and non-social situations," said D-CIDES Director Scott Huettel, the Hubbard professor of psychology & neuroscience at Duke and senior author of the study. "Social information may cause our brain to play by different rules than non-social information, and it will be important for both scientists and policymakers to understand what causes us to approach a decision in a social or a non-social manner.
"Understanding how the brain identifies important competitors and collaborators — those people who are most relevant for our future behavior — will lead to new insights into social phenomena like dehumanization and empathy," Huettel added.
Source: Science Daily
July 4, 2012
(Medical Xpress) — Researchers at the UCL Institute of Neurology have found that giving the drug rotigotine as a skin patch can improve inattention in some stroke patients.
Hemi-spatial neglect, a severe and common form of inattention that can be caused by brain damage following a stroke, is one of the most debilitating symptoms, frequently preventing patients from living independently. When the right side of the brain has suffered damage, the patient may have little awareness of their left-hand side and have poor memory of objects that they have seen, leaving them inattentive and forgetful. Currently there are few treatment options.
The randomised control trial took 16 patients who had suffered a stroke on the right-hand side of their brain and assessed to see whether giving the drug rotigotine improved their ability to concentrate on their left-hand side. The results showed that even with treatment for just over a week, patients who received the drug performed significantly better on attention tests than when they received the placebo treatment.
Rotigotine acts by stimulating receptors on nerve cells for dopamine, a chemical normally produced within the brain.
Professor Masud Husain who led the study at the Institute of Neurology at UCL says: “Inattention can have a devastating effect on stroke patients and their families. It impacts on all aspects of their lives. If the results of our clinical trial are replicated in further, larger studies, we will have overcome a major hurdle towards providing a new treatment for this important consequence of stroke.
“Milder forms of inattention occur in other brain disorders, across all ages - from ADHD (attention deficit hyperactivity disorder) to Parkinson’s disease. Our findings show that it is possible to alter attention by using a drug that acts at specific receptors in the brain, and therefore have implications for understanding the mechanisms that might cause inattention in conditions other than stroke.”
Provided by University College London
Source: medicalxpress.com
ScienceDaily (July 3, 2012) — University of Granada researchers have developed an artificial cerebellum (a biologically-inspired adaptive microcircuit) that controls a robotic arm with human-like precision. The cerebellum is the part of the human brain that controls the locomotor system and coordinates body movements.
To date, although robot designers have achieved very precise movements, such movements are performed at very high speed, require strong forces and are power consuming. This approach cannot be applied to robots that interact with humans, as a malfunction might be potentially dangerous.
To solve this challenge, University of Granada researchers have implemented a new cerebellar spiking model that adapts to corrections and stores their sensorial effects; in addition, it records motor commands to predict the action or movement to be performed by the robotic arm. This cerebellar model allows the user to articulate a state-of-the-art robotic arm with extraordinary mobility.
Automatic Learning
The developers of the new cerebellar model have obtained a robot that performs automatic learning by extracting the input layer functionalities of the brain cortex. Furthermore, they have developed two control systems that enable accurate and robust control of the robotic arm during object handling.
The synergy between the cerebellum and the automatic control system enables robot’s adaptability to changing conditions i.e. the robot can interact with humans. The biologically-inspired architectures used in this model combine the error training approach with predictive adaptive control.
The designers of this model are Silvia Tolu, Jesús Garrido and Eduardo Ros Vidal, at the University of Granada Department of Computer Architecture and Technology, and the University of Almería researcher Richard Carrillo.
Source: Science Daily