Posts tagged Human Brain Project

Posts tagged Human Brain Project

Brain simulation raises questions
What does it mean to simulate the human brain? Why is it important to do so? And is it even possible to simulate the brain separately from the body it exists in? These questions are discussed in a new paper published in the scientific journal Neuron today.
Simulating the brain means modeling it on a computer. But in real life, brains don’t exist in isolation. The brain is a complex and adaptive system that is seated within our bodies and entangled with all the other adaptive systems inside us that together make up a whole person. And the fact that the brain is a brain inside our bodies is something we can’t ignore when we attempt to simulate it realistically. Today, two Human Brain Project (HBP) researchers, Kathinka Evers, philosopher at the Centre for Research Ethics and Bioethics at Uppsala University and Yadin Dudal, neuroscientist at the Weizmann Institute of Science, publish a paper in Neuron that discusses the questions raised by brain simulations within and beyond the EU flagship project HBP.
For many scientists, understanding means being able to create a mental model that allows them to predict how a system would behave under different conditions. For the brain sciences, this type of understanding is currently only possible for a limited number of basic functions. In the article, Kathinka Evers and Yadin Dudal discuss the goal of simulation. In broad terms it has to do with understanding. But what does understanding mean in neuroscience?
As it dwells inside our bodies, the brain is always a result of what the individual has experienced up to that point. That is why, when we simulate the brain, we have to take this ‘experienced brain’ into account and try and reflect that.
According to Kathinka Evers, leader of the Ethics and Society part of the Human Brain Project, neglecting this experience would severely limit the outcome of any brain simulation. But if we are to include experience we have to simulate real-life situations.
“That is a daunting task: a large part of that experience is the brain’s interaction with the rest of the human body existing and interacting in a still larger social context”, says Kathinka Evers.
What outcome would be realistic to hope for in the Human Brain Project’s simulation? In neuroscience, computer simulations of specific systems are already in use. These simulations are a complement to other tools scientists use.
But there are some warnings to issue here. According to Kathinka Evers and Yadin Dudal, our knowledge to date is still very limited. There are many neuroscientists who think that it is too early for large scale brain simulations. Collecting the data we need for this is not an easy task. Another problem is whether we truly can understand what we are about to build. There are also technical limitations: there simply isn’t enough computing power available today.
But if we do manage to simulate the brain, would that mean we have created artificial consciousness? And can a computer be conscious at all? According to Kathinka Evers and Yadin Dudal, that depends on what consciousness is: If it is the result of certain types of organization or functions of biological matter, like the cells in the human body, then a computer can never gain consciousness. But if it is a matter of organization alone, without the need for biological matter, then the answer could be yes. But it is still a very hypothetical stance.
Neuroscience: Where is the brain in the Human Brain Project?
Launched in October 2013, the Human Brain Project (HBP) was sold by charismatic neurobiologist Henry Markram as a bold new path towards understanding the brain, treating neurological diseases and building information technology. It is one of two ‘flagship’ proposals funded by the European Commission’s Future and Emerging Technologies programme (see go.nature.com/icotmi). Selected after a multiyear competition, the project seemed like an exciting opportunity to bring together neuroscience and IT to generate practical applications for health and medicine (see go.nature.com/2eocv8).
Contrary to public assumptions that the HBP would generate knowledge about how the brain works, the project is turning into an expensive database-management project with a hunt for new computing architectures. In recent months, the HBP executive board revealed plans to drastically reduce its experimental and cognitive neuroscience arm, provoking wrath in the European neuroscience community.
The crisis culminated with an open letter from neuroscientists (including one of us, G.L.) to the European Commission on 7 July 2014 (see www.neurofuture.eu), which has now gathered more than 750 signatures. Many signatories are scientists in experimental and theoretical fields, and the list includes former HBP participants. The letter incorporates a pledge of non-participation in a planned call for ‘partnering projects’ that must raise about half of the HBP’s total funding. This pledge could seriously lower the quality of the project’s final output and leave the planned databases empty.
With the initial funding, or ‘ramp-up’, phase now in full swing, the European Commission is currently evaluating the HBP directors’ plan for the larger second part of the project. This offers an opportunity to introduce reforms and reconciliation. Here, we offer our analysis of how the HBP project strayed off course and how it might be steered back.
Scientists Criticize Europe’s $1.6B Brain Project
Dozens of neuroscientists are protesting Europe’s $1.6 billion attempt to recreate the functioning of the human brain on supercomputers, fearing it will waste vast amounts of money and harm neuroscience in general.
The 10-year Human Brain Project is largely funded by the European Union. In an open letter issued Monday, more than 190 neuroscience researchers called on the EU to put less money into the effort to “build” a brain, and to invest instead in existing projects.
If the EU doesn’t adopt their recommendations, the scientists said, they will boycott the Human Brain Project and urge colleagues to do the same.

Thinking it through: Scientists seek to unlock mysteries of the brain
Understanding the human brain is one of the greatest challenges facing 21st century science. If we can rise to this challenge, we will gain profound insights into what makes us human, develop new treatments for brain diseases, and build revolutionary new computing technologies that will have far reaching effects, not only in neuroscience.
Scientists at the European Human Brain Project—set to announce more than a dozen new research partnerships worth Eur 8.3 million in funding later this month—the Allen Institute for Brain Science, and the US BRAIN Initiative are developing new paradigms for understanding how the human brain works in health and disease. Today, their international and collaborative projects are defined, explored, and compared during “Inventing New Ways to Understand the Human Brain,” at the 2014 AAAS Annual Meeting in Chicago.
Brain Simulation, Big Data, and a New Computing Paradigm
Henry Markram from the Ecole Polytechnique Fédérale de Lausanne (EPFL), in Switzerland, where the Human Brain Project is based, describes how the project will leverage available experimental data and basic principles of brain organization to reconstruct the detailed structure of the brain in computer models. The models will allow the HBP to run super-computer based simulations of the inner working of the brain.
"Brain simulation allows measurements and manipulations impossible in the lab, opening the road to a new kind of in silico experimentation," Markram says.
The data deluge in neuroscience is resulting in a revolutionary amount of brain data with new initiatives planning to acquire even more. But searching, accessing, and analyzing this data remains a key challenge.
Sean Hill, also of EPFL and a speaker at AAAS, leads The Neuroinformatics Platform of the Human Brain Project (HBP). In this scientific panel, he explains how the platform will provide tools to manage, navigate, and annotate spatially referenced brain atlases, which will form the basis for the HBP’s modeling effort—turning Big Data into deep knowledge.
The Neuroinformatics Platform will bring together many different kinds of data. University of Edinburgh’s Seth Grant, a key member of the HBP, describes how he is deriving new methods to decode the molecular principles underlying the brain’s organization, such as how individual proteins assemble into larger complexes. As Grant explains in Chicago, this has important practical applications as many mutations in schizophrenia and autism converge on these so-called supercomplexes in the brain.
As we understand more and more about the way the brain computes we can apply this knowledge to technology. Karlheinz Meier, of Heidelberg University in Germany and a speaker at AAAS, outlines how he is working to create entirely new computing systems as part of the HBP. These Neuromorphic Computing Systems will merge realistic brain models with new hardware for a completely new paradigm of computing—one that more closely resembles how the brain itself processes information.
"The brain has the ability to efficiently perform computations that are impossible even for the most powerful computers while consuming only 30 Watts of power," Meier says.
Brain: Get Ready For Your Close-up
At AAAS, Christof Koch lays out another ambitious, 10-year plan from the Allen Institute for Brain Science: to understand the structure and function of the brain by mapping cell types from mice and humans with computer simulations and figuring out how the cells connect, and how they encode, relay, and process information. The project, Koch says, promises massive, multimodal, and open-access datasets and methodology that will be reproducible and scalable.
At Harvard University, George Church is participating in the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, which aims to map every neuron in the brain with rapidly advancing technologies. At AAAS, he describes progress on new tools for measurements of brain cell development, connectivity, and functional state dynamics in rodent and human clinical samples.
What do all of these projects have in common? They seek to help find some of the most elusive answers known to man: what makes us human, how does the brain function, what causes neurological and mental illness, and, most importantly, how can we treat or cure these afflictions?
"BigBrain" Study Provides Most Detailed 3-D Map of the Brain Yet
A landmark three-dimensional digital reconstruction of a complete human brain, called the BigBrain, shows the brain anatomy in microscopic detail at a spatial resolution of 20 micrometers—smaller than the size of one fine strand of hair.
The reconstruction, published in the 21 June issue of the journal Science, exceeds the resolution of all existing reference brains presently in the public domain, and will be made freely available to the broader scientific community.
The fine-grained anatomical resolution of the BigBrain will allow scientists who use it to gain insights into the neurobiological basis of cognition, language, emotions and other processes, according to the study. The anatomical tool yielded by the researchers will serve as an atlas for neurosurgery and provide a framework for research in many directions, including enhanced understanding of brain diseases, such as Alzheimer’s disease.
"It is a common basis for scientific discussions because everybody can work with this brain model," said Science co-author Karl Zilles, senior professor of the Jülich Aachen Research Alliance.
The new reference brain, which is part of the European Human Brain Project, “redefines traditional maps from the beginning of the 20th century,” explained lead author Katrin Amunts from the Research Centre Jülich. Amunts serves as director of the Cecile and Oskar Vogt Institute for Brain Research at the Heinrich Heine University Düsseldorf in Germany.
"The authors pushed the limits of current technology," said Science Senior Editor Peter Stern. Existing reference brains do not probe further than the macroscopic, or visible, components of the brain. The BigBrain provides a resolution much finer than the typical 1 millimeter resolution from MRI studies. "The spatial resolution the researchers achieved exceeds that of presently available reference brains by a factor of 50," said Stern.
"Of course, we would love to have spatial resolution going down to 1 micrometer," said Amunts in a 19 June press teleconference. However, "there are simply no computers at this moment which would be capable to process such data, to visualize this or to analyze it."
To create the detailed brain atlas, Amunts and colleagues took advantage of new advances in computing capacities and image analysis. Using a special tool called a microtome, they carefully cut the paraffin-covered brain of a 65-year-old female into 20 micrometer-thick sections.
The project was “a tour-de-force to assemble images of over 7400 individual histological sections, each with its own distortions, rips and tears, into a coherent 3-D volume,” said Science co-author Alan Evans, a professor at the Montreal Neurological Institute at McGill University in Montreal, Canada.
The sections were mounted on slides, stained to detect cell structures and finally digitized with a high-resolution flatbed scanner so researchers could reconstruct the high-resolution 3-D brain model. It took approximately 1000 hours to collect the data.
The researchers’ future plans for using the map include extracting measurements of cortical thickness to gain insights into aging and neurodegenerative disorders. Eventually, Amunts and colleagues hope to build a brain model at the resolution of 1 micron to capture details of single cell morphology. Detailed brain maps can aid researchers who are exploring the full set of neural connections and real-time brain activity, as scientists discussed recently in a Capitol Hill briefing sponsored by AAAS.
The creation of such a detailed brain map, offering a gateway to unprecedented insights into the brain’s anatomy and organization, was long in the works. “It was a dream for almost 20 years,” Amunts said. “The dream came true because of an interdisciplinary and intercontinental collaboration spanning from Europe to Canada and from neuroanatomy to supercomputing .”
Though not directly related to the BRAIN Initiative announced by President Barack Obama earlier this year, the work by Amunts and colleagues supports the Initiative’s goal of giving scientists the best possible tools with which to obtain a dynamic picture of the brain.
To handle large amounts of data from detailed brain models, IBM, EPFL, and ETH Zürich are collaborating on a new hybrid memory strategy for supercomputers. This will help the Blue Brain Project and the Human Brain Project achieve their goals.

Motivated by extraordinary requirements for neuroscience, IBM Research, EPFL, and ETH Zürich through the Swiss National Supercomputing Center CSCS, are exploring how to combine different types of memory – DRAM, which is standard for computer memory, and flash memory that is akin to USB sticks – for less expensive and optimal supercomputing performance.
The Blue Brain Project, for example, is building detailed models of the rodent brain based on vast amounts of information – incorporating experimental data and a large number of parameters – to describe each and every neuron and how they connect to each other. The building blocks of the simulation consist of realistic representations of individual neurons, including characteristics like shape, size, and electrical behavior.
Given the roughly 70 million neurons in the brain of a mouse, a huge amount of data needs to be accessed for the simulation to run efficiently.
“Data-intensive research has supercomputer requirements that go well beyond high computational power,” says EPFL professor Felix Schürmann of the Blue Brain Project in Lausanne. “Here, we investigate different types of memory and how it is used, which is crucial to build detailed models of the brain. But the applications for this technology are much broader.”
70 Million Neurons for the New IBM Blue Gene/Q
The Blue Brain Project has acquired a new IBM Blue Gene/Q supercomputer to be installed at CSCS in Lugano, Switzerland. This machine has four times the memory of the supercomputer used by the Blue Brain Project up to now, but this still may not be enough to model the mouse brain at the desired level of detail.
The challenge for scientists is to modify the supercomputer so that it can model not only more neurons—as many as the 70 million in the mouse brain—but with even more detail while using fewer resources. The researchers aspire to do just that by engineering different types of memory. The Blue Gene/Q comes equipped with 64 terabytes of DRAM memory. But this type of memory, which is ubiquitous in personal computers, loses data almost instantaneously when the power is turned off.
The scientists plan to boost the supercomputer’s capacity by combining DRAM with another type of memory that has made its way into everyday devices, from cameras to mobile phones: flash memory. Unlike DRAM, flash memory can retain information, even without power, and is much more affordable. The Blue Brain Project’s new supercomputer efficiently integrates 128 terabytes of flash memory with the 64 terabytes of DRAM memory.
“These technological advancements will not only help scientists model the brain, but they will also contribute to future evidence-based systems,” says IBM Research computational scientist Alessandro Curioni, who is based in Zurich.
To take full advantage of this novel mix of memory, IBM has been developing a scalable memory system architecture, while EPFL and ETH Zürich researchers are working on high-level software to optimize this hybrid memory for large-scale simulations and interactive supercomputing.
“The resulting machine may not necessarily be the fastest supercomputer in the world, but it will certainly open up new avenues for data-intensive science,” says ETH Zürich professor and CSCS director Thomas Schulthess. “The results of this collaboration will support scientific investigations across all types of data intensive applications including astronomy, geosciences and healthcare.”
Towards the Human Brain
The Blue Brain Project has recently become the core of an even more ambitious project, the European Flagship Human Brain Project, also coordinated by EPFL. The Human Brain Project faces the daunting task of providing the technical tools to integrate as much data as possible into detailed models of the human brain by 2023. Estimated at 90 billion neurons, the human brain compared to that of a mouse contains roughly a thousand times more neurons. The new strategy to use hybrid memory is an important step towards helping the Human Brain Project meet its 10-year goal.
As it goes with research and innovation, a scientific pursuit is pushing the boundaries of technology, leading to new and more powerful tools. The Blue Brain and Human Brain Projects have brought into perspective the need to deal with complex and unusual calculations, requiring supercomputer technology where speed is simply not enough.
(Source: actu.epfl.ch)
Watch it to Believe it. http://www.humanbrainproject.eu/
Full 10 Year Joint EU funding (2013-2023) with over 1 Billion Euro`s has now started!!!
Will we ever… simulate the human brain?
A billion dollar project claims it will recreate the most complex organ in the human body in just 10 years. But detractors say it is impossible. Who is right?
For years, Henry Markram has claimed that he can simulate the human brain in a computer within a decade. On 23 January 2013, the European Commission told him to prove it. His ambitious Human Brain Project (HBP) won one of two ceiling-shattering grants from the EC to the tune of a billion euros, ending a two-year contest against several other grandiose projects. Can he now deliver? Is it even possible to build a computer simulation of the most powerful computer in the world – the 1.4-kg (3 lb) cluster of 86 billion neurons that sits inside our skulls?
The very idea has many neuroscientists in an uproar, and the HBP’s substantial budget, awarded at a tumultuous time for research funding, is not helping. The common refrain is that the brain is just too complicated to simulate, and our understanding of it is at too primordial a stage.
Then, there’s Markram’s strategy. Neuroscientists have built computer simulations of neurons since the 1950s, but the vast majority treat these cells as single abstract points. Markram says he wants to build the cells as they are – gloriously detailed branching networks, full of active genes and electrical activity. He wants to simulate them down to their ion channels – the molecular gates that allow neurons to build up a voltage by shuttling charged particles in and out of their membrane borders. He wants to represent the genes that switch on and off inside them. He wants to simulate the 3,000 or so synapses that allow neurons to communicate with their neighbours.
Erin McKiernan, who builds computer models of single neurons, is a fan of this bottom-up approach. “Really understanding what’s happening at a fundamental level and building up – I generally agree with that,” she says. “But I tend to disagree with the time frame. [Markram] said that in 10 years, we could have a fully simulated brain, but I don’t think that’ll happen.”
Even building McKiernan’s single-neuron models is a fiendishly complicated task. “For many neurons, we don’t understand well the complement of ion channels within them, how they work together to produce electrical activity, how they change over development or injury,” she says. “At the next level, we have even less knowledge about how these cells connect, or how they’re constantly reaching out, retracting or changing their strength.” It’s ignorance all the way down.
“For sure, what we have is a tiny, tiny fraction of what we need,” says Markram. Worse still, experimentally mapping out every molecule, cell and connection is completely unfeasible in terms of cost, technical requirements and motivation. But he argues that building a unified model is the only way to unite our knowledge, and to start filling in the gaps in a focused way. By putting it all together, we can use what we know to predict what we don’t, and to refine everything on the fly as new insights come in.

Why we’re building a €1 billion model of a human brain
We want to reach a unified understanding of the brain and the simulation on a supercomputer is the tool. Today you have neuroscientists working on a genetic, behavioural or cognitive level, and then you have informaticians, chemists and mathematicians. They all have their own understanding of how the brain functions and is structured. How do you get them all around the same table? We think of the project as like a CERN for the brain. The model is our way of bringing everyone, and our understanding, together.
RIKEN, OIST Dive into Human Brain Project
One of the major frontiers of modern science is a comprehensive understanding of the human brain and its functions to guide the development of new technologies in information and communication. In a major announcement for the globalization of science, two Japanese research organizations, the Okinawa Institute of Science and Technology Graduate University (OIST) and RIKEN, will join forces with a large European consortium on the Human Brain Project (HBP), which the European Commission has officially announced as one of two Future and Emerging Technology (FET) Flagship projects. The new project will federate international efforts to understand and simulate the human brain for the creation of new technological advances for society.
The goal of the Human Brain Project is to combine all existing knowledge about the human brain and to reconstruct the brain, piece by piece, in supercomputer-based models and simulations. The models will offer the prospect of a new understanding of the human brain and its diseases and of completely new computing and robotics technologies. On January 28, the European Commission supported this vision, announcing that it has selected the HBP as one of two projects to be funded through the new FET Flagship Program. With more than 80 European and international research institutions, the Human Brain Project will last for ten years (2013-2023). At a cost estimated at 1.19 billion euros the HBP will become one of the most ambitious efforts in the history of science that will focus international efforts on research objectives expected to stimulate the global economy.
With three teams involved in the project, the RIKEN Brain Science Institute will contribute to the identification of the brain structures underlying mental capabilities. By listening to the brain’s activity during behavior, RIKEN investigators hope to reveal new principles of the mind and cognition. This information will guide the construction of the HBP brain model and stimulate the development of a new generation of brain-based computer and information technologies. Participating RIKEN faculty include Keiji Tanaka, Naotaka Fujii and Justin Gardner.
Dr. Naotaka Fujii’s team will contribute to the Language group by studying the neural network mechanisms of primate learning of proto-language via nested sequential stimuli. Drs Keiji Tanaka and Justin Gardner will participate in the group studying the mechanisms of information integration in the brain. The process by which semantic knowledge of the world is developed based on visual object representations and how prior knowledge of the world influences visual perception.
Charles Yokoyama, Coordinator of the RIKEN Brain Science Institute-Human Brain Project collaboration, said: “The participation of RIKEN in the Human Brain Project marks a new era in international collaboration to study the brain; such a large-scale, coordinated effort is needed to produce consistent benefits for society.”
OIST’s contribution is led by Prof. Erik De Schutter, whose team participates in the development of the Brain Simulation Platform, a major software infrastructure effort. Specifically, the team at OIST will contribute its experience in programming software for the spatial simulation of the interaction between electrophysiological events and biochemical reactions in neurons.
"We are delighted that OIST will participate in this major international initiative," said De Schutter. "Our major challenge is how to integrate fine scale of modeling at the molecular level with large-scale modeling of whole brain regions."
The project will begin work in the closing months of 2013 and will be coordinated at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, by neuroscientist Henry Markram with co-directors Karlheinz Meier of Heidelberg University, Germany, and Richard Frackowiak of Clinique Hospitalière Universitaire Vaudoise (CHUV) and the University of Lausanne (UNIL).