IBM Moves Closer To Creating Computer Based on Insights From The Brain
Today at SC 09, the supercomputing conference, IBM announced significant progress toward creating a computer system that simulates and emulates the brain?s abilities for sensation, perception, action, interaction and cognition.
The cognitive computing team, led by IBM Research, has achieved significant advances in large-scale cortical simulation and a new algorithm that synthesizes neurological data -- two major milestones that indicate the feasibility of building a cognitive computing chip.
BlueMatter, a new algorithm created in collaboration with Stanford University, exploits the Blue Gene supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.
Scientists, at IBM Research - Almaden, in collaboration with colleagues from Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses.
Additionally, in collaboration with researchers from Stanford University, IBM scientists have developed an algorithm that exploits the Blue Gene supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.
These advancements will provide a unique workbench for exploring the computational dynamics of the brain, and stand to move the team closer to its goal of building a compact, low-power synaptronic chip using nanotechnology and advances in phase change memory and magnetic tunnel junctions. The team?s work stands to break the mold of conventional von Neumann computing, in order to meet the system requirements of the instrumented and interconnected world of tomorrow.
As the amount of digital data that we create continues to grow massively and the world becomes more instrumented and interconnected, there is a need for new kinds of computing systems ? imbued with a new intelligence that can spot hard-to-find patterns in vastly varied kinds of data, both digital and sensory; analyze and integrate information real-time in a context-dependent way; and deal with the ambiguity found in complex, real-world environments.
Businesses will simultaneously need to monitor, prioritize, adapt and make rapid decisions based on ever-growing streams of critical data and information. A cognitive computer could quickly and accurately put together the disparate pieces of this complex puzzle, while taking into account context and previous experience, to help business decision makers come to a logical response.
"Learning from the brain is an attractive way to overcome power and density challenges faced in computing today," said Josephine Cheng, IBM Fellow and lab director of IBM Research - Almaden. "As the digital and physical worlds continue to merge and computing becomes more embedded in the fabric of our daily lives, it?s imperative that we create a more intelligent computing system that can help us make sense the vast amount of information that's increasingly available to us, much the way our brains can quickly interpret and act on complex tasks."
To perform the first near real-time cortical simulation of the brain that exceed the scale of the cat cortex, the team built a cortical simulator that incorporates a number of innovations in computation, memory, and communication as well as sophisticated biological details from neurophysiology and neuroanatomy. This scientific tool, akin to a linear accelerator or an electron microscope, is a critical instrument used to test hypotheses of brain structure, dynamics and function. The simulation was performed using the cortical simulator on Lawrence Livermore National Lab?s Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of main memory.
The algorithm, when combined with the cortical simulator, allows scientists to experiment with various mathematical hypotheses of brain function and structure of how structure affects function as they work toward discovering the brain?s core computational micro and macro circuits.
After the successful completion of Phase 0, IBM and its university partners were recently awarded $16.1M in additional funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 1 of DARPA?s Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. This phase of research will focus on the components, brain-like architecture and simulations to build a prototype chip. The long-term mission of IBM?s cognitive computing initiative is to discover and demonstrate the algorithms of the brain and deliver low-power, compact cognitive computers that approach mammalian-scale intelligence and use significantly less energy than today?s computing systems. The team includes researchers from several of IBM?s worldwide research labs and scientists from Stanford University, University of Wisconsin-Madison, Cornell University, Columbia University Medical Center and University of California- Merced.
"The goal of the SyNAPSE program is to create new electronics hardware and architecture that can understand, adapt and respond to an informative environment in ways that extend traditional computation to include fundamentally different capabilities found in biological brains," said DARPA program manager Todd Hylton, Ph.D.
Modern computing is based on a stored program model, which has traditionally been implemented in digital, synchronous, serial, centralized, fast, hardwired, general-purpose circuits with explicit memory addressing that indiscriminately over-write data and impose a dichotomy between computation and data. In stark contrast, cognitive computing ? like the brain ? will use replicated computational units, neurons and synapses that are implemented in mixed-mode analog-digital, asynchronous, parallel, distributed, slow, reconfigurable, specialized and fault-tolerant biological substrates with implicit memory addressing that only update state when information changes, blurring the boundary between computation and data.
BlueMatter, a new algorithm created in collaboration with Stanford University, exploits the Blue Gene supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.
Scientists, at IBM Research - Almaden, in collaboration with colleagues from Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses.
Additionally, in collaboration with researchers from Stanford University, IBM scientists have developed an algorithm that exploits the Blue Gene supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information.
These advancements will provide a unique workbench for exploring the computational dynamics of the brain, and stand to move the team closer to its goal of building a compact, low-power synaptronic chip using nanotechnology and advances in phase change memory and magnetic tunnel junctions. The team?s work stands to break the mold of conventional von Neumann computing, in order to meet the system requirements of the instrumented and interconnected world of tomorrow.
As the amount of digital data that we create continues to grow massively and the world becomes more instrumented and interconnected, there is a need for new kinds of computing systems ? imbued with a new intelligence that can spot hard-to-find patterns in vastly varied kinds of data, both digital and sensory; analyze and integrate information real-time in a context-dependent way; and deal with the ambiguity found in complex, real-world environments.
Businesses will simultaneously need to monitor, prioritize, adapt and make rapid decisions based on ever-growing streams of critical data and information. A cognitive computer could quickly and accurately put together the disparate pieces of this complex puzzle, while taking into account context and previous experience, to help business decision makers come to a logical response.
"Learning from the brain is an attractive way to overcome power and density challenges faced in computing today," said Josephine Cheng, IBM Fellow and lab director of IBM Research - Almaden. "As the digital and physical worlds continue to merge and computing becomes more embedded in the fabric of our daily lives, it?s imperative that we create a more intelligent computing system that can help us make sense the vast amount of information that's increasingly available to us, much the way our brains can quickly interpret and act on complex tasks."
To perform the first near real-time cortical simulation of the brain that exceed the scale of the cat cortex, the team built a cortical simulator that incorporates a number of innovations in computation, memory, and communication as well as sophisticated biological details from neurophysiology and neuroanatomy. This scientific tool, akin to a linear accelerator or an electron microscope, is a critical instrument used to test hypotheses of brain structure, dynamics and function. The simulation was performed using the cortical simulator on Lawrence Livermore National Lab?s Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of main memory.
The algorithm, when combined with the cortical simulator, allows scientists to experiment with various mathematical hypotheses of brain function and structure of how structure affects function as they work toward discovering the brain?s core computational micro and macro circuits.
After the successful completion of Phase 0, IBM and its university partners were recently awarded $16.1M in additional funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 1 of DARPA?s Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. This phase of research will focus on the components, brain-like architecture and simulations to build a prototype chip. The long-term mission of IBM?s cognitive computing initiative is to discover and demonstrate the algorithms of the brain and deliver low-power, compact cognitive computers that approach mammalian-scale intelligence and use significantly less energy than today?s computing systems. The team includes researchers from several of IBM?s worldwide research labs and scientists from Stanford University, University of Wisconsin-Madison, Cornell University, Columbia University Medical Center and University of California- Merced.
"The goal of the SyNAPSE program is to create new electronics hardware and architecture that can understand, adapt and respond to an informative environment in ways that extend traditional computation to include fundamentally different capabilities found in biological brains," said DARPA program manager Todd Hylton, Ph.D.
Modern computing is based on a stored program model, which has traditionally been implemented in digital, synchronous, serial, centralized, fast, hardwired, general-purpose circuits with explicit memory addressing that indiscriminately over-write data and impose a dichotomy between computation and data. In stark contrast, cognitive computing ? like the brain ? will use replicated computational units, neurons and synapses that are implemented in mixed-mode analog-digital, asynchronous, parallel, distributed, slow, reconfigurable, specialized and fault-tolerant biological substrates with implicit memory addressing that only update state when information changes, blurring the boundary between computation and data.