Breaking News

ASUS ROG Unveils Rapture GT-BE19000AI, the World’s First AI Gaming Router TerraMaster Unveils TOS 7 Insider Preview CORSAIR Steps Into the Ring, Announces Novablade Pro Wireless Hall Effect Leverless Fight Controller PROGRADE DIGITAL ANNOUNCES PG25 PRO THUNDERBOLT 5 DOCK Samsung TVs get HDR10 Plus Supported Content

logo

  • Share Us
    • Facebook
    • Twitter
  • Home
  • Home
  • News
  • Reviews
  • Essays
  • Forum
  • Legacy
  • About
    • Submit News

    • Contact Us
    • Privacy

    • Promotion
    • Advertise

    • RSS Feed
    • Site Map

Search form

IBM Research Creates New Foundation to Program SyNAPSE Chips

IBM Research Creates New Foundation to Program SyNAPSE Chips

Enterprise & IT Aug 8,2013 0

Scientists from IBM today unveiled a breakthrough software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain. The technology could enable a new generation of intelligent sensor networks that mimic the brain's abilities for perception, action, and cognition.

Dramatically different from traditional software, IBM?s new programming model breaks the mold of sequential operation underlying today's von Neumann architectures and computers. It is instead tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures.

"Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm," said Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research. "We are working to create a FORTRAN for synaptic computing chips. While complementing today's computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems."

To advance and enable this new ecosystem, IBM researchers developed the following breakthroughs that support all aspects of the programming cycle from design through development, debugging, and deployment:

- Simulator: A multi-threaded, massively parallel and highly scalable functional software simulator of a cognitive computing architecture comprising a network of neurosynaptic cores.

- Neuron Model: A simple, digital, highly parameterized spiking neuron model that forms a fundamental information processing unit of brain-like computation and supports a wide range of deterministic and stochastic neural computations, codes, and behaviors. A network of such neurons can sense, remember, and act upon a variety of spatio-temporal, multi-modal environmental stimuli.

- Programming Model: A high-level description of a "program" that is based on composable, reusable building blocks called "corelets." Each corelet represents a complete blueprint of a network of neurosynaptic cores that specifies a based-level function. Inner workings of a corelet are hidden so that only its external inputs and outputs are exposed to other programmers, who can concentrate on what the corelet does rather than how it does it. Corelets can be combined to produce new corelets that are larger, more complex, or have added functionality.

- Library: A cognitive system store containing designs and implementations of consistent, parameterized, large-scale algorithms and applications that link massively parallel, multi-modal, spatio-temporal sensors and actuators together in real-time. In less than a year, the IBM researchers have designed and stored over 150 corelets in the program library.

- Laboratory: A novel teaching curriculum that spans the architecture, neuron specification, chip simulator, programming language, application library and prototype design models. It also includes an end-to-end software environment that can be used to create corelets, access the library, experiment with a variety of programs on the simulator, connect the simulator inputs/outputs to sensors/actuators, build systems, and visualize/debug the results.

Modern computing systems were designed decades ago for sequential processing according to a pre-defined program. Although they are fast and precise "number crunchers," computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us. In contrast, the brain - which operates comparatively slowly and at low precision?excels at tasks such as recognizing, interpreting, and acting upon patterns, while consuming the same amount of power as a 20 watt light bulb and occupying the volume of a two-liter bottle.

In August 2011, IBM successfully demonstrated a building block of a novel brain-inspired chip architecture based on a scalable, interconnected, configurable network of "neurosynaptic cores." Each core brings memory ("synapses"), processors ("neurons"), and communication ("axons") in close proximity, executing activity in an event-driven fashion. These chips serve as a platform for emulating and extending the brain?s ability to respond to biological sensors and analyzing vast amounts of data from many sources at once.

Having completed Phase 0, Phase 1, and Phase 2, IBM and its collaborators (Cornell University and iniLabs, Ltd) have recently been awarded approximately $12 million in new funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 3 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project, thus bringing the cumulative funding to approximately $53 million.

IBM?s long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.

Systems built from these chips could bring the real-time capture and analysis of various types of data closer to the point of collection. They would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. This raw data reflects activity in the world of every kind ranging from commerce, social, logistics, location, movement, and environmental conditions.

Take the human eyes, for example. They sift through over a Terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data.

These sensors would gather and interpret large-scale volumes of data to signal how many individuals are ahead of the user, distance to an upcoming curb, number of vehicles in a given intersection, height of a ceiling or length of a crosswalk. Like a guide dog, sub-symbolic data perceived by the glasses would allow them to plot the safest pathway through a room or outdoor setting and help the user navigate the environment via embedded speakers or ear buds. This same technology -- at increasing levels of scale -- can form sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras, and robots.

Tags: IBM
Previous Post
Appeals Court Revives Apple Patent Claims vs. Google
Next Post
Sony Introduces Home Theater Sound Bar Speaker

Related Posts

  • IBM and AMD Join Forces to Build the Future of Computing

  • IBM Unveils watsonx Generative AI Capabilities to Accelerate Mainframe Application Modernization

  • New magnetic tape prototype breaks data density and capacity records

  • IBM Expands the Computational Power of its IBM Cloud-Accessible Quantum Computers

  • Researchers Use Analog AI hardware to Support Deep Learning Inference Without Great Accuracy

  • Server Market Posts a Record First Quarter on Strong Cloud-service Demand

  • IBM Wants to Change IT Operations With Watson AIOps, Releses Edge Computing Solutions for 5G Deployments 5G era

  • IBM Reports Continued Cloud Revenue Growth, Withdraws Annual Forecast

Latest News

ASUS ROG Unveils Rapture GT-BE19000AI, the World’s First AI Gaming Router
Enterprise & IT

ASUS ROG Unveils Rapture GT-BE19000AI, the World’s First AI Gaming Router

TerraMaster Unveils TOS 7 Insider Preview
Enterprise & IT

TerraMaster Unveils TOS 7 Insider Preview

CORSAIR Steps Into the Ring, Announces Novablade Pro Wireless Hall Effect Leverless Fight Controller
Gaming

CORSAIR Steps Into the Ring, Announces Novablade Pro Wireless Hall Effect Leverless Fight Controller

PROGRADE DIGITAL ANNOUNCES PG25 PRO THUNDERBOLT 5 DOCK
Cameras

PROGRADE DIGITAL ANNOUNCES PG25 PRO THUNDERBOLT 5 DOCK

Samsung TVs get HDR10 Plus Supported Content
Consumer Electronics

Samsung TVs get HDR10 Plus Supported Content

Popular Reviews

be quiet! Dark Mount Keyboard

be quiet! Dark Mount Keyboard

Terramaster F8-SSD

Terramaster F8-SSD

be quiet! Light Mount Keyboard

be quiet! Light Mount Keyboard

Soundpeats Pop Clip

Soundpeats Pop Clip

Akaso 360 Action camera

Akaso 360 Action camera

Dragon Touch Digital Calendar

Dragon Touch Digital Calendar

Noctua NF-A12x25 G2 fans

Noctua NF-A12x25 G2 fans

be quiet! Pure Loop 3 280mm

be quiet! Pure Loop 3 280mm

Main menu

  • Home
  • News
  • Reviews
  • Essays
  • Forum
  • Legacy
  • About
    • Submit News

    • Contact Us
    • Privacy

    • Promotion
    • Advertise

    • RSS Feed
    • Site Map
  • About
  • Privacy
  • Contact Us
  • Promotional Opportunities @ CdrInfo.com
  • Advertise on out site
  • Submit your News to our site
  • RSS Feed