Technology giant IBM has taken the wraps off a novel “prototype” analog AI chip engineered to emulate human brain functionality. This chip showcases its prowess in tackling intricate computations across a spectrum of tasks within deep neural networks (DNN).
Yet, the chip’s potential extends beyond this. IBM asserts that this cutting-edge chip holds the capability to significantly enhance the efficiency of artificial intelligence, simultaneously mitigating the energy drain on computers and smartphones.
This innovation was published in a paper by IBM Research.
A paradigm shift in AI computation
Engineered within IBM’s Albany NanoTech Complex, the novel AI chip consists of 64 analog in-memory compute cores. Leveraging fundamental principles of neural network operations in biological brains, IBM has embedded compact, time-based analog-to-digital converters within each tile or core. The chip facilitates seamless transitions between analog and digital realms.
Source: IBM
Each tile (or core) also incorporates lightweight digital processing units, responsible for executing elementary nonlinear neuronal activation functions and scaling operations. These innovations are elaborated upon in an IBM blog post published on August 10.
Paving the way for chip evolution
IBM’s prototype chip holds the potential to replace the conventional chips that currently power resource-intensive AI applications in computers and mobile devices. According to the blog, “a global digital processing unit is integrated into the middle of the chip that implements more complex operations that are critical for the execution of certain types of neural networks.”
Given the influx of foundational models and generative AI tools entering the market, the performance and energy efficiency of the traditional computing methods they rely upon are approaching a critical juncture.
IBM seeks to bridge this gap. The company notes that numerous contemporary chip designs have a disparity between their memory and processing units, leading to computational bottlenecks. “This means the AI models are typically stored in a discrete memory location, and computational tasks require constantly shuffling data between the memory and processing units.”
Thanos Vasilopoulos, a researcher at IBM’s Swiss research lab, drew parallels between the human brain and conventional computers. He highlighted that the former “is able to achieve remarkable performance while consuming little power.
This heightened energy efficiency in the IBM chip could enable the execution of “large and more complex workloads in low-power or battery-constrained environments,” such as automobiles, mobile phones, and cameras.
Vasilopulos added, that “cloud providers will be able to use these chips to reduce energy costs and their carbon footprint.”