Neuromorphic Computing and the Future of Artificial Intelligence

Subscribe To Read This Insight

By David Lobina | 4Q 2021 | IN-6324


The Advent of Commercial Neuromorphic Chips


A specter is haunting Artificial Intelligence (AI)—the specter of the environmental costs of Deep Learning (DL). As Neural Networks become ubiquitous in modern AI applications, the gains the industry has seen in applying Deep Neural Networks (DNNs) to solve ever more complex problems increasingly come at a high price. The quantities of computational power and data needed to train networks have increased exponentially. At the current pace of development, this translates into unsustainable amounts of power consumption and carbon emissions in the long run, and therefore alternative methods will be required before long. Recent advances in neuromorphic computing, especially the commercialization of neuromorphic chips by industrial giants such as Intel, with its recently released Loihi 2 boards and related software, hint at a future of ultra-low power but high-performance AI applications that would be ideally suited to cloud and edge technology.

Ameliorating the Environmental Costs of Deep Learning


Neuromorphic comp…

You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.