Neural Architecture Search Brings Power Efficiency and Optimized Performance to Deep Learning at the Edge and in the Cloud

Subscribe To Read This Insight

By David Lobina | 3Q 2022 | IN-6597


Deci and the Push toward Accelerated Deep Learning


The deployment of Artificial Intelligence (AI), mostly in the form of Machine Learning (ML), and more specifically still, by employing DL algorithms, is pretty ubiquitous these days, especially in the cloud, where both training and inference can be carried out, and more and more so at the edge, currently in the form of inference, for the most part. Despite the widespread use of ML, two issues pose particular challenges to a commercially viable AI. One is that deploying ML algorithms do not always result in clear Returns on Investment (ROIs); Intel, for instance, reports that only 20% of vendors employing AI see a ROI. The second issue, which may partly explain the first issue, is that model training requires ever more quantities of both data and energy requirements—bigger models tend to do better, and models also tend to become bigger and bigger—and this is not always viable. Thus, the current importance of accelerators on all aspects of the AI cycle, with one particular, and interesting, example to be found in Deci, an Israeli startup that offers an end-t…

You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.