Brownfield AI Deployment Requires a Different Approach
At the moment, there are a lot of discussions in the Artificial Intelligence (AI) industry about the merits of dedicated AI hardware for inference workloads at the edge, namely on device, gateway, and on-premises servers. A previous ABI Insight, System-on-Chip versus Discrete AI Chipset: Bringing Artificial Intelligence Beyond Mobile Devices (IN-5694), thoroughly explored the benefits of dedicated AI hardware, as it minimizes resource overhead and operation complexity while bringing cost efficiency. This is particularly important for AI devices that are moving from the Research and Development (R&D) phase to large-scale deployment. Most AI workloads nowadays are narrow AI that focus on a singular task and do not require a juggernaut type chipset that offers a wide range of capabilities.
However, the aforementioned benefits only apply to greenfield deployment. Brownfield AI deployment is a very different scenario. In brownfield environments, many existing legacy devices often feature decent computational capabilities, such as embedded Microcontroller (M…
You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.