Make Distributed Intelligence Deployment Scalable

by Lian Jye Su | 4Q 2020 | IN-5930
Traditionally, the centralization of Artificial Intelligence (AI) workloads in the cloud brings the benefits of flexibility and scalability. However, the industry has witnessed a shift in the AI paradigm. Edge AI brings task automation and augmentation to device and sensor levels across various sectors. However, making the deployment as scalable and cost friendly as possible is one of the biggest challenges that all stakeholders are confronting. This ABI Insight discusses the three industry responses to the challenge: partner with an ecosystem with a scalable business model, create an open standard for edge AI hardware, and leverage open source projects.

You must be a subscriber to view this ABI Insight.

To find out more about subscribing contact a representative about purchasing options.

Service