As Artificial Intelligence (AI) moves to the edge, edge AI chipsets become more important. Edge AI chipsets refer to computational chipsets focusing on AI workloads that are typically deployed in edge environments, which include end devices, gateways, and on-premise servers. This chipset is generally designed for an AI inference workload, though in some cases, they can also support some level of AI training, particularly the training of deep learning models.
Overall, ABI Research estimates that the annual global edge AI chipset revenues for 2018 was US$10.6 billion. The market has experienced strong growth in the past and is expected to continue to grow to US$71 billion by 2024, with a CAGR of 31% between 2019 and 2024. Such strong growth is propelled by the migration of AI inference workloads to the edge, particularly in the smartphone, smart home, automotive, wearables, and robotics industries.
This report explores the dynamic edge AI landscape. By looking at chipset architectures, their respective computational requirements, and use cases, the report provides a holistic view of the current state and future trends of the edge AI chipset. Main players in the edge AI chipset industry have also been profiled with their key capabilities highlighted.
In addition, the report also looks into current developments in the open-source chipset. Under RISC-V, open-source chipset startups have started to develop AI-dedicated chipsets with high parallelistic computing capabilities. Due to participation and contributions from across the industry, open-source AI chipsets will be more in line with market requirements and expectations, significantly reducing the cost of error and development costs in product maintenance and upgrade.