New Generation Distributed and Edge Computing Hardware and Software to Advance Edge AI

Subscribe To Download This Insight

By Yih-Khai Wong | 1Q 2023 | IN-6825

Traditional Artificial Intelligence (AI) requires large amounts of data, as well as powerful, centralized data processing infrastructure. Over the years, the industry has seen the adoption of edge AI, where AI algorithms and models are processed on edge devices, without the need to rely on a centralized processing environment. This ABI Insight discusses the emergence of new-generation distributed and edge computing hardware and software that look to further accelerate the growth of edge AI.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.

 

AI an Important Area of Growth for Semiconductor Producers and Technology Vendors

NEWS


Edge Artificial Intelligence (AI) is being used more and more frequently, driven by technological advances. Some of the advances that are driving edge AI adoption include Neural Processing Units (NPUs), Graphics Processing Units (GPUs), and pre-built AI toolkits, as well as improved AI learning models, such as Deep Learning (DL).

In January 2023, Dell and NVIDIA, both key players in edge AI, launched a suite of solutions leveraging on Dell’s PowerEdge servers accelerated by the full NVIDIA AI stack, including GPUs, Data Processing Units (DPUs), and NVIDIA’s AI Enterprise software suite. This partnership aims to help businesses accelerate automation across every industry by building an AI-first system, leveraging years of expertise from the two giants in their respective fields.

During CES 2023, AMD announced the company’s strategy in enabling pervasive AI, introducing innovations, such as AI accelerators with industry-leading performance and energy efficiency for multiple AI workload inferences, as well as an integrated data center Central Processing Unit (CPU) and DPU, designed specifically for High-Performance Computing (HPC) and AI performance. At the same time, AMD’s acquisition of Xilinx in 2022 further solidifies AMD’s intention in the AI computing space.

New Innovations in Distributed and Edge Computing a Boon for Advancement of Edge AI

IMPACT


While traditional AI algorithms and models are trained and processed in centralized data centers or on cloud platforms, edge AI refers to AI models that are designed to run on resource-constrained environments, such as edge servers and gateways, autonomous vehicles, sensor devices, drones, smartphones, etc.

The demand for greater cybersecurity and data residency regulations also fuels the growth of edge AI. Businesses can regulate the flow of data and reduce exposure to cyberattacks by ensuring that data are kept and processed locally at the source without the need to be transported to a centralized location. For industries like financial services, governments, and healthcare, edge AI can assist in ensuring compliance with tight data residency laws, with transparency in understanding exactly when, where, and how the data are processed and kept.

There have been new advancements and innovations that helped improve the speed and efficiency of edge AI processing. Some of these advancements include:

  • Energy-Efficient Processing Chips: Energy-efficient chips, such as NPUs consume much lower energy and are optimized for AI workloads with faster processing times, enabling edge AI devices to perform real-time tasks. Energy-efficient chips also produce less heat, reducing the risk of thermal heat issues that might impact processing performance. Examples of NPUs include Google’s Tensor Processing Unit (TPU), Arm’s Ethos-N series, Qualcomm’s Snapdragon Neural Processing Engine, and Graphcore’s Intelligent Processing Units (IPUs).
  • Edge AI Toolkits/Platforms: Pre-built hardware and software AI toolkits/platforms make it easier to develop, deploy, and manage AI solutions on the edge. An example of this would be NVIDIA’s Jetson AGX Xavier Series, enabling AI inferencing capabilities on edge devices, specifically for autonomous machines, such as delivery and logistics robots.
  • Thermal-Efficient Edge AI Gateways: The purpose of a thermal-efficient edge AI gateway is to balance processing power, thermal efficiency, and form factors in environments where heat dissipation and power consumptions are critical considerations. Intel’s NUC series consists of small mini-Personal Computer (PC) kits that use low-power processors and can be used as edge AI gateways, while Advantech’s UTC-520 is a compact, fanless edge AI gateway designed for commercial and industrial applications.

Accelerating Business Transformation through Edge AI

RECOMMENDATIONS


A wide range of industries stands to benefit from deploying AI on the edge. Some of these include:

  • Mining: Mining activities often are done in areas where connectivity is a challenge. AI-powered data analysis and digital twins can be used to ensure safe, sustainable, and efficient mining operations. Skycatch, a San Francisco-based startup and a partner of NVIDIA uses NVIDIA’s Jetson platform on the edge to process data from drones and other sensors for mining pipeline visualization.
  • Transportation: The ability to process real-time data gives edge computing an “edge” in autonomous vehicle technology. Autonomous vehicles, such as Tesla, Google’s Waymo, and Nuro, an autonomous delivery robot, rely on AI algorithms deployed at the edge to provide a complete and multi-layered view of the surrounding environment.
  • Retail/Food & Beverage (F&B): Retailers and F&B chains often deal with distributed locations, where data from certain applications or workloads require real-time analytics. Wendy’s, a quick-service restaurant is working with Google’s Anthos in exploring AI edge use cases, primarily looking at AI-enabled voice and computer vision deployed within its restaurants.

According to ABI Research’s Artificial Intelligence and Machine Learning for Distributed and Edge Computing report, the worldwide shipments for on-premises and edge cloud AI servers are expected to grow by a Compounded Annual Growth Rate (CAGR) of 56% from 2023 to 2028, while the installed base is expected to grow by a CAGR of 63% during the same period.

The edge AI market is expected to see continued growth, supported by advancements in distributed and edge computing. Businesses will look at edge AI to gain an advantage over the competition, as well as provide excellent customer experience, both of which play an important role in business transformation.

 

Services

Companies Mentioned