Facing Enormous Workloads, Public Clouds Turn to SmartNICs, DPUs, and IPUs

This Research Highlight covers the evolution of NICs to SmartNICs, introduces DPUs/IPUs, presents opportunities and challenges, and lists some of the manufacturers of intelligent accelerators in the market.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Research Highlight.

Market Overview

Modern data centers are evolving, handling huge workloads from new cutting-edge technologies, such as Artificial Intelligence (AI), Machine Learning (ML), the Internet of Things (IoT), and distributed ledger technology. These solutions require significantly high computing resources, often with specific bandwidth and latency requirements. These requirements can strain the Central Processing Unit (CPU), resulting in less-than-ideal performance.

The emergence of these highly specialized workloads, such as AI/ML, IoT solutions, distributed ledger, etc., is placing significant strain on CPUs. This has resulted in semiconductor manufacturers introducing three new intelligent accelerators to offload workloads and improve data center performance:

  1. Data Processing Units (DPUs): Specialized programmable processors designed to accelerate data processing and analytical workloads.
  2. Infrastructure Processing Units (IPUs): Introduced by Intel, designed to accelerate system-level resources and allow hyperscalers to leverage virtualized storage and network architecture.
  3. SmartNICs: An evolution of the Network Interface Card (NIC), SmartNICs are intelligent accelerators that can be programmed to run communication tasks more efficiently compared to server CPUs.

This Research Highlight covers the evolution of NICs to SmartNICs, introduces DPUs/IPUs, and lists some of the manufacturers of intelligent accelerators in the market.

“Cloud hyperscalers are processing highly specific, compute-intensive workloads that often require ultra-low latency and high-bandwidth networking. The rise of 5G wireless networks, AI/ML, IoT, etc. has forced data centers to evolve to be able to cope with the new wave of data flowing into the platform.” – Yih-Khai Wong, Senior Analyst at ABI Research

Key Decision Items

Define the Benefits of Intelligent Accelerators

The rise of cloud computing and edge computing, and the increasing amount of data generated by businesses are creating opportunities to use SmartNICs, DPUs, and IPUs as offloading engines for high-compute data processing tasks. The SmartNIC, DPU, and IPU market is growing rapidly, driven by the increasing demand for high-performance and intelligent processing power.

Cloud hyperscalers, large telecommunications operators, and data center infrastructure providers are the main consumers of intelligent accelerators like SmartNICs, DPUs, and IPUs. Some of the benefits of implementing these accelerators include the following:

  • Network Acceleration: Reduce the network load from the host CPU, accelerating network functions and improving network performance.
  • Storage Virtualization: Act as storage controllers, communicating directly with the server storage and offloading storage data flow from the host CPU.
  • Security Data Encryption: Accelerate data center servers by offloading data encryption from the host CPU to SmartNICs, DPUs, or IPUs.

Understand the Evolution to SmartNICs

A NIC is a physical hardware component that connects a computer to the computer network. NICs work as an interface at the Transmission Control Protocol (TCP)/Internet Protocol (IP) layer, transmitting signals at the physical layer and delivering data packets to the network layer.

SmartNICs, however, are programmable NICs. Compared to NICs, which are essentially plug-and-play hardware with no customizable abilities, SmartNICs can be programmed intelligently. SmartNICs can be used to offload network, storage, and security functions from the core CPU, ultimately enabling the core CPU to focus on processing application-specific data more efficiently. Cloud hyperscalers are the biggest customers of SmartNICs, with companies like Amazon Web Services (AWS), Microsoft Azure, Alibaba, and Google Cloud leveraging the technology.

SmartNICs are commonly used to perform various networking functions, such as the following:

  • Load balancing and virtual switches
  • Network security orchestration
  • Security isolation for multi-tenanted servers
  • Storage
  • Data access processing
  • Taking on virtualization tasks
  • Packet processing for high latency required by certain workloads, such as AI

Assess the Challenges Associated with SmartNICs

Currently, the cost of deploying SmartNICs can be prohibitively high for private enterprise data centers due to the high production cost and the need for skilled resources to program the device. Public cloud hyperscalers have a more compelling need for SmartNICs due to the sheer infrastructure scale and financial resources available to these players.

SmartNICs are also difficult to program, requiring strong engineering resources and huge power consumption to be able to maximize the Return on Investment (ROI) in using SmartNICs. This can challenge private enterprises with insufficient infrastructure, resources, and financial capabilities compared to a Tier One cloud hyperscaler.

Enterprise adoption of SmartNIC solutions remains a small percentage. This represents a largely untapped opportunity for SmartNIC vendors, but the question remains, how can SmartNIC vendors address the high upfront cost, high power consumption, and high programmability difficulty of SmartNICs to reduce the Total Cost of Ownership (TCO) for customers.

Gauge the Capabilities of DPUs

DPUs are specialized programmable processors designed to accelerate data processing and analytical workloads. These types of accelerators handle workloads for networking and processing transferred from the server CPU.

DPUs take care of data-centric tasks at scale, combining dozens of processor cores with hardware accelerator blocks and a high-performance network interface. DPUs ensure that the right data arrive promptly and in the proper format. Right now and for the foreseeable future, DPUs are mainly centered in large data centers used by cloud hyperscalers and major Communication Service Providers (CSPs).

Some commercial use cases for DPUs include:

  • The training process for AI and ML algorithms involves huge amounts of datasets and can be very compute-intensive. DPUs can be used to offload network and storage data, leaving the CPU to focus on AI/ML-related tasks.
  • DPUs could also be leveraged in a security or surveillance scenario to process data from device sensors and camera images to detect anomalies or threats. DPUs allow you to analyze data in real time, especially if hosted on an edge computing environment.
  • From an Internet of Things (IoT) perspective, DPUs can be used to process data generated from sensors and smart devices. Data that are collected can be filtered, sorted, and analyzed by DPUs to provide actionable results and outcomes.

Determine How IPUs Can Help Processing-Intensive Data Centers

IPUs are programmable hardware networking devices that manage system-level resources by securely accelerating and optimizing networking and storage infrastructure functions in a data center. IPUs are designed for cloud hyperscalers and CSPs to reduce overhead and improve core server CPU performance.

Introduced in 2021, Intel’s IPU extends its SmartNIC capabilities and is designed to address the increasing complexities and inefficiencies in the modern data center due to large amounts of data flowing through. The IPU is built using a microservice-based architecture, with a dedicated functionality of accelerating modern applications and workloads.

IPUs offer the possibilities of:

  • Accelerating infrastructure functions, such as storage virtualization, network virtualization, and security functions with dedicated protocol accelerators.
  • Freeing up server CPU cores by offloading storage and network virtualization functions previously executed on the server CPU to the IPU.
  • Allowing flexible workload placement and, ultimately, improving data center optimization and utilization rate.
  • Enabling cloud hyperscalers to customize infrastructure function deployments based on customer requirements.

Key Market Players to Watch

Dig Deeper for the Full Picture

This resource provides a solid overview of the accelerator technologies that can assist data centers with handling unprecedented demand for public cloud applications. But there are still technology aspects that were not covered by this resource, including the following:

  • Identifying the opportunities and challenges within the SmartNIC, DPU, and IPU market.
  • Assessing the capabilities of competing companies if you’re a vendor.
  • Understanding the ongoing trends shaping intelligent accelerator adoption in data centers.

To acquire all this vital information, download ABI Research’s SmartNICs, DPUs, and IPUs: Intelligent Accelerators Driving Hyperscale Infrastructure and High-Performance Computing research report.

Not ready for the report yet? Check out our following Research Highlights:

This content is part of the company’s Distributed & Edge Computing Research Service.