AI Development Software from Chipset Supplier Critical to the Next Phase of AI Democratization

Subscribe To Download This Insight

1Q 2023 | IN-6848

Artificial Intelligence (AI) development software is at the heart of AI democratization. Popular AI development platforms from public cloud and pure-play AI software vendors allow developers to develop, train, and deploy AI models on their desired AI infrastructure. However, even with the right tools in place, AI development and deployment is becoming increasingly demanding and complicated. ABI Research believes that AI development software from chipset supplier critical to the next phase of AI democratization.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.

 

New AI Applications Create New Challenges

NEWS


Artificial Intelligence (AI) development software is at the heart of AI democratization. Popular AI development platforms from public cloud and pure-play AI software vendors allow developers to develop, train, and deploy AI models on their desired AI infrastructure. Generally, they offer the following benefits to facilitate and accelerate AI design, development, and deployment:

  • Database and data lakes to enable the capturing, storage, and processing of data for AI training and testing, as well as effective data processing and governance.
  • Large Model Zoo, equipped with popular pre-trained models dedicated for various AI use cases such as computer vision, conversational AI, and highly personalized recommendations systems.
  • Managed instances of popular AI frameworks, such as TensorFlow, PyTorch, and MXNet, where developers can create their own Machine Learning (ML) algorithms from scratch.
  • Integration with other public cloud services and cloud AI hardware.

In order to accommodate the widespread adoption of AI, these vendors are also adding new features to their solutions. These include AI model optimization tools to recommend, tune the right AI model for target environments, and AI model explainability tools to detect and resolve bias, drift, and other gaps in data and models. All these capabilities are introduced with the aim to simplify the AI development process.

However, even with the right tools in place, AI development and deployment is getting increasingly demanding and complicated. The industry is entering the next phase of AI development, as enterprises are pushing for more accurate computer vision, highly personalized recommendation systems, or more natural interactions with conversational AI. In response, developers are experimenting and testing a myriad of new deep learning techniques. These techniques, such as large language model, reinforcement learning, multimodal learning, and graph neural networks, require the support of not only robust software, but also high performing AI hardware infrastructure. As a result, developers often need custom hardware and software solutions to meet the performance expectations.

In doing so, developers face several key challenges:

  • Poor optimization between AI development software and the target chipset and hardware.
  • Constant need to learn new AI techniques and tools.
  • Complicated and unfamiliar tools that slow down the development process and time-to-market.
  • Specific demands for industry-specific applications, such as conversational AI and digital avatar in customer care, autonomous navigation in automotive and robotics, and medical diagnosis and drug synthesis in healthcare and pharmaceutical.

AI Development Benefits from Hardware Integration and Optimization

IMPACT


Traditionally, AI chipset vendors have preferred to let the public cloud and pure-play AI software vendors take the lead when working with AI developers. They were more focused on working behind the scenes and upgrading and optimizing their AI chipsets. In recent years, AI chipset vendors have allocated much of their time and resources to developing industry-specific software to complement the hardware. All the advancements above indicate that AI chipset vendors are realizing how important it is to simplify AI development processes if they are to remain competitive in the AI market. Having high performance AI hardware is no longer sufficient in democratizing AI development and adoption.

An area where AI chipset vendors are making a huge impact is the integration, optimization, and execution of AI models across various AI hardware. Widely considered the most industrious chipset vendor in the open-source community, Intel’s software solutions such as OpenVINO, oneAPI, and Geti eliminate the convolution that goes into working with heterogeneous hardware. Mobile System on a Chip (SoC) giant Qualcomm has also launched Qualcomm AI Stack, a collection of AI libraries, tools, and frameworks that enable developers to create AI applications for a wide range of Qualcomm products, such as Arm-based Snapdragon chips and its data center inference chipset Cloud AI 100.

These software not only downplays pipeline workloads, but also reduces power consumption, bandwidth needs, and associated operating costs. In addition, optimizing AI for dedicated hardware helps to lower the cost and power consumption of running AI applications. An optimized and well-integrated code could enable developers to save significant amount of cost during commercialization.

At the same time, AI chipset vendors are supporting AI democratization through industry-specific AI solutions. The announcement of the NeMo Large Language Model (LLM) cloud service by NVIDIA at GTC Autumn 2022 enables developers to design, train, and deploy large language models for automatic speech recognition, natural language processing, and text-to-speech synthesis. Taking things further, NVIDIA is delivering tailored solutions like BioNeMo, a large, pre-programmed language model that speeds up the process of drug discovery and protein sequencing. By providing innovative software solutions geared toward specific use cases and industries, customers can reduce time to market.

The Shift in Business Model

RECOMMENDATIONS


Highly optimized, easy-to-use AI software is now key to the overall user experience and has become pivotal in AI chipset vendors’ effort to generate ecosystem stickiness with their user base. This investment in AI software allows AI chipset companies to orientate its business towards AI leveraging on the strengths of their hardware technology, adequately address computational demand of AI models.

These AI software from AI chipset vendors are interoperable, hardware-agnostic, easy to use, and allow developers/data scientists to devote more time and resources to design, test, and deploy their AI models and applications. To further create a much better user experience for AI developers, these companies are also building strong alliances with independent device Original Equipment Manufacturers (OEMs), Independent Software Vendors (ISVs), and System Integrators (SIs).

Finally, the most significant impact of this development is the change in business model. Rather than focusing on AI chipset and hardware, chipset companies are transitioning their focus over to Software-as-a-Service (SaaS). What we’re seeing before our eyes is the demise of the “one and done” hardware purchase revenue model and the birth of subscription-based access to AI software platforms. While offering the combination of AI hardware and software may lead to direct competitive with the OEMs and ISVs, the benefits introduced through the integration and optimization cannot be simply overlooked. AI chipset vendors are expected to double-down on offering and monetizing more software solutions as AI continues to become more ubiquitous and complex.

 

Services

Companies Mentioned