INDEX

AI & Machine Learning

Our Artificial Intelligence & Machine Learning coverage assesses and maps the value proposition offered by these technology implementations. Our research assesses the various AI and ML business models including the platform as a service, technology as a service, software licensing models, and edge device applications. We aim to provide technology implementers with insight into how these technologies are shaping new applications and business models.

Featured Research

Artificial Intelligence and Machine Learning

This report delivers a quantitative assessment of Machine Learning (ML) usage across numerous consumer, commercial, and industrial device markets. The assessment was informed by, and further refined by, a qualitative analysis of the technological, business, and political drivers and constraints impacting the use of Artificial Intelligence (AI) technologies.

Detailed shipments and segmentation are provided in product-based Market Data (MD) research deliverables spanning automotive, mobile devices, wearables, smart home, robotics, drones, manufacturing, retail, video systems, buildings, and energy. Device categories, segmentation, and annual shipment volumes used in this Artificial Intelligence and Machine Learning MD are derived from existing device and product data sets.

Continue

Reports & Data

All
Show more research...

Executive Foresights

Co-Inference: An Artificial Intelligence Technique that 5G Will Unlock

4Q 2018

A team of researchers from Sun Yat-sen University in China have developed a new technique for Artificial Intelligence (AI) inference that spreads inference across both the edge and the cloud—the researchers are calling this technique “co-inference.” A combination of 5G and co-inference could massively improve flexibility around the management of inference on devices. The researchers present an approach that marries both the edge and the cloud together in a framework called Edgenet, a deep-learning co-inference model. Co-inference relies on an idea called Deep Neural Network (DNN) partitioning—a process of adaptively splitting DNN layers between the edge and the cloud relative to the available bandwidth and compute at both. Co-inferencing segments inference processing between the edge device and the cloud by splitting the layers of the DNN and assigning them either to the edge or to the cloud. The critical task here is identifying the most computationally intensive layers of a DNN and having the inference of those layers take place in the cloud. Done correctly, this can reduce latency and at the same time send as little data to the cloud as possible.

Insights

IBC 2017 Preview – Artificial Intelligence (AI) and Machine Learning (ML) in Media

3Q 2017

The International Broadcaster Conference runs from September 15th to 19th. One of the most significant new trends promoted at this conference will be related to the implementation of artificial intelligence (AI) and machine learning (ML) in video services. Some solutions targeted as AI or Machine Learning simply migrate from editor- or developer-coded optimization methods to neural-network trained solutions, a host of new solutions leverage video analytics to generate metadata.

Analyst Support

Every client is assigned a key member of our research team, based on their organization’s needs and goals. And, an unlimited number of Analyst Inquiry calls are available to answer your specific questions.

Rian Whitton

Research Analyst

Malik Saadi

Vice President, Strategic Technologies

Lian Jye Su

Principal Analyst

Jack Vernon

Industry Analyst

Dimitris Mavrakis

Research Director

Michael Inouye

Principal Analyst

Pierce Owen

Principal Analyst

Stephanie Tomsett

Research Analyst

Nick Finill

Senior Analyst

Don Alusha

Senior Analyst

Webinar

Market Forces Driving Edge AI: Enabling Technologies, Value Proposition and Key Use Cases

AI is going to see a dramatic growth in adoption in many verticals. However the currently popular model of having AI inference and training take place in the cloud is simply not appropriate for many use cases, creating a sizable opportunity for edge AI hardware to flourish. In this webinar Malik Saadi, Vice President of Strategic Technologies and Jack Vernon, Industry Analyst, will cover the main drivers for shifting AI to the edge, how the technology stack for AI is changing to reflect the shift, and the market opportunity in edge AI.

This webinar will address the following questions:

  • How is AI currently being implemented?
  • What is the case for shifting AI processing to the edge?
  • What are the use cases that will drive edge AI?
  • What are the hardware options for implementing AI at the edge?
  • How big is the market opportunity in edge AI implementation?

Replay