Co-Inference: An Artificial Intelligence Technique that 5G Will Unlock

Subscribe To Read This Insight

4Q 2018 | IN-5327

 

What Is Co-Inference, and How Can It Improve AI Inference?

NEWS


A team of researchers from Sun Yat-sen University in China have developed a new technique for Artificial Intelligence (AI) inference that spreads inference across both the edge and the cloud—the researchers are calling this technique “co-inference.” A combination of 5G and co-inference could massively improve flexibility around the management of inference on devices. The researchers present an approach that marries both the edge and the cloud together in a framework called Edgenet, a deep-learning co-inference model. Co-inference relies on an idea called Deep Neural Network (DNN) partitioning—a process of adaptively splitting DNN layers between the edge and the cloud relative to the available bandwidth and compute at both. Co-inferencing segments inference processing between the edge device and the cloud by splitting the layers of the DNN and assigning them either to the edge or to the cloud. The critical...

You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.

Services