Paul Schell

Paul Schell

Industry Analyst

Read More   

Topics Covered

Paul Schell In The News

Wire19 (2024-02-22)
“Cloud deployment will act as a bottleneck for generative AI to scale due to concerns about data privacy, latency, and networking costs. Solving these challenges requires moving AI inferencing closer to the end-user – this is where on-device AI has a clear value proposition as it eliminates these risks and can more effectively scale productivity-enhancing AI applications,” says Paul Schell, Industry Analyst at ABI Research. “What’s new is the generative AI workloads running on heterogenous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU. Qualcomm, MediaTek, and Google were the first movers in this space, as all three are producing chipsets running LLMs on-device. Intel and AMD lead in the PC space.”
Citation