MIT’s Low Power, 168 Core Eyeriss Designed for On-Device Deep Learning
DARPA funding for on-device deep neural network research results in MIT’s Eyeriss, a low-power chip design that allows mobile devices to execute neural network algorithms locally, minimizing or eliminating the need to send data to distributed servers in the cloud for further processing.
To find out more about subscribing:
- The Small Unmanned Aerial System Ecosystem
- Nvidia Strengthens Software AI Offering for GPU
- The 3D Vertical Environment: Where Drones Really Become Valuable
- AI-Enabled Robotic Automation for New Workforce Efficiencies
- Benchmarking AI Frameworks: Productization, Market Rationalization, and Ecosystem Development