MIT’s Low Power, 168 Core Eyeriss Designed for On-Device Deep Learning
DARPA funding for on-device deep neural network research results in MIT’s Eyeriss, a low-power chip design that allows mobile devices to execute neural network algorithms locally, minimizing or eliminating the need to send data to distributed servers in the cloud for further processing.
To find out more about subscribing:
- The Small Unmanned Aerial System Ecosystem
- Webinar: Market Forces Driving Edge AI: Enabling Technologies, Value Proposition, and Key Use Cases
- Quantum Computing: Core Technologies, Development, and Use Cases
- Third-Party Integrators and Technology Providers Are Essential to the adoption of Cobots.
- The Use Case of Cognitive Systems in Collaborative Robotics