MIT’s Low Power, 168 Core Eyeriss Designed for On-Device Deep Learning
DARPA funding for on-device deep neural network research results in MIT’s Eyeriss, a low-power chip design that allows mobile devices to execute neural network algorithms locally, minimizing or eliminating the need to send data to distributed servers in the cloud for further processing.
To find out more about subscribing:
- 5G A Key Factor in UAV Traffic Management
- 5G Use Cases in sUAS
- Microlocation Ushers in the Next Advancement in Mapping and Navigation of Robots
- Deep Reinforcement Learning Gets Boost with Dopamine, but Still Requires a Technology Paradigm Shift.
- Mobile Robotics and Autonomous Material Handling for Logistics and Warehousing