MIT’s Low Power, 168 Core Eyeriss Designed for On-Device Deep Learning
DARPA funding for on-device deep neural network research results in MIT’s Eyeriss, a low-power chip design that allows mobile devices to execute neural network algorithms locally, minimizing or eliminating the need to send data to distributed servers in the cloud for further processing.
To find out more about subscribing:
- AI Techniques: Multimodal Learning: Technology Development and Use Cases
- Exoskeletons for Industrial Use Cases
- Digital Map Vendors Are Jumping on the AI Bandwagon
- IBM Announces US$2 Billion Investment in AI Campus, Hinting at Reentry into the Semiconductor Business
- Democratizing Robotics Will Lead to Supply Chain Internalization