Off-loading Onboard Computing in Driverless Vehicles: Role of HD Maps, 5G, and the (Edge) Cloud

Subscribe To Read This Insight

By Dominique Bonte | 4Q 2018 | IN-5274

 

Deep Learning and Sensor Fusion Driving Onboard Compute Requirements 

NEWS


The consensus about how to perform advanced sensor fusion based on deep-learning computing in autonomous and driverless vehicles converges around heavy lifting Graphics Processing Unit– (GPU-)based central embedded processing units that support tens of Teraflops (One Trillion Floating Point Operations per Second, or TFLOPs). This computing power allows the running of multiple deep-learning inferencing applications, including machine vision, High Definition (HD) map-based positioning (e.g., Simultaneous Localization and Mapping, or SLAM), and driver or passenger monitoring. But this inflation of onboard computing comes at price—not only in terms of the cost of the compute module but also in terms of the power budget impacting cooling needs and Electronic Vehicle (EV) range.

This is prompting multiple hardware, software, and mapping startups to develop light- or lighter-weight solutions. French semiconductor vendor Kalray recently announced its Massively Parallel Processor Array (MPPA) architecture that can run t…

You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.

Services