Registered users can unlock up to five pieces of premium content each month.
Harnessing Crowdsourced Data |
NEWS |
The crowdsourcing of car sensor data represents the “next step” in the role that connectivity plays in automotive, moving from use cases that support convenience and entertainment toward more mission-critical functions. The aggregation of camera, radar, LiDAR, and ultrasonic data from connected cars can enable a range of new services, including smart parking guidance, real-time traffic, and hyper-local weather information. The crowdsourcing paradigm will also play a key role in autonomous driving—firstly in the building and maintenance of 3D maps for relative positioning, and secondly for the training of deep learning networks to improve the performance of autonomous systems software.
Tesla has always been ahead of the competition in terms of how effectively it leverages connectivity to deliver a unique and improved user experience, particularly with its pioneering use of over-the-top (OTA) software and firmware updates to remotely fix bugs and add significantly new functionality. Once again, Tesla is pushing the boundaries of what connectivity means for automotive, this time by capturing brief moments of camera data from its vehicles in order to “improve autonomous safety features and make self-driving a reality for [the consumer] as soon as possible.”
Didn't Tesla Always Collect Camera Data? |
IMPACT |
Tesla’s announcement came in the form of an updated data sharing agreement sent to users alongside a comprehensive upgrade of the second iteration of its Autopilot system. The new agreement asks users to consent to share “short video clips using the car’s external cameras”—the new Autopilot system features an impressive suite of eight external cameras. This new request for camera data may come as a surprise to Tesla owners, given that CEO Elon Musk boasts millions of miles of driving data that Tesla collected to improve the Autopilot system.
However, it is now clear that the data that Tesla employed for its continuous feedback loop related more to general driving behavior, rather than raw sensor data from the on-board sensors. For example, did the driver intervene when Autopilot was activated? How did he or she intervene—by steering violently or braking more harshly? Certainly, this information is valuable in identifying corner cases where the system is not functioning safely or adequately, and this new camera data can add vital context to these driving events. Indeed, the fact that Tesla is looking only to capture short video clips suggests that this will function very much like a conventional EDR.
Nevertheless, adding context to driver intervention events seems unlikely to be the limit of Tesla’s camera data crowdsourcing ambition. The new data sharing agreement states that Tesla “needs to collect short video clips using the car’s external cameras to learn how to recognize things like lane lines, street signs, and traffic light positions.” This strongly suggests that Tesla will be using crowdsourced data to train neural networks in object and situation recognition.
Addressing Deep Learning's Achilles' Heel |
COMMENTARY |
Deep learning neural networks have been established as a key enabling technology for autonomous vehicles, providing an automated feedback loop to improve safety and functionality based on the combined experience of connected, autonomous vehicles already deployed on the road. Many networks have been trained on the experience of trial vehicles deployed across the globe, but this can’t compare to the diversity of experience that can be leverage from thousands of Tesla cars. Although using camera sensor data to train neural networks has been a common practice in autonomous vehicle development, the industry continues to lack the necessary massive data sets pertaining to the remaining suite of automotive sensors, including radar and LiDAR. Therefore, Tesla and other OEMs should architect their ADAS and low level autonomous systems such that they can gather large data sets from more exotic sensors, as well as external cameras.