Registered users can unlock up to five pieces of premium content each month.
Uber Preliminary Report Released |
NEWS |
In March 2018, an experimental Uber autonomous vehicle struck and killed a pedestrian in Arizona. This was the first fatal accident of its kind and it sent shockwaves through the automotive market, resulting in mass speculation about the cause and some companies suspending their autonomous vehicle trials as a precaution.
The crash led the National Transportation Safety Board, (NTSB) to send a team to Arizona to investigate the fatal collision, and on May 24th, a preliminary report into the crash investigation was released.
Multiple Points of Failure |
IMPACT |
So, what does the preliminary report say? And what does this mean moving forward? The key takeaways from the report are:
The Need for Advanced Driver-Assistance Systems and Better Sensors |
RECOMMENDATIONS |
We know that the fatal Uber crash took place at night, under poor lighting conditions, and we now know that the current set of sensors and software used by Uber struggled to identify the object. If the current set of sensors struggled with object detection, then does that promote the need for higher resolution Light Detection and Ranging (LiDAR) or thermal infrared cameras in autonomous vehicle applications? Higher resolution LiDAR or infrared thermal imaging cameras would help provide better semantic information in poor lighting conditions and could help provide vital extra sensor information.
Thermal imaging cameras are not currently used in autonomous vehicle applications. Those companies that focus in this area, producing night vision systems in premium vehicles, should now seriously think about how their sensors could be adapted for autonomous vehicle applications. These sensors could be positioned as part of a sensor suite that can provide extra and vital semantic information under poor lighting conditions; in the same way that radar complements LiDAR for range, thermal imaging cameras could complement LiDAR for semantic information in poor lighting.
The next takeaway from the report is the need for system diversity. From the preliminary report, it would seem that if AEB was in use, it could have prevented a fatal collision. This demonstrates how system diversity could be used to provide more holistic safety systems. By separating autonomous vehicle systems from current Advanced Driver-Assistance Systems (ADAS) systems, OEMs can provide system diversity, mitigate risks, and thereby create safer systems. In this architecture, ADAS such as AEB could be used provide ASIL-D functionality, separated from the main computing platform and associated hardware that is used to provide the main autonomous vehicle functionality. Although many see the path from ADAS to full vehicle autonomy as a linear path, there may still be a need for current ADAS to be used in conjunction with future autonomous vehicle systems, to provide ensured safety through hardware and software diversity.
Overall, the preliminary Uber crash report has probably raised a lot more questions than answers. Vendors that produce high-resolution LiDAR or thermal imaging cameras should start to think about how their hardware can be used/adapted to help provide semantic information in autonomous vehicle applications, under poor lighting conditions. Meanwhile, those operating in the ADAS should also start to think about how their current ADAS can be adapted to be used as mission-critical safety systems in future autonomous vehicle applications.