Uber Fatal Crash Highlights Need for Safety System Independence

Subscribe To Download This Insight

2Q 2018 | IN-5153

In March 2018, an experimental Uber autonomous vehicle struck and killed a pedestrian in Arizona. This was the first fatal accident of its kind and it sent shockwaves through the automotive market, resulting in mass speculation about the cause and some companies suspending their autonomous vehicle trials as a precaution.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.

 

Uber Preliminary Report Released 

NEWS


In March 2018, an experimental Uber autonomous vehicle struck and killed a pedestrian in Arizona. This was the first fatal accident of its kind and it sent shockwaves through the automotive market, resulting in mass speculation about the cause and some companies suspending their autonomous vehicle trials as a precaution.

The crash led the National Transportation Safety Board, (NTSB) to send a team to Arizona to investigate the fatal collision, and on May 24th, a preliminary report into the crash investigation was released. 

Multiple Points of Failure

IMPACT


So, what does the preliminary report say? And what does this mean moving forward? The key takeaways from the report are:

  • An object was detected 6 seconds prior to crash. At a speed of 43 mph (19.22 m/s), detection of an object 6 seconds prior to crash would suggest that the object was detected at around 115 m prior to crash. The first question we must ask is, is this adequate? Most Original Equipment Manufacturers (OEMs) suggest at least a 150+ m range for any object detection system. The next point of interest is the NTSB’s use of the word “object.” NTSB details how the system first classified the object as an unknown object, then a vehicle, and then, finally, as a bicycle. The next question to ask is, when was the object classified correctly? If the vehicle was travelling at higher speeds, would this have been adequate?
  • The Autonomous Vehicle (AV) system determined that Automatic Emergency Braking (AEB) was required 1.3 seconds prior to crash. The self-driving system determined that at 1.3 seconds before collision, AEB was required to avoid a collision. Applying SUVAT mechanics, this equates to 25 m between vehicle and object. The estimated stopping distance of a vehicle travelling at that speed is approximately 24 m. This would seem to suggest that the software had correctly determined the required action.
  • The vehicle impact speed was 39 mph. If the vehicle had detected an object, why had it not slowed down? Even though the AEB had been disabled, deceleration up to 6.5m/s2 was still enabled by Uber. This would have actually been enough to bring the vehicle to a complete stop from initial object detection to the point of impact, without the need for any emergency braking at all. Why was the vehicle then still travelling at nearly the same speed before the collision? Clearly the software had failed to apply any sort of braking at all.
  • The AEB system had been disabled by Uber. The built-in AEB Volvo system had been disabled by Uber. To prevent conflict with the self-driving system, Uber required human operators to take control of the vehicle. The system is also not designed to alert the operators. The preliminary report and data seems to suggest if AEB had been activated, there would have been enough time to bring the vehicle to a near standstill from the point at which the AV system realized AEB needed to be applied.

The Need for Advanced Driver-Assistance Systems and Better Sensors  

RECOMMENDATIONS


We know that the fatal Uber crash took place at night, under poor lighting conditions, and we now know that the current set of sensors and software used by Uber struggled to identify the object. If the current set of sensors struggled with object detection, then does that promote the need for higher resolution Light Detection and Ranging (LiDAR) or thermal infrared cameras in autonomous vehicle applications? Higher resolution LiDAR or infrared thermal imaging cameras would help provide better semantic information in poor lighting conditions and could help provide vital extra sensor information.

Thermal imaging cameras are not currently used in autonomous vehicle applications. Those companies that focus in this area, producing night vision systems in premium vehicles, should now seriously think about how their sensors could be adapted for autonomous vehicle applications. These sensors could be positioned as part of a sensor suite that can provide extra and vital semantic information under poor lighting conditions; in the same way that radar complements LiDAR for range, thermal imaging cameras could complement LiDAR for semantic information in poor lighting.

The next takeaway from the report is the need for system diversity. From the preliminary report, it would seem that if AEB was in use, it could have prevented a fatal collision. This demonstrates how system diversity could be used to provide more holistic safety systems. By separating autonomous vehicle systems from current Advanced Driver-Assistance Systems (ADAS) systems, OEMs can provide system diversity, mitigate risks, and thereby create safer systems. In this architecture, ADAS such as AEB could be used provide ASIL-D functionality, separated from the main computing platform and associated hardware that is used to provide the main autonomous vehicle functionality. Although many see the path from ADAS to full vehicle autonomy as a linear path, there may still be a need for current ADAS to be used in conjunction with future autonomous vehicle systems, to provide ensured safety through hardware and software diversity.

Overall, the preliminary Uber crash report has probably raised a lot more questions than answers. Vendors that produce high-resolution LiDAR or thermal imaging cameras should start to think about how their hardware can be used/adapted to help provide semantic information in autonomous vehicle applications, under poor lighting conditions. Meanwhile, those operating in the ADAS should also start to think about how their current ADAS can be adapted to be used as mission-critical safety systems in future autonomous vehicle applications.

Services