NTSB Slams Uber Self-Driving Car Unit’s Safety Culture
[Stay on top of transportation news: Get TTNews in your inbox.]
Uber Technologies Inc.’s self-driving vehicle unit lacked an effective safety culture at the time when one of its test vehicles struck and killed a pedestrian in Tempe, Ariz., in 2018, National Transportation Safety Board Chairman Robert Sumwalt said Nov. 19.
“The inappropriate actions of both the automatic driving system as implemented and the vehicle’s human operator were symptoms of a deeper problem, the ineffective safety culture that existed at the time,” Sumwalt said as he opened a board meeting to determine the probable cause of the collision.
Sumwalt: Although the vehicle’s sensors first noted the pedestrian 5.6 seconds before impact, the system waffled between classifying her as a vehicle, a bicycle, or “other.”
— NTSB_Newsroom (@NTSB_Newsroom) November 19, 2019
The probe is the NTSB’s first to examine a fatal crash involving a self-driving test vehicle. The case is being closely watched in the emerging autonomous vehicle industry, a sector that has attracted billions of dollars in investment from companies such as General Motors Co. and Alphabet Inc. in an attempt to transform transportation.
Elaine Herzberg, 49, was hit and killed by an Uber self-driving SUV as she walked her bicycle across a road at night. Uber halted self-driving car tests after the crash investigative information released since the March 2018 collision highlighted a series of lapses, both technological and human, that the board may cite as having contributed to the crash. Uber resumed self-driving testing late last year in Pittsburgh.
The Uber vehicle’s radar sensors first observed Herzberg about 5.6 seconds prior to impact before she entered the vehicle’s lane of travel and initially classified her as a vehicle. The self-driving computers changed its classification of her as different types of objects several times and failed to predict that her path would cross the lane of the self-driving test SUV, according to the NTSB.
In our second episode of RoadSigns, we ask: How will the next levels of automation be deployed? Hear a snippet from Chuck Price, vice president of product at TuSimple, above, and get the full program by going to RoadSigns.TTNews.com.
The modified Volvo SUV being tested by Uber wasn’t programmed to recognize and respond to pedestrians walking outside of marked crosswalks, nor did the system allow the vehicle to automatically brake ahead of an imminent collision. The responsibility to avoid accidents fell to the single safety driver monitoring the vehicle’s automation system, while other companies place a second human in the vehicle for added safety.
The safety driver was streaming a television show on her mobile phone in the moments before the crash, despite company policy prohibiting drivers from using mobile devices, according to police. The NTSB has also said that Uber’s Advanced Technologies Group that was testing self-driving cars on public streets in Tempe didn’t have a stand-alone safety division, a formal safety plan, standard operating procedures or a manager focused on preventing accidents.
Uber made extensive changes to its self-driving system after several reviews of its operation and findings by NTSB investigators. The company told the NTSB that the new software would have been able to correctly identify Herzberg and triggered controlled braking to avoid her more than four seconds before the original impact, the NTSB has said.
Want more news? Listen to today's daily briefing: