— The National Transportation Safety Board (NTSB) has finally said what safety advocates have yelled about for years concerning self-driving cars and autonomous companies.
According to a report from the NTSB that was 18 months in the making, the National Highway Traffic Safety Administration (NHTSA) is failing to protect the public by allowing autonomous companies to self-regulate.
The NTSB report occurred only because of a March 18, 2018, fatal crash between a self-driving Uber SUV and Elaine Herzberg, an Arizona pedestrian who was illegally crossing the street with her bicycle. The Uber, a 2017 Volvo XC90, was being tested in autonomous mode with a human in the driver's seat who was there to monitor the vehicle.
The SUVs factory-installed forward collision warning and automatic emergency braking systems were deactivated because the vehicle was equipped with technology created by Uber.
The NTSB recently released a report that described how the SUV failed to correctly identify Herzberg as a pedestrian, and the system also failed to determine which direction she was moving.
According to NHTSA, the computer is considered the legal driver when a driverless car is in autonomous mode. But the NTSB report blames the human in the driver's seat who was there to monitor the Volvo's autonomous mode.
According to the report, the woman was the immediate cause of the crash because she was distracted by her cell phone throughout the trip.
The woman hired to monitor the vehicle spent her time "looking toward the bottom of the SUV’s center console, where she had placed her cell phone at the start of the trip. The operator redirected her gaze to the road ahead about 1 second before impact."
The report says the distraction of her cell phone is a typical consequence of a self-driving car which creates "complacency" of the human "driver."
The NTSB further found Uber's "inadequate safety culture, exhibited by a lack of risk assessment mechanisms, of oversight of vehicle operators, and of personnel with backgrounds in safety management."
Uber was also blamed for not having a system to monitor the human monitor in real-time, and the company made things worse by removing a second vehicle operator during vehicle testing.
Arizona is also named as a contributing cause of the crash by the lack of a "safety-focused application-approval process for automated driving system (ADS) testing at the time of the crash, and its inaction in developing such a process since the crash, demonstrate the state’s shortcomings in improving the safety of ADS testing and safeguarding the public."
Although the NTSB references the lack of oversight by Arizona officials which contributed to the crash, the governor commented prior to the crash how Arizona's "low regulatory environment has led to increased investment and economic development throughout the state."
In addition, about three weeks before the crash, the governor issued a document that begins by talking about how Arizona is known for being friendly to autonomous companies.
Driverless car companies were welcomed by lackluster rules that required a driverless car meet the same standards as a typical vehicle, even if the autonomous car had no human driver.
NHTSA is also named as a cause of the problem because of a need for safety requirements for testing driverless cars on public roads. According to the NTSB report, NHTSA's voluntary guidelines don't do anything meaningful to evaluate autonomous driving systems.
"Considering the lack of federal safety standards and assessment protocols for automated driving systems, as well as the National Highway Traffic Safety Administration’s inadequate safety self-assessment process, states that have no, or only minimal, requirements related to automated vehicle testing can improve the safety of such testing by implementing a thorough application and review process before granting testing permits." - NTSB Uber crash report
Read more about autonomous vehicles here.