Tesla tells Congress that self-driving deaths will likely occur as systems are refined.

Posted in News

Tesla Responds to Questions About Death of Joshua Brown
Tesla tells Congress that self-driving deaths will likely occur as systems are refined.

— Tesla is answering questions from the U.S. government after receiving a letter from U.S. Sen. John Thune (R-S.D.), chairman of the Senate Committee on Commerce, Science and Transportation.

Thune requested information about the crash that killed former Navy SEAL Joshua Brown, 40, who was driving his Tesla Model S on Autopilot when the car slammed into the side of a tractor-trailer. The crash sheered off the top of the Model S that was traveling 74 mph when it went under the trailer, causing the battery to lose connection with the electric motors.

The car flew out the other side of the trailer and traveled nearly 300 feet before it smashed into a utility pole, then traveled another 50 feet after breaking the pole.

Although Tesla warns drivers to always keep their hands on the wheel when using Autopilot, Mr. Brown was a firm believer in the technology and wasn't paying attention to the road, or the car. Autopilot has a built-in system that causes the car to brake on its own, but only if it recognizes an approaching object, which in this case it didn't.

Thune's letter sought answers about Tesla’s actions in response to the crash and the company’s cooperation with the National Highway Traffic Safety Administration. Thune also asked what Tesla is doing to educate consumers not only about the benefits of autonomous technology, but also the limitations of the systems.

Autopilot includes different components used to steer the car and to maintain lane position while an automatic braking feature is used to slow down the car if a driver fails to take action. A driver must activate the system and agree to stay alert and keep their hands on the steering wheel.

Tesla engineers told the U.S. Senate Commerce Committee two possibilities are being looked at concerning why the Model S didn't even attempt to slow down before the crash.

Engineers say it's possible the camera system didn't recognize the white trailer of the truck as an obstacle, potentially believing the side of the trailer was part of the sky. The other theory concerns both the radar and camera systems that may have analyzed the scene and determined the tractor-trailer was an overhead sign.

Tesla says the system was programmed to ignore certain objects such as overhead highway signs or tall bridge structures. This way the car won't slow down every time it comes near one of those objects.

The automaker says the government shouldn't interfere with the development of automated technology and that deaths will likely occur as every self-driving company refines their systems. However, that viewpoint isn't popular with auto safety advocates who see the process as one that uses drivers as guinea pigs.

In a letter to President Obama, the consumer advocacy groups asked for solid rules and regulations to be put in place before allowing self-driving technology into the hands of everyday drivers. The organizations also say the administration's self-driving car policies have been "developed in the shadows" and NHTSA has apparently bought the driverless car "hype."

A D V E R T I S E M E N T S

Become a Fan & Spread the Word