view from cockpit of Tesla

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

A Tesla driver was killed in a collision in Florida with a tractor trailer while the vehicle was in “Autopilot” mode, the car maker announced Thursday.

It is the first known fatality in more than 130 million miles driven with autopilot activated, Tesla said in a statement which also expressed condolences to the driver’s family.

The National Highway Traffic Safety Administration said it was investigating the fatality to see if the autopilot system was to blame. But Tesla acknowledged that the accident might have been the fault of the computer.

The crash occurred May 7 when Joshua David Brown, 40, of Canton, Ohio, was behind the wheel of his black 2015 Model S Tesla, while in Autopilot mode on U.S. Route-27 in Williston, Fla., and hit the side of a tractor trailer that was crossing the road to make a turn.

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a blog post entitled “A Tragic Loss.

The Model S passed under the trailer, crushing the top of the car and the windshield. “Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents,” Tesla said.

“This is the first known fatality in just over 130 million miles where Autopilot was activated,” Tesla said. “Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

Ron Montoya, senior consumer advice editor at Edmunds.com, said that the incident shows no “single technology on the market today can make a vehicle 100 percent safe” — and that “there is not a true ‘Autopilot.'”

He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission.

This is a sad story. It is a worst possible case impact scenario that in all likelihood would have played out the same way in any sedan driven by a fully engaged driver. It is also a story that is going to draw a lot of attention to the autonomous vehicle concept and how it is being regulated. And it is giving the naysayers and the Musk-haters an easy shot.
Three things stand out to me.
  1. Clearly it’s early days but right now the Tesla record is between 50 and 100% better than average. To put this in context, “Some 1.25 million people die each year as a result of road traffic crashes, according to the World Health Organization Global status report on road safety 2015.” That suggests that if every vehicle were equipped with autopilot technology a substantial number of lives would be spared every year.
  2. Tesla knows what happens because Tesla makes it their business to track every one of their vehicles in real time. This is a significant reason that they have advanced their systems so quickly. It is ironic that the expectation that Tesla will be able to provide information into the accident is a direct result of this unique capability. No other automotive manufacturer, except perhaps Google, has ever been able to or is currently able to do this.
  3. This news stayed out of the media until today (6/30/16) when the NHTSA announced they were opening an investigation. Because of their monitoring, Tesla knew at the time of the accident or shortly after.  In the blog post they state that they notified the NHTSA immediately. Both Tesla and the NHTSA are already taking heat for the delay between the accident and the announcement. And Tesla just completed their annual shareholders meeting without disclosing the accident – thin ice for a publicly traded company. All of which suggests that they were trying to keep a lid on this as long as they could.
UPDATE Excellent context from WaPo.
The Tesla’s “Autopilot” feature was turned on. But the model was not designed to be and should not have been considered to be fully self-driving. The car’s semi-autonomous systems, which use onboard sensors to guide it away from hazards, were not advanced enough to steer and brake the car without the driver paying continuous attention and correcting when necessary. In fact, none of the semi-autonomous cars on the market are trustworthy enough to allow drivers to sit back and zone out.

read more at washingtonpost.com

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.