graphic from Tesla website showing dashboard
Automatic steering with traffic aware cruise control

I’ll take my chances with a fender bender that I may be or be not responsible rather than turn my steering wheel, gas pedal, and brake pedal to a pile of plastic circuit boards.
[comment on Marketing Daily]

As predicted last week, things are going to get plenty hot for Elan Musk and Tesla. It is just a matter of time until the SEC gets involved. Here’s why:

On May 18, eleven days after Brown died, Tesla and CEO Elon Musk, in combination (roughly three parts Tesla, one part Musk), sold more than $2 billion of Tesla stock in a public offering at a price of $215 per share—and did it without ever having released a word about the crash.

To put things baldly, Tesla and Musk did not disclose the very material fact that a man had died while using an auto-pilot technology that Tesla had marketed vigorously as safe and important to its customers.

Deciding to publish this story this morning, Fortune tried most of yesterday—yes, the Fourth of July—to reach someone at Tesla to a) inform them as to what the story would say and b) see if Tesla wished to comment. Ultimately, we reached a public relations executive choosing to emphasize the action of the stock last Friday, when the stock closed up for the day despite the bad news announced on Thursday. That outcome, the executive said, proves that the crash news was not a material fact.

Then Elon Musk himself suddenly entered the email conversation. He first thought, mistakenly, that Fortune was criticizing the price at which Tesla and he had sold stock. This writer replied that was not the case and that the issue was the non-disclosure of a material fact. That, Musk replied in a second e-mail, “is not material to the value of Tesla.”

He continued, “Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”

“Mislead”—a good word to think about in this matter.
Fortune

The concept of misleading is also being applied to Tesla’s heretofore amazing safety claims of 130 million miles driven without a fatality.

Elon Musk has asserted that “of the over 1 million auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available.” But those comparisons are questionable, according to experts. Autopilot only drives on highways, so it can’t be directly compared with U.S.-wide statistics. And Tesla’s cars are bigger and safer than many on the road, so you’d expect fewer fatalities.

recent report suggested that automated cars may have to drive hundreds of billions of miles before their performance can be fairly compared with statistics from human drivers.
MIT Technology Review

The court of public perception will be further swayed by the media who now have a story to sink their fangs into.

Okay, Joshua Brown’s Tesla S ($70k to $100K) crashes; Brown dies in the accident.

Police arrive, emergency responders arrive; yellow police tape cordons off the crash site; a police photographer snaps pictures, and police reports are filed. Photos of Brown are all over the Internet as well as YouTube video (see below).  Previously, Brown had taken video of his Tesla in Autopilot (the Tesla autonomous driving system) avoiding a truck. The National Highway Traffic Safety Administration (NHTSA) has opened an inquiry into the fatal crash, while most every media outlet has covered near every nuance to this, the first known death caused by a self-driving car.

Brown was killed on a highway built by taxpayers, and his license was the state of Florida’s permission for the Ohio man to drive its highways.

Terribly tragic as it was, that’s about as public an event as there is. And everything about it should be fully public: the authorities, whose job it is to safeguard the Nation’s highways, need to know what happened and why…as soon as possible!

Strangely, Tesla Motors has a non-disclosure agreement (part of every Tesla car sale) that precludes the car owner from contacting NHTSA regarding safety concerns.
Robotics Business Review

IEEE took a slightly different perspective focusing on the problem with thinking a computer can be infallible and then considering what Tesla might do to remedy the situation. People’s willingness to trust a computer when they shouldn’t is an emerging theme we have seen before.

I don’t believe that it’s Tesla’s intention to blame the driver in this situation, but the issue (and this has been an issue from the beginning) is that it’s not entirely clear whether drivers are supposed to feel like they can rely on the Autopilot or not. I would guess Tesla’s position on this would be that most of the time, yes, you can rely on it, but because Tesla has no idea when you won’tbe able to rely on it, you can’t really rely on it. In other words, the Autopilot works very well under ideal conditions. You shouldn’t use it when conditions are not ideal, but the problem with driving is that conditions can very occasionally turn from ideal to not ideal almost instantly, and the Autopilot can’t predict when this will happen. Again, this is a fundamental issue with any car that has an “assistive” autopilot that asks for a human to remain in the loop, and is why companies like Google have made their explicit goal to remove human drivers from the loop entirely.

The fact that this kind of accident has happened once means that there is a reasonable chance that it, or something very much like it, could happen again.
IEEE Spectrum

See this article for an interesting discussion in IEEE Spectrum on the different approaches various players are taking to advance autonomous vehicle technology. Including something like UTM…
Adding insult to injury, a 2016 Tesla Model X was involved in an accident on the Pennsylvania Turnpike July 6. The police officer investigating the scene said that the driver told him that the Autopilot feature was engaged.

“…it hit a guard rail off the right side of the roadway. It then crossed over the eastbound lanes and hit the concrete median.” After that, the Tesla Model X rolled onto its roof and came to rest in the middle eastbound lane.

In a follow-up article in the Detroit Free Press, Tesla said that “there is no evidence that the Autopilot was engaged.” What a closer reading of the story reveals is that because the car flipped and the antenna was destroyed, they did not receive a signal from the car. Plausible deniability at work for the moment.

“We received an automated alert from this vehicle on July 1 indicating air bag deployment, but logs containing detailed information on the state of the vehicle controls at the time of the collision were never received,” Tesla said in a statement. “This is consistent with damage of the severity reported in the press, which can cause the antenna to fail.”

Going back to the original story we learn that:

Frank Baressi, 62, the driver of the truck and owner of Okemah Express, said the Tesla driver was “playing Harry Potter on the TV screen” at the time of the crash and driving so quickly that “he went so fast through my trailer I didn’t see him.”

The movie “was still playing when he died,” Baressi told the Associated Press in an interview from his home in Palm Harbor, Fla., saying the careening car “snapped a telephone pole a quarter-mile down the road.” He acknowledged he didn’t see the movie, only heard it.

Tesla Motors said it is not possible to watch videos on the Model S touch screen. There was no reference to the movie in initial police reports.

Admittedly it is completely unclear to me how Baressi heard the video – perhaps through the windshield assuming that he stopped to render aid… It seems highly unlikely that he heard Professor Dumbledore yelling “look out Harry” at the moment of the crash.
And let’s not even get into what he did and didn’t see since he was making what clearly appears to be an illegal left into oncoming traffic.

police diagram of Tesla accident

UPDATE Apparently I am wrong. I found this thread on HuffPost discussing the expected SEC investigation.

Dallas E. Weaver
The truck was making a left turn infront of the oncoming car. With no signal or stop sign, the tesla had the rightaway and it was the truckers fault (at least in California).

Dale Johnson
Dallas E. Weaver Wrong. The truck started his turn before the Tesla had crested the hill and had come into view. You can’t give right-of-way to a vehicle not there. Yes, I’m a trucker and know the facts of this accident very well.

Dale Johnson
So far what is known is the the Tesla was traveling 85 mph, 20 mph over. No brakes were applied so the driver was distracted.

Based on other reports I have read, the driver had a history of speeding violations.
So perhaps you are wondering by now why I am bringing this up and what it has to do with drones.
Well, it has nothing to do with drones directly and everything to do with advancing the state of the art and gaining public acceptance for the concept of autonomous operations. As Sally French pointed out in her excellent article, A Tale of Two Robotics Policies, the NHTSA is a whole lot friendlier to ideas like autonomous operations than the FAA is.

Self-driving cars have been expected to be a boon to safety because they’ll eliminate human errors. Human error is responsible for about 94% of crashes.

Later this month, NHTSA, which is authorized to set the safety rules for all motor vehicles, will issue guidelines intended to set the near-term rules of the road in autonomous vehicle research. Autonomous vehicle advocates argue that over time these features, which include things such as adaptive cruise control, rear object detection, lane departure alert and blind-spot detection already in many newer models, can make a quantum improvement in safety.

But regulators must operate in an environment where technology is moving faster than their ability to understand all its ramifications.
The Detroit Free Press

Sound familiar? I thought so.