Wednesday, April 5, 2017

Teslas fatal Autopilot accident Why the New York Times got it wrong

Teslas fatal Autopilot accident Why the New York Times got it wrong


In Saturdays New York Times, coverage of Teslas Autopilot crash was framed as a failure of technology on the part of Tesla. But leaving out the context is misleading.


On the front page of Saturdays New York Times, the fatal accident that occurred when Joshua Browns Tesla Model S, in Autopilot mode, failed to brake was featured in two articles examining the circumstances of the crash: the limits of Autopilot, and what it means for Tesla.

The problem is that, in this case, the reporting in the Times was incomplete and ignored the full context of the situation.

But the New York Times articles on Saturday, which were highly critical of Tesla and its CEO Elon Musk, failed to fully capture the context of the accident. Heres what the coverage got wrong:

Conflation of Autopilot with a self-driving car. The headline of the front page piece, "A Fatality in a Self-driving Car Forces Tesla to Confront its Limits" is misleading. While the car was in Autopilot mode?akin to an advanced cruise-control?it is not accurate to say that the car was self-driving. Tesla does not make a self-driving car, a fully-autonomous car, or a driverless car. The title sensationalizes the incident, and plays into the fear factor involved when new technology is released.

Not enough information on the caveats of driving with Autopilot. Autopilot is a new technology?and one that, Tesla acknowledges, is imperfect. It is optional, meant only for highway driving. Drivers using Autopilot are required to keep both hands on the wheel?there is an alert, similar to what happens if you dont wear a seatbelt, that goes off if a driver fails to do so. Autopilot drivers are instructed to be alert at all times while the system is enabled. 

Drivers are expected to re-engage by shaking on the wheel or pressing the brake when needed. In this case, Brown did not re-engage or apply the brake himself. It is also likely that he was not paying attention?a Florida Highway Patrol sergeant found a portable DVD player loaded with a Harry Potter DVD in Browns Tesla. According to the AP, "Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was playing Harry Potter on the TV screen at the time of the crash ... He acknowledged he couldnt see the movie, only heard it."

Prematurely insinuating that Tesla is to blame for the accident. One of the Times pieces calls Brown "a victim of an innovation geared precisely to people like him." But the fact is, while we know that the accident happened during Autopilot, we still dont know what, exactly, went wrong. Until the US Department of Transportations investigation is complete, it is inaccurate to blame the technology.

Few sources from the tech world. The Times quoted analysts from Autotrends and Edmunds, but no one from the AI or self-driving car world. This is a mistake. Since technology is so central to the story, to have no one from inside the tech industry leaves out the most important perspective on the issue.

Few real-world Autopilot drivers sourced. The Times piece quoted one Autopilot driver who expressed concern for safety after the accident?and he had only used Autopilot in a loaner, not regularly. 

But over the last few weeks, Ive spoken to half a dozen Autopilot drivers, all of whom have expressed a feeling of safety with the technology?even post-accident. When I talked to a few more directly after the fatal crash, they were unchanged in their opinion, and reinforced the fact that they keep their hands on the wheel while using the technology, as intended. Daniel Nasserian said the accident "doesnt concern me at all. The Autopilot feature is still in its infancy, and I think the media ran with the story because of how new the concept is. The amount of miles driven without a fatality on Autopilot still remains impressive compared to the human error factor."

Why does this matter? When a new technology is released, and malfunctions, there is often an overreaction and a rush to judgment. Cruise control, for example, is not perfect. Nor are air bags, which can kill a driver. Nor are seat belts, which can also be a cause of death. But each of these innovations are designed to protect drivers, and have been shown to save more lives than they take. Likewise, Autopilot is intended as a safety feature, and has a high success rate. This is the first known accident in 130 million miles driven, while, according to a 2015 report by the US National Safety Council, the estimated annual mileage death rate is 1.3 deaths per 100 million vehicle miles traveled.

Bryant Walker Smith, professor at the University of South Carolina, and an expert on the legal aspects of self-driving vehicles, makes the following point. "The article depicted Tesla/Musk as indifferent to safety rather than struggling with, and perhaps reaching a different conclusion about, the same issues that are bedeviling the rest of the industry."

"In my experience," said Smith, "the company is attuned to safety. Perfection is not possible, and every company is in some ways learning from its customers experiences. Teslas beta labeling is candid about this?and may even reflect an effort to emphasize the need for caution to its users."

Smith said that the piece "seems to treat Autopilot as nothing more than a convenience feature with no safety benefits?and, indeed, safety detriments."

This is not to let Tesla off the hook. The companys blog post about the accident left much to be desired. It was slightly robotic and tone-deaf, attempting to address the tragedy while simultaneously celebrating and defending Autopilots achievements. And TechRepublic is not a cheerleader of Tesla: we hold the company accountable to answer several critical questions regarding the timeline of the incident and what can be learned from the accident.

Smith poses the following questions, which we plan to cover as answers emerge:

  • Should Tesla have monitored its users more?
  • Should Tesla have more strongly communicated the limitations?
  • Should Tesla have designed and marketed these technologies much more as safety than convenience features?
  • Should the systems have been limited to a backup rather than primary role in directing steering and braking? Should the system have been named something more modest?
  • Why did Tesla wait to announce this crash?
  • Has it already updated its software to account for this scenario (and others)? If so, how?
  • Why was there a lag between the fatal accident and the public announcement?
  • What has Tesla learned from the accident?
  • Will the accident do anything to change Teslas approach?
  • Still, it is important to keep the big picture in mind.


"What if Autopilot ultimately prevents more deaths than it causes? 

What if the lessons learned through this early deployment helps to advance the state of safety technologies much more quickly than they would otherwise develop?" said Smith.

"In these cases, introducing Autopilot may make sense," said Smith, "and may, indeed, save lives."

The New York Times has since published several more pieces on the Tesla accident, some of which include more details about how the crash occurred. But when such a prestigious publication prints something on the front page that leaves out key points that would tell the full story, we are at risk of misleading the public on a very important issue.



Available link for download