Monday, April 26, 2021

The Fatal Tesla Crash in Houston: What Really Happened?

 

Late on Saturday, Apr. 17, Dr. Will Varner, an anesthesiologist, said good-by to his wife and the wife of a friend, Everett Talbot, as the women left Varner's Tesla Model S for a home in a gated community in The Woodlands outside of Houston, Texas.  Apparently, Varner then offered to show Talbot "how it can drive itself" on a short test drive around the neighborhood, as subsequent testimony from witnesses indicates.  What is certain is that around 11:25 PM, the Tesla went off the road and crashed at considerable speed into a tree.  The car's massive lithium-ion battery caught fire, and photographs of the wreckage after the bodies were removed show only two door uprights standing on either side of the otherwise flattened and blackened wreckage.  First responders found the body of Talbot in the front passenger seat and that of Varner in the back seat.  Neither was at the wheel of the vehicle at the time of the crash.

 

Harris County Constable Mark Herman claimed to reporters that "no one was driving" the 2019 Tesla at the time of the crash.  But in a tweet the following Monday, Tesla CEO Elon Musk stated "Data logs recovered so far show Autopilot was not enabled and this car did not purchase FSD."  FSD stands for Full Self-Driving, a mode which still requires driver supervision.  Musk went on to say "Moreover, standard Autopilot would require lane lines to turn on, which this street did not have."

 

In defense of Tesla and its CEO, Tesla drivers are warned repeatedly to keep their hands on the wheel even if Full Self Driving mode is engaged.  This is like telling a five-year-old to keep your hand in the cookie jar, but just don't take any cookies.  Many Tesla drivers have given in to the temptation to engage Autopilot or otherwise surrender control of the vehicle to the system computer and allow their attention to stray, or even leave the driver's seat altogether, as Dr. Varner apparently did.  And the self-driving capabilities of the car are good enough that most of the time, absentee drivers can get away with it.

 

Musk bases his claim that Autopilot was not enabled on the fact that Teslas telemeter "periodic" updates via wireless links to the company.  Leaving aside the question of whether having your car inform Tesla of your every driving move is compatible with privacy, it is not clear how frequent these updates are.  If you read Musk's tweet like a lawyer, the phrase "Data logs recovered so far" could cover the possibility that the most recent data log Tesla has from the vehicle in question was many minutes before the actual crash occurred.  In other words, Musk could be saying nothing more significant than, "We know that ten minutes before the crash happened, Autopilot was not engaged."  But a lot can happen in ten minutes. 

 

The Houston police authorities have both impounded the wreckage of the Tesla and stated that they "eagerly wait" for the data that Tesla has recovered remotely.  It is unclear at this writing whether any data logs can be recovered from the incinerated wreck.  Unless Tesla has taken steps to harden the housing of the car's computer memory in a way similar to the kind of waterproofing and fireproofing that aviation black boxes have, I'd say that the remote data is all the data that anyone's likely to recover.

 

In the long run, removing the human element entirely from driving may significantly reduce automotive fatalities and injuries.  And when so-called "driver assist" systems such as lane-keeping, station-keeping at a fixed distance behind a leading vehicle, and automatic braking in emergencies are employed in the way they are supposed to be used—as assists to a real driver at the wheel, not as a substitute—studies have shown that they do reduce accidents. 

 

But the way Tesla has marketed their vehicles and promoted the driver-assist features as "Full Self-Driving" and "Autopilot" is misleading on the face of it.  Musk's cowboy reputation, which he appears to relish, may be a big reason why only 14% of Americans say they "would trust riding in a vehicle that drives itself."  If Musk really intends to sell a mass-market car rather than one that only doctors and stockbrokers can afford, that 14% number will have to increase a lot before a truly self-driving car can succeed.

 

In the meantime, deceptive and hypocritical marketing such as Tesla engages in contributes to the perception that alone among automakers, Tesla has really arrived at what the Society of Automotive Engineers calls "level 5" autonomous driving.  In a letter to California's Department of Motor Vehicles in March of this year, Tesla representatives admitted that the most advanced features of any Tesla vehicle amount only to Level 2 autonomy.  According to the SAE, Level 2 automation is simply driver-support, not automated driving, and requires that the driver constantly supervise the car's operation.  Clearly, many Tesla drivers are going beyond that.  Dr. Varner gambled on getting away with it and lost.

 

There's nothing new about automakers providing features on cars that some drivers abuse.  The muscle cars of the 1960s had power and acceleration that went way beyond anything normal driving required, and as a consequence some people wound up dying in fiery crashes after 140-mph joyrides.  But bigger engines and faster cars were just incremental changes that took place since the invention of the automobile.

 

Cars that seem to drive themselves are something truly new in automotive history, and we are still in the early stages of seeing how autonomous driving will play out.  While Tesla deserves credit for marketing what is probably the most technically advanced combination of driver-assist technologies on the market today, they have created a dangerous situation in which even a few spectacular crashes such as the one in Houston can put a damper on an entire technology and scare people away from it. 

 

If Tesla is smart enough to make a nearly autonomous car, they are also smart enough to figure out how to keep drivers from absenting themselves from the wheel until the future date when it is reasonably safe to do so.  By Tesla's own admission, that date is not here yet, but it is apparently ridiculously simple to fool a Tesla car into driving itself while you play your violin in the back seat.  Other car makers are taking more sophisticated precautions such as eye-motion detection to ensure that the driver-assist system always has a driver to assist who is paying attention.  It's way past time for Tesla to do something like the same.

 

Sources:  I consulted the following articles in the preparation of this blog:  "Feds investigating fatal Tesla crash in Texas" (Austin American-Statesman, 4/20/2021), "Tesla crash shows Autopilot isn't there yet," (Austin American-Statesman, 4/22/2021), Car and Driver, https://www.caranddriver.com/news/a36189237/tesla-model-s-fire-texas-crash-details-fire-chief/,

Reuters at https://www.reuters.com/business/autos-transportation/us-probes-fatal-tesla-crash-believed-be-driverless-2021-04-19/, the New York Post at https://nypost.com/2021/04/21/victims-in-deadly-houston-telsa-crash-identified/, Click2Houston.com at https://www.click2houston.com/news/local/2021/04/18/2-men-dead-after-fiery-tesla-crash-in-spring-officials-say/, and the Society of Automotive Engineers at https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles. 

No comments:

Post a Comment