Monday, July 21, 2025

Can Teslas Drive Themselves? Judge and Jury To Decide

 

On a night in April of 2019, Naibel Benavidez and her boyfriend Dillon Argulo had parked the Tahoe SUV they were in near the T-intersection of Card Sound Road and County Road 905 in the upper Florida Keys.  George McGee was heading toward the intersection "driving" a Model S Tesla.  I put "driving" in quotes, because he had set the vehicle on the misleadingly-named  "full self-driving" mode.  It was going about 70 MPH when McGee dropped his cellphone and bent down to look for it.

 

According to dashcam evidence, the Tesla ran a stop sign, ignoring the blinking red light above the intersection, crashed through reflective warning signs, and slammed into the SUV, spinning it so violently that it struck Naibel and threw her 75 feet into the bushes, where her lifeless body was found by first responders shortly thereafter.  Dillon survived, but received a brain injury from which he is still recovering.

 

Understandably, the families of the victims have sued Tesla, and in an unusual move, Tesla is refusing to settle and is letting the case go to trial. 

 

The firm's position is that McGee was solely at fault for not following instructions on how to operate his car safely.  The driver should be prepared to take over manual control at all times, according to Tesla, and McGee clearly did not do that. 

 

The judge in the federal case, Beth Bloom, has thrown out charges of defective manufacturing and negligent misrepresentation, but says she is hospitable toward arguments that Tesla "acted in reckless disregard of human life for the sake of developing their product and maximizing profit."

 

Regardless of the legal details, the outlines of what happened are fairly clear.  Tesla claims that McGee was pressing the accelerator, "speeding and overriding the car's system at the time of the crash."  While I am not familiar with exactly how one overrides the autopilot in a Tesla, if it is like the cruise control on many cars, the driver's manual interventions take priority over whatever the autopilot is doing.  If you press the brake, the car's going to stop, and if you press the accelerator, it's going to speed up, regardless of what the computer thinks should be happening. 

 

The Society of Automotive Engineers (SAE) has promulgated its six levels of vehicle automation, from Level 0 (plain old human-driven cars without even cruise control) up to Level 5 (complete automation in which the driver can be asleep or absent and the car will still operate safely).  The 2019 Tesla involved in the Florida crash was most likely a Level 3 vehicle, which can operate itself in some conditions and locations, but requires the driver to be prepared to take over at any time. 

 

McGee appears to have done at least two things wrong.  First, he was using the "full self-driving" mode on a rural road at night, while it is more suitable for limited-access freeways with clear lane markings.  Second, for whatever reason, when he dropped his phone he hit the accelerator at the wrong time.  This could conceivably have happened even if he had been driving a Level 0 car.  But I think it is much less likely, and here's why.

 

Tesla drivers obviously accumulate experience with their "self-driving" vehicles, and just as drivers of non-self-driving cars learn how hard you have to brake and how far you have to turn the steering wheel to go where you want, Tesla drivers learn what they can get by with when the car is in self-driving mode.  It appears that McGee had set the car in that mode, and while I don't know what was going through his mind, it is likely that he'd been able to do things such as look at his cellphone in the past while the car was driving itself, and nothing bad had happened.  That may be what he was doing just before he dropped the phone.

 

At 70 MPH, a car is traveling over 100 feet per second.  In a five-second pause to look for a phone, the car would have traveled as much as a tenth of a mile.  If McGee had been consciously driving a non-autonomous car the whole time, he probably would have seen the blinking red light ahead and mentally prepared to slow down.  But the way things happened, his eyes might have been on the phone the whole time, even after it dropped, and when he (perhaps accidentally) pressed the accelerator, the rest of the situation plays out naturally, but unfortunately for Naibel and Dillon.

 

So while Tesla may be technically correct that McGee's actions were the direct cause of the crash, there is plenty of room to argue that the way Tesla presents their autonomous-driving system encourages drivers to over-rely on it.  Tesla says they have upgraded the system since 2019, and while that may be true, the issue at trial is whether they had cut corners and encouraged ways of driving in 2019 that could reasonably have led to this type of accident.

 

In an article unrelated to automotive issues but focused on the question of AI in general, I recently read that the self-driving idea has "plateued."  A decade or so ago, the news was full of forecasts that we would all be able to read our phones, play checkers with our commuting partners, or catch an extra forty winks on the way to work as the robot drove us through all kinds of traffic and roads.  That vision obviously has not come to pass, and while there are a few fully autonomous driverless vehicles plying the streets of Austin right now—I've seen them—they are "geofenced" to traverse only certain areas, and equipped with many more sensors and other devices than a consumer could afford to purchase for a private vehicle. 

 

So we may find that unless you live in certain densely populated regions of large cities, the dream of riding in a robot-driven car will remain just that:  a dream.  But when Tesla drivers presume that the dream has become reality and withdraw their attention from their surroundings, the dream can quickly become a nightmare.

 

Sources:  I referred to an Associated Press article on the trial beginning in Miami at https://apnews.com/article/musk-tesla-evidence-florida-benavides-autopilot-3ffab7fb53e93feb4ecfd3023f2ea21f.  I also referred to news reports on the accident and trial at https://www.nbcmiami.com/news/local/trial-against-tesla-begins-in-deadly-2019-crash-in-key-largo-involving-autopilot-feature/3657076/ and

https://www.nbcmiami.com/news/local/man-wants-answers-after-deadly-crash/124944/. 

 

No comments:

Post a Comment