Monday, May 23, 2022

Driver of Tesla on Autosteer Charged with Vehicular Manslaughter

 

Shortly after the first commercially available driver-assist autopilot-equipped vehicles appeared on roadways, pundits raised the question of who would be responsible in case of an accident:  the driver, the vehicle maker, or both?  That question is about to get legs in the case of Kevin George Aziz Riad, who prosecutors have charged with two counts of vehicular manslaughter as a result of an accident that occurred on Dec. 29, 2019 in Gardena, California.

 

As we described the crash in this space shortly after it occurred, a couple in a Honda Civic were making a left turn at an intersection with the terminus of the Gardena Freeway that evening.  The traffic light at the intersection was green for them, and red for oncoming traffic to their right coming off the freeway.  Neither Riad nor the Tesla Model S he was driving paid any attention to the numerous slow-down signs or the red traffic signal when the Tesla barreled at 74 MPH into the Civic, killing both of its occupants and slightly injuring Riad and his passenger. 

 

Subsequent investigation proved that the Tesla had both Autosteer and Traffic Aware Cruise Control engaged at the time.  Tesla's instructions to drivers using these features are clear:  the driver must keep a hand on the steering wheel at all times and "be prepared to take over at any moment."  The Gardena Freeway is basically straight for at least five or six miles before its termination, and data recovered from the Tesla showed that Riad had not moved the steering wheel significantly nor applied the brakes for six minutes before the crash. 

 

Riad's defense attorney asked Los Angeles County Superior Court Judge Teresa Magno to lower the charges to misdemeanors, claiming that if a crash hadn't occurred the worst charges would have resulted from just running a red light.  The judge declined to follow that line of reasoning, and instead ruled that the trial for vehicular manslaughter will proceed.  It is widely believed that this is the first such trial involving a commercial version of automated driving technology, according to a report in the Orange County Register and a subsequent AP story.

 

Car accidents due to inattention are nothing new, but the novel feature of this case is that technology has allowed inattention to soar to new heights. 

 

In the days before driver-assist technologies such as autosteer and cruise control, one of the chief dangers of a long, straight, dull stretch of freeway was that a fatigued driver might simply fall asleep from monotony.  I'm sure this has happened to most drivers at least once or twice, and most of the time, the consequences are minor:  a slight drifting from one's lane, a jerk awake once you realize you've been dozing, and a quick flurry of attention to get things back on track.  If a fatal crash results, then the prospect of vehicular homicide charges arise, and while juries may be sympathetic to someone who simply falls asleep at the wheel, part of driving responsibly is knowing not to drive when you're very sleepy, and it's reasonable to charge such people and sometimes convict them of negligent homicide.

 

With the sophisticated driver-assist technologies of cars such as Tesla's Model S, the driver receives mixed messages.  According to an article in Popular Science, the Society of Automotive Engineers has established a five-level system for assessing how self-driving a self-driving car is, and the Model S features get it only to Level 2.  In Level 2, the driver is still in charge, even though the system can automatically brake, accelerate, and steer.  But according to Popular Science, that is not the impression that Tesla gives many of its drivers, who apparently play chicken with the system to see what they can get away with, keeping one hand on the steering wheel but having their attention otherwise engaged for many seconds or minutes, as Mr. Riad apparently did. 

 

And despite the efforts of Tesla engineers to prevent this kind of thing, they have not yet developed a psychic feature that reads the driver's mind in order to find out if he or she is really paying attention, or just acting like it with one hand on the wheel and the rest of the body doing something else altogether.  The number of fatal crashes involving Teslas with some form of driver assist engaged has reached the point that the National Highway Traffic Safety Administration (NHTSA) has required automakers to report any crashes on public roads involving such systems, including vehicles from Tesla and those of several other firms. 

 

While the absolute number of fatalities in such crashes is small, they form a leading edge of a worrisome upward trend in auto casualties generally.  After reaching a minimum in 2011, the number of U. S. auto fatalities has been creeping upward from less than 33,000 in that year to 38,824 in 2020.  A significant number of these crashes have been linked to inattentive driving in which technology—smartphones, videos, and driver-assist devices, among others—was a factor.

 

As driver-assist technology is only going to get more widespread as the cost declines and the performance increases, it's even more important that we figure out how to manage the transition between complete driver responsibility, in which nothing automated intervenes between the driver and the road and the driver recognizes that—and complete irresponsibility, the sought-for SAE Level 5 in which the car harbors an effective ideal chauffeur who allows the passengers to play pinochle, sleep, or do whatever else they want during the completely programmed ride. 

 

No Level 5 technology currently exists outside laboratories and highly-controlled environments, and it is far from clear that we will ever get there unless some radical changes are made in our entire approach to automotive transport.  In the meantime, we have to figure out a way to keep things like the Gardena crash from occurring.  Prevention may have to take the form of really annoying features such as having to press a button whenever a random light blinks, or something along those lines.  Although it would go against the grain of the Tesla-style "Look, Ma, no hands!" approach, we as a society will have to evaluate how much we want to save lives at the price of a little inconvenience.

 

Sources:  The AP story on Judge Magno's ruling to proceed to trial was carried in numerous places including ABC News at https://abcnews.go.com/Technology/wireStory/driver-stand-trial-deadly-tesla-crash-california-84851701.  I also referred to an article in the Orange County Register at https://www.ocregister.com/2022/05/19/driver-of-tesla-on-autopilot-in-fatal-gardena-crash-to-face-trial-judge-rules/, and used the sites https://www.popsci.com/technology/autonomous-vehicles-explained/

and https://www.kktv.com/2022/05/18/federal-agency-sends-team-probe-tesla-crash-that-killed-3/as well.  My first blog on this incident is at https://engineeringethicsblog.blogspot.com/2020/01/are-self-driving-cars-more-dangerous.html.

No comments:

Post a Comment