Monday, July 31, 2023

Uber Backup Driver in Self-Driving Car Fatality Pleads Guilty

 

When Elaine Herzberg started to cross a dark street in Tempe, Arizona on Mar. 18, 2018 with her bicycle, she didn't see the Volvo approaching.  Unfortunately for her, it wasn't an ordinary Volvo.  It was an experimental self-driving vehicle operated by Uber, and behind the wheel was Rafaela Vasquez, whose job was to monitor the car's performance and intervene before anything serious went wrong.  The Volvo was equipped with factory-installed safety features such as automatic braking, but Uber had disabled those features because they interfered with Uber's own self-driving system.  It was Vasquez's job to stop the car if it was about to run over something—or somebody.

 

Video footage recorded at the time showed that Vasquez was looking at something in her lap instead of watching where the car was going.  Prosecuting attorneys said it was a TV show she was streaming on her phone.  Her defense attorney claimed it was a messaging program on a work cellphone.  Whichever it was, it distracted Vasquez enough so that by the time she noticed Herzberg only a few feet ahead of the car, it was too late to do anything.  Herzberg died in the accident and Vasquez was charged with negligent homicide.  Last Friday, she pled guilty to a lesser charge and was sentenced to three years of supervised probation.

 

Most sources agree that this was the first fatal collision involving a fully autonomous car.  As such, it presented something of a puzzle to the prosecutors.  Clearly, Vasquez was not just an ordinary driver—her role was to monitor the car's performance and intervene only when it looked like it was getting into trouble.  Numerous other parties were involved as well:  Volvo itself, whose automatic braking features were disabled; Uber engineers who developed the self-driving features being tested; and Uber supervisors who issued orders and instructions to Vasquez. 

 

More than five years have passed since the accident, and the charge Vasquez pled guilty to is not the one she was originally charged with.  In a plea deal, she agreed to plead guilty to an "undesignated felony," which apparently can be converted into a misdemeanor if she successfully completes three years of supervised probation.  Vasquez is a convicted felon, having served prison time for attempted armed robbery, so she is no newcomer to the criminal justice system.  While Uber deserves credit for employing ex-convicts, their judgment may have erred in this case.

 

When I blogged on this accident back in 2018, the autonomous-vehicle landscape was very different.  Self-driving cars were a novelty and found mostly in experimental trials in a few cities.  Since that time, things have progressed in that field, but probably not as fast as its promoters wished.  Tesla, which has probably fielded the largest number of partially self-driving cars of any U. S. manufacturer, is currently under scrutiny by the National Highway Traffic and Safety Administration for accidents involving its so-called Autopilot feature, which drives the car without direct human-driver intervention but should not be used without constant monitoring for bad behavior.  I can testify that these days, I see at least one Tesla almost any time I'm out on the road for any length of time, and maybe several.  As Autopilot is an expensive option for an already costly car, I don't know how many drivers have it or use it, but so far I haven't seen a Tesla tooling down the expressway with nobody at the wheel—yet.

 

Simply because the current driving environment is so complex, the ultimate vision of so-called Level 5 autonomous vehicles, which could drive anywhere a human could without any human intervention at all, may not ever come to pass.  Fully autonomous vehicles can work in highly restricted and controlled environments such as open-pit mines, but the average city street is full of so many surprises and hard-to-determine obstacles that current technology cannot be trusted to navigate it without human help. 

 

If we are to realize the dream of totally autonomous cars, we might have to accept some geographic restrictions that will not be popular.  For example, if certain streets or blocks were designated for autonomous vehicles only, and no-jaywalking laws were strictly enforced, the environment could be modified enough so that fully-autonomous Level 5 cars would work with reasonable safety.  But that would require a coordination among local, state, and national governments, besides car manufacturers, that is so far lacking, and may never be achieved.

 

A few people have always enjoyed the equivalent of fully-autonomous cars:  those who can afford to hire a chauffeur.  The fact that some rich folks are driven around by hired drivers has had negligible impact on the transportation system so far, and if such an experience never makes it to prime time via the development of Level 5 autonomous vehicles, it will not signal the failure of transportation technology in general.  The only people who would really benefit in a major way from Level 5 vehicles are those who cannot drive:  the handicapped and disabled, the elderly, and children.  I am told that children used to ride the New York subway system unsupervised all the time, and some may still do so.  But we have a long way to go before anyone would trust their five-year-old to get in an autonomous vehicle for a ride to day care.

 

That dark night in Tempe, Rafaela Vasquez unwillingly made history through her negligence in trusting too much to the self-driving capabilities of the experimental Uber-modified vehicle she was hired to supervise.  The same mistake is being made by people who don't follow instructions to keep their hands on the wheel of an Autopilot-equipped Tesla, and the unfortunate thing is that they often get away with it.  But sometimes they don't, and that's what the NHTSA is looking into.  Uber was not charged in the Tempe accident, probably because there was evidence that they had told Vasquez to be vigilant and she clearly failed to do so.  But it's human nature to assume that if you can get away with something for a long time, you'll be able to get away with it indefinitely.  And the temptation to do that with advanced self-driving features is too great for some people, who should be held responsible, along with their car's manufacturer, when something goes wrong.

 

Sources:  The AP report on the Vasquez trial can be found at https://apnews.com/article/autonomous-vehicle-death-uber-charge-backup-driver-1c711426a9cf020d3662c47c0dd64e35.  I also referred to a CNBC report at https://www.cnbc.com/2023/07/06/nhtsa-presses-tesla-for-more-records-in-autopilot-safety-probe.html.  I blogged on the Tempe accident here and here.

No comments:

Post a Comment