Mill
Avenue near Curry Road in Tempe, Arizona is a wide four-lane surface road with
few streetlights. Around 10:30 PM on the night of Sunday, March 18, it was
quite dark where Elaine Hertzberg, 49, chose to walk her bicycle across the
road in the middle of a block, instead of crossing at a designated
crosswalk. From a dashcam video of
the accident that happened a few seconds later, it does not appear that her
bicycle had any lights or reflectors on it, and she wasn't looking in the
direction of oncoming traffic as she slowly rolled her bike across.
Meanwhile,
in a self-driving car operated by the ride service Uber, Rafael (or
Rafaela—sources differ) Vasquez, 44, was behind the wheel. But as well-functioning self-driving
vehicles tend to do, the car had driven itself so well without difficulties in
the past that Vasquez had gotten used to doing things other than keeping her
eyes on the road all the time. A
video of the car interior taken simultaneously with the dashcam video shows
Vasquez glancing up occasionally, but most of the time she has her eyes on
something in her lap—possibly a cellphone.
Hertzberg
and her bicycle were well into the car's lane before they showed up in the
headlights. But the bicyclist was
moving too slowly—or the car was going too fast—for her to get out of the way
before it hit her. She died at a
local hospital a short time later, becoming the first pedestrian to die in an
accident involving an autonomous vehicle.
In
the week just ended, several experts have criticized Uber for fielding a
self-driving system that could have failed so easily. Uber's autonomous vehicles are equipped with radars and
lidars (essentially, radars using light), and should have detected Hertzberg or
her bicycle before she came into view in the car's headlights. But the video reveals no sign such as
swerving or braking that the system had any sign that she was there. And Vasquez happened to glance up only
a second or two before the collision, so she was unable to do anything until it
was too late.
This
death both draws attention to possible defects in Uber's self-driving cars, and
also calls into question the usefulness of backup drivers in such vehicles.
Most
self-driving cars today are in a kind of gray area, neither completely
independent of human assistance but not needing it most of the time. In the National Highway Traffic Safety
Administration's five-level classification of self-driving cars, the highest
level out there according to one expert is Level 3, which says that the
automated driving system (ADS) performs all driving tasks under "some
circumstances," but the human driver must be prepared to take over at the
request of the ADS. I don't know
about you, but I would have a lot of trouble sitting in the driver's seat and
waiting for a call that might not come for hours or days. Boredom sets in, and it's no wonder
Vasquez was looking at her cellphone when she should have been watching the
road. So as long as humans are
designed to be in the loop of controlling the car, the humans will have an
opportunity to drop the ball, and last Sunday, that's what happened.
On
the other hand, it would be hard for designers to make the leap straight from
Level 0 (totally human-driven cars) to Level 5 (humans are passengers only, and
the car does all the driving all the time) without months and years of
real-life testing, as well as computer simulations. And so you get into a chicken-and-egg situation: how do you design a Level-5-competence
car without testing it in real traffic, but if it's not Level 5 yet, how do you
test it without a human driver to help out? The answer is, you don't, so we have Level 2 and Level 3
cars out there with people behind the wheel, but inevitably, some of the
drivers start treating the car like it was a Level 5, and you get
accidents.
This
mishap will be thoroughly investigated, so it is too soon to draw any definite
conclusions about the cause. It
may turn out that some of the vital systems in the Uber car were temporarily
out of commission, or that the pedestrian's bicycle confused the sensors
somehow, or that some other explanation is the case that we can't even imagine
now. Uber has commendably
suspended all operations of their self-driving cars. But once the cause or causes are found, the state of
self-driving-car engineering can move forward with added understanding, which
is how engineering usually works—learning from failures sometimes more than
from successes.
It's
usually dangerous to commit a known wrong in the present on the chance that it
will lead to something better that might result from it in the future. That's how we got Communism's mass
slaughters in the name of the glorious egalitarian future that never came. But there is no indication that Uber
has been so negligent as to allow this accident to happen on purpose. The first fatality involving steam
locomotives didn't stop the progress of railroads, and this first fatality
involving a pedestrian and a self-driving car will not stop the development of
Level 5 cars, which promise eventually to reduce the already declining number
of automotive deaths in both the U. S. and abroad. It's likely that if we could give every person convicted
of DWI a self-driving car right now, the net auto fatality rate might plunge
significantly. I'm not
recommending that, but if we don't have the gumption to stop drunks from
driving, we should take the keys out of their hands and give them to robots as
soon as the market and the technology are ready.
I'm
sure that Elaine Hertzberg had no desire to become posthumously famous by being
the first pedestrian to die from being hit by a self-driving car. But now that it's happened, it's the
job of engineers to make sure that more lives are saved than lost by the
advancement of self-driving car technology.
Sources: A story on this accident by Associated Press writers Tom
Krisher and Jacques Billeaud was picked up by numerous outlets such as the
Phoenix ABC-TV station at https://www.abc15.com/news/region-southeast-valley/tempe/experts-uber-suvs-autonomous-system-should-have-seen-woman-in-tempe. I also referred to articles in the Washington Post at https://www.washingtonpost.com/news/dr-gridlock/wp/2018/03/24/waymo-ceo-on-fatal-autonomous-uber-crash-our-car-would-have-been-able-to-handle-it/?utm_term=.a25e8b99ab08
and Forbes at https://www.forbes.com/sites/michaeltaylor/2018/03/22/fatal-uber-crash-inevitable-says-bmws-top-engineer/#7f99b0f45568. The dashcam and interior camera videos
leading up to the moment before the crash can be viewed at https://arstechnica.com/tech-policy/2018/03/video-uber-driver-looks-down-for-seconds-before-fatal-crash/.
No comments:
Post a Comment