On May 7 of this year, Joshua Brown, owner of a
wireless-network technology company and Tesla car enthusiast, was riding in his
Tesla Model S on a divided highway in Florida. Mr. Brown loved his car and posted numerous YouTube videos that
showed him using the autopilot function in the "look, Ma, no hands!"
mode. By all accounts, Brown was a
generous, enthusiastic risk-taker (his specialty when he was in the military
was disarming weapons, according to a New
York Times report), and hands-free driving went against the explicit
instructions Tesla provides for the autopilot feature. But Tesla owners do it all the time,
apparently, and until May 7, Mr. Brown had gotten away with it.
Then a tractor-trailer rig made a left turn in front of Mr.
Brown's Tesla. According to a
statement by Tesla, the high ground clearance of the trailer and its light
color, resulting in low visual contrast against the sky, failed to trigger the
car's brakes. The Tesla ran
underneath the trailer, fatally injuring Mr. Brown. A neighbor quoted Mr. Brown afterwards as saying in another
context a few weeks before the accident, "For something to catch Elon
Musk’s eye, I can die and go to heaven now." No one knows how serious Mr. Brown was when he said that. But he will go down in history as the
first person in the U. S., and perhaps in the world, to die in a car that was
operating in its self-driving mode.
Will this tragedy spell doom for self-driving cars? Almost certainly not. The first recorded steam-locomotive railway
fatality was that of the English politician William Huskisson, who attended the
opening ceremonies of the Liverpool and Manchester Railway on Sept. 15, 1830, which
featured inventor George Stephenson's locomotive the Rocket. Wanting to
shake the hand of his former political enemy the Duke of Wellington, Huskisson
walked over to the Duke's railway carriage, then saw that the Rocket was bearing down on him on a
parallel track. He panicked, tried
to climb onto the carriage, and fell back onto the track, where the locomotive
ran over his leg and caused injuries that were ultimately fatal. Passengers had been warned to stay
inside the train, but many paid no attention.
If Huskisson's death had been mysterious and
incomprehensible, it might have led to a wider fear of railways in
general. But everyone who learned
of it took away the useful lesson that hanging around in front of oncoming
steam locomotives wasn't a good idea, and railways became an essential feature
of modern life. Nevertheless,
every accident can teach engineers and the rest of us useful lessons in how to
prevent the next one, and the same is true in Mr. Brown's sad case.
It's not clear how long the Version 7.0 of the Model S
software featuring the autopilot function has been available, but it's probably
been out for at least a year.
Multiply that time by the number of Model S owners and how far they
drive, and you have a track record that shows if anything much is wrong with
the software, it's not very wrong.
Model S owners aren't dying like flies in autopilot accidents. Still, telling drivers how great a
self-driving feature is, and then expecting them to pay constant attention as
though the car were a driver's ed student and you were the instructor, is
sending a mixed message.
Tesla's own posting about the accident cites statistics that
show if anything, Model S cars have a lower accident rate than average, and
that may be true. But as Tesla's
public profile rises, the firm has some delicate maneuvering ahead of it to
avoid becoming a target for lawyers who will want to portray Tesla in court as
heedless of driver safety.
We've known since the earliest days of automobiles that they
are dangerous in careless hands and require constant vigilance on the part of
the operator. Plenty of people
ignore that fact and pay for it with injuries or their lives, and take the
lives of others as well. But
everybody, whether safe or careless, still admits it's a good idea to pay
attention while you're driving.
Now, however, something fundamentally new has been added. When a car has a self-driving feature
that nevertheless requires you to be ready to take command at a moment's notice,
the driver is torn between letting the machine take over and keeping a constant
lookout for trouble. You can't
both be constantly vigilant and also watch a Harry Potter movie, as Mr. Brown
may have been doing at the time of the accident. In most of us, especially guys, attention is a focused thing
that has to be directed at one primary target at a time. Even if I had a self-driving car (which I don't), and after
driving it for a while and learning what it typically can and can't do, I
wouldn't feel very comfortable just sitting there and waiting for something
awful to happen, and then having to spring into action once I decided that the
car wasn't doing the right thing.
That's a big change of operating modes to ask a person to do, especially
if you've been lulled into a total trust of the software by many miles of
watching it perform well. Who
wouldn't be tempted to watch a movie, or read the paper, or even sleep?
I'm afraid we've got some institutionalized hypocrisy here
that most auto companies are fortunately free of. But Tesla is a different kind of beast, founded at a time
when anybody who ever installs software is either forced to lie, or actually
has to read dozens of pages of legal gobbledegook before clicking the "I
Agree" button. The impression
I have of the arrangement between Tesla and Model S owners is that Tesla
pretends that they have to keep
their hands on the wheel, and the owners pretend that they're following
instructions. And the pretense has
made the lawyers happy, I suppose—until now.
Now that the much-anticipated First Fatality has happened,
things could go in any of several directions. The National Highway Transportation Safety Administration,
which is investigating the accident, could come out with a bunch of
heavy-handed federal regulations that could squash or set back autonomous
vehicles in the U. S. for many years.
Joshua Brown's relatives could mount a lawsuit that could cripple
Tesla. Or (and this is the one I'm
hoping for), Tesla's engineers can learn what went wrong in Mr. Brown's case,
fix it, and deliver clearer, more practical instructions to drivers, including
some human-factors engineering that seems to be missing, about how to use the
self-driving feature, so that the remaining Tesla drivers can lessen their
chances of becoming Fatality No. 2.
Sources: Many news outlets carried reports of
Mr. Brown's death. Tesla's own
posting concerning the incident appeared June 30 at https://www.teslamotors.com/blog/tragic-loss. I referred to reports on Fortune's online version at http://fortune.com/2016/07/02/fatal-tesla-crash-blind-spot/,
the New York Times report on Mr.
Brown's background at http://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.html,
the Tesla press kit on its autopilot at https://www.teslamotors.com/presskit/autopilot,
and the Wikipedia article on William Hoskisson. Thanks to my wife for notifying me about the incident.
Mr. Brown illustrates that one of the arguments for self-driving cars—that poor, risk-taking drivers are better off with the technology—may need reconsideration.
ReplyDeleteThe same risk-taking mindset that makes someone a poor driver on his own is likely to mean that he makes poor decisions about when to use an automatic driving system, as well as how he uses it. Depending on the system on an extremely bright day with traffic crossing in front of him illustrates the first. Watching a movie instead of the road illustrates the second. There's only so much you can do to help a fool.
Personally, I see no need for a self-driving car. My record shows I'm a safe driver, and there are ways to make use of my hands-on-wheel-and-active driving time for other purposes—such as listening to audiobooks—that don't affect my driving ability.
If anything, an audiobook keeps me from being bored and inattentive on long trips. In 2012, I listened to several audiobooks on a 2900-mile, cross-country move.