Monday, August 23, 2021

Federal Safety Agency Investigates Tesla Autopilot

 

In 2015, the upstart automaker Tesla introduced its Autopilot feature, an advanced artificial-intelligence-enabled system that takes over most of the routine operations that a human driver normally performs.  At the same time, Tesla warned Autopilot users that they should remain attentive with their hands on the wheel at all times, even when Autopilot is engaged.

 

This is a little bit like taking a hungry child into a candy store and telling them not to touch anything.  Most kids will obey, but it's hard on the kid and it can lead to embarrassing situations.

 

It's not surprising that, according to the U. S. National Highway Traffic Safety Administration (NHTSA), since 2016 ten people have died in eight crashes of Tesla vehicles in which the Autopilot feature was the cause of the crash.  Lately, there have been numerous crashes, one of them fatal, in which Teslas with Autopilot engaged have run into the rear of emergency vehicles with flashing lights. 

 

Finally, the NHTSA has had enough.  It has launched a formal investigation into how the Autopilot system works, how it is implemented, what its defects are, and what steps Tesla has taken to make sure that drivers are paying attention like they are supposed to when the Autopilot is driving the car.  There is abundant evidence that in many of the crashes, the driver was doing something other than watching the road:  watching a movie, playing a video game, or even sitting in a seat other than the driver's seat.  The Autopilot system is supposed to monitor hand pressure on the steering wheel, but according to some sources, this feature is very easy to defeat, and many people appear to have done so.  And most of them probably get away with it most of the time.  But not always.

 

In human-machine safety issues, there is a tradeoff between the two poles which represent extreme approaches to operating a device safely.  One pole relies totally on training the individual not to do dangerous things, or to do them in a safe way, if that makes sense.  Think of stunt drivers in the movie business:  they do things with cars that cars are not designed to do, but with careful planning and finely-honed skills, they manage to survive car flips, crashes, and other tricks that have probably passed into history now that CGI technology is so good that real stunt drivers are probably looking for work.  But the point is that this approach to safety concentrates on the knowledge and attention of the operator or driver, and basically tells him or her to drive safely.

 

The other pole of safety is building in foolproof safety features to the machine itself, so that even an ignorant five-year-old turned loose with the keys couldn't get hurt.  It's not possible to make a car at a reasonable price that is completely safe no matter what you do—at least not yet.  But many of the autonomous-vehicle-type features that are now showing up on many makes besides Tesla move in this direction:  lane-keeping features, automatic braking to avoid head-on collisions, and so on.  They make up for a driver's deficiencies, inattention, or errors.  But they are far from perfect yet, and so the attention and intelligence of the driver are still needed to fill in the gaps where systems like Tesla's Autopilot still can't figure out the situation, such as an emergency vehicle stopped in your lane.

 

I expect the NHTSA will encounter some headwinds in trying to figure out Tesla's Autopilot system.  Elon Musk has, shall we say, a rather cavalier attitude toward convention and traditional ways of doing things, and recently abolished Tesla's public-relations department.  Perhaps he thinks a few tweets from him do just as well, and in the absence of more formal ways of getting information from the company, he may be right.  But nobody can stop the NHTSA from renting or buying some Teslas and putting them through various scenarios and seeing what they do with and without human supervision.  Whatever is going on under the hood, the results will be clear to see.

 

But just testing the hardware and software is only part of the issue.  The poisonous mixture that the NHTSA is dealing with combines an Autopilot system that is very good—so good that people really can let it drive the car for many minutes at a time and get away with it—and drivers who either intentionally put too much trust in the Autopilot system, or simply get distracted and fail to do what they know they ought to be doing, which is looking at the road.  But nobody just accidentally starts watching a movie or playing a video game, and so we must conclude that in at least some of the cases where inattention and the Autopilot have caused crashes, people simply ignored the advice of Tesla to not let Autopilot drive the car by itself, and paid the penalty for their inattention.

 

Now in some countries and cultures (and political persuasions—notably extreme libertarianism), this would not be a concern of the government's.  If people want to do foolish things and ignore instructions, well, let them do it and suffer the consequences.  The problem with this attitude is that it ignores everybody else, particularly other people who might be harmed and killed in the same accident.

 

My point is simply that we in the U. S. have grown accustomed to holding automakers to safety standards that avoid preventable accidents, in the sense that preventable accidents follow a consistent pattern which reasonable interventions at not too much cost can prevent. 

 

We are in a curious transition phase in which systems like Autopilot are good enough to fool us that they can really drive our cars without us paying any attention, but not good enough to do it for real.  And until it is just as safe to play pinochle from the driver's seat as it is at home, we need some way to remind drivers that they can't ignore the road even if the car seems to be driving itself.

 

Sources:  I consulted an article in Consumer Reports at https://www.consumerreports.org/autonomous-driving/nhtsa-safety-defect-investigation-tesla-autopilot-crashes-a6996819019/ and an article in the Aug. 16 Austin American-Statesman online edition "Feds Open Investigation Into Tesla's Autopilot System.  The statistic on total fatalities and accidents due to Autopilot since 2016 was obtained from https://thehill.com/changing-america/sustainability/infrastructure/561717-increasing-number-of-crashes-involving-teslas#:~:text=In%20total%2C%20at%20least%2010,each%20year%20in%20the%20U.S.

No comments:

Post a Comment