The National Health Service
(NHS) in England is one of the oldest government health-care systems in the
world, founded in 1948 when the Labor Party was in power. Despite consuming some 30% of the public
service budget, by many accounts it is underfunded, especially when it comes to
capital equipment such as IT systems. This
may be a factor in a scandal involving a wayward algorithm that prevented some
half-million Englishwomen from receiving mammograms for the last nine years. Estimates vary as to how serious a problem
this is, but it's likely that at least a few women have lost their lives due to
breast cancer that was caught too late as a result of this computer error.
A report carried in the
IEEE's "Risk Factor" blog describes how in 2009, an algorithm
designed to schedule older women for breast cancer screening was set up
incorrectly. As a result, over the next
nine years almost 500,000 women aged 68 to 71 were not allowed to have
mammograms that they otherwise would have been scheduled for. When the error was caught, the news media had
a field day with headlines like "Condemned to Death . . . by an NHS
Computer." Depending on who's
making the statistical estimate, the consequences are either tragic or possibly
beneficial.
The NHS's own Health
Minister had his statisticians run the numbers, and they came up with a range
of 135 to 270 women who may have died as a result of this error. But others claim that as many as 800 women
may be better off because of not having to go through surgical and other
procedures based on the false positives that inevitably result from a large
number of mammograms.
While the actual
consequences of this problem are ambivalent, it raises a larger issue: what should we do when computer-generated
algorithms that affect the fates of thousands go awry?
As a practical matter,
computer algorithms are part of the fabric of modern industrial society
now. If you want to borrow money, the
bank uses algorithms to decide whether you're a good credit risk. If you look for something online,
sophisticated algorithms take note of it and decide what other kinds of ads you
see. And if you're in England or another
country where health care is allocated by a central computerized authority, a
computer is going to tell you when you can get certain kinds of preventive
health care and if you're ill, it may even tell you when you can get treated—if
at all.
From a utilitarian
engineering perspective, computer algorithms are the ideal solution for
large-scale resource-allocation problems.
Health care these days is very complicated. Each person has a unique combination of
health history, genetic makeup, and needs, and the arsenal of treatments is
constantly changing too. If you are
working in an environment of centralized fixed resources (as NHS is), then you
will naturally turn to computers as a way of implementing policies that can be
shown mathematically to treat everyone equally.
Unless they don't, of course, as happened with the older women who were
left out of mammogram screenings by the badly programmed algorithm.
There's an old saying,
"To err is human, but to screw up royally requires a computer." The NHS flap is a good example of how one
mistake can affect thousands or millions when multiplied by the power of a
large system.
The U. S., with its much
more hodge-podge mixture of private, commercial, and government health care
systems, is still not immune from such errors, but because the federal
government doesn't run the whole show, its mistakes are somewhat limited in
extent. There are also numerous outside
agents keeping tabs on things, so that an egregious error by, say Medicare, comparable
to what happened with the NHS algorithm in England, would probably be caught by
private insurers before it got too far.
Just as a power grid with a number of small distributed generating
stations is more robust than one that relies exclusively on one giant power
plant, the U. S. health care system, even with all its flaws, is less likely to
be felled by a single coding mistake.
Instead, we have widely
distributed minor errors that cause more inconvenience than tragedy. But precisely because the system is so
kludged together, it doesn't take much to cause a problem.
Here's a simple example: my wife is scheduled the day I am writing
this for a routine well-person exam that requires her general practitioner (GP)
to write a referral for it. Dutiful
organized person that she is, several weeks ago she went by her doctor's office
and asked them to do the referral so she could schedule the appointment, and
the staff at the office said they'd take care of it. Yesterday (the day before the procedure), she
got a call from the office that was going to do the procedure, saying they
hadn't gotten the referral yet and if they didn't get it they were going to
cancel the procedure or make us pay cash for it.
So ensued a half-hour or so
of near panic, during which time we ran down to her doctor's office and
discovered that the lady who was supposed to send the referral out had quit the
previous day. And that was one of the
things she left undone.
When the GP's office staff
figured out what had happened, they were very nice about it—they faxed the
referral to the proper office, handed us a copy which we carried over by hand
to the office needing it, and everything is fine now. But until all medical offices are staffed by
robots and all paperwork is untouched by human hands, people will always be
involved in medical care, and people sometimes make mistakes.
Personally, I much prefer a
system in which I can drive over to the office where the mistake was made and
talk to the people responsible. If we
had something like the NHS here, the mistake might have been made in Crystal City, Virginia by an anonymous person whom it would take the FBI to
discover, and my wife would have been out of luck.
-->
No comments:
Post a Comment