On May 12, thousands of users of Windows computers around the globe suddenly
saw a red screen with a big padlock image and a headline that read,
"Ooops, your files have been encrypted!" It turned out to be a ransom note generated by an Internet
worm called WannaCry. The ransom
demanded was comparatively small—about US $300—but the attack itself was
not. The most critical damage was
caused in Great Britain where many National Health Service computers locked up,
causing delays in surgery and preventing access to files containing critical
patient data. Fortunately, someone
found a kill switch for the virus and so its spread was halted, but over
200,000 computers were affected in over 100 countries, according to Wikipedia.
No one knows for sure who implemented this attack, although we do know the
source of the software that was used:
the U. S. National Security Agency, which developed something called the
EternalBlue exploit to spy on computers.
Somehow it got into the wild and was weaponized by a group that may be
in North Korea, but no one is sure.
At this writing, the attack is mostly over except for the cleanup, which is
costing millions as backup files are installed or re-created from scratch, if
possible. Experts recommended not
paying the ransom, and it's estimated that the perpetrators didn't make much
money on the deal, which was payable only in bitcoin, the software currency
that is virtually untraceable.
Writing in the New York Times,
editorialist Zeynep Tufekci of the School of Information and Library Science at
the University of North Carolina put the blame for the attack on software
companies. She claims that the way
upgrades and security patches are done is itself exploitative and does a
disservice to customers, who may have good reasons not to upgrade a
system. This was painfully obvious
in Great Britain, where their National Health Service was running lots of old
Windows XP systems, although the vast majority of the computers affected were
running the more recent Windows 7.
Her point was that life-critical systems such as MRI machines and
surgery-related instruments are sold as a package, and incautious upgrading can
upset the delicate balance that is struck when a Windows system is embedded
into a larger piece of technology.
She suggested that companies like Microsoft take some of the $100
billion in cash they are sitting on and spend some of it on free upgrades to
customers who would normally have to pay for the privilege.
There is plenty of blame to go around in this situation: the NSA, the NHS, Microsoft, and
ordinary citizens who were too lazy to install patches that they had even paid
for. But such a large-scale
failure of what has become by now an essential part of modern technological
society raises questions that we have been able to ignore, for the most part,
up to now.
When I described a much smaller-scale ransomware attack in this space back
in March, I likened it to a foreign military invasion. That analogy doesn't seem to be too
popular right now, but I still think it's valid. What keeps us from viewing the two cases similarly has to do
with the way we've been trained to look at software, and the way software
companies have managed to use their substantial monopolistic powers to set up
conditions in their favor.
Historically, such monopolistic abuse has come to an end only through
vigorous government action to call the monopoly to account. The U. S. National Transportation
Safety Board can conduct investigations and levy penalties on auto companies
who violate the rules or behave negligently. So far, software firms have almost completely avoided any
form of government regulation, and the free-marketers among us have pointed to
them as an example of how non-intervention by government can benefit an
industry.
Well, yes and no. People have
made a lot of money in the software and related industries—a few people,
anyway, because the field is notorious for the huge returns it can give a few
dozen employees and entrepreneurs who happen to get a good idea first,
implement it, and dominate a new field (think Facebook). But when you realize that the same
companies charge customers over and over again for the ever-required upgrades
and security patches (which are often bundled together so you can't keep the
software you like without having it get hacked sooner or later), the difference
between a software company and an old-fashioned protection racket where a guy
flipping a blackjack in his hand comes in your candy store, looks around, and
says, "Nice place you got here—a shame if anything should happen to
it" becomes hard to distinguish in some ways.
Software performs a valuable service to billions of people, and I'm not
calling for a massive takeover of software firms by the government. And users of software have some
responsibility for doing maintenance, assuming that maintenance is of
reasonable cost and isn't impossibly hard to do, or leads to situations that
make the software less useful. But
when a major disaster like WannaCry can cause such global havoc, it's time to
rethink the fundamentals of how software is designed, sold (technically, it's
leased, not sold), and maintained.
And like it or not, the U. S. market has a huge influence on these
things.
Even the threat of regulation can have a most salutary effect on
monopolistic firms, which to avoid government oversight often enter voluntarily
into industry-wide agreements to implement reforms rather than let the
government take over the job. It's
unlikely that the current chaos going on in Washington is a good environment in
which to undertake this task, but there needs to be a coordinated, technically
savvy, but also ethically deep conversation among the principals—software
firms, major customers, and government regulators—to find a different way of
doing security and upgrades, which are inextricably tied together.
I don't know what the answer is, but companies like Microsoft may have to
accept some form of restraint on their activities in exchange for remaining
free of the heavy hand of government regulation. The alternative is that we continue muddling along as we
have been while the growth of the Internet of Things (IoT) spreads highly
vulnerable gizmos all across the globe, setting us up for a tragedy that will
make WannaCry look like a minor hiccup.
And nobody wants that to happen.
Sources: Zeynep Tufekci's op-ed piece "The
World Is Getting Hacked. Why Don't
We Dp More to Stop It?" appeared on the website of the New York Times on May 13, 2017, at https://www.nytimes.com/2017/05/13/opinion/the-world-is-getting-hacked-why-dont-we-do-more-to-stop-it.html. I also referred to the Wikipedia
article "WannaCry ransomware attack." My blog "Ransomware Comes to the Heartland"
appeared on Mar. 27, 2017.
No comments:
Post a Comment