Until I saw the title of Andrew Futter’s Hacking the Bomb: Cyber Threats and Nuclear Weapons in the
new-books shelf of my university library, I had never given any thought to what
the new threat of cyber warfare means to the old threat of nuclear war. Quite a lot, it turns out.
Futter is associate professor of history at the University
of Leicester in the UK, and has gathered whatever public-domain information he
could find on what the world’s major nuclear players—chiefly Russia, China, and
the U. S.—are doing both to modernize their nuclear command and control systems
to bring them into the cyber era, and to keep both state and non-state actors
(e. g. terrorists) from doing what his title mentions—namely, hacking a nuclear
weapon, as well as other meddlesome things that could affect a nuclear nation’s
ability to respond to threats.
The problem is a complicated one. The worst-case scenario would be for a hacker
to launch a live nuclear missile. This
almost happened in the 1983 film WarGames,
back when cyberattacks were primitive attempts by hobbyists using phone-line
modems. Since then, of course, cyber
warfare has matured. Probably the most
well-known case is the Stuxnet attack on
Iranian nuclear-material facilities (probably carried out by a U. S -Israeli
team) discovered in 2010, and Russia’s 2015 crippling of Ukraine’s power grid
by cyberweapons. While there are no
known instances in which a hacker has gained direct control of a nuclear
weapon, that is only one side of the hacker coin—what Futter calls the enabling
side. Just as potentially dangerous from
a strategic point of view is the disabling side: the potential to interfere with a nation’s
ability to launch a nuclear strike if needed.
Either kind of hacking could raise the possibility of nuclear war to
unacceptable levels.
At the end of his book, Futter recommends three principles
to guide those charged with maintaining control of nuclear weapons. The problem is that two of the three
principles he calls for run counter to the tendencies of modern computer
networks and systems. His three
principles are (1) simplicity, (2) security, and (3) separation from
conventional weapons systems.
Security is perhaps the most important principle, and so
far, judging by the fact that we have not seen an accidental detonation of a
nuclear weapon up to now, those in charge of such weapons have done at least an
adequate job of keeping that sort of accident from happening. But anyone who has dealt with computer systems
today, which means virtually everyone, knows that simplicity went out the
window decades ago. Time and again,
Futter emphasizes that while the old weapons-control systems were basically
hard-wired pieces of hardware that the average technician could understand and
repair, any modern computer replacement will probably involve many levels of
complexity in both hardware and software.
Nobody will have the same kind of synoptic grasp of the entire system
that was possible with 1960s-type hardware, and Futter is concerned that what
we can’t fully understand, we can’t fully control.
Everyone outside the military organizations charged with
control of nuclear weapons is at the disadvantage of having to guess at what
those organizations are doing along these lines. One hopes that they are keeping the newer
computer-control systems as simple as possible, consistent with
modernization. What is more likely to be
followed than simplicity is the principle of separation—keeping a clear
boundary between control systems for conventional weapons and systems controlling
nuclear weapons.
Almost certainly, the nuclear-weapons control networks are
“air-gapped,” meaning that there is no physical or intentional electromagnetic
connection between the nuclear system and the outside world of the
Internet. This was true of the control
system that Iran built for its uranium centrifuges, but despite their air-gap
precaution, the developers of Stuxnet were able to bridge the gap, evidently
through the carelessness of someone who brought in a USB flash drive containing
the Stuxnet virus and inserted it into a machine connected to the
centrifuges.
Such air-gap breaches could still occur today. And this is where the disabling part of the
problem comes in.
One problem with live nuclear weapons is that you never get
to test the entire system from initiating the command to seeing the mushroom
cloud form over the target. So we never really know from direct experience if
the entire system is going to work as planned in the highly undesirable event
that the decision is made to use nuclear weapons.
The entire edifice of nuclear strategy thus relies on faith
that each major player’s system will work as intended. Anything that undermines that faith—a
message, say, from a hacker asking for money or a diplomatic favor, or else we
will disable all your nuclear weapons in a way you can’t figure out—well, such
an action would be highly destabilizing for the permanent standoff that exists
among nuclear powers.
Though it’s easy to ignore it, Russia and the U. S. are like
two gunslingers out in front of a saloon, each covering the other with a loaded
pistol. Neither one will fire unless he
is sure the other one is about to fire.
But if one gunman thought that in a few seconds, somebody was going to
snatch his gun out of his hands, he might be tempted to fire first. That’s how the threat of an effective
disabling hack might lead to unacceptable chances of nuclear war.
These rather dismal speculations may not rise to the top of
your worry list for the day, but it’s good that someone has at least asked the
questions, and has found that the adults in the room, namely the few military
brass who are willing to talk on the public record, are trying to do something
about them. Still, it would be a shame
if after all these decades of successfully avoiding nuclear war, we wound up
fighting one because of a software error.
Sources: Andrew Futter’s Hacking the Bomb: Cyber Threats
and Nuclear Weapons by Andrew Futter was published by Georgetown University
Press in 2018. I also referred to the
Wikipedia article on Stuxnet.
No comments:
Post a Comment