Today's topic may be a
little afield of conventional engineering ethics, but it involves billions of
dollars at risk and the future of an entire engineering-intensive industry, so
that's enough to make it an ethical concern already.
Most engineers have heard of
Moore's Law, the observation that eventually became the backbone of the
semiconductor industry's road map or marching orders: the doubling of computing power every two
years. In recent years, Moore's Law has
run into difficulties because you simply can't make conventional transistor
logic gates too small, or else the electrons don't stay where you want them to
because of their quantum nature.
But not to worry: for close to four decades now, we've been told
that when conventional computer logic circuits can no longer be improved, the
industry will switch to "quantum computers," which are based on an
entirely different principle that takes advantage of quantum effects, and
Moore's Law or its quantum equivalent will keep advancing computer power
indefinitely into the future. This
transition to quantum computing has been held out as the best hope for
continued progress, and currently it's taken quite seriously by major players
in hardware, software, and finance. IBM
and Microsoft, among others, are spending tons of money on quantum computing,
and each year thousands of research
papers (mostly theoretical ones) are published about it.
In the face of all this
optimism comes one Mikhail Dyakunov, a Russian-born physicist currently at the
Université-Montpelier-CNRS in France.
Dyakunov is well-known for his discoveries in plasma and quantum physics
over a long career (he is 78). And last
November, the website of the professional engineering magazine IEEE Spectrum published his article
"The Case Against Quantum Computing," in which he expresses serious
doubts that a practical quantum computer capable of anything more than what
conventional computers can do now will ever be built.
Along the way, he gives the
most straightforward non-technical explanation of what a quantum computer is
that I have seen, and I've seen many over the years. The gist of the difficulty, he says, is that
conventional computers store information as transistor states which are either
on or off. With a clear definition of on
and off in terms of a current, say, it's not that challenging to set up and
process data in the form of on-or-off bits, which are the essence of what we
mean by "digital." Discrete
unambiguous states are the key to the entire conventional-computer intellectual
construct, and while errors do occur, there are well-known and not terribly
demanding ways to correct them. That is
how we got to where we are today.
But the fundamental logical
unit in a quantum computer is not a conventional on-or-off current or
voltage. It is the quantum state of a
"qubit" which can be exemplified by, for instance, the direction that
the magnetic axis of an electron points in.
And as long as you are not taking a measurement (roughly equivalent to
reading out data), the information that makes quantum computing work is the
exact angle of that spin with respect to some reference direction. And that angle is not just up or down, 1 or 0, but can take on any value between plus and minus 90 degrees.
Back where I come from, a
computer which stores information in the form of continuous physical states is
called an analog computer. Most people
younger than 40 have little or no memory of analog computers, but surprisingly
sophisticated problems were solved on these things from the early 20th century
up to the 1960s. However, they were comparatively
slow and had very limited accuracy, typically a percent or so. And when digital computers came along,
virtually all analog computers became museum pieces (think of how many people
you see using slide rules these days). One
of the last ones to go was a curious system that took synthetic-aperture radar
(SAR) data from a flying airplane and transformed the data into light and dark
patches on photographic film. Then the
film was placed into an optical system that performed a Fourier transform on
the data and presto! you obtained the
real-space version of the SAR radar image:
the actual mountains and valleys that the plane flew over. Since this gizmo used light waves, and light
waves are fundamentally quantum in nature, I suppose you could have called that
a quantum computer, though nobody did.
And you can bet nobody who
is promoting quantum computing is going to refer to their goal as an analog
computer, because for decades, "analog" has been an embarrassing term
in the world of computation. But guess
what—Dyakunov has explained to us mortals that quantum computers have to
manipulate and store data in analog form.
And the same kinds of problems of accuracy and errors that caused the
analog-computer dinosaurs to die off are currently keeping quantum computers
from getting any farther than they have so far, which is not very (no practical
quantum computers are in commercial production). You think reading out a analog computer's
shaft position accurately is hard? Try
measuring the spin of a single electron without disturbing it. I may be oversimplifying things, ut that
seems to be the essence of what has to be done.
And Dyakunov points out that the experts themselves say they'll need
thousands of logical qubits to do anything useful, and perhaps up to a thousand
physical qubits per logical qubit to have enough information to correct the
inevitable errors.
In sum, Dyakunov thinks the
quantum-computing fad may be going the way of the superconducting-computer fad,
which flared in the 1980s and died in the early 2000s when conventional
silicon-based computers overtook them performance-wise. For a time, it was easier to build smaller
logic gates out of something called Josephson junctions than it was to make
silicon gates. The problem with
Josephson junctions is that they have to be cooled to a few millikelvin with
liquid helium, which leads to all kinds of interface problems. Ironically, Josephson junctions are one of
the leading contenders for the best path to qubits, but handling millikelvin
circuits hasn't gotten much easier in the meantime.
The late science-fiction
writer Arthur C. Clarke made a famous comment about elderly scientists: "When
a distinguished but elderly scientist states that something is possible, he is
almost certainly right. When he states that something is impossible, he is very
probably wrong." By this criterion,
we should ignore Dyakunov and keep working on quantum computers. But it would be interesting if he turns out
to be right.
Sources: I read Dyakunov's article in the March 2019
hard-copy issue of IEEE Spectrum, pp.
24-29, but a version is also available online at https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing. I also referred to https://prabook.com/web/mikhail.dyakonov/448309
for Dyakunov's date of birth and the Wikipedia article on him, and the Arthur
C. Clarke entry in Wikiquote for what is known as Clarke's First Law.
-->
No comments:
Post a Comment