Showing posts with label quantum computing. Show all posts
Showing posts with label quantum computing. Show all posts

Monday, February 15, 2021

Major Embarrassment for Microsoft: No Majorana Particle After All

 

In 1937, the Italian physicist Ettore Majorana published a paper predicting the existence of something that came to be known as the Majorana particle.  In the society of subatomic particles, the Majorana is rather standoffish:  without a positive or negative charge, without an antiparticle (technically, it's its own antiparticle) and without even a magnetic or electric dipole moment.  Even the famously neutral neutron has a magnetic dipole moment.  A few months after writing the paper, Majorana sent an enigmatic note to a colleague saying he was sorry for what he was about to do, got on a boat bound from Palermo to Naples, and was never seen again. 

 

Of course, the physics community started looking for Majorana particles right away, and the search intensified after people began trying to make quantum computers.  Theoretically, a quantum computer can perform certain kinds of calculations many orders of magnitude faster than ordinary bit-based computers, because each "qubit" can hold a combination of states and thus process more information in a given amount of time.  (That explanation probably gives physicists a headache, but it's the closest I can get in the space I have.)

 

Anyway, it turns out that if engineers could make Majorana particles, their standoffish nature would become a virtue, because the quantum computers people have devised up to now all suffer from a common problem:  insufficient isolation from the environment.  The quantum states needed to do quantum calculations are very delicate, and any little disturbance from magnetic or electric fields, or just the passage of time, busts up the party so much that extensive error correction and multiple processing of the same problem are necessary.  Theorists say that a quantum computer using Majorana particles would be much less prone to such errors because the particles are so inert, relatively speaking. 

 

So the quantum-computing world was quite impressed back in 2018 when researchers funded by Microsoft announced that they'd finally made a Majorana particle.  The alleged particle wasn't "fundamental" in the sense that it was a single entity.  Rather, they said it was a kind of collective phenomenon created by electron interactions in a cold semiconductor. 

 

There's nothing fishy about that.  Even my EE undergrads learn about positively charged "particles" called holes, which turn out to be a collective effect of electrons in a semiconductor.  But in January of 2021, the same research group published a new paper saying basically, "Oops, we screwed up."  Some critical data tending to falsify the result was omitted from the 2018 paper, which they are withdrawing. 

 

The new paper came about when Sergey Frolov, another physicist, questioned the results of the 2018 paper and obtained their raw data, which included points that were not shown in the 2018 paper. 

 

Leo Kouwenhoven, the leader of the Microsoft research team, released the new paper before peer review along with a note that retracted their earlier paper.  He refused to comment further on the new paper because of peer review, but it's fairly clear what has happened, as described by a recent report in Wired. 

 

Under pressure to deliver results, the Microsoft team omitted a part of their data, allegedly for "esthetic" reasons, and published the 2018 claim to have discovered a Majorana particle.  In retrospect, omitting the esthetically displeasing data was not a good idea.  But they did the right thing in providing Frolov with unpublished as well as published data, and in issuing a new paper showing that they were basically incorrect in their 2018 interpretation of the same data.

 

Physics is hard enough even when the only motivation is intellectual curiosity.  When the auxiliary pressures of continued funding, fame, fortune, or tenure get into the mix, it's tempting to make claims that later can't withstand intense scrutiny.

 

In my own peculiar little field of ball lightning research, I see this quite often.  Ball lightning is an atmospheric phenomenon which thousands of people have seen over the centuries.  There is a consistent set of characteristics which leaves little doubt that there is a real thing there which occurs rarely, but not so rarely that people never see it.  However, there is as yet no generally accepted scientific explanation for ball lightning, and no one has ever been able to produce anything in a lab that shows the most common characteristics of ball lightning.  Worse yet, there are no photographs or videos that are generally accepted as showing an actual ball lightning object.

 

Of course, taking a photo or video or obtaining other kinds of objective data on ball lightning would be a major accomplishment, and many people have claimed to do that over the years.  But subsequent investigations, either by the original researchers or someone else, usually shows that there is a simpler explanation than ball lighting for what was photographed, or just leaves the question unresolved.

 

I don't attribute base motives to people who publish exciting-looking data that later turns out to be not so exciting.  There's always the chance it will prove to be the real thing, and one important reason for publishing scientific data and interpretations is to get it out in the open so others can look at it and criticize it if necessary, just as Forlov did with the Microsoft data.  And while it's embarrassing and can lead to adverse career consequences, admitting that you made a mistake is a part of being an adult, and Kouwenhoven and his group have done the right thing by publishing the later paper and retracting the 2018 one.   

 

Some people would look at this situation and say there's something wrong with the way physics works, but I disagree.  As my wife says in a different context, "More communication is better than less communication."  Let anybody who even thinks they have something worth publishing go ahead and publish it, and let the reviewers and critics have at it as hard as they like, without being mean, of course.  That's the way progress happens.  

 

Allegedly, a person matching Majorana's description was seen in the late 1950s in Valencia, Venezuela.  For several years leading up to his disappearance, Majorana had become increasingly isolated, almost like his eponymous particle.  And while he may have lived through World War II and after in professional silence in Venezuela, he may have ended his life in the waters off the Italian coast on March 25, 1938.  Some things we just can't know for sure yet, and that goes for physics too.

 

Sources:  The report by Tom Simonite "Microsoft's Big Win in Quantum Computing Was an 'Error' After All," appeared on Feb. 12, 2021 at https://www.wired.com/story/microsoft-win-quantum-computing-error/#intcid=_wired-homepage-right-rail_c7864c71-ed27-4bf6-8fd8-91bb939170d2_popular4-1.  I also referred to Wikipedia articles on Ettore Majorana and the Majorana particle. 

Monday, March 25, 2019

Quantum Computers Are Analog Computers


Today's topic may be a little afield of conventional engineering ethics, but it involves billions of dollars at risk and the future of an entire engineering-intensive industry, so that's enough to make it an ethical concern already. 

Most engineers have heard of Moore's Law, the observation that eventually became the backbone of the semiconductor industry's road map or marching orders:  the doubling of computing power every two years.  In recent years, Moore's Law has run into difficulties because you simply can't make conventional transistor logic gates too small, or else the electrons don't stay where you want them to because of their quantum nature. 

But not to worry:  for close to four decades now, we've been told that when conventional computer logic circuits can no longer be improved, the industry will switch to "quantum computers," which are based on an entirely different principle that takes advantage of quantum effects, and Moore's Law or its quantum equivalent will keep advancing computer power indefinitely into the future.  This transition to quantum computing has been held out as the best hope for continued progress, and currently it's taken quite seriously by major players in hardware, software, and finance.  IBM and Microsoft, among others, are spending tons of money on quantum computing, and each year  thousands of research papers (mostly theoretical ones) are published about it.

In the face of all this optimism comes one Mikhail Dyakunov, a Russian-born physicist currently at the Université-Montpelier-CNRS in France.  Dyakunov is well-known for his discoveries in plasma and quantum physics over a long career (he is 78).  And last November, the website of the professional engineering magazine IEEE Spectrum published his article "The Case Against Quantum Computing," in which he expresses serious doubts that a practical quantum computer capable of anything more than what conventional computers can do now will ever be built.

Along the way, he gives the most straightforward non-technical explanation of what a quantum computer is that I have seen, and I've seen many over the years.  The gist of the difficulty, he says, is that conventional computers store information as transistor states which are either on or off.  With a clear definition of on and off in terms of a current, say, it's not that challenging to set up and process data in the form of on-or-off bits, which are the essence of what we mean by "digital."  Discrete unambiguous states are the key to the entire conventional-computer intellectual construct, and while errors do occur, there are well-known and not terribly demanding ways to correct them.  That is how we got to where we are today.

But the fundamental logical unit in a quantum computer is not a conventional on-or-off current or voltage.  It is the quantum state of a "qubit" which can be exemplified by, for instance, the direction that the magnetic axis of an electron points in.  And as long as you are not taking a measurement (roughly equivalent to reading out data), the information that makes quantum computing work is the exact angle of that spin with respect to some reference direction.  And that angle is not just up or down, 1 or 0, but can take on any value between plus and minus 90 degrees. 

Back where I come from, a computer which stores information in the form of continuous physical states is called an analog computer.  Most people younger than 40 have little or no memory of analog computers, but surprisingly sophisticated problems were solved on these things from the early 20th century up to the 1960s.  However, they were comparatively slow and had very limited accuracy, typically a percent or so.  And when digital computers came along, virtually all analog computers became museum pieces (think of how many people you see using slide rules these days).  One of the last ones to go was a curious system that took synthetic-aperture radar (SAR) data from a flying airplane and transformed the data into light and dark patches on photographic film.  Then the film was placed into an optical system that performed a Fourier transform on the data and presto!  you obtained the real-space version of the SAR radar image:  the actual mountains and valleys that the plane flew over.  Since this gizmo used light waves, and light waves are fundamentally quantum in nature, I suppose you could have called that a quantum computer, though nobody did.

And you can bet nobody who is promoting quantum computing is going to refer to their goal as an analog computer, because for decades, "analog" has been an embarrassing term in the world of computation.  But guess what—Dyakunov has explained to us mortals that quantum computers have to manipulate and store data in analog form.  And the same kinds of problems of accuracy and errors that caused the analog-computer dinosaurs to die off are currently keeping quantum computers from getting any farther than they have so far, which is not very (no practical quantum computers are in commercial production).  You think reading out a analog computer's shaft position accurately is hard?  Try measuring the spin of a single electron without disturbing it.  I may be oversimplifying things, ut that seems to be the essence of what has to be done.  And Dyakunov points out that the experts themselves say they'll need thousands of logical qubits to do anything useful, and perhaps up to a thousand physical qubits per logical qubit to have enough information to correct the inevitable errors. 

In sum, Dyakunov thinks the quantum-computing fad may be going the way of the superconducting-computer fad, which flared in the 1980s and died in the early 2000s when conventional silicon-based computers overtook them performance-wise.  For a time, it was easier to build smaller logic gates out of something called Josephson junctions than it was to make silicon gates.  The problem with Josephson junctions is that they have to be cooled to a few millikelvin with liquid helium, which leads to all kinds of interface problems.  Ironically, Josephson junctions are one of the leading contenders for the best path to qubits, but handling millikelvin circuits hasn't gotten much easier in the meantime. 

The late science-fiction writer Arthur C. Clarke made a famous comment about elderly scientists: "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."  By this criterion, we should ignore Dyakunov and keep working on quantum computers.  But it would be interesting if he turns out to be right. 

Sources:  I read Dyakunov's article in the March 2019 hard-copy issue of IEEE Spectrum, pp. 24-29, but a version is also available online at https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing.  I also referred to https://prabook.com/web/mikhail.dyakonov/448309 for Dyakunov's date of birth and the Wikipedia article on him, and the Arthur C. Clarke entry in Wikiquote for what is known as Clarke's First Law. 
-->