Monday, February 22, 2021

The Texas Power Crisis—Will It Happen Again?


I don't often get an email from a colleague in Scotland asking how I'm weathering the power cuts in Texas.  But I did last Friday, after the worst was over. 


For much of last week, millions of Texans had to endure the loss of electric power, and all that entails, during some of the coldest weather on record. 


Early Monday morning, Feb. 15, the Electric Reliability Council of Texas (ERCOT), the nonprofit organization that operates the Texas power grid, ran out of power.  That is to say, the soaring demand due to millions of heating units working overtime exceeded the dropping supply due to equipment failures caused by the same cold weather.  In order to avoid an uncontrolled system crash that would take weeks to recover from, ERCOT commanded its grid customers (operating companies that distribute the power and collect power bills) to implement rolling blackouts, handing out percentages of their load they had to shed.  How they shed it was up to them, but shed they must. 


This put increasing numbers of electric customers in the dark, both physically and informationally.  Although many people could access news and websites through their phones, there wasn't that much to learn about how the power cuts were being decided or how long they would last.  Here in San Marcos, our house had power all day Monday. That evening I was on a Zoom conference with some people in Austin, who'd been told that the rolling blackouts would last only an hour or so.  Instead, it seemed that almost everybody in Austin lost power and stayed in the dark except for a lucky few who happened to be on a "critical feeder" that powered a hospital or fire station. 


In the middle of the Zoom call, our power went out.  My laptop battery kept my computer going, though, and I managed to power our modem and wireless network with an emergency battery powerpack and inverter, and I got online long enough to complete the call.  But then I went to bed at 8:30, as it was getting cold in the house.  We had an hour or two of power every so often for the next two days, but it was mostly off until Wednesday afternoon.  Many people in Austin and other parts of Central Texas fared much worse, losing power for two or three days straight, and when their pipes froze they had to seek out an emergency shelter or stay with friends. 


Like many engineering failures, this one had multiple causes.  On Friday, the IEEE (Institute of Electrical and Electronics Engineers) Smart Grid Initiative sponsored a webinar panel discussion on the crisis, and much of what follows is taken from the information presented in that webinar. 


As you may know, Texas has its own power grid that operates independently of the grids in the rest of the country.  A "grid" is defined by a region where all the power is synchronized to 60 Hz and can be fairly easily shipped back and forth as the need arises.  For historical reasons having to do with the criticality of Texas infrastructure during World War II (and a bit of Texas stubborn independence), most of Texas is covered by the grid supervised by ERCOT.  There are a few power interties between Texas and other grids, but they require special equipment and cannot provide significant amounts of power transfer.


So when an extraordinarily cold air mass charged through Texas beginning on Sunday, ERCOT was going to have to handle it by itself.  On paper, they were prepared.  About half of the 80+ gigawatts (billions of watts, abbreviated GW) of nominal capacity for the grid is natural-gas generation.  ERCOT has over 500 generating stations of various kinds to draw from, and some are more reliable than others.  Coal and nuclear plants are the most reliable kind, but coal plants are being shut down these days and no new nuclear plants are being built.  A quarter of Texas' installed capacity is wind, but wind generators are notoriously unreliable, and once the cold air arrived it stopped moving, idling most of the wind turbines and freezing up many of them. 


The usual alternative in such a situation is to turn on gas-fired turbine generators, which in principle can be started up in minutes to augment the grid's energy sources.  But the same weather that caused demand to soar also crippled the natural-gas infrastructure, freezing well heads and control valves and causing other problems that eventually eliminated some 26 GW of gas-fired generators that could have otherwise been used.  Although at the peak of the crisis, the grid was producing some 62 GW of power, ERCOT would have had to come up with another 10 GW to meet the extraordinary demand produced by single-digit temperatures in Houston and other normally balmy parts of central Texas.  Hence the rolling blackouts that quit rolling and just sat there until pipes froze and people had to seek warmer shelter.


How could this have been avoided?  The panel experts proposed a number of solutions. 


One was much better interconnections to other grids, both from Texas to other grids and across North America.   This is an expensive long-term solution, costing many billions of dollars over many years, and it would increase the robustness of electric power nationwide.  But as one expert pointed out, the grid covering much of the Midwest, just north of ERCOT's grid, was experiencing its own rolling blackouts of less severity, and had no power to spare.  So even if Texas had not been electrically independent, we still would have had blackouts, but perhaps not as severe.


Another idea that has been widely adopted in places like Italy is "demand response."  This is a smart-grid technology that allows the power company to adjust the demand from individual consumers.  For example, in exchange for a discount on your power bill, you might allow your electric utility the right to reduce your thermostat setting or cut off your electric dryer in power emergencies such as the one Texas just experienced.  If demand response had been tested and widely deployed in Texas, the blackouts might have been less severe, but they would probably still have been needed.


Fortunately, there were not many fatalities directly attributable to the power outages, and so as crises go, this one was greatly inconvenient but not deadly.  Better enforcement of winterization protection for natural-gas plants is the most urgent thing to do that will keep something like this from happening again, but the final call is up to Mother Nature.


Sources:  A webinar called "The Texas Energy Crisis" available to IEEE members and hosted by Peter Wung, IEEE Smart Grid chair, is the source of much of the information in this blog.  A more politically oriented report that gets the technical matters mostly right was written by Texan Kevin Williamson and was posted on the National Review website at

Monday, February 15, 2021

Major Embarrassment for Microsoft: No Majorana Particle After All


In 1937, the Italian physicist Ettore Majorana published a paper predicting the existence of something that came to be known as the Majorana particle.  In the society of subatomic particles, the Majorana is rather standoffish:  without a positive or negative charge, without an antiparticle (technically, it's its own antiparticle) and without even a magnetic or electric dipole moment.  Even the famously neutral neutron has a magnetic dipole moment.  A few months after writing the paper, Majorana sent an enigmatic note to a colleague saying he was sorry for what he was about to do, got on a boat bound from Palermo to Naples, and was never seen again. 


Of course, the physics community started looking for Majorana particles right away, and the search intensified after people began trying to make quantum computers.  Theoretically, a quantum computer can perform certain kinds of calculations many orders of magnitude faster than ordinary bit-based computers, because each "qubit" can hold a combination of states and thus process more information in a given amount of time.  (That explanation probably gives physicists a headache, but it's the closest I can get in the space I have.)


Anyway, it turns out that if engineers could make Majorana particles, their standoffish nature would become a virtue, because the quantum computers people have devised up to now all suffer from a common problem:  insufficient isolation from the environment.  The quantum states needed to do quantum calculations are very delicate, and any little disturbance from magnetic or electric fields, or just the passage of time, busts up the party so much that extensive error correction and multiple processing of the same problem are necessary.  Theorists say that a quantum computer using Majorana particles would be much less prone to such errors because the particles are so inert, relatively speaking. 


So the quantum-computing world was quite impressed back in 2018 when researchers funded by Microsoft announced that they'd finally made a Majorana particle.  The alleged particle wasn't "fundamental" in the sense that it was a single entity.  Rather, they said it was a kind of collective phenomenon created by electron interactions in a cold semiconductor. 


There's nothing fishy about that.  Even my EE undergrads learn about positively charged "particles" called holes, which turn out to be a collective effect of electrons in a semiconductor.  But in January of 2021, the same research group published a new paper saying basically, "Oops, we screwed up."  Some critical data tending to falsify the result was omitted from the 2018 paper, which they are withdrawing. 


The new paper came about when Sergey Frolov, another physicist, questioned the results of the 2018 paper and obtained their raw data, which included points that were not shown in the 2018 paper. 


Leo Kouwenhoven, the leader of the Microsoft research team, released the new paper before peer review along with a note that retracted their earlier paper.  He refused to comment further on the new paper because of peer review, but it's fairly clear what has happened, as described by a recent report in Wired. 


Under pressure to deliver results, the Microsoft team omitted a part of their data, allegedly for "esthetic" reasons, and published the 2018 claim to have discovered a Majorana particle.  In retrospect, omitting the esthetically displeasing data was not a good idea.  But they did the right thing in providing Frolov with unpublished as well as published data, and in issuing a new paper showing that they were basically incorrect in their 2018 interpretation of the same data.


Physics is hard enough even when the only motivation is intellectual curiosity.  When the auxiliary pressures of continued funding, fame, fortune, or tenure get into the mix, it's tempting to make claims that later can't withstand intense scrutiny.


In my own peculiar little field of ball lightning research, I see this quite often.  Ball lightning is an atmospheric phenomenon which thousands of people have seen over the centuries.  There is a consistent set of characteristics which leaves little doubt that there is a real thing there which occurs rarely, but not so rarely that people never see it.  However, there is as yet no generally accepted scientific explanation for ball lightning, and no one has ever been able to produce anything in a lab that shows the most common characteristics of ball lightning.  Worse yet, there are no photographs or videos that are generally accepted as showing an actual ball lightning object.


Of course, taking a photo or video or obtaining other kinds of objective data on ball lightning would be a major accomplishment, and many people have claimed to do that over the years.  But subsequent investigations, either by the original researchers or someone else, usually shows that there is a simpler explanation than ball lighting for what was photographed, or just leaves the question unresolved.


I don't attribute base motives to people who publish exciting-looking data that later turns out to be not so exciting.  There's always the chance it will prove to be the real thing, and one important reason for publishing scientific data and interpretations is to get it out in the open so others can look at it and criticize it if necessary, just as Forlov did with the Microsoft data.  And while it's embarrassing and can lead to adverse career consequences, admitting that you made a mistake is a part of being an adult, and Kouwenhoven and his group have done the right thing by publishing the later paper and retracting the 2018 one.   


Some people would look at this situation and say there's something wrong with the way physics works, but I disagree.  As my wife says in a different context, "More communication is better than less communication."  Let anybody who even thinks they have something worth publishing go ahead and publish it, and let the reviewers and critics have at it as hard as they like, without being mean, of course.  That's the way progress happens.  


Allegedly, a person matching Majorana's description was seen in the late 1950s in Valencia, Venezuela.  For several years leading up to his disappearance, Majorana had become increasingly isolated, almost like his eponymous particle.  And while he may have lived through World War II and after in professional silence in Venezuela, he may have ended his life in the waters off the Italian coast on March 25, 1938.  Some things we just can't know for sure yet, and that goes for physics too.


Sources:  The report by Tom Simonite "Microsoft's Big Win in Quantum Computing Was an 'Error' After All," appeared on Feb. 12, 2021 at  I also referred to Wikipedia articles on Ettore Majorana and the Majorana particle. 

Monday, February 08, 2021

Can Democracy in America Survive Big Tech?


Two articles I came across recently raise the question in the headline of today's column.  One is by a journalist named Allum Bokhari, who gave a speech last November at Hillsdale College, one of the very small number of U. S. colleges that does not accept Federal grants, loans, or other funding.  The other is by Robert D. Kaplan, a geopolitics specialist at the Foreign Policy Research Institute.  Both gentlemen are deeply concerned that social media, as it now works, constitute an existential threat to American small-d democratic government.


Kaplan is concerned that social media may create conditions in which the "fragile, perhaps even ephemeral" experiment called American democracy cannot survive.  His studies of nation-states range widely over time and geography.  The old USSR, he points out, was not defeated from without by nuclear or conventional warfare.  Rather, it was destroyed by internal weaknesses and a crisis of purpose that led to its disintegration.  Regarding the present rivalry between the U. S. and China, he sees social media playing radically different roles in the two countries.


In China, the authoritarian government ensures that everything on social media reinforces the "blood-and-soil nationalism" of the dominant Han cultural matrix.  Traces of dissent are ruthlessly stamped out, and ethnic minorities such as Tibetans and Uighurs are suppressed and even locked up in concentration camps.  There is basically one political story available in China, and social media reinforce it.


In the U. S., on the other hand, Big Tech effectively control social media, and recent events emphasize the subtle but increasingly effective control they exert.  The dominant vision embraced by those who inhabit the upper reaches of corporate and cultural America is a transnational one which, when it looks at American history at all, sees a story of exploitation and shame, exemplified by the New York Times's "1619 Project" that attempted to show that the founders based America on slavery, not on anything noble.  Even worse, the economics of social media have come to embrace the divide-and-conquer principle that feeding different kinds of people what they most want to hear means cutting up the citizenry into "racial, gender, political, or sexual" identity groups that are often pitted against each other, to the great loss of the basic unity that any nation needs to survive.


Allum Bokhari brings his experience with Breitbart News to the table.  While I am no fan of Breitbart News, the old principle of free speech (much abused lately) says that every voice deserves to be heard, if not believed.  And he brings some indisputable facts to the table that are worth considering.


Unlike the early days of the Internet when no single social-media platform was dominant and everybody had more or less equal access to everybody else's website, today's Internet is a creature of the Google-Facebook-Amazon complex of corporate control.  And control is the right word.  The velvet glove of free apps and fun-looking websites conceals an iron hand of manipulation that is so subtle and complex, powered by advanced AI software, that the vast majority of users have little or no idea that they are being manipulated.  But they are.  


Cadres of software engineers spend countless hours devising complex algorithms to change behavior, not only to the benefit of advertisers on Big Tech's media, but for other reasons as well.  One quote that Bokhari reports from a source he interviewed at Facebook says it all:  “We have thousands of people on the platform who have gone from far right to center in the past year, so we can build a model from those people and try to make everyone else on the right follow the same path.”  If this isn't manipulation, I don't know what is.


In recent months, the manipulation and control has come above ground for everyone to see.  Bokhari cites the actions of Facebook, Twitter, and other Big Tech firms in de-platforming President Trump, and of Amazon and Apple in kicking the upstart social-media platform Parler off their equipment (or in the case of Apple, off the privately owned phones of millions of users).  One can argue about the motivations for such actions.  But the bare fact of the actions remain:  privately owned companies, largely unhindered and in fact protected by government regulation from lawsuits that private individuals can be subject to (that is what Section 230 of the Communications Decency Act does), unilaterally censored an entire social-media network regardless of who or what was on it, and also censored the sitting President of the United States. 


For those who can remember the old days of only three television networks, the only analogous action I can imagine would be if the President decided to make a speech one day, and in the middle of his words spoken to the "pool" camera that all three networks were taking their video feed from, executives decided to pull the switch and return to their regular programming of the Beverly Hillbillies or whatever.  Nothing like that ever happened, but if it had, the roars of outrage from common citizens of every political viewpoint would have been deafening. 


Today, roars—or anything else—can't be heard unless Big Tech approves of the roar.  The dominant progressive political views of the transnational cultural elite who are in charge are squeezing out the wide spectrum of views that, no matter how annoying some of the extremes are, turn out to be vital to the survival of democracy.  To those who deplore disagreement and debate, I would say this:  disagreement and debate are features of democracy, not bugs.  Cut them off and you are left with a softer form of what China has:  a homogenized, uniform, expert-driven technocracy that maintains the form of democracy, perhaps, but denies its power.  If this nation, which has endured for 245 years, is to preserve government "of the people, by the people, for the people," the malignant effects of social media and corporate control must be dealt with.  And soon, before it is too late.


Sources:  Allum Bokhari's post, based on a modified version of his Nov. 8, 2020 speech at Hillsdale College, is available at  Robert D. Kaplan's article "How We Lose Against China" appeared in the Feb. 8, 2021 issue of National Review on pp. 27-29.

Monday, February 01, 2021

Six Die from Liquid Nitrogen Accident at Georgia Poultry Plant


Last Thursday, January 28, emergency responders in Gainesville, Georgia began to receive 911 calls from Plant 4 of the Foundation Foods Prepared Food Division in that city.  One caller reported, "I've got two people not breathing."  Another said he had gotten a call from another part of the plant saying that someone could be "frozen from liquid nitrogen."  Firemen recovered five bodies from the facility and took ten people to hospitals, where one later died from injuries received in the accident.


Liquid nitrogen is manufactured by the ton for food processing applications such as flash freezing.  At a news conference following the accident, the U. S. Chemical Safety Board Chairman and CEO Katherine Lemos said that the factory receives two or three tank trucks of liquid nitrogen (LN2) every day.  At a temperature of -320 F, liquid nitrogen freezes nearly everything it touches, as long as you have enough of it.  For one summer in my somewhat misspent youth, I worked at a facility that made the frozen cornbread sticks known in the South as hushpuppies.  After being squeezed out of a gizmo that worked like a gang of toothpaste tubes, the sticks were cut up and went through a stainless-steel tunnel about three feet on a side and ten feet long.  By the time they came out they were hard enough to use as a murder weapon, though that never occurred to me at the time. 


Something along the same lines was probably in use at Foundation Foods, only scaled up to handle tons of chicken parts.  With proper ventilation and control of the release rate, liquid-nitrogen flash freezing can be a safe and efficient way of making frozen foods.  But obviously, too much LN2 at once in one place can create major hazards.  If the skin contacts LN2 for more than a second or so, the tissues freeze and create serious burn-like injuries, and if such injuries are extensive it can result in death.  And even people who are not in direct contact with the super-cold liquid can suffocate when it boils and expands by many times to exclude the oxygen in ambient air.


A combination of these hazards ended the lives of six workers at the Foundation Foods plant and injured nine more in one of the most serious cryogenic accidents in recent memory.  The CSB news conference revealed that the freezing system involved had been installed in the last four to six weeks, and tools were found in the vicinity of the system, indicating there may have been a maintenance problem.  Within a short time after the accident occurred, the main shutoff valves from the LN2 tanks outside the building were turned off, limiting the damage.  But by that time a number of serious injuries had been sustained.


Working at a meat-processing plant is no picnic.  The work environment is dictated by microbiologists whose paramount concern is to avoid contaminating the product with bacteria or other pathogens.  Employees must follow strict protocols such as stepping in sanitizing baths before entering certain spaces, wearing clothing to prevent contamination, and enduring long hours of repetitive and strenuous work in windowless, noisy, and often cramped spaces.  Add to these trials the new complications imposed by COVID-19 restrictions on employee spacing and contact, and it is no wonder that meat-processing jobs are not more popular.  But the work is steady and for those who can take it, it is often one of the very few options in rural parts of the country.


Moving from facts to speculation, it's pretty clear that a whole lot of LN2 got loose at once where it wasn't supposed to be.  A properly designed plant would have had sufficient ventilation to keep ahead of whatever LN2 is released under normal operating circumstances, but could have been overwhelmed if a supply pipe cracked under the stress of extreme cold or a joint came loose. 


In the hushpuppy plant I worked in back in the 1970s, the largest pipe going to the LN2 tunnel might have been an inch in diameter.  If that pipe had broken, there were numerous ways to escape, but all the same, I wouldn't have wanted to try to get out of there in a hurry.  While I didn't particularly enjoy that job, it never occurred to me to worry about the possibility of death by LN2, although the supervisors warned us to be careful when opening up the tunnel to clear a jam, as there might be puddles of LN2 lying around and we shouldn't touch it.


The conditions that some of our lowest-paid workers endure in the U. S. would surprise many people.  As news reports noted, we never get to see news footage of conditions inside meat-processing plants.  We buy inexpensive frozen chicken at my house, but I rarely give a thought to the conditions under which it was packed, despite having worked in the industry briefly and taken a tour of an Idaho hamburger plant in connection with some technical work.  The tour convinced me never to accept another such invitation again. 


The Bureau of Labor Statistics says that the average hourly wage of animal slaughtering and processing workers in the U. S. is $13.76.  If the Biden administration succeeds in passing a national $15-an-hour minimum-wage law, that will significantly alter the economics of frozen chicken manufacturing, perhaps pushing it outside the country altogether.  Would that be a good thing?  Would you rather have a lousy uncomfortable dangerous job, or no job at all?  Six of the employees of Foundation Foods are now beyond such concerns, but for the sake of the others in such plants across the country, I hope both the wages and the working conditions improve after we learn what went wrong in Gainesville.


Sources:  I used material from Atlanta's 11Alive TV station at, which includes 911 calls and the CSB news conference.  I thank my wife Pam for pointing out this story to me.