Twenty years ago this week, a late-night experiment at an obscure nuclear power plant in the former Soviet Union turned into the worst nuclear accident in history. During the early morning hours of April 26, 1986, operators at the graphite-core plant in Chernobyl, some eighty miles north of the Ukranian capital of Kiev, violated numerous regulations and disabled safety mechanisms during an ill-considered reactor test. The reactor blew apart and the graphite (carbon) core caught fire like a giant nuclear barbecue pit, sending radioactive smoke into the atmosphere. The accident was compounded by the criminally slow response of the Soviet government, which first attempted to cover up the incident. When Scandinavian nations detected abnormal levels of airborne radioactivity and started asking questions, the USSR reluctantly admitted there was a problem, but not before thousands of people living near the plant had been exposed to dangerous levels of radioactivity.
An Associated Press story by Mara D. Bellaby published this week recounts estimates of the total number of fatalities and illnesses caused by the accident. Not as many people died from Chernobyl as was originally feared. Eventually the government got around to evacuating some 116,000 people who lived within twenty miles of the plant. Official reports released by United Nations agencies recently say that only 50 people have died so far as a direct result of radiation poisoning traceable to the accident. Surprisingly, this includes those who fought the fire in the first hours of the accident and who were exposed to the most intense levels of radiation. The most significant problem in the general public has turned out to be a sharp increase in thyroid cancer among young people. Since radioactive iodine is taken up preferentially by the thyroid in children and adolescents, this increase was expected. Careful screening for early signs of thyroid cancer and prompt treatment have cured nearly all of those who contracted the disease, according to the reports. So if the world's worst nuclear accident caused only 50 deaths, why is it that no new nuclear power plants have been ordered in the United States since 1978?
The last nuclear plant to be completed in this country was finished in 1996. The nearly twenty-year span between these two dates alone give you some idea as to why utilities are reluctant to order nuclear plants. For a variety of reasons, many of them good, the nuclear power industry in the U. S. is hedged with an incredible number of regulations, permit processes, and controls from overlapping Federal, state, and local jurisdictions. Our own worst nuclear-plant disaster, Three Mile Island, happened in Pennsylvania in 1979, and compares to Chernobyl as a fender-bender compares to a bus full of children tumbling down a mountain. Nevertheless, it was serious enough to create political turmoil that effectively shut down the nuclear power construction industry in this country. There are still U. S. companies that make nuclear plants—they just don't sell them here.
As a consequence, the increased demand for electricity in the U. S. has been met since the 1980s largely by more coal-fired plants, with a small but significant amount contributed from renewable sources such as wind power. There are many good reasons to oppose nuclear power: the problem of what to do with the highly hazardous wastes created by plant operation, the danger of nuclear proliferation to unstable countries, and the "yuck factor" that some people will always feel about a technology that is associated with nuclear weapons. But assuming that the nation's use of electric energy is not going to decrease in absolute terms any time soon, the power has to come from somewhere, namely coal in the last few years. And opponents of greenhouse-gas emissions, many of whom also oppose nuclear power, know (or should know) that you can't burn coal without making carbon dioxide, which is the greenhouse gas of most concern. Nuclear power, whatever its other drawbacks, produces virtually no greenhouse gases, which is one reason that even "greens" have been giving it a second glance lately.
Some countries such as France never abandoned nuclear power. France's example shows that given a moderate, stable regulatory environment and good engineering, nuclear power can be a safe and reliable source of electricity, leaving aside the question of wastes. Still, it is not at all clear that the nuclear industry will ever be able to build substantial numbers of new plants in the U. S. The new free-enterprise model of partially deregulated utilities makes it even more risky to plan a long-term capital investment such as a nuclear plant, which sucks in millions of dollars for years before even starting to produce revenue. So if we can't build new nuclear plants, and we don't want to contributed to global warming by building new fossil-fueled coal, oil, or natural gas plants, where will the energy come from?
Radical conservation combined with renewable and distributed energy generation is one possible answer. Here and there, enterprising architects have built houses and even commercial buildings whose net use of externally supplied energy in the form of electricity or natural gas is only a small fraction of what typical construction uses. The drawback, of course, is that it takes expensive custom engineering and materials to achieve these radical savings, and in the current economic environment there is no incentive to do these things. Perhaps some radical economic experimentation is in order here. If large tax breaks or even subsidies were provided for building structures whose energy usage was, say, 50% or less of the average level, this could really be regarded in the light of a loan, since the country as a whole will benefit from the fact that less energy usage is a net gain in a costly-energy economy. A whole raft of vested interests would first have to be placated, but that is what politics is for.
As the aftermath of Chernobyl has proved, our worst fears in some areas sometimes turn out to be not as bad as we thought. But before we in the U. S. go nuclear in a big way, we have time to consider other options.
Sources: An article by Mara Bellaby similar to the one carried in the Austin American-Statesman is at http://www.newsobserver.com/104/story/431637.html.
Monday, April 24, 2006
Wednesday, April 19, 2006
Patent or Blackmail?
Here is a list of some of the great human achievements of the past five hundred years: the Scientific Revolution, the Industrial Revolution, the patent system . . . . What's that last one doing there? Historians of technology rightly regard the development of patent law as one of the most significant intellectual innovations of the early modern period. Beginning in Renaissance Europe and spreading to America, the idea that an inventor's rights to make and sell his invention should be protected by law for a limited period encouraged innovation while ensuring that the rights of the general public would also be protected from monopolies of indefinite lifetime. Engineers, whose ideas form the basis of many patents, should be interested to know that the present U. S. patent system is being gamed in a major way, to the detriment of nearly all concerned.
The most recent example of this concerns the firm Research in Motion, which makes the popular Blackberry wireless communication system. It used to be the case that patents were fairly difficult to obtain. The inventor's patent attorneys were pretty evenly matched by the U. S. government's patent examiners, whose job it was to make sure that trivial, obvious, or otherwise meritless patents were not issued. Patenting an idea was a serious and sometimes difficult undertaking, but when you got one, you knew you had something, and so did everyone else.
Not so anymore. A combination of factors—inadequate Patent Office funding, a hyper-pro-business attitude in government, and speedups in the pace of innovation—have made it much easier to get a patent in the last ten to twenty years. This includes dubious ones sometimes called "submarine patents"—not patents on the submarine, but patents deliberately designed to cover all parts of an emerging field, whether or not the supposed inventor has any genuinely innovative ideas or not. In the past, these types of patents would have never been issued, but in the current almost-anything-goes atmosphere, all it takes is enough money paid to a good patent firm.
What happened to Research in Motion this year shows what kind of harm can result from this over-liberalized issuing of patents. In the early 1990's, one Thomas Campagna patented some ideas for wireless email. In the meantime, Research in Motion put in a lot of work to develop the Blackberry, and obtained its own patents. In 2001, a company named NTP, formed to exploit Campagna's patents, sued RIM for patent infringement. The resulting legal hassle threatened to produce an injunction that would shut down all Blackberry services in the U. S., clearly an outcome that would benefit no one. This was despite the fact that the U. S. Patent and Trademark Office re-examined and rejected at least seven of NTP's patents along the way. In March of this year, RIM announced a settlement in which NTP would receive over $600 million. No doubt RIM views this as part of the cost of staying in business. But if the shady NTP patents had never been issued in the first place, none of this would have happened.
What has this got to do with engineering ethics? A lot. First, engineers can refrain from participating in the generation of "junk" patents. Unfortunately, this may not have much of an effect, since unscrupulous patent lawyers don't need much in the way of technical help to cobble together useless patents. This is not to say that patenting is unethical in general. Properly used in a well-conducted system, patents help to achieve the balance between monopolistic profit, innovation, and reasonably-priced new products and services that characterizes modern industrial societies. But the pendulum has swung way too far in favor of patent owners and patent attorneys to the detriment of the general public and those who actually do the hard work of developing and marketing new products, only to have their resources diverted into pointless patent battles. Under the present circumstances, the danger is that innovation will be stifled by artificially extended patents that allow established firms to exclude competition indefinitely. This is already happening in the pharmaceutical industry as some firms come up with patented repackaging of old patented drugs to prevent a cheaper generic form from coming onto the market. Who pays for this? The beleaguered patient who has to pay beaucoup bucks for the name-brand drug longer than necessary.
The second thing engineers can do is to make a political issue out of the patent system. True, it doesn't have the popular appeal of antiwar movements or tax reform. But it is critically important to fix a badly broken system before R&D departments of multinational firms decide to relocate in countries where the system is more rational. Ever since the U. S. patent system was founded in 1790, it has differed in significant ways from most European systems. One of the most important differences is that most European patent holders must show that they are licensing their patent to others or using their patents themselves, while there is no such requirement in the U. S. This allows U. S. patent holders to "sit on" submarine patents that lie dormant until a well-heeled company comes within the sights of the patent-holder's legal gun. Besides changes in the legal structure of patents, the U. S. Patent and Trademark Office simply needs a lot more good help in the form of funding and staff to stay competitive with the best private patent lawyers. Only then will they be able to reinstate the rigorous examination of patents that prevailed before the recent gold-rush atmosphere developed.
With their specialized training, engineers stand in a unique position to make an important political difference in this situation. Consider writing your U. S. senator or congressman about this matter, and see what happens. The worst that can happen is nothing, and the best could be a lot better than that.
Sources: The New York Times article "In Silicon Valley, A Man Without a Patent" by John Markoff was published online on Apr. 16, 2006, and is available from the NYT archives at http://select.nytimes.com/gst/abstract.html?res=F20811FA3D5B0C758DDDAD0894DE404482 for a fee. The Forbes.com article "More Patents Rejected in BlackBerry Case" by Arik Hesseldahl is at http://www.forbes.com/business/2005/06/22/rim-patent-infringement-cx_ah_0622rim.html.
The most recent example of this concerns the firm Research in Motion, which makes the popular Blackberry wireless communication system. It used to be the case that patents were fairly difficult to obtain. The inventor's patent attorneys were pretty evenly matched by the U. S. government's patent examiners, whose job it was to make sure that trivial, obvious, or otherwise meritless patents were not issued. Patenting an idea was a serious and sometimes difficult undertaking, but when you got one, you knew you had something, and so did everyone else.
Not so anymore. A combination of factors—inadequate Patent Office funding, a hyper-pro-business attitude in government, and speedups in the pace of innovation—have made it much easier to get a patent in the last ten to twenty years. This includes dubious ones sometimes called "submarine patents"—not patents on the submarine, but patents deliberately designed to cover all parts of an emerging field, whether or not the supposed inventor has any genuinely innovative ideas or not. In the past, these types of patents would have never been issued, but in the current almost-anything-goes atmosphere, all it takes is enough money paid to a good patent firm.
What happened to Research in Motion this year shows what kind of harm can result from this over-liberalized issuing of patents. In the early 1990's, one Thomas Campagna patented some ideas for wireless email. In the meantime, Research in Motion put in a lot of work to develop the Blackberry, and obtained its own patents. In 2001, a company named NTP, formed to exploit Campagna's patents, sued RIM for patent infringement. The resulting legal hassle threatened to produce an injunction that would shut down all Blackberry services in the U. S., clearly an outcome that would benefit no one. This was despite the fact that the U. S. Patent and Trademark Office re-examined and rejected at least seven of NTP's patents along the way. In March of this year, RIM announced a settlement in which NTP would receive over $600 million. No doubt RIM views this as part of the cost of staying in business. But if the shady NTP patents had never been issued in the first place, none of this would have happened.
What has this got to do with engineering ethics? A lot. First, engineers can refrain from participating in the generation of "junk" patents. Unfortunately, this may not have much of an effect, since unscrupulous patent lawyers don't need much in the way of technical help to cobble together useless patents. This is not to say that patenting is unethical in general. Properly used in a well-conducted system, patents help to achieve the balance between monopolistic profit, innovation, and reasonably-priced new products and services that characterizes modern industrial societies. But the pendulum has swung way too far in favor of patent owners and patent attorneys to the detriment of the general public and those who actually do the hard work of developing and marketing new products, only to have their resources diverted into pointless patent battles. Under the present circumstances, the danger is that innovation will be stifled by artificially extended patents that allow established firms to exclude competition indefinitely. This is already happening in the pharmaceutical industry as some firms come up with patented repackaging of old patented drugs to prevent a cheaper generic form from coming onto the market. Who pays for this? The beleaguered patient who has to pay beaucoup bucks for the name-brand drug longer than necessary.
The second thing engineers can do is to make a political issue out of the patent system. True, it doesn't have the popular appeal of antiwar movements or tax reform. But it is critically important to fix a badly broken system before R&D departments of multinational firms decide to relocate in countries where the system is more rational. Ever since the U. S. patent system was founded in 1790, it has differed in significant ways from most European systems. One of the most important differences is that most European patent holders must show that they are licensing their patent to others or using their patents themselves, while there is no such requirement in the U. S. This allows U. S. patent holders to "sit on" submarine patents that lie dormant until a well-heeled company comes within the sights of the patent-holder's legal gun. Besides changes in the legal structure of patents, the U. S. Patent and Trademark Office simply needs a lot more good help in the form of funding and staff to stay competitive with the best private patent lawyers. Only then will they be able to reinstate the rigorous examination of patents that prevailed before the recent gold-rush atmosphere developed.
With their specialized training, engineers stand in a unique position to make an important political difference in this situation. Consider writing your U. S. senator or congressman about this matter, and see what happens. The worst that can happen is nothing, and the best could be a lot better than that.
Sources: The New York Times article "In Silicon Valley, A Man Without a Patent" by John Markoff was published online on Apr. 16, 2006, and is available from the NYT archives at http://select.nytimes.com/gst/abstract.html?res=F20811FA3D5B0C758DDDAD0894DE404482 for a fee. The Forbes.com article "More Patents Rejected in BlackBerry Case" by Arik Hesseldahl is at http://www.forbes.com/business/2005/06/22/rim-patent-infringement-cx_ah_0622rim.html.
Thursday, April 13, 2006
Earthquake Prediction: Ready for Prime Time?
Earthquakes and the tsunamis that sometimes accompany them are one of the most frightening and fatal types of natural disasters. The December 26, 2004 earthquake and tsunami that struck in and around the Indian Ocean killed more than 200,000 people, and millions more have died in similar disasters. One of the main ways people die in an earthquake is in collapsing buildings, and over the years civil engineers have developed building codes and other techniques that reduce (but do not eliminate) the danger of structural collapse during an earthquake. Unfortunately for billions of people who live in developing countries, these measures are expensive. If the choice is between living in shaky but affordable housing on the one hand, and going without shelter on the other, most people take their chances with a house that may fall down in an earthquake. The poor of this world have more pressing things to worry about than earthquake safety, but that doesn't make their lives any less valuable.
Viewed as an engineering problem, the question of how to save lives in earthquakes and tsunamis has several possible solutions. The only one we have pursued to any great extent up to now is to make sure that structures will withstand the likely force of an earthquake. (As far as tsunamis go, there is little one can do except run for higher ground.) If—and this is a big "if"—earthquakes could be predicted with good accuracy, the problem becomes simpler. A few hours before an earthquake strikes, simply clear everyone out of dangerous buildings until the danger is past. This second solution is not without its own problems, but if it could be implemented, the cost of an early-warning system would be much less than earthquake-proof buildings for everybody, and the potential to save lives would therefore be much greater. The only problem is, how do you predict earthquakes?
Historically, earthquake prediction has been regarded as a pseudo-science. The abundance of post-earthquake "premonition" stories such as animals acting strangely, unusual sounds, and lights in the sky is a set of data that few scientists take seriously, and with some justification. Human beings are not emotionless recording machines, and memory is a highly subjective thing. Perfectly ordinary and random incidents that happen just before a frightening event take on an ominous cast when recalled later. But the shady neighborhood that earthquake prediction has resided in up to now should not prevent scientists and engineers from exploring ideas about how to do it.
The December 2005 issue of IEEE Spectrum, a highly regarded magazine for professional electrical and electronic engineers, carried an article on recent efforts to develop technical means of predicting earthquakes. (The article can be found at http://www.spectrum.ieee.org/dec05/2367). The lead author. Tom Bleier, described how ELF waves (extremely-low-frequency electromagnetic waves) and other measures such as satellite-sensed electromagnetic waves and surface temperatures have appeared at times to be correlated with certain large earthquake events. He made what to this author sounded like a good case that there is something to the idea that such correlations are real. However, a good physical explanation for why such correlations should occur is presently lacking.
The article inspired three geophysicists to write a letter to the editors of IEEE Spectrum protesting the publication of claims that they said should be rejected (the letter can be viewed at http://www.spectrum.ieee.org/apr06/3275). Robert J. Geller, Alex I. Braginski, and Wallace H. Campbell argued that there is no scientific basis for the kind of earthquake prediction that Bleier and his colleagues are doing. They claim there is so much noise from other natural and man-made sources at the frequencies in question that any exercise in earthquake prediction amounts to sophisticated tea-leaf reading. Their opinion is that the scientific community has examined the methods of Bleier and company and found them wanting.
This controversy reminds me of the early days of tornado prediction. From the late 19th century until 1938, forecasters at the U. S. Weather Bureau were forbidden even to use the word "tornado" in a forecast. The prevailing opinion was that there was no reliable way to predict tornadoes and such a forecast was likely only to cause needless panic. It wasn't until 1948 when some U. S. Air Force weathermen at Tinker Air Force Base in Oklahoma had their airfield trashed by a tornado that anyone began to apply serious scientific effort toward the problem of tornado forecasting. They came up with a combination of conditions that looked like it would work. Five days later, they noted the same conditions prevailed, and, not being under the restrictions of the civilian Weather Bureau, took it upon themselves to issue a tornado forecast to Air Force personnel. Later that same evening, probably the only tornado in history that was greeted with jubilation struck Tinker Air Force Base again! The weathermen published their findings in 1950 and 1951, but for several years afterwards tornado forecasts were restricted to military facilities unless they were leaked to the media. Other researchers attempting to publish research papers relating to tornado forecasting were blocked by skeptical reviewers. It took the better part of a decade to overcome the attitude that forecasting tornadoes was so chancy as to not be worth upsetting the public. But in combination with radar-based early warning systems for tornadoes that were put in place in the 1950s, annual tornado fatalities in the Midwest plummeted. (The story of tornado prediction is told in Marlene Bradford's Scanning the Skies: A History of Tornado Forecasting.)
Time will tell whether the new techniques of earthquake forecasting will bear fruit in the form of reliable, specific predictions. In the meantime, its proponents should prepare themselves for a long battle with skeptics. We can hope that if there is anything to it, engineers, scientists, and the public will be open-minded enough to welcome the practice and take it seriously enough to save lives with it in the future.
Sources: See URLs above referring to items in IEEE Spectrum. Marlene Bradford's Scanning the Skies: A History of Tornado Forecasting was published in 2001 by the University of Oklahoma Press, Norman.
Viewed as an engineering problem, the question of how to save lives in earthquakes and tsunamis has several possible solutions. The only one we have pursued to any great extent up to now is to make sure that structures will withstand the likely force of an earthquake. (As far as tsunamis go, there is little one can do except run for higher ground.) If—and this is a big "if"—earthquakes could be predicted with good accuracy, the problem becomes simpler. A few hours before an earthquake strikes, simply clear everyone out of dangerous buildings until the danger is past. This second solution is not without its own problems, but if it could be implemented, the cost of an early-warning system would be much less than earthquake-proof buildings for everybody, and the potential to save lives would therefore be much greater. The only problem is, how do you predict earthquakes?
Historically, earthquake prediction has been regarded as a pseudo-science. The abundance of post-earthquake "premonition" stories such as animals acting strangely, unusual sounds, and lights in the sky is a set of data that few scientists take seriously, and with some justification. Human beings are not emotionless recording machines, and memory is a highly subjective thing. Perfectly ordinary and random incidents that happen just before a frightening event take on an ominous cast when recalled later. But the shady neighborhood that earthquake prediction has resided in up to now should not prevent scientists and engineers from exploring ideas about how to do it.
The December 2005 issue of IEEE Spectrum, a highly regarded magazine for professional electrical and electronic engineers, carried an article on recent efforts to develop technical means of predicting earthquakes. (The article can be found at http://www.spectrum.ieee.org/dec05/2367). The lead author. Tom Bleier, described how ELF waves (extremely-low-frequency electromagnetic waves) and other measures such as satellite-sensed electromagnetic waves and surface temperatures have appeared at times to be correlated with certain large earthquake events. He made what to this author sounded like a good case that there is something to the idea that such correlations are real. However, a good physical explanation for why such correlations should occur is presently lacking.
The article inspired three geophysicists to write a letter to the editors of IEEE Spectrum protesting the publication of claims that they said should be rejected (the letter can be viewed at http://www.spectrum.ieee.org/apr06/3275). Robert J. Geller, Alex I. Braginski, and Wallace H. Campbell argued that there is no scientific basis for the kind of earthquake prediction that Bleier and his colleagues are doing. They claim there is so much noise from other natural and man-made sources at the frequencies in question that any exercise in earthquake prediction amounts to sophisticated tea-leaf reading. Their opinion is that the scientific community has examined the methods of Bleier and company and found them wanting.
This controversy reminds me of the early days of tornado prediction. From the late 19th century until 1938, forecasters at the U. S. Weather Bureau were forbidden even to use the word "tornado" in a forecast. The prevailing opinion was that there was no reliable way to predict tornadoes and such a forecast was likely only to cause needless panic. It wasn't until 1948 when some U. S. Air Force weathermen at Tinker Air Force Base in Oklahoma had their airfield trashed by a tornado that anyone began to apply serious scientific effort toward the problem of tornado forecasting. They came up with a combination of conditions that looked like it would work. Five days later, they noted the same conditions prevailed, and, not being under the restrictions of the civilian Weather Bureau, took it upon themselves to issue a tornado forecast to Air Force personnel. Later that same evening, probably the only tornado in history that was greeted with jubilation struck Tinker Air Force Base again! The weathermen published their findings in 1950 and 1951, but for several years afterwards tornado forecasts were restricted to military facilities unless they were leaked to the media. Other researchers attempting to publish research papers relating to tornado forecasting were blocked by skeptical reviewers. It took the better part of a decade to overcome the attitude that forecasting tornadoes was so chancy as to not be worth upsetting the public. But in combination with radar-based early warning systems for tornadoes that were put in place in the 1950s, annual tornado fatalities in the Midwest plummeted. (The story of tornado prediction is told in Marlene Bradford's Scanning the Skies: A History of Tornado Forecasting.)
Time will tell whether the new techniques of earthquake forecasting will bear fruit in the form of reliable, specific predictions. In the meantime, its proponents should prepare themselves for a long battle with skeptics. We can hope that if there is anything to it, engineers, scientists, and the public will be open-minded enough to welcome the practice and take it seriously enough to save lives with it in the future.
Sources: See URLs above referring to items in IEEE Spectrum. Marlene Bradford's Scanning the Skies: A History of Tornado Forecasting was published in 2001 by the University of Oklahoma Press, Norman.
Friday, April 07, 2006
The Engineer and Grandma Millie: The California Energy Crisis Revisited
Engineers like to think that what they do professionally helps people. The payoff in this regard is not as direct as a doctor gets when he takes in a dying patient and sends them home feeling fine. But most engineers, I suspect, would like to believe that the work they do makes a positive difference in the lives of people who use their products and services.
This connection is especially visible in the area of electric utilities. In the aftermath of numerous hurricanes and storms in 2005, we saw teams of hundreds of linemen coming from all across America to repair the damaged distribution infrastructure. Linemen aren't engineers, true, but I have encountered the same "keep the power coming" attitude in power engineers whose job it is to direct operations of the regional power pools that maintain a moment-by-moment balance between the fluctuations of electricity demand and the available supply. Since electricity cannot be stored in large quantities, it must be produced as needed, and keeping abreast of changing demands can be a headache, even when no one is trying to play financial games too.
This week, two of the all-time top financial game-players are on trial for lying about their company's profitability. Jeffrey Skilling, former CEO of Enron, and Kenneth Lay, the firm's founder, are being tried in a fraud and conspiracy case that the federal prosecutors have framed in simple terms. In the years 2000 and 2001, some parts of Enron were losing lots of money, and the claim is that Skilling and Lay knew this and lied about it to investors and the general public. Ironically, one part of Enron that was extremely—some would say sinfully—profitable was the energy-trading division, which the Attorney General of the State of California claims was responsible for many of the rolling blackouts that hit that state in the same years. Skilling and Lay didn't have to lie about that—they just had to live with their consciences.
What happened to the energy market in California in 2000 has been described as the perfect storm of electric-utility deregulation. To make a long and complex story short, increasing demand and partial deregulation led to a situation in which there was simply not enough electricity available in California for several days of unusually hot weather or short supply. The new tariffs allowed companies like Enron to charge whatever the traffic would bear for energy imports and futures, and as a result rates soared to the stratosphere in only a few months. The Attorney General claims that Enron and other utility interests purposefully took generating facilities offline in order to increase their profits. The fallout in terms of accusations, lawsuits, bankruptcy proceedings, and other effects continues to this day. In the process, state investigators unearthed a set of recorded phone conversations among energy brokers at Enron and other firms.
These tapes make for depressing listening. One took place as the state legislature was debating whether to cap the spot price of energy on the open market. "So the rumor's true, they're taking all the f---ing money back from you guys? All the money you guys stole from those poor grandmothers in California?"
"Yeah, Grandma Millie, man. She's the one who couldn't figure out how to vote on the butterfly ballot. Now she wants her f---ing money back from the power utilities. . . ." And these are some of the less obscene samples. Many more such recordings can be found at http://ag.ca.gov/antitrust/energy/index.htm.
The engineers who participated, willingly or unwillingly, in the events of the California energy crisis have not received as much attention as the financial traders who clearly profited from the situation. I have seen a few references in the open literature to their activities and the difficult situation they faced. In the tightly coordinated world of electric power-pool operation, individual action is nearly impossible, since the decision to shut down a facility or make purchases of power here or there is one that only a few individuals can make. Anything other than following orders in a case like this would amount to industrial sabotage, since an uncoordinated attempt to shut down or put online a generator would cause serious damage. Whether or not the engineers involved liked what they were doing, and whether or not they knew its implications, they had few options in the event. Like soldiers in a battlefield, their horizon was limited to their immediate surroundings and the technical circumstances they had to deal with at each moment. It is possible that they realized the wider implications of their actions during the crisis only in retrospect.
If any power engineers involved in the California energy crisis care to share their experiences, it would be appreciated. Engineers are generally far from the centers of political and corporate power where rate setting and related issues are decided. But to the extent that such matters interfere with the average engineer's desire to serve the public, not penalize it, there is something wrong structurally with the way electric utilities are set up and administered.
Since 2001, Enron has gone bankrupt, the California economy has cooled off, continued efforts in energy conservation have alleviated summer blackout threats, and additional generating capacity has been added to the nation's power grid. Sometimes a crisis has to happen in order to galvanize politicians and corporations into action, so we might actually be thankful that the California energy crisis happened when it did, and was no more severe than it was. All the same, it would be easy to become complacent in the face of new schemes for shortchanging Grandma Millie in order to profit the powerful, and we should be wary of them in the future.
Sources: The best source of Enron tapes I have found online is on the California Attorney General's website http://ag.ca.gov/antitrust/energy/index.htm.
This connection is especially visible in the area of electric utilities. In the aftermath of numerous hurricanes and storms in 2005, we saw teams of hundreds of linemen coming from all across America to repair the damaged distribution infrastructure. Linemen aren't engineers, true, but I have encountered the same "keep the power coming" attitude in power engineers whose job it is to direct operations of the regional power pools that maintain a moment-by-moment balance between the fluctuations of electricity demand and the available supply. Since electricity cannot be stored in large quantities, it must be produced as needed, and keeping abreast of changing demands can be a headache, even when no one is trying to play financial games too.
This week, two of the all-time top financial game-players are on trial for lying about their company's profitability. Jeffrey Skilling, former CEO of Enron, and Kenneth Lay, the firm's founder, are being tried in a fraud and conspiracy case that the federal prosecutors have framed in simple terms. In the years 2000 and 2001, some parts of Enron were losing lots of money, and the claim is that Skilling and Lay knew this and lied about it to investors and the general public. Ironically, one part of Enron that was extremely—some would say sinfully—profitable was the energy-trading division, which the Attorney General of the State of California claims was responsible for many of the rolling blackouts that hit that state in the same years. Skilling and Lay didn't have to lie about that—they just had to live with their consciences.
What happened to the energy market in California in 2000 has been described as the perfect storm of electric-utility deregulation. To make a long and complex story short, increasing demand and partial deregulation led to a situation in which there was simply not enough electricity available in California for several days of unusually hot weather or short supply. The new tariffs allowed companies like Enron to charge whatever the traffic would bear for energy imports and futures, and as a result rates soared to the stratosphere in only a few months. The Attorney General claims that Enron and other utility interests purposefully took generating facilities offline in order to increase their profits. The fallout in terms of accusations, lawsuits, bankruptcy proceedings, and other effects continues to this day. In the process, state investigators unearthed a set of recorded phone conversations among energy brokers at Enron and other firms.
These tapes make for depressing listening. One took place as the state legislature was debating whether to cap the spot price of energy on the open market. "So the rumor's true, they're taking all the f---ing money back from you guys? All the money you guys stole from those poor grandmothers in California?"
"Yeah, Grandma Millie, man. She's the one who couldn't figure out how to vote on the butterfly ballot. Now she wants her f---ing money back from the power utilities. . . ." And these are some of the less obscene samples. Many more such recordings can be found at http://ag.ca.gov/antitrust/energy/index.htm.
The engineers who participated, willingly or unwillingly, in the events of the California energy crisis have not received as much attention as the financial traders who clearly profited from the situation. I have seen a few references in the open literature to their activities and the difficult situation they faced. In the tightly coordinated world of electric power-pool operation, individual action is nearly impossible, since the decision to shut down a facility or make purchases of power here or there is one that only a few individuals can make. Anything other than following orders in a case like this would amount to industrial sabotage, since an uncoordinated attempt to shut down or put online a generator would cause serious damage. Whether or not the engineers involved liked what they were doing, and whether or not they knew its implications, they had few options in the event. Like soldiers in a battlefield, their horizon was limited to their immediate surroundings and the technical circumstances they had to deal with at each moment. It is possible that they realized the wider implications of their actions during the crisis only in retrospect.
If any power engineers involved in the California energy crisis care to share their experiences, it would be appreciated. Engineers are generally far from the centers of political and corporate power where rate setting and related issues are decided. But to the extent that such matters interfere with the average engineer's desire to serve the public, not penalize it, there is something wrong structurally with the way electric utilities are set up and administered.
Since 2001, Enron has gone bankrupt, the California economy has cooled off, continued efforts in energy conservation have alleviated summer blackout threats, and additional generating capacity has been added to the nation's power grid. Sometimes a crisis has to happen in order to galvanize politicians and corporations into action, so we might actually be thankful that the California energy crisis happened when it did, and was no more severe than it was. All the same, it would be easy to become complacent in the face of new schemes for shortchanging Grandma Millie in order to profit the powerful, and we should be wary of them in the future.
Sources: The best source of Enron tapes I have found online is on the California Attorney General's website http://ag.ca.gov/antitrust/energy/index.htm.
Subscribe to:
Posts (Atom)