Monday, September 17, 2018

The Massachusetts Gas Disaster

Being one of the longest-settled regions of the U. S., the Commonwealth of Massachusetts has been home to gas utility companies for close to a century and a half.  Unfortunately, some of the pipes installed in the 19th century are still in use in older parts of the state, notably around Lawrence, Andover, and North Andover in the Merrimack Valley north of Boston.  So earlier this month, Columbia Gas of Massachusetts, the gas utility serving the area, announced that it was going to start replacing some older gas lines. 

On Thursday afternoon, Sept. 13, residents of these towns would have been justified in thinking that they had suddenly been transported into the midst of a bad horror movie.  House after house exploded and caught fire.  The Massachusetts State Police logged over 30 calls reporting fires, and by Saturday authorities counted over 60 suspected gas fires in the area.  When one house exploded, its chimney toppled over onto a car, and one teenager in the car later died of his injuries.  Twenty-five people suffered injuries, some serious.

As soon as the scope of the disaster was known, authorities shut off gas and electric utilities to the areas affected and ordered an evacuation, which lasted in some cases until Saturday.  Citing Columbia Gas’s lack of cooperation, Massachusetts Governor Charlie Baker said he was replacing Columbia Gas with another utility, Eversource, to lead recovery efforts, and declared a state of emergency in the region. 

A thorough understanding of what went wrong in Lawrence will have to wait for the results of investigations by local, state, and Federal officials, including members of a National Transportation Safety Board team that were dispatched to the scene.  Nevertheless, similar incidents have happened before, and their history is better known.

In the early days of gas utilities, gas was manufactured typically by the destructive distillation of coal and stored in large accumulator tanks at close to atmospheric pressure.  Only enough pressure to send it through distribution pipes and gas meters was used, and the delivered pressure was so low it was measured in inches of water, rather than pounds per square inch (PSI).  About 7 inches of water was the standard delivery pressure then, and it remains so today—equivalent to about 0.25 PSI.  Such a low pressure reduces requirements on residential piping and means that even a wide-open pipe is not going to leak enough gas to form anything more than a moderate flame.  I have seen a gas utility worker at a work site in my neighborhood street leave a delivery pipe open and unattended for a few minutes, with no apparent concern.

But by the same token, systems designed for such low pressure behave badly if by some mishap a higher transmission-line pressure reaches them.  For transmission over long distances, pressures ranging in the 50 to 100 PSI range are used to deliver gas to substations, where pressure regulators lower the pressure to the low levels needed at customers’ houses. 

Experts agree that somehow, a pressure greatly in excess of the normal 0.25 PSI was mistakenly connected to the distribution system in Andover, North Andover, and Lawrence.  The exact effects on each home depended on what kind of appliances were connected and operating at the time.  Fortunately, the weather was mild—highs in the 70s, lows in the 60s—so probably few domestic heating systems were being used.  Still, older furnaces and stoves use pilot lights, and one official was quoted as saying with enough pressure, a pilot light can turn into a torch.  Minor flaws in piping that would withstand 0.25 PSI may give way under higher pressures, resulting in major leaks of gas that can be triggered by an electric spark, a burning candle, or other source of ignition.  Out of the many hundreds of homes serviced by Columbia Gas in that region, some 60 or so suffered either major leaks or fires and explosions as a result.

When I lived in Massachusetts for a time after growing up in Texas, I noticed a rather widespread prejudice against gas for domestic heating, as opposed to either electricity or heating oil.  I encountered more than one person who said they would never buy or rent a place with gas, simply because of the danger.  Having grown up in a house heated with gas floor furnaces and space heaters (most of which would be too dangerous to install in new construction today), this attitude struck me as strange.  But Columbia Gas has not done its industry’s reputation any good by first allowing this accident to happen, and then by failing to take quick, decisive public actions to mitigate the disaster.  Columbia Gas’s parent company NISource saw its stock price fall 12 percent on the Friday following the disaster, an unusual occurrence for a generally conservative investment such as a fully-regulated utility firm. 

Environmentally speaking, the use of natural gas for domestic heating is more efficient than electric heating, even if natural gas is used to power the electric generation plants.  It can be argued that in making so many new natural-gas discoveries over the past decade, the U. S. has done more in recent years to replace higher-carbon-emission oil with natural gas than any other nation, thus decreasing the world’s carbon footprint. 

But these arguments will not console those who have lost loved ones or property in the Massachusetts gas explosions.  I wouldn’t blame them if some of them return to what is left of their homes and rip out all the gas lines completely.  I can also imagine the troops of lawyers who must be descending upon the region to file lawsuits against Columbia Gas and anyone else with deep pockets who might have been involved.  There is justice in compensating victims for their losses to the extent possible.  A trust has been betrayed when a utility’s normally benign facilities suddenly turn into engines of fire and destruction.  It will be interesting to learn what combination of mechanical failure, lack of understanding, and management errors led to this tragedy.  But even if we learn everything there is to know, that won’t fix the death and damage that resulted from it. 

Sources:  I referred to articles carried by CNN on Sept. 14 at, the Boston Globe at, and Fortune (a Bloomberg News story) at, as well as Wikipedia articles on natural gas and Lawrence, Massachusetts. 

Monday, September 10, 2018

Boyan Slat’s Pacific Trash Collector: Noble or Quixotic?

Earlier this month, a 2000-foot-long (600-meter-long) floating boom set out from California to head for the Great Pacific Garbage Patch.  The instigator of this ambitious project, a Dutchman named Boyan Slat, hopes that the boom will demonstrate its ability to clean up the garbage floating in the top layer of the ocean.  If it does, he wants to use more of the $35 million he’s raised so far to build more booms and make a sizable dent in the garbage patch, which is reportedly twice the size of Texas.

According to a Sept. 8 Associated Press story, Slat got inspired to clean up the ocean when he went scuba diving in the Mediterranean Sea when he was sixteen, and saw more plastic junk than fish.  Now 24, he heads the nonprofit organization called simply The Ocean Cleanup, which he has single-mindedly guided to create the boom that is now undergoing its initial tests.  Wikipedia’s article on him notes that he first presented his garbage-collecting boom idea in a TEDx talk in 2012, and raised $2 million for it shortly thereafter with crowdfunding.  Only six years later, he has realized his initial dream and is hoping that storms won’t reduce his garbage collector to pieces that will themselves become floating garbage, although this is a small but real possibility.

Slat is a product of his times, and while we must salute his drive and ingenuity, he depends vitally on the good will and priorities of the thousands of people, wealthy and otherwise, who have supported his enterprise. 

The Ocean Cleanup represents something fairly new in engineering organizations:  an explicitly non-profit entity whose goal is to do something that indirectly benefits the entire world but directly benefits no one in particular.  Smaller-scale organizations such as Engineers Without Borders also try to do good rather than simply support outfits that make money, but EWB tends to take on small-scale specific projects, not mega-ambitious things such as a fleet of booms to clean up the Pacific Ocean. 

Nevertheless, Slat is doing engineering, and it remains to be seen whether the project will succeed on its own terms.  The task Slat is undertaking is not easy.

The phrase “garbage dump” conjures up a picture of a solid layer of large pieces of floating plastic trash so thick you could almost walk on it, like you might see in a puddle at a trash dump.  But the worst of the Great Pacific Garbage Dump is nothing near that dense.  Some estimates say that at the highest concentration, there are only about 4 particles per cubic meter, and the size of most particles is on the order of a few millimeters, which makes the area difficult to assess through satellite imagery.  You have to go out there with a sieve and drag the surface to find it—even visual observations from a boat will miss most of it. 

There were no details in the AP article about the size of the screen that Slat’s boom uses, but obviously there are a lot of compromises involved.  A screen small enough to catch 5-mm pieces of plastic will also bother fish, and so Slat is sending marine biologists along with the boom to monitor any harm that wildlife may come to as a result. 

Another question is, how many of these booms will have to be deployed to make a dent in the problem?  The Wikipedia article on the Great Pacific Garbage Dump cites an estimate of 80,000 metric tons of trash in the area.  A spokesman for the Ocean Conservancy named George Leonard said that while he hoped Slat’s effort will succeed, some 9 million tons of plastic waste go into the oceans each year.  No source was given for that statistic.

Even if we ignore the question of whether metric or English tons are in question, the ratio of 9 million to 80,000 is a factor of about 100.  Assuming the 9-million-ton figure is true, even if Slat gets the Garbage Dump completely cleaned up, that will represent only about 1% of the stuff entering the ocean each year, which is (pardon the expression) a drop in the ocean.  If the 9-million-ton figure is in error, somebody needs to be called to account to correct it.

Now, cleaning up the environment is a noble goal, and I hope Slat’s boom collects all the garbage that his heart desires.  Of course, once the garbage is brought to land we’ll face the problem of what to do with it then, but at least it will be out of the ocean.  But I can’t help but wonder how a 24-year-old, even a very determined one, can raise $30 million for a giant project that will not demonstrably directly improve the life of any single individual on earth, while obvious human needs such as the lack of pure water and sanitation in many locations around the globe results in the deaths of thousands every year. 

The answer, I think, is the distortion of priorities that has occurred in the global culture, a distortion that can be traced to a vacuum where knowledge ought to be.  Most traditional cultures presented people with an integrated vision of what the world is about, and what one’s place in the world was.  Leaving aside any question of which vision is actually true, a person growing up in such a culture usually conformed to the culture’s vision, and if that vision was benign, things went well with that culture, generally speaking.

But in today’s atomistic, fragmented, individualistic culture dominated by global media that present a highly selective view of the world, projects such as Slat’s can attract attention and support from people who may not know or care that their next-door neighbor could use help a lot better than the fish in the Pacific Ocean.  There is no united vision of the world and its purposes, so anybody who comes along with a good-sounding solution to one of the problems highlighted by the media can gain the kind of support that Slat has with his Ocean Cleanup project.

I wish Boyan Slat’s project the best, but I hope he learns from failure as well as success, and redirects his future efforts toward something that will help others a little more directly. 

Sources:  The Chicago Tribune website carried the Associated Press article by Olga Rodriguez “Two Texases' worth of plastic is floating in the Pacific. A new device will spend 20 years trying to clean it” on Sept. 8, 2018 at  I also referred to the Wikipedia articles on the Great Pacific Garbage Dump and on Boyan Slat. 

Monday, September 03, 2018

Transcranial Direct-Current Stimulation and the Point of Sports

Full disclosure:  on a scale of sports-fan tendencies, I am off the scale in the negative direction.  But my almost complete lack of intrinsic interest in sports makes me a somewhat dispassionate observer, I hope, of a phenomenon that engages the attention of billions around the globe, and has its ethical aspects as well.  When technology gets in the mix, you have engineering ethics concerns.  And so that’s why we’re looking today at something called transcranial direct-current stimulation (tDCS) and its increasing use by both professional and amateur athletes.

The technique of tDCS consists of connecting two or more electrodes to your skull and sending a small DC current of a few milliamps through the electrodes, where some of it finds its way to your brain.  Depending on which part of the brain is stimulated, the effects can range from nothing to the triggering of an epileptic fit in susceptible individuals.  Most of the time, though, the effects are subtle and have to be documented through elaborate studies.

According to a report in IEEE Spectrum that appeared in 2016, two young tDCS researchers named Daniel Chao and Brett Wingeier decided to take what they learned from working at a brain-implant company that sold anti-epileptic devices, and turn it into some kind of profitable business.  They experimented with a non-invasive tDCS setup instead of an invasive implant, and found that the area of the brain that seemed to respond most positively to tDCS was the motor cortex, which controls voluntary movements.  They founded a company called Halo Neuroscience, and for the last year or two the firm has been selling a product that looks like an odd kind of headphone with foam spikes pointing inward around the headband.  The spikes are the electrodes, and the Spectrum reporter who tried an early prototype found that using the device enhanced her performance on a simple motion:  curling her biceps on a bicep-curling machine.

The article also raised the question of whether tDCS would be viewed unfavorably by sports-regulating commissions such as the World Anti-Doping Agency (WADA), which is the outfit that tries to police Olympic sports to prevent certain categories of drug-taking and other activities deemed to be unfair. 

The criteria used by the WADA to decide whether a given technology is allowable are as follows:  (1)  Does it have the potential to enhance or enhances sport potential?  (2)  Does the substance or practice represent an actual or potential health risk to the athlete?  (3)  Is it in the spirit of the sport?

By now, Halo and similar tDCS firms have been able to show repeatable positive results when athletes train for their particular sports while wearing tDCS rigs.  The devices are not used during an actual competition, because their usefulness consists in aiding what is sometimes called “muscle memory.”  In stimulating the neurons associated with voluntary movement, tDCS makes it easier to acquire the particular patterns of nerve behavior that makes optimum use of one’s muscles.  Even now the fine details of how tDCS does this are not entirely clear, but a lack of total understanding of a technology has never kept entrepreneurs from selling something that works.  And although the results are not spectacular—increases in performance in the 2 to 5 percent range are typical—these marginal improvements are most valuable to professional athletes looking for that little extra something.

Of course, if tDCS becomes as common as Gatorade and everybody uses it, we’ll be right back where we started, except that the tDCS companies will have a guaranteed market indefinitely into the future. 

The second criterion of the WADA about safety seems not to be much of a concern with tDCS.  The technique itself has been studied with modern techniques for at least forty years, and no one has discovered any notable ill effects that tDCS has on most people, unless there is some underlying condition already present such as epilepsy. 

So if there is any ethical objection to tDCS, it would be based on the third criterion, namely that using it is not in “the spirit of the sport.”  And that’s a rather fuzzy phrase.

There are some things you can imagine that would enhance performance, wouldn’t be dangerous to the athlete, but would definitely be contrary to the spirit of the sport.  For example, if a shot-putter got into the ring and brought a carbide cannon with him (a little device that generates acetylene and then sets it off behind a projectile), and used it to hurl the shot a couple thousand feet, this would clearly not be in the spirit of the sport.  The point of shot-putting is to see how far you can throw the thing, not how far your cannon can throw it.  But once you start banning technological aids, it’s hard to draw the line, because all technology is the same kind of thing, in one sense.  It’s just the degree to which it helps that varies. 

It turns out that the WADA has effectively given tDCS a pass, and its use is spreading among athletes in a wide variety of sports ranging from swimmers to cyclers and beyond.  One practical concern that doesn’t show up explicitly in the WADA criteria is the question of how easy it is to detect a given technology’s use.  Carrying a carbide cannon into a shot-put ring is a fairly obvious thing to do.  But using a tDCS device only in training and not on the field is something that would be almost impossible to detect after the fact, and to detect such use would require continuous supervision by WADA personnel that the agency simply does not have.  I suspect this near-impossibility of detection played a major role in the agency’s decision to allow tDCS.

But that doesn’t answer the question of whether tDCS, or any other advanced technology that makes the body something else than what it was before, is truly in the spirit of any sport.  The answer to that question hinges upon one’s philosophy of what sports is all about.  Is it just a way that humans entertain themselves and others, no different in principle than watching a Star Wars movie?  Or is it a striving toward an ideal, a direct assault on the possible using only what you were born with and can acquire through personal discipline? 

Having no discernable interest in sports myself whatsoever, I’m the wrong person to answer these questions.  But those who care need to think about what sports is really about before simply accepting advanced technologies such as tDCS, because one day you may wake up and realize that the sport you loved has turned into something a lot closer to Star Wars than you may like.

Sources:  The article on Halo Neuroscience’s prototype tDCS headset, “A New Kind of Juice” was written by Eliza Strickland and appeared on pp. 34-40 of the September 2016 print issue of IEEE Spectrum.  An online version of the article can be found at  I also referred to articles on tDCS and sports regulation at and  A news release about Halo equipment being used for the USA cycling team can be found on the company’s website at  I also referred to the Wikipedia article on tDCS.

Monday, August 27, 2018

The Morani Bridge Collapse: Style Over Substance?

Riccardo Morani (1902-1989) was an Italian civil engineer and bridge designer who was one of the earliest proponents of designs that used mainly prestressed concrete, rather than mostly steel.  In 1967, a bridge he designed was put into service in Genoa, Italy.  It spanned a river, some railroad tracks, and other portions of the city with three tall pylons, each of which had concrete stays reaching diagonally down to the roadway, which was suspended some 145 feet (44 m) above the ground.  It came to be known as the Morani Bridge, after its designer.

On Tuesday, August 14, during an intense rainstorm one of the tower-supported sections of the bridge suddenly collapsed.  As of today (Aug. 26), a total of 43 people have died as a result of the accident, not to mention injuries and property damage, which will total in the millions.  Government officials have called for the revocation of the contract with Autostrade per l’Italia, the private firm that handles highway maintenance in Italy.  One mourner at the state funeral held for many of the victims said that “In Italy, we prefer ribbon-cuttings to maintenance.” 

Engineering experts consulted by the media all said it was too soon to draw any conclusions about what might have caused the bridge to fall.  Bridges designed by Morani have a history of requiring more maintenance than more common designs do.  The stark elegance that may have appealed to clients around the world who were looking for something distinctive to add to a city skyline was achieved at a cost of asking a lot of the material that was used in the bridges Morani designed.  As we have mentioned before, pure concrete has almost no strength in tension, so to use it as a structural material, it has to be reinforced with steel “rebars” and other components that can withstand pulling stresses.  This would be especially true of the stays that slanted down from the tops of the towers to support the roadbed.  Over time, corrosion can attack these tension members, sometimes invisibly deep within a vital member of the structure. 

The evidence of why the bridge collapsed is buried inthe huge piles of rubble that workers will need to clear meticulously and carefully, and because such work is both a huge project on its own and demanding of attention to detail, it may be months or even years before we have an answer to the question of why the bridge collapsed.  After a bridge in Minneapolis collapsed in August of 2007, it took over a year for the U. S. National Transportation Safety Board to issue its final report on the accident, which attributed the collapse to a design flaw that made a gusset plate too weak. 

The problem with forensic investigation of prestressed-concrete bridges is that concrete is a much more complex material than steel.  Unlike steel, which is fabricated under carefully controlled conditions in a steel mill, concrete is often formed onsite, and the way it is mixed, poured, and treated after pouring can influence its ultimate strength and other properties.  Nevertheless, most prestressed-concrete bridges withstand the stresses they were designed for, and so the reasons for the Morani collapse will be interesting to discover, if they can be found.

While we still do not know whether the collapse was due to an initial design flaw or faulty maintenance, the question of maintenance for bridges and other vital pieces of infrastructure is an urgent one that industrialized nations all around the world are struggling with.  In 2017, the American Society of Civil Engineers (ASCE) gave the U. S. a D+ in its “infrastructure report card,” saying that 56,000 bridges (about 9% of the total) were “structurally deficient” in 2016.  While the situation has not reached such a crisis that we see bridges falling down every month, tragedies like the Morani collapse remind us that the price of deferred maintenance is sometimes much higher than anyone would like to pay. 

It’s a little bit like preparing for war.  The only way you know you didn’t spend enough money on preparing for a war is if you lose it.  You can win with barely enough resources, or with three times more resources than you need, and the result is the same.  The art and science of maintenance consists in doing enough to prevent nearly all major tragedies and to do something about minor problems fast enough, while not simply wasting resources on painting a wall that doesn’t need painting, for example. 

Judging by the rarity of bridge collapses, most bridges were either built well enough to start with to survive many decades with whatever maintenance they’ve received, or have been maintained well enough to keep standing.  But the shock value of a major bridge collapse is one of the main motivators for public funding of infrastructure maintenance, which has none of the appeal of new construction. 

Engineers are mostly used to working out of the limelight, doing dull but necessary things like scheduling expensive maintenance that takes money away from more flashy and popular government activities.  Riccardo Morani was somewhat an exception to this rule, attaching his name to striking bridge designs that caught the eye of the public time after time.  If there’s the equivalent of an Internet connection wherever he is, I’m sure he’s sorry to see what has happened to his creation in Genoa, whether the failure is due to him personally or due to insufficient maintenance over the five decades the bridge has carried traffic since it opened.  But maintenance is a job for the living, not the dead, and engineers in charge of maintenance owe it to their constituent publics to be sure that tragedies such as the Morani bridge collapse don’t happen.  We look forward to finding out what went wrong in Genoa a couple of weeks ago, and applying those lessons to future problems so that they can be avoided before more people get killed.

Sources:  I referred to news reports on the accident carried by Time’s website at and The Guardian at  I also referred to Wikipedia articles on Riccardo Morani and the I-35W Mississippi bridge, and the ASCE report card at

Monday, August 20, 2018

Some Answers About the Panhandle Cornfield Meet of 2016

A “cornfield meet” in railroad parlance is a head-on collision between two locomotive engines.  Needless to say, such occurrences are avoided if at all possible.  But on the morning of June 28, 2016, two freight trains collided head-on in the Texas Panhandle, killing three people and causing an estimated $16 million in damage.  At the time I blogged about it, the only information available was news reports.  A few weeks later, the National Transportation Safety Board (NTSB) issued a preliminary report on the accident.  While the NTSB has not made public any additional data on the accident since then, the preliminary report makes clear that human error was likely at fault.
The BNSF line through the town of Panhandle is a single-track line, and two-way traffic is managed with a series of sidings.  The dispatchers, probably in the Fort Worth regional train control center, planned to switch the westbound train to a siding near the town, where it would remain while the eastbound train passed by on the main line.  If the eastbound train arrived in the area of the siding too soon, before the westbound train had time to move completely from the main line to the siding, two signals were set along the main line west of the eastern switch, where the westbound train was going to leave the main line for the siding.  The first signal the eastbound train encountered was solid yellow, which means for the engineer seeing the signal to slow the train to a maximum of 40 MPH and be prepared to stop at the next signal.  The second signal was set to red, which forbids the engineer from moving any part of the train past the red signal. 

So the plan was for the eastbound train to slow down at the yellow signal and stop at the red signal, while the westbound train arrived at the eastern switch and eventually cleared the main line by running onto the siding.

What happened instead was this.  Before the dispatchers had a chance to change the eastern switch from the main line to the siding, the eastbound train passed the yellow signal on the main line going at 62 MPH and the red signal at 65 MPH, heading through the switch on the main line straight for the westbound train.  When the engineer on the westbound train saw what was happening, he managed to jump from the cab.  But his conductor died in the resulting crash, as well as the engineer and conductor on the eastbound train.  The NTSB report somewhat ruefully notes that positive train control (PTC) was scheduled to be installed on this section of track later in 2016, although planned PTC installations have suffered repeated delays in the past.

PTC is a semi-automated system that promises to reduce the chances for human error in train operations.  A PTC system would have figured out that the two trains were heading toward a collision and would have at least slowed them down, if not preventing the accident entirely.  As it stands, the physical evidence points responsibility for the accident toward the crew of the eastbound train, as they failed to respond to the clearly visible yellow and red signals in time. 

We may never know what distracted them, but people make mistakes from time to time.  And some mistakes exact a fearful penalty. 

While even one death due to preventable causes is a tragedy, some context to this accident is provided by a slim volume I have on my shelves:  Confessions of a Railroad Signalman, by James O. Fagan, copyright 1908.  It was written at a time when railroad-related fatalities (passengers and railroad employees combined) were running at about 5,000 a year, a much higher rate per train-mile than today.  Fagan’s concern was that railroad employees of his day had to deal with on-the-job pressures that encouraged them to take risks and shortcuts that flouted the rules, and that the management system was ill-equipped to discipline misbehaving employees. 

While much has changed in railroading since 1908, any system that relies on a human being’s alertness can still fail if the person’s attention flags.  And that seems to be what happened outside Panhandle, Texas on that summer morning in 2016. 

If and when PTC is installed on most stretches of U. S. railways, the hope is that fatal and costly accidents will decline to even lower levels than what we see today.  The limiting factor after that will be mechanical malfunctions, perhaps, or dispatching errors at a high enough level to overrule the PTC system.  In any case, we can expect rail travel and shipping to be even safer than it is now, which compared to 1908 is pretty safe already.

Machines and systems are deceptively solid-looking.  It doesn’t seem possible that thousands of tons of steel rolling stock and rails can change very fast.  But the way it’s used can change, and PTC promises to do that.  Eventually, I suppose that the nation’s entire rail system will be run by computers and will resemble nothing so much as a giant version of a tabletop model train, running smoothly and without collisions or hazards.  Of course, automobile drivers will still manage to stop on grade crossings and people will walk on train trestles, so those types of accidents can’t be prevented even by PTC.  To eliminate those types of accidents, we’d have to tear up the whole system and rebuild it the way the English built their rail systems from the start:  fenced-off railroad property, virtually no grade crossings (tunnels and bridges instead), and other means to keep people and trains permanently separated. 

But I suspect we as a society are not that exercised to eliminate the last possible railroad fatality from the country.  So instead, we will enjoy whatever benefits PTC brings along and hope that we personally can stay out of the way of the trains. 

And modern-day cornfield meets will at last join their ancestors as a historic footnote, a quaint disaster that simply can’t happen anymore.  Like soldiers dying on the last day of a war, the crew members who died in the 2016 accident may be among the last to depart in that singularly violent way.  But for those of us who remain, and whose continued survival depends on our being alert, whether behind the throttle of a locomotive or the wheel of a car, this story is a good reminder to keep awake and pay attention.

Sources:  The NTSB report on the June 28, 2016 Panhandle, Texas accident can be found in the agency’s listing of railroad incident reports at  For those with a certain type of morbid curiosity, there is a collection of silent movies of three or four intentionally-staged cornfield meets between steam locomotives that can be viewed on YouTube at  Confessions of a Railroad Signalman was published by Houghton-Mifflin. 

Monday, August 13, 2018

Exporting Enslavement: China’s Illiberal Artificial Intelligence

In 1989, I had the privilege of visiting Tiananmen Square in Beijing only a few months after the famous June Fourth protests that the Chinese govermnent violently suppressed.  Our tour guide showed us black marks on the pavement that were left by fires during the conflict—a memory that has not faded.

Much has changed since then.  China is now a world leader in many areas of science and technology, including artificial intelligence (AI).  But the nature of the Chinese government has not changed, and as Ryan Khurana points out in a recent online article in National Review, its illiberal policies may transform AI into a weapon that similar governments around the world can use to enslave their citizens. 

To avoid confusion, I should define a couple of political terms.  In the sense I intend here, the term liberal refers to what political scientists call “classical liberalism.”  Simply put, a liberal government in this sense encourages the liberty of its citizens.  It acknowledges  fundamental rights such as the right to life, the rights to worship freely and express one’s views without fear of government reprisal, and the right to participate meaningfully in political affairs.  The intention of the founders of the United States of America was to form a liberal government in this sense.

By contrast, illiberal governments are top-down operations in which those in charge have essentially unlimited power over the mass of citizens.  Most monarchies were set up this way in theory, and from its founding the Peoples’ Republic of China has behaved in a consistently illiberal way, and continues to do so under President Xi Jinping.  Anything that assists the Chinese government in spying on its citizens, learning about their private as well as public actions, and controlling their behavior so that they conform to a model pleasing to the government is going to get a lot of support.  And AI fits this bill perfectly, which is one reason why China is not only pouring billions into AI R&D, but exporting it to other countries that want to spy on their people too.

Khurana points out that Zimbabwe, an African country well-known for its human-rights abuses, has received advanced Chinese AI technology from a startup company in exchange for letting the firm have access to the country’s facial-recognition database.  So China is helping the government of Zimbabwe to keep tabs on its citizens as well.  Maybe Zimbabwe will come up with something like China’s recently announced Social Credit system, which is a sort of personal reliability rating that eventually every person in China will receive.  Think credit rating, only one based on the government’s electronic dossier of your behavior:  what stores you visit, what friends you have, what meetings you go to, what TV shows you watch, and whether you go to church. 

Khurana says that we are engaged in a kind of arms race reminiscent of the old Cold War conflict between the Soviet Union and its satellites, and what used to be called the Free World.  Back then, the game was to see whether the U. S. or the U. S. S. R. could dangle the most attractive technological baubles in front of this or that country to tempt it toward one side or the other.  It wasn’t only military technology, but weaponry was the trump card. 

Things are different now, and AI is not like a howitzer—you can do lots of things with it, both peaceful and warlike.  Or liberal and illiberal.  But unless a smaller country has already developed a capable AI technological base of its own, it is likely to want only turn-key systems already designed to do particular things.  And companies in China who have learned how to help the government use AI to monitor and control people will naturally find it easiest to help other governments do the same illiberal thing.

Khurana says the U. S. side is currently losing this battle.  The federal government and military have been slow to get up to speed on using AI for defensive purposes.  When the Department of Defense tried to engage Google in a cooperative AI project to identify terrorists, the company pulled out, and other attempts to use AI in the military have been stymied because critical pieces of intellectual property turn out to be linked to Russian or Chinese ownership. 

There are two aspects to this problem.  The international aspect is that around the world, Chinese AI is coming with illiberal strings attached, and governments with little interest in protecting their citizens’ freedom are eagerly following China’s lead in using AI to spy on and suppress their peoples.  The domestic aspect is that the U. S. is going perhaps too far in the direction of pretending that we are all one big happy AI family, and that we can get AI technology from anywhere in the world and use it for our own private, liberal, or defensive purposes. 

But the world is not that way.  Back when wars depended mainly on hardware, nations contemplating future conflicts made sure they stockpiled essential materials such as tungsten and vanadium before starting a war.  Now that international conflicts increasingly involve cyberwarfare and AI-powered technology, it is foolish and shortsighted to ignore the fact that China is flooding the globe with their AI products and services, and to pretend we don’t have to worry about it and will always be able to outsmart them.  Physical weapons have a way of being used, and the same is true of AI designed for illiberal purposes.  Let’s hope that freedom doesn’t get trampled underfoot in the rush of many countries to get on the Chinese AI bandwagon.

Sources:  Ryan Khurana’s article “The Rise of Illiberal Artificial Intelligence” appeared on the website of National Review on Aug. 10, 2018 at

Monday, August 06, 2018

In Professionals We Trust—Or Do We?

In a recent New York Times opinion piece, science journalist Melinda Wenner Moyer bemoans the fact that vaccine researchers are getting paranoid about publishing scientific papers that contain anything negative about vaccines, out of fear that the anti-vaccine movement will weaponize such results.  This problem has important implications for public trust in professionals generally, including engineers.

First, a little background.  Life before vaccines was shorter and riskier.  Smallpox, diphtheria, tetanus, and the lesser but still potentially fatal childhood diseases of measles and mumps killed millions and left survivors scarred for life or otherwise disabled.  That is why the world’s most advanced thinkers, including the New England minister and Princeton president Jonathan Edwards, embraced the idea of vaccinations against smallpox when Edward Jenner popularized it in the 1700s.  Unfortunately, when Edwards was vaccinated during an outbreak of the disease in 1758, it led to full-blown smallpox that killed him. 

Vaccination methods were crude back then, and over the following decades, the smallpox vaccine was refined to the point that in 1980, the World Health Organization declared that smallpox had been eradicated.  But Edwards’ death is a reminder that progress isn’t uniform, and bad news as well as good news has to be shared among professional practitioners if progress in any technology is to be made.

Up to about the year 2000, the attitude of the public in most industrialized nations toward vaccines was almost uniformly positive, and not controversial.  Each new and more effective vaccine, such as the Salk and Sabin vaccines against polio in the 1950s, was hailed as one more example of science’s triumph over disease.  Then in 1998, a gastroenterologist named Andrew Wakefield published the results of a small study based on 12 cases that seemed to indicate a link between autism and the very small amount of mercury used as a preservative in the mumps-measles-rubella (MMR) vaccine that was routinely given to millions of small children every year. 

Wakefield’s paper was published in the respected medical journal Lancet, and created a huge controversy.  Parents of autistic children now had something to blame on which to blame the mysterious syndrome, and as time went on, activist groups of parents formed and made Wakefield a hero.  The nascent Internet became a powerful tool in the hands of these groups, as it bypassed the usual peer-review process that scientists must adhere to and enabled isolated parents of autistic children to band together.  The failure of any subsequent scientific studies to confirm Wakefield’s findings didn’t slow down the anti-vaccine movement significantly.

It wasn’t until 2004 that serious questions were raised about Wakefield’s integrity.  It turned out that he was being paid by attorneys who wanted to sue vaccine manufacturers, and after further investigation revealed that Wakefield had fabricated some data, Lancet withdrew the paper and Wakefield had his British medical license revoked.  But the horse had left the barn long before that.  Currently, many well-educated and otherwise rational people refuse to have their children vaccinated for what are generally termed “philosophical reasons.”  As epidemiologists know, there is a threshold for the percent of unvaccinated people in a given population above which the risk of epidemics increases rapidly, and widespread refusal to vaccinate is partly blamed for recent outbreaks such as the 147 cases of measles centered at Disneyland in California in 2015. 

This story of the anti-vaccination trend is perhaps one of the clearest examples of what is a relatively new thing in Western civilization: widespread distrust of expert authority.  Back when everyone knew someone who had died of smallpox and many survivors bore scars, the promise of being able to immunize yourself and your offspring against such a terrible disease was so attractive that intelligent people such as Jonathan Edwards took the risks of what was by modern standards a very dangerous vaccination. 

Today, when the chances of anything bad happening from a vaccination well known and down in the fifth decimal place (a few per 100,000), and the ill effects of not getting vaccinated are also well known and clearly worse than taking the vaccine, why would anybody refuse, especially on behalf of their innocent children?  Clearly, because they believe in something or someone other than the conventional scientific wisdom represented by institutions such as the medical profession, government and private research organizations, and even people as supposedly trustworthy as their own family doctor.   

The problem with all this is that some professionals really do know more about a subject than non-professionals, and when experts talk about their own fields, they are generally more worth listening to than some random website you find with Google.  The paranoia among vaccine researchers that Moyer discusses is a sad result of ignoring this basic fact of life. 

It’s like a child who is repeatedly accused falsely of stealing from the cookie jar.  If he’s punished often enough for something he didn’t do, he may go ahead and steal anyway, figuring he’s going to get blamed for it whether or not he’s done it, so he might as well enjoy the ill-gotten gains of stealing, because the negative consequences will be the same.

In embracing bogus and disproved theories of harm from vaccines, anti-vaccine groups appear to be creating the very behavior they suspected was already happening among scientists:  namely, a reluctance to report negative aspects of vaccine use.  Of course, this will cripple any efforts to improve vaccines, because you have to know what went wrong before you can fix it. 

Let’s hope that engineers keep their collective noses clean in this regard.  Few polls of trust in the professions even ask the public about engineers.  I had to dig for a while before I came up with a global poll from 2015 that lumped engineers in with technicians, and that combined group came in on the trust scale about in the middle, just below pilots and just above soldiers.  Firefighters were the most trusted profession, and bankers the least.

Things could be worse, certainly.  In this fishbowl Internet age when anybody who says anything eye-catching, whether true or not, is liable to become world-famous overnight, engineers need to be especially careful in their public pronouncements.  It’s good to let the public know your considered expert opinion about something.  But first, be sure you’re right.  Lying about a matter of expert opinion that’s of vital interest can create harmful effects that go on for decades, as the anti-vaccine movement has shown.

Sources:  Melinda Wenner Moyer’s opinion piece entitled “Anti-Vaccine Activists Have Taken Vaccine Science Hostage” appeared on the New York Times website on Aug. 4, 2018 at  I referred to the website for information about the Wakefield controversy, to about the Disneyland measles outbreak, and to for the global survey of trust in various professions.  Jonathan Edwards’ death as the result of a smallpox vaccination is well known and reported in numerous sources.