Regular readers of this blog—all five of you—know that in the past I have had less than kind words for NASA regarding safety issues. The agency has become something of a poster child in engineering ethics circles, mainly because of the Challenger and Columbia space-shuttle disasters. Recently, however, under the leadership of Michael Griffin, NASA has shown signs of getting its act together. It has stuck to long-overdue plans to retire the current fleet of space shuttles by 2010, which by Thursday will be only next year. And it has embarked on ambitious but generally well-considered plans to develop a new way of getting to the moon and beyond: the Constellation program, which includes the new Ares series of partially reusable rockets and space capsules. In these plans, NASA is at least trying to do what's right, which is to get away from the increasingly antiquated and hazardous shuttle system toward newly designed systems that take advantage of thirty years of aerospace progress, while embracing safe tried-and-true technologies.
For example, the personnel-transporting Ares I rocket uses liquid-fueled J-2 engines, the same kind that the Gemini and Apollo programs used. The solid-fuel boosters employed by the space shuttle were a bargain-basement compromise that led directly to the first major shuttle disaster, and it will be good to see them go.
But the best-laid plans of mice and men, at least those employed by the government, are subject to political winds. News reports of a few weeks ago carried a story about a dust-up between Griffin and Lori Garver, a former NASA public relations officer who now heads the Obama transition team in charge of deciding what to do with NASA.
According to Time, the dispute arose when Garver and Griffin met in a NASA library during a book signing. Although details of the encounter vary by source, one issue was the possibility of either canceling the Ares I program altogether or switching to a cheaper alternative using existing rockets such as the Atlas and Centaur, currently used for unmanned space projects. Griffin reportedly questioned Garver's engineering qualifications and said that Atlas and Centaur rockets are not safe enough to use for people.
Of course, it is within the prerogative of a new administration to replace Griffin with someone more cooperative. But if NASA has finally learned something from its past mistakes and is headed in a good direction, it would be a shame if the incoming administration forced it into the same nickel-and-dime mode that led to the twenty-eight-year dangerous compromise called the space shuttle.
The environment that NASA operates in now is very different than the situation three decades ago. In the late 1970s, the only credible competitor in space was the old USSR, which had pretty much thrown in the towel when the U. S. reached the moon in 1969. Today, all kinds of folks are getting into the act: China, Japan, and India all have serious space programs, and Russia, far from having abandoned theirs, will be our only link to the International Space Station from the time the shuttle is retired until we develop something better—assuming there are funds and political support to do so.
The appeal of space flight—especially manned space flight—has always been more emotional than rational, more tied to political prestige than to economic realities. From a strictly business point of view, the market is very thin—there simply aren't that many super-rich people willing to pay $20 million for a joyride in space. And while there is money to be made from orbiting satellites, that end of the business has become truly a business, in which the cheapest rocket that will do the job is used. For the foreseeable future, flying people outside the atmosphere is something that will always be a money-loser. So the agencies charged with doing it have to justify their existence on grounds other than profit.
For NASA, this means two things: science and the romantic appeal of space exploration. As far as science goes, the Hubble Space Telescope has proved very fruitful, but note that the only time we have to send people up to it is when it breaks down. With advances in autonomous robotic instruments, even exploration of other planets can be done much more economically with unmanned probes. So sending people into space doesn't really further science, except to add to our knowledge of what happens to people when they spend long periods of time in space. (The short answer to that is, nothing much good.)
So that leaves romance and a kind of quasi-religious feeling as the real basis for manned space flight. Griffin is reportedly a believer in the idea that mankind's long-term destiny is to colonize space, whether other planets in our solar system or planets in other solar systems. I have had some minor dealings with people who think this way, and it really amounts to a kind of religion. If you don't believe in the supernatural, and you think we are well on the way to trashing the only planet we can live on, that leaves space exploration as the only hope for immortality of the human species. After all, the sun is going to run out of gas here in a few billion more years, and then what?
I am generally well disposed toward idealism, if it is directed at worthy goals. And I'm all in favor of the scientific aspects of space exploration. But sending people into space costs a lot more than unmanned projects, and (needless to say) are more dangerous to those participating as well. In a democracy, public expenditures have to be justified by the economic or political good that they can achieve. These political goods include maintaining what prestige we have in the community of nations, satisfying the desires of those who see space as the final frontier, and making heroes, without whom few nations make it for long. It is up to the Obama administration and the new Congress to decide whether NASA's plans are worth the cost. Whatever happens, I hope that we end up with something to replace the shuttles that is orders of magnitude safer and will serve the public in the best way possible—whatever that is.
Sources: Reports of the Griffin-Garver controversy I used can be found at the Time Magazine website http://www.time.com/time/nation/article/0,8599,1866045,00.html
and the Orlando Sentinel website http://blogs.orlandosentinel.com/news_space_thewritestuff/2008/12/nasa-has-become.html.
Monday, December 29, 2008
Monday, December 22, 2008
A Riveting Story: The Titanic's 94-Year-Old Mystery Solved
The sinking of the White Star Line's luxury passenger liner Titanic on April 14, 1912 has got to be one of the most famous engineering failures in history. Everybody knows the story: how the ship ran full speed into an iceberg despite warnings relayed by the then-new wireless, and sank less than three hours later with the loss of over 1500 lives. Since the discovery of the wreck in 1985, researchers have been able to recover hundreds of artifacts and subject them to modern forensic analyses. Two of these researchers—metallurgical experts Jennifer Hooper McCarty and Tim Foecke—have written a book about their discoveries. What Really Sank the Titanic clears up a long-standing mystery about the tragedy and points the finger of blame in a surprising direction.
Boards of inquiry held immediately after the disaster obtained enough information from survivors to piece together the following story. At about 11:40 PM, as the Titanic moved at about 22 knots through a near-freezing sea "as smooth as glass," lookouts spotted an iceberg in the path of the ship. The steersman had just begun to turn the bow to port when the berg scraped along the starboard side of the ship, making a long-lasting noise that was described variously as tearing, jarring, or ripping. Although the ship's six watertight compartments were immediately sealed when it was discovered that water was coming in, there were enough holes in different parts of the hull that eventually all six compartments filled up, and the ship sank. The ship's designer, Edward Wilding, said at one inquiry that a long, narrow series of slit-like openings about two hundred feet long and only an inch or so wide would have accounted for the fashion and speed with which the ship foundered. But since steel is much harder than iceberg ice, he could not explain how such openings could have occurred. There the mystery lay at the bottom of the North Atlantic for over eighty years, until salvage expeditions began to bring pieces to the surface.
In a decade-long investigation, McCarty and Foecke, respectively graduate student at Johns Hopkins University and staff member at the Gaithersburg, Maryland office of the National Institute of Standards and Technology, obtained samples of the Titanic's hull, which consisted of large steel plates held together by rivets (electric-arc welding was not to become the standard steel-fabrication method until World War II). McCarty's archival research in England revealed that Harland & Wolff, the ship's Belfast builders, used two kinds of rivets: the more modern machine-formed steel rivets for the central part of the ship, and the old-fashioned hand-formed wrought-iron type for the stern and bow sections, where much of the collision impact probably occurred.
The making and installing of wrought-iron rivets was largely a manual operation. The Titanic needed over three million rivets in all, and this huge demand led flocks of entrepreneurial ironmakers to enter the field. The hand-stirred "puddling" process then used to make wrought iron from ore required strong and highly experienced workers, of which there were not enough in 1912. So it turned out that Harland & Wolff bought wrought iron from a wide variety of suppliers, some of whom were much less experienced than others. McCarty and Foecke have proof of this in the form of long, stringy slag inclusions they found in some of the recovered rivets. These inclusions tended to make wrought iron, an already a less satisfactory material than steel, even weaker.
Why weren't steel rivets used throughout? Besides reasons of cost, steel rivets had to be formed with hydraulic riveters—large U-shaped steel machines upwards of six feet high that had to be laboriously positioned on either side of the plate to be riveted. Then the rivet, shaped much like a blunt round-headed nail, would be heated, inserted into its hole through the two overlapping hull plates to be joined, and squeezed between the jaws of the riveter. This squeezing formed heads on both ends, and as the rivet cooled, the resulting shrinkage provided tension that held the two steel hull plates together in a watertight joint.
At least, that was how it was supposed to work. The problem was there was not enough room in the bow and stern areas to maneuver the hydraulic riveter. So the builders resorted in those areas to the older hand-forming way of riveting, which couldn't use steel rivets because of reasons to do with the different ways steel and iron cool. Wrought iron was more forgiving to the delays and variations involved when a boy tossed a red-hot rivet from a portable stove to the rivet gang, which placed it in its hole and pounded it in by hand.
When the Titanic embarked on her maiden voyage on April 10, 1912, her bow hull plates were held together by wrought-iron rivets. The iron itself had probably never undergone any systematic quality testing, and the only quality tests done on the finished riveting job was a hurried hammer tap by an inspector, who listened to the sound it made. All this inspection could detect was loose rivets, not those made from defective wrought iron.
Then came the iceberg. While ice itself will crumble if forced against solid steel, the typical iceberg was a lot heavier than the Titanic. So in a glancing collision, the iceberg exerted tremendous localized force against only a few hull plates at a time. While even poorly-made rivets can withstand the mainly sideways stress that uniform pressure causes (e. g. hydraulic pressure on a water tank or a ship's hull), some of the forces that the iceberg caused tended to pull the hull plates apart, causing tensile stress. And the researchers found that wrought-iron rivets made of bad iron with lots of slag inclusions pop their heads off much more easily than either steel rivets or wrought-iron rivets made with better material. Significantly, many of the steel plates recovered from the wreckage were missing their rivets altogether. And riveting is not a gracefully-degrading fastening method. Once one rivet in a row pops, the ones next to it get much higher stress and are likely to fail as well, leading to a kind of chain-reaction zipper effect.
That is exactly what McCarty and Foecke say must have happened as the iceberg bounced repeatedly along the side of the ship, popping rivets and opening up long, narrow slots between hull plates—exactly what designer Wilding said in 1912 must have happened, though he couldn't explain exactly how. The researchers also show in detail how a rival theory—one that says the cold Atlantic waters made the plates themselves brittle enough to shatter like glass—is full of holes, so to speak.
So the roots of the Titanic disaster prove to go in several directions: to the heedlessness of the captain who failed to slow down in a known field of icebergs, to the rulemakers who didn't require lifeboats for everybody, and, surprisingly, to little mom-and-pop wrought-iron puddling operations that sprang up all over the United Kingdom in response to increased demand for wrought iron. McCarty and Foecke conclude that if all the ship's rivets had been steel, the ship still might have sustained serious damage, but not so much as to sink it in less than three hours. Even a few hours longer afloat could have given time for nearby ships to arrive and save most or all the passengers. But that was not the way it happened.
Sources: What Really Sank the Titanic was published in 2008 by Citadel Press. I also thank my wife Pamela for her thoughtfulness in this birthday-gift selection.
Boards of inquiry held immediately after the disaster obtained enough information from survivors to piece together the following story. At about 11:40 PM, as the Titanic moved at about 22 knots through a near-freezing sea "as smooth as glass," lookouts spotted an iceberg in the path of the ship. The steersman had just begun to turn the bow to port when the berg scraped along the starboard side of the ship, making a long-lasting noise that was described variously as tearing, jarring, or ripping. Although the ship's six watertight compartments were immediately sealed when it was discovered that water was coming in, there were enough holes in different parts of the hull that eventually all six compartments filled up, and the ship sank. The ship's designer, Edward Wilding, said at one inquiry that a long, narrow series of slit-like openings about two hundred feet long and only an inch or so wide would have accounted for the fashion and speed with which the ship foundered. But since steel is much harder than iceberg ice, he could not explain how such openings could have occurred. There the mystery lay at the bottom of the North Atlantic for over eighty years, until salvage expeditions began to bring pieces to the surface.
In a decade-long investigation, McCarty and Foecke, respectively graduate student at Johns Hopkins University and staff member at the Gaithersburg, Maryland office of the National Institute of Standards and Technology, obtained samples of the Titanic's hull, which consisted of large steel plates held together by rivets (electric-arc welding was not to become the standard steel-fabrication method until World War II). McCarty's archival research in England revealed that Harland & Wolff, the ship's Belfast builders, used two kinds of rivets: the more modern machine-formed steel rivets for the central part of the ship, and the old-fashioned hand-formed wrought-iron type for the stern and bow sections, where much of the collision impact probably occurred.
The making and installing of wrought-iron rivets was largely a manual operation. The Titanic needed over three million rivets in all, and this huge demand led flocks of entrepreneurial ironmakers to enter the field. The hand-stirred "puddling" process then used to make wrought iron from ore required strong and highly experienced workers, of which there were not enough in 1912. So it turned out that Harland & Wolff bought wrought iron from a wide variety of suppliers, some of whom were much less experienced than others. McCarty and Foecke have proof of this in the form of long, stringy slag inclusions they found in some of the recovered rivets. These inclusions tended to make wrought iron, an already a less satisfactory material than steel, even weaker.
Why weren't steel rivets used throughout? Besides reasons of cost, steel rivets had to be formed with hydraulic riveters—large U-shaped steel machines upwards of six feet high that had to be laboriously positioned on either side of the plate to be riveted. Then the rivet, shaped much like a blunt round-headed nail, would be heated, inserted into its hole through the two overlapping hull plates to be joined, and squeezed between the jaws of the riveter. This squeezing formed heads on both ends, and as the rivet cooled, the resulting shrinkage provided tension that held the two steel hull plates together in a watertight joint.
At least, that was how it was supposed to work. The problem was there was not enough room in the bow and stern areas to maneuver the hydraulic riveter. So the builders resorted in those areas to the older hand-forming way of riveting, which couldn't use steel rivets because of reasons to do with the different ways steel and iron cool. Wrought iron was more forgiving to the delays and variations involved when a boy tossed a red-hot rivet from a portable stove to the rivet gang, which placed it in its hole and pounded it in by hand.
When the Titanic embarked on her maiden voyage on April 10, 1912, her bow hull plates were held together by wrought-iron rivets. The iron itself had probably never undergone any systematic quality testing, and the only quality tests done on the finished riveting job was a hurried hammer tap by an inspector, who listened to the sound it made. All this inspection could detect was loose rivets, not those made from defective wrought iron.
Then came the iceberg. While ice itself will crumble if forced against solid steel, the typical iceberg was a lot heavier than the Titanic. So in a glancing collision, the iceberg exerted tremendous localized force against only a few hull plates at a time. While even poorly-made rivets can withstand the mainly sideways stress that uniform pressure causes (e. g. hydraulic pressure on a water tank or a ship's hull), some of the forces that the iceberg caused tended to pull the hull plates apart, causing tensile stress. And the researchers found that wrought-iron rivets made of bad iron with lots of slag inclusions pop their heads off much more easily than either steel rivets or wrought-iron rivets made with better material. Significantly, many of the steel plates recovered from the wreckage were missing their rivets altogether. And riveting is not a gracefully-degrading fastening method. Once one rivet in a row pops, the ones next to it get much higher stress and are likely to fail as well, leading to a kind of chain-reaction zipper effect.
That is exactly what McCarty and Foecke say must have happened as the iceberg bounced repeatedly along the side of the ship, popping rivets and opening up long, narrow slots between hull plates—exactly what designer Wilding said in 1912 must have happened, though he couldn't explain exactly how. The researchers also show in detail how a rival theory—one that says the cold Atlantic waters made the plates themselves brittle enough to shatter like glass—is full of holes, so to speak.
So the roots of the Titanic disaster prove to go in several directions: to the heedlessness of the captain who failed to slow down in a known field of icebergs, to the rulemakers who didn't require lifeboats for everybody, and, surprisingly, to little mom-and-pop wrought-iron puddling operations that sprang up all over the United Kingdom in response to increased demand for wrought iron. McCarty and Foecke conclude that if all the ship's rivets had been steel, the ship still might have sustained serious damage, but not so much as to sink it in less than three hours. Even a few hours longer afloat could have given time for nearby ships to arrive and save most or all the passengers. But that was not the way it happened.
Sources: What Really Sank the Titanic was published in 2008 by Citadel Press. I also thank my wife Pamela for her thoughtfulness in this birthday-gift selection.
Monday, December 15, 2008
Explosion in Tyler: More Questions than Answers
Over the weekend I heard a short news item about an explosion and fire at an oil refinery in Tyler, Texas. Here is the fruit of less than an hour's web research on what happened:
Tyler is a town of about 100,000 in East Texas. Among its industrial facilities is a smallish oil refinery (it's the 94th largest in the U. S.) owned by an Israeli company called Delek USA. On Thursday afternoon, Nov. 20 of this year, a part of its saturated gas unit exploded and caught fire. Four employees of the company were hospitalized, and two of them died later of their injuries. About 2,000 gallons of gasoline spilled into a nearby creek during the accident, but didn't catch fire. Five employees have filed suit in a Houston court over injuries suffered in the blast. The U. S. Chemical Safety and Hazard Investigation Board has not yet posted a notice of investigation for this accident.
The plant has been shut down since the explosion and no one is saying when it might start up again.
As oil refinery accidents go, this one is not in the major leagues. The 2005 BP refinery accident in Texas City, Texas that killed 15 people is probably the most recent leader in that regard, if we judge by the number of fatalities. But the death or injury of even one person in any engineered facility is the result of something that shouldn't have happened.
I wish I could present you with a complete story of exactly what went wrong in Tyler that day and how it could have been prevented. But alas, it is not my job to gather such information from primary sources. Investigators from insurance companies or perhaps the Federal government will undertake that arduous and exacting task, equipped with tools and knowledge that such specialists have. In a few months, perhaps, the truth will emerge about what caused the accident. If history is any guide, human error will have turned out to play some role.
There are a lot of good reasons why refineries are such dangerous places. Just handling millions of gallons of highly flammable liquids and gases involves considerable risks on its own, even if you're not doing anything to make matters worse. But that is how refineries operate: you take combustible crude, run it through pipes surrounded by intense flames, squeeze it under tremendous pressure that will use any slight excuse to shoot flammable stuff out into the air where it will spontaneously burn because it's so hot, and then subject it to all sorts of chemical indignities with catalysts, further heating, pressure, toxic acids, and so on.
It would be bad enough if it was simple, but refineries are some of the most complex large systems on Earth: thousands of pipes, pumps, valves, tanks, sensors, actuators, and other stuff, all having to be operated just so or else you're in big trouble fast. The fact that we don't have refinery accidents every day is a testimony to the incredibly disciplined management and training that has to go into good refinery operations. Refinery employees have hard, dangerous jobs, and most of them do their jobs well. But not always, especially if they or their managers get careless or squeezed by cost issues into deferring necessary maintenance or safety practices.
Equipment failure, while not unheard of, is something that can almost always be prevented. The physics and chemistry of how steel fails and how chemicals behave are well enough understood that chemical engineers can predict what's going to happen in almost any given case. The problem is making sure that knowledge is applied at the right time in the right way.
The fact that the Tyler plant is owned by an offshore firm may not have any bearing on the accident. But responsibility is a funny thing: like radio waves, it tends to weaken over long distances. This is not to say that all companies based in the U. S. act more responsibly than any foreign firm—that is silly. But the point is that if nationalism means anything at all, and there is abundant evidence to show that it does, people are going to be more conscientious about protecting the lives and wellbeing of fellow citizens before they will look out for the interests of foreign nationals. It is a natural human tendency, but one that must be fought against when the situation of foreign ownership of plants arises. Years ago, when it was more common to find American firms owning offshore factories, the temptation was to neglect safety for non-American native populations, and tragedies such as the Union Carbide Bhopal disaster in India were the result. Now that ownership of manufacturing facilities seems to have become anathema to U. S. businesses, the shoe is on the other foot. Such refineries and plants that we have here are increasingly owned by foreign firms, whose owners may or may not be as careful to ensure safe working environments as U. S. owners might be.
We will simply have to wait to find out more about what happened in Tyler last November 20. But let's hope that any lessons we can find out will be learned well by everyone who has anything to do with oil refinery safety or engineering.
Sources: I have used information from the following sources: http://www.haaretz.com/hasen/spages/1044426.html, http://www.tylerpaper.com/article/20081120/NEWS08/811200292, and http://oilspot2.dtnenergy.com/e_article001272178.cfm?x=b11,0,w.
Tyler is a town of about 100,000 in East Texas. Among its industrial facilities is a smallish oil refinery (it's the 94th largest in the U. S.) owned by an Israeli company called Delek USA. On Thursday afternoon, Nov. 20 of this year, a part of its saturated gas unit exploded and caught fire. Four employees of the company were hospitalized, and two of them died later of their injuries. About 2,000 gallons of gasoline spilled into a nearby creek during the accident, but didn't catch fire. Five employees have filed suit in a Houston court over injuries suffered in the blast. The U. S. Chemical Safety and Hazard Investigation Board has not yet posted a notice of investigation for this accident.
The plant has been shut down since the explosion and no one is saying when it might start up again.
As oil refinery accidents go, this one is not in the major leagues. The 2005 BP refinery accident in Texas City, Texas that killed 15 people is probably the most recent leader in that regard, if we judge by the number of fatalities. But the death or injury of even one person in any engineered facility is the result of something that shouldn't have happened.
I wish I could present you with a complete story of exactly what went wrong in Tyler that day and how it could have been prevented. But alas, it is not my job to gather such information from primary sources. Investigators from insurance companies or perhaps the Federal government will undertake that arduous and exacting task, equipped with tools and knowledge that such specialists have. In a few months, perhaps, the truth will emerge about what caused the accident. If history is any guide, human error will have turned out to play some role.
There are a lot of good reasons why refineries are such dangerous places. Just handling millions of gallons of highly flammable liquids and gases involves considerable risks on its own, even if you're not doing anything to make matters worse. But that is how refineries operate: you take combustible crude, run it through pipes surrounded by intense flames, squeeze it under tremendous pressure that will use any slight excuse to shoot flammable stuff out into the air where it will spontaneously burn because it's so hot, and then subject it to all sorts of chemical indignities with catalysts, further heating, pressure, toxic acids, and so on.
It would be bad enough if it was simple, but refineries are some of the most complex large systems on Earth: thousands of pipes, pumps, valves, tanks, sensors, actuators, and other stuff, all having to be operated just so or else you're in big trouble fast. The fact that we don't have refinery accidents every day is a testimony to the incredibly disciplined management and training that has to go into good refinery operations. Refinery employees have hard, dangerous jobs, and most of them do their jobs well. But not always, especially if they or their managers get careless or squeezed by cost issues into deferring necessary maintenance or safety practices.
Equipment failure, while not unheard of, is something that can almost always be prevented. The physics and chemistry of how steel fails and how chemicals behave are well enough understood that chemical engineers can predict what's going to happen in almost any given case. The problem is making sure that knowledge is applied at the right time in the right way.
The fact that the Tyler plant is owned by an offshore firm may not have any bearing on the accident. But responsibility is a funny thing: like radio waves, it tends to weaken over long distances. This is not to say that all companies based in the U. S. act more responsibly than any foreign firm—that is silly. But the point is that if nationalism means anything at all, and there is abundant evidence to show that it does, people are going to be more conscientious about protecting the lives and wellbeing of fellow citizens before they will look out for the interests of foreign nationals. It is a natural human tendency, but one that must be fought against when the situation of foreign ownership of plants arises. Years ago, when it was more common to find American firms owning offshore factories, the temptation was to neglect safety for non-American native populations, and tragedies such as the Union Carbide Bhopal disaster in India were the result. Now that ownership of manufacturing facilities seems to have become anathema to U. S. businesses, the shoe is on the other foot. Such refineries and plants that we have here are increasingly owned by foreign firms, whose owners may or may not be as careful to ensure safe working environments as U. S. owners might be.
We will simply have to wait to find out more about what happened in Tyler last November 20. But let's hope that any lessons we can find out will be learned well by everyone who has anything to do with oil refinery safety or engineering.
Sources: I have used information from the following sources: http://www.haaretz.com/hasen/spages/1044426.html, http://www.tylerpaper.com/article/20081120/NEWS08/811200292, and http://oilspot2.dtnenergy.com/e_article001272178.cfm?x=b11,0,w.
Monday, December 08, 2008
Michael Polanyi and the Purity of Science
Everybody agrees that engineering and technology these days depend on science. Modern engineering is inconceivable without the advances made possible by the Scientific Revolution, both in our state of knowledge and in our approach to knowledge itself. One of the twentieth century's most profound thinkers about the scientific way of knowing and its connection to technology was Michael Polanyi (1891-1976).
Born of Jewish parentage in Hungary, he obtained a Ph. D. in chemistry and began a career as a research chemist in Germany. He married a Catholic there and left with his family for England shortly after Hitler came to power in 1933. At the University of Manchester, his interests turned gradually from chemistry to the philosophy of science. When he was invited to give the prestigious Gifford Lectures for 1951-52, he focused on the personal nature of supposedly "objective" scientific knowledge and published his thoughts in his best-known work, Personal Knowledge, in 1958. Fifty years later, his arguments about the purity of science and its relationship to technology are probably worth heeding even more now than when he wrote them.
Personal Knowledge is a deep and wide-ranging book, and all I want to do today is to give you a small sample that touches on the relationship between science and engineering, and how we allow the practical needs of technology to dominate the way we do science at our peril.
If you write a proposal these days to the U. S. National Science Foundation for even the most abstruse and theoretical project, you will have to address two questions in your proposal. The first, appropriately, is "What is the intellectual merit of the proposed activity?" That is, how will this work advance the state of scientific knowledge? This is an entirely reasonable criterion, and one which has been in place since the early days of the Foundation in the 1950s. However, the answer to a second question is now given equal weight in the funding reviews: "What are the broader impacts of the proposed activity?" In an explanatory paragraph, the Foundation expands on this second question thus: "How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? . . . .What may be the benefits of the proposed activity to society?" Unless an investigator can muster adequate answers to both main questions, the proposal stands practically no chance of getting funded.
Now the NSF distributes taxpayers' money, and the taxpayers have a right to know what they're getting. But contrast that second question with Polanyi's words about why scientists should do science. In describing how science fares in developing countries, he says that "it suffers from a lack of response to its true values. Consequently, the authorities grant insufficient time for research; politics play havoc with appointments; businessmen deflect interest from science by subsidizing only practical projects. . . . Encircled today between the crude utilitarianism of the philistine and the ideological utilitarianism of the modern revolutionary movement, the love of pure science may falter and die. And if this sentiment were lost, the cultivation of pure science would lose the only driving force which can guide it toward the achievement of true scientific value. The opinion is widespread that the cultivation of science would always be continued for the sake of its practical advantages. . . . The scientific method was devised precisely for the purpose of elucidating the nature of things under more carefully controlled conditions and by more rigorous criteria than are present in the situations created by practical problems. These conditions and criteria can be discovered only by taking a purely scientific interest in the matter, which again can exist only in minds educated in the appreciation of scientific value. Such sensibility cannot be switched on at will for purposes alien to its inherent passion [e. g. for writing answers to question about the "broader impact" of research]. No important discovery can be made in science by anyone who does not believe that science is important—indeed, supremely important—in itself."
I wonder how far we in the U. S. have gone in the direction that Polanyi warned about. I see several signs that maybe we've gone quite a distance. Corporations have long since shuttered their basic-research divisions: Bell Laboratories is no more, IBM and Xerox are no longer spending large fractions of their income on basic research, and even before the present economic downturn, any CEO who authorized spending that couldn't promise a return in one to three quarters was asking for trouble from the stockholders. Following the fiasco of the Superconducting Supercollider in the 1990s, the axis of high-energy physics research left the U. S. and returned to its birthplace, Europe. We are relying on the Russians for the next few years of access to the International Space Station once the Space Shuttle is (finally!) terminated. And native-born Americans seem to be allergic to almost any kind of graduate study that doesn't promise quick financial rewards. That eliminates science and engineering graduate study for most of them, which explains the highly international flavor of most graduate schools, and by now most engineering and science college faculties.
That is great news for the folks who come to the U. S. to better their education, of course. We are 99% a nation of immigrants anyway, and I do not begrudge anyone a place in the great adventure of science and technology, no matter where they came from. But we can't rely exclusively on immigration from places which, in another generation or so, will have graduate schools as good or better than ours, partly because people in those countries seem to have learned what we're in the process of forgetting: that some science, in fact all true science, has to be pursued for philosophia—the love of knowledge, not for what it can do or how much money it can make. Some engineers and politicians may not like to hear that, but unless we're prepared to do without new science, we should at least recognize the problem, which, as in any engineering ethics issue, is the first step toward solving it.
Sources: The Polanyi quotations are from pp. 182-183 of Personal Knowledge: Toward a Post-Critical Philosophy (Chicago: Univ. of Chicago Press, 1958). The questions an NSF applicant must answer can be found in the Grant Proposal Guide on the NSF website, http://www.nsf.gov/pubs/policydocs/pappguide/nsf08_1/gpg_3.jsp.
Born of Jewish parentage in Hungary, he obtained a Ph. D. in chemistry and began a career as a research chemist in Germany. He married a Catholic there and left with his family for England shortly after Hitler came to power in 1933. At the University of Manchester, his interests turned gradually from chemistry to the philosophy of science. When he was invited to give the prestigious Gifford Lectures for 1951-52, he focused on the personal nature of supposedly "objective" scientific knowledge and published his thoughts in his best-known work, Personal Knowledge, in 1958. Fifty years later, his arguments about the purity of science and its relationship to technology are probably worth heeding even more now than when he wrote them.
Personal Knowledge is a deep and wide-ranging book, and all I want to do today is to give you a small sample that touches on the relationship between science and engineering, and how we allow the practical needs of technology to dominate the way we do science at our peril.
If you write a proposal these days to the U. S. National Science Foundation for even the most abstruse and theoretical project, you will have to address two questions in your proposal. The first, appropriately, is "What is the intellectual merit of the proposed activity?" That is, how will this work advance the state of scientific knowledge? This is an entirely reasonable criterion, and one which has been in place since the early days of the Foundation in the 1950s. However, the answer to a second question is now given equal weight in the funding reviews: "What are the broader impacts of the proposed activity?" In an explanatory paragraph, the Foundation expands on this second question thus: "How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? . . . .What may be the benefits of the proposed activity to society?" Unless an investigator can muster adequate answers to both main questions, the proposal stands practically no chance of getting funded.
Now the NSF distributes taxpayers' money, and the taxpayers have a right to know what they're getting. But contrast that second question with Polanyi's words about why scientists should do science. In describing how science fares in developing countries, he says that "it suffers from a lack of response to its true values. Consequently, the authorities grant insufficient time for research; politics play havoc with appointments; businessmen deflect interest from science by subsidizing only practical projects. . . . Encircled today between the crude utilitarianism of the philistine and the ideological utilitarianism of the modern revolutionary movement, the love of pure science may falter and die. And if this sentiment were lost, the cultivation of pure science would lose the only driving force which can guide it toward the achievement of true scientific value. The opinion is widespread that the cultivation of science would always be continued for the sake of its practical advantages. . . . The scientific method was devised precisely for the purpose of elucidating the nature of things under more carefully controlled conditions and by more rigorous criteria than are present in the situations created by practical problems. These conditions and criteria can be discovered only by taking a purely scientific interest in the matter, which again can exist only in minds educated in the appreciation of scientific value. Such sensibility cannot be switched on at will for purposes alien to its inherent passion [e. g. for writing answers to question about the "broader impact" of research]. No important discovery can be made in science by anyone who does not believe that science is important—indeed, supremely important—in itself."
I wonder how far we in the U. S. have gone in the direction that Polanyi warned about. I see several signs that maybe we've gone quite a distance. Corporations have long since shuttered their basic-research divisions: Bell Laboratories is no more, IBM and Xerox are no longer spending large fractions of their income on basic research, and even before the present economic downturn, any CEO who authorized spending that couldn't promise a return in one to three quarters was asking for trouble from the stockholders. Following the fiasco of the Superconducting Supercollider in the 1990s, the axis of high-energy physics research left the U. S. and returned to its birthplace, Europe. We are relying on the Russians for the next few years of access to the International Space Station once the Space Shuttle is (finally!) terminated. And native-born Americans seem to be allergic to almost any kind of graduate study that doesn't promise quick financial rewards. That eliminates science and engineering graduate study for most of them, which explains the highly international flavor of most graduate schools, and by now most engineering and science college faculties.
That is great news for the folks who come to the U. S. to better their education, of course. We are 99% a nation of immigrants anyway, and I do not begrudge anyone a place in the great adventure of science and technology, no matter where they came from. But we can't rely exclusively on immigration from places which, in another generation or so, will have graduate schools as good or better than ours, partly because people in those countries seem to have learned what we're in the process of forgetting: that some science, in fact all true science, has to be pursued for philosophia—the love of knowledge, not for what it can do or how much money it can make. Some engineers and politicians may not like to hear that, but unless we're prepared to do without new science, we should at least recognize the problem, which, as in any engineering ethics issue, is the first step toward solving it.
Sources: The Polanyi quotations are from pp. 182-183 of Personal Knowledge: Toward a Post-Critical Philosophy (Chicago: Univ. of Chicago Press, 1958). The questions an NSF applicant must answer can be found in the Grant Proposal Guide on the NSF website, http://www.nsf.gov/pubs/policydocs/pappguide/nsf08_1/gpg_3.jsp.
Monday, December 01, 2008
Engineering Social Capital
Social capital is the network of personal relationships—memberships in associations, personal friendships, even people you say hi to at the grocery store—without which a society becomes just a collection of isolated individuals. Of course, even a highly dysfunctional society has some social capital, unless everyone is living as a hermit or a Robinson Crusoe with no human interaction of any kind. But social capital is at least as important to a society's well-being as the more familiar financial and physical kinds. In his 2000 book Bowling Alone, Harvard professor of public policy Robert D. Putnam showed the vital importance of social capital for all sorts of things, ranging from personal health to national prosperity. He also exhibited tons of evidence that the U. S. has suffered a long-term decline in social capital beginning around the 1960s and progressing through the rest of the twentieth century.
As I read Bowling Alone, I kept finding little pieces of my life being explained here and there, and I'm sure nearly everyone who reads it will as well. That Junior League thing that my mother belonged to—social capital. Those bridge games and backyard barbecues my parents were always having—social capital. (Did you know that in 1958, one of every three adults were bridge players?) The stubborn decline in the percentage of eligible U. S. engineers who belong to professional societies such as the Institute of Electrical and Electronics Engineers over the last twenty years—social capital again.
Much of the change is generational. It turns out that the baby-boomer-parent generation which came of age in the 1940s and 50s also participated in a peak in social capital, as measured by everything from memberships in voluntary organizations, to voting and political action, to union membership and writing letters to newspapers. Those who came afterwards do less of all these things. In a series of state-by-state studies that compare the degree of social capital with public health, efficiency in government, economic strength, and so on, Putnam shows that the right kinds of social capital (there are bad kinds, it turns out) benefit these other socially desirable factors as well. In other words, if you live in a state with higher social capital, you're likely to make more money, live longer, and even be happier with your life, on average. So the fact that social capital in the U. S. is sinking is cause for concern.
What has this got to do with engineering? Only nearly everything. Many of the sources of the decline Putnam describes have their roots in technological changes: the rise of television in the late 1940s, the increased urbanization (and suburbanization) of America, even (although he doesn't mention it) technologies like air conditioning, that makes staying inside with the windows closed on really hot days a viable option. Although the Internet was just gathering steam as Putnam finished his tome, I'm sure he'd have a lot to say today about the rise of computer-mediated relationships at the expense of face-to-face encounters.
Far from being a technology-is-bad Luddite, Putnam acknowledges that technology has helped to increase social capital in some ways. The automobile made travelling for social reasons easier, especially for young people. Having online relationships is better than no relationships at all. His main point is that many technologies have externalities—unintended consequences, basically—that often adversely affect social capital. When your choice of what to do in the evening was to go to a movie with friends or play a pickup basketball game, both activities involved you with other people. But now the options might be between watching your latest Netflix CD or trying out your new videogame—typically both solitary pursuits.
Every engineer whose work affects the way people spend their time or money—which is nearly every engineer—should read this book. It's a big book, some 400 pages and notes, but it's easy to read, entertaining in spots, and toward the end Putnam goes on a little Sherlock Holmes quest to find the "culprits" responsible for the decline of social capital. Technology isn't the only one—the shift of women into the workplace, urbanization and sprawl, and several others helped as well—but it is a major enough player to justify the attention of any engineer whose product or service might tend either to bring people together or to isolate them from others.
A good example of a newly engineered product that potentially leads to beneficial social capital is Nintendo's Wii video game console. Unlike other games which require only rapid finger movements, the Wii console encourages the use of games that involve the player's whole body and can more easily involve two players at once. I haven't tried it myself, but if the reports are true, a video game that encourages two people to be playing the same game in the same room produces more social capital than otherwise.
So the next time you stop what you're doing and think about the social consequences of your work, try thinking along the lines of social capital: Will this product bring people together? Or will it be one more reason for a guy to sit alone in his room with his machinery? It's not the only factor, but it is a factor that engineers seldom consider. And Bowling Alone makes it clear that something needs to be done soon.
Sources: Robert D. Putnam's Bowling Alone was published by Simon & Schuster in 2000. For an explanation of externalities, see my blog "The Ethics of Externalities" on Nov. 3, 2008.
As I read Bowling Alone, I kept finding little pieces of my life being explained here and there, and I'm sure nearly everyone who reads it will as well. That Junior League thing that my mother belonged to—social capital. Those bridge games and backyard barbecues my parents were always having—social capital. (Did you know that in 1958, one of every three adults were bridge players?) The stubborn decline in the percentage of eligible U. S. engineers who belong to professional societies such as the Institute of Electrical and Electronics Engineers over the last twenty years—social capital again.
Much of the change is generational. It turns out that the baby-boomer-parent generation which came of age in the 1940s and 50s also participated in a peak in social capital, as measured by everything from memberships in voluntary organizations, to voting and political action, to union membership and writing letters to newspapers. Those who came afterwards do less of all these things. In a series of state-by-state studies that compare the degree of social capital with public health, efficiency in government, economic strength, and so on, Putnam shows that the right kinds of social capital (there are bad kinds, it turns out) benefit these other socially desirable factors as well. In other words, if you live in a state with higher social capital, you're likely to make more money, live longer, and even be happier with your life, on average. So the fact that social capital in the U. S. is sinking is cause for concern.
What has this got to do with engineering? Only nearly everything. Many of the sources of the decline Putnam describes have their roots in technological changes: the rise of television in the late 1940s, the increased urbanization (and suburbanization) of America, even (although he doesn't mention it) technologies like air conditioning, that makes staying inside with the windows closed on really hot days a viable option. Although the Internet was just gathering steam as Putnam finished his tome, I'm sure he'd have a lot to say today about the rise of computer-mediated relationships at the expense of face-to-face encounters.
Far from being a technology-is-bad Luddite, Putnam acknowledges that technology has helped to increase social capital in some ways. The automobile made travelling for social reasons easier, especially for young people. Having online relationships is better than no relationships at all. His main point is that many technologies have externalities—unintended consequences, basically—that often adversely affect social capital. When your choice of what to do in the evening was to go to a movie with friends or play a pickup basketball game, both activities involved you with other people. But now the options might be between watching your latest Netflix CD or trying out your new videogame—typically both solitary pursuits.
Every engineer whose work affects the way people spend their time or money—which is nearly every engineer—should read this book. It's a big book, some 400 pages and notes, but it's easy to read, entertaining in spots, and toward the end Putnam goes on a little Sherlock Holmes quest to find the "culprits" responsible for the decline of social capital. Technology isn't the only one—the shift of women into the workplace, urbanization and sprawl, and several others helped as well—but it is a major enough player to justify the attention of any engineer whose product or service might tend either to bring people together or to isolate them from others.
A good example of a newly engineered product that potentially leads to beneficial social capital is Nintendo's Wii video game console. Unlike other games which require only rapid finger movements, the Wii console encourages the use of games that involve the player's whole body and can more easily involve two players at once. I haven't tried it myself, but if the reports are true, a video game that encourages two people to be playing the same game in the same room produces more social capital than otherwise.
So the next time you stop what you're doing and think about the social consequences of your work, try thinking along the lines of social capital: Will this product bring people together? Or will it be one more reason for a guy to sit alone in his room with his machinery? It's not the only factor, but it is a factor that engineers seldom consider. And Bowling Alone makes it clear that something needs to be done soon.
Sources: Robert D. Putnam's Bowling Alone was published by Simon & Schuster in 2000. For an explanation of externalities, see my blog "The Ethics of Externalities" on Nov. 3, 2008.
Monday, November 24, 2008
The Ethics of Financial Engineering
As I write this, the Dow-Jones Industrial Average is somewhere south of 8100, down 35% or more from its 2007 high and showing few signs of fatigue in its downward trek. General Motors hourly threatens to go bankrupt, credit markets are doing an imitation of the last Ice Age, and newspapers are running old pictures of soup lines during the Great Depression of the 1930s, I guess to get us used to what's coming. For such a young recession, it's already gotten plenty of publicity. But in all the finger-pointing about whose fault it was that we got into this mess, I have not read anyone who has addressed the question of what might be called economic and financial engineering, and the ethics associated with it.
I remember being surprised the first time I heard that a couple of our better electrical engineering graduates got high-paying jobs with a credit-card company, of all places. But I'm surprised no longer when I hear that present or former engineers often get hired by banks, brokerage firms, and other outfits that deal in highly technical and complex financial machinery. The attention to detail and problem-solving skills that engineers learn can be applied fruitfully to finance and securities trading as well as electronics, and the pay can be better, too. I call it "machinery" although in reality it's mostly software and rules devised by lawyers and technical types such as former engineers and physicists. But the complexity is there, and there is a good argument that such complexity played a significant role in the current recession.
From what little I do understand about the situation, when all sorts of home loans (the good, the bad, and the ugly) were bundled together by means of software-mediated deals, they sold like two-dollar Miley Cyrus concert tickets at a middle school full of teenage girls. To make things more complicated still, financial institutions started selling things called "credit default swaps," which were some sort of unregulated insurance against the eventuality of loans turning bad. My point is not to explain these things in all their gory details (which I couldn't even if I had to), but to show that computers and technical people who can keep track of these things, and figure out the rules by which they operated, played essential roles in this situation.
Before electronic computers became generally available, the most complex math a banker had to deal with was figuring out compound interest, and there were tables for that sort of thing. The complexity of a given financial deal was limited by, among other things, the labor it would take to figure it out. If somebody came up with some kind of security that came with a formula that would take three women punching calculators for three days to figure out, nobody would have bought it.
Not so today. If you took all the computers away from today's traders, the whole system would come to an instant halt, not only because computers are the medium of communication (so-called "electronic trading" is involved in virtually all transactions), but because a lot of trades are initiated by automatic triggers that write buy and sell orders based on electronically reported prices.
This is not to say that speculative booms and busts are possible only when you have engineer types and horribly complicated automated trading involved. The classic textbook example of a boom-and-bust phenomenon was the tulip-bulb mania of the early 1600s. Substitute tulip bulbs for bundled home mortgages, and you can see the same psychology at work: rising prices, a spreading perception that investing in tulip bulbs is a great way to make money fast, a few people made richer but only if they cash out early, and then alternate reality sets in: hey, we're only talking about tulip bulbs here! What's the big deal? And the crash follows, wiping out thousands of tulip-bulb plutocrats.
Engineers or technical people are not to blame for the mass psychology of crashes. But as they are often endowed with perhaps an above-average grasp of logic and what used to be called common sense, I would hope that they could serve as a kind of reality check or brake on things when matters really get out of hand. Of course, engineers working for a firm whose whole existence is based on complex derivatives or credit-default swaps or tulip-bulb futures, are not going to have long stable careers in such firms if they start questioning the fundamental assumptions on which the operation is based. On the other hand, it's looking like they won't have long stable careers anyway, now that many of the outfits are going broke.
I have no illusions that many of my readers are working in the financial industry. But if you ever happen to end up either working in it or dealing with it, remember that when a deal gets so complicated and computerized that even the people who are buying and selling it don't really understand it—then maybe it's too complicated. Complexity in the service of necessity is one thing, but complexity simply to confuse the buyer is wrong. And it looks like there were a lot of confused buyers out there who have lost faith in their vendors, to the detriment of the economy as a whole.
I remember being surprised the first time I heard that a couple of our better electrical engineering graduates got high-paying jobs with a credit-card company, of all places. But I'm surprised no longer when I hear that present or former engineers often get hired by banks, brokerage firms, and other outfits that deal in highly technical and complex financial machinery. The attention to detail and problem-solving skills that engineers learn can be applied fruitfully to finance and securities trading as well as electronics, and the pay can be better, too. I call it "machinery" although in reality it's mostly software and rules devised by lawyers and technical types such as former engineers and physicists. But the complexity is there, and there is a good argument that such complexity played a significant role in the current recession.
From what little I do understand about the situation, when all sorts of home loans (the good, the bad, and the ugly) were bundled together by means of software-mediated deals, they sold like two-dollar Miley Cyrus concert tickets at a middle school full of teenage girls. To make things more complicated still, financial institutions started selling things called "credit default swaps," which were some sort of unregulated insurance against the eventuality of loans turning bad. My point is not to explain these things in all their gory details (which I couldn't even if I had to), but to show that computers and technical people who can keep track of these things, and figure out the rules by which they operated, played essential roles in this situation.
Before electronic computers became generally available, the most complex math a banker had to deal with was figuring out compound interest, and there were tables for that sort of thing. The complexity of a given financial deal was limited by, among other things, the labor it would take to figure it out. If somebody came up with some kind of security that came with a formula that would take three women punching calculators for three days to figure out, nobody would have bought it.
Not so today. If you took all the computers away from today's traders, the whole system would come to an instant halt, not only because computers are the medium of communication (so-called "electronic trading" is involved in virtually all transactions), but because a lot of trades are initiated by automatic triggers that write buy and sell orders based on electronically reported prices.
This is not to say that speculative booms and busts are possible only when you have engineer types and horribly complicated automated trading involved. The classic textbook example of a boom-and-bust phenomenon was the tulip-bulb mania of the early 1600s. Substitute tulip bulbs for bundled home mortgages, and you can see the same psychology at work: rising prices, a spreading perception that investing in tulip bulbs is a great way to make money fast, a few people made richer but only if they cash out early, and then alternate reality sets in: hey, we're only talking about tulip bulbs here! What's the big deal? And the crash follows, wiping out thousands of tulip-bulb plutocrats.
Engineers or technical people are not to blame for the mass psychology of crashes. But as they are often endowed with perhaps an above-average grasp of logic and what used to be called common sense, I would hope that they could serve as a kind of reality check or brake on things when matters really get out of hand. Of course, engineers working for a firm whose whole existence is based on complex derivatives or credit-default swaps or tulip-bulb futures, are not going to have long stable careers in such firms if they start questioning the fundamental assumptions on which the operation is based. On the other hand, it's looking like they won't have long stable careers anyway, now that many of the outfits are going broke.
I have no illusions that many of my readers are working in the financial industry. But if you ever happen to end up either working in it or dealing with it, remember that when a deal gets so complicated and computerized that even the people who are buying and selling it don't really understand it—then maybe it's too complicated. Complexity in the service of necessity is one thing, but complexity simply to confuse the buyer is wrong. And it looks like there were a lot of confused buyers out there who have lost faith in their vendors, to the detriment of the economy as a whole.
Monday, November 17, 2008
Gateways to Engineering: "Our Mr. Sun" 52 Years On
In my perhaps overly generous definition of engineering ethics, I consider the question of why people become engineers a legitimate topic of enquiry within the field. If for some mysterious reason young people all of a sudden lost interest in becoming engineers, we'd have real problems getting engineering projects done, and all the good things that result from engineering wouldn't happen. Also, even though it's been two years since I did a blog on the movie "The Prestige," I'm still getting comments about it. So these two factors lead me to draw your attention to a curious film called "Our Mr. Sun." But first, some context.
The nineteen-fifties were unique in many ways, some good, some not so good. Having helped to knock flat nearly every other industrialized country's manufacturing capabilities during World War II, the United States enjoyed an unprecedented decade of prosperity as we produced a lot of neat new stuff that the rest of the world wanted. One beneficiary of this abundance was the Bell System, which back then legally monopolized the U. S. telecommunications market. With a small portion of their government-regulated profits, Ma Bell devoted itself to what it perceived as good works, the nature of which some historian of technology ought to explore one of these days. One of these good works was a well-funded series of entertaining and (they hoped) educational films, which became known as the Bell Laboratories Science Series.
It sounds like what we call "infotainment" today, but compared to today's thirty-minute ads for weight-loss nostrums, the Bell films are almost the exact opposite. Today's infotainment is produced as cheaply as the advertisers can get by with it; Bell went out and hired top-notch directors such as Frank Capra, and gave them pretty nearly a free hand and generous funding. The whole point of today's infotainment is to sell you something; the first film in the Science Series, an hour-long production called "Our Mr. Sun," has about three minutes devoted to the Bell System's early research in solar cells, and one gets the feeling that if Mr. Capra had decided to cut it for a good artistic reason, the Bell people would have swallowed their pride and gone along with the cut.
When one asks why a telecommunications monopoly would spend their shareholders' money on such an apparently profitless enterprise, the non-historian is reduced to guesswork (the real answer may be buried in the AT&T Archives in New Jersey, but my historical-research travel budget for this blog is busted). One reason might have been that Bell, and a lot of other people besides, were worried that U. S. citizens didn't know enough about science, and needed to know more. Although the film's format combined animation and live action, it was clearly intended for a wider audience than the kiddies, since it reportedly aired on television at 10 PM, which was not a good time if you wanted a lot of children in your audience. On the other hand, over 600 prints were later distributed to schools and civic organizations, so reaching the younger generation must have been at least a part of their intentions.
One way to get a grip on why they did it is to watch the film, which I did last night. (It is now in the public domain, and a URL for a streaming source is given below.) Technically, the story is in the form of an allegory a la Pilgrim's Progress, with characters such as Mr. Research (played by professor of literature Frank Baxter), Mr. Writer (Eddie Albert), and the voices of Mr. Sun (Marvin Miller) and Father Time (allegedly played by Lionel Barrymore, in his last role). These latter two worthies appear via animated segments, which makes it easier to conduct interviews with beings such as stars and personifications of non-material ideas.
Capra knew his way around lively stories, and the film holds up surprisingly well in both the dramatic and the technical senses. Dramatically, it does not induce that cringing sensation that the hyper-corniness of so many didactic films of the 1950s produce in us today. Capra managed to get across three thousand years' worth of history about how the sun was originally regarded as a god, to today's present view of it as a flaming ball of gas, and pinpoints the turning point with the name of a specific Greek philosopher. Then the viewer is treated to such things as the physics of nuclear fusion as explained by a magician, the problem of future sources of energy as symbolized by an "energy bank" measured in horsepower-hours, and speculation that future energy shortages will be solved in the short run by nuclear energy, of course, but eventually we might run out of uranium and then we'd have to develop better solar cells.
It's easy to throw rocks at such clouded crystal balls (to mix a metaphor), but the science that was state-of-the-art then was explained well. We get to see a clip of Hans Bethe, who originated the explanation for the carbon cycle of nuclear fusion in the sun, and get treated to what scientists knew about how chlorophyll (or rather, Chloro Phyll, a diminutive cook in a plant's metaphorical chemical kitchen) turns sunlight and water and carbon dioxide into sugar. In one of the weirder sequences, a cartoon scientist dressed in a chef's outfit starts with beach sand, purifies it in a blender, adds a "dash of arsenic" (!), cooks it in a boron oven, and voila! out pops a pan of solar cells from the oven, like cookies. (You're tempted to say, "Kids, don't try this at home," but the Bell System, in a separate but related public-education program, made available to public schools a do-it-yourself solar-cell kit, complete with a set of little fire bricks to build your boron-diffusion oven with.)
Your correspondent was too young at the time the film was released in 1956 to see it in its initial release on TV, but seven others came out over the next eight years, and chances are that anybody going to school in the 1960s saw at least one of these films. In an essay on the series, David Templeton notes that many young people who saw the films later became scientists and engineers, and some cite the series as at least one reason why they chose their technical fields. So in that sense, at least, it looks like Ma Bell got her money back.
It's hard to imagine anything like this taking place today, for a number of reasons. Telecomm monopolies have passed from the scene, corporate altruism is not popular with shareholders these days (what is?), and there is no chance in perdition that a modern film director could get by with the framing motif that Capra, a committed Catholic, chose to begin and end "Our Mr. Sun" with. The first words you see are "The Heavens declare the glory of God" (the first line of the Old Testament Psalm 19), and nearly the last words you hear are those of St. Francis of Assisi, who viewed Nature not as our mother, to be enslaved to, nor as our mistress or subject, to be exploited and dominated, but as our sister, to be loved, cared for, and regarded as a fellow creature of one's Creator. Of course, this was back when prayer in schools was not only permitted but often required by law. Whether getting rid of that kind of religious intrusion in public education has contributed to our current parlous state in which the future of engineering in the U. S. is at least somewhat in doubt, I will leave as a puzzle for the reader.
Sources: "Our Mr. Sun" can be viewed at the AVGeeks archive at http://www.archive.org/details/our_mr_sun. David Templeton's essay can be found at http://www.metroactive.com/papers/sonoma/09.23.99/bellscience-9938.html.
The nineteen-fifties were unique in many ways, some good, some not so good. Having helped to knock flat nearly every other industrialized country's manufacturing capabilities during World War II, the United States enjoyed an unprecedented decade of prosperity as we produced a lot of neat new stuff that the rest of the world wanted. One beneficiary of this abundance was the Bell System, which back then legally monopolized the U. S. telecommunications market. With a small portion of their government-regulated profits, Ma Bell devoted itself to what it perceived as good works, the nature of which some historian of technology ought to explore one of these days. One of these good works was a well-funded series of entertaining and (they hoped) educational films, which became known as the Bell Laboratories Science Series.
It sounds like what we call "infotainment" today, but compared to today's thirty-minute ads for weight-loss nostrums, the Bell films are almost the exact opposite. Today's infotainment is produced as cheaply as the advertisers can get by with it; Bell went out and hired top-notch directors such as Frank Capra, and gave them pretty nearly a free hand and generous funding. The whole point of today's infotainment is to sell you something; the first film in the Science Series, an hour-long production called "Our Mr. Sun," has about three minutes devoted to the Bell System's early research in solar cells, and one gets the feeling that if Mr. Capra had decided to cut it for a good artistic reason, the Bell people would have swallowed their pride and gone along with the cut.
When one asks why a telecommunications monopoly would spend their shareholders' money on such an apparently profitless enterprise, the non-historian is reduced to guesswork (the real answer may be buried in the AT&T Archives in New Jersey, but my historical-research travel budget for this blog is busted). One reason might have been that Bell, and a lot of other people besides, were worried that U. S. citizens didn't know enough about science, and needed to know more. Although the film's format combined animation and live action, it was clearly intended for a wider audience than the kiddies, since it reportedly aired on television at 10 PM, which was not a good time if you wanted a lot of children in your audience. On the other hand, over 600 prints were later distributed to schools and civic organizations, so reaching the younger generation must have been at least a part of their intentions.
One way to get a grip on why they did it is to watch the film, which I did last night. (It is now in the public domain, and a URL for a streaming source is given below.) Technically, the story is in the form of an allegory a la Pilgrim's Progress, with characters such as Mr. Research (played by professor of literature Frank Baxter), Mr. Writer (Eddie Albert), and the voices of Mr. Sun (Marvin Miller) and Father Time (allegedly played by Lionel Barrymore, in his last role). These latter two worthies appear via animated segments, which makes it easier to conduct interviews with beings such as stars and personifications of non-material ideas.
Capra knew his way around lively stories, and the film holds up surprisingly well in both the dramatic and the technical senses. Dramatically, it does not induce that cringing sensation that the hyper-corniness of so many didactic films of the 1950s produce in us today. Capra managed to get across three thousand years' worth of history about how the sun was originally regarded as a god, to today's present view of it as a flaming ball of gas, and pinpoints the turning point with the name of a specific Greek philosopher. Then the viewer is treated to such things as the physics of nuclear fusion as explained by a magician, the problem of future sources of energy as symbolized by an "energy bank" measured in horsepower-hours, and speculation that future energy shortages will be solved in the short run by nuclear energy, of course, but eventually we might run out of uranium and then we'd have to develop better solar cells.
It's easy to throw rocks at such clouded crystal balls (to mix a metaphor), but the science that was state-of-the-art then was explained well. We get to see a clip of Hans Bethe, who originated the explanation for the carbon cycle of nuclear fusion in the sun, and get treated to what scientists knew about how chlorophyll (or rather, Chloro Phyll, a diminutive cook in a plant's metaphorical chemical kitchen) turns sunlight and water and carbon dioxide into sugar. In one of the weirder sequences, a cartoon scientist dressed in a chef's outfit starts with beach sand, purifies it in a blender, adds a "dash of arsenic" (!), cooks it in a boron oven, and voila! out pops a pan of solar cells from the oven, like cookies. (You're tempted to say, "Kids, don't try this at home," but the Bell System, in a separate but related public-education program, made available to public schools a do-it-yourself solar-cell kit, complete with a set of little fire bricks to build your boron-diffusion oven with.)
Your correspondent was too young at the time the film was released in 1956 to see it in its initial release on TV, but seven others came out over the next eight years, and chances are that anybody going to school in the 1960s saw at least one of these films. In an essay on the series, David Templeton notes that many young people who saw the films later became scientists and engineers, and some cite the series as at least one reason why they chose their technical fields. So in that sense, at least, it looks like Ma Bell got her money back.
It's hard to imagine anything like this taking place today, for a number of reasons. Telecomm monopolies have passed from the scene, corporate altruism is not popular with shareholders these days (what is?), and there is no chance in perdition that a modern film director could get by with the framing motif that Capra, a committed Catholic, chose to begin and end "Our Mr. Sun" with. The first words you see are "The Heavens declare the glory of God" (the first line of the Old Testament Psalm 19), and nearly the last words you hear are those of St. Francis of Assisi, who viewed Nature not as our mother, to be enslaved to, nor as our mistress or subject, to be exploited and dominated, but as our sister, to be loved, cared for, and regarded as a fellow creature of one's Creator. Of course, this was back when prayer in schools was not only permitted but often required by law. Whether getting rid of that kind of religious intrusion in public education has contributed to our current parlous state in which the future of engineering in the U. S. is at least somewhat in doubt, I will leave as a puzzle for the reader.
Sources: "Our Mr. Sun" can be viewed at the AVGeeks archive at http://www.archive.org/details/our_mr_sun. David Templeton's essay can be found at http://www.metroactive.com/papers/sonoma/09.23.99/bellscience-9938.html.
Monday, November 10, 2008
Watching Teenage Drivers with Webcams
Over two hundred teenagers in southern Maryland are now driving around with a webcam on the rear-view mirror of their cars. Whenever they turn or brake sharply, the resulting g-force triggers the camera to record a 20-second sequence of what went on inside and outside the car before and after the incident. These dynamic snippets go via the web to a company in San Diego that reviews them, attaches little helpful comments about how such dangerous driving incidents can be avoided, and notifies the teenager's parents that the video is now available for viewing.
Although deaths and injuries in automobile accidents have been declining slowly for years, over 40,000 people died in highway-related accidents in 2007. Anything that makes that number go down without severely compromising some other desirable outcome of automobile use is worth considering. And at first glance, the specter of Mom or Pop looking over the teen driver's shoulder, so to speak, seems like a good idea. A similar study done in 2006 showed that drivers who started out triggering the webcam a lot with their jerky, high-risk driving, eventually learned to reduce their triggering rate (and thus drive more safely) by four-fifths. It's too early to tell whether a similar improvement will result from the Maryland experiment. But one thing is already clear: the teens don't like the idea, even though some grudgingly admit that the system has improved their driving.
Do the teens have a point? Is the webcam an intrusion on their privacy? Obviously it is, but then you have to ask whether the chance of saving someone's life is worth a little less privacy. And it's not like the thing was on all the time. Teens do other things with, and in, cars that I'm sure they wouldn't want their parents to see. But when the company that operates the system says it won't forward anything that's "embarrassing to the teen" in their words, that seems to be enough to satisfy most young drivers. Of course, if the company were ever to betray that trust, the entire system might suffer a black eye that it might never recover from.
This system is just one example of how technology is making it possible to monitor more and more aspects of our daily lives, in ways that were unthinkable back in the days when George Orwell wrote 1984. One of the creepier images of that novel was the spy cameras everywhere, monitored by secret police whose presence the citizens were reminded of through the slogan "Big Brother is watching you." A sure-fire argument against that kind of thing ever happening in reality used to be that you'd never be able to man every camera everywhere, because eventually you'd end up hiring one half of the populace to watch the other half. But notice that the in-car webcam uses smart technology—namely, accelerometers—to select only those incidents worthy of study, thereby reducing the work of human editors to manageable proportions.
So as time goes on, it will be more and more practical to acquire webcam data on all sorts of activities, and still be able to handle the massive amounts of raw input intelligently. Is this a fundamental threat to privacy, liberty, and all that? Or is it a tempest in a teapot?
The answer hinges on those who are doing the spying, or monitoring, or whatever you want to call it. In the case we're discussing today, a private company is involved with consenting families, and if the company does anything out of line, they are liable to lose business fast. That's one of the best constraints against misbehavior. Governments do not have such a negative incentive, which is why government-sponsored monitoring of behavior can be more problematic. A case in point is the increasingly obtrusive safety inspections for airline passengers. In certain airports, systems are now in place that use millimeter-wave sensors to see through a person's clothes. The people who inspect these images are not co-located at the inspection point, but still, you wonder if and when this kind of thing will be abused.
It seems the best thing to do in these cases is to ask whether the system is doing any good. In the case of the in-car webcams, it looks like they may well improve driver safety, which is good for everybody. In other situations, such as in-flight security, it's harder to evaluate effectiveness except with tests in which people try to sneak by the inspection stations on purpose. And the news regularly carries reports that inspectors often fail these tests. On the other hand, we haven't had any U. S. planes get bombed or turned into flying missiles since Sept. 9, 2001, so something is working, at any rate.
The other factor to consider is the continuing decline in monitoring technology cost. The current webcam system costs $900 plus a $30 monthly fee, but if it proves popular, these costs could go down to where it would be offered as an option when you buy a new car. If insurance companies like it, you could get a discount on your teenager's insurance rate if you agreed to install the device. And once it's in there, it will work for everybody: the teens, Mom, Pop, and Grandpa. So one day we may all be driving around with spycams in our rear-view mirrors, who knows? Let's just hope the people operating the cameras then are as trustworthy as they are today.
Sources: An early report on the in-car webcam can be found at the Washington Post website for Oct. 24, 2008 at http://www.washingtonpost.com/wp-dyn/content/article/2008/10/23/AR2008102303821.html. The 2007 automotive fatality statistics can be found at the U. S. National Highway Traffic Safety Administration website http://www-nrd.nhtsa.dot.gov/Pubs/811017.PDF.
Although deaths and injuries in automobile accidents have been declining slowly for years, over 40,000 people died in highway-related accidents in 2007. Anything that makes that number go down without severely compromising some other desirable outcome of automobile use is worth considering. And at first glance, the specter of Mom or Pop looking over the teen driver's shoulder, so to speak, seems like a good idea. A similar study done in 2006 showed that drivers who started out triggering the webcam a lot with their jerky, high-risk driving, eventually learned to reduce their triggering rate (and thus drive more safely) by four-fifths. It's too early to tell whether a similar improvement will result from the Maryland experiment. But one thing is already clear: the teens don't like the idea, even though some grudgingly admit that the system has improved their driving.
Do the teens have a point? Is the webcam an intrusion on their privacy? Obviously it is, but then you have to ask whether the chance of saving someone's life is worth a little less privacy. And it's not like the thing was on all the time. Teens do other things with, and in, cars that I'm sure they wouldn't want their parents to see. But when the company that operates the system says it won't forward anything that's "embarrassing to the teen" in their words, that seems to be enough to satisfy most young drivers. Of course, if the company were ever to betray that trust, the entire system might suffer a black eye that it might never recover from.
This system is just one example of how technology is making it possible to monitor more and more aspects of our daily lives, in ways that were unthinkable back in the days when George Orwell wrote 1984. One of the creepier images of that novel was the spy cameras everywhere, monitored by secret police whose presence the citizens were reminded of through the slogan "Big Brother is watching you." A sure-fire argument against that kind of thing ever happening in reality used to be that you'd never be able to man every camera everywhere, because eventually you'd end up hiring one half of the populace to watch the other half. But notice that the in-car webcam uses smart technology—namely, accelerometers—to select only those incidents worthy of study, thereby reducing the work of human editors to manageable proportions.
So as time goes on, it will be more and more practical to acquire webcam data on all sorts of activities, and still be able to handle the massive amounts of raw input intelligently. Is this a fundamental threat to privacy, liberty, and all that? Or is it a tempest in a teapot?
The answer hinges on those who are doing the spying, or monitoring, or whatever you want to call it. In the case we're discussing today, a private company is involved with consenting families, and if the company does anything out of line, they are liable to lose business fast. That's one of the best constraints against misbehavior. Governments do not have such a negative incentive, which is why government-sponsored monitoring of behavior can be more problematic. A case in point is the increasingly obtrusive safety inspections for airline passengers. In certain airports, systems are now in place that use millimeter-wave sensors to see through a person's clothes. The people who inspect these images are not co-located at the inspection point, but still, you wonder if and when this kind of thing will be abused.
It seems the best thing to do in these cases is to ask whether the system is doing any good. In the case of the in-car webcams, it looks like they may well improve driver safety, which is good for everybody. In other situations, such as in-flight security, it's harder to evaluate effectiveness except with tests in which people try to sneak by the inspection stations on purpose. And the news regularly carries reports that inspectors often fail these tests. On the other hand, we haven't had any U. S. planes get bombed or turned into flying missiles since Sept. 9, 2001, so something is working, at any rate.
The other factor to consider is the continuing decline in monitoring technology cost. The current webcam system costs $900 plus a $30 monthly fee, but if it proves popular, these costs could go down to where it would be offered as an option when you buy a new car. If insurance companies like it, you could get a discount on your teenager's insurance rate if you agreed to install the device. And once it's in there, it will work for everybody: the teens, Mom, Pop, and Grandpa. So one day we may all be driving around with spycams in our rear-view mirrors, who knows? Let's just hope the people operating the cameras then are as trustworthy as they are today.
Sources: An early report on the in-car webcam can be found at the Washington Post website for Oct. 24, 2008 at http://www.washingtonpost.com/wp-dyn/content/article/2008/10/23/AR2008102303821.html. The 2007 automotive fatality statistics can be found at the U. S. National Highway Traffic Safety Administration website http://www-nrd.nhtsa.dot.gov/Pubs/811017.PDF.
Monday, November 03, 2008
The Ethics of Externalities
You may have never heard of an externality, but engineers (as well as nearly everybody else) deal with them all the time without realizing it. The term comes from economics, and means an effect of an economic transaction that happens to somebody who was not directly involved in the transaction. That's pretty dry, so let me give a juicy example.
In the early nineteenth century, the chemical called sodium carbonate (washing soda) was obtained by burning a type of seaweed found off the coast of Spain. But when Napoleon ticked off England so much that the British blockaded French ports, that cut off France's supply of soda from Spain. The French government thereupon offered a prize for the best process of making soda without seaweed. A chemist named Le Blanc found that if he heated ordinary table salt with sulfuric acid, he got an intermediate chemical (sodium sulfate) that was easily transformed into washing soda. Le Blanc won the prize, the French were able to wash clothes again, and eventually the Le Blanc process took over as the main commercial way of making soda.
The trouble was that a by-product of the process was hydrochloric acid. At first the manufacturers just let it go up the chimney, but nearby farmers began to complain that it was killing their crops. This effect was an externality to the economy of making, buying and selling soda. Eventually chemical engineers found a way to capture the acid and sell it too, but not all air pollution problems are so easily solved.
One of the major externality issues these days is the problem of carbon dioxide emissions. Every time anyone burns a carbon-bearing fuel (coal, especially, but to some extent gasoline and oil as well), the resulting carbon dioxide goes into the air and plays some role (exactly how much isn't totally clear) in global warming. The prophecies about what consequences global warming will have if we don't do something about it range from the negligible to the apocalyptic. This fuzziness about how much carbon dioxide does what amount of damage is one of the classic problems with externalities. In a straighforward economic transaction between informed parties, the price paid says a lot about the relative value of the commodity. If the price goes up or down, that represents information that buyers, sellers, and even economists can use about the thing being traded. But the person harmed (or occasionally helped) by an externality doesn't spend any money, and therefore the economic equivalent of the externality is much harder to determine in many cases.
What have externalities got to do with engineering ethics? A lot, as it turns out. Many externalities are hidden, often from the traders and sometimes even from the third parties being affected. Returning to environmental externalities, such infamous incidents as the terrible ground pollution in the Love Canal area of Niagara Falls, New York came about because standard practices at the time allowed chemical companies to dump toxic waste into the ground with only minimal precautions, and nobody gave much thought to the possibility that someone in the far future might want to come along and build a school on the former toxic waste dump site.
Once you start looking for externalities, you'll see them everywhere. Anyone who buys a piece of new electronic gear is creating a future externality that arises when the thing is no longer useful: where does it go then? Into a landfill? A landfill next to whose property? Or maybe it goes off to some third-world reprocessing facility—then what? As the saying goes, "you can't just throw things away anymore because there is no 'away' anymore."
Even the cyberworld has externalities. Say some online game clogs up a server so much that other people just trying to get their work done experience slowdowns. That's an externality, and one that's hard to evaluate as well.
Just knowing about an externality doesn't mean that it's easy to deal with. Economists say you can pass laws or tax externalities to right the potential wrongs they represent. But that assumes you can put a value, either economic or moral, on the externality. Obviously, if a third party is injured or inconvenienced by some transaction that he or she has no control over, there is at least the potential for injustice, and in a just world such things wouldn't happen. Then you have to ask whether in the grand scheme of things, this particular injustice due to the externality is worth worrying about or fixing compared to everything else that's going on. This is the problem known as life, and we don't do life guiding here.
All I wanted to do in today's blog was to let you know about a concept that I have found useful in thinking about a wide variety of engineering ethics problems. Perhaps the most important thing about externalities is to recognize them when they occur. Depending on how serious they are, the ethical engineer may or may not want to address the issue, but if you don't see it, you'll never be able to do anything about it.
Sources: Wikipedia's article on "externality" is helpful, although it concentrates mainly on the economics of the concept. The story of Napoleon's washing comes from a modern reproduction copy of Asher & Adams' Pictorial Album of American Industry, 1876, p. 19.
In the early nineteenth century, the chemical called sodium carbonate (washing soda) was obtained by burning a type of seaweed found off the coast of Spain. But when Napoleon ticked off England so much that the British blockaded French ports, that cut off France's supply of soda from Spain. The French government thereupon offered a prize for the best process of making soda without seaweed. A chemist named Le Blanc found that if he heated ordinary table salt with sulfuric acid, he got an intermediate chemical (sodium sulfate) that was easily transformed into washing soda. Le Blanc won the prize, the French were able to wash clothes again, and eventually the Le Blanc process took over as the main commercial way of making soda.
The trouble was that a by-product of the process was hydrochloric acid. At first the manufacturers just let it go up the chimney, but nearby farmers began to complain that it was killing their crops. This effect was an externality to the economy of making, buying and selling soda. Eventually chemical engineers found a way to capture the acid and sell it too, but not all air pollution problems are so easily solved.
One of the major externality issues these days is the problem of carbon dioxide emissions. Every time anyone burns a carbon-bearing fuel (coal, especially, but to some extent gasoline and oil as well), the resulting carbon dioxide goes into the air and plays some role (exactly how much isn't totally clear) in global warming. The prophecies about what consequences global warming will have if we don't do something about it range from the negligible to the apocalyptic. This fuzziness about how much carbon dioxide does what amount of damage is one of the classic problems with externalities. In a straighforward economic transaction between informed parties, the price paid says a lot about the relative value of the commodity. If the price goes up or down, that represents information that buyers, sellers, and even economists can use about the thing being traded. But the person harmed (or occasionally helped) by an externality doesn't spend any money, and therefore the economic equivalent of the externality is much harder to determine in many cases.
What have externalities got to do with engineering ethics? A lot, as it turns out. Many externalities are hidden, often from the traders and sometimes even from the third parties being affected. Returning to environmental externalities, such infamous incidents as the terrible ground pollution in the Love Canal area of Niagara Falls, New York came about because standard practices at the time allowed chemical companies to dump toxic waste into the ground with only minimal precautions, and nobody gave much thought to the possibility that someone in the far future might want to come along and build a school on the former toxic waste dump site.
Once you start looking for externalities, you'll see them everywhere. Anyone who buys a piece of new electronic gear is creating a future externality that arises when the thing is no longer useful: where does it go then? Into a landfill? A landfill next to whose property? Or maybe it goes off to some third-world reprocessing facility—then what? As the saying goes, "you can't just throw things away anymore because there is no 'away' anymore."
Even the cyberworld has externalities. Say some online game clogs up a server so much that other people just trying to get their work done experience slowdowns. That's an externality, and one that's hard to evaluate as well.
Just knowing about an externality doesn't mean that it's easy to deal with. Economists say you can pass laws or tax externalities to right the potential wrongs they represent. But that assumes you can put a value, either economic or moral, on the externality. Obviously, if a third party is injured or inconvenienced by some transaction that he or she has no control over, there is at least the potential for injustice, and in a just world such things wouldn't happen. Then you have to ask whether in the grand scheme of things, this particular injustice due to the externality is worth worrying about or fixing compared to everything else that's going on. This is the problem known as life, and we don't do life guiding here.
All I wanted to do in today's blog was to let you know about a concept that I have found useful in thinking about a wide variety of engineering ethics problems. Perhaps the most important thing about externalities is to recognize them when they occur. Depending on how serious they are, the ethical engineer may or may not want to address the issue, but if you don't see it, you'll never be able to do anything about it.
Sources: Wikipedia's article on "externality" is helpful, although it concentrates mainly on the economics of the concept. The story of Napoleon's washing comes from a modern reproduction copy of Asher & Adams' Pictorial Album of American Industry, 1876, p. 19.
Monday, October 27, 2008
Cheesy Products: The Case of the Solar-Powered Lamp
Less than a year ago, I responded to my wife's request to install a light on the stairway leading down from our back deck into the back yard. Her elderly father, who lives with us, goes down those stairs at night when he lets his dog out for the usual reason that dogs like to go out, and even with a flashlight the stairs can be tricky. She mentioned seeing advertisements for solar-powered light-emitting-diode (LED) lamps, so I found one at a local hardware chain store and screwed it to the stairway near the bottom.
From the start, the thing was somewhat of a disappointment. After dusk fell the first night, I was hoping its light would be enough to see the stairs by. But frankly, it reminded me of Mark Twain's candle supplied by a skinflint innkeeper. Twain complained that the candle was so dim he needed a second candle to see the first one by. If you looked carefully out in the back yard after dark, you'd see a dim bluish light hovering somewhere in the blackness, but it served more like a lighted buoy in a channel than as a source of illumination for the steps. Still, it was better than nothing.
Time went on, winter, spring, summer, and somewhere along the line, the lamp quit working. It was getting as much sunlight as it ever did, so I decided to do a post-mortem on the thing. It's 98% plastic, of course, and the works are all in the top. A white LED shines down on a conical reflector in the base that scatters light back up along a cylindrical diffuser behind a translucent white plastic box. Inside the lid I found connections to the solar cell itself (installed in a square opening on the top), a cadmium sulfide photocell, and wires leading to a circuit board. I haven't bothered to trace out the whole thing, but it looks like they used a CMOS-type integrated circuit (IC) to detect the light level with the photocell and switch the 2.4-volt rechargeable battery pack to the LED when it gets dark.
That is all fine and good, but this thing sat out in the weather. Somehow water got on the top (not a startling eventuality), made its way inside, and created a nice little rust spot on the circuit board next to the IC. CMOS ICs are notoriously sensitive to small leakage currents, and the conductive rust likely shorted out something or other, causing the entire apparatus to fail.
Now in the grand scheme of engineering ethics, this is not a big deal. My father-in-law didn't trip in the dark and break his hip, the monetary losses are small (twenty-five bucks or so, if I recall correctly), and after all, the lamp did give us nearly a year of service, such as it was. And I suppose it may have even come with a one-year warranty with which, if I bothered to fill out the paperwork, get a return authorization, and ship it back, I could get a new one. But what would happen then? In a year or so I'd have to do the same thing all over again.
Cheesy bargain-basement products intentionally made to last just long enough but no longer are nothing new. As industrialization during the 1800s made possible the mass production of stamped-metal products, complaints arose about how the market was flooded with goods that barely lasted long enough to take home. But somehow, there is always a market for a thing that's a little cheaper than the next comparable product, even if it doesn't work as well. These products are generally made by anonymous factories in Asian countries (the lamp in question is made in the Peoples' Republic of China, to use the official name), sold under nice-sounding brand names (this unit carries the brand of Hampton Bay, which makes a decent line of ceiling fans), and carried by the Wal-Marts and Lowe's Hardwares of the world. And yes, it takes that kind of a system to deliver goods at the lowest prices possible.
But see what you get: a product whose impermanence is almost guaranteed. The problem I'm speaking of could have been prevented with a better weather seal, but that would have added manufacturing steps, labor costs, maybe some added R&D costs, and the price would have gone up a dollar or two. And with the ruthless international market to deal with, the designer said to heck with it, let's ship it as is.
Of course, in principle I could have spent a few more dollars and gotten a better product, but only if one were available. But I did the easy thing, which was to go to the big-box store, find the cheapest thing that did what I wanted (or at least claimed to), and bought it. Judging by the selection available, that's what most people do. I'm a believer in spending a little more if you know you'll get a better product that will last longer, but such options are not always available. In some areas of consumer electronics, the tendency is to drive toward the bottom of the product lineup, cutting costs while maintaining a minimum of functionality. And we as consumers vote with our money to encourage such behavior.
Was it wrong for that designer to neglect the problem that killed the lamp after less than a year? I can't say unequivocally yes, and yet this situation falls into a kind of gray area of ethics that I personally would not want to spend a lot of time in. Cutting corners and trusting to warranties to get you out of legal trouble does not add to one's reputation, but then brand recognition and reputation is such an ephemeral thing nowadays that I'm not sure anyone worries about it much anymore. If I'm in the market for a ceiling fan any time soon (although in contrast to this lamp, ceiling fans seem to run forever), I might consider Hampton Bay, but not for solar-powered lamps.
As for the backyard stairs, I went out and bought a set of three solar-powered lights that use a pole-mounted solar-powered battery. The new lights are much brighter than the old lamp, and you can actually see the stairs better than the lamps. But the other night my wife told me one of the lights had gone out, and a few days later the whole system died. I've just finished exchanging the solar unit for a new one and tracing out a short somewhere that mysteriously disappeared. It works for now, but we'll see how long it lasts.
From the start, the thing was somewhat of a disappointment. After dusk fell the first night, I was hoping its light would be enough to see the stairs by. But frankly, it reminded me of Mark Twain's candle supplied by a skinflint innkeeper. Twain complained that the candle was so dim he needed a second candle to see the first one by. If you looked carefully out in the back yard after dark, you'd see a dim bluish light hovering somewhere in the blackness, but it served more like a lighted buoy in a channel than as a source of illumination for the steps. Still, it was better than nothing.
Time went on, winter, spring, summer, and somewhere along the line, the lamp quit working. It was getting as much sunlight as it ever did, so I decided to do a post-mortem on the thing. It's 98% plastic, of course, and the works are all in the top. A white LED shines down on a conical reflector in the base that scatters light back up along a cylindrical diffuser behind a translucent white plastic box. Inside the lid I found connections to the solar cell itself (installed in a square opening on the top), a cadmium sulfide photocell, and wires leading to a circuit board. I haven't bothered to trace out the whole thing, but it looks like they used a CMOS-type integrated circuit (IC) to detect the light level with the photocell and switch the 2.4-volt rechargeable battery pack to the LED when it gets dark.
That is all fine and good, but this thing sat out in the weather. Somehow water got on the top (not a startling eventuality), made its way inside, and created a nice little rust spot on the circuit board next to the IC. CMOS ICs are notoriously sensitive to small leakage currents, and the conductive rust likely shorted out something or other, causing the entire apparatus to fail.
Now in the grand scheme of engineering ethics, this is not a big deal. My father-in-law didn't trip in the dark and break his hip, the monetary losses are small (twenty-five bucks or so, if I recall correctly), and after all, the lamp did give us nearly a year of service, such as it was. And I suppose it may have even come with a one-year warranty with which, if I bothered to fill out the paperwork, get a return authorization, and ship it back, I could get a new one. But what would happen then? In a year or so I'd have to do the same thing all over again.
Cheesy bargain-basement products intentionally made to last just long enough but no longer are nothing new. As industrialization during the 1800s made possible the mass production of stamped-metal products, complaints arose about how the market was flooded with goods that barely lasted long enough to take home. But somehow, there is always a market for a thing that's a little cheaper than the next comparable product, even if it doesn't work as well. These products are generally made by anonymous factories in Asian countries (the lamp in question is made in the Peoples' Republic of China, to use the official name), sold under nice-sounding brand names (this unit carries the brand of Hampton Bay, which makes a decent line of ceiling fans), and carried by the Wal-Marts and Lowe's Hardwares of the world. And yes, it takes that kind of a system to deliver goods at the lowest prices possible.
But see what you get: a product whose impermanence is almost guaranteed. The problem I'm speaking of could have been prevented with a better weather seal, but that would have added manufacturing steps, labor costs, maybe some added R&D costs, and the price would have gone up a dollar or two. And with the ruthless international market to deal with, the designer said to heck with it, let's ship it as is.
Of course, in principle I could have spent a few more dollars and gotten a better product, but only if one were available. But I did the easy thing, which was to go to the big-box store, find the cheapest thing that did what I wanted (or at least claimed to), and bought it. Judging by the selection available, that's what most people do. I'm a believer in spending a little more if you know you'll get a better product that will last longer, but such options are not always available. In some areas of consumer electronics, the tendency is to drive toward the bottom of the product lineup, cutting costs while maintaining a minimum of functionality. And we as consumers vote with our money to encourage such behavior.
Was it wrong for that designer to neglect the problem that killed the lamp after less than a year? I can't say unequivocally yes, and yet this situation falls into a kind of gray area of ethics that I personally would not want to spend a lot of time in. Cutting corners and trusting to warranties to get you out of legal trouble does not add to one's reputation, but then brand recognition and reputation is such an ephemeral thing nowadays that I'm not sure anyone worries about it much anymore. If I'm in the market for a ceiling fan any time soon (although in contrast to this lamp, ceiling fans seem to run forever), I might consider Hampton Bay, but not for solar-powered lamps.
As for the backyard stairs, I went out and bought a set of three solar-powered lights that use a pole-mounted solar-powered battery. The new lights are much brighter than the old lamp, and you can actually see the stairs better than the lamps. But the other night my wife told me one of the lights had gone out, and a few days later the whole system died. I've just finished exchanging the solar unit for a new one and tracing out a short somewhere that mysteriously disappeared. It works for now, but we'll see how long it lasts.
Monday, October 20, 2008
Ethics of Career Choice: Nuclear Engineering
Since most engineers are not self-employed, the type of work they do is largely determined by the organizations they work for. This means that one of the most ethically significant decisions engineers make is deciding on what job offer to accept. These days, it may seem like finding any engineering job at all is a challenge, but even in the worst of times you are still free to choose where to look for work. Today I'd like to show how you can begin to work through the ethical implications of a whole field of work, namely, nuclear engineering.
Only a small number of nuclear engineers design bombs, but the fact that the first application of nuclear fission was to kill thousands of Japanese in and around the cities of Hiroshima and Nagasaki in World War II has cast its shadow over the field ever since. There are those who are unalterably opposed to any use of nuclear energy, peaceful or otherwise. They argue that besides the danger of nuclear-weapons proliferation, the problem of nuclear waste hasn't been solved and the danger of a Chernobyl-type accident is too great to allow any further growth of nuclear power. (Chernobyl was the name of a city in the present country of Ukraine, then the USSR, near which a nuclear power plant exploded and caught fire in 1986, spewing tons of radioactive material over the countryside and forcing the evacuation of hundreds of thousands of people.)
On the other hand, both current U. S. Presidential candidates favor at least some expansion of nuclear generating capacity in the U. S. as a way of decreasing the nation's dependence on foreign oil. Nuclear generation releases essentially no greenhouse gases such as CO2, in contrast to the burning of fossil fuels such as oil or coal. Proponents argue that the waste problem is manageable and point to countries such as France that generate most of their electricity with nuclear-powered facilities.
You may have heard someone say that technology is ethically neutral, it's what human beings do with technology that makes for good or bad consequences. Most of the time, that view is at least an oversimplification, but it turns out to be almost exactly true of a particular kind of nuclear-related technology: the gas centrifuge. Naturally occurring uranium does not have a high enough percentage of the isotope U-235 to be useful either in nuclear reactors (which need slightly enriched uranium) or weapons (which need almost pure U-235). It turns out that the most efficient way to increase the fraction of the lighter U-235 isotope in uranium, compared to the heavier U-238 one, is to turn it into a gas by attaching six fluorine atoms to each uranium atom, and send the gas through an extremely high-speed centrifuge in the form of a hollow aluminum cylinder spinning in a vacuum. You need hundreds of these centrifuges to do the job, but they can be as small as only six feet high, and a centrifuge plant uses only about as much energy as a food-processing plant of the same size. The very same plant can be used either for making slightly enriched uranium for peaceful nuclear reactors, or highly enriched uranium for bombs.
This is one reason why Iran's gas-centrifuge facilities are so controversial: Iran's government says it's for peaceful applications, but the International Atomic Energy Agency inspectors know the same facility could be used to make bomb-grade material. It all depends on what the engineers do with it.
We've only scratched the surface of what is clearly a complex and multifaceted issue. One of the world's most infamous parties in nuclear proliferation, a metallurgist named Abdul Qadeer Khan, turned his knowledge of gas centrifuges toward making plans available for black-market buyers such as Iran and Libya. Khan is a bad example of how technical knowledge can be abused, but even that viewpoint might be debated by some of the people in countries that benefited from his expertise.
Perhaps more than most other fields of engineering, nuclear engineering is fraught with ethical questions: What products are made? How safe are the facilities being designed? What are the long-term consequences of use for future generations? Young people who are uncomfortable dealing with these issues may consider other fields instead. But the field needs some good engineers—both in a technical and an ethical sense—if the benefits of the technology are to be realized with a minimum of harm.
Personally, I would like no better outcome concerning America's energy situation than if we built a bunch of safe, uniform, French-style nuclear plants, junked our gas-guzzling nineteenth-century internal-combustion cars, replaced them with electrics charged from the nuclear-powered grid, and got in a position to thumb our collective noses at foreign oil producers. But a lot of things would have to change before that vision is realized, and a lot of nuclear engineers would be involved in the change.
Sources: The September 2008 issue of Physics Today carried an informative article by H.G. Wood, A. Glaser, and R. S. Kemp on the gas centrifuge and its role in nuclear-weapons proliferation, pp. 40-45.
Only a small number of nuclear engineers design bombs, but the fact that the first application of nuclear fission was to kill thousands of Japanese in and around the cities of Hiroshima and Nagasaki in World War II has cast its shadow over the field ever since. There are those who are unalterably opposed to any use of nuclear energy, peaceful or otherwise. They argue that besides the danger of nuclear-weapons proliferation, the problem of nuclear waste hasn't been solved and the danger of a Chernobyl-type accident is too great to allow any further growth of nuclear power. (Chernobyl was the name of a city in the present country of Ukraine, then the USSR, near which a nuclear power plant exploded and caught fire in 1986, spewing tons of radioactive material over the countryside and forcing the evacuation of hundreds of thousands of people.)
On the other hand, both current U. S. Presidential candidates favor at least some expansion of nuclear generating capacity in the U. S. as a way of decreasing the nation's dependence on foreign oil. Nuclear generation releases essentially no greenhouse gases such as CO2, in contrast to the burning of fossil fuels such as oil or coal. Proponents argue that the waste problem is manageable and point to countries such as France that generate most of their electricity with nuclear-powered facilities.
You may have heard someone say that technology is ethically neutral, it's what human beings do with technology that makes for good or bad consequences. Most of the time, that view is at least an oversimplification, but it turns out to be almost exactly true of a particular kind of nuclear-related technology: the gas centrifuge. Naturally occurring uranium does not have a high enough percentage of the isotope U-235 to be useful either in nuclear reactors (which need slightly enriched uranium) or weapons (which need almost pure U-235). It turns out that the most efficient way to increase the fraction of the lighter U-235 isotope in uranium, compared to the heavier U-238 one, is to turn it into a gas by attaching six fluorine atoms to each uranium atom, and send the gas through an extremely high-speed centrifuge in the form of a hollow aluminum cylinder spinning in a vacuum. You need hundreds of these centrifuges to do the job, but they can be as small as only six feet high, and a centrifuge plant uses only about as much energy as a food-processing plant of the same size. The very same plant can be used either for making slightly enriched uranium for peaceful nuclear reactors, or highly enriched uranium for bombs.
This is one reason why Iran's gas-centrifuge facilities are so controversial: Iran's government says it's for peaceful applications, but the International Atomic Energy Agency inspectors know the same facility could be used to make bomb-grade material. It all depends on what the engineers do with it.
We've only scratched the surface of what is clearly a complex and multifaceted issue. One of the world's most infamous parties in nuclear proliferation, a metallurgist named Abdul Qadeer Khan, turned his knowledge of gas centrifuges toward making plans available for black-market buyers such as Iran and Libya. Khan is a bad example of how technical knowledge can be abused, but even that viewpoint might be debated by some of the people in countries that benefited from his expertise.
Perhaps more than most other fields of engineering, nuclear engineering is fraught with ethical questions: What products are made? How safe are the facilities being designed? What are the long-term consequences of use for future generations? Young people who are uncomfortable dealing with these issues may consider other fields instead. But the field needs some good engineers—both in a technical and an ethical sense—if the benefits of the technology are to be realized with a minimum of harm.
Personally, I would like no better outcome concerning America's energy situation than if we built a bunch of safe, uniform, French-style nuclear plants, junked our gas-guzzling nineteenth-century internal-combustion cars, replaced them with electrics charged from the nuclear-powered grid, and got in a position to thumb our collective noses at foreign oil producers. But a lot of things would have to change before that vision is realized, and a lot of nuclear engineers would be involved in the change.
Sources: The September 2008 issue of Physics Today carried an informative article by H.G. Wood, A. Glaser, and R. S. Kemp on the gas centrifuge and its role in nuclear-weapons proliferation, pp. 40-45.
Monday, October 13, 2008
Expert Witnessing Ethically
I have never been an expert witness in a courtroom or legal situation. But I have known engineers who have been. And sooner or later, many engineers and academics who teach engineering may get a call from a legal firm wanting to pay for their services as an expert witness. What are the ethical implications of serving as a paid expert witness? Can you both take money from only one side of a contentious legal battle and still preserve your integrity and objectivity?
Let's look at a few of the issues that might arise. To keep things concrete, so to speak, let's say you are an expert in concrete, and have been called in by the owner of a shopping mall whose sidewalks are cracking up after only two years of use. The owner is suing the contractor who poured the sidewalks. Should you accept the job, and if you do, what are your ethical obligations?
Many professional codes of ethics have a lot to say about this kind of situation. Although electrical engineers don't deal much with concrete, the Institute of Electrical and Electronics Engineers (IEEE) has a code of ethics that is representative of many engineering codes, so we will take it as an example since I'm most familiar with the code of my own professional organization.
The first item in the IEEE code that speaks to our hypothetical question says that an engineer should "undertake technological tasks for others only if qualified by training and experience, or after full disclosure of pertinent limitations." In other words, if you don't know beans about concrete and how it cures and why it would crack, you shouldn't trade on your educational qualifications in an unrelated field just to impress a jury. Some lay persons are awed by anyone who can write "Ph. D." after their name, thinking that it confers indefinite wisdom. Those of us who work around Ph. D.s every day know that except for the narrow specialization that the Ph. D. represents, people with doctorates in engineering tend to be all over the map when it comes to wisdom or judgment. And anyway, what your client is paying you for is specific technical expertise that you claim to have. If you don't have it, you've lied to your client, and anyway, the defendant's lawyer, if he's any good at all, will take you apart with great glee during cross-examination. And being humiliated in front of a crowd of people would be just deserts for claiming expertise you don't have.
Suppose you are well qualified to pass judgment on the matter at hand. The plaintiff or his attorneys will offer you a fee for your services. There is generally nothing wrong with this, because everyone understands that an expert's time is valuable and in the course of ordinary affairs, people have to pay experts for their professional time. Of course, if you feel strongly about a certain matter and want to provide pro bono services (for free), there is nothing to stop you from doing that. However, most expert witnesses are paid for their time and effort, which may be considerable in a complicated technical matter, and this is nothing you should be ashamed of.
On the other hand, if there are other connections involved that would look fishy to outsiders, and you don't disclose them, you run up against two more ethical principles covered in the IEEE code: to "reject bribery in all its forms" and to "avoid real or perceived conflicts of interest wherever possible, and to disclose them. . . ." Say for instance that you're married to the owner's daughter and stand to inherit the shopping center when the owner passes on. Most people would say that there is at least a chance that this fact will influence your professional judgment, since you stand to gain a lot more than just your fee for your testimony in that situation. At the very least, this fact should be made known to everyone, to the defendant as well as your client, the plaintiff.
Well, what if you agree to testify, set a fee, and then find that, contrary to the owner's hopes, the contractor he sued wasn't at fault? Maybe the owner made false claims to the contractor about the nature of the subsoil, and it shifted and cracked what was otherwise perfectly good concrete. All sorts of weird things like that can happen to make a case turn out to be otherwise than what it looked like in the beginning. What do you do then?
There's something you're obliged to do, and then you take whatever consequences arise from it. You have to tell the owner the results of your study, whether they're good news or bad news for him. Obviously, if your testimony isn't going to help your client, he won't be wanting you to testify. Whether or not you get paid depends on the nature of your contract with the client. The fairest thing might be just to write it off as a learning experience and not send a bill, but if you spent several weeks working on the issue, you can't afford to undertake too many projects like that. But what you should not do under any circumstances is fudge the data or your analysis to make things look better for your client than they actually are. Here's where the real temptation comes, and here is where it has to be resisted, whether it loses you your fee or not.
Besides all these matters, I have my personal opinion I'll throw in at this point. Like chili powder in soup, expert witnessing is probably best if done sparingly. The picture conveyed to juries is generally that of a busy professional who does "real work" most of the time and has undertaken to benefit the legal system with the expertise that he or she has acquired elsewhere. But expert witnessing pays well, and some are tempted to turn it into a profitable enterprise that takes up a lot of their time. This doesn't seem to me like a good idea for someone who wishes to keep a professional edge on their technical expertise. While the temptations to bend the truth can be successfully resisted if you play the witness role only once in a while, being objective for only one side in a dispute is always a strain. Most people can support the strain once in a while, but I wouldn't advise making a career out of it.
To sum up, expert witnessing can be a genuine public service that can "improve the understanding of technology, its appropriate application, and potential consequences" (another item in the IEEE code of ethics). But this kind of work is fraught with more than the usual quota of ethical hazards, and it takes judgment and wisdom to negotiate them without slipping up. And not everybody—not even those with a Ph. D.—can do it successfully.
Sources: The IEEE Code of Ethics can be found at http://www.ieee.org/portal/pages/iportals/aboutus/ethics/code.html.
Let's look at a few of the issues that might arise. To keep things concrete, so to speak, let's say you are an expert in concrete, and have been called in by the owner of a shopping mall whose sidewalks are cracking up after only two years of use. The owner is suing the contractor who poured the sidewalks. Should you accept the job, and if you do, what are your ethical obligations?
Many professional codes of ethics have a lot to say about this kind of situation. Although electrical engineers don't deal much with concrete, the Institute of Electrical and Electronics Engineers (IEEE) has a code of ethics that is representative of many engineering codes, so we will take it as an example since I'm most familiar with the code of my own professional organization.
The first item in the IEEE code that speaks to our hypothetical question says that an engineer should "undertake technological tasks for others only if qualified by training and experience, or after full disclosure of pertinent limitations." In other words, if you don't know beans about concrete and how it cures and why it would crack, you shouldn't trade on your educational qualifications in an unrelated field just to impress a jury. Some lay persons are awed by anyone who can write "Ph. D." after their name, thinking that it confers indefinite wisdom. Those of us who work around Ph. D.s every day know that except for the narrow specialization that the Ph. D. represents, people with doctorates in engineering tend to be all over the map when it comes to wisdom or judgment. And anyway, what your client is paying you for is specific technical expertise that you claim to have. If you don't have it, you've lied to your client, and anyway, the defendant's lawyer, if he's any good at all, will take you apart with great glee during cross-examination. And being humiliated in front of a crowd of people would be just deserts for claiming expertise you don't have.
Suppose you are well qualified to pass judgment on the matter at hand. The plaintiff or his attorneys will offer you a fee for your services. There is generally nothing wrong with this, because everyone understands that an expert's time is valuable and in the course of ordinary affairs, people have to pay experts for their professional time. Of course, if you feel strongly about a certain matter and want to provide pro bono services (for free), there is nothing to stop you from doing that. However, most expert witnesses are paid for their time and effort, which may be considerable in a complicated technical matter, and this is nothing you should be ashamed of.
On the other hand, if there are other connections involved that would look fishy to outsiders, and you don't disclose them, you run up against two more ethical principles covered in the IEEE code: to "reject bribery in all its forms" and to "avoid real or perceived conflicts of interest wherever possible, and to disclose them. . . ." Say for instance that you're married to the owner's daughter and stand to inherit the shopping center when the owner passes on. Most people would say that there is at least a chance that this fact will influence your professional judgment, since you stand to gain a lot more than just your fee for your testimony in that situation. At the very least, this fact should be made known to everyone, to the defendant as well as your client, the plaintiff.
Well, what if you agree to testify, set a fee, and then find that, contrary to the owner's hopes, the contractor he sued wasn't at fault? Maybe the owner made false claims to the contractor about the nature of the subsoil, and it shifted and cracked what was otherwise perfectly good concrete. All sorts of weird things like that can happen to make a case turn out to be otherwise than what it looked like in the beginning. What do you do then?
There's something you're obliged to do, and then you take whatever consequences arise from it. You have to tell the owner the results of your study, whether they're good news or bad news for him. Obviously, if your testimony isn't going to help your client, he won't be wanting you to testify. Whether or not you get paid depends on the nature of your contract with the client. The fairest thing might be just to write it off as a learning experience and not send a bill, but if you spent several weeks working on the issue, you can't afford to undertake too many projects like that. But what you should not do under any circumstances is fudge the data or your analysis to make things look better for your client than they actually are. Here's where the real temptation comes, and here is where it has to be resisted, whether it loses you your fee or not.
Besides all these matters, I have my personal opinion I'll throw in at this point. Like chili powder in soup, expert witnessing is probably best if done sparingly. The picture conveyed to juries is generally that of a busy professional who does "real work" most of the time and has undertaken to benefit the legal system with the expertise that he or she has acquired elsewhere. But expert witnessing pays well, and some are tempted to turn it into a profitable enterprise that takes up a lot of their time. This doesn't seem to me like a good idea for someone who wishes to keep a professional edge on their technical expertise. While the temptations to bend the truth can be successfully resisted if you play the witness role only once in a while, being objective for only one side in a dispute is always a strain. Most people can support the strain once in a while, but I wouldn't advise making a career out of it.
To sum up, expert witnessing can be a genuine public service that can "improve the understanding of technology, its appropriate application, and potential consequences" (another item in the IEEE code of ethics). But this kind of work is fraught with more than the usual quota of ethical hazards, and it takes judgment and wisdom to negotiate them without slipping up. And not everybody—not even those with a Ph. D.—can do it successfully.
Sources: The IEEE Code of Ethics can be found at http://www.ieee.org/portal/pages/iportals/aboutus/ethics/code.html.
Monday, October 06, 2008
NASA At Fifty: A Modest Proposal
Five decades ago this month, a brand-new agency of the U. S. government called the National Aeronautics and Space Administration went into business. In 1969, only a little more than a decade later, NASA scored the biggest triumph of its short existence by putting men on the moon. While it would not be fair to say it's been downhill ever since, there is general agreement that NASA is now a troubled, conflicted, underfunded, and rudderless organization. As a recent Associated Press retrospective points out, the Space Shuttle is a flying antique that NASA can't afford to keep and can't afford to get rid of. The Shuttle is our only way of getting to the International Space Station, and current plans are that when (or if) the Shuttle retires, we will rely on the Russians until we can come up with a new vehicle on our own. These days, relying on the Russians looks about as smart as relying on the housing market to keep rising.
This is not to deny that NASA has pockets of excellence here and there. But a few pockets don't make a garment, and clearly something needs to be done about NASA. In the spirit of Jonathan Swift's "Modest Proposal," I offer the following suggestions.
One way to find out what NASA is really worth is to have a garage sale. You could have different sales for hardware—things like the Deep Space Network, Shuttle spare parts, the giant Vehicle Assembly Building (VAB) in Florida—and the software—outfits like the Goddard Space Flight Center in Virginia, the Jet Propulsion Laboratory in California, the Marshall Space Flight Center in Alabama, and so on. If this garage sale is anything like ones I've had, we'll have to offer some real bargains. On the other hand, I can see some entrepreneurs who might see possibilities in selling rides on high-G centrifuges and swims in zero-G swimming pools. Rocket-engine firings on test stands will always attract crowds on the Fourth of July. And think how many loft-style condos you could make out of the VAB, once the Florida real estate market comes back.
And here's an idea to make the sale go better. Instead of sending a bunch of dull old highly trained engineers up to the Space Station in the next Shuttle flight, we go around the world and offer free rides to the most popular entertainers in the world, regardless of nationality. I have no idea who these people might be, but you can ask any young Chinese or Russian or Indian, and I'm sure they'll have plenty of suggestions. We send them up there with a couple year's supply of food, and then sit back and say, "Surprise, young people of the world! You've got to build the rocket to get them back!" This will do two things: it will probably move a lot more NASA surplus stuff off the shelves, and it will motivate a lot of young people to get interested in space flight real fast.
That ties in with my next idea: the deregulation of space. It is high time that we let the free market determine what we do out there, rather than a bunch of bureaucrats and politicians. Of course, the first step is advertising and publicity. The drama of rescuing those entertainers will make great reality TV. And of course, everybody wants to travel to places where famous people have been, so space tourism will get a tremendous boost. Tourism means motels, restaurants, and all the other things that go with development. Having your latte at an altitude of 200,000 miles will give a whole new meaning to the word "Starbucks."
Naysayers will object that space travel is expensive, dangerous, and ought not to be approached with the reckless exuberance of a prospector looking for gold in a newly discovered territory. I counter that this is exactly the attitude we want. Every new generation looks around for some object to focus its idealism on. There are people out there who want to travel in space more than anything else, and we ought to get clunky old organizations like NASA out of their way and let them. The free market will determine the size of the effort, whether it's one private-enterprise rocket a year or a weekly space-bus trip from starports around the world. The good pieces of NASA that can contribute will find their places in this new order of the ages, and the rest, well, some things are better off simply coming to an end.
Nothing will ever take away the fact that once upon a time, an organization of people and machines known as NASA put men on the moon. But that was close to forty years ago. Five hundred years ago, Queen Isabella funded Columbus's voyages to the New World. But nobody has tried to keep the Spanish court going ever since simply to send more Niñas and Pintas and Santa Marias out to do battle with the wind and the waves. If NASA's time has come to fold its doors, let's at least try to get some of our money back in the process. And let's encourage the world at large to do what it really wants to do with space—by putting its money where its mouth is.
Sources: The AP story on NASA's troubled 50th anniversary can be found at http://www.newsvine.com/_news/2008/09/29/1930411-analysis-is-the-right-stuff-now-lost-in-space. Jonathan Swift's "Modest Proposal" for the Irish to solve their overpopulation and poverty problems by eating their children was obviously intended to be ironic, as is the case with my proposals above. Swift's original essay can be read at http://www.uoregon.edu/~rbear/modest.html.
This is not to deny that NASA has pockets of excellence here and there. But a few pockets don't make a garment, and clearly something needs to be done about NASA. In the spirit of Jonathan Swift's "Modest Proposal," I offer the following suggestions.
One way to find out what NASA is really worth is to have a garage sale. You could have different sales for hardware—things like the Deep Space Network, Shuttle spare parts, the giant Vehicle Assembly Building (VAB) in Florida—and the software—outfits like the Goddard Space Flight Center in Virginia, the Jet Propulsion Laboratory in California, the Marshall Space Flight Center in Alabama, and so on. If this garage sale is anything like ones I've had, we'll have to offer some real bargains. On the other hand, I can see some entrepreneurs who might see possibilities in selling rides on high-G centrifuges and swims in zero-G swimming pools. Rocket-engine firings on test stands will always attract crowds on the Fourth of July. And think how many loft-style condos you could make out of the VAB, once the Florida real estate market comes back.
And here's an idea to make the sale go better. Instead of sending a bunch of dull old highly trained engineers up to the Space Station in the next Shuttle flight, we go around the world and offer free rides to the most popular entertainers in the world, regardless of nationality. I have no idea who these people might be, but you can ask any young Chinese or Russian or Indian, and I'm sure they'll have plenty of suggestions. We send them up there with a couple year's supply of food, and then sit back and say, "Surprise, young people of the world! You've got to build the rocket to get them back!" This will do two things: it will probably move a lot more NASA surplus stuff off the shelves, and it will motivate a lot of young people to get interested in space flight real fast.
That ties in with my next idea: the deregulation of space. It is high time that we let the free market determine what we do out there, rather than a bunch of bureaucrats and politicians. Of course, the first step is advertising and publicity. The drama of rescuing those entertainers will make great reality TV. And of course, everybody wants to travel to places where famous people have been, so space tourism will get a tremendous boost. Tourism means motels, restaurants, and all the other things that go with development. Having your latte at an altitude of 200,000 miles will give a whole new meaning to the word "Starbucks."
Naysayers will object that space travel is expensive, dangerous, and ought not to be approached with the reckless exuberance of a prospector looking for gold in a newly discovered territory. I counter that this is exactly the attitude we want. Every new generation looks around for some object to focus its idealism on. There are people out there who want to travel in space more than anything else, and we ought to get clunky old organizations like NASA out of their way and let them. The free market will determine the size of the effort, whether it's one private-enterprise rocket a year or a weekly space-bus trip from starports around the world. The good pieces of NASA that can contribute will find their places in this new order of the ages, and the rest, well, some things are better off simply coming to an end.
Nothing will ever take away the fact that once upon a time, an organization of people and machines known as NASA put men on the moon. But that was close to forty years ago. Five hundred years ago, Queen Isabella funded Columbus's voyages to the New World. But nobody has tried to keep the Spanish court going ever since simply to send more Niñas and Pintas and Santa Marias out to do battle with the wind and the waves. If NASA's time has come to fold its doors, let's at least try to get some of our money back in the process. And let's encourage the world at large to do what it really wants to do with space—by putting its money where its mouth is.
Sources: The AP story on NASA's troubled 50th anniversary can be found at http://www.newsvine.com/_news/2008/09/29/1930411-analysis-is-the-right-stuff-now-lost-in-space. Jonathan Swift's "Modest Proposal" for the Irish to solve their overpopulation and poverty problems by eating their children was obviously intended to be ironic, as is the case with my proposals above. Swift's original essay can be read at http://www.uoregon.edu/~rbear/modest.html.
Monday, September 29, 2008
Where Will China's Walk in Space Take Us?
Over the weekend, three Chinese astronauts landed safely in Inner Mongolia after completing a 68-hour flight that included a 20-minute spacewalk. After the burst of patriotism from the Chinese people that the world witnessed during the Beijing Olympics, China now has even more to celebrate. As a successful demonstration that China has mastered the extreme engineering complexities of manned space flight, the exploit's message is unambiguous. But as with any technology, its ethical implications depend on how it is used and why.
It's no surprise that I obtained one of the more comprehensive news reports on the flight from New Delhi Television Limited's website. As China's nearest large neighbor to the south, India is more than a little interested in any signs that China's ability to throw complicated machinery a long distance has improved. The space race between the old U. S. S. R. and the United States was about many things, but at its core was the technology needed to launch intercontinental ballistic missiles (ICBMs) halfway around the world. Just as war games provide a way for a country to show off its military might without actually fighting the enemy, the race to the moon provided the U. S. with a peaceful means of showing off the advanced state of our aerospace technology which, with relatively small modifications, was fully capable of blowing the U. S. S. R. to pieces.
Something similar is going on with China's space program, which has surprisingly long roots. As long ago as 1967, Chinese government officials announced their intentions to put a man in space. Unfortunately, a few things like the Great Cultural Revolution, Mao's demise, and the resultant governmental and social turmoil got in the way. It wasn't until 2003 that one Yang Liwei climbed aboard a rocket and became the first Chinese astronaut. But since then, the Chinese space program has made great strides. Considering that the U. S. took eight years to go from its first manned spaceflight in 1961 to the first moon landing in 1969, the Chinese program probably won't keep up quite that pace. But a moon landing is clearly in the works, as well as extensive Earth-orbiting doings such as a Chinese space station.
Unlike the International Space Station currently in orbit that involves astronauts and technology from numerous countries, China has chosen to go it alone almost completely in space. For many years the U. S. did the same, and it is tempting to lay out other parallels between the Chinese and U. S. space efforts. But they are different countries, and the reasons behind the Chinese space program may differ considerably from ours.
In a sense, space is the best of places and the worst of places. Some of the most idealistic and noble ambitions (and people, too) are directed toward the exploration of space, either for purely scientific reasons or for reasons of national prestige. The old Latin phrase ad astra ("to the stars") captures the quasi-religious feeling that many people have when they think about manned space exploration. At the same time, the worst kind of mass destruction that mankind is capable of inflicting in the form of intercontinental ballistic missiles (ICBMs) would pass through the void of space on their way to vaporizing millions of people back on Earth. It is too early to tell what China will do with its new-found space capabilities. So far, all they've done is to perform the same kind of stunts that the U. S. and the U. S. S. R. did in the harmless but significant space race of the 1960s. That race, you will recall, did result in the dissolution of one of the two parties, although how much the Soviet Union's diversion of resources to its space effort contributed to its demise is a fight for the historians.
China is a different situation altogether. Although they have their own territorial ambitions, China is a much more homogeneous country than the U. S. S. R. ever was. And while I deplore dictatorships and Communist governments, from a technocratic point of view they can provide the long-range stability that tends to go away when you have a newly elected government every two to four years or so. Let's hope that China will put its efforts into showing how it can master the peaceful challenges of space instead of trying to pull some kind of international space blackmail on the rest of the world some day.
Sources: Wikipedia has a good article on the Chinese space program. An article on the recent Chinese spacewalk can be found at http://www.ndtv.com/convergence/ndtv/story.aspx?id=NEWEN20080067038.
It's no surprise that I obtained one of the more comprehensive news reports on the flight from New Delhi Television Limited's website. As China's nearest large neighbor to the south, India is more than a little interested in any signs that China's ability to throw complicated machinery a long distance has improved. The space race between the old U. S. S. R. and the United States was about many things, but at its core was the technology needed to launch intercontinental ballistic missiles (ICBMs) halfway around the world. Just as war games provide a way for a country to show off its military might without actually fighting the enemy, the race to the moon provided the U. S. with a peaceful means of showing off the advanced state of our aerospace technology which, with relatively small modifications, was fully capable of blowing the U. S. S. R. to pieces.
Something similar is going on with China's space program, which has surprisingly long roots. As long ago as 1967, Chinese government officials announced their intentions to put a man in space. Unfortunately, a few things like the Great Cultural Revolution, Mao's demise, and the resultant governmental and social turmoil got in the way. It wasn't until 2003 that one Yang Liwei climbed aboard a rocket and became the first Chinese astronaut. But since then, the Chinese space program has made great strides. Considering that the U. S. took eight years to go from its first manned spaceflight in 1961 to the first moon landing in 1969, the Chinese program probably won't keep up quite that pace. But a moon landing is clearly in the works, as well as extensive Earth-orbiting doings such as a Chinese space station.
Unlike the International Space Station currently in orbit that involves astronauts and technology from numerous countries, China has chosen to go it alone almost completely in space. For many years the U. S. did the same, and it is tempting to lay out other parallels between the Chinese and U. S. space efforts. But they are different countries, and the reasons behind the Chinese space program may differ considerably from ours.
In a sense, space is the best of places and the worst of places. Some of the most idealistic and noble ambitions (and people, too) are directed toward the exploration of space, either for purely scientific reasons or for reasons of national prestige. The old Latin phrase ad astra ("to the stars") captures the quasi-religious feeling that many people have when they think about manned space exploration. At the same time, the worst kind of mass destruction that mankind is capable of inflicting in the form of intercontinental ballistic missiles (ICBMs) would pass through the void of space on their way to vaporizing millions of people back on Earth. It is too early to tell what China will do with its new-found space capabilities. So far, all they've done is to perform the same kind of stunts that the U. S. and the U. S. S. R. did in the harmless but significant space race of the 1960s. That race, you will recall, did result in the dissolution of one of the two parties, although how much the Soviet Union's diversion of resources to its space effort contributed to its demise is a fight for the historians.
China is a different situation altogether. Although they have their own territorial ambitions, China is a much more homogeneous country than the U. S. S. R. ever was. And while I deplore dictatorships and Communist governments, from a technocratic point of view they can provide the long-range stability that tends to go away when you have a newly elected government every two to four years or so. Let's hope that China will put its efforts into showing how it can master the peaceful challenges of space instead of trying to pull some kind of international space blackmail on the rest of the world some day.
Sources: Wikipedia has a good article on the Chinese space program. An article on the recent Chinese spacewalk can be found at http://www.ndtv.com/convergence/ndtv/story.aspx?id=NEWEN20080067038.
A Note to Readers Requesting Private Responses
From time to time, a reader may wish to communicate with me in a way that requests a private response: for example, a query for more information, a question requiring a specific answer, etc. Unfortunately, until recently I neglected to post my email address on the profiles page of this blog. (For the record, it is kdstephan@txstate.edu, and can also now be found on the profiles page.) Some readers unable to locate my email address sent queries to the comments section of this blog, assuming that I could obtain their email addresses from the system and respond.
That is not the case. The system does not reveal to me the email addresses of anyone who sends in a comment. So to those readers who sent me queries or requests via the comment section, I apologize for not responding and for not posting my email address. You probably think by now that I'm one of those arrogant bloggers who's too busy to respond to individual inquiries. I assure you that this is not the case; I simply cannot recover your email address when you post in the comments section. Of course, if you give your email address or other identifying information within the post itself, I can respond that way, but some correspondents failed to do that, or are reluctant to post their email address in a public location.
So in the future, please feel free to post comments either anonymously or with identifying information within the post. But if you expect an individual response from me, please either include your email in the post, or email me directly at kdstephan@txstate.edu. Thank you.
That is not the case. The system does not reveal to me the email addresses of anyone who sends in a comment. So to those readers who sent me queries or requests via the comment section, I apologize for not responding and for not posting my email address. You probably think by now that I'm one of those arrogant bloggers who's too busy to respond to individual inquiries. I assure you that this is not the case; I simply cannot recover your email address when you post in the comments section. Of course, if you give your email address or other identifying information within the post itself, I can respond that way, but some correspondents failed to do that, or are reluctant to post their email address in a public location.
So in the future, please feel free to post comments either anonymously or with identifying information within the post. But if you expect an individual response from me, please either include your email in the post, or email me directly at kdstephan@txstate.edu. Thank you.
Monday, September 22, 2008
What Is Distributism, and Why Should Engineers Care?
Engineering is an unavoidably economic activity, since it always involves applying knowledge to achieve an end within the constraint of limited resources. Engineers have worked under every kind of economic system from radical Communism to the nearly unrestrained free market of places like Singapore. There seems to be a growing consensus that the only kind of economic system with a future is free-market capitalism, which even the leaders of the Peoples' Republic of China have embraced. I will now take moment during this more-than-usually-political season to introduce you to a system that is more than economics and really more than politics, but would profoundly change both if it was adopted seriously. It is a third alternative to capitalism or socialism which almost no one has heard of: distributism.
Historically, distributism was the way most economies operated in most parts of the world for centuries until the rise of the mercantile states in the seventeenth century, when capitalism began to take its modern form. Then socialism arose as an attempt to correct the flaws of capitalism, but sometimes the cure is worse than the disease. Both capitalism and socialism share many concepts in common, including the philosophical assumption that man is Homo economicus: that is, the most important thing about man is his economic activity and behavior. Socialism puts the government in charge of the economy and capitalism bows to the free market, but both systems assume that when you have solved the economic problem, you have solved the most important problems.
Distributism, which had its heyday in England in the 1930s, starts from a different place altogether. It says that the economy was made for man, and not man for the economy.
Here's a little quiz: how many of the following items do you find appealing? Never mind how they would come about, just react positively or negatively to each:
--- Working at home, rather than in an office at the end of a long commute
--- Eating fresh fruits and vegetables you grew yourself or bought from a local farmer
--- Owning your own business
--- Being better off economically for having children rather than the reverse
--- Buying things made and sold by people who live in your neighborhood
None of these things are impossible or cloud-cuckoo-land pipe dreams. Millions of Americans enjoy one or more of them every day. All these things, and more that space doesn't allow me to list, are pieces of a distributist program that would encourage movement toward the wider distribution of ownership of productive property. That is distributism in a nutshell.
Where would engineers fit in a distributist economy? That is a good question, but one I would have to take time off and write a book about to answer adequately. Because large-scale capitalism is so deeply entrenched worldwide, most engineers work for firms that are either large multinationals themselves or depend on them. It is silly to pretend that you could take a multi-billion-dollar semiconductor foundry and turn it into dozens of little mom-and-pop IC plants spread all over the world. But it may seem silly simply because no one has thought along those lines for decades.
Many technical innovations that have taken place since the 1930s are potentially very friendly to a distributist economy. For instance, before the advent of the Internet it was impossible for a three-person company with limited capital to do worldwide marketing of any kind. There were simply no advertising media that such a small company could afford. Now all it takes is a website and maybe some translation software, and there you are. Already many firms are outsourcing specific engineering functions to private contractors, although in a haphazard way motivated by capitalistic concerns rather than other factors. The profession of engineering itself began largely as a group of quasi-independent professionals with what amounted to consulting practices, rather than as large staffs of wage-earning employees, which is the norm today.
These are idle musings at this point, admittedly, but the point is that bigger is not always better, and more means exist today to make small, owner-operated engineering firms viable than possibly ever before. There will always be a need for large organizations to deal with large projects such as aerospace programs, public works, and so on. But they need not be the rule-–one day they could be the rare exception in a distributist economy, in which most engineers would work either for themselves or in small local firms.
After decades of neglect, distributism is now seeing something of a renaissance, with books and websites showing up with some regularity. One of distributism's most prominent early exponents was the British author G. K. Chesterton, whose writings on distributism (The Outline of Sanity, Utopia of Usurers) are easier to find than some others. Wendell Berry, an author and farmer associated with what is known as the Southern Agrarian movement, takes positions that are often sympathetic with distributist principles. The Amish, who are often thought to eschew all forms of technology, actually take advantage of certain carefully chosen modern technologies, but only after carefully considering how its use will affect their individual and communal life.
You will probably never see a distributist candidate for President or a Distributist Party playing power politics. It is inherently a small-scale, local movement, but for that reason it can be much easier to live a practical distributist life here and now, in some ways, than it is to become an instant successful capitalist, for instance. If you think my treatment of distributism has been wacky and out of place, I promise not to bring it up again at least till after the November elections. But it's not impossible to imagine engineers doing well and doing good in a distributist economy as well as in the one we have now. And maybe, just maybe, things might be better than they are.
Sources: Books such as Distributist Perspectives I and II and Beyond Capitalism and Socialism are available from IHS Press (www.ihspress.com), which also publishes other works of Catholic social thought, where distributism finds many of its origins. On the web there are peppery blogs and information on distributism to be found in The Distributist Review at http://www.distributism.blogspot.com. IEEE Technology and Society Magazine carried an excellent article by Jameson Wetmore on the Amish and their attitude toward technology in its Summer 2007 issue, pp. 10-21.
Historically, distributism was the way most economies operated in most parts of the world for centuries until the rise of the mercantile states in the seventeenth century, when capitalism began to take its modern form. Then socialism arose as an attempt to correct the flaws of capitalism, but sometimes the cure is worse than the disease. Both capitalism and socialism share many concepts in common, including the philosophical assumption that man is Homo economicus: that is, the most important thing about man is his economic activity and behavior. Socialism puts the government in charge of the economy and capitalism bows to the free market, but both systems assume that when you have solved the economic problem, you have solved the most important problems.
Distributism, which had its heyday in England in the 1930s, starts from a different place altogether. It says that the economy was made for man, and not man for the economy.
Here's a little quiz: how many of the following items do you find appealing? Never mind how they would come about, just react positively or negatively to each:
--- Working at home, rather than in an office at the end of a long commute
--- Eating fresh fruits and vegetables you grew yourself or bought from a local farmer
--- Owning your own business
--- Being better off economically for having children rather than the reverse
--- Buying things made and sold by people who live in your neighborhood
None of these things are impossible or cloud-cuckoo-land pipe dreams. Millions of Americans enjoy one or more of them every day. All these things, and more that space doesn't allow me to list, are pieces of a distributist program that would encourage movement toward the wider distribution of ownership of productive property. That is distributism in a nutshell.
Where would engineers fit in a distributist economy? That is a good question, but one I would have to take time off and write a book about to answer adequately. Because large-scale capitalism is so deeply entrenched worldwide, most engineers work for firms that are either large multinationals themselves or depend on them. It is silly to pretend that you could take a multi-billion-dollar semiconductor foundry and turn it into dozens of little mom-and-pop IC plants spread all over the world. But it may seem silly simply because no one has thought along those lines for decades.
Many technical innovations that have taken place since the 1930s are potentially very friendly to a distributist economy. For instance, before the advent of the Internet it was impossible for a three-person company with limited capital to do worldwide marketing of any kind. There were simply no advertising media that such a small company could afford. Now all it takes is a website and maybe some translation software, and there you are. Already many firms are outsourcing specific engineering functions to private contractors, although in a haphazard way motivated by capitalistic concerns rather than other factors. The profession of engineering itself began largely as a group of quasi-independent professionals with what amounted to consulting practices, rather than as large staffs of wage-earning employees, which is the norm today.
These are idle musings at this point, admittedly, but the point is that bigger is not always better, and more means exist today to make small, owner-operated engineering firms viable than possibly ever before. There will always be a need for large organizations to deal with large projects such as aerospace programs, public works, and so on. But they need not be the rule-–one day they could be the rare exception in a distributist economy, in which most engineers would work either for themselves or in small local firms.
After decades of neglect, distributism is now seeing something of a renaissance, with books and websites showing up with some regularity. One of distributism's most prominent early exponents was the British author G. K. Chesterton, whose writings on distributism (The Outline of Sanity, Utopia of Usurers) are easier to find than some others. Wendell Berry, an author and farmer associated with what is known as the Southern Agrarian movement, takes positions that are often sympathetic with distributist principles. The Amish, who are often thought to eschew all forms of technology, actually take advantage of certain carefully chosen modern technologies, but only after carefully considering how its use will affect their individual and communal life.
You will probably never see a distributist candidate for President or a Distributist Party playing power politics. It is inherently a small-scale, local movement, but for that reason it can be much easier to live a practical distributist life here and now, in some ways, than it is to become an instant successful capitalist, for instance. If you think my treatment of distributism has been wacky and out of place, I promise not to bring it up again at least till after the November elections. But it's not impossible to imagine engineers doing well and doing good in a distributist economy as well as in the one we have now. And maybe, just maybe, things might be better than they are.
Sources: Books such as Distributist Perspectives I and II and Beyond Capitalism and Socialism are available from IHS Press (www.ihspress.com), which also publishes other works of Catholic social thought, where distributism finds many of its origins. On the web there are peppery blogs and information on distributism to be found in The Distributist Review at http://www.distributism.blogspot.com. IEEE Technology and Society Magazine carried an excellent article by Jameson Wetmore on the Amish and their attitude toward technology in its Summer 2007 issue, pp. 10-21.
Monday, September 15, 2008
Will Peers Process Patents Perspicaciously?
Well, once you get on one of those alliteration kicks, it's hard to stop. This is a story about a big problem with the U. S. patent system, which is of concern to any engineer whose work is valuable enough to patent. And, about one small attempt to make it better.
For some years now, there has been general agreement that the patent system has major flaws. Basically, it's too easy to get a bad patent, and too easy to clog the legal system with patent lawsuits that never should have been started in the first place, based on overly-broad patents that never should have been issued. Partly because it's so easy, more patents are being filed every year, but the U. S. Patent and Trademark Office (USPTO) can't keep up—it now takes an average of more than two years to get a patent. And since many technologies such as software engineering come up with a whole new generation of products every few months or so, the patent system starts to look like a glacier stuck up on a mountain while a flood of water rushes by in the valley.
Part of the problem is that there aren't enough good patent examiners. Those are the government folks who pass judgment on whether a patent should be granted or not. The ideal patent examiner has advanced degrees in both law and a technical field, plus the patience and incorruptibility of a good detective. Such people have never been easy to find, and attracting them with a government pay scale is even harder. Faced with the rising flood of patents, patent examiners nowadays err on the side of generosity, allowing all sorts of patents through which in more rigorous days would have been tossed out. But to toss out a patent you need a good reason such as a citation of "prior art," and apparently doing a thorough job in that area is simply not something the patent office can handle very well anymore.
A recent news article highlights an attempt to improve the situation with something called Peer-to-Patent, a collaboration between the USPTO and New York Law School professor Beth Noveck. She has set up a website at which ordinary citizens (you or I included) can review selected patent applications, read and interpret the claims, cite prior art, and in short, pretend you are a patent examiner. If the "community" of volunteer examiners votes to forward your citations to the patent office, one of them may make a Top Ten list that actually gets used in the patent, if it gets issued, or more likely denied if your prior-art citation is a good one.
I viewed the little video on the site that gives an overview of the process. While it puts the best face on the matter, even my passing familiarity with patents (I have managed to obtain a couple over the years) tells me that to do a good job on just one application would require a good bit more work than it takes to do your average income-tax form, if not more. When I read about the Peer-to-Patent idea, my first question was, "Why would anybody bother to donate several hours of their highly marketable expertise to such a thing?" and after looking at the website, my first question remains unanswered.
As a practical matter, the only people I can imagine who would want to fool with this and devote the serious amount of work it would take, would be rivals of the inventors who made the original application, who are of course highly motivated to see it fail. If you translate this idea to a more familiar setting, I think you can see its problems better. Suppose you sue your neighbor for building a corner of his garage on your property. And suppose the judge in the civil suit, instead of hearing testimony from duly sworn-in experts such as surveyors and land-title experts, opens a website, posts the records of the case on line, and invites all and sundry to make comments, without even requiring them to give their real names. (The Peer-to-Patent website doesn't require real names, although it is recommended that you not hide behind an alias.) Who is the person most motivated to get online and trash your side of the case? Your neighbor, of course, or maybe his lawyer.
The analogy is not exact, but it does seem to me that by asking for "volunteers" to put in such a large amount of work—effort that the government can't seem to be able to hire on the open market—the site automatically selects only for the people who have the greatest motivation to criticize an application—that is, rivals of the original applicant who would dearly love to see it fail. And maybe that's exactly what Prof. Noveck is trying to do. But if that's the case, it seems more than a little hypocritical to just pretend that the volunteers are random, public-spirited citizens whose only motivation is the honor of having one of their prior-art citations selected for use by the USPTO. I mean, wouldn't that send you into orbit for weeks? Maybe there are some people like that, but I'm not optimistic that they'll be coming out of the woodwork to make the Peer-to-Patent idea succeed.
You have to give the USPTO and the New York Law School credit for trying something. The other day I heard a review of a rather cynical book by a fellow who says that the real motivation for Republicans who gain control of the federal government is to make it run so badly that people will lose faith in it, and not complain when it eventually withers away to the small government of many decades ago. I sincerely hope he's wrong about that, since if he's right we have been watching institutionalized hypocrisy in action for a long time. But weird ideas like this Peer-to-Patent business make me wonder. Maybe I'm wrong and Peer-to-Patent will be the answer to many of the USPTO's problems. But we'll have to wait a while to see.
Sources: The Associated Press article on Peer-to-Patent by Joelle Tessler was carried by many papers, including the Baltimore Sun on Sept. 15 at http://www.baltimoresun.com/technology/bal-patent0915,0,1444023.story. The USPTO's main website is http://www.uspto.gov, and the Peer-to-Patent website is http://www.peertopatent.org.
For some years now, there has been general agreement that the patent system has major flaws. Basically, it's too easy to get a bad patent, and too easy to clog the legal system with patent lawsuits that never should have been started in the first place, based on overly-broad patents that never should have been issued. Partly because it's so easy, more patents are being filed every year, but the U. S. Patent and Trademark Office (USPTO) can't keep up—it now takes an average of more than two years to get a patent. And since many technologies such as software engineering come up with a whole new generation of products every few months or so, the patent system starts to look like a glacier stuck up on a mountain while a flood of water rushes by in the valley.
Part of the problem is that there aren't enough good patent examiners. Those are the government folks who pass judgment on whether a patent should be granted or not. The ideal patent examiner has advanced degrees in both law and a technical field, plus the patience and incorruptibility of a good detective. Such people have never been easy to find, and attracting them with a government pay scale is even harder. Faced with the rising flood of patents, patent examiners nowadays err on the side of generosity, allowing all sorts of patents through which in more rigorous days would have been tossed out. But to toss out a patent you need a good reason such as a citation of "prior art," and apparently doing a thorough job in that area is simply not something the patent office can handle very well anymore.
A recent news article highlights an attempt to improve the situation with something called Peer-to-Patent, a collaboration between the USPTO and New York Law School professor Beth Noveck. She has set up a website at which ordinary citizens (you or I included) can review selected patent applications, read and interpret the claims, cite prior art, and in short, pretend you are a patent examiner. If the "community" of volunteer examiners votes to forward your citations to the patent office, one of them may make a Top Ten list that actually gets used in the patent, if it gets issued, or more likely denied if your prior-art citation is a good one.
I viewed the little video on the site that gives an overview of the process. While it puts the best face on the matter, even my passing familiarity with patents (I have managed to obtain a couple over the years) tells me that to do a good job on just one application would require a good bit more work than it takes to do your average income-tax form, if not more. When I read about the Peer-to-Patent idea, my first question was, "Why would anybody bother to donate several hours of their highly marketable expertise to such a thing?" and after looking at the website, my first question remains unanswered.
As a practical matter, the only people I can imagine who would want to fool with this and devote the serious amount of work it would take, would be rivals of the inventors who made the original application, who are of course highly motivated to see it fail. If you translate this idea to a more familiar setting, I think you can see its problems better. Suppose you sue your neighbor for building a corner of his garage on your property. And suppose the judge in the civil suit, instead of hearing testimony from duly sworn-in experts such as surveyors and land-title experts, opens a website, posts the records of the case on line, and invites all and sundry to make comments, without even requiring them to give their real names. (The Peer-to-Patent website doesn't require real names, although it is recommended that you not hide behind an alias.) Who is the person most motivated to get online and trash your side of the case? Your neighbor, of course, or maybe his lawyer.
The analogy is not exact, but it does seem to me that by asking for "volunteers" to put in such a large amount of work—effort that the government can't seem to be able to hire on the open market—the site automatically selects only for the people who have the greatest motivation to criticize an application—that is, rivals of the original applicant who would dearly love to see it fail. And maybe that's exactly what Prof. Noveck is trying to do. But if that's the case, it seems more than a little hypocritical to just pretend that the volunteers are random, public-spirited citizens whose only motivation is the honor of having one of their prior-art citations selected for use by the USPTO. I mean, wouldn't that send you into orbit for weeks? Maybe there are some people like that, but I'm not optimistic that they'll be coming out of the woodwork to make the Peer-to-Patent idea succeed.
You have to give the USPTO and the New York Law School credit for trying something. The other day I heard a review of a rather cynical book by a fellow who says that the real motivation for Republicans who gain control of the federal government is to make it run so badly that people will lose faith in it, and not complain when it eventually withers away to the small government of many decades ago. I sincerely hope he's wrong about that, since if he's right we have been watching institutionalized hypocrisy in action for a long time. But weird ideas like this Peer-to-Patent business make me wonder. Maybe I'm wrong and Peer-to-Patent will be the answer to many of the USPTO's problems. But we'll have to wait a while to see.
Sources: The Associated Press article on Peer-to-Patent by Joelle Tessler was carried by many papers, including the Baltimore Sun on Sept. 15 at http://www.baltimoresun.com/technology/bal-patent0915,0,1444023.story. The USPTO's main website is http://www.uspto.gov, and the Peer-to-Patent website is http://www.peertopatent.org.
Subscribe to:
Posts (Atom)