Wednesday, July 26, 2006

Is MySpace a Safer Place?

Back on June 20, I wrote about the Texas Attorney General's efforts to track down cyber predators who abuse popular social-networking websites such as MySpace. At last report, he had rounded up eighty alleged criminals who tried to meet cute under-age girls or boys for nefarious purposes, only to find themselves at the wrong end of a sting operation. The very next day, on June 21, announced a series of new restrictions to help fix the problem. I am certain that this blog played no role in MySpace's decision, but it is equally certain that publicity about the potential for abuse as well as the potential for lawsuits did have an effect.

According to an Associated Press report, the changes make it impossible for anyone registered as being over 18 to view the full profiles of members under 16, unless the older user knows the younger one's email address or full name. (MySpace has long had a lower age limit of 14.) While this is undoubtedly an improvement, the report also pointed out that MySpace simply takes a user's word about age. There is still nothing like the credit-card verification mechanism recommended by the Texas Attorney General to verify the user's age by independent means. So if I decided to masquerade as a 14-year-old boy in order to view the full profiles of 14-year-old girls, I could still do so.

The controversy over MySpace is just one battle in the larger war about privacy and technology. These days, "technology" usually means computers, networks, and the whole communications infrastructure of iPods, websites, and other hardware and software that makes us the most connected society in history. In examining a problem, engineers sometimes like to cook up a worst-case scenario in which everything that could conceivably go wrong does go wrong. If the system they are designing nevertheless withstands such a perfect storm of Murphy's Law ("whatever can go wrong will go wrong"), then the engineers can generally breathe a sigh of relief that the system will make it through more likely incidents in which only some things go wrong. Of course, this assumes that the system is simple enough, and the engineers are imaginative enough, to come up with a truly worst-case situation. But even if these conditions don't always apply, the technique is still a useful one.

What is a worst-case scenario in terms of privacy and technology? The answer may depend on what your own worst fears are.

Say you feel strongly that your financial matters are nobody else's business, and that you value your good credit rating. Your worst cyber-privacy nightmare might then be to have your identity stolen by a gang of hot-check-writing, heroin-using, credit-card-busting criminals who pay for a million-dollar orgy of consumer spending with your financial resources and then flee the country, leaving your credit rating in tatters that will take years to repair.

Say that you like to speak your mind about politics or anything else. Then your worst fears might be that a kind of super-Patriot Act would allow the government to spy on everything you email, blog, say, or see online. Imagine what Joseph Stalin would have done with a Communist version of the Internet. In the old days of manual telephone taps and flesh-and-blood spies, the ability of a government to spy on its citizens was limited by the fact that you could hire only so many spies, and there were never enough to keep tabs on all the citizens all the time. But new automated spyware has lifted that restriction and brought the blessings of increased productivity to the espionage business. My blog on "Engineering Censorship in China" shows how a totalitarian government can use technology to monitor or censor the online activities of over a billion people, with the help of companies like Microsoft.

Say that you have a rare genetic disorder that has a good, but not certain, chance of striking you as a young adult. It won't be fatal, but will require many thousands of dollars' worth of specialized health care over the rest of your lifetime. Do you want your prospective employers or health insurance companies to know this fact about you? Even if they say they will not let it influence their decisions about you, do you believe them? There are laws currently under consideration by the U. S. Congress that will mandate the electronic storage of medical data, which is now largely maintained in the form of paper files. This change does not guarantee that any Joe or Jane off the street will be able to access your medical records, but it is not clear that it will safeguard them perfectly, either.

In each of these cases, something that was at first intended to be a good, convenient, or more efficient way of doing things gets twisted around and used to harm. Systems designed to make it easier to buy things also make it easier to steal things. Those who built features into the Internet to encourage the small-d democratic exchange of ideas now find that some governments use it to repress ideas. Attempts to make medical records more accurate and accessible can also hurt someone with a costly medical problem if insurers or employers use their medical records against them. And a great idea about how to bring people closer together with technology-assisted social networking occasionally helps cyber predators carry out their evil intentions.

While there are many laws of physics that engineers must obey at their peril, there is also one principle of human behavior that is equally important. It goes by various names. In the Christian tradition, it is called "original sin," which means that everyone on Earth has an inherent tendency to do the wrong thing, even if they know the right thing. G. K. Chesterton called this doctrine "the only part of Christian theology which can really be proved." The proof, of course, is empirical. There has never been a technology that has actually been used, which has not ended up causing at least some harm as well as good. And it is foolish to design anything without taking this tried-and-true human factor into account.

Sources: The Associated Press report on MySpace's new restrictions is at One view of the issue of medical privacy rights (the patient-advocate view) can be found at The Chesterton quote is from Orthodoxy (New York: Doubleday, 1990, orig. published 1908), p. 15.

Wednesday, July 19, 2006

The Big Dig in Big Trouble

Boston's Big Dig project to put much of I-90 underground spanned parts of two centuries and cost more than any other single highway project in the United States. On July 11, when the project was mostly finished and people in Massachusetts thought they could begin to put the disruption and cost overruns behind them, a three-ton ceiling tile came loose in a connector tunnel and killed a newlywed woman. Further investigation has revealed that over a thousand fasteners used to hold up similar tiles are probably defective. What can we learn from all this?

The first lesson is an old one: nothing draws attention like death and destruction. According to a report by Sean Murphy and Raja Mishra in the July 18 Boston Globe, lab tests of the epoxy glue used to hold the fasteners in place were originally scheduled during construction. But officials of Bechtel/Parsons Brinckerhoff, the engineering firm in charge of the Big Dig, felt so confident in the epoxy that they canceled the tests. Now it looks like the tests would have been a good idea, because they might have revealed the kind of problems that ultimately led to the fatal ceiling collapse. But there was no immediate harm that resulted from skipping the tests, so the incident went by unnoticed.

The next lesson is one we hear starting in kindergarten: be sure to follow instructions. Engineering is a constant battle between expensive over-caution on the one hand, and reckless negligence on the other hand. Where lives are at stake, as in the construction of bridges and tunnels, laws require licensed engineers to sign off on plans and specifications. But all the licensed engineers in the world won't do any good if the contractors and builders don't carry out the engineers' instructions to the letter.

Speculation by experts centers on the possibility that the epoxy used to hold the concrete ceiling tiles up was either not prepared and applied correctly, or used with oily steel. Steel as it comes from the factory has a thin coating of oil on it, and unless this oil is cleaned off prior to use, adhesives such as epoxy cannot form a good bond. Even if the steel was clean, the widely varying temperatures at a Boston construction site may have interfered with the chemical changes that epoxy goes through in order to harden. Inadequately hardened plastic adhesives can "creep" under stress, moving a tiny fraction of an inch every month, until the entire joint fails. Whatever was done wrong, it appears to have been done wrong consistently, because Governor Mitt Romney has announced that over 1300 fasteners are suspect and will have to be removed or replaced.

Further investigations will eventually reveal what went wrong, and possibly who was responsible. Structural engineering is based mostly on physical science, and things don't generally fall down for no reason at all. But finding the physical cause gets us only part way toward preventing similar accidents in the future. Until the human organizations that let such things happen are repaired and kept in order, the same thing can happen again. In a way, it has.

The Boston tunnel collapse is strangely similar in some ways to a much more serious tragedy that happened twenty-five years ago this month. On July 11, 1981, several hundred people gathered on a suspended concrete walkway to watch a dance party in the newly opened Hyatt Hotel in Kansas City, Missouri. The walkway was held up by steel rods which should have been strong enough to support the weight of the crowd. If they had been installed according to the original engineering plan, everything would have been fine. But on the site, a contractor decided to make a subtle change in the way the rods were made and assembled. This change greatly weakened the structure and caused it to collapse that evening, killing 114 people and injuring 200. Again, we had heavy concrete slabs, dangerous to life, suspended by thin steel rods. Again, if the plans had been carried out to the letter, the disaster would not have occurred. This is not to say that nobody should ever suspend heavy concrete slabs with thin steel rods again, or that engineers never make mistakes. They do. But the point is that responsibility inheres not only in those who make plans, but in those who carry them out and those charged with making sure that the work agrees with the plans.

Everyone involved in a building project, from those who pay for it, to the architects and engineers, to the contractors, to inspectors, down to the lowliest laborer cleaning up afterwards, has to walk that same line between excessive over-caution and reckless carelessness. Since the vast majority of engineering projects work without major failures or loss of life, we can assume that most of these folks do their job well enough most of the time. But an accident like the Big Dig tunnel collapse reminds us of what has to happen at every step of the way, and what can go wrong if somebody doesn't pay enough attention to details that don't seem to matter at the time.

Sources: The Boston Globe articles cited are at (Gov. Romney's announcement) and (the neglected lab tests). A string of technical discussions on the general subject of epoxy ceiling fasteners and how they can fail is at the Engineering Tips website The Wikipedia article about the Kansas City Hyatt Regency walkway collapse is at

Monday, July 10, 2006

Counterfeit Electronics: Coming to a Store Near You

Ten days ago, on July 1, 2006, it became illegal in the European Union to sell electronics that contain more than a very small amount of lead, mercury, cadmium, and a few other hazardous chemicals. These new Reduction of Hazardous Substances (RoHS) regulations present a golden opportunity for electronics counterfeiters to re-label and re-package lead-containing electronics to look like they meet the RoHS requirements.

What is electronics counterfeiting? Anyone who has strolled through a crowded street-level market in New York City has had the chance to buy things like "Rolix" watches and maybe even "Ipods" (not "iPods"). This kind of counterfeiting, where someone makes a cheap imitation of an expensive product and labels it with an almost-like name, is pretty easy to spot and avoid. But it is only the tip of a huge iceberg that costs legitimate manufacturers up to $100 billion a year in lost revenue, according to some estimates.

Most of the counterfeiting goes on far out of sight of consumers, among the thousands of manufacturers, suppliers, and parts brokers who provide the components for both consumer items and industrial electronics systems. Electronics supply chains are increasingly global, and increasingly use the Internet as a marketing and communications tool. The problem with global Internet-based supply chains is that purchasers and suppliers rarely meet face-to-face. This makes it easy for an unethical engineering firm to set up as a legitimate manufacturer and resell used ICs salvaged from old computers as new parts, for example. Another ploy is to relabel cheap, poorly performing parts as expensive better-performing ones. The manufacturer who trusts the part's label and builds a bogus two-dollar IC into a five-hundred-dollar motherboard, which thereupon fails, has got a huge financial headache on his hands. And even worse, the part can perform just well enough to leave the factory, only to fail when it gets to the consumer.

A recent article in IEEE Spectrum Magazine by Michael Pecht and Sanjay Tiku describes some of the ways manufacturers can guard against these problems. One obvious way would be to test parts as they arrive. Years ago, this practice was not uncommon, but it is costly and recent trends have been to move component testing away from the user and toward the supplier. But this requires a level of trust between supplier and user that some suppliers obviously don't deserve.

If the supply chain consisted of just two links, a manufacturer might be able to vet each supplier thoroughly and establish trustworthiness that way. But take the example of a criminally incompetent supplier a few years ago, who stole a formula for the electrolyte used in electrolytic capacitors, a very common type of cheap electronic component. He got the formula wrong, but went ahead and mixed up a batch anyway and sold it to some capacitor manufacturers. They used it to make their capacitors, they sold the capacitors to a board-making company, who sold the boards to computer makers. Some time later, the bad electrolyte began to fail and ruined hundreds, if not thousands, of computers. There were at least five links in this defective supply chain, not counting middlemen and suppliers, and the only problem was at the head of the chain, where it was hard to detect. The harm in this case was a flurry of failed computers, but suppose a bad capacitor went into a heart pacemaker? The harm that counterfeit parts cause isn't only financial. Reputations can be ruined and people can die. But connecting the dots to find out who was responsible is often an impossible task.

Counterfeit electronics is an obvious case of unethical engineering. Someone with enought technical expertise to know what parts are in demand and how to fake them is profiting illegally and immorally from counterfeiting of this kind. Although it happens all over the world, including the United States, the fact that a huge part of all electronics manufacturing is done in Asia means that many counterfeiters also hail from the East. Ironically, a friend of mine who is a native of Hong Kong characterizes the engineering environment in China in recent years as "the wild wild West," associating it with California gold rushes, wide-open cities, and general hell-raising. This anything-goes atmosphere encourages fly-by-night counterfeiting operations and worse. Although China has anti-counterfeiting laws on the books and stages highly publicized raids on counterfeiters from time to time, the sheer volume of fake goods produced means that most fakers never get caught.

If there weren't so many fakers in the first place, things would improve on their own. What if more engineers in China joined professional organizations with a strong commitment to ethical behavior? The Chinese government is suspicious of any organization that is not tightly under its control, but it would certainly have no objection to professional organizations that oblige their members not to engage in counterfeiting.

By many measures, the economies in China, India, Malaysia, Singapore, and elsewhere in Asia are still maturing. In the 1800s, when the British Empire's economy vastly overshadowed that of the United States, it was very common for unethical U. S. publishers to print unauthorized editions of British authors' works. Eventually, an international copyright agreement was hammered out, and as more U. S. publishers agreed to pay copyright to British authors, British publishers did the same for U. S. authors, and the marketplace became more efficient overall. Something like this may take place in Asia, but first, as in the United States, the professional culture will have to change.

Counterfeiting electronics, like counterfeiting money, is an act that benefits the counterfeiter substantially (for a while, anyway) while spreading harm randomly and diffusely everywhere else. There will always be some criminals, but wherever there are enough professionals to band together to take common action and to declare themselves committed to upholding the highest principles of their profession, they can bring about a change in their culture. And this is something no amount of law enforcement can do.

Sources: The IEEE Spectrum article "Bogus" is at

Thursday, July 06, 2006

Willie Nelson, Environmental Engineer

The last time I drove from San Marcos up to Fort Worth on Interstate 35, I passed a billboard that bore the grizzled visage of Willie Nelson, the living legend of country music. But instead of advertising his latest album, this billboard urges me to go "BioWillie." Mr. Nelson, it turns out, is using his popularity among truckers to promote biodiesel, a type of diesel fuel made partly from animal and vegetable fats as well as ordinary petroleum. A recent New York Times article said that he drives cars and runs tractors on his farm which are modified to operate on 100% renewable oil, which according to some reports makes the exhaust smell like French fries. So far, biodiesel is available at only a few truck stops, mostly in Texas, but the entertainer has high hopes that his environmentally friendly fuel will become at least as popular as his music.

Does this make Willie Nelson an environmental engineer? I'm not sure he can even spell "methyl ester," much less synthesize it from the used restaurant frying oil that forms much of the raw stock that his refinery uses to make the stuff. But his interest in biodiesel and his clever promotion of the fuel to a market of likely users shows the kind of imagination and initiative that characterizes good engineers.

For that matter, the definition of a good engineer has been changing. It used to be the case in my grandfather's day that technical ability was the only thing expected of engineers. Before the dawn of the computer age, designs of any complexity, from a bridge to a telephone network, needed lengthy, tedious calculations combined with the kind of judgment learned only from experience. But today, technical expertise surpassing even the best of the earlier engineers has been canned into computer software packages. It requires a different kind of genius to use these packages, but the need to spend time on all the nitty-gritty details is less than it used to be.

In many fields, engineers have been freed by these changes to consider other matters beyond the strictly technical features of a project. These include safety concerns, marketing and cost factors, manufacturing problems, and environmental issues. Not that the earlier engineers ignored these factors altogether. But back then simply getting a design to work took so much effort that the other things didn't receive as much attention as they could have.

Biodiesel is a good example of a product whose appeal derives from the simple fact that it is made or grown in an environmentally friendly way, even if it costs more and doesn't perform much better than a competing product. These so-called "soft" issues can actually be harder to deal with than the "hard" technical questions, which nowadays can often be settled in a few computer runs rather than having to build prototype after prototype until the right combination of design factors falls together. And the soft issues are where engineering ethics comes in.

Take for example the thing that is making biodiesel and other bio-derived fuels such as ethanol (made from corn) so attractive: the relatively high price of oil, seventy-five dollars a barrel at this writing. There is one school of economic thought that favors minimal interference in markets, from the convenience store down the street to the global market for oil or any other commodity. If oil becomes too expensive, they say, people will scout around for other ways to get from A to B: a hybrid car, biodiesel, hydrogen, or even a bicycle. In the meantime, such meddlesome practices as higher fuel taxes to force drivers to conserve are counterproductive. When the price of oil gets high enough, the chance to make money with alternative fuels will attract inventors, engineers, and entrepreneurs like Willie Nelson, and in the meantime, we should leave things alone.

That argument is fine as far as it goes, but the trouble is, sometimes it doesn't go far enough. Simple free-market analyses often leave out what are called "externalities." These are things like air pollution, global warming, and other effects that result from the use of a certain commodity, but are not easily expressed in terms of the commodity's market cost. In an insightful article in IEEE Technology and Society Magazine, regional planning expert Clint Andrews showed what happens if you look at global energy costs in recent history and include the externality of military expenditures.

Andrews supposes for the sake of argument that concerns over energy security represent half of the reasons that the U. S. went to war in Iraq in 2003. Estimating the annual cost of the war at $40 billion, half of that figure is $20 billion a year. Andrews points out that $20 billion is also about what the U. S. spends on imported Persian Gulf oil annually. So if we include only half of a modest estimate of what we spend on the Iraq war as an externality of our oil supply and "internalize" it, we really spend $40 billion a year, not $20 billion. And of course this neglects the cost in human lives, which is—or should be—incalculable.

Andrews concludes that while a reasonably free market is a necessary condition to good energy policies, it isn't sufficient. When you include externalities such as wars and other government interventions in energy markets (the billions of dollars in state and federal highway taxes are another example), we are very far from the ideal free market envisioned by libertarians.

An ethical engineer will not simply sell technical services to the highest bidder, but will also think about the far-reaching effects of each project or job. That's exactly what Willie Nelson is doing with his French-fry-smelling tractors and BioWillie billboards. May all engineers do the same.

Sources: Willie Nelson's activities in biodiesel were described in an article by Eric O'Keefe in the New York Times on July 5, 2006 at Mr. Nelson's website describing his project is at Clinton Andrews' article "Energy security as a rationale for government action" was in the Summer 2005 issue of IEEE Technology and Society Magazine, available through many university libraries and at