Trust is a fragile thing. But it's also the mortar that holds organizations together. Two ongoing news items have brought to mind the critical role trust plays in engineering and what can happen when it's betrayed.
Shortly after the Sept. 11, 2001 attacks, envelopes containing a white powder that turned out to be anthrax spores showed up in the offices of several Congressmen and elsewhere, killing a total of five people and shutting down an entire Congressional office building for a time. The FBI investigation of the incidents progressed largely out of public view until a scientist at the U. S. Army Medical Research Institute of Infectious Diseases (USAMRIID, for short) named Bruce Ivins committed suicide last week. Although much remains to be revealed about the situation, it appears that a recently developed genetic test has linked the anthrax spores used in the 2001 attacks to anthrax that Ivins was working on. Ironically, Ivins was one of several scientists the FBI called on to assist with the original investigation.
The second item concerned a computer engineer named Terry Childs, who worked for the city of San Francisco in a highly responsible position in which he had exclusive control of certain passwords needed to make changes in the city's computer systems. It looks like Mr. Childs and his colleagues got into some kind of dispute that devolved into Mr. Childs being arrested on four felony counts of computer tampering. When it was discovered that nobody else in San Francisco knew those passwords, Mayor Gavin Newsome accepted an invitation by Childs' attorney to meet Childs in person at the jail, and got the passwords out of him, thus averting a potential computer disaster if changes had needed to be made to the system.
Both of these cases are far from over, and I hold no particular brief for either side of either dispute. But if either Bruce Ivins or Terry Childs turns out to have done what it looks like they might have done, we've got two failures on our hands. And to continue the theme of double trouble, both failures are of two kinds.
First, the personal failures. Suppose Ivins in fact did what it seems the FBI thinks he may have done: taken some of the anthrax spores he was developing exclusively for the purpose of coming up with defenses against them, and using them in real attacks. His motivation for such a heinous act can only be guessed at. One newswriter speculated that if Ivins was trying to gain attention and funding for what he thought was a neglected area of research, he succeeded—but at the price of five lives and the anxiety of millions. That kind of thing gets an F on anyone's moral calculus exam. And although Childs' accusations that the information technology department in San Francisco is corrupt and incompetently run may in fact be true, that doesn't justify his holding the entire system hostage by absconding with passwords, even though there were no service disruptions as a result of his actions. There is, I hope, little or no debate that these individuals did wrong if the accusations against them turn out to be true.
But what about the organizational failures? So many times it happens that engineering tragedies come about, not because any one person did something wicked or devious, but simply because the system allowed little slipups and slight ignorance here and there to cascade into a disaster. If Ivins really was able to take anthrax spores outside his lab and mail them from post offices in New Jersey, there is something wrong with the security system at the USAMRIID. But short of 100% body searches of everyone coming in and out of the labs, I'm not sure how you would improve it.
I don't know what the organization's policy is on allowing scientists to work alone, but if they allow such things, maybe they ought to stop. If there are always at least two people present any time hot stuff like anthrax spores are being worked on, you now have to have a conspiracy in order to take some away for nefarious purposes. Conspiracies aren't impossible, but they're less likely than the actions of one individual with malicious intent acting alone.
And the same goes for the San Francisco IT organization. Computer engineers can be notoriously poor communicators, and it is quite possible that nobody other than Childs knew that he had these powerful passwords under his exclusive control. There just seems to be something about the type of personality drawn to that line of work which delights in exclusive control of things. But once you trade your own personal computer games for a system that is essential for the safety and livelihoods of thousands of people, the penchant for exclusivity has to go out the window. No amount of organizational incompetence, personal distrust of others' motives, or so on can justify a computer engineer's taking matters into his or her own hands that way. This is an elementary lesson that ought to be drilled into the head of every computer-engineering student, but such uniformity in education is just a pipe dream at this point.
You can remember the lesson here with the adage, "two heads are better than one." Usually it's taken to mean that it's easier to solve problems with help, and that's true. But in technical organizations where life-critical matters are being dealt with, it's always dangerous when the system allows solitary individuals to do things that threaten the system's integrity. Rules enforcing the principle of never working alone or of always sharing system-critical passwords go against the personality grain of some types of engineers. But they're needed, and might have prevented the problems that were the focus of the news items we've just discussed.
Sources: An early report on the Ivins case can be found in the Los Angeles Times at http://www.latimes.com/news/nationworld/nation/la-na-anthrax1-2008aug01,0,2864223.story. The San Francisco Chronicle reported on the Childs incident at http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2008/07/22/BAGF11T91U.DTL&tsp=1
Subscribe to:
Post Comments (Atom)
Terry Childs did not abscond with anything. He was suspended from work.
ReplyDeleteDo government employees who are tasked with securing systems have a duty to ensure that the systems are secured from unqualified users?
The persons within the DTIS department who pressured Terry for access (complete with threat of arrest) do not meet the minimum qualifications set by the city itself to administer the network.